News Column

Patent Issued for Structured Polygonal Mesh Retesselation

March 6, 2014



By a News Reporter-Staff News Editor at Computer Weekly News -- From Alexandria, Virginia, VerticalNews journalists report that a patent by the inventors Yu, Meng (San Francisco, CA); Baraff, David (Oakland, CA), filed on September 29, 2010, was published online on February 18, 2014.

The patent's assignee for patent number 8654121 is Pixar (Emeryville, CA).

News editors obtained the following quote from the background information supplied by the inventors: "This disclosure relates to computer-generated imagery (CGI) and computer-aided animation. More specifically, this disclosure relates to an interactive multi-mesh garment modeling system and structured polygon mesh retesselation techniques for use in CGI and computer-aided animation.

"With the wide-spread availability of computers, computer graphics artists and animators can rely upon computers to assist in production process for creating animations and computer-generated imagery (CGI). This may include using computers to have physical models be represented by virtual models in computer memory. Typically, two-dimensional (2D) or three-dimensional (3D) computer-aided animation combines 2D/3D models of objects and programmed movement of one or more of the models. In 3D computer animation, the first step is typically the object modeling process. Objects can be sculpted much like real clay or plaster, working from general forms to specific details, for example, with various sculpting tools. Models may then be constructed, for example, out of geometrical vertices, faces, and edges in a 3D coordinate system to represent the objects. These virtual models can then be manipulated using computers to, for example, simulate physics, design aesthetic actions such as poses or other deformations, crate lighting, coloring and paint, or the like, of characters or other elements of a computer animation display.

"Pixar is one of the pioneering companies in the computer-generated imagery (CGI) and computer-aided animation industry. Pixar is more widely known as Pixar Animation Studios, the creators of animated features such as 'Toy Story' (1995) and 'Toy Story 2' (1999), 'A Bugs Life' (1998), 'Monsters, Inc.' (2001), 'Finding Nemo' (2003), 'The Incredibles' (2004), 'Cars' (2006), 'Ratatouille' (2007), and others. In addition to creating animated features, Pixar develops computing platforms and tools specially designed for computer-aided animation and CGI. One such example is now known as PhotoRealistic RenderMan, or PRMan for short. PRMan is a photorealistic RenderMan-compliant rendering software system based on the RenderMan Interface Specification (RISpec) which is Pixar's technical specification for a standard communications protocol (or interface) between 3D computer graphics programs and rendering programs. PRMan is produced by Pixar and used to render their in-house 3D animated movie productions. It is also available as a commercial product licensed to third parties, sold as part of a bundle called RenderMan Pro Server, a RenderMan-compliant rendering software system developed by Pixar based on their own interface specification. Other examples include tools and plug-ins for programs such as the AUTODESK MAYA high-end 3D computer graphics software package from AutoDesk, Inc. of San Rafael, Calif.

"One core functional aspect of PRMan can include the use of a 'rendering engine' to convert geometric and/or mathematical descriptions of objects into images. This process is known in the industry as 'rendering.' For movies, other animated features, shorts, and special effects, a user (e.g., a skilled computer graphics artist) can specify the geometric or mathematical description of objects to be used in the rendered image or animation sequence, such as characters, props, background, or the like. In some instances, the geometric description of the objects may include a number of animation control variables (avars) and values for the avars. An animator may also pose the objects within the image or sequence and specify motions and positions of the objects over time to create an animation.

"As such, the production of CGI and computer-aided animation may involve the extensive use of various computer graphics techniques to produce a visually appealing image from the geometric description of an object that may be used to convey an essential element of a story or provide a desired special effect. One of the challenges in creating these visually appealing images is the can be the balancing of a desire for a highly-detailed image of a character or other object with the practical issues involved in allocating the resources (both human and computational) required to produce those visually appealing images.

"Therefore, one issue with the production process is the time and effort involved when a user undertakes to model the geometric description of an object. One practice in computer animation can be to use multiple versions of the same geometric model according to a specific task. One might, for example, create and render a model for final display, but prefer to use a somewhat simplified model for other purposes, such as creating animation or performing simulation. However, it may take several hours to several days for a user to design, create, rig, pose, paint, or otherwise prepare a model that can be used to produce the visually desired look for one task. This involvement in time and effort can limit that ability of the user to create enough variants of the model for use in different stages of the production process or in a single scene to covey particular element of the story or to provide the desired visual effect. Additionally, artistic control over the look of a model or its visual effect when placed in a scene may also be lost by some attempts at reducing the time and effect in preparing a model that rely too much on automated procedural creation of models.

"Accordingly, what is desired is to solve one or more of the problems relating to creating objects for use in CGI and computer-aided animation, some of which may be discussed herein. Additionally, what is desired is to reduce some of the drawbacks relating to creating objects for use in CGI and computer-aided animation, some of which may be discussed herein."

As a supplement to the background information on this patent, VerticalNews correspondents also obtained the inventors' summary information for this patent: "This disclosure relates to computer-generated imagery (CGI) and computer-aided animation. More specifically, this disclosure relates to an interactive multi-mesh modeling system and structured polygon mesh retesselation techniques for use in CGI and computer-aided animation.

"In various embodiments, an interactive multi-mesh modeling system may allow users to employ a variety of modeling techniques (e.g., solid modeling or shell/boundary modeling) to interactively create one or more computer-generated representations of objects (e.g., animated characters, static props, and objects whose motions are determined by computer simulations) for a variety of different tasks or tools associated with phases of modeling, layout and animation, and rendering. Some of these different tasks or tools can have requirements for the computer-generated representations of objects on which they operate. These requirements may differ from how some computer-generated representations where originally created (e.g., 3D solid objects output using solid modeling techniques vs. 2D flat 'panel constructed' objects required for some computer simulations). Thus, the interactive multi-mesh modeling system may further employ a variety of techniques for taking a source computer-generated representation of an object and providing the automatic creation, management, and maintenance of instances or versions of the source, and any information defined thereon or associated therewith, that are suitable for several different tasks.

"Accordingly, in one aspect, an interactive multi-mesh modeling system may automatically create and/or maintain one or more computer-generated representations of an object designated for each of a plurality of different tasks based on a source computer-generated representation of the object which may not be suitable to the requirements of some of the tasks. In another aspect, in maintaining each of the one or more computer-generated representations of the object designated for each of the plurality of different tasks, the interactive multi-mesh modeling system may automatically create and maintain correspondences between the source computer-generated representation of the object and any of the computer-generated representations of the object designated for the different tasks such that information defined on or otherwise associated with the source computer-generated representation of the object can be shared with or transferred to any of the other computer-generated representations of the object, and vice versa. In a further aspect, in maintaining each of the one or more computer-generated representations of the object designated for each of the plurality of different tasks, the interactive multi-mesh modeling system may automatically create and maintain correspondences between versions or iterations of computer-generated representations of the object that may be generated for the same task such that information defined on or otherwise associated with one computer-generated representation of the object used for the task can be shared with or transferred to any other computer-generated representation of the object for the same task, and vice versa. In a still further aspect, in maintaining each of the one or more computer-generated representations of the object designated for each of the plurality of different tasks, the interactive multi-mesh modeling system may automatically create and maintain correspondences between one of the computer-generated representations of the object designated for a first task and another computer-generated representations of the object designated for a second task such that information defined on or otherwise associated with one computer-generated representation of the object for one task can be shared with or transferred to another computer-generated representation of the object for another different task, and vice versa.

"In one embodiment, a computer-implemented method for mesh retesselation can include receiving a first non-planar mesh. A mesh generation specification for a second mesh may be received. The mesh generation specification can define a sampling strategy and rejection criteria. For each vertex of the second mesh beginning with a seed vertex, one or more potential vertices that neighbor the vertex are determined based on the mesh generation specification. Each of the one or more potential vertices satisfy the rejection criteria. Further, one or more edges are immediately determined based on the vertex and the one or more potential vertices. Information specifying the second mesh can be generated based on each vertex of the second mesh and the determined edges

"In some embodiments, generating the information specifying the second mesh can include generating a triangular mesh. In another aspect, generating the information specifying the second mesh can include generating the second mesh with the plurality of uniform structures having a user-specified size. In a further aspect, one or more vertices for the second mesh can be determined based on a boundary associated with the first mesh. One or more vertices for the second mesh may be determined in response to traversing a predetermined set of edges associated with the first mesh.

"In further embodiments, a set of vertices of the second mesh may be determined that fail to satisfy finishing criteria specified in the mesh generation specification. One or more edges may be determined based the mesh generation specification and at least some of the vertices in the set of vertices that fail to satisfy the finishing criteria. In one aspect, an ordering may be determined for each vertex of the second mesh for which to determine potential vertices and edges associated with the vertex. The determined ordering may include a raster scan traversal of a space associated with the first mesh.

"In some embodiments, receiving the mesh generation specification may include receiving resolution of the second mesh. In other embodiments, receiving the mesh generation specification further may include receiving one or more edge constraints that preserve one or more structures associated with the first mesh.

"In one embodiment, a non-transitory computer-readable medium stores computer-executable code for mesh retesselation. The computer-readable medium can include code for receiving a first non-planar mesh, code for receiving a mesh generation specification for a second mesh, the mesh generation specification defining a sampling strategy and rejection criteria, code for, for each vertex of the second mesh beginning with a seed vertex, determining one or more potential vertices that neighbor each vertex of the second mesh based on the mesh generation specification, each of the one or more potential vertices satisfying the rejection criteria, and determining one or more edges based on the vertex and the one or more potential vertices, and code for generating information specifying the second mesh based on each vertex of the second mesh and the determined edges.

"In another embodiment, a system for mesh retesselation can include a processor and a memory in communication with the processor and configured to store instructions which when executed by the processor cause the processor to receive a first non-planar mesh, receive a mesh generation specification for a second mesh, the mesh generation specification defining a sampling strategy and rejection criteria, for each vertex of the second mesh beginning at a seed vertex, determine one or more potential vertices that neighbor the vertex based on the mesh generation specification, each of the one or more potential vertices satisfying the rejection criteria, and determine one or more edges based on the vertex and the one or more potential vertices, and generate information specifying the second mesh based on each vertex of the second mesh and the determined edges.

"A further understanding of the nature of and equivalents to the subject matter of this disclosure (as wells as any inherent or express advantages and improvements provided) should be realized by reference to the remaining portions of this disclosure, any accompanying drawings, and the claims in addition to the above section."

For additional information on this patent, see: Yu, Meng; Baraff, David. Structured Polygonal Mesh Retesselation. U.S. Patent Number 8654121, filed September 29, 2010, and published online on February 18, 2014. Patent URL: http://patft.uspto.gov/netacgi/nph-Parser?Sect1=PTO2&Sect2=HITOFF&p=58&u=%2Fnetahtml%2FPTO%2Fsearch-bool.html&r=2892&f=G&l=50&co1=AND&d=PTXT&s1=20140218.PD.&OS=ISD/20140218&RS=ISD/20140218

Keywords for this news article include: Pixar, Software.

Our reports deliver fact-based news of research and discoveries from around the world. Copyright 2014, NewsRx LLC


For more stories covering the world of technology, please see HispanicBusiness' Tech Channel



Source: Computer Weekly News


Story Tools