News Column

Patent Issued for Directing Indirect Illumination to Visibly Influenced Scene Regions

July 22, 2014



By a News Reporter-Staff News Editor at Information Technology Newsweekly -- From Alexandria, Virginia, VerticalNews journalists report that a patent by the inventors Budge, Brian Christopher (Tiburon, CA); Arbree, Adam Joseph (Tiburon, CA), filed on May 26, 2011, was published online on July 8, 2014.

The patent's assignee for patent number 8773434 is Autodesk, Inc. (San Rafael, CA).

News editors obtained the following quote from the background information supplied by the inventors: "The present invention relates generally to lighting a three-dimensional (3D) model, and in particular, to a method, apparatus, and article of manufacture for indirectly illuminating particular scene regions.

"Rendering algorithms often trace light particles from a scene's light sources as part of the scene's calculation of global illumination (photon mapping, instant radiosity, lightcuts, etc.). Generating these particle paths is a difficult problem because, in expansive cases, much of the scene is not even indirectly viewable from the camera. If a particle set could be restricted to only the regions affecting the camera, the set of stored photons becomes much smaller and later operations using these photons become more efficient. This increased efficiency translates into faster rendering times and increased rendering quality. The prior art fails to provide a mechanism for generating particle paths in such a way that the indirect illumination garnered from the paths is likely to contribute to the final image. Such problems may be better understood with a description of prior art lighting techniques.

"Many applications attempt to perform simulated photography in a large scene. In other words, the application simulates the placement of a camera at a particular location in a scene and attempts to illuminate the scene (referred to as global illumination). Further, the light received at the camera's location must have a sufficient density to produce an accurate/acceptable rendering. As part of the illumination process, paths between the camera and the light source(s) are computed. If the path is complicated, there may not be sufficient density or enough paths relative to the entire scene (e.g., millions of paths in the scene but only a few that reach the camera location). For example, the camera may be located in a single room of a multi-room hotel looking out a window. Prior art global illumination techniques do not take the camera's location into account and may therefore insufficiently illuminate the hotel room where the camera is located (i.e., for simulated light). More specifically, the global illumination computation will consume processing resources by globally illuminating the scene without regard to a targeted area such as a camera's location. Accordingly, it is useful to label the camera's location in a scene and bias the scene illumination based on such a location. In other words, it is desirable to isolate a small important subset of the global illumination computation. However, prior art methodologies fail to perform global computations with such a bias. To better understand global illumination, a description of prior art global illumination computations is useful.

"Two different types of prior art are often utilized to compute global illumination--geometric based computations (e.g., progressive radiosity) and lighting based computations.

"In geometric based prior art solutions, the geometric representations of areas not important in the scene are computed at a low quality or are actually modified to produce more/fewer vertices for use in the analysis. Thus, the geometry of a scene is analyzed and the rendering of non-important geometry may be performed at a reduced resolution/quality.

"Most prior art systems that compute global illumination are lighting based. One such method utilizes virtual point lights (VPLs). A VPL is a point light source that is placed virtually in a scene location. A VPL is used as a source of light and may not be the originating light source. For example, if the light source is the sun that reflects or bounces off many objects in a scene (e.g., ground, window, wall, etc.), during a process of forward ray tracing (i.e. tracing rays from light sources) a VPL may be placed at any of the intersecting/reflecting locations (e.g., on the ground, window, wall, etc.). To use the VPLs, they must first be created. In a small scene, (e.g., a light source and a single room with a window), many VPLs can be projected into the scene with sufficient density to obtain a reasonable estimate of global illumination. However, many applications attempt to compute global illumination in a large scene (e.g., 1 mile by 1 mile with a camera in one room of a large house that has a window to the outside, or to a multi-room hotel). While millions of VPLs may be traced, the density in the room with the camera may be very low and insufficient. Thus, prior art methods fail to provide an efficient mechanism for computing global illumination that is biased in a particular location.

"Another prior art approach is that of traditional ray tracing or backward ray tracing. Ray tracing is a technique that models the path taken by light by following rays of light as they interact with optical surfaces. In a 3D graphics environment, backward ray tracing follows rays from the camera eyepoint outward, rather than originating at a light source. Thus, visual information on the appearance of the scene is viewed from the point of view of the camera, and lighting conditions specified are interpreted to produce a shading value. The ray's reflection, refraction, or absorption are calculated when it intersects objects and media in the scene.

"Building on ray tracing is another prior art technique called photon mapping. Photon mapping is noted for its ability to handle caustics (specular indirect effects) (e.g., rather than radiosity which is for diffuse indirect effects) as well as diffuse inter-reflection. Photon mapping uses forward ray tracing to deposit photons from the light sources into objects in the scene. The photons are stored in a binary space partitioning (BSP) tree data structure where neighbors can be quickly discovered and photons merged to constrain memory use. BSP is a method for recursively subdividing a space into convex sets by hyperplanes. The subdivision gives rise to a representation of the scene by means of a tree data structure referred to as the BSP tree. In the case of reflective or refractive objects, new photons are generated from the incoming set and further propagated through the scene, again using ray tracing, and the resulting photons are added to the tree. Each photon stored contains information including the direction the photon came from, where the photon hits a surface, and reflection properties at the surfaces.

"A photon mapping algorithm usually proceeds in two phases. First a coarse illumination solution is prepared as described above. Second, the coarse illumination is 'gathered,' pixel by pixel, to produce a smooth final output. This gathering step requires many rays for quality results and is the subject of much research.

"In many non-geometric based prior art solutions, global illumination is often analyzed in an unbiased context. Unbiased prior art techniques attempt to account for every path in a scene, a process that is computationally and processor intensive. Few prior art solutions use biased approaches to ensure relevant global illumination computations. One method that attempts to perform a bias based computation is that of 'Importons'. Importons are similar to 'visual particles' that are emitted from the camera and bounce towards light. Importons move in the opposite direction to how photons travel but in contrast to photons, importons store color data that describes the factor with which an illumination at a certain location would contribute to the final image. Thus, with importons, the path is followed from the camera eyepoint, and importons are deposited at intersecting/bounce locations. Thereafter, photon mapping is performed. During the photon mapping (i.e., from the light source to the camera eyepoint), if a photon lands near an importons, it is stored, otherwise, the photon/location is ignored. Accordingly, since only particular photons are stored, the importons serve to bias the global illumination using photon mapping based on where importons are stored.

"Global illumination may also be based on 'daylight portals' to enable faster and better targeting for particle tracing. A daylight portal is a portal that is placed by the user (i.e., it is not automatically generated) and identifies a region of interest. These portals also present a major problem in scenes that are partially outdoors and partially indoors. Namely, using portals allows fast particle tracing into the indoor portion of the scene, but essentially disables global illumination due to the environment in outdoor portions of the scene. Portals can be disabled, allowing global illumination in all portions of the scene, but particle densities will be far from desirable (in large scenes, particles will be distributed over a large area). Moreover, even with portals, this density problem may occur, since portals may be located far from the camera, in other rooms, or even in other buildings. Thus, prior art solutions fail to allow particle densities to be relatively high in areas that actually contribute illumination to the final image.

"In summary, the prior art global illumination computations have many deficiencies and problems. Geometric based prior art solutions modify scene geometry and are processor intensive. Unbiased lighting-based prior art solutions attempt to compute the global illumination for an entire scene and may reach the maximum capacity for memory/processor usage. The biased prior art computations focus on the final computation used during photon tracing/mapping and therefore also may exceed memory and processing limitations, or fail to have sufficient density of relevant data to compute a high quality global illumination solution.

"In view of the above, one may note that all of the prior art solutions, whether geometric, lighting based, unbiased, or biased, fail to easily and efficiently illuminate a particular scene location without wasting precious memory and processing resources for illuminating an entire scene."

As a supplement to the background information on this patent, VerticalNews correspondents also obtained the inventors' summary information for this patent: "Embodiments of the invention are directed towards global illumination while focusing/targeting a region of interest. The location of the camera/eyepoint is used to influence and isolate the rendering computation for a final image in a sophisticated manner. Light is emitted from the camera/eyepoint in a reverse rendering process to identify targets in a scene. Thereafter, photons are projected/emitted from a light source. Only those photons that fall on photon paths that intersect with the targets are accepted and later utilized as virtual point lights to illuminate a region of interest in a scene."

For additional information on this patent, see: Budge, Brian Christopher; Arbree, Adam Joseph. Directing Indirect Illumination to Visibly Influenced Scene Regions. U.S. Patent Number 8773434, filed May 26, 2011, and published online on July 8, 2014. Patent URL: http://patft.uspto.gov/netacgi/nph-Parser?Sect1=PTO1&Sect2=HITOFF&d=PALL&p=1&u=%2Fnetahtml%2FPTO%2Fsrchnum.htm&r=1&f=G&l=50&s1=8773434.PN.&OS=PN/8773434RS=PN/8773434

Keywords for this news article include: Autodesk Inc., Information Technology, Information and Data Architecture.

Our reports deliver fact-based news of research and discoveries from around the world. Copyright 2014, NewsRx LLC


For more stories covering the world of technology, please see HispanicBusiness' Tech Channel



Source: Information Technology Newsweekly


Story Tools






HispanicBusiness.com Facebook Linkedin Twitter RSS Feed Email Alerts & Newsletters