News Column

Patent Issued for Measuring Method of 3D Image Depth and a System for Measuring 3D Image Depth Using Boundary Inheritance Based Hierarchical...

May 27, 2014



Patent Issued for Measuring Method of 3D Image Depth and a System for Measuring 3D Image Depth Using Boundary Inheritance Based Hierarchical Orthogonal Coding

By a News Reporter-Staff News Editor at China Weekly News -- A patent by the inventors Lee, Sukhan (Yongin-si, KR); Bui, Quang Lam (Suwon-si, KR), filed on June 22, 2012, was published online on May 13, 2014, according to news reporting originating from Alexandria, Virginia, by VerticalNews correspondents.

Patent number 8724121 is assigned to Research & Business Foundation Sungkyunkwan University (Suwon-si, KR).

The following quote was obtained by the news editors from the background information supplied by the inventors: "The present invention relates to a method of decoding hierarchically orthogonal structured light and an 3-D depth measurement system using the same and, particularly, to a method of decoding hierarchically orthogonal structured light which enables precise 3-D depth measurement because a deformed signal can be precisely restored using a new scheme called signal separation coding and a 3-D depth measurement system using the same.

"In general, a 3-D depth measurement method using structured light is recently in the spotlight because the 3-D depth measurement method is suitable for sensing a 3-D environment in service robot engineering. A basic principle of depth measurement using structured light, that is, an active stereo scheme lies in that a ray of light is radiated to an object using a projection apparatus, such as a projector, the object to which the ray of light has been radiated is captured using an image reception apparatus, such as a camera, the depth to the object is calculated by monitoring how degree is the ray of light distorted by the object, and a depth image is obtained based on the calculated depth.

"FIG. 1 is a schematic diagram for illustrating the principle of a 3-D depth measurement system based on structured light. As shown in FIG. 1, the 3-D position of one point x of the target object 100 is determined as an intersection point where a straight line, coupling the original point Op of projection means and a point p on the image plane 200 of a projection apparatus, meets a straight line that couples the original point Oc of an image reception apparatus and a point q on the image plane 300 of the image reception apparatus. Accordingly, if the projector (i.e., the projection apparatus) and the camera (i.e., the image reception apparatus) have been corrected, the depth image may be obtained by calculating the coordinates of the x point as a pair of address values in each image plane of the points p and q. That is, in this stereo scheme, the core of the method of measuring the depth image is to determine a point corresponding to a pixel in a received image and a projected image. When the corresponding point is determined, the distance may be easily calculated according to simple geometry.

"For the accuracy of depth measurement, a light pattern projected from the projection apparatus is temporally coded according to spatial and/or time sequences on a pixel array so that the spatial and/or temporal addresses of a signal detected by the image reception apparatus may solely determine a point corresponding to a pixel of a corresponding projection apparatus.

"As related to this application, there is Korean Patent Registration No. 10-0588296 for a 3-D image generation apparatus using orthogonally hierarchically structured light in which the pixel of each projection apparatus is addressed and projected using orthogonally hierarchically structured light and the results are decoded in an image reception apparatus. A conventional technique for hierarchically structured light, proposed in Korean Patent Registration No. 10-0588296, is simply described using FIG. 2. FIG. 2 is an exemplary diagram showing projection light of hierarchically orthogonal light, having three hierarchical layers, that is projected on a projection region. As shown in FIG. 2(a), in a layer 1, a projection region is addressed by light projected between a point of time t0 and a point of time t3. A projection region is addressed into (1 0 0 0) and projected at the point of time t0, a projection region is addressed into (0 1 0 0) and projected at the point of time t1, a projection region is addressed into (0 0 1 0) and projected at the point of time t2, and a projection region is addressed into (0 0 0 1) and projected at the point of time t3. Likewise, as shown in FIG. 2(b), in a layer 2, each the regions of the layer 1 is addressed into a detailed region and projected. The layer 2 includes a point of time t4 to a point of time t7. A projection region is addressed into (1000 1000 1000 1000) and projected at the point of time t4, a projection region is addressed into (0100 0100 0100 0100) and projected at the point of time t5, a projection region is addressed into (0010 0010 0010 0010) and projected at the point of time t6, and a projection region is addressed into (0001 0001 0001 0001) and projected at the point of time t7. Finally, in a layer 3, as shown in FIG. 2, each of the regions classified in the layer 2 is addressed into a detailed region and projected, and the layer 3 includes a point of time t8 to a point of time t11. A projection region is addressed into (1000100010001000 1000100010001000 1000100010001000 1000100010001000) and projected at the point of time t8, a projection region is addressed into (0100010001000100 0100010001000100 0100010001000100 0100010001000100) and projected at the point of time t9, a projection region is addressed into (0010001000100010 0010001000100010 0010001000100010 0010001000100010) at the point of time t10, and a projection region is addressed into (0001000100010001 0001000100010001 0001000100010001 0001000100010001) and projected at the point of time t11.

"As shown in FIG. 2, when an image is projected on the projection apparatus, the image reception apparatus may check information about the depth of a 3-D image by decoding the address of the projection apparatus corresponding to each pixel. A system using hierarchically structured light as in FIG. 2 is problematic in that a boundary line becomes unclear in an image signal received by the image reception apparatus depending on a geometric shape of a target object placed in a relevant boundary region, the degree that light is reflected, external environments, etc. in classifying regions according to precise timing in the image reception apparatus and projecting light. However, a conventional technique had many problems because an image is classified according to specific timing when decoding addresses included in a received image and the addresses included in the received image are restored despite of the above problem."

In addition to the background information obtained for this patent, VerticalNews journalists also obtained the inventors' summary information for this patent: "An object of the present invention is to provide a method of decoding hierarchically orthogonal structured light using hierarchically structured light and a 3-D depth measurement system using the same, which can reduce decoding errors by clarifying the boundary region of a lower layer in checking a 3-D depth using hierarchically structured light.

"The present invention relates to a method of precisely searching for the boundary line of a pattern of radiated structured light based on the real coordinate system of an image plane, and an object thereof is to search for a boundary line irrespective of the reflection of a surface of an object, clearly classify a true boundary line and a false boundary line in a poor radiance environment, and increase the accuracy of a boundary line placed in another layer through inheritance. The object of the present invention relates to a method of decoding hierarchically orthogonal structured light, the method obtaining a 3-D image by encoding the structured light including hierarchically orthogonal address information, radiating the encoded light to a target object, and decoding relevant address from a camera image obtained by capturing the target object to which the encoded light is radiated, and it may be achieved by a step of detecting boundary lines encoded at an identical position between layers and a boundary line inheritance step of converting the boundary lines, detected in the step of detecting boundary lines encoded at an identical position between layers, into identical boundary lines.

"The step of detecting boundary lines encoded at an identical position between layers may include a step of correcting the boundary lines by taking different reflectivities of surfaces of the target object into consideration in order to precisely measure positions of the boundary lines.

"The step of correcting the boundary lines by taking different reflectivities of surfaces of the target object into consideration may use

".function..function..function..sigma..times..function..function..function- ..sigma..function. ##EQU00001## .function..apprxeq..function..function..function..sigma..function..functi- on..function..sigma..times. ##EQU00001.2## (wherein fs(x) is a captured light stripe signal in a camera image, s(x) is a pattern for projector illumination, R(x) is a reflection index of the local surface at x, fc(x) is a canonical form of light pattern, A(x) is ambient light, gp(x,.sigma.p) is a Gaussian blur kernel of a projector lens at x position with blur radius .sigma.p, gc(x,.sigma.c) is Gaussian blur kernel of camera lens at x position with blur radius ac, W(x) is noise of an imaging sensor, fo(x) is a captured image when black pattern is projected, f1(x) is a captured image when white pattern is projected, H is a maximum intensity of pattern image (here we used 255), and L is the minimum intensity of pattern image (here we used 0))

"The boundary line inheritance step of converting the detected boundary lines into identical boundary lines may include a first step of obtaining a detected boundary point d from which an image is checked in a current layer, a second step of obtaining a time-series boundary point D detected in time series, a third step of calculating a difference value d-D between the time-series boundary point D and the detected boundary point d, and a fourth step of changing the detected boundary point d into an identical boundary with the time-series boundary point D if the difference value calculated in the third step falls within a range .DELTA.d and setting the detected boundary point d as a new boundary point if the difference value calculated in the third step does not fall within the range .DELTA.d. Here, the first step and the second step are performed irrespective of order.

"The boundary line inheritance step of converting the detected boundary lines into identical boundary lines may include a first step of obtaining an upper layer boundary point D detected in an upper layer, a second step of obtaining a detected boundary point d from which an image is checked in a current layer, a third step of calculating a difference value d-D between the detected boundary point d and the upper layer boundary point D, and a fourth step of changing the detected boundary point d into an identical boundary with the upper layer boundary point D if the difference value calculated in the third step falls within a range .DELTA.d and setting the detected boundary point d as a new boundary point if the difference value calculated in the third step does not fall within the range .DELTA.d.

"The boundary line inheritance step of converting the detected boundary lines into identical boundary lines may include setting a region between d and D in the current layer so that the region between d and D in the current layer may be included in a region next to a time-series boundary point D of the upper layer or the upper layer boundary point D, if the difference value d-D calculated in the third step has a value smaller than zero.

"The boundary line inheritance step of converting the detected boundary lines into identical boundary lines may include setting a region between d and D in the current layer so that the region between d and D in the current layer may be included in a region anterior to a time-series boundary point D of the upper layer or the upper layer boundary point D, if the difference value d-D calculated in the third step has a value greater than zero.

"Another object of the present invention relates to a 3-D depth measurement system, including a projection apparatus for encoding structured light including hierarchically orthogonal addresses and radiating the encoded structured light to a target object, an image reception apparatus for receiving light radiated to the target object, and an operation processing apparatus for decoding an address of the structured light, corresponding to each pixel of the image reception apparatus, from an image received from the image reception apparatus and calculating and processing a 3-D depth of the target object, wherein the operation processing apparatus detects boundary lines encoded at an identical position between layers and performs a boundary line inheritance process of converting the detected boundary lines into identical boundary lines. The boundary inheritance process may include obtaining a detected boundary point d from which an image is checked in a current layer, obtaining a time-series boundary point D detected in time series, calculating a difference value d-D between the time-series boundary point D and the detected boundary point d, and changing the detected boundary point d into an identical boundary with the time-series boundary point D if the difference value falls within a range .DELTA.d and setting the detected boundary point d as a new boundary point if the difference value does not fall within the range .DELTA.d. The boundary inheritance process may include obtaining an upper layer boundary point D detected in an upper layer, a second step of obtaining a detected boundary point d from which an image is checked in a current layer, a third step of calculating a difference value d-D between the detected boundary point d and the upper layer boundary point D, and a fourth step of changing the detected boundary point d into an identical boundary with the upper layer boundary point D if the difference value calculated in the third step falls within a range .DELTA.d and setting the detected boundary point d as a new boundary point if the difference value calculated in the third step does not fall within the range .DELTA.d.

"In the method of decoding hierarchically orthogonal structured light and the 3-D depth measurement system using the same according to the present invention, in calculating the address of hierarchical light provided by the projection means, a substantial boundary region is calculated, compared with a boundary point checked in time-series, and then made identical with a boundary point placed in an upper layer if a difference between the substantial boundary region and the boundary point falls within a specific range. Accordingly, 3-D depth image generation errors can be reduced.

"The present invention relates to a method of precisely searching for the boundary line of a pattern of radiated structured light based on the real coordinate system of an image plane and can search for a boundary line irrespective of the reflection of a surface of an object, clearly classify a true boundary line and a false boundary line in a poor radiance environment, and increase the accuracy of a boundary line placed in another layer through inheritance."

URL and more information on this patent, see: Lee, Sukhan; Bui, Quang Lam. Measuring Method of 3D Image Depth and a System for Measuring 3D Image Depth Using Boundary Inheritance Based Hierarchical Orthogonal Coding. U.S. Patent Number 8724121, filed June 22, 2012, and published online on May 13, 2014. Patent URL: http://patft.uspto.gov/netacgi/nph-Parser?Sect1=PTO2&Sect2=HITOFF&p=57&u=%2Fnetahtml%2FPTO%2Fsearch-bool.html&r=2823&f=G&l=50&co1=AND&d=PTXT&s1=20140513.PD.&OS=ISD/20140513&RS=ISD/20140513

Keywords for this news article include: Asia, Korea, Research & Business Foundation Sungkyunkwan University.

Our reports deliver fact-based news of research and discoveries from around the world. Copyright 2014, NewsRx LLC


For more stories covering arts and entertainment, please see HispanicBusiness' Arts & Entertainment Channel



Source: China Weekly News


Story Tools






HispanicBusiness.com Facebook Linkedin Twitter RSS Feed Email Alerts & Newsletters