News Column

Researchers Submit Patent Application, "Imaging Optical System for 3d Image Acquisition Apparatus, and 3d Image Acquisition Apparatus Including the...

May 14, 2014



Researchers Submit Patent Application, "Imaging Optical System for 3d Image Acquisition Apparatus, and 3d Image Acquisition Apparatus Including the Imaging Optical System", for Approval

By a News Reporter-Staff News Editor at Electronics Newsweekly -- From Washington, D.C., VerticalNews journalists report that a patent application by the inventors PARK, Yong-hwa (Yongin-si, KR); GORELOV, Alexander (Moscow, RU); YOU, Jang-woo (Yongin-si, KR); SHIRANKOV, Alexander (Moscow, RU); LEE, Seung-wan (Suwon-si, KR), filed on October 22, 2013, was made available online on May 1, 2014.

The patent's assignee is Samsung Electronics Co., Ltd.

News editors obtained the following quote from the background information supplied by the inventors: "Methods and apparatuses consistent with exemplary embodiments relate to an imaging optical system for a three-dimensional (3D) image acquisition apparatus, and a 3D image acquisition apparatus including the imaging optical system, and more particularly, to an imaging optical system having a decreased size and a 3D image acquisition apparatus including the imaging optical system so that the size of the 3D image acquisition apparatus may be decreased.

"As the demand for 3D display apparatuses has increased, the use of and request for three-dimensional (3D) contents have also increased. Accordingly, 3D image acquisition apparatuses such as 3D cameras for producing 3D contents have been developed. A 3D camera should acquire general two-dimensional (2D) color image information along with depth information via one photographing operation.

"The depth information regarding a distance between surfaces of a target object and the 3D camera may be obtained by a stereo vision method using two cameras or by a triangulation method using structured light and a camera. However, as the distance from the target object is increased, the accuracy of the depth information substantially deteriorates when the aforementioned methods are used. Also, the depth information varies with the states of the surfaces of the target object, and thus, it is difficult to acquire accurate depth information when these methods are used.

"In order solve these problems, a Time-of-Flight (TOF) method has been developed. In the TOF method, illumination light is irradiated to a target object, and then an optical flight time until light reflected from the target object is received by a light receiving unit is measured. The illumination light has a particular wavelength (e.g., a near infrared ray of 850 nm) and is irradiated to the target object by an illuminating optical system including a light-emitting diode (LED) or a laser-diode (LD), and light that has the same wavelength and is reflected from the target object is received by the light receiving unit. Thereafter, a series of processes in which the received light is modulated by using a modulator having a known gain waveform are performed to extract depth information. Various TOF methods using a predetermined series of optical processes have been introduced.

"In general, a 3D camera using a TOF method includes an illuminating optical system for emitting illumination light to acquire depth information, and an imaging optical system for acquiring an image of a target object. The imaging optical system generates a general color image by sensing visible light reflected from the target object and simultaneously generates a depth image only having depth information by sensing illumination light reflected from the target object. For this purpose, the imaging optical system may separately include an object lens and an image sensor for visible light, and an object lens and an image sensor for illumination light (i.e., a two-lens and two-sensor structure). However, in the two-lens and two-sensor structure, a color image and a depth image have different fields of view, and thus, a separate process is required to accurately match the two images. Accordingly, a size of the 3D camera and the manufacturing costs are increased.

"Thus, a 3D camera having one common object lens and two image sensors (i.e., a one-lens and two-sensor structure) has been developed. However, even in the one-lens and two-sensor structure, there remains a need to prevent volume and weight increases of an imaging optical system and the 3D camera and also an increase of the manufacturing costs."

As a supplement to the background information on this patent application, VerticalNews correspondents also obtained the inventors' summary information for this patent application: "One or more exemplary embodiments may provide an imaging optical system having a decreased size, whereby a size of a 3D image acquisition apparatus having one common object lens and two image sensors with different sizes may be decreased.

"One or more exemplary embodiments may further provide a 3D image acquisition apparatus including the imaging optical system.

"Additional aspects will be set forth in part in the description which follows and, in part, will be apparent from the description, or may be learned by practice of the presented exemplary embodiments.

"According to an aspect of an exemplary embodiment, there is provided an imaging optical system including an object lens configured to transmit light; first and second image sensors having different sizes; a beamsplitter on which the light transmitted by the object lens is incident, the beamsplitter being configured to split the light incident thereon into light of a first wavelength band and light of a second wavelength band, and direct the light of the first wavelength band to the first image sensor and the light of the second wavelength band to the second image sensor; and at least one optical element disposed between the beamsplitter and the second image sensor, being configured to reduce an image that is incident on the second image sensor, wherein the at least one optical element includes at least one of a Fresnel lens and a diffractive optical element (DOE).

"The at least one optical element may include at least two of the Fresnel lenses which are sequentially disposed along an optical path between the beamsplitter and the second image sensor.

"The at least one optical element may include a first optical element and a second optical element which are sequentially disposed along an optical path between the beamsplitter and the second image sensor, where the first optical element is a Fresnel lens and the second optical element is a DOE.

"The Fresnel lens may be a collimating element that converts light reflected from the beamsplitter into parallel light, and the DOE may reduce an image by converging the parallel light onto the second image sensor.

"The imaging optical system may further include an optical shutter which is disposed between the at least one optical element and the second image sensor and is configured to modulate the light of the first wavelength band light and provide the modulated light to the second image sensor.

"The size of the second image sensor may be less than the size of the first image sensor, and the light of the first wavelength band may include visible light and the light of the second wavelength band may include infrared light.

"The beamsplitter may be configured to transmit the light of the first wavelength band and may reflect the light of the second wavelength band.

"According to an aspect of another exemplary embodiment, there is provided an imaging optical system including an object lens configured to transmit light; first and second image sensors having different sizes; and a beamsplitter on which the light transmitted by the object lens is incident, the beamsplitter being configured to split the light incident thereon into light of a first wavelength band and light of a second wavelength band, and direct the light of the first wavelength band to the first image sensor and the light of the second wavelength band to the second image sensor, wherein the beamsplitter is inclined by more than about 45 degrees with respect to an optical axis of the object lens.

"The beamsplitter may include a plurality of first slopes which are formed on a light-incident surface of the beamsplitter to be inclined by about 45 degrees with respect to the optical axis of the object lens; and a plurality of second slopes which are formed on a light-exit surface of the beamsplitter to have a complementary shape with respect to the plurality of minute first slopes, wherein the plurality of first slopes and the plurality of second slopes are parallel to each other.

"The beamsplitter may further include a wavelength separation filter configured to transmit the light of the first wavelength band and reflects the light of the second wavelength band, and the wavelength separation filter may be coated on the plurality of first slopes.

"The beamsplitter may include reflective first diffraction patterns that are formed on a light-incident surface of the beamsplitter, wherein the reflective first diffraction patterns are configured to transmit the light of the first wavelength band and reflect the light of the second wavelength band, and are positioned so that a reflection angle of the reflected light of the second wavelength band is about 45 degree with respect to the optical axis; and second diffraction patterns that are formed on a light-exit surface of the beamsplitter and have a complementary shape with respect to the reflective first diffraction patterns.

"The imaging optical system may further include at least one optical element disposed between the beamsplitter and the second image sensor, the at least one optical element being configured to reduce an image which is incident on the second image sensor, wherein the at least one optical element includes at least one of a Fresnel lens and a diffractive optical element (DOE).

"According to an aspect of another exemplary embodiment, there is provided an imaging optical system including an object lens configured to transmit light; first and second image sensors having different sizes; and a beamsplitter on which the light transmitted by the object lens is incident, the beamsplitter being configured to split the light incident thereon into light of a first wavelength band and light of a second wavelength band, and to direct the light of the first wavelength band to the first image sensor and the light of the second wavelength band to the second image sensor, wherein the beamsplitter has a concave reflective surface coated with a wavelength separation filter which is configured to to transmit the light of the first wavelength band and to reflect the light of the second wavelength band.

"The imaging optical system may further include a convex minor configured to reflect the light of the second wavelength band reflected by the beamsplitter, and a flat minor configured to reflect the light of the second wavelength band reflected by the convex minor toward the second image sensor.

"The imaging optical system may further include a flat mirror configured to reflect the light of the second wavelength band reflected by the beamsplitter, and a convex minor configured to reflect the light of the second wavelength band reflected by the flat minor toward the second image sensor.

"According to an aspect of another exemplary embodiment, there is provided an imaging optical system including an object lens configured to transmit light; first and second image sensors having different sizes; and a beamsplitter on which the light transmitted by the object lens is incident, the beamsplitter being configured to split the light incident thereon into light of a first wavelength band and light of a second wavelength band, and to direct the light of the first wavelength band to the first image sensor and the light of the second wavelength band to the second image sensor, wherein the beamsplitter includes a first dichroic mirror and a second dichroic mirror which are disposed in an upper region and a lower region, respectively, with respect to an optical axis of the object lens, wherein the first dichroic minor and the second dichroic minor partially contact each other along the optical axis of the object lens, and are folded by a predetermined angle with respect to a reflective surface of the second dichroic mirror, wherein the first dichroic mirror is configured to transmit the light of the first wavelength band and reflect the light of the second wavelength band toward the upper region, and wherein the second dichroic minor is configured to transmit the light of the first wavelength band and reflect the light of the second wavelength band toward the lower region.

"The imaging optical system may further include a first minor that is disposed to face the first dichroic minor and is configured to reflect the light of the first wavelength band reflected by the first dichroic mirror toward the second image sensor; and a second minor that is disposed to face the second dichroic minor and is configured to reflect the light of the first wavelength band reflected by the second dichroic minor toward the second image sensor.

"Reflective diffraction patterns having an image reduction function may be formed on a reflective surface of the first minor and a reflective surface of the second mirror.

"According to an aspect of another exemplary embodiment, there is provided an imaging optical system including an object lens configured to transmit light; first and second image sensors having different sizes; a beamsplitter on which the light transmitted by the object lens is incident, the beamsplitter being configured to split the light incident thereon into light of a first wavelength band and light of a second wavelength band, and to direct the light of the first wavelength band to the first image sensor and the light of the second wavelength band to the second image sensor; and a fiber optic taper disposed between the beamsplitter and the second image sensor, and having a light-incident surface greater than a light-exit surface.

"The imaging optical system may further include at least one optical element disposed between the beamsplitter and the fiber optic taper, the at least one optical element being configured to reduce an image which is incident on the second image sensor, wherein the at least one optical element includes at least one of a Fresnel lens and a diffractive optical element (DOE).

"The at least one optical element may include the Fresnel lenses which is configured to convert light from the beamsplitter into parallel light, and the fiber optic taper may be configured to reduce an image by converging the parallel light onto the second image sensor.

"The optical element may include the Fresnel lens which is configured to convert light from the beamsplitter into parallel light, and the DOE which is configured to reduce an image by converging the parallel light, and the fiber optic taper may be configured to additionally reduce the image that is reduced by the DOE.

"According to an aspect of another exemplary embodiment, there may be provided a three-dimensional (3D) image acquisition apparatus including the imaging optical system, a light source configured to generate light of the second wavelength band and irradiate the light of the second wavelength band onto a target object; an image signal processor (ISP) configured to generate a 3D image by using an image output from the first image sensor and an image output from the second image sensor; and a control unit configured to control operations of the light source and the ISP.

"The light source may be configured to irradiate the second wavelength band light having a predetermined period and a predetermined waveform to the target object, according to a control operation performed by the control unit.

"The light of the first wavelength band may include visible light and the light of the second wavelength band may include infrared light, the first image sensor may be configured to generate a color image having a red ® component, a green (G) component, and a blue (B) component for each of a plurality of pixels in the first image sensor, and the second image sensor may be configured to generate a depth image regarding a distance between the 3D image acquisition apparatus and the target object.

"The ISP may be configured to calculate a distance between the target object and the 3D image acquisition apparatus for each of the plurality of pixels in the first image sensor by using the depth image output from the second image sensor, and combine calculation results with the color image output from the first image sensor, to therebygenerate a 3D image.

BRIEF DESCRIPTION OF THE DRAWINGS

"These and/or other exemplary aspects will become apparent and more readily appreciated from the following description of exemplary embodiments, taken in conjunction with the accompanying drawings in which:

"FIG. 1 is a conceptual drawing illustrating an imaging optical system and a structure of a 3D image acquisition apparatus including the imaging optical system according to an exemplary embodiment;

"FIG. 2 is a cross-sectional view illustrating a structure of an optical element of FIG. 1;

"FIG. 3 is a conceptual diagram illustrating an imaging optical system including a diffractive optical element (DOE) and a structure of a 3D image acquisition apparatus including the imaging optical system according to another exemplary embodiment;

"FIG. 4A is a conceptual diagram illustrating a structure of an imaging optical system, according to another exemplary embodiment;

"FIG. 4B is a conceptual diagram illustrating a modification of the exemplary embodiment of FIG. 4A;

"FIG. 5 is a conceptual diagram illustrating a structure of an imaging optical system, according to another exemplary embodiment;

"FIG. 6A is a conceptual diagram illustrating a structure of an imaging optical system, according to another exemplary embodiment;

"FIG. 6B is a conceptual diagram illustrating a structure of an imaging optical system, according to another exemplary embodiment;

"FIGS. 7A and 7B are a side view and a front view illustrating a structure of an imaging optical system, according to another exemplary embodiment;

"FIG. 7C is a conceptual diagram illustrating a modification of the exemplary embodiment of FIG. 7B;

"FIG. 8 is a conceptual diagram illustrating a structure of an imaging optical system, according to another exemplary embodiment;

"FIG. 9 is a perspective view of a fiber optic taper of FIGS. 8; and

"FIG. 10 is a conceptual diagram illustrating a structure of an imaging optical system, according to another exemplary embodiment."

For additional information on this patent application, see: PARK, Yong-hwa; GORELOV, Alexander; YOU, Jang-woo; SHIRANKOV, Alexander; LEE, Seung-wan. Imaging Optical System for 3d Image Acquisition Apparatus, and 3d Image Acquisition Apparatus Including the Imaging Optical System. Filed October 22, 2013 and posted May 1, 2014. Patent URL: http://appft.uspto.gov/netacgi/nph-Parser?Sect1=PTO2&Sect2=HITOFF&u=%2Fnetahtml%2FPTO%2Fsearch-adv.html&r=4123&p=83&f=G&l=50&d=PG01&S1=20140424.PD.&OS=PD/20140424&RS=PD/20140424

Keywords for this news article include: Samsung Electronics Co. Ltd.

Our reports deliver fact-based news of research and discoveries from around the world. Copyright 2014, NewsRx LLC


For more stories covering the world of technology, please see HispanicBusiness' Tech Channel



Source: Electronics Newsweekly


Story Tools






HispanicBusiness.com Facebook Linkedin Twitter RSS Feed Email Alerts & Newsletters