News Column

Patent Application Titled "Method of 3d Model Morphing Driven by Facial Tracking and Electronic Device Using the Method the Same" Published Online

February 13, 2014



By a News Reporter-Staff News Editor at Computer Weekly News -- According to news reporting originating from Washington, D.C., by VerticalNews journalists, a patent application by the inventors Ye, Zhou (Foster City, CA); Lu, Ying-Ko (Taoyuan County, TW); Hsu, Yi-Chia (Tainan City, TW); Jeng, Sheng-Wen (Taipei City, TW), filed on July 12, 2013, was made available online on January 30, 2014.

No assignee for this patent application has been made.

Reporters obtained the following quote from the background information supplied by the inventors: "The present invention relates to a method of 3D model morphing, and more particularly, to a method of 3D model morphing driven by facial tracking, and an electronic device adopting the method of 3D model morphing driven by facial tracking.

"Please refer to FIG. 1. FIG. 1 is a diagram showing a traditional 3D model morphing and motion capture technique. In FIG. 1, the traditional 3D model morphing and motion capturing that are commonly applied in commercial movies' visual effects usually use high speed camera to capture a large number of color-dot tags disposed on a subject or user (i.e. a whole person). These color-dot tags are typically pasted in advance on the subject or user's face to track facial expression. However, it's highly inconvenient and non-intuitive to paste such large number of color-dot tags on the face.

"Hence, how to improve the method of 3D morphing has become an important topic in this field, and thus there is room for improvement in the art."

In addition to obtaining background information on this patent application, VerticalNews editors also obtained the inventors' summary information for this patent application: "It is one of the objectives of the present invention to provide a method of 3D morphing driven by facial tracking that does not require pasting of large number of color-dot tags on the subject.

"It is one of the objectives of the present invention to provide an electronic device adopted to use the method of 3D morphing driven by facial tracking that does not require of using large number of color-dot tags being pasted on the subject.

"According to an embodiment of the present invention, a method of 3D morphing driven by facial tracking is provided. In the method of 3D morphing according to the embodiment, the use of any color-dot tag for pasting on the subject(s) is not required.

"According to an embodiment of the present invention, an electronic device adapted to use the method of real-time 3D morphing driven by real-time facial tracking is provided. In the usage of the electronic device for providing real-time 3D morphing, the use of any color-dot tag for pasting on the subject(s) is not required.

"According to an embodiment of the present invention, a plurality of facial feature control points and a plurality of boundary control points are defined and picked up from vertices on the 3D model to generate deformation to a plurality of tracking points during facial tracking. Each facial feature control point or boundary control point is corresponding to one tracking point. In addition, a plurality of boundary control points are defined and picked up from other vertices on the 3D model to define the physical boundary formed between a deformable region that includes the face, the forehead, and an upper portion of the neck region, and a non-deformable region which includes the head, hair, a lower portion of the neck region, and the rest of the body that are not affected by facial expressions of the 3D model.

"According to an embodiment of the present invention, a control point pickup method is provided. In addition, the position or location of the picked up facial feature or boundary control points can be revised at any time by dragging a cursor using a mouse or a touch screen.

"According to an embodiment of the present invention, a control point reassignment and reconfiguration method is provided.

"According to an embodiment of the present invention, a boundary control point pickup method is provided.

"According to an embodiment of the present invention, a deformation method to control a deformation of the positions of the vertices in the 3d model using equations for producing a real-time morphed 3D avatar image is provided.

"According to an embodiment of the present invention, a method for achieving morphing effects on specified individual region of a real-time facial image of a person to obtain more exaggerated bigger movements is provided.

"According to an embodiment of the present invention, a pupil movement detection method is provided during facial tracking.

"According to an embodiment of the present invention, a method to track teeth and tongue of the real-time captured facial image of a person by using a camera, in particular to tongue and teeth displacement is provided.

"According to an embodiment of the present invention, scaling, translation and rotation of the real-time 3D avatar image is provided.

"According to an embodiment of the present invention, a hair flowing effect simulation implementation method is provided.

"According to an embodiment of the present invention, another hair flowing effect simulation method incorporated with 3D animation software is further provided.

"These and other objectives of the present invention will no doubt become obvious to those of ordinary skill in the art after reading the following detailed description of the preferred embodiment that is illustrated in the various figures and drawings.

BRIEF DESCRIPTION OF THE DRAWINGS

"The application file contains at least one drawing executed in color. Copies of this patent or patent application publication with color drawing(s) will be provided by the Office upon request and payment of the necessary fee.

"The present invention is illustrated by way of example and not limitation in the figures of the accompanying drawings in which like references indicate similar elements. Besides, many aspects of the disclosure can be better understood with reference to the following drawings. Moreover, in the drawings like reference numerals designate corresponding elements throughout. Wherever possible, the same reference numerals are used throughout the drawings to refer to the same or like elements of an embodiment.

"FIG. 1 is a diagram showing a traditional 3D model morphing technique.

"FIG. 2 is a diagram illustrating a basic concept of a method of 3D morphing driven by facial tracking according to an embodiment of the present invention.

"FIG. 3 is a flow chart illustrating a method of 3D morphing driven by facial tracking according to the embodiment of the present invention.

"FIG. 4 is a diagram illustrating a method for configuring and setting up a plurality of facial feature control points and boundary control points on a 3D model according to the embodiment.

"FIGS. 5A and 5B are diagrams illustrating a control point pick up method according to a first embodiment of the present invention.

"FIG. 6 is a diagram illustrating a control point reassignment and reconfiguration method according to a second embodiment of the present invention.

"FIG. 7 is a diagram illustrating a boundary control point pickup method according to the embodiment of the present invention.

"FIG. 8 is a diagram containing detailed description of the location data for the control points corresponding to the 3D model to be organized as a file according to the embodiment of the present invention.

"FIG. 9 is a diagram illustrating a 3D model morphing method driven by real-time facial tracking according to a fourth embodiment of the present invention.

"FIG. 10 is a diagram illustrating pupil movement detection according to an embodiment of the present invention.

"FIG. 11 is a diagram illustrating how to track teeth and tongue according to an embodiment of the present invention.

"FIG. 12 is a diagram illustrating scaling, translation and rotation of the real-time 3D avatar image according to the embodiment of the present invention.

"FIGS. 13A and 13B are diagrams illustrating hair flowing effect simulation of the embodiment of present invention.

"FIG. 14 illustrates a conceptual representation of the 3D avatar image prior to morphing when being subjected to the constraints of the facial feature control points and the boundary control points and using deformation method to produce the morphed 3D avatar image thereafter.

"FIG. 15 illustrates a graphical representation of respective functional variables utilized in the deformation method.

"FIG. 16 illustrates a method for determination of deformation rate at before and after adjustment for each facial tracking point.

"FIG. 17 is a diagram illustrating a pupil movement detection method according to a third embodiment of the present invention.

"FIG. 18 shows a Gaussian distribution to amplify the pupils' gray level of color at point x to find the pupils' positions.

"FIG. 19 shows a flow chart for a method for pupils' movement detection.

"FIG. 20 shows a flow chart of a hair flowing effect simulation implementation method.

"FIG. 21 shows a flow chart of another hair flowing effect simulation implementation method incorporated with 3D animation software."

For more information, see this patent application: Ye, Zhou; Lu, Ying-Ko; Hsu, Yi-Chia; Jeng, Sheng-Wen. Method of 3d Model Morphing Driven by Facial Tracking and Electronic Device Using the Method the Same. Filed July 12, 2013 and posted January 30, 2014. Patent URL: http://appft.uspto.gov/netacgi/nph-Parser?Sect1=PTO2&Sect2=HITOFF&u=%2Fnetahtml%2FPTO%2Fsearch-adv.html&r=4028&p=81&f=G&l=50&d=PG01&S1=20140123.PD.&OS=PD/20140123&RS=PD/20140123

Keywords for this news article include: Patents, Software.

Our reports deliver fact-based news of research and discoveries from around the world. Copyright 2014, NewsRx LLC


For more stories covering the world of technology, please see HispanicBusiness' Tech Channel



Source: Computer Weekly News


Story Tools