![]() SYSTEM AND METHOD FOR CORRECTION BASED ON THE LIGHT FIELD OF COLORED SURFACES IN AN IMAGE
专利摘要:
A computer-implemented method for correcting a make-up or skin effect on a surface region of an image of a body part of a person. The method and system correcting the makeup or skin effect by taking into account image-specific light field parameters, such as a light profile estimate and a minimum light field estimate, and making the effect makeup or skin corrected on the image to generate a corrected image. 公开号:FR3062739A1 申请号:FR1752547 申请日:2017-03-28 公开日:2018-08-10 发明作者:Parham Aarabi 申请人:Modiface Inc; IPC主号:
专利说明:
FIELD The present subject relates to a system and to a method for correcting a makeup or skin effect to be rendered on a surface region of an image of a part of a person's body. BACKGROUND [0002] The importance of being able to see the precise color of makeup correctly under multiple lighting conditions is a crucial aspect in a beauty routine. Although makeup may look beautiful in artificial bathroom light, under natural light, a person may discover that their foundation is too dark and that the shade of lipstick does not flatter their complexion. These errors could have been avoided by allowing a person to try makeup in a digital image, where that makeup has been corrected in relation to the lighting environment of the image. For example, although the color of the red lipstick may appear dark and dull in a room with warm lighting, when you go out, the color becomes bright and bright. Another example is that the color of the lipstick that a woman would like to wear matches the color of the dress she is wearing. Being able to see what color will look like in multiple lighting sets makes it easier for a person to choose a shade that will flatter their complexion in any light, but also ensures that the person has spent their time and money wisely. The same color under different lighting conditions can look very different. Additionally, the same chemical texture (which has a specific texture / finish and chemical composition) will reflect light differently in different lighting situations. Document US9495582 entitled "Digital Makeup" and published / issued on November 15, 2016 discloses a system which uses a combination of low-pass, band-pass and high-pass filters to denoise a facial image. However, document US9495582 does not disclose any method for simulating color layers on an image, retaining the texture of the underlying surface on which a color is simulated, or any model for dynamically adjusting a color layer. coloring to simulate a realistic color in different lighting conditions. Document US9547883 entitled Systems And Methods For Dewarping Images and published / issued on January 17, 2017 discloses a system to remove image distortions caused by wide angle lenses. Document US9547883 discloses no method for simulating color layers on an image, retaining the texture of the underlying surface on which a color is simulated, or any model for dynamically adjusting a coloring layer in order to simulate a realistic color in different lighting conditions. Document US9542599 entitled Image Processing Device And Image Processing Method and published / issued on January 10, 2017 discloses a system for eliminating halos around a pupil in a digital photo. Document US9542599 discloses no method for simulating a layer of makeup color on an image, retaining the texture of the underlying surface on which a color is simulated, or any model for dynamically adjusting a coloring layer to simulate a realistic color in different lighting conditions. Document US9449412 entitled "Adaptive, Calibrated Simulation Of Cosmetic Products On Consumer Devices" and published / issued on September 20, 2016 discloses a system for simulating cosmetic products on images. Document US9449412 does not disclose a non-linear transformation based on a light field of the lumination index, and does not disclose the estimate of minimum bright pixels which are essential for the transformation of the lumination index. Finally, document US9449412 discloses only pixel adjustments, but not region-based adjustments as disclosed in this document. Document US9460462 entitled "Monetization Using VideoBased Simulation Of Cosmetic Products" and published / issued on October 4, 2016 discloses a system for the sale of makeup products. Document US9460462 discloses no method for preserving the texture of the underlying surface on which a color is simulated, or any model for the dynamic adjustment of a coloring layer in order to simulate a realistic color under different lighting conditions . SUMMARY In accordance with the present disclosure, there is provided a system and method for correcting a makeup or color effect to be rendered on a surface region of an image of a person which takes into account the environment of illumination of the image and allows an exact rendering of a color layer under different lighting conditions (for example, low light, high intensity light, etc.). As a corollary, adjusting the rendering of the color layer based on light field parameters can help users determine the makeup shade that will work perfectly in all lighting sets without having to buy and test several shades that can become expensive and unrealistic. Discloses a method for correcting a makeup or skin effect to be rendered on a surface region of an image of a part of the body of a person; calculating an estimate of light intensity of the image as the average color of pixels in a first region of the image; calculating an estimate of the minimum light field of the image as the minimum color of pixels in a second region of the image; receive visual rendering parameters for the makeup or skin effect; correct the visual rendering parameters of the makeup or skin effect by a non-linear transformation taking into account the estimation of light intensity and the estimation of minimum light field in order to generate a makeup or skin effect corrected on the basis of the light field; rendering the corrected makeup or skin effect on the basis of the light field on the surface region in order to generate a corrected image; and provide the corrected image for display. The part of a person's body can be any part of the body, such as the face, hair or skin. The exemplary embodiment of the method disclosed here is specific to a face, however it is intended that the method can be applied to any part of the body. The method may include receiving a video image as the image. The first region of the image may include a face region of the person. The second region of the image may include a region of the person's eye. There is provided a computer system comprising at least one processor and a memory storing instructions and data which, when executed by the processor, cause the computer system to carry out at least one of the methods described. There is also provided a computer readable non-transient medium storing instructions which, when executed by a computer, cause the computer to perform at least one of the methods described. BRIEF DESCRIPTION OF THE DRAWINGS Exemplary embodiments of the subject will now be described in conjunction with the following drawings, by way of example only, where: Figure 1 is a flow diagram showing a method for the correction based on the light field of colored surfaces in an image as a sequence of steps with inputs according to a mode of 0 achievement; and Figure 2 is a flowchart showing in more detail the method shown in Figure 1, in accordance with one embodiment. DETAILED DESCRIPTION As described more fully below with reference to Figures 1 and 2, a computer-implemented system and method for correction based on the light field of colored surfaces in an image are provided. Such a system and such a method can be useful for correcting a makeup or skin effect to be rendered on a surface region of an image 100 of a part of the body of a person, which takes account of a light field. of the image. The method set out in this disclosure is implemented on a computer using at least one processor and a memory (or other non-transient computer-readable medium) associated with the computer. The memory can store instructions and / or data to configure the at least one processor during execution and therefore the computer to provide and / or carry out the methods described here. The instructions can be configured as software such as a native application or a browser-based application. The computer can be coupled for communication, either locally for example via a bus or remotely for example via a network, to one or more storage device (s), image and / or video capture devices (by e.g. a camera) and other devices or services (e.g. via the web). An exemplary embodiment is the use of a mobile device using static images or a live video stream in real time which is processed image by image. The mobile device can include the camera. Another exemplary embodiment is a browser running on a computer which simulates makeup shades on static images or on a real-time live video stream processed image by image. Another exemplary embodiment is an internal magic mirror which simulates makeup on a live video stream in real time processed image by image. The proposed method could be used in any number of color spaces (RGB, HSV, HSL, LAB, CMYK, etc.). For simplicity, the following calculations are based on the HSV domain. However, it is noted that anyone familiar with art will be able to reproduce these equations in different color spaces. The following leads adopt an RGB scale in the range 0-255 per channel and an HSV scale with a 0-1 scale for the SetV channels. In this disclosure, a reference to a color means the color for a single region which is determined by averaging the colors of this region. A region could be as small as a single pixel, but could be made up of a group of pixels making up a single coloring region. A region on the image to be colored with the new coloring layer has a medium base color for the region defined by H for hue, S for saturation and V for value. Referring to Figure 1, a flowchart showing a method for correction based on the light field of colored surfaces in an image 100 is shown. The received image 100 can be provided from a camera, storage device, or other image source. The image can be a video format or a single image format. In step 110, the estimation of light field parameters is carried out. In step 120, the visual rendering correction is performed, using the received effect information 150. In step 130, the corrected rendering based on the light field is performed on the received image having the effect and in step 140 the corrected image having the effect is displayed. The steps 110, 120, 130 in particular are described in more detail with reference to FIG. 2. The effect information received 150 can be a makeup effect, a skin effect or any other color effect. The received effect information 150 can be provided by a user, a storage device, or any other effect source. Referring to Figure 2 and in brief, the exemplary embodiment of the correction of the makeup or skin effect comprises the following steps: (1) the detection of facial features 230 and facial features images, (2) the calculation of the estimation of light field parameters comprising the calculation of the estimation of light profile 210 of the image, and the calculation of the estimation of minimum light field 220 of the image, (3) the reception of the color effect 150 which must be applied to the image, (4) the application of a non-linear transformation of the light field 200 to the intensity of the color effect, (5 ) the application of the non-linear transformation of the light field with a minimum light compensation 240 to the intensity of the color effect, (6) the application of the saturation correction 250 of the color effect, ( 7) the corrected rendering based on the light field 130 of the received effect 150 on the image 100 to generate a corrected image, and (8) the display Display of the corrected image 140. In FIG. 1, the estimation of light field parameters 110 is detailed as step 2 following the preliminary processing of step 1 and the visual correction rendering 120 is detailed as steps 4 to 6. . Step one in the exemplary embodiment is the detection of facial features 230 and facial features in the image. This can be achieved by gradient-based methods, model-based methods, or other methods to determine specific replacement of the face and facial features such as the eyes, nose, and lips. The facial skin regions can then be detected, from which the skin color could be determined by averaging the pixels in the skin region. Step two in the exemplary embodiment is the calculation of the estimation of light field parameters 110, including the estimation of light profile 210 and the estimation of minimum light field 220. In the exemplary embodiment, the calculation of the light profile estimate 210 is determined by defining V_face 300 as the value of the average face color. V_face 300 could also be determined from other intrinsic parameters of a photo in order to have an indication of the lighting of the environment. For example, V_face 300 could be determined from the shine on the hair, using the extremes of the shine and shade on the hair to provide a method for calibrating the face color value V_face (for example , calculate V_face on a scale between the smallest V and the largest V on the hair and report it on a normalized scale). V_face 300 could also be obtained by using average background lighting in a similar manner by using the minimum and maximum values (or other statistical measures) to calibrate the average skin pixels from which V_face is estimated. In fact, V_face could be determined by calibrating any source of light measurement, including the background, hair or face, or a combination of some or all of them. In the exemplary embodiment, the method evaluates low-light camera contrast adjustments by calculating the estimate of minimum light field 220. V_black 310 is defined as being the smallest value V around the eyes (which is a method of estimating the minimum lumination index of the face). This is done by looking for a region of pixels around the eyes, finding the pixel or region that has the smallest value parameter V and defining it as V_black 310. V_black 310 could also be defined as the minimum V value around the face or in the whole image too. It is a measure of the minimum V value which can be determined by various image and camera parameters by observing the minimum V value in a region which is assumed to contain a black color (assuming that an ideal V_black is equal to 0, although in practice V_black may be greater than 0 due to the lighting of the image / video image). In fact, having a face or facial features in the region is not a requirement, although knowledge of facial features (eg black color of eyelashes or shadows in the nose provides target regions useful for estimating V_black) be useful. Step three in the exemplary embodiment is the receipt of effect information 150, which may be a makeup effect, a skin effect, or another color effect which is to be applied to the image. The color of this effect which is intended to be rendered at the level of the current region is defined as having an H_original hue, an S_original saturation and a V_original value. Step four of the exemplary embodiment consists in applying a non-linear transformation of the light field 200 to the intensity of the color, calculated as a scaled value parameter defined below, which is used for calculate the adjusted intensity of the color, V_adjusted 320: V_adjusted = V_original * (1- V_original * c_1) In the above equation, c_1 is a constant value distortion correction factor in the range 0-1. A preferred value of c_1 is in the range 0.4-0.9. This adjustment is applied since the color that is intended to be rendered is generally set in a brighter frame than the lighting conditions where the image is captured. Step five of the exemplary embodiment consists in applying a non-linear transformation of the light field with a minimum light compensation 240 to calculate the new intensity of the color to be rendered on the image. The color is adjusted to correct any problems with low light contrast in the image or video image. Vnew 330 is calculated based on the following set of nonlinear equations: if (V_face> V_black) && (c_2> V_black) && (V_adjusted> V_black) V_new = (V_adjusted-V_black) * (V_face-V_black) / (c_2-V_black) + V_black otherwise Vnew = V_black In the above equations, c_2 is a base value limit in the range 0-1. A preferred value of c_2 is in the range of 0.3 to 0.8. The non-linear transformations defined above include the minimum lumination index parameter V_black 310. This provides a unique distinction with other color adjustment methods such as that disclosed in US9449412, which does not use any transformation nonlinear lumination index nor estimates the minimum lumination index parameter that is critical in the nonlinear lumination index transformation. Step six of the exemplary embodiment consists in applying the saturation correction 250 to the saturation value of the received effect information 150, as the saturation of the received color effect 150 can also be affected by the light field. In the exemplary embodiment, this is a non-linear transformation taking into account the minimum light. The corrected saturation value S_new 340 is calculated using the following equation: Snew = S_originarmax (0.1 - V_black / c_3) In the above equation, c_3 is a saturation compensation scaling value in the range 0-1. A preferred value of c_3 is in the range of 0.4 to 1.0. The corrected color value which is rendered on the image is defined by H_original, S_new 340 and V_new 330. Step seven of the exemplary embodiment is the corrected rendering based on the light field 130 of the received effect information 150 on the image 100. This is done by replacing the original H, S and V of the pixels in the current region of the image by the corrected color values H_original, S_new 340 and Vnew 340, respectively. Again, a region could be as small as a pixel or could be a group of pixels whose average H, S and V values would be replaced by H_original, S_new 340 and Vnew 330. Other improvements could be achieved by a more complex lighting model for H, S and V, including how 5 each of them is affected by the lighting. This process is then repeated for all the other regions of the received image 100 where a received color effect 150 must be applied. The color in one or more region (s) of the image 100 is replaced by the corrected coloring value, thus creating a corrected image. This corrected image can be stored in a computer memory and / or displayed 140 to the user by means of a screen or a display. If the corrected image is a video image, the resulting video image can be displayed 140 and / or stored as part of a group of video images. It should be obvious to those skilled in the art that the description and the Figures are merely illustrative and not limiting. They are presented by way of example only. Modifications and other embodiments are within the reach of the skilled person and are envisaged as falling within the subject claimed.
权利要求:
Claims (11) [1" id="c-fr-0001] 1. Process implemented by computer to correct a makeup or skin effect to be rendered on a surface region of an image of a part of the body of a person, which takes account of a light field of the image, the method comprising the fact; calculating an estimate of the light intensity of the image as the average color of pixels in a first region of the image; determining a minimum light field estimate of the image as the minimum color of pixels in a second region of the image; receive color effect parameters from the makeup or skin effect; correct the color effect parameters by a non-linear transformation taking into account the estimation of light intensity and the estimation of minimum light field to generate a makeup or skin effect corrected on the basis of the light field ; rendering the corrected makeup or skin effect based on the light field on the surface region to generate a corrected image; and provide the corrected image for display. [2" id="c-fr-0002] 2. The method of claim 1, wherein the image is a video image. [3" id="c-fr-0003] 3. The method of claim 1, wherein the body part of the person comprises the face of a user. [4" id="c-fr-0004] 4. The method of claim 3, wherein the first image region comprises a face region of the person. [5" id="c-fr-0005] 5. The method of claim 3, wherein the second image region comprises a person's eye region. [6" id="c-fr-0006] 6. Computer system for correcting a make-up or skin effect to be rendered on a surface region of an image of a part of the body of a person, which takes account of a light field of the image, the system comprising at least one processor coupled to a memory storing instructions and data which, when executed by the processor, cause the computer system to: calculating an estimate of light intensity of the image as the average color of pixels in a first region of the image; determining a minimum light field estimate of the image as the minimum color of pixels in a second region of the image; receive color effect settings from the makeup or skin effect; correct the color effect parameters of the make-up or skin effect by a non-linear transformation, taking into account the estimate of light intensity and the estimate of minimum light field to generate a make-up or skin corrected on the basis of the light field; rendering the makeup or skin effect corrected based on the light field on the surface region to generate a corrected image; and provide the corrected image for display. [7" id="c-fr-0007] 7. The system of claim 6, wherein the image is a video image. [8" id="c-fr-0008] 8. The system of claim 6, wherein the body part of the person comprises the face of a user. [9" id="c-fr-0009] 9. The system of claim 8, wherein the first image region comprises a face region of the person. [10" id="c-fr-0010] 10. The system of claim 8, wherein the second image region comprises a person's eye region. [11" id="c-fr-0011] 11. Non-transient computer-readable medium storing instructions which, when executed by a computer, cause the computer to perform the following steps which consist of: calculating an estimate of light intensity of the image as the average color of pixels in a first region of the image; determining a minimum light field estimate of the image as the minimum color of pixels in a second region of the image; receiving color effect parameters from the makeup or skin effect; correcting the color effect parameters by a non-linear transformation taking into account the estimation of light intensity and the estimation of minimum light field to generate a makeup or skin effect corrected on the basis of the light field ; rendering the corrected makeup or skin effect on the basis of the light field on the surface region to generate a corrected image; and provide the corrected image for display. 1/2
类似技术:
公开号 | 公开日 | 专利标题 FR3062739A1|2018-08-10|SYSTEM AND METHOD FOR CORRECTION BASED ON THE LIGHT FIELD OF COLORED SURFACES IN AN IMAGE Zhang et al.2017|Underwater image enhancement via extended multi-scale Retinex US9852499B2|2017-12-26|Automatic selection of optimum algorithms for high dynamic range image processing based on scene classification KR101140533B1|2012-05-02|Method and system for recommending a product based upon skin color estimated from an image CN102301392B|2014-08-13|Color image processing method, color image processing device, and recording medium US20070058858A1|2007-03-15|Method and system for recommending a product based upon skin color estimated from an image WO2018201662A1|2018-11-08|Lip color rendering method, apparatus, and electronic device Li et al.2018|A multi-scale fusion scheme based on haze-relevant features for single image dehazing CN102985943A|2013-03-20|Color image processing method, color image processing device, and color image processing program WO2018133379A1|2018-07-26|Method for enhancing low-illumination image CN109754377A|2019-05-14|A kind of more exposure image fusion methods WO2017092593A1|2017-06-08|Method, apparatus and device for adjusting fusion material CN109427041B|2021-10-22|Image white balance method and system, storage medium and terminal equipment Livingston et al.2011|Image processing for human understanding in low-visibility Zhao et al.2018|Spectral illumination correction: achieving relative color constancy under the spectral domain CN110533732B|2021-11-26|Method and device for identifying skin color in image, electronic equipment and storage medium JP3497801B2|2004-02-16|Face image display method and face image processing device Liba et al.2020|Sky Optimization: Semantically aware image processing of skies in low-light photography He et al.2016|Development of a perceptually calibrated objective metric for exposure quality TW202103484A|2021-01-16|System and method for creation of topical agents with improved image capture Zhao et al.2018|Pixel-wise Illumination Correction Algorithms for Relative Color Constancy Under the Spectral Domain JP2014087641A|2014-05-15|Method for evaluating skin state KR20190141438A|2019-12-24|Apparatus and method for generating a high dynamic range image Banik et al.2019|Fusing Reflectance based LDR Images to Generate HDR Image JPH07296160A|1995-11-10|Image correcting method
同族专利:
公开号 | 公开日 EP3358523B1|2021-01-06| EP3358523A1|2018-08-08| FR3062739B1|2020-03-20| US10892166B2|2021-01-12| US20200027744A1|2020-01-23| US20180225845A1|2018-08-09| US10565741B2|2020-02-18|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US10866716B2|2019-04-04|2020-12-15|Wheesearch, Inc.|System and method for providing highly personalized information regarding products and services| US11281366B2|2020-11-18|2022-03-22|Hillary Sinclair|System and method for providing highly personalized information regarding products and services|FR2799022B1|1999-09-29|2002-02-01|Oreal|MAKEUP ASSISTANCE DEVICE AND ASSEMBLY CONSISTING OF SUCH A DEVICE AND A DEVICE FOR DELIVERING A PRODUCT HAVING A PREDETERMINED BRDF, SELECTED BY THE MAKEUP ASSISTANCE DEVICE| US6937755B2|2000-06-27|2005-08-30|Rami Orpaz|Make-up and fashion accessory display and marketing system and method| KR20140099319A|2011-12-04|2014-08-11|디지털 메이크업 엘티디|Digital makeup| US9460462B1|2012-05-22|2016-10-04|Image Metrics Limited|Monetization using video-based simulation of cosmetic products| US9449412B1|2012-05-22|2016-09-20|Image Metrics Limited|Adaptive, calibrated simulation of cosmetic products on consumer devices| JP6396890B2|2013-04-08|2018-09-26|パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America|Image processing apparatus, image processing method, and program capable of virtually reproducing state where makeup coating material is applied| CN105263399B|2013-06-07|2017-06-23|富士胶片株式会社|Transparent sensation evaluating device, transparent appraisal method| WO2015144563A1|2014-03-28|2015-10-01|Koninklijke Philips N.V.|Image processing system and method| GB201411975D0|2014-07-04|2014-08-20|Rolls Royce Plc|Aircraft control method| US20160042557A1|2014-08-08|2016-02-11|Asustek Computer Inc.|Method of applying virtual makeup, virtual makeup electronic system, and electronic device having virtual makeup electronic system| JP6327071B2|2014-09-03|2018-05-23|オムロン株式会社|Image processing apparatus and image processing method| JP6583660B2|2015-03-26|2019-10-02|パナソニックIpマネジメント株式会社|Image composition apparatus and image composition method| US9547883B1|2016-08-19|2017-01-17|Intelligent Security Systems Corporation|Systems and methods for dewarping images| US10565741B2|2017-02-06|2020-02-18|L'oreal|System and method for light field correction of colored surfaces in an image|US10565741B2|2017-02-06|2020-02-18|L'oreal|System and method for light field correction of colored surfaces in an image| US10863059B2|2018-06-26|2020-12-08|Microsoft Technology Licensing, Llc|Environmental-based spatial color uniformity correction by using statistical distribution of camera modules| EP3628187A1|2018-09-26|2020-04-01|Chanel Parfums Beauté|Method for simulating the rendering of a make-up product on a body area| US11222446B2|2019-09-19|2022-01-11|Prêt-À-Template Criação E Comercialização De Software De Moda Ltda.|Virtual face chart|
法律状态:
2018-02-23| PLFP| Fee payment|Year of fee payment: 2 | 2019-02-13| PLFP| Fee payment|Year of fee payment: 3 | 2019-12-13| TP| Transmission of property|Owner name: L'OREAL SA, FR Effective date: 20191105 | 2020-02-14| PLFP| Fee payment|Year of fee payment: 4 | 2020-04-03| CA| Change of address|Effective date: 20200226 | 2021-02-10| PLFP| Fee payment|Year of fee payment: 5 | 2022-02-10| PLFP| Fee payment|Year of fee payment: 6 |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 US15/425,326|US10565741B2|2017-02-06|2017-02-06|System and method for light field correction of colored surfaces in an image| US15425326|2017-02-06| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|