专利摘要:
The invention relates to a method for measuring a distance between an object in a scene (24) and a time-of-flight camera system (10), and providing a depth map. object, the Time-of-Flight camera system comprising an illumination unit (20), an imaging sensor (21) having an array of pixels (22, 23) and means (5) image processing. According to the method, the illumination of said illumination unit (20) is discreetly modified so as to illuminate elementary areas (31, 32) of the scene (24) with different incident intensities, respectively, for distinguish direct incident light beams from indirect incident light beams; the beams reflected by said elementary zones are received on the pixels (22, 23) of the sensor matrix and the corresponding data is supplied to the image processing means (5); said corresponding data is processed to eliminate the influence of the indirect light beams in the depth map of the object.
公开号:BE1022486B1
申请号:E2014/0578
申请日:2014-07-25
公开日:2016-05-03
发明作者:Der Tempel Ward Van
申请人:Softkinetic Sensors Nv;
IPC主号:
专利说明:

A TOF camera system and a method for measuring distance from the system
TECHNICAL FIELD OF THE INVENTION The invention relates to i) a method for measuring a distance between an object of a scene and a time-of-flight camera system and ii) the system of time-of-flight camera associated therewith. In particular, the invention relates to the problem of direct and indirect reflections of light in a scene and depth measurement errors induced by these multiple reflections. It must be understood that by scene we mean all the surfaces surrounding the object on which a light beam could be directly or indirectly reflected.
Background of the invention
Time-of-Flight (TOF) technology is a promising technology for depth perception. The well-known basic operating principle of a standard TOF 3 camera system is shown in Figure 1. The TOF 3 camera system captures 3-D images of a scene Analyzing the flight time of light from a dedicated illumination unit 18 to an object. The camera system TOF 3 includes a camera, for example a matrix of pixels 1, and data processing means 4. The scene 15 is actively illuminated with a modulated light 16 at a length predetermined waveform using the dedicated illumination unit 18, for example with light pulses of at least one predetermined frequency. Modulated light is reflected by objects in the scene. A lens 2 collects the reflected light 17 and forms an image of objects on the imaging sensor 1 of the camera. Depending on the distance of the objects from the camera, a delay is experienced between the emission of the modulated light, for example the so-called pulses of light, and the reception of these reflected light pulses. at the camera level. The distance between reflective objects and the camera can be determined according to the observed time delay and the constant value of the speed of light. One of the disadvantages of this technology is illustrated in Figure 2 and is related to the phenomenon called "multiple path". A standard camera system TOF 9 is shown, comprising an illumination unit 8 for illuminating a scene 24 in multiple directions, a TOF sensor 6 for detecting the reflections of the emitted light and processing means 7 to process the data acquired by the TOF sensor 6.
The pixels of the TOF sensor 6, not shown, measure the direct path 25 of the illumination unit 8 to the scene 24 and the scene 24 back to the pixels. But secondary reflections 26 or higher order reflections can be captured also on the same pixel and distort the perceived delay for the first direct reflection 25. The light captured by the sensor 6 can come from both the direct path 25 and the the secondary reflection 26, the depth map 27, representing the depth associated with each point of the scene, which is measured is thus wrong.
In the prior art, several methods have been implemented to recover the direct component of the incident light. For example, multi-frequency approaches have been realized by acquiring a set of depth measurements with different modulation frequencies, but the resolution obtained remains low.
Another approach uses a set of different spatial patterns that are generated, for example, by a digital light processing (DLP) projector. By creating dark and bright tasks in the scene, the direct and indirect components can be separated, as the acquired depth of the black part of the pattern is created by only an indirect signal from the multiple path. The different patterns are chosen so that each part of the scene is captured in a black situation. Edge effects are limited by defining patterns with sufficient overlap. However, creating these different patterns is expensive.
A solution remains to be found to find only the direct component of the light reflected in the most economical way, so as to achieve more precise measurements of distances between objects of a scene and the camera system TOF. Summary of the invention
The present invention relates to a method for measuring a distance between an object of a scene and a time-of-flight camera system, and providing a depth map of the object, the system of Time-of-Flight camera comprising an illumination unit, an imaging sensor having a matrix of pixels and image processing means, the method being characterized by the following steps: discretely illuminating said illumination unit so as to illuminate elementary areas of the scene with different incident intensities, respectively, to distinguish incident light beams from incident indirect light beams; receiving, on the pixels of the matrix of the sensor, the beams reflected by said elementary zones and supplying the corresponding data to the image processing means; processing said corresponding data to eliminate the influence of the indirect light beams in the depth map of the object.
Advantageously, during the data processing, the method comprises an identification step, for example on an intermediate map of depths, but not only, peaks associated with the elementary zones on which only indirect incident light beams can strike. By identifying such peaks, the data could be processed to eliminate the influence of indirect light beams and obtain a final and accurate depth map of the scene.
The present invention also relates to a Time-of-Flight (TOF) camera system for measuring a distance between an object of a scene and the TOF recording system, and providing a map of depths of the object, the TOF recording system comprising: - an illumination unit for illuminating the scene with a modulated light; an imaging sensor having a matrix of pixels for receiving on the pixels of the matrix of the sensor the beams reflected by the scene; image processing means for receiving, from the imaging sensor, data corresponding to the reflected beams and for processing said corresponding data; characterized in that it further comprises: - pattern forming means for discretely modifying the illumination of said illumination unit so as to illuminate elementary areas of the scene with different incident intensities, respectively, for distinguishing direct incident light beams from indirect incident light beams and eliminating the influence of indirect light beams in the depth map of the object by processing said corresponding data.
Advantageously, the modification of the illumination is performed by masking the illumination unit in a discrete manner. The pattern-forming means could for example be a mask for preventing direct incident light beams from striking elementary areas of the scene.
More preferably, the patterning means could comprise a series of identical pattern groups to allow easier data processing.
Other advantages and new features of the invention will be better understood on reading the detailed description which follows and with reference to the accompanying drawings.
Brief description of the drawings
Figure 1 illustrates the basic operating principle of a TOF; Figure 2 illustrates the multipath phenomenon; Figure 3 shows a TOF recording system according to an embodiment of the invention; Figure 4 shows an example of pattern forming means; Figure 5 shows an intermediate depth map of a scene and the associated pixels of the TOF; Figure 6 shows a scene and a luminous pattern projected on the stage comprising two different spatial zones.
Description of the invention
Figure 3 shows a TOF camera system 10 according to one embodiment of the invention. The TOF camera system 10 includes an illumination unit 20 for illuminating a scene 24 with modulated light. The light emitted by this illumination unit 20 is adapted to suit the distance measurement using Time-of-Flight technology. For example, the illumination unit 20 may be adapted to emit light pulses with an appropriate pulse width. Indeed, when pulses are used, the pulse width of each light pulse determines the range of the camera. For example, for a pulse width of 50 ns, the range is limited to 7.5 m. As a result, scene illumination becomes critical to the operation of the TOF camera system, and the high speed driving frequency requirements for illumination units require the use of specialized light sources such as than light-emitting diodes (LEDs) or lasers to generate such short pulses of light. The illumination unit is arranged to emit multidirectional light, as represented by the plurality of emitted light rays 25, 26 and 28 shown in FIG. 3.
The TOF camera system further comprises an imaging sensor 21 typically comprising a matrix array of pixels for receiving and detecting the reflected beams and forming an image of the scene 24. For the sake of clarity and clarity, illustration, only two pixels 22 and 23 have been shown in Figure 3, but the invention is not limited to a matrix of two pixels. By pixel, it is meant the image element sensitive to electromagnetic radiation of light and its associated electronic circuits. The pixel output may be used to determine the flight time of the light from the illumination unit 20 to an object in the scene 24 and reflected by the object to the TOF imaging sensor 21.
The TOF camera system further comprises pattern forming means 30 for creating a luminous pattern on scene 24. The luminous pattern may be a native pattern of laser scabs obtained directly from laser light interferences. or obtained by pattern forming means which can be placed in front of the illumination unit 20, or a combination of both laser scabs and pattern forming means 30. Using the pattern forming means 30, the light emitted by the illumination unit 20 passes through these pattern forming means, the light then being modified and a light pattern being formed on the stage, with delimited elementary zones 31, 32 of different intensities . Before striking the scene, the light emitted by the illumination unit 20 may be blocked or its intensity may be reduced on given areas of the patterning means 30 and not be blocked on other areas, causing the creating low light intensity areas 31 and high intensity 32, respectively, on the stage. For the sake of illustration, these zones have been represented by thick lines 31 and 32, but it is understood that the luminous pattern created on the stage is not a solid or physical motif fixed on the scene 24, but the result of light effects emanating from the pattern forming means 30 placed in front of the illumination unit 20. The light pattern is projected onto the stage by the illumination unit 20. The pattern-forming means 30 may be means filter, a mask, a grid or any other means for discreetly changing the illumination. Ideally, the patterning means should provide on the scene a spatially periodic light pattern 31, 32, 45, 46, to easily find areas 31 where only secondary reflections are measured. The patterning means 30 could also comprise a series of identical pattern groups 50, as shown in FIG. 4, where a series of different pattern groups that could be used sequentially in time, synchronously with a multiple of image rate of the TOF camera.
It is important to note that the invention does not require a pattern with a 100% contrast, and that there is no need to align the pattern forming means with the image sensor.
The camera system TOF 10 further comprises processing means 5 for determining the flight time of the light emitted by the illumination unit 20 and, from this, the distance between an object of the scene 24 and the imaging sensor 21. The processing means 30 are arranged to receive data from the pixels of the imaging sensor 21 and to process them to eliminate the influence of the indirect light beams in the depth map of the image. 'object. The method for determining this distance, and a final and accurate depth map of the object, will be described in the following paragraphs. The flight time can be calculated in a separate processing unit that can be coupled to the TOF sensor 21 or can be integrated directly into the TOF sensor itself. In FIG. 3, the processing means 5 have been shown coupled to the illumination unit 20, but the invention is not limited thereto.
Will now be described, with reference to Figures 3 - 6, a method for measuring a distance between an object of a scene and the camera system TOF and provide a depth map of the object, the system Time-of-Flight camera apparatus 10 comprising an illumination unit 20, an imaging sensor 21 having a pixel array 22, 23 and image processing means 30.
The method comprises the steps of discretely modifying the illumination of the illumination unit 20 so as to illuminate elemental areas 31, 32 of the scene with different incident intensities, respectively, to distinguish the incident light beam. direct 25 indirect incident light beam 26, 28. This modification can be implemented for example by creating a luminous pattern on a scene, the luminous pattern comprising delimited elementary zones with high and low light intensities. The light pattern can be created by placing the pattern forming means 30 in front of the illumination unit 20, and thus, projecting a light pattern onto the scene.
The pixels of the sensor 21 receive the beams reflected by these elementary areas 31, 32 and provide the image processing means 30 corresponding data.
These data are then processed to eliminate the influence of indirect light beams and obtain an accurate depth map of the object.
The luminous pattern projected on the scene can be found on an intermediate map of depths 29. This is illustrated in Figure 3 and Figure 5. By comparing the depth maps 27 in Figure 2 and 29 in Figure 3, without and with the use of pattern forming means, respectively, it is possible to notice the occurrence of peaks 33. These peaks 33 correspond to areas 31 of scene 24 where only secondary reflections are measured. In fact, in these zones, without secondary reflection, the pixels of the imaging sensor 21 should not measure light, or with a low intensity, because, by definition, these zones 31 are associated with zones of the light source means. pattern formation where the light is blocked or where the light intensity is reduced. The light measured by the pixel 23 is more dominated by secondary reflections 28, whereas the light measured by the pixel 22 corresponds to the two direct and indirect components 25, 26. The identification, for example on an intermediate map of depths 29, elementary zones 31 on which a direct incident light beam can not strike can be used to eliminate the influence of indirect light beams and obtain a final and accurate depth map of the object. The complex data obtained by measuring the time of flight of the light dominated by the indirect components in the pixel 23 can for example be subtracted from the complex data obtained by both the direct and indirect components in the pixel 22 to form new data. CN complexes. If the contributions of indirect components to the complex data are equal in the pixels 22 and 23, the resulting complex CN data contains only the direct component. Even though the pixel 23 receives a small direct component due to the limited contrast of the pattern forming means 30, the resultant complex data CN will have a smaller amplitude but will have the correct phase representing the flight time of the forward component.
Figure 5 shows an intermediate depth map of an object in scene 29 and the associated pixels of the TOF camera system. The pixels 40 measure only the indirect components and are associated with greater depths and peaks 33, while the pixels 41 measure both direct and indirect components, and are associated with zones 34 of the depth map. The identification of the pixels corresponding to the zones 31 can also be obtained by using a signal intensity map, where these pixels will have a lower intensity due to the missing direct components. Confidence cards or noise cards may also be used to identify the pixels associated with the areas 31.
Referring to Figure 3, to determine only the direct component of the reflected light, for the pixel 22 for example, the complex value measured by the pixel 22 can be subtracted from the complex value measured by the pixel 23 to form a new complex CN value.
In a real system, there will be more than 2 pixels, and an indirect component function can be built by the samples taken on the zones 31. This indirect component function can then be interpolated for all the pixels having both components. direct and indirect and subtracted from these pixels, leaving only the direct components.
When the scene is fairly simple, and the reflectance of the ideal scene, the value associated with the indirect components is a continuous function that can be easily sampled by all the zones 31, because the indirect components come from a Lambertian reflectance of the scene 24.
When the scene is more complex, the calculation can be done differently. The scene 40 of Figure 6 comprises for example a first wall 43 with a door and a second wall 41, on which a cabinet 42, with a given depth, is fixed. In this example, indirect reflections from the reflection of cabinet 42 or wall 43 do not lead to similar measurements. To determine the direct component of the reflected light, different spatial areas 45, 46 of light pattern 40 on scene 40 can be determined. For the sake of illustration, various forms have been used in FIG. 6, but it is understood that the two sub-patterns 45 and 46 come from the same patterning means 30 placed in front of the illumination unit 20 of the printing system. 10. The scene is now fragmented first using the available depth data or any other additional data useful for segmenting the scene. For each segment of the scene, a continuous function can again be associated with the indirect components, which can be sampled by the zones 31 belonging to each segment respectively. This indirect component function related to each segment can then be used to compensate for the unwanted indirect components present in the pixels with both direct and indirect components.
权利要求:
Claims (8)
[1]
claims
A method for measuring a distance between an object of a scene (24, 40) and a Time-of-Flight (10) camera system, and providing a depth map of the object, the time-of-flight camera system (10) comprising an illumination unit (20), an imaging sensor (21) having a pixel array (22, 23) and processing means image (30), the method being characterized by the following steps: - discretely modifying the illumination of said illumination unit (20) so as to illuminate elementary zones (31, 32) of the scene with different incident intensities, respectively, for distinguishing direct incident light beams (25) from indirect incident light beams (26, 28); receiving on the pixels of the matrix of the sensor (21) the beams reflected by said elementary zones (31, 32) and supplying the image processing means (30) with the corresponding data; processing said corresponding data to eliminate the influence of the indirect light beams in the depth map of the object.
[2]
The method of claim 1, wherein the step of modifying the illumination is a step of masking the illumination unit (20) in a discrete manner to create elementary zones (31) on the scene on which no direct incident light beam can strike.
[3]
3. The method of claim 1 or 2, wherein the step of processing said corresponding data comprises the step of identifying elementary areas (31) on which only indirect incident light beams can strike.
[4]
The method of any one of the preceding claims, further comprising the step of determining two different spatial areas (45, 46) of scene elementary areas (24, 40) associated with different sets of pixels of said matrix of pixels (22, 23), respectively.
[5]
5. A Time-of-Flight (TOF) camera system (10) for measuring a distance between an object of a scene (24, 40) and the TOF (10) , and providing a depth map of the object, the TOF recording system comprising: - an illumination unit (20) for illuminating the scene (24) with modulated light; an imaging sensor (21) having a matrix of pixels (22, 23) for receiving on the pixels of the array of the sensor (21) the beams reflected by the scene (24, 40); image processing means (5) for receiving, from the imaging sensor (21), data corresponding to the reflected beams and for processing said corresponding data; characterized in that it further comprises: pattern forming means (30) for discretely modifying the illumination of said illumination unit (20) so as to illuminate elementary areas (31, 32) of the scene with different incident intensities, respectively, to distinguish the incident light beams (25) from the indirect incident light beams (26, 28) and to eliminate the influence of the indirect light beams in the depth map of the object by processing said corresponding data.
[6]
The TOF camera system (10) of claim 5, wherein the pattern forming means (30) is placed in front of the illumination unit (20) to project a light pattern (31). , 32, 45, 46) on the stage.
[7]
The TOF camera system (10) of claim 5 or 6, wherein the pattern forming means (30) comprises a series of identical pattern groups (50).
[8]
The TOF camera system (10) of any one of claims 5 to 6, wherein the pattern forming means (30) includes a mask to prevent direct incident beams from striking areas. elementary (31) of the scene (24, 40).
类似技术:
公开号 | 公开日 | 专利标题
BE1022486B1|2016-05-03|A TOF TUNING APPARATUS SYSTEM AND A METHOD FOR MEASURING DISTANCE WITH THE SYSTEM
US20170322309A1|2017-11-09|Specular reflection removal in time-of-flight camera apparatus
US10255682B2|2019-04-09|Image detection system using differences in illumination conditions
CN103477644A|2013-12-25|Method of recording an image and obtaining 3D information from the image, and camera system
JP2020515811A|2020-05-28|System for characterizing the surroundings of a vehicle
EP2813809A1|2014-12-17|Device and method for measuring the dimensions of an objet and method for producing an item using said device
US20180210070A1|2018-07-26|Active Brightness-Based Strategy for Invalidating Pixels in Time-of-Flight Depth-Sensing
US20200096637A1|2020-03-26|Time-of-flight camera
CA2880145C|2020-07-28|Method for the non-destructive testing of a blade preform
EP3073441B1|2019-07-10|Method for correcting an image of at least one object presented remotely to an imager and illuminated by an illumination system and camera system for carrying out said method
FR3091356A1|2020-07-03|ACTIVE SENSOR, OBJECT IDENTIFICATION SYSTEM, VEHICLE AND VEHICLE LIGHT
US20200320725A1|2020-10-08|Light projection systems
EP3230713B1|2020-07-29|Method for obtaining an image of a sample, and associated lens-free imaging system
FR2551210A1|1985-03-01|METHOD OF CONTROLLING A NETWORK
FR3048104A1|2017-08-25|METHOD AND DEVICE FOR CAPTURING IMAGES OF A VEHICLE
WO2018229358A1|2018-12-20|Method and device for constructing a three-dimensional image
FR2966257A1|2012-04-20|METHOD AND APPARATUS FOR CONSTRUCTING A RELIEVE IMAGE FROM TWO-DIMENSIONAL IMAGES
FR3088160A1|2020-05-08|IMAGE SENSOR FOR OPTICAL CODE | RECOGNITION
WO2020260783A1|2020-12-30|Method and device for passive ranging by image processing
FR3097974A1|2021-01-01|PASSIVE TELEMETRY METHOD AND DEVICE BY IMAGE PROCESSING AND USE OF THREE-DIMENSIONAL MODELS
FR3103301A1|2021-05-21|Method for detecting specularly reflected light beam intensity peaks
FR3091249A1|2020-07-03|Method and device for determining the positioning of a motor vehicle in a traffic lane
WO2017025680A1|2017-02-16|Method and device for deleting stray information in images of at least one illuminated specular optical test pattern, and location system comprising said device
FR3086071A1|2020-03-20|METHOD, MICROSCOPE AND OFF-AXIS TRANSMISSION DIGITAL HOLOGRAPHIC IMAGING ASSEMBLY.
FR3079041A1|2019-09-20|METHOD FOR DETERMINING DISTANCE SEPARATING AN OBJECT AND AN OPTICAL DETECTION SYSTEM, AND CORRESPONDING SYSTEM
同族专利:
公开号 | 公开日
WO2015189311A1|2015-12-17|
US20170123067A1|2017-05-04|
US10901090B2|2021-01-26|
JP2017517737A|2017-06-29|
CN106662651B|2019-08-06|
KR20170041681A|2017-04-17|
EP2955544A1|2015-12-16|
CN106662651A|2017-05-10|
EP2955544B1|2020-06-17|
JP2021039131A|2021-03-11|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题
US20130148102A1|2011-12-12|2013-06-13|Mesa Imaging Ag|Method to Compensate for Errors in Time-of-Flight Range Cameras Caused by Multiple Reflections|
US20140055771A1|2012-02-15|2014-02-27|Mesa Imaging Ag|Time of Flight Camera with Stripe Illumination|
US20140049609A1|2012-08-14|2014-02-20|Microsoft Corporation|Wide angle depth detection|
US20040213463A1|2003-04-22|2004-10-28|Morrison Rick Lee|Multiplexed, spatially encoded illumination system for determining imaging and range estimation|
US9002511B1|2005-10-21|2015-04-07|Irobot Corporation|Methods and systems for obstacle detection using structured light|
US20100157280A1|2008-12-19|2010-06-24|Ambercore Software Inc.|Method and system for aligning a line scan camera with a lidar scanner for real time data fusion in three dimensions|
EP2380038B1|2008-12-22|2019-01-09|Koninklijke Philips N.V.|Cmos imager|
US8491135B2|2010-01-04|2013-07-23|Microvision, Inc.|Interactive projection with gesture recognition|
EP2395369A1|2010-06-09|2011-12-14|Thomson Licensing|Time-of-flight imager.|
DE102011081561A1|2011-08-25|2013-02-28|Ifm Electronic Gmbh|Time of flight camera system with signal path monitoring|
JP2013078433A|2011-10-03|2013-05-02|Panasonic Corp|Monitoring device, and program|
WO2013052781A1|2011-10-07|2013-04-11|Massachusetts Institute Of Technology|Method and apparatus to determine depth information for a scene of interest|
US9462255B1|2012-04-18|2016-10-04|Amazon Technologies, Inc.|Projection and camera system for augmented reality environment|
US9069080B2|2013-05-24|2015-06-30|Advanced Scientific Concepts, Inc.|Automotive auxiliary ladar sensor|
DE102013109020B4|2013-08-21|2016-06-09|Pmdtechnologies Gmbh|Scattered light reference pixel|
US9874638B2|2014-03-06|2018-01-23|University Of Waikato|Time of flight camera system which resolves direct and multi-path radiation components|
US10302768B2|2016-05-09|2019-05-28|Microsoft Technology Licensing, Llc|Multipath signal removal in time-of-flight camera apparatus|US10302768B2|2016-05-09|2019-05-28|Microsoft Technology Licensing, Llc|Multipath signal removal in time-of-flight camera apparatus|
US10928489B2|2017-04-06|2021-02-23|Microsoft Technology Licensing, Llc|Time of flight camera|
US10598768B2|2017-05-24|2020-03-24|Microsoft Technology Licensing, Llc|Multipath mitigation for time of flight system|
US10215856B1|2017-11-27|2019-02-26|Microsoft Technology Licensing, Llc|Time of flight camera|
US10901087B2|2018-01-15|2021-01-26|Microsoft Technology Licensing, Llc|Time of flight camera|
CN108259744B|2018-01-24|2020-06-23|北京图森智途科技有限公司|Image acquisition control method and device, image acquisition system and TOF camera|
EP3791210A1|2018-05-09|2021-03-17|Sony Semiconductor Solutions Corporation|Device and method|
CN109459738A|2018-06-06|2019-03-12|杭州艾芯智能科技有限公司|A kind of more TOF cameras mutually avoid the method and system of interference|
US20200041620A1|2018-07-31|2020-02-06|Waymo Llc|Hybrid Time-of-Flight and Imager Module|
JP2020020681A|2018-08-01|2020-02-06|ソニーセミコンダクタソリューションズ株式会社|Light source device, image sensor and sensing module|
KR20200020184A|2018-08-16|2020-02-26|엘지이노텍 주식회사|Method and apparatus for sensing|
US11029149B2|2019-01-30|2021-06-08|Microsoft Technology Licensing, Llc|Multipath mitigation for time of flight system|
TWI732424B|2019-05-02|2021-07-01|大陸商廣州印芯半導體技術有限公司|Image sensor with distance sensing function and operating method thereof|
US11070757B2|2019-05-02|2021-07-20|Guangzhou Tyrafos Semiconductor Technologies Co., Ltd|Image sensor with distance sensing function and operating method thereof|
US20210004567A1|2019-07-01|2021-01-07|Samsung Electronics Co., Ltd.|Electronic apparatus and control method thereof|
CN110378946B|2019-07-11|2021-10-01|Oppo广东移动通信有限公司|Depth map processing method and device and electronic equipment|
法律状态:
优先权:
申请号 | 申请日 | 专利标题
EP141719856|2014-06-11|
EP14171985.6A|EP2955544B1|2014-06-11|2014-06-11|A TOF camera system and a method for measuring a distance with the system|
[返回顶部]