专利摘要:
According to an aspect, there is provided a method comprising controlling a structural light source of a modelling arrangement to produce a diffraction pattern of a known geometry on a surface to be modeled, the diffraction pattern accurately complying with a mathematical-physical model and wherein beam output angles of the diffraction pattern are accurately known based on the mathematicalphysical model; recording a first image of the surface comprising the diffraction pattern with a first camera of the modelling arrangement and a second image of the Surface comprising the diffraction pattern with a second camera of the modelling arrangement substantially simultaneously, the first camera and the second camera being calibrated and their positions relative to each other being known; determining a point cloud comprising primary points from the diffraction pattern visible in the first and second images; and using each primary point of the point cloud in the first and second images as an initial point for search spaces for secondary points in the first and second images.
公开号:FI20185522A1
申请号:FI20185522
申请日:2018-06-07
公开日:2019-12-08
发明作者:Sami Ruuskanen;Jorma Palmén;Oona Räisänen
申请人:Ladimo Oy;
IPC主号:
专利说明:

BACKGROUND OF THE INVENTION
Photogrammetry makes use of mapping or modeling based on photographs taken of an object of interest. There are numerous applications in which, for ex10 ample, three-dimensional imaging conducted by means of machine vision is very useful. In general, these systems may include, for example, a camera, a lighting device and a control system managing the equipment. In stereophotogrammetry, a three-dimensional object is 15 modeled by taking two or more photographs of an object from different camera positions and identifying the common points in each photograph.
The challenge in photogrammetry and stereophotogrammetry is to be able to model a three20 dimensional object quickly, for example, in real-time or near real-time, and do this very accurately.
20185522 prh 07-06- 2018
SUMMARY OF THE INVENTION
According to a first aspect of the invention, there is provided a method comprising controlling a structural light source of a modelling arrangement to produce a diffraction pattern of a known geometry on a surface to be modeled, the diffraction pattern accurately complying with a mathematical-physical model and wherein beam output angles of the diffraction pattern are accurately known based on the mathematicalphysical model. The method further comprises recording a first image of the surface comprising the diffraction pattern with a first camera of the modelling ar35 rangement and a second image of the surface comprising the diffraction pattern with a second camera of the modelling arrangement substantially simultaneously, the first camera and the second camera being calibrated and their positions relative to each other being 5 known. The method further comprises determining a point cloud comprising primary points from the diffraction pattern visible in the first and second images, and using each primary point of the point cloud in
the first andsecond images as aninitialpointfor10 searchspacesfor secondary pointsin thefirstandsecondimages . In animplementation formof thefirstas-pect,the method further comprises forming partialim-
20185522 prh 07-06- 2018 age pairs of the first and second images by dividing 15 the first image and the second image to a set of partial images, each partial image pair comprising a partial image of the first image and a corresponding partial image of the second image, and wherein each partial image of the first image and the second image 20 comprise the same single primary point of the point cloud; identifying corresponding secondary points in the partial image pairs, each secondary point having x and y image coordinates, and calculating X, Y and Z coordinates in an external coordinate system for each 25 secondary point based on the x and y image coordinates of the secondary point and the camera positions.
In further implementation form of the first aspect, the method further comprises dividing each partial image into sub-blocks; and identifying corre30 spending secondary points in the sub-blocks.
In further implementation form of the first aspect, identifying corresponding secondary points in the partial image pairs comprises: searching a secondary point in the partial image of the second image, 35 the secondary point corresponding to a secondary point in the corresponding partial image of the first image,
20185522 prh 07-06- 2018 in a search area in the partial image of the second image having substantially the same distance from the primary point in the partial image of the second image than a distance between the secondary point in the 5 partial image of the first image and the primary point in the partial image of the first image.
According to a second aspect of the invention, there is provided a modelling system comprising a modelling arrangement comprising a structural light 10 source configured to produce a diffraction pattern of a known geometry on a surface to be modeled, the diffraction pattern accurately complying with a mathematical-physical model and wherein beam output angles of the diffraction pattern are accurately known based on 15 the mathematical-physical model; a first camera configured to photograph the surface to be modeled at wavelengths emitted by the light source; and a second camera configured to photograph the surface to be modeled at wavelengths emitted by the light source. The 20 modeling arrangement is calibrated so that relative orientations of the optical axis of the first camera and the second camera and the diffraction axis are known, and wherein the position, distance and orientation of the output point of the structural light 25 source to the first camera and the second camera are known. The modelling system further comprises a control system connected to the modelling arrangement and being configured to control the structural light source to produce the diffraction pattern of a known 30 geometry on the surface to be modeled, record a first image of the surface comprising the diffraction pattern with the first camera and a second image of the surface comprising the diffraction pattern with the second camera substantially simultaneously; determine 35 a point cloud comprising primary points from the diffraction pattern visible in the first and second imag4 es; and use each primary point of the point cloud in
the first andsecond images as initialpointsforsearchspacesfor secondary pointsinthefirstandsecondimages . 5In animplementation formofthesecondas-pect,the control system connectedtothemodelling
20185522 prh 07-06- 2018 arrangement is further configured to form partial image pairs of the first and second images by dividing the first image and the second image to a set of par10 tial images, each partial image pair comprising a partial image of the first image and a corresponding partial image of the second image, and wherein each partial image of the first image and the second image comprise the same single primary point of the point 15 cloud; identify corresponding secondary points in the partial image pairs, each secondary point having x and y image coordinates; and calculate X, Y and Z coordinates in an external coordinate system for each secondary point based on the x and y image coordinates of 20 the secondary point and the camera positions.
In a further implementation form of the second aspect, the control system is configured to divide each partial image into sub-blocks, and identify corresponding secondary points in the sub-blocks.
In a further implementation form of the second aspect, the control system is configured to search a secondary point in the partial image of the second image, the secondary point corresponding to a secondary point in a corresponding partial image of the 30 first image, in a search area in the partial image of the second image having substantially the same distance from the primary point in the partial image of the second image than a distance between the secondary point in the partial image of the first image and the 35 primary point in the partial image of the first image.
20185522 prh 07-06- 2018
According to a third aspect, there is provided an apparatus comprising at least one processor and at least one memory connected to the at least one processor. The at least one memory stores program in5 structions that, when executed by the at least one processor, cause the apparatus to control a structural light source of a modelling arrangement to produce a diffraction pattern of a known geometry on a surface to be modeled, the diffraction pattern accurately com10 plying with a mathematical-physical model and wherein beam output angles of the diffraction pattern are accurately known based on the mathematical-physical model; record a first image of the surface comprising the diffraction pattern with a first camera of the model15 ling arrangement and a second image of the surface comprising the diffraction pattern with a second camera of the modelling arrangement substantially simultaneously, the first camera and the second camera being calibrated and their positions relative to each 20 other being known; determine a point cloud comprising primary points from the diffraction pattern visible in the first and second images; and use each primary point of the point cloud in the first and second images as an initial point for search spaces for secondary 25 points in the first and second images.
In an implementation form of the third aspect, the at least one memory stores program instructions that, when executed by the at least one processor, cause the apparatus to form partial image pairs 30 of the first and second images by dividing the first image and the second image to a set of partial images, each partial image pair comprising a partial image of the first image and a corresponding partial image of the second image, and wherein each partial image of 35 the first image and the second image comprise the same single primary point of the point cloud; identify cor responding secondary points in the partial image pairs, each secondary point having x and y image coordinates; and calculate X, Y and Z coordinates in an external coordinate system for each secondary point 5 based on the x and y image coordinates of the secondary point and the camera positions.
In a further implementation form of the third aspect, the at least one memory stores program instructions that, when executed by the at least one 10 processor, cause the apparatus to divide each partial image into sub-blocks; and identify corresponding secondary points in the sub-blocks.
In a further implementation form of the third
aspect, the atleastone memorystoresprogramin-15 structions that,whenexecutedbythe at: leastoneprocessor, causetheapparatustosearcha secondarypoint in the partialimage ofthesecondimage,the
20185522 prh 07-06- 2018 secondary point corresponding to a secondary point in the corresponding partial image of the first image, in 20 a search area in the partial image of the second image having substantially the same distance from the primary point in the partial image of the second image than a distance between the secondary point in the partial image of the first image and the primary point in the 25 partial image of the first image.
According to a fourth aspect, there is provided an apparatus comprising means for controlling a structural light source of a modelling arrangement to produce a diffraction pattern of a known geometry on a 30 surface to be modeled, the diffraction pattern accurately complying with a mathematical-physical model and wherein beam output angles of the diffraction pattern are accurately known based on the mathematicalphysical model; means for causing recording of a first 35 image of the surface comprising the diffraction pattern with a first camera of the modelling arrangement
20185522 prh 07-06- 2018 and a second image of the surface comprising the diffraction pattern with a second camera of the modelling arrangement substantially simultaneously, the first camera and the second camera being calibrated and 5 their positions relative to each other being known;
means for determining a point cloud comprising primary points from the diffraction pattern visible in the first and second images; and means for using each primary point of the point cloud in the first and second 10 images as an initial point for search spaces for secondary points in the first and second images.
In an implementation form of the fourth aspect, the apparatus comprises means for forming partial image pairs of the first and second images by di15 viding the first image and the second image to a set of partial images, each partial image pair comprising a partial image of the first image and a corresponding partial image of the second image, and wherein each partial image of the first image and the second image 20 comprise the same single primary point of the point cloud; means for identifying corresponding secondary points in the partial image pairs, each secondary point having x and y image coordinates; and means for calculating X, Y and Z coordinates in an external co25 ordinate system for each secondary point based on the x and y image coordinates of the secondary point and the camera positions.
In a further implementation form of the fourth aspect, the apparatus comprises means for di30 viding each partial image into sub-blocks; and means for identifying corresponding secondary points in the sub-blocks .
In a further implementation form of the fourth aspect, the apparatus comprises means for 35 searching a secondary point in the partial image of the second image, the secondary point corresponding to
20185522 prh 07-06- 2018 a secondary point in the corresponding partial image of the first image, in a search area in the partial image of the second image having substantially the same distance from the primary point in the partial 5 image of the second image than a distance between the secondary point in the partial image of the first image and the primary point in the partial image of the first image.
According to a fifth aspect of the invention, 10 there is provided a computer program comprising program code arranged to perform the method of the first aspect.
According to a sixth aspect of the invention, there is provided a computer-readable medium compris15 ing a computer program comprising program code arranged to perform the method of the first aspect.
At least some of the aspects and implementation forms may provide one or more effects and/or advantages. The solution may enable significantly to in20 crease the spatial resolution of the modelling arrangement in real-time or near real-time solutions with a substantially reduced processing time. This may also increase the usability of the modelling arrangement for various applications involving accurate ob25 ject identification and with algorithms requiring denser three-dimensional data about a scene. An increased spatial resolution means that scene objects will be sampled much more densely than with prior solutions while simultaneously keeping the accuracy of 30 the measured points higher than with known sensors that produce dense point clouds. Furthermore, the disclosed solution may allow to separate the extremely high accuracy primary points from the secondary points for applications that must use extremely high quality 35 three-dimensional data.
20185522 prh 07-06- 2018
LIST OF FIGURES
The invention will be described below in detail by means of examples of embodiments, wherein
FIG. 1 illustrates a modeling system accord5 ing to an aspect of the invention.
FIG. 2 illustrates a block diagram method for obtaining and processing images.
FIG. 3 illustrates a simplified image comprising a point cloud determined from a diffraction 10 pattern.
FIG. 4A illustrates an example of a partial image of a larger image captured with a camera.
FIGS. 4B and 4G illustrate an example of a pair of corresponding partial images.
FIGS. 5Ά and 5B illustrate an example of identifying corresponding secondary points in a pair of partial images.
FIGS. 6A - 6N illustrate another example of identifying corresponding secondary points in a pair 20 of partial images.
FIG. 7 illustrates an exemplary apparatus for controlling a modelling arrangement.
DETAILED DESCRIPTION OF THE INVENTION
FIG. 1 illustrates a modeling system according to an aspect of the invention. The modelling system comprises a modelling arrangement 100 and a control system 112. The modelling arrangement 100 comprises a structural light source 102 configured to 30 produce a diffraction pattern of a known geometry on a surface to be modeled. The structural light source 102 may comprise a laser light source and a diffraction element through which light from the laser light source travels. The structural light source 102 may be 35 configured to produce substantially monochromatic and coherent electromagnetic radiation. The diffraction
20185522 prh 07-06- 2018 pattern provided by the structural light source 102 accurately complies with a mathematical-physical model and wherein beam output angles of the diffraction pattern are accurately known based on the mathematical5 physical model. In other words, the diffraction pattern provided by the structural light source 102 is modelled based on the mathematical-physical model. The geometry of the point cloud produced by the diffraction may depend on a grating used in the diffraction 10 source, and any suitable point geometry can be used.
In one example, the structural light source 102 may produce radiation that contains only one wavelength or a plurality of wavelengths at the same time, for example, red, green and blue coherent light.
The modelling arrangement 100 further comprises a first camera 108 configured to photograph the surface to be modeled at wavelengths emitted by the structural light source 102, and a second camera 104 configured to similarly photograph the surface to be 20 modeled at wavelengths emitted by the structural light source 102. The image files may be automatically transferred from the first and second cameras to the control system 112, for example, by means of wireless data transfer.
The modelling arrangement 100 is calibrated so that relative orientations of the optical axis of the first camera 108 and the second camera 104 and the diffraction axis of the structural light source 102 are known, and wherein the position, distance and ori30 entation of the output point of the structural light source 102 to the first camera 108 and the second camera 104 are known. The calibration may also take into account optical qualities and distortions of lenses of the cameras 104, 108.
The modelling system comprises also a control system 112 connected to the modelling arrangement 100.
20185522 prh 07-06- 2018
The control system 112 is configured to control operations of the modelling arrangement 100. The control system 112 may comprise a computer equipped with appropriate software to control the cameras 104, 108 and 5 to process images recorded with the cameras 104, 108.
The control system 112 may also be configured to control the structural light source 102 to produce the diffraction pattern of known geometry on the surface to be modeled. The control system 112 may further 10 be configured to cause recording of a first image of the surface comprising the diffraction pattern with the first camera 108 and a second image of the surface comprising the diffraction pattern with the second camera 104 substantially simultaneously. The term 15 substantially simultaneously means that the first camera and the second camera are configured to photograph the same object with the diffraction pattern simultaneously or nearly simultaneously.
When the first and second images have been 20 recorded, the control system 112 is configured to determine a point cloud comprising primary points from the diffraction pattern visible in the first and second images. The point cloud is designed and implemented so that unambiguous identification of each primary 25 point within in an image is possible. In practice this means that in a specific section in the image there is only one possible primary point that can be there. The control system 112 is further configured to use each primary point of the point cloud in the first and sec30 ond images as initial points for search spaces for secondary points in the first and second images. The secondary points may be artifacts of lesser intensity that lie between the primary points of the image and for which accurate three-dimensional positions can be 35 calculated. The secondary points may be background light that falls within the used bandwidth. Further,
20185522 prh 07-06- 2018 its intensity may be significantly lesser than that of the primary points. As another example, the secondary points may be noise generated by the diffraction element. As another example, the secondary points may be 5 patterns or noise generated by separate light source.
In one example, the separate lights source may be integrated to the modelling arrangement 100. As the possible search spaces are significantly reduced in size, this enables a solution that is able in real-time or 10 nearly real-time to provide an accurate threedimensional representation of the surface.
The control system 112 may be provided in immediate proximity to the modeling arrangement 100. Alternatively the control system 112 may be physically 15 present in any other location, as long as the control system 112 is able to download the images taken by the cameras 104, 108 via a data communication network, for example, the Internet.
The control system 112 may comprise one or 20 more processors and one or more memories connected to the processor. Through a data transfer interface, the control system 112 may receive information from external devices or systems. The memory may contain one or more computer programs containing program code which 25 is arranged to perform the method steps described herein.
FIG. 2 illustrates a block diagram method for obtaining and processing images. The method may be implemented as a computer program performed by a comput30 er.
At 200 a structural light source of a modelling arrangement is controlled to produce a diffraction pattern of a known geometry on a surface to be modeled. The diffraction pattern accurately complies 35 with a mathematical-physical model and wherein beam output angles of the diffraction pattern are accurate
20185522 prh 07-06- 2018 ly known based on the mathematical-physical model. The geometry of the diffraction depends on a grating used in the diffraction source, and any suitable point geometry can be used.
At 202 recording of a first image of the surface comprising the diffraction pattern with a first camera of a modelling arrangement and a second image of the surface comprising the diffraction pattern with a second camera of the modelling arrangement are 10 caused substantially simultaneously. The structural light source, the first camera and the second camera have been calibrated so that relative orientations of the optical axis of the first camera and the second camera and the diffraction axis are known. Further, 15 the position, distance and orientation of the output point of the structural light source to the first camera and the second camera are known. The calibration may also take into account optical distortions of the lenses of the cameras.
At 204 a point cloud comprising primary points is determined from the diffraction pattern visible in the first and second images.
At 206 each primary point of the point cloud in the first and second images is used as an initial 25 point for search spaces for secondary points in the first and second images. The secondary points may be artifacts of lesser intensity that lie between the primary points of the image that lie between two primary points and for which accurate three-dimensional 30 positions can be calculated. The secondary points may be background light that falls within the used bandwidth. Further, its intensity may be significantly lesser than that of the primary points. As another example, the secondary points may be noise generated by 35 the diffraction element. As another example, the secondary points may be patterns or noise generated by
20185522 prh 07-06- 2018 separate light source. The separate lights source may be integrated to the modelling arrangement. As the possible search spaces are significantly reduced in size, this enables a solution that is able in real5 time or nearly real-time to provide an accurate threedimensional representation of the surface.
Corresponding secondary points in the first and second images may be identified using any image processing solution or algorithm that is able to find 10 corresponding points from two images. When the corresponding secondary points in images have been identified, each secondary point has x and y image coordinates. Based on the x and y image coordinates of the secondary point and the camera positions, X, Y and Z 15 coordinates in an external coordinate system may be calculated for each secondary point.
FIG. 3 illustrates a simplified image 300 comprising a point cloud 306 determined from a diffraction pattern visible in an image recorded with a 20 first camera. Each point 304 of the point cloud 306 is accurately identifiable and has very accurate threedimensional coordinates.
FIG. 3 illustrates an embodiment in which a plurality of partial images 302 are formed from the 25 whole image 300. Although FIG. 3 illustrates only six partial images 302, other partial images of the image 300 are generated similarly. A similar operation, i.e. forming partial images, may be done to an image recorded substantially simultaneously with a second cam30 era.
Each partial image 302 of the image 300 comprises a single primary point 304 of the point cloud. This is exemplified in FIG. 3 with a dot in a center of each partial image 302. By dividing the image 300 35 into partial images 302, it is possible to process each partial image separately and in a parallel man
20185522 prh 07-06- 2018 ner. Each primary point of the point cloud in a partial image 302 can be used as an initial point for search spaces for secondary points in the partial image 302. The secondary points are points that lie be5 tween two primary points and for which accurate threedimensional positions can be calculated.
FIG. 4A illustrates an example of a partial image 400 of a larger image captured with a camera. The partial image 400 may correspond to a partial im10 age 302 illustrated in FIG. 3.
The partial image 400 comprises a single primary point 402 of a point cloud as a center point. As discussed already earlier, a point cloud comprising primary points is determined from the diffraction pat15 tern visible in an image. The diffraction pattern accurately complies with a mathematical-physical model and wherein beam output angles of the diffraction pattern are accurately known based on the mathematicalphysical model. Similar partial images are formed from 20 the image for each other primary point of the point cloud.
FIGS. 4B and 4G illustrate a pair of corresponding partial images. A partial image 400A is a partial image of an image recorded with a first camera 25 and a partial image 400B is a corresponding partial image of an image recorded with a second camera. Both partial images 400A, 400B include the same primary point 402A, 402B of the point cloud as a center point.
FIGS. 5A and 5B illustrate an example of 30 identifying corresponding secondary points in a pair of partial images 514A, 514B. In addition to FIGS. 4B and 4G, FIGS. 5A and 5B now illustrate that both partial images include shapes 504A, 506A, 508A, 510A,
512A; 504B, 506B, 508B, 510B, 512B that can be identi35 fied using one or more image processing algorithms. A shape may also refer to a secondary point already dis
20185522 prh 07-06- 2018 cussed earlier. For example, when a shape or secondary point 504A has been identified in the partial image 514A, a corresponding shape or secondary point 504B can be identified in the other partial image 500B. A 5 search for a corresponding secondary point in the partial image 514B can be directed in a search area having substantially the same distance from the primary point in the partial image 514B than a distance between the corresponding secondary point in the partial 10 image 514A and the primary point in the partial image
514A. This means that the secondary point 504B in the partial image 514B can be found accurately and quickly.
FIGS. 6A - 6F illustrate another example of 15 identifying corresponding secondary points in a pair of partial images 600A, 600B. In FIG. 6A, the partial image 600A of a first image has been divided to a set of sub-blocks 604. In this particular example, the partial image 60A is divided to nine sub-blocks. A 20 center sub-block comprises a primary point 602A. As the center block comprises the primary point 602A, there is no need to separately search for a corresponding point in the partial image 600B as it is already known (i.e. it is the primary point 602B) . FIG.
6B illustrates a partial image 600B of a second image corresponding to the partial image 600A of the first image .
FIGS. 6C and 6D illustrate an example where a sub-block or a part 616 of the sub-block is used as a 30 template for a search for a corresponding sub-block or a part of the sub-block in the partial image 600B. The sub-block or the part 616 of the sub-block of the partial image 600A may then be compared to pixels of the partial image 600B.
FIGS. 6E - 6N illustrate an exemplary process for finding a corresponding secondary point in the partial image 600B.
Again, a section 616 in the sub-block in the left-hand corner of the partial image 600A is used as a template. FIGS. 6F - 6N illustrate that the template is sequentially compared to a set of pixels 618A-618E of the partial image 600B in the corner of the partial image 600B. Although in FIGS. 6F - 6N it may be illus10 trated that the set of pixels 618A-618E in the partial image 600B in each sequence moves only in direction, for example, in the horizontal direction, in other examples there may be both horizontal and vertical direction movement with set of pixels 618A-618E in the 15 partial image 600B between two consecutive sequences.
At each comparison, a correlation coefficient may be calculated for the template 616 in the partial image 600A and the set of pixels 618A-618E in the partial image 600B. One possible solution for calculating 20 the correlation coefficient is to use Pearson Correlation Coefficient:
Ε[(Χ-μχ)(Υ-μγ}]
20185522 prh 07-06- 2018
The quantitative measure of the degree of as25 sociation of two distinct variables is often coined as correlation coefficient, and it typically ranges between -1 and 1. If the correlation coefficient is 1, there is a high correlation between the two variables.
When all correlation coefficients have been calculated, the set of pixels 618A-618E in the partial image 600B moved within the partial image 600B during the calculations having the highest correlation coef
20185522 prh 07-06- 2018 ficient is the best correspondence for the template 616.
Center points of the template 616 and the best corresponding set of pixels 618A-618E in the par5 tial image 600B form a pair of observed image points.
Based on the pair of observed image points and based on the positions of the two calibrated cameras used to capture the two images, it is possible to calculate three-dimensional coordinates in an external coordi10 nate system. In other words, firstly corresponding secondary points in the partial image pairs can be identified, each secondary point having x and y image coordinates. And secondly, X, Y and Z coordinates in the external coordinate system for each secondary 15 point can be calculated based on the x and y image coordinates of the secondary point and the camera positions by using, for example, triangulation.
In an embodiment, the examples and embodiments disclosed above provide pixel level accuracy in 20 finding corresponding points in the two images. In a further embodiment, the pixel level accuracy may be extended even further to sub pixel accuracy. The sub pixel accuracy refers to identifying the location of an object with accuracy greater than the pixel level.
For example, this may mean defining a center point of a circle observed from the image in pixel coordinates with decimal point accuracy instead of merely referring to the integer pixel values. Furthermore, achieving sub pixel accuracy might involve interpolating 30 and/or re-sampling pixel values of the image, which is one commonly used method in finding a best match for a template image. In another example, achieving sub pixel accuracy might involve fitting a continuous smooth function to the correlation values in different loca35 tions and finding the local maxima.
20185522 prh 07-06- 2018
The term calibrated camera may refer to 1) determining distortion errors of a lens or lens system of the camera and to 2) determining an internal geometry of the camera. The distortion errors may refer to 5 changes in the image geometry caused by physical properties of the lens. In an image distortion errors may be seen, for example, as barrel distortion and pincushion distortion. The distortion error may be mathematically corrected. The determination of the internal 10 geometry of the camera may comprise determining the focal length of the lens or lens system, the principal point of the camera and an aspect ratio.
The focal length depends on the physical properties of the lens and describes the distance of 15 the optical center point of the lens system and the image plane. Furthermore, with image observation coordinates it describes the direction where a phenomenon observed from the image lies and the scale of objects on the image plane in relation to their distance from 20 the camera.
The principal point is the point on the image plane where the optical axis crosses it. Lens distortion depends from a distance to the principal point where the distortion is zero.
The aspect ratio describes the asymmetry of the scale of different image axis. This may be due to lens properties of the structure or assembly of the photo-voltaic cells on the image plane. The aspect ratio may be defined by using different focal lengths 30 for x and y coordinates. The internal geometry is defined simultaneously with lens distortion parameters by observing objects with known geometries.
Further, in one embodiment, the calibration of the camera lens is performed only once, and optical 35 distortions of the images are corrected by the ob
20185522 prh 07-06- 2018 tained calibration information by moving the pixels of the images to the correct positions.
The two cameras may also be calibrated with respect to each other. The camera 104 positioned far5 ther from a structural light source 102 of a modelling arrangement illustrated 100 in FIG. 1 determines the measurement coordinate system, and the other camera 108 is positioned with respect to that. The origin of the measurement coordinate system is a projection cen10 ter of the camera 104. The axes of the measurement coordinate system are parallel with axes of an image coordinate system of an image, and the third axis being perpendicular to the image plane is an optical axis of the camera. After the calibration, observations of the 15 camera 104 are by definition directly in the measurement coordinate system, whereas observations of the camera 108 can be converted to the measurement coordinate system.
Although the embodiments and examples above 20 have only disclosed analysis based on separate photographs or images, it is obvious for a person skilled in the art also to apply the invention so that several individual photographs are combined into a larger unity of the surface to be modeled. In this case, the 25 modeling arrangement described in the invention can be manually or automatically moved in the photographing situation so that the desired arrangement can be modeled.
The embodiments of the invention described 30 above may be used in many different application environments, for example in measuring walls of a tunnel, rock material identification, forest applications (for example in evaluating a forest stand), automotive industry or any other application for modeling a sur35 face.
20185522 prh 07-06- 2018
FIG. 7 illustrates an apparatus depicting an exemplary apparatus 700 that may include a variety of optional hardware and software components. The illustrated apparatus 700 can include one or more control5 lers or processors 702 (e.g., signal processor, microprocessor, ASIC, or other control and processing logic circuitry) for performing such tasks as signal coding, data processing, input/output processing, power control, and/or other functions.
The illustrated apparatus 700 can also include a memory or memories 702. The memory 702 can include a non-removable memory and/or a removable memory. The non-removable memory can include RAM, ROM, flash memory, a hard disk, or other well-known memory storage technologies. The removable memory can include flash memory or other well-known memory storage technologies. The memory can be used for storing data and/or code for running an operating system and/or one or more applications.
The apparatus 700 may be configured to implement the various features, examples and embodiments illustrated, for example, in FIGS. 2-4, 4A-4C, 5A-5B and 6A-6N partially or completely. The functionality described herein can be performed, at least in part, by one or more computer program product components such as software components. According to an example, the processor 702 may be configured by the program code which when executed performs the examples and embodiments of the operations and functionality de30 scribed. Alternatively, or in addition, the functionality described herein can be performed, at least in part, by one or more hardware logic components. For example, and without limitation, illustrative types of hardware logic components that can be used include
Field-programmable Gate Arrays (FPGAs), Programspecific Integrated Circuits (ASICs), Program-specific
20185522 prh 07-06- 2018
Standard Products (ASSPs), System-on-a-chip systems (SOCs), Complex Programmable Logic Devices (CPLDs), Graphics Processing Units (GPUs).
The embodiments and examples are described above by way of example only, and the hardware used to carry out these embodiments may vary in many ways, as persons skilled in the hardware and/or software art will appreciate. The functionality of one or more components of the example embodiments may be implemented 10 for example by one or more apparatus and/or a computer program executed on a computer.
The example embodiments may store information related to the different processes described herein. This information may be stored in one or more memo15 ries, such as a hard disk, optical disk, RAM memory etc. One or more memories or databases may store the information used to carry out the example embodiments of the present invention.
The example embodiments as a whole or parts 20 of them may be carried out using one or more generalpurpose processors, microprocessors, DSP processors, microcontrollers etc., which are programmed according to the teachings of the example embodiments of the present invention, as persons skilled in the computer 25 and/or software art will appreciate.
Any computer-readable medium or combination of media may store the computer program or computer program product for executing the processing to be executed to carry out the invention as a whole or in 30 part (if the processing is distributed).
The devices for use in the embodiments of the invention may include computer-readable media or memories containing commands programmed according to the teachings of the present invention with data struc35 tures, tables, records and/or other data described herein. The computer readable-media may include any
20185522 prh 07-06- 2018 suitable medium participating in providing commands to the processor for their execution. Such a medium may be provided in many different forms, including nonvolatile memories, volatile memories, transfer media, 5 etc., without being limited to the afore-said. The non-volatile memories may include for example optical or magnetic disks, magneto-optical disks, etc. The volatile memories may include dynamic memories, etc.
The transfer media may include coaxial cables, copper 10 wire, optical fiber, etc. The transfer media may also be provided in the form of acoustic, optical, electromagnetic etc. waves, such as in the form of waves formed during radio-frequency communication, infrared data transfer, etc. General embodiments of computer15 readable media may include for example a computer disk, hard disk, magnetic tape, any other suitable magnetic medium, CD-ROM disk, CD-R disk, CD-RW disk, DVD disk, DVD-ROM disk, DVDiRW disk, DVD±R disk, any other suitable optical medium, RAM memory, ROM memory, 20 EPROM memory, FLASH-EPROM memory, any other suitable memory chip or any other suitable medium readable by a processor or a computer. The devices for use in the embodiments of the invention may also include data transfer means by which information is sent and re25 ceived using a wired or wireless data transfer connection .
The invention is not limited merely to the above embodiment examples; instead, many modifications are possible within the scope of the inventive idea 30 defined by the claims.
权利要求:
Claims (14)
[1] 1. A method comprising:
controlling (200) a structural light source (102) of a modelling arrangement (100) to produce a 5 diffraction pattern of a known geometry on a surface to be modeled, the diffraction pattern accurately complying with a mathematical-physical model and wherein beam output angles of the diffraction pattern are accurately known based on the mathematical-physical mod10 el;
recording (202) a first image of the surface comprising the diffraction pattern with a first camera (104) of the modelling arrangement (100) and a second image of the surface comprising the diffraction pat15 tern with a second camera (108) of the modelling arrangement (100) substantially simultaneously, the first camera (104) and the second camera (108) being calibrated and their positions relative to each other being known;
20 determining (204) a point cloud (306) comprising primary points from the diffraction pattern visible in the first and second images; and using (206) each primary point of the point cloud (306) in the first and second images as an ini25 tial point for search spaces for secondary points in the first and second images.
[2] 2. The method of claim 1, further comprising: forming partial image pairs (400A, 400B) of
30 the first and second images by dividing the first image and the second image to a set of partial images (302), each partial image pair (400A, 400B) comprising a partial image (400A) of the first image and a corresponding partial image (400B) of the second image, and 35 wherein each partial image (400A, 400B) of the first
20185522 prh 07-06- 2018 image and the second image comprise the same single primary point (304) of the point cloud (306);
identifying corresponding secondary points in the partial image pairs (400A, 400B), each secondary
5 point having x and y image coordinates; and calculating X, Y and Z coordinates in an external coordinate system for each secondary point based on the x and y image coordinates of the secondary point and the camera positions.
[3] 3. The method of claim 2, further comprising: dividing each partial image (400A, 400B) into sub-blocks (604); and identifying corresponding secondary points in 15 the sub-blocks (604).
[4] 4. The method of claim 2, wherein identifying corresponding secondary points in the partial image pairs comprises:
20 searching a secondary point in the partial image (400B) of the second image, the secondary point corresponding to a secondary point in the corresponding partial image (400A) of the first image, in a search area in the partial image (400B) of the second
25 image having substantially the same distance from the primary point (402B) in the partial image of the second image than a distance between the secondary point in the partial image (400A) of the first image and the primary point (402A) in the partial image (400A) of
30 the first image.
[5] 5. A modelling system comprising:
a modelling arrangement (100) comprising:
a structural light source (102) configured
35 to produce a diffraction pattern of a known geometry on a surface to be modeled, the dif fraction pattern accurately complying with a mathematical-physical model and wherein beam output angles of the diffraction pattern are accurately known based on the mathematical-
5 physical model; a first camera ( :104) configured to photo- graph the surface to be modeled at wave- lengths emitted by the light source (102) ; and 10 a second camera (108) configured to photo- graph the surface to be modeled at wave- lengths emitted by ' the 1 ight source ()
wherein the modeling arrangement (100) is calibrated so that relative orientations of
20185522 prh 07-06- 2018
15 the optical axis of the first camera (104) and the second camera (108) and the diffraction axis are known, and wherein the position, distance and orientation of the output point of the structural light source (102) to
20 the first camera (104) and the second camera (108) are known;
a control system (112) connected to the modelling arrangement (100) and being configured to:
control the structural light source (102) to 25 produce the diffraction pattern of a known geometry on the surface to be modeled;
record a first image of the surface comprising the diffraction pattern with the first camera (104) and a second image of the surface comprising the 30 diffraction pattern with the second camera (108) substantially simultaneously;
determine a point cloud (306) comprising primary points (302) from the diffraction pattern visible in the first and second images; and
35 use each primary point (302) of the point cloud (306) in the first and second images as initial
20185522 prh 07-06- 2018 points for search spaces for secondary points in the first and second images.
[6] 6. The modelling system of claim 5, wherein 5 the control system (112) connected to the modelling arrangement (100) is further configured to:
form partial image pairs (400A, 400B) of the first and second images by dividing the first image and the second image to a set of partial images, each 10 partial image pair (400A, 400B) comprising a partial image (400A) of the first image and a corresponding partial image (400B) of the second image, and wherein each partial image (400A, 400B) of the first image and the second image comprise the same single primary 15 point of the point cloud (306);
identify corresponding secondary points in the partial image pairs (400A, 400B), each secondary point having x and y image coordinates; and calculate X, Y and Z coordinates in an exter20 nal coordinate system for each secondary point based on the x and y image coordinates of the secondary point and the camera positions.
[7] 7. The modelling system of claim 6, wherein 25 the control system (112) is configured to:
divide each partial image (400A) into subblocks (604); and identify corresponding secondary points in the sub-blocks (604).
[8] 8. The modelling system of claim 6 or 7, wherein the control system (112) is configured to:
search a secondary point in the partial image (400B) of the second image, the secondary point corre35 spending to a secondary point in a corresponding partial image (400A) of the first image, in a search area
20185522 prh 07-06- 2018 in the partial image (400B) of the second image having substantially the same distance from the primary point (402B) in the partial image (400B) of the second image than a distance between the secondary point in the 5 partial image (400A) of the first image and the primary point (402A) in the partial image (400A) of the first image.
[9] 9. An apparatus comprising:
[10] 10 means for controlling a structural light source (102) of a modelling arrangement (100) to produce a diffraction pattern of a known geometry on a surface to be modeled, the diffraction pattern accurately complying with a mathematical-physical model 15 and wherein beam output angles of the diffraction pattern are accurately known based on the mathematicalphysical model;
means for causing recording of a first image of the surface comprising the diffraction pattern with 20 a first camera (104) of the modelling arrangement (100) and a second image of the surface comprising the diffraction pattern with a second camera (108) of the modelling arrangement (100) substantially simultaneously, the first camera (104) and the second camera 25 (108) being calibrated and their positions relative to each other being known;
means for determining a point cloud (306) comprising primary points from the diffraction pattern visible in the first and second images; and
30 means for using each primary point of the point cloud (306) in the first and second images as an initial point for search spaces for secondary points in the first and second images.
35 10. The apparatus of claim 9, further comprising :
means for forming partial image pairs (400A, 400B) of the first and second images by dividing the first image and the second image to a set of partial images (302), each partial image pair (400A, 400B)
5 comprising a partial image (400A) of the first image and a corresponding partial image (400B) of the second image, and wherein each partial image (400A, 400B) of the first image and the second image comprise the same single primary point (304) of the point cloud (306);
10 means for identifying corresponding secondary points in the partial image pairs (400A, 400B), each secondary point having x and y image coordinates; and means for calculating X, Y and Z coordinates in an external coordinate system for each secondary
15 point based on the x and y image coordinates of the secondary point and the camera positions.
20185522 prh 07-06- 2018
[11] 11. The apparatus of claim 10, further comprising :
20 means for dividing each partial image (400A,
400B) into sub-blocks (604); and means for identifying corresponding secondary points in the sub-blocks (604) .
25
[12] 12. The apparatus of claim 10, further comprising :
means for searching a secondary point in the partial image (400B) of the second image, the secondary point corresponding to a secondary point in the 30 corresponding partial image (400A) of the first image, in a search area in the partial image (400B) of the second image having substantially the same distance from the primary point (402B) in the partial image of the second image than a distance between the secondary 35 point in the partial image (400A) of the first image and the primary point (402A) in the partial image (400A) of the first image.
[13] 13. A computer program comprising program
5 code arranged to perform the method according to any one of claims 1-4, when the program code is executed by at least one processor.
[14] 14. A computer-readable medium comprising a
10 computer program comprising program code arranged to perform the method according to any one of claims 1 4, when the program code is executed by at least one processor .
类似技术:
公开号 | 公开日 | 专利标题
JP2015057612A|2015-03-26|Device and method for performing non-contact measurement
JP6323993B2|2018-05-16|Information processing apparatus, information processing method, and computer program
EP3049756B1|2021-12-01|Modeling arrangement and method and system for modeling the topography of a three-dimensional surface
Beaubier et al.2014|CAD-based calibration and shape measurement with stereoDIC
Chen et al.2002|Self-recalibration of a colour-encoded light system for automated three-dimensional measurements
WO2018201677A1|2018-11-08|Bundle adjustment-based calibration method and device for telecentric lens-containing three-dimensional imaging system
Bernal et al.2013|Performance evaluation of optical scanner based on blue LED structured light
Li et al.2007|Large depth-of-view portable three-dimensional laser scanner and its segmental calibration for robot vision
El-Hakim et al.1993|Multicamera vision-based approach to flexible feature measurement for inspection and reverse engineering
Hosseininaveh et al.2014|Towards fully automatic reliable 3D acquisition: From designing imaging network to a complete and accurate point cloud
Karaszewski et al.2013|Automated full-3D digitization system for documentation of paintings
Chang et al.1995|360-deg profile noncontact measurement using a neural network
FI20185522A1|2019-12-08|Modeling the topography of a three-dimensional surface
Aguilar et al.2005|Development of a stereo vision system for non-contact railway concrete sleepers measurement based in holographic optical elements
Hinz et al.2020|A 3d measuring endoscope for use in sheet-bulk metal forming: Design, algorithms, applications and results
Yang et al.2018|A method for measuring the thermal geometric parameters of large hot rectangular forgings based on projection feature lines
CN111353997A|2020-06-30|Real-time three-dimensional surface defect detection method based on fringe projection
Breuckmann et al.1997|3D-metrologies for industrial applications
Weckenmann et al.2006|Testing formed sheet metal parts using fringe projection and evaluation by virtual distortion compensation
Kuhmstedt et al.2004|Phasogrammetric optical 3D sensor for the measurement of large objects
Drouin et al.2017|Characterizing the impact of optically induced blurring of a high-resolution phase-shift 3D scanner
Weckenmann et al.2005|3D surface coordinate inspection of formed sheet material parts using optical measurement systems and virtual distortion compensation
Breitbarth et al.2008|Measurement of optical free-form surfaces with fringe projection
Isa et al.2020|Laser triangulation
Engelke et al.2013|Measurement based simulation of microscope deviations for evaluation of stitching algorithms for the extension of Fourier-based alignment
同族专利:
公开号 | 公开日
EP3803268A4|2022-03-09|
CN112243485A|2021-01-19|
FI128523B|2020-07-15|
WO2019234307A1|2019-12-12|
JP2021527285A|2021-10-11|
US20210247181A1|2021-08-12|
EP3803268A1|2021-04-14|
KR20210019486A|2021-02-22|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

US6711293B1|1999-03-08|2004-03-23|The University Of British Columbia|Method and apparatus for identifying scale invariant features in an image and use of same for locating an object in an image|
WO2008062407A2|2006-11-21|2008-05-29|Mantisvision Ltd.|3d geometric modeling and 3d video content creation|
CN101373135B|2008-07-01|2010-06-23|南京航空航天大学|Method and apparatus for measuring three-dimensional curved surface based on instantaneous accidental light irradiation|
US20140307055A1|2013-04-15|2014-10-16|Microsoft Corporation|Intensity-modulated light pattern for active stereo|
FI20135961A|2013-09-25|2015-03-26|Aalto Korkeakoulusäätiö|Imaging arrangements and procedures as well as systems for mapping the topography of three-dimensional surface|
WO2016040229A1|2014-09-10|2016-03-17|Faro Technologies, Inc.|Method for optically measuring three-dimensional coordinates and calibration of a three-dimensional measuring device|
CN105091748B|2015-05-29|2017-11-28|南京中车浦镇城轨车辆有限责任公司|Rail vehicle tolerance dimension measuring system|
US9989357B2|2015-09-09|2018-06-05|Faro Technologies, Inc.|Aerial device that cooperates with an external projector to measure three-dimensional coordinates|
CN106878697A|2016-06-29|2017-06-20|鲁班嫡系机器人|A kind of image pickup method and its imaging method, device and equipment|
US10074160B2|2016-09-30|2018-09-11|Disney Enterprises, Inc.|Point cloud noise and outlier removal for image-based 3D reconstruction|
CN106840037A|2017-01-17|2017-06-13|黑龙江科技大学|A kind of three-dimensional appearance digitized measurement system and method for reverse-engineering|
CN106802138B|2017-02-24|2019-09-24|先临三维科技股份有限公司|A kind of 3 D scanning system and its scan method|
法律状态:
2020-07-15| FG| Patent granted|Ref document number: 128523 Country of ref document: FI Kind code of ref document: B |
优先权:
申请号 | 申请日 | 专利标题
FI20185522A|FI128523B|2018-06-07|2018-06-07|Modeling the topography of a three-dimensional surface|FI20185522A| FI128523B|2018-06-07|2018-06-07|Modeling the topography of a three-dimensional surface|
PCT/FI2019/050446| WO2019234307A1|2018-06-07|2019-06-07|Modeling the topography of a three-dimensional surface|
CN201980037971.4A| CN112243485A|2018-06-07|2019-06-07|Modeling topography of three-dimensional surfaces|
EP19814728.2A| EP3803268A4|2018-06-07|2019-06-07|Modeling the topography of a three-dimensional surface|
JP2021517916A| JP2021527285A|2018-06-07|2019-06-07|Modeling of topography on a three-dimensional surface|
KR1020217000111A| KR20210019486A|2018-06-07|2019-06-07|Modeling of three-dimensional surface topography|
US16/972,994| US20210247181A1|2018-06-07|2019-06-07|Modeling the topography of a three-dimensional surface|
[返回顶部]