![]() METHOD FOR DETERMINING THE POSITION OF AN AIRCRAFT
专利摘要:
The invention relates to a method for determining the position and position of an aircraft (1), in particular aircraft (1), within a predetermined geographical area, starting from an estimated position and estimated position of the aircraft by means of estimation, in particular by means of GPS positioning, inclination measurement and compass (1), a) wherein at least one camera image (10) of the ground below the aircraft (1) is recorded with a camera (2) located in the aircraft (1), b) georeferenced orthonormalized reference images (6) in a database (6). 12) of the geographical area are provided, wherein the pixels of the reference images (12) each have height information indicating the geographical height of the areas imaged on the pixels, c) wherein a comparison image (11) is created by the in the database (6) stored reference images (12) taking into account the altitude information dre associated with their pixels idimensionaleobjects of a scene (13) are created, and wherein by means of a rendering algorithm based on the scene (13) and the estimated position and the estimated position of the aircraft (1) the comparison images (11) are created, d) wherein the comparison image (11) with the camera image (10) are compared, and depending on this comparison, a new estimation position and a new estimation position is determined, e) wherein steps c) and d) are repeated and the estimated position and estimation position is changed until the Comparison in step d) results in a sufficient agreement of the comparison image (11) with the camera image (10), and the last estimation position and estimation position are regarded as the position and position of the aircraft (1). 公开号:AT511460A4 申请号:T1788/2011 申请日:2011-12-02 公开日:2012-12-15 发明作者: 申请人:Ait Austrian Inst Technology; IPC主号:
专利说明:
02 ^ 12/2011 12:51 UILDHftCK JELLINEK * 53424535 NUMS90 504 • * * * • f * · * · i. 4 · »· · · ··· The invention relates to a method for determining the position and position of an aircraft, in particular an aircraft according to the independent claim 1. Furthermore, the invention relates to a position and position determining unit for an aircraft, according to the independent claim. 5 The invention is advantageously used commercially in the field of aviation as well as in the operation of unmanned aerial vehicles. Furthermore, with the invention, a pilot assistance system can be created, which assists the pilot in particular during the approach, to find the runway and to bring the aircraft safely to the runway. As a result, errors and accidents can be avoided. Furthermore, the position and attitude determination method according to the invention as a whole can be used for a pilot assistance system, thus enabling an improved autopilot system for both manned and autonomously operating unmanned aerial vehicles. A variety of unmanned aircraft location mechanisms with the support of video imaging are known in the art. In principle, these known methods have a method step in which a camera is positioned in the aircraft and images of the ground beneath the aircraft are taken. Furthermore, some methods are known in which visual information and information, which have been determined by means of satellite navigation, are combined, and in this way a higher precision of the determined position data is possible. The object of the invention is to provide a method for determining the position and attitude of an aircraft, as well as an aircraft, which determines its position and location, wherein the determination of the position and attitude of the aircraft take place substantially due to a camera carried in the aircraft should. The invention relates to a method for determining the position and position of an aircraft, in particular aircraft, within a predetermined geographical area, starting from an estimate, in particular by means of GPS positioning, inclination measurement and compass, predetermined estimation position and estimated position of the aircraft. It is envisaged that with a camera in the aircraft at least one camera image of the ground located below the aircraft 02/12/2011 12:54 N °: R864 P.004 / 026 02/12/2011 12:51 UJILDHQCK JELL1NEK - »53424535 NUM890 005 georeferenced orthonormalized reference images of the geographical area are made available in a database, with reference images still having height information in addition to aerial photographs indicating the geographic height of the associated mapped areas. A comparison image is created in that three-dimensional objects of a scene are created from the reference images stored in the database, taking into account the height information associated with their pixels, and the comparison images are generated by means of a rendering algorithm based on the scene and the estimated position and the estimated position of the aircraft become. The comparison image will be compared with the camera image and, depending on this comparison, a new estimation position and a new estimation position are determined. The steps of creating the comparison image and re-estimating the position and attitude of the aircraft are repeated and the estimation position and estimation position are changed until the comparison gives a sufficient match of the comparison image with the camera image and the last estimation position and estimation position and location of the aircraft, The method according to the invention offers the particular advantage that the exact position and position of the aircraft can be determined at any time on the basis of a relatively rough estimate. The aircraft itself only requires a camera and a data processing unit with a database. The inventive method can be operated with relatively little resource consumption in real time and provides about 10 position and position determinations per second with conventional computing power. In order to achieve a particularly advantageous matching of camera images with comparison images, it can be provided that the verification of the match of the camera image with the respective comparison image is searched for pairs of corresponding points, one of which lies in the camera image and the other in the comparison image , wherein both points each have the same features, and that the corresponding points are compared and a direction vector between the corresponding points is given, starting from the present direction vectors and the respectively associated points in the comparison image or camera image by interpolation a vector field is created and that the Vector field is used to redetermine the estimation position and estimate. 02/12/2011 12:54 No,: R864 P.005 / 026 02/12/2011 12:51 WILDHACK TELLINEK - > 53424535 NUM890 D0S «· · · 3i. For the same purpose, it may be additionally or alternatively provided that, when checking the correspondence between each pixel of the camera image and a pixel of the comparison image, the individual points and / or image environments of these pixels are examined for agreement and a degree of correspondence is determined, wherein a correspondence is recognized as the match size exceeds a predetermined threshold. In order to avoid the detection of random matches, provision may be made for certain pixels on the reference images stored in the database to be annotated and for the correlation measure to be increased or weighted more heavily if the changed image of an annotated pixel in the comparison image is compared with the corresponding point in the camera image matches. In order to achieve a faster finding of the position and position of the aircraft during the flight of the aircraft, provision can be made for a plurality of camera images to be taken at predetermined intervals in succession, the determined position and position of the aircraft being an estimated position at the time a camera image is taken and estimated position of the next camera image are used. In order to further improve the estimation position and estimation position, it may be provided that the velocity and direction of flight of the flying object is determined and the estimated position and estimation position relative to the position and attitude of the aircraft of the immediately preceding shot is offset by a vector the length of which since then Last shot with the determined speed corresponds 2urückgelegt way and whose direction coincides with the direction of flight of the aircraft. Furthermore, the invention relates to a position and attitude determination unit for determining the position and attitude of an aircraft, within a predetermined geographical area, starting from an estimated position and estimated position of the aircraft by means of estimation, in particular by means of GPS positioning, inclination measurement and compensation, comprising a) one Camera for creating a camera image, b) a database in which georeferenced orthonormalized reference images of the geographical area are stored, the pixels of the reference images being 02/12/2011 12:54 No .: R864 P.006 / 026 02/12/2011 12:51 ÜILDHftCK JELLINEK + 53424535 NUM890 00 C) a rendering unit that generates three-dimensional objects of a scene from the reference images stored in the database, taking into account the height information associated with their pixels, and by means of a rendering algorithm d) a comparison unit which compares the comparison image with the camera image and, depending on this comparison, determines a new estimation position and a new estimation position, e) an iterative control unit the rendering unit and the comparison unit, which controls the rendering unit to create a comparison image and forwards it to the position determination for determining a new estimation position and a new estimation position and the position thus determined in turn the rendering Unit until it reaches a sufficient match of the comparison image with the camera image, wherein the last estimation position and estimation position as the position and position of the aircraft at the output of the position and attitude determination unit abut. The position and attitude determination unit according to the invention offers the particular advantage that the position and position of the aircraft can be determined at any time on the basis of a relatively rough estimate; the aircraft itself requires only a camera and a data processing unit with a database. The inventive method can be operated with relatively little resource consumption in real time and provides about 10 position and position determinations per second with conventional computing power. In order to achieve a faster locating of the position and position of the aircraft during the flight of the aircraft, it can be provided that the camera is designed to receive a plurality of camera images at predetermined time intervals one behind the other. wherein the control unit of the comparison unit and the rendering unit specifies the determined position and attitude of the aircraft at the time of taking a camera image as the estimated position and estimated position of the next camera image. In order to further improve the estimation position and estimation position, it may be provided that the control unit is supplied with the speed and direction of flight of the aircraft, and the control unit of the comparison unit and the rendering unit 02/12/2011 12:55 No .: R864 P.007 / 026 02/12/2011 12:51 UIILDHACK JELLINEK · * 53424535 NUM890 008 I, 1 < · > · · ··. , ... · »* * * * .....,: ******** *« Ai · * * · * * ******* Estimated position and estimated position relative to the position and position of the aircraft of the immediately preceding recording offset by a vector whose length corresponds to the distance traveled since the last shot with the determined speed and whose direction coincides with the direction of flight of the flying object. Furthermore, the invention relates to an aircraft with a position and attitude determination unit according to the invention, wherein the camera is arranged in the aircraft such that it has outward visibility, the camera is directed in particular to an outside window of the aircraft. This measure ensures that the ground below the aircraft is visible to the camera. In order to make the largest possible area of the floor visible to the camera and at the same time to achieve a particularly advantageous determination of the position and position of the aircraft, it can be provided that the camera is arranged such that it in flight during straight flight of the aircraft at an angle from 10 * to 50 * from a horizontal position downwards. In the following an embodiment of the invention is illustrated in detail with reference to the accompanying drawings. Fig. 1 shows an aircraft equipped with a position and attitude determination unit according to the invention. Fig. 2 shows schematically the interconnection of the embodiment of the position and attitude determination unit. Fig. 3 shows a Referenzbiid whose surface is provided with color information and a created from the reference image three-dimensional scene. 4 shows the comparison between a camera image and a comparison image as well as corresponding pixels in the two images. Fig. 5 shows the two images and the prominent pixels shown above. FIG. 6 schematically shows a vector field formed by comparing the images shown in FIG. 4. Fig. 1 shows an aircraft 1, which is equipped with a position and attitude determination unit 2 according to the invention. In the aircraft 1, a camera 3 is arranged. The camera 3 is arranged in the present case so that it is directed in flight, in straight flight of the aircraft at an angle of 10 to 50 * viewed from a horizontal position down. In the present case, the camera 3 is arranged centrally in the front region of the aircraft 1 and directed in the direction of flight of the aircraft 1. In the front area of the aircraft 1 is a transparent 02/12/2011 12:55 No .: R864 P.008 / 026 02/12/2011 12:51 UILDHftCK JELLINEK - »53424535 NUM890 009 Ä. Exterior window S is provided, which is arranged immediately in front of the camera 2, to protect the camera from external effects such as wind, heat, cold and moisture The camera 3 picks up individual camera images 10 at predetermined time intervals and delivers them to a comparison unit 4 belonging to the position and position determination unit 2, which compares the respective recorded camera image 10 with a comparison image 11. In this preferred embodiment, the camera image 10 is equalized in the course of image preprocessing so that possible lens defects are removed as far as possible. In a further step, some features such as edge information etc. can be determined in advance. However, such image preprocessing is not absolutely necessary, such image processing can of course also be omitted. In the present case, an image preprocessing unit 14 which corrects the lens aberration is used The creation of the comparison images 11 will be described in more detail below. 2 shows schematically the connection of the embodiment of the position and position determination unit. The position and position determination unit 2 has a database 6 in which georeferenced orthonormalized reference images 12 of the geographical area are stored. Depending on the size of the required area and the required resolution, the database requires storage space for storing the reference images 12. A reference image 12 shows the area or a part of this area from above, removing all the image distortions resulting from the images from the images and an image without distortion distortion of the area is available. As a region limited geographic areas can be selected, within which the inventive method is performed, d. H. within which a position and orientation is made. The reference image 12 or a plurality of reference images 12 is stored in the database 6 and can be retrieved on request. Each pixel of the reference image 12 is uniquely assignable to a point on the earth's surface. Each pixel is assigned a color and brightness information. This color and brightness information is determined during the creation of the reference image 12 and corresponds to the color and brightness of the recorded area which prevail during the recording of the reference image. In addition, each pixel of the reference image 12 is associated with height information which indicates how far away the respective pixel associated with the pixel is from a reference height. In the present case, the standard ellipsoid WGS1984 commonly used by GPS receivers is used. 02/12/2011 12:56 No .: R864 P.009 / 026 02/12/2011 12:51 WILDHACK JELLINEK 53424535 NUM890 G> 10 7 · I · »» * · · The rendering unit 5 is designed to create comparison images 11 in the form of virtual aerial images of the respective area. For this purpose, the rendering unit 5 is supplied at its input by a control unit 7 with position and position information indicating from which point and with which viewing direction the rendering unit 5 is to create a virtual image of the respective area. The rendering unit 5 receives the reference images 12 together with the height information and creates a three-dimensional scene on the basis of the reference images 12. The three-dimensional scene 13 includes a Surface representation of the respective area, wherein each pixel of the reference image 12 acts as a surface point of a three-dimensional scene 13. Fig. 3 shows a reference image 12 whose surface is provided with color information. 3 shows a three-dimensional scene 13 created from the reference image 12. The three-dimensional scene 13 is defined with respect to a coordinate system, wherein two of the three coordinates of each point of the scene 13 depend on the location of the respectively associated pixel within the reference image 12 and the third coordinate of the point, in the present case the height corresponding to height information stored for each pixel. The rendering unit 5 creates the scene 13 and assigns to the individual points of the scene 13 respectively those colors that have the pixels of the reference image 12 respectively associated with the points. The rendering unit 5 receives an estimate of the current position and position at its input input and, based on the created scene 13 and the position and position information, creates a virtual three-dimensional comparison image 11 that shows the respective scene when viewed in the position - and position information specified position and location is visible. For this purpose, the reference biiders and altitude information relevant for the estimated position and direction are loaded from a database 6. These are used to create a perspective correct mapping ("Rendering"). This view represents a virtual camera image, hereinafter referred to as comparison image 11, which would have taken a real camera at the previously defined location, it would have been placed at the time of recording the scene 13 here. 02/12/2011 12:56 N °: R864 P, 010/026 02/12 / -2011 12:51 WILDHACK JELLINEK * 53424535 NUM890 ( U ♦ ♦ Λ, The reference images 12 and height data are stored in this preferred embodiment of the invention according to the folder structure defined in TMS (Tile Map Service Specification). Alternatively, other formats could be used As a tool to the rendering engine, in this preferred embodiment of the invention, the Terrain Rendering Toolkit " osgEarth " used. TMS or alternative formats allow access to georeferenced map material. Alternatively, of course, other rendering engines for creating the comparison image 11 may be used by the rendering unit 5. FIG. 4 shows the comparison between a camera image 10 and a comparison image 11. The comparison unit 5 is supplied, on the one hand, with a camera image 10 and, on the other hand, with a comparison image 11. The comparison unit 4 compares the camera image 10 with the comparison image 11. In this preferred embodiment of the invention, matches of the camera image 10 with the respective comparison image 11 are sought. For this purpose, 10, 11 corresponding pixels are searched in both images, i. E. an attempt is made to find pairs 15 of corresponding pixels, one of which is located in the camera image 10 and the other in the comparison image 11, wherein both pixels have the same characteristics in each case. Thus, pixels can be considered as corresponding if they have common features or have a similar point environment. A significant feature here is that the images 10, 11 come from different sources, the first image was created by the camera 3, while the comparison image 11 mitteis of the rendering unit 5, that is, has been made artificially. In order to achieve this and to find corresponding pixels in the images 10, 11, respectively relevant image features are determined for the individual pixels of the two images 10, 11, wherein in each case a comparison of the bit characteristics of a pixel from one of the images 10, 11 with the image features one pixel of the other of the two images 10,11 is compared. In order to allow corresponding pairs of images from different sources, relevant image features are based less on local pixel intensity, but rather on the presence of certain structures such as e.g. Edges, corners or other structure descriptors, such as e.g. normalized orientation histograms. In order to establish an actual correspondence between a (partial) set of image features, individual image features are directly compared. 02/12/2011 12:56 No .: R864 P.011 / 026 02/12/2011 12:51 WILDHftCK JELLINEK - > 53424535 NUM890 012 * ♦ · * «*« · * * * * · · i During preprocessing in the preprocessing unit, the camera images are already equalized, with the equalization compensating for the distortion created by the optics of the camera. Furthermore, in the course of the pretreatment certain features of the camera images and the comparative images 11 can be emphasized, which remain unchanged over time and are not subject to seasonal fluctuations, such as the color of the soil, which can vary greatly depending on season, snow conditions, etc .. When detecting the edges common edge extraction methods such as Canny come into consideration. When detecting features, common feature extraction techniques such as SIFT or SURF are considered. Alternatively or additionally, a texture analysis can be performed In the context of image processing, the term matching is commonly used to find correspondences. When matching features, for example, SIFT features or vectorized image information are compared as such, In the specific case, when examining the correspondence between a respective pixel of the camera image 10 and a pixel of the comparison image 11, the individual pixels and / or image environments of these pixels are examined for agreement and a correspondence measure is determined, wherein a correspondence is recognized as being Match size exceeds a predetermined threshold. In a particular embodiment of the invention, certain pixels are provided with annotations on the reference images 12 stored in the database 6. Corresponding pixels are searched for these pixels and a possible match is weighted more heavily if the rasterized image of an annotated pixel in the comparison image 11 is weighted matches the corresponding pixel in the camera image 10. Fig. 5 shows the two images 10 11 and the prominent pixels shown one above the other. For all pairs of mutually corresponding pixels, the difference vector is formed in each case, ie. H. a direction vector is determined between the corresponding points. Starting from the distribution of the direction vectors in the comparison image 11 or in the camera image 10, a vector field, shown in FIG. 6, is determined and created by interpolation 02/12/2011 12:57 No .: R864 P.012 / 026 02/12/2011 12s 51 WILDHACK JELLINEK - > 53424535 NUM890 D13 The vector field can also be created by area-based matching, which applies certain correlation methods to the pixels of the images. The result of a matching process is, on the one hand, a general degree of agreement for the degree of agreement, and, on the other hand, information about the extent and direction of the deviations in different parts of the image. , To determine such vector fields, among others, the " SIFT-FLOW " method described in " C. Uu, J. Yuen, A. Torralba, J. S / v / c, and W. T. Freeman. SIFT flow: dense correspondence across different scenes. In the European Conference on Computer Vision (ECCV) 2008. Oral presentation, http://people.csail.mit.edu/celiu/ECCV2008/SIFTflow.pdf is described. If the vectors in this vector field are equal to zero over the entire image 10, 11, the positions of the pixels of all the pairs of prominent pixels coincide with each other, and the images 10, 11 are considered to be completely coincident with each other in this case. The estimation position and the estimation position respectively coincide with the position and attitude of the aircraft when the sum of the vector lengths is equal to zero. The position and position of the camera 3 at the time of taking the camera image 10 is thus equated to the position and position from which the comparison image has been created. At best, the position and position of another object arranged in or on the aircraft 1 can be determined from the position and position of the camera 3 by coordinate transformation, in particular by translation and rotation. On the other hand, if the vector field has vectors which are nonzero, the associated distinctive points differ from each other. A new comparison image 11 based on a position correction or a position correction is created so that the comparison image 11 is further approximated with the camera image 10. Starting from the vector field, a positron and position correction is performed, the rendering unit 5 creates a new comparison image 11 based on the new position and position, which is compared again with the camera image 10. In this preferred embodiment of the invention, the re-estimation of the position based on the vector field of position shifts is established by means of an error code. The aim of this procedure is to use the corresponding 02/12/2011 12:57 No .: R864 P.013 / 026 02/12/2011 12:51 LIILDHACK JELLINEK 4 53424535 NUfie90 P14 · * * * · «« * »·« * ·· ** · * M · · * * * * * * * Mi 1 * * · * * * # * * * ♦ ** < < * * * «I * ··« ··· « Stetten in a pair of images, consisting of a camera image and an artificially crafted image to calculate a position estimate for the real camera. The re-estimation of the position and position from the previously known estimation position and estimation position can be made as follows: Starting from an equal number of correspondence parameters in the form of vectors in the vector field, it is possible to approximately create a model for changing the camera position. Such a model can be created, for example, using artificially enhanced, altered image pairs for which image pairs the positional change is known. In the course of the modeling individual artificially produced images are compared with each other, whereby the respective camera position and -läge is known in advance. By means of a matching method, mutually corresponding points which map the same virtual point are searched for, whereby points corresponding to one another are combined into point pairs. As described above, a vector field (FIG. 6) is created which indicates the distance of the individual points of a pair of points. Subsequently, the vector field is fed to a modeling together with the two camera positions and positions, wherein a model function is searched, with the basis of a vector field and a first camera position and -läge a second camera position and days is searched. This modeling can be done for example by a multivariate linear model, wherein it is finally possible to conclude from a first camera position and the vector field to a second camera position. Suitable modeling is e.g. The Linear Regression, is about Schmidt, Trenkler: Introduction to Modern Matrix Algebra: With Applications in Statistics, Springer, ISBN 978-3-540-33007-3. The linear regression provides a matrix that allows to calculate directly from the vector field by matrix multiplication an estimated change of position and position. The position of the virtual camera can then be adjusted by means of the estimated change of position, the position being adjusted only slightly in the direction of the estimated change in position in order to prevent behavior from becoming unstable. In further iteration steps, the change in position is estimated alternately and the position of the virtual camera is adjusted until the position of the virtual camera of one of the real camera is adapted as well as possible. 02/12/2011 12:58 No .: R864 P.014 / 026 02/12/2011 12:51 WILDHACK JELLINEK * 53424535 NUM890 C15 The control of this iterative process is directed by a control unit 7 (Figure 2). The comparison unit 4 indicates the respective position and position correction at its output and makes it available to the control unit 7. The control unit 7 changes the position and position within the scene 13, from which the comparison image 11 is created and forwards this position and position information thus determined to the rendering unit 5, which creates a new comparison image 11. If, in an iterative process, the position of the aircraft 1 is now to be determined for the time at which the camera image 10 was taken, an estimated position and position is indicated in a first step. This estimation position can be determined, for example, by means of a GPS device, the estimation position by means of a compass and a inclinometer and / or gyroscope. The estimation position and length need not be very precise; rather, this information need only be sufficiently close to the actual position of the aircraft 1 that some common object points of the camera image 10 and a comparison image 11 based on the estimated position and position are visible some point correspondence can be found. By means of the iterative algorithm described above, the actual position and position of the aircraft 1 can be determined by incrementally varying the estimated position and the estimated position of the aircraft 1. The determined actual position and position of the aircraft 1 is present at the output of the comparison unit 4 at the end of the iteration and is available for applications within the aircraft 1. For example, the available position and attitude information can be used to check the functioning of position- and position-indicating instruments. Furthermore, this information can also be used for automated flight maneuvers such as, for example. Visually controlled landing, etc. are used. In the following, a second embodiment of the invention is shown, which substantially corresponds to the first illustrated embodiment. In particular, the differences from the first embodiment of the invention are shown. In this embodiment of the invention, it is considered that a plurality of shots are taken in a short time sequence. In a typical flight, the aircraft 1 first lifts off a runway and climbs. While the arranged in the aircraft 1 on the runway still a very small part of the ground wahmimmt and a localization of the aircraft 1 is not yet possible, the area of the ground detected by the camera 3 below the aircraft 1 is getting larger, so that after some time based on the in the database 6 02/12/2011 12:58 No .: R864 P.015 / 026 02/12/2011 12:51 _ WILDHACK JELL1NEK 4 53424535 NUM890 PI6 stored reference images 12 a detection of position and position of the aircraft 1 takes place. In order to detect an initial position and attitude of the aircraft 1, the method illustrated in the first embodiment of the invention is applied. After the position and position of the aircraft 1 are detected, the next camera image 10 taken by the camera 3 is transmitted to the comparison unit 4. Since the camera images 10 are recorded by the camera 3, typically at points in rapid succession, the position and the position of the aircraft 1 between the taking of two camera images 10 does not change significantly. In order to obtain the best possible estimate for the position and position determination of the second camera image 10, it is advantageous to use the position and position determined during the position and orientation determination for the first camera image 10 as an estimated value. This estimated value can be improved by offsetting the estimated position and positions of the aircraft for the creation of the comparison image 11 with respect to the position and position determined for the immediately preceding camera image 10 by a vector whose length corresponds to the speed determined since the last acquisition the distance traveled and whose direction coincides with the direction of flight of the flying object. In connection with this procedure, it is said that after a first detection of the aircraft the system engages, wherein the detection of the position and position of the aircraft 1 at a subsequent time is possible with much less effort. If a match can no longer be found, the method according to the first embodiment of the invention can be repeated until a position and attitude determination is possible. Again, a new detection can take place either on the basis of constantly determined GPS data or on the basis of the position and position determined last. A third embodiment of the invention provides that on the same aircraft 1, a plurality of cameras 3 are arranged, which are directed from different positions on the aircraft 1 - in normal attitude - on the ground below the aircraft 1. Each camera 3 is its own position and 02/12/2011 12:58 No .: R864 P.016 / 026 02/12/2011 12:51 WILDHACK JELLINEK 53424535 NUM890 E17 * * * * * * * «* * * * * * * * ·« «4 * * * * · · «* * F A * '* * • *' * · · * * * * *« «» »« i Position determination unit 2 assigned, which detects the position and location of the respective camera 3. If the positions and orientation of the individual cameras are known to each other, the position and position of the entire aircraft 1 can be better estimated by averaging the individual results obtained. It is particularly advantageous in this context to use exclusively those cameras 3 for the averaging, with the camera images 10 valid positron and position information could be determined. Such a system with multiple cameras 3 has the particular advantage that a positron and position detection is still possible even if, for example, a part of the floor below the aircraft 1 is covered by fog or clouds. 02/12/2011 12:59 No .: R864 P, 017/026
权利要求:
Claims (9) [1] 02 / 12- / 2011 12:51 UJILDHACK JELLINEK 4 53424535 NUM890 The ♦ * · · · * * * * · «« «« · «« * «» «. • · · · · · · · fc · * # * * «* rö» * · * »· ♦ · ·· * < ·« * * ······························································· the position and attitude of an aircraft (1). in particular aircraft (1), within a predetermined geographical area, starting from an estimation, in particular by means of GPS positioning, inclination measurement and Kompess, predetermined estimation position and estimation position of the aircraft (1), a) with a plane in the aircraft (1) camera ( 2) at least one camera image (10) of the ground below the aircraft (1) is recorded, b) wherein georeferenced orthonomnalized reference images (12) of the geographical area are provided in a database (6), the pixels of the reference images (12) being provided each have height information indicating the geographic height of the areas imaged on the pixels. c) wherein a comparison image (11) is created by three-dimensional objects of a scene (13) are created from the stored in the database (6) Referenzbildem (12) taking into account the height information associated with their pixels, and wherein by means of a rendering algorithm based on the scene (13) and the estimated position and the estimated position of the aircraft (1) the comparison images (11) are created, d) wherein the comparison image (11) with the camera image (10) are compared, and depending on this comparison, a new estimation position and a new estimation position is determined, e) wherein steps c) and d) are repeated and the estimation position and estimation position are changed until the comparison in step d) shows that the comparison image (11) and the camera image (FIG. 10), and the last estimation position and estimation position are regarded as the position and attitude of the aircraft (1). [2] 2. The method according to claim 1, characterized in that in the examination of the match of the camera image (10) with the respective comparison image (11) is searched for pairs of corresponding points, one of which in the camera image (10) and the other in the comparison image (11), wherein both points have the same characteristics in each case that the corresponding points are compared with each other and a direction vector between the corresponding points is given, starting from the present direction vectors and the respectively assigned points in 02/12/2011 12: 59 No .: R864 P.018 / 026 02/12/2011 LJ ILDHACK JELL1NEK »53424535 12:51 D19 NUM890 Comparison image (11) or camera image (10) A vector field is created by interpolation and that the vector field is used to redetermine the estimation position and estimation position is used. [3] 3. The method according to any one of the preceding claims, characterized in that when examining the correspondence between each pixel of the camera image (10) and a pixel of the comparison image (11), the individual points and / or image environments of these pixels are examined for agreement and determining a match mismatch, where a correspondence is deemed to be recognized if the match metric exceeds a predetermined threshold, in particular on the reference images (12) stored in the database (6) being annotated and the match metric being increased or weighted more heavily, when the rendered image of an annotated pixel in the comparison image matches the corresponding point in the camera image (10). [4] 4. The method according to any one of the preceding claims, characterized in that a plurality of camera trays (10) is taken at predetermined time intervals in a row, wherein the determined position and position of the aircraft (1) at the time of recording a camera image (10) as a treasure position and Estimate the next camera image (10) are used, in particular, the speed and flight direction of the flying object is determined and the estimation position and estimation position relative to the position and position of the aircraft (1) the immediately preceding shot is offset by a vector whose length corresponds since the last shot with the determined speed covered path and its direction coincides with the direction of flight of the aircraft (1). [5] 5. Position and position determination unit (2) for determining the position and position of an aircraft (1), within a predetermined geographical area, starting from an estimate, in particular by means of GPS positioning, inclination measurement and compass, predetermined estimation position and estimated position of the aircraft (1 ), comprising a) a camera (3) for creating a camera image (10). b) a database (6) in which georeferenced orthonormalized reference images (12) of the geographical area are stored, the pixels of the reference images (12) 02/12/2011 12:59 Nr .: R864 P.019 / 026 02/12 / 2011 12:51 WILDHACK JELLINEK + 53424535 NUM890 D20 02/12/2011 12:51 WILDHACK JELLINEK + 53424535 NUM890 D20 Each have height information indicating the geographic height of the areas imaged on the pixels, c) a rendering unit (5) consisting of the ones stored in the database (6). *** " Reference images (12) taking into account the height information associated with their pixels three-dimensional objects of a scene (13) created, and by means of a rendering algorithm based on the scene (13) and the estimated position and the estimated position of the aircraft creates the comparison images (11), d ) of a comparison unit (4) which compares the comparison image (11) with the camera image (10) and, depending on this comparison, determines a new estimation position and a new estimation position, e) a control unit (7) for iteratively controlling the rendering unit ( 5) and the comparison unit (4), which controls the rendering unit (5) to create a comparison image (11) and this to the position determination for determining a r forwarding the new estimation position and a new estimation position and feeding the position thus determined back to the rendering unit (5), until a sufficient match of the comparison image (11) with the camera image (10) exists, wherein the last estimation position and estimation position Position and position of the aircraft (1) at the exit of the position and attitude determination unit, [6] 6. position and attitude determination unit (2) according to claim 5, characterized in that the camera (3) for receiving a plurality of camera images (10) at predetermined time intervals behind one another, wherein the control unit (7) of the comparison unit (4) and the rendering unit (5) predetermines the determined position and position of the aircraft (1) at the time of taking a camera image (10) as a treasure position and estimated days of the next camera image (10). [7] 7. position and attitude determination unit according to claim 6, characterized in that the control unit (7) the speed and direction of flight of the aircraft (1) are supplied, and the control unit (7) of the comparison unit (4) and the rendering unit (5) the treasure position and estimation position relative to the position and attitude of the aircraft (1) of the immediately preceding recording offset by a vector whose length corresponds to the distance traveled since the last shot with the determined speed and whose direction coincides with the direction of flight of the flying object. 02/12/2011 13:00 No .: R864 P.020 / 026 02/12/2011 12:51 WILDHACK JELLIHEK - »53424535 NUM090 021 ** · · * · · ♦ # ··· * * * * *« · * * »* * * 9 4 9 9 • * Λ Λ · 9 * • * JO * · · · · · * * *» * * ·· * · · I ·· «· *» · [8] 8. An aircraft (1) with a position and attitude determination unit <2) according to one of claims 5 to 7, characterized in that the camera (3) is arranged in the aircraft (1) so that it has outward view, wherein the camera (2) is directed in particular to an outer window (3) of the aircraft (1). [9] 9. Aircraft (1) according to claim 8, characterized in that the camera (2) is arranged such that it directed in flight during straight flight of the aircraft (1) at an angle of 10 ° to 50 ° from a horizontal position downwards is. 02/12/2011 13:00 No .: R864 P.021 / 026
类似技术:
公开号 | 公开日 | 专利标题 EP0027168B1|1983-09-14|Stereo-photogrammetric device for aircraft and spacecraft for obtaining a digital terrain representation DE102011106050B4|2017-11-02|Shadow removal in an image captured by a vehicle-based camera for detection of a clear path DE202015009182U1|2016-12-01|Aligning panoramic and aerial photography DE112011103690T5|2013-09-05|Detection and tracking of moving objects DE112010004767T5|2013-01-24|Point cloud data processing device, point cloud data processing method and point cloud data processing program EP2916106B1|2018-06-13|Method for determining position and attitude by means of virtual reference images EP2821750A1|2015-01-07|Method of determining position for a measuring device and corresponding measuring device EP2060873B1|2016-03-30|Method of supporting the inertial navigation of an aircraft EP2369296A2|2011-09-28|Navigation method for a missile DE102013204597A1|2014-09-18|Method and apparatus for determining visibility in fog during the day DE102017205647A1|2018-09-20|METHOD FOR DETERMINING A PATH ALONG AN OBJECT, SYSTEM AND METHOD FOR AUTOMATIC INSPECTION OF AN OBJECT DE69730733T2|2005-09-29|FLIGHT BODY STEERING SYSTEM WITH LOCK AFTER SHOT AND RECONSTRUCTION OF THREE-DIMENSIONAL SCENES AT511460B1|2012-12-15|METHOD FOR DETERMINING THE POSITION OF AN AIRCRAFT EP3548842B1|2020-12-16|Method and device for georeferencing of aerial image data by means of sar image data DE112014002943T5|2016-03-10|Method of registering data using a set of primitives DE102016225310A1|2018-06-21|Mobile manipulator and method for controlling a mobile manipulator DE102014205083B3|2015-09-10|Create maps with shorelines DE102017010683B4|2019-08-14|Method for automatic restoration of a measured state of a projection system DE102018102691A1|2018-08-16|Parallaxenberechnungsvorrichtung DE112017003426T5|2019-03-28|Method and system for reconstructing a three-dimensional representation DE102017208526A1|2018-11-22|Marker based camera tracker EP3089076A1|2016-11-02|Method for aligning an agent unit on a target object DE10042387A1|2002-03-14|Method for transforming 3-D object points to picture elements | for linear-fan beam sensor images, includes calculating difference between starting coordinates and preceding coordinates WO2016165814A1|2016-10-20|Method for classifying a vehicle DE102015004943B4|2017-03-30|Method for directional calibration of cameras of a camera system
同族专利:
公开号 | 公开日 AT511460B1|2012-12-15|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 DE102007054950A1|2007-11-17|2009-08-06|Lfk-Lenkflugkörpersysteme Gmbh|Method for supporting the automatic navigation of a low-flying missile|EP2933603A1|2014-04-14|2015-10-21|Saab Vricon Systems AB|Navigation based on at least one sensor and a 3d map| US9483842B2|2014-04-14|2016-11-01|Vricon Systems Aktiebolag|Navigation based on at least one sensor and a three dimensional map| EP3343172A1|2017-01-03|2018-07-04|iOnRoad Technologies Ltd.|Creation and use of enhanced maps|
法律状态:
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 ATA1788/2011A|AT511460B1|2011-12-02|2011-12-02|METHOD FOR DETERMINING THE POSITION OF AN AIRCRAFT|ATA1788/2011A| AT511460B1|2011-12-02|2011-12-02|METHOD FOR DETERMINING THE POSITION OF AN AIRCRAFT| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|