![]() APPARATUS AND METHOD FOR GUIDING A MOBILE PLATFORM WITHIN AN ENVIRONMENT
专利摘要:
apparatus and method for guiding a mobile platform within an environment A method and apparatus for guiding a mobile platform within an environment can be provided. a number of first type data streams and a number of second type data streams can be generated using a plurality of data systems. a probability distribution can be applied to each of several of the second type of data streams to form multiple modified data streams. the various first type data streams and the various modified data streams can be merged to generate a position estimate with a desired level of accuracy for the mobile platform with respect to an environment around the mobile platform. 公开号:BR102015007157B1 申请号:R102015007157-4 申请日:2015-03-30 公开日:2021-07-27 发明作者:Peter F. Trautman;Hui Li;Robert P. Higgins 申请人:The Boeing Company; IPC主号:
专利说明:
BACKGROUND INFORMATION 1. Field: [001] The present exposure is generally related to identifying the position of a mobile platform in an environment. More particularly, the present disclosure relates to a method and apparatus for forming and merging data streams each including an uncertainty measure to generate a position estimate for the mobile platform within the environment. 2. Foundation: [002] In some situations, it may be desirable to have a mobile robot that can move freely within an environment just as a human would. Physical reference points, such as paint, tape, or magnets, which typically can be used to help a mobile robot move within an environment, can constrain a mobile robot to follow only predefined routes. Furthermore, installing these types of physical landmarks can be more time-consuming and expensive than desired. To move more freely within an environment, a mobile robot may need to perform localization, which includes identifying the mobile robot's position within the environment. As used herein, a "position" includes a position, an orientation, or both with respect to a reference coordinate system. [003] A mobile robot can use an external sensor system to perform location. However, in some cases, the line of sight between a mobile robot and the external sensor system may be obstructed by other objects, robots, and/or people within the industrial environment. As an example, in an industrial aircraft environment, line of sight can be lost when the mobile robot operates under an aircraft wing, inside the wing, in the vicinity of factory objects such as cranes or columns, and/or in areas restricted. Once line of sight is lost, the mobile robot may no longer receive position updates and may need to stop operations until line of sight has been regained. Without location, the mobile robot is not able to navigate the environment as precisely as desired. [004] Furthermore, in a dynamic environment, cars, planes, workstations, vehicles, equipment platforms, other types of devices, human operators, or some combination thereof may move. Consequently, a mobile robot is not able to rely solely on its surroundings to move through this type of environment or a cluttered or unsegmented or efficiently structured environment. Currently available mobile robots cannot operate at the desired levels of performance and efficiency or maneuver around human operators as safely as desired in these different types of environments. [005] Additionally, in some cases, the equipment or devices used for locating may be more expensive, larger, or heavier than desired. In certain situations, the processing required to perform the localization with a desired level of precision may take longer or require more processing resources than desired. Therefore, it would be desirable to have a method and apparatus that take into account at least some of the issues discussed above, as well as other possible issues. SUMMARY [006] In an illustrative embodiment, an apparatus may include a plurality of data systems, a modifier, and a position estimator. The plurality of data systems can be configured to generate a plurality of data streams. The plurality of data streams may include several of the first type of data streams and several of the second type of data streams. The modifier can be configured to apply a probability distribution to each of the second type number of data streams to form multiple modified data streams. The position estimator can be located embedded in a mobile platform and can be configured to receive and merge the number of first type data streams and the number of modified data streams to generate a position estimate with a desired level of accuracy for the mobile platform with respect to an environment around the mobile platform. [007] In another illustrative embodiment, a mobile platform may include a base, a controller associated with the base, and a motion system associated with the base. The controller may be further configured to receive data streams from a plurality of data systems in which data streams may include several first type data streams and several second type data streams. The controller can include a position modifier and estimator. The modifier can be configured to apply a probability distribution to each of the second type number of data streams to form multiple modified data streams. The position estimator can be configured to receive the number of first type of data streams and the number of modified data streams. The position estimator can be further configured to merge the plurality of data streams together to generate a position estimate with a desired level of accuracy for the mobile platform with respect to an environment around the mobile platform. The motion system can be configured to be controlled by the controller based on position estimation to move the mobile platform within the environment. [008] In yet another illustrative embodiment, a method for guiding a mobile platform within an environment may be provided. Several first type data streams and several second type data streams can be generated using a plurality of data systems. A probability distribution can be applied to each of the second type number of data streams to form multiple modified data streams. The various first type data streams and the various modified data streams can be merged to generate a position estimate with a desired level of accuracy for the mobile platform with respect to an environment around the mobile platform. [009] The features and functions may be achieved independently in various embodiments of the present exhibit or may be combined in still other embodiments in which further details can be seen with reference to the following description and drawings. BRIEF DESCRIPTION OF THE DRAWINGS [0010] The accredited modern features of the illustrative embodiments are published in the appended claims. The illustrative embodiments, however, as well as a preferred mode of use, purposes and additional features thereof, will be better understood by reference to the following detailed description of an illustrative embodiment of the present exhibit when read in conjunction with the accompanying drawings, in which: [0011] Figure 1 is an illustration of an environment in the form of a block diagram as an illustrative embodiment; [0012] Figure 2 is an illustration of a plurality of data systems in the form of a block diagram according to an illustrative embodiment; [0013] Figure 3 is an illustration of the components of a plurality of data systems that are located built-in and the components of a plurality of data systems that are located externally according to an illustrative embodiment; [0014] Figure 4 is an illustration of an industrial environment as an illustrative embodiment; [0015] Figure 5 is an illustration of a mobile robot as an illustrative embodiment; [0016] Figure 6 is an illustration of a process for generating a position estimate for a mobile platform in an environment in the form of a flowchart according to an illustrative embodiment; [0017] Figure 7 is an illustration of a process for guiding a mobile robot within an industrial environment in the form of a flowchart according to an illustrative embodiment; [0018] Figure 8 is an illustration of a data processing system in the form of a block diagram as an illustrative embodiment; [0019] Figure 9 is an illustration of an aircraft manufacturing and service method in the form of a block diagram as an illustrative embodiment; and [0020] Figure 10 is an illustration of an aircraft in the form of a block diagram as an illustrative embodiment. DETAILED DESCRIPTION [0021] The illustrative embodiments recognize and take into account different considerations. For example, the illustrative embodiments recognize and take into account that it may be desirable to have a method and apparatus capable of more accurately and quickly performing location for various mobile platforms within an industrial environment. Furthermore, the illustrative embodiments recognize and take into account that it may be desirable to have a method and apparatus for generating a position estimate for a mobile robot within an embedded environment of the mobile robot. [0022] Thus, the illustrative embodiments provide a method and apparatus for generating a position estimate for an embedded mobile robot of the mobile robot using sensor fusion. The method and apparatus provided by the illustrative embodiments can reduce the time needed to generate a position estimate, while increasing the accuracy of the estimate. Furthermore, the solution provided by the illustrative embodiments may be simpler and more economical than some currently available solutions. [0023] Referring now to the figures and, in particular, with reference to Figure 1, an illustration of an environment is described as an illustrative embodiment. In this illustrative example, environment 100 can be any environment in which multiple mobile platforms 102 can be used. As used here, "multiple" items can be one or more items. In this way, a plurality of mobile platforms 102 can include one or more mobile platforms. [0024] In an illustrative example, the environment 100 can take the form of industrial environment 101 in which the object 103 is being manufactured. Object 103 can take many different shapes. For example, without limitation, the object 103 may take the form of a door, a film panel, a wing for an aircraft, a fuselage for an aircraft, a structural component for a building, an assembly of components, or some other type. of object. [0025] As described, the mobile platform 104 can be an example of an implementation for a mobile platform on various mobile platforms 102. In this illustrative example, the mobile platform 104 can take the form of mobile robot 106. Of course, depending on the implementation, the mobile platform 104 can take the form of any type of platform, structure, device, or object capable of moving at least partially autonomously within environment 100. [0026] As described, mobile robot 106 may include base 108, motion system 110, various tools 112 and controller 114. Motion system 110, various tools 112 and controller 114 may be associated with base 108. As used herein. , when a component is "associated" with another component, the association is a physical association in the examples described. [0027] For example, without limitation, a first component, such as the movement system 110, may be considered to be associated with a second component, such as the base 108, being secured to the second component, connected to the second component, mounted to the second component, welded to the second component, secured to the second component, and/or connected to the second component in some other suitable way. The first component can also be connected to the second component using a third component. Furthermore, the first component can be considered to be associated with the second component being formed as part and/or as an extension of the second component. [0028] Motion system 110 can be used to move mobile robot 106 within environment 100. For example, without limitation, motion system 110 can be used to move mobile robot 106 within environment 101. Depending on the implementation, the movement system 110 may include at least one of a plurality of wheels, a plurality of rollers, a plurality of legs, a plurality of holonomic wheels, or other types of devices capable of providing movement. [0029] As used here, the phrase "at least one of", when used with a list of items, means that different combinations of one or more of the items listed may be used and only one of the items in the list may be needed. The item can be a particular object, thing, or category. In other words, "at least one of" means that any combination of items or number of items can be used from the list, but not all items in the list may be required. [0030] For example, "at least one of item A, item B and item C" can mean item A; item A and item B; item B; item A, item B and item C; or item B and item C. In some cases, "at least one of item A, item B, and item C" can mean, for example, without limitation, two of item A, one of item B, and ten of item C; four from item B and seven from item C; or some other suitable combination. [0031] In this illustrative example, multiple tools 112 can be used to perform multiple operations 116 within environment 100. At least one of multiple operations 116 can be performed on object 103. Multiple operations 116 can include, for example, without limitation, at least minus one of a drilling operation, a clamping operation, a sanding operation, a painting operation, a machining operation, a proofing operation, an imaging operation, or some other type of operation. In this way, various tools 112 may include, for example, without limitation, at least one of a drilling device, a clamping device, a sanding tool, a painting tool, a fluid dispensing system, a water application device. sealant, a machining device, a milling device, a proofing system, an imaging device, a scanner, a marker, a pen, a label applicator, or some other type of tool. [0032] In this illustrative example, controller 114 may be configured to control the operation of at least several tools 112. In addition, controller 114 may be configured to control motion system 110. In particular, controller 114 may control the system of motion 110 to move mobile robot 106 along path 117 in environment 100. Path 117 may be at least partially along floor 115 of environment 100. As used herein, floor 115 may include a floor surface, a surface on a bridge, a surface formed by one or more pallets, a platform surface, an elevator floor, a conveyor belt floor, some other type of surface, or some combination thereof. [0033] Path 117 can be a dynamically updated path since controller 114 can update path 117 when mobile robot 106 moves through environment 100. Controller 114 can update path 117 when mobile robot 106 moves through environment 100 to helping mobile robot 106, for example, without limitation, at least one of avoiding obstacles, will move around objects that have been recently placed or moved within environment 100, respond to changes in various operations 116 to be performed through mobile robot 106 , maneuvering around human operators who are located or moving around within environment 100, or responding to some other type of new or changed circumstance within environment 100. Controller 114 can use location to help navigate mobile robot 106. [0034] As described, controller 114 may include, for example, without limitation, position estimator 118 and modifier 120. Position estimator 118 may generate position estimate 122 for mobile robot 106. Position estimate 122 may be a estimation of mobile robot position 106 within environment 100. Mobile robot position 106, as used herein, may be included from at least one of a mobile robot position 106 or a mobile robot orientation 106 with respect to the coordinate system of reference 124 for environment 100. Thus, position estimate 122 can be included from at least one of position estimate 121 and orientation estimate 123 of mobile robot 106 with respect to coordinate reference system 124 for environment 100. [0035] Mobile robot 106 can be configured to move with six degrees of freedom in 100 environment. Thus, position estimate 122 can be a position estimate of six degrees of freedom (6DoF) for mobile robot 106. In some cases , position estimate 122 may be called mobile robot position estimate 106 in the six mobile robot spatial degrees of freedom 106. [0036] In this illustrative example, position estimator 118 may merge the plurality of data streams 126 to generate position estimate 122. At least a portion of the plurality of data streams 126 may be received from the plurality of data systems 128 As used herein, a data system in the plurality of data systems 128 can include multiple sensor systems, multiple processor units, or some combination thereof. As used herein, a "sensor system" can be comprised of any number of sensor devices, active devices, passive devices, or combinations thereof. [0037] As used herein, "merging" the plurality of data streams 126 may mean combining and processing the different data streams into the plurality of data streams 126 to generate a single position estimate 122. Each of the plurality of data streams 126 data can be included from estimates generated over time. [0038] Data stream 125 is an example of one of a plurality of data streams 126. Data stream 125 may be generated by one of a plurality of data systems 128. In this illustrative example, data stream 125 may be included from estimates generated over time. [0039] As used here, an "estimate" can be an estimate of the position of six degrees of freedom of mobile robot 106. This estimate can be generated based on measurements generated either at a single point in time or over a period of time. In some cases, the estimate can also include metadata. In some illustrative examples, the estimate may be called an output data point such that the data stream 125 can be included from a plurality of output data points. [0040] Position estimator 118 can use plurality of data streams 126 and a Bayesian estimation algorithm to generate position estimation 122. In particular, position estimator 118 can use the following Bayesian estimation equations: where t is time, xt+1 is the mobile robot position 106 at time t+1, zT is the collection of all estimates in the plurality of data streams 126 per time, , y∑_: is the probability of xt+ i given ZT, xt is the mobile robot position 106 at time t, is the total number of data systems in the plurality of data systems 128, is the estimate generated by a first data system in the plurality of data systems 128a time t+1, and is the estimate generated by the nth data system in the plurality of data systems 128 at time t+1. for a given time j only includes data systems that provided estimates at time t+1. [0041] The error in position estimator 118 can be further reduced by increasing the number of data streams in the plurality of data streams 126 used to generate the position estimate 122, and thereby the number of data systems in the plurality of data systems 128. In other words, as the number of data streams in the plurality of data streams 126 increases, the position estimation error 122 generated by position estimator 118 decreases. [0042] Using Bayesian estimation techniques to generate the position estimate 122 may require that all data used to generate the position estimate 122 be probabilistic. In other words, all data may need to include randomness or uncertainty. [0043] However, data streams 127 generated by the plurality of data systems 128 may include several first type data streams 132 and several second type data streams 130. A "first type data stream", such as one of several of the first type of data streams 132 may include data points in which each data point includes, or is coupled with, an uncertainty measurement, based on some probability distribution. In particular, a first type of data stream may be generated by a probabilistic system, model, or algorithm in which the output or mode in which a output is generated for a given input takes into account randomness or a degree of uncertainty. In this way, each of several of the first type of data streams 132 may be called a probabilistic data stream in some illustrative examples. [0044] As used herein, a "second type of data stream" such as one of several second type data streams 130 may include data points in which each data point does not include, or is not coupled with, an uncertainty measurement. For example, without limitation, the data point can only include a single data value. In some illustrative examples, this second type of data stream can be called a pseudo-deterministic data stream. [0045] Several of the second type of data streams 130 into data streams 127 can be received by modifier 120 in position estimator 118. Modifier 120 can be configured to modify a second type of data stream to make the data stream usable per position estimator 118. In particular, modifier 120 can convert several second type data streams 130 into several first type processed data streams. All of the first type of data streams can then be processed by position estimator 118 to generate position estimate 122. [0046] As an illustrative example, second type of data stream 133 may be generated by one of the plurality of data systems 128. In this example, second type of data stream 133 may have been generated using one or more odometry techniques. Modifier 120 can be configured to modify second type of data stream 133 to form modified data stream 135, which is usable by position estimator 118. Modifier 120 can convert second type of data stream 133 to modified data stream 135 using any number of techniques. As an illustrative example, modifier 120 may apply probability distribution 137 to second type of data stream 133 to form modified data stream 135. [0047] Depending on the implementation, the probability distribution 137 may be a Gaussian distribution or some other type of probability distribution. Probability Distribution 137 can be a predetermined probability distribution. For example, without limitation, probability distribution 137 may have been determined empirically, using a mathematical model, or in some other way before mobile robot 106 was used to perform various operations 116 in environment 100. [0048] By converting the second type of data stream 133 into modified data stream 135, the need for probabilistic data generated using a physics-based model of mobile robot 106 can be eliminated. In particular, the term y...j in equation (1) described above can be provided using one at least one of data streams 127 from the plurality of data systems 128 instead of a physics-based model. For example, without limitation, position estimator 118 can use a data stream from an odometry system in the plurality of data systems 128 to provide the term y...j in equation (1) described above. The term y...j is the probability of the mobile robot position 106 at time t+1 given the mobile robot position 106 at time t. [0049] In this manner, a probability distribution can be applied to each of a plurality of second type data streams 130 per modifier 120 to form a plurality of modified data streams 129. Each of a plurality of modified data streams 129 can be equal to each of several first type data streams 132. several modified data streams 129 and several first type data streams 132 may together form the plurality of data streams 126 used by position estimator 118 to form the position estimate 122. [0050] In these illustrative examples, the plurality of data systems 128 may include a number of embedded data systems 134 and a number of external data systems 136. As used herein, an "embedded data system" such as a number of embedded data systems 134, may be configured to generate a mobile robot embedded in data stream 106. In some cases, an embedded data system may be completely separate from the controller 114. In other illustrative examples, at least a portion of an embedded data system may be implemented within or integrated with the controller 114. The data stream generated by the embedded data system can be received by position estimator 118 or modifier 120, depending on whether the data stream is a second type of data stream or a first type of data stream. [0051] Embedded data system 144 may be an example of one of several embedded data systems 134. Embedded data system 144 may include at least one of a passive element, an active element, a processor unit, an integrated circuit, a microprocessor, a sensor system, an objective, or some kind of other device or element. At least the embedded data system portion 144 that generates a data stream is located embedded in mobile robot 106. In this way, all of embedded data system 144 can be located embedded in mobile robot 106 or an embedded data system portion 144 may be located inlaid, while another portion may be located externally. [0052] As used herein, an "external data system", such as one of several external data systems 136, may be a data system configured to remotely generate a data stream with respect to mobile robot 106. data generated by the external data system can then be sent to controller 114 using, for example, without limitation, a wireless communications link. [0053] External data system 145 can be an example of several external data systems 136. External data system 145 can include at least one of a passive element, an active element, a processor unit, an integrated circuit, a microprocessor , a sensor system, an objective, or some type of other device or element. At least the portion of external data system 145 that generates a data stream is located external to mobile robot 106. In this way, all of external data system 145 can be located external or a portion of external data system 145 can be located. external, while another portion may be located inlaid. [0054] Additionally, controller 114 can be configured to reduce error in moving mobile robot 106 along path 117. In particular, controller 114 can reduce random error in moving mobile robot 106 from home location 138 along path 117 to desired location 140 along path 117 to within selected tolerances. In an illustrative example, controller 114 may use one or more of several embedded data systems 134 to reduce this random error. [0055] In particular, controller 114 may use one or more of a number of embedded data systems 134 configured to observe multiple reference points 142 within environment 100 to reduce random error in moving mobile robot 106 from initial location 138 to along path 117 to desired location 140 along path 117. A reference point at several reference points 142 can be any recognizable feature in environment 100. For example, without limitation, a reference point can take the shape of a pillar , a platform, a structural feature, a piece of equipment, a man-made structure, a goal, a label, or some other type of reference point. [0056] In this illustrative example, the embedded data system 144 may include a sensor system capable of observing at least one of several reference points 142 in environment 100 while mobile robot 106 is at home location 138 within environment 100. For example, without limitation, embedded data system 144 may observe datum 146 from a number of datum 142 while at initial location 138. Datum 146 may be observed a selected number of times. For example, N observations can be made from reference point 146. [0057] Reference point 146 can be a natural or man-made reference point, depending on the implementation. In this illustrative example, datum 146 may be a stationary datum. However, in another illustrative example, the reference point 146 may be mobile and capable of moving within environment 100 as needed. In some illustrative examples, reference point 146 may be a person. [0058] Embedded data system 144 can be used to identify an initial relative distance between starting location 138 of mobile robot 106 and reference point 146. When the number of reference point observations 146 made increases, the error in the initial relative distance between mobile robot start location 106 and reference point 146 decreases. The reduction in error is based on the central limit theorem. [0059] In particular, the central limit theorem can be explored such that the error can be reduced by a factor of the square root of n, where n is the total number of observations made. The central limit theorem states that, under certain conditions, the sum of n independent identically distributed random variables, when properly graded, can converge in distribution to a standard normal distribution. Thus, in an illustrative example, increasing n, the empirical covariance will decrease at a rate given as follows: where u is the standard deviation with respect to the mean. [0060] Mobile robot 106 can then be moved to a new location as far as possible to the desired location direction 140 without losing reference point 146 within the field of view of the embedded data system sensor system 144. Embedded data system 144 can identify a new relative distance between the new mobile robot location 106 and the reference point 146. The difference between the initial relative distance and the new relative distance can then be computed with minimal error and used to determine the new mobile robot location 106. [0061] If the new location is not the desired location 140 within selected tolerances, the mobile robot 106 can then be moved closer to the desired location 140 using the new reference point 147. In particular, the embedded data system 144 can search for new landmark 147 at multiple landmarks 142 while at the new location. While at the new location, the embedded data system 144 can then observe the new reference point 147 the selected number, N, of times. In this way, the new datum 147 can be considered "correlated" with the datum 146 at the new location. [0062] The process of moving to another location as close as possible to the desired location 140 and operations performed while at this other location, as described above, can then be repeated. This type of movement and processing can be repeated until the mobile robot 106 has reached the desired location 140 within selected tolerances. This type of process can reduce the overall error associated with moving mobile robot 106 from initial location 138 to desired location 140 within selected tolerances, as compared to moving mobile robot 106 without using multiple reference points 142 and observing each reference point. reference from multiple reference points 142 a selected number, N, of times. [0063] Referring now to Figure 2, an illustration of the plurality of data systems 128 is depicted from Figure 1 in the form of a block diagram in accordance with an illustrative embodiment. As described, the plurality of data systems 128 may include a plurality of embedded data systems 134 and a plurality of external data systems 136. [0064] In this illustrative example, the plurality of data systems 128 may include inertial measurement unit 202, color and depth odometry system 204, wheel odometry system 206, visual odometry system 208, and detection system and light range determination 210, indoor global positioning system 212, motion capture system 214 and laser system 216. Inertial measurement unit 202, color and depth odometry system 204, wheel odometry system 206, visual odometry system 208, and light range detection and determination system 210 can be part of various embedded data systems 134. Indoor global positioning system 212, motion capture system 214 and laser system 216 can do part of various external data systems 136. [0065] In this illustrative example, the inertial measurement unit 202 can measure relative displacement of mobile robot 106 within environment 100 sensing velocity, orientation and acceleration. Inertial measurement unit 202 can generate data stream 203, which can be sent to controller 114 as one of data streams 127. Depending on the manner in which inertial measurement unit 202 is implemented, data stream 203 can be considered one of several first type data streams 132 or one of several second type data streams 130. [0066] Color and depth odometry system 204 can be used to provide color data and depth data for environment 100. Wheel odometry system 206 can be used to measure relative displacement of mobile robot 106 within environment 100 when the movement system 110 in Figure 1 includes wheels. Visual odometry system 208 can use cameras to estimate the relative displacement of mobile robot 106 within environment 100. Light range detection and determination system 210 can generate laser scan of environment 100. [0067] Each of color and depth odometry system 204, wheel odometry system 206, visual odometry system 208 and light range detection and determination system 210 can be located completely embedded in mobile robot 106. In one illustrative example, color and depth odometry system 204, wheel odometry system 206, visual odometry system 208, and light range detection and determination system 210 can generate data stream 205, data stream 207, stream data stream 209 and data stream 211, respectively, which can be sent to controller 114 as part of data streams 127. In this illustrative example, each of data stream 205, data stream 207, data stream 209, and data stream. data 211 may be included from a number of first type data streams 132 or a number of second type data streams 130, depending on the implementation. In this illustrative example, each of data stream 205, data stream 207, data stream 209, and data stream 211 may be included in several of the second type of data streams 130. [0068] In other illustrative examples, one or more of data stream 205, data stream 207, data stream 209 and data stream 211 generated by color and depth odometry system 204, wheel odometry system 206, system of visual odometry 208 and light range detection and determination system 210 respectively, may be sent to locator and mapper 218. Locator and mapper 218 may be implemented within controller 114 in Figure 1 or separate from controller 114, depending on the implementation. [0069] Furthermore, the locator and mapper 218 can take the form of two-dimensional locator and mapper 220 or three-dimensional locator and mapper 222, depending on the implementation. In some cases, the color and depth odometry system 204, wheel odometry system 206, visual odometry system 208, light range detection and determination system 210, and locator and mapper 218 can together form the locating system and mapping 224. Mapping and location system 224 can be considered an embedded data system in various embedded data systems 134. [0070] Locator and mapper 218 can be configured to simultaneously estimate a metric map of environment 100 and an estimate of a mobile robot position 106 within this metric map based on all data streams received from locator and mapper 218. The metric map it can be two-dimensional or three-dimensional, depending on the implementation. In an illustrative example, locator and mapper 218 may be called a simultaneous location and mapping system (SLAM). In these examples, the environment metric map estimate 100 and the mobile robot position estimate 106 within this metric map can be sent in the form of data stream 213 to controller 114 in Figure 1 as one of data streams 127. data 213 may be one of several first type data streams 132 or one of several second type data streams 130. [0071] In this illustrative example, the indoor global positioning system 212 includes multiple sensor devices 226, multiple transmitters 228 and server 230. Multiple transmitters 228 can be located externally, while multiple sensor devices 226 can be located embedded in robot mobile 106. [0072] Multiple transmitters 228 may be configured to generate multiple light signals 229. Multiple light signals 229 may include at least one of a laser signal, an infrared signal, or some other type of light signal. Various sensor devices 226 may be passive and used to sense various light signals 229 transmitted from various transmitters 228. Various sensor devices 226 may send light data 231 over sensed number of light signals 229 to server 230. [0073] Server 230 may be configured to use this data to estimate the position of mobile robot 106 within environment 100 over time. Estimates generated over time may form data stream 233, which may be sent to controller 114 as one of data streams 127. Data stream 233 may be one of several of the first type of data streams 132 or one of several of the second type of data streams 130, depending on the implementation. Server 230 may be located offsite. In this way, server 230 can be an external data source that makes indoor global positioning system 212 one of several external data systems 136. [0074] Motion capture system 214 may include motion capture objective 232, image generation system 234 and motion capture server 236. Motion capture objective 232 may be passive and located embedded in mobile robot 106. System of imaging 234 may be located externally within environment 100 in Figure 1 and used to generate motion capture data 235 and track motion capture objective 232. Motion capture data 235 generated by motion capture system 214 may be sent to motion capture server 236 for further processing. [0075] Motion capture server 236 may then send motion capture data 235 in the form of data stream 237 to controller 114 as one of data streams 127. In some cases, motion capture server 236 may process data of motion capture 235 to form data stream 237. Data stream 237 can be one of several first type data streams 132 or one of several second type data streams 130, depending on the implementation. Motion capture server 236 can be located externally within environment 100. In this way, motion capture server 236 can be considered an external data source, which makes motion capture system 214 one of several external data systems 136. [0076] As described, laser system 216 may include laser target 238 and laser sensor 240. Laser target 238 can be passive and located embedded in mobile robot 106. Laser sensor 240 can be located externally within environment 100 and used to track laser target movement 238. Laser sensor 240 can measure laser target position 238 and process this data to generate an estimate of a mobile robot position 106, which can form data stream 241 with the passage of time. Data stream 241 may be one of several first type data streams 132 or one of several second type data streams 130, depending on the implementation. Laser sensor 240 can send data stream 241 to controller 114 as one of data streams 127. [0077] In this way, various types of sensor systems and devices can be used to generate data streams 127. Various second type data streams 130 in data streams 127 can be processed through modifier 120 in Figure 1 to form multiple modified data streams 129. Together, multiple modified data streams 129 in Figure 1 and several of the first type of data streams 132 can form the plurality of data streams 126 in Figure 1 used by position estimator 118 to generate the estimate of position 122. [0078] Referring now to Figure 3, an illustration of the components of the plurality of data systems 128 that are located embedded and the components of the plurality of data systems 128 that are located externally as described in Figure 2 is described as an illustrative embodiment . As described, some of the components of the plurality of data systems 128 of Figure 2 are located embedded 300, while other components of the plurality of data systems 128 of Figure 2 are located externally 302. In Figure 3, embedded 300 means embedded in mobile robot 106 in Figure 1 and external 302 means external with respect to mobile robot 106 in Figure 1. [0079] In particular, the inertial measurement unit 202, color and depth odometry system 204, wheel odometry system 206, visual odometry system 208, light range detection and determination system 210, and locator and mapper 218 of Figure 2 are located flush 300. In addition, various sensor devices 226, motion capture target 232, and laser target 238 of Figure 2 are also located flush 300. Various transmitters 228, server 230, imaging system 234, motion capture server 236 and laser sensor 240 may be located externally. [0080] In an illustrative example, the color and depth odometry system 204, wheel odometry system 206, visual odometry system 208 and light range detection and determination system 210 send the data stream 205, flow of data 207, data stream 209 and data stream 211, respectively, to locator and mapper 218. Locator and mapper 218 can then use these data streams to form data stream 213 and send data stream 213 to controller 114 of the Figure 1, which is also located embedded 300. Inertial measurement unit 202 can send data stream 203 directly to controller 114. In this illustrative example, these data streams can be sent to controller 114 using any number of wired communications links or wireless. [0081] Furthermore, server 230, motion capture server 236 and laser sensor 240 can send data stream 233, data stream 237 and data stream 241 to controller 114. In this illustrative example, these data streams can be sent to controller 114 by wireless communication. [0082] A data stream that is sent to controller 114 can be received by position estimator 118 if the data stream is a first type of data stream in several of the first type of data streams 132 in Figure 1 or modifier 120 if the data stream is a second type of data stream in several of the second type of data stream 130 in Figure 1. [0083] The illustrations of environment 100 in Figure 1, the plurality of data systems 128 in Figure 2, and the embedded 300 and external located components 302 in Figure 3 are not meant to imply physical or architectural limitations in the manner in which an illustrative embodiment can be implemented. Components other than or in place of those illustrated may be used. Some components may be optional. Also, blocks are presented to illustrate some functional components. One or more of these blocks may be combined, split, or combined and split into different blocks when implemented in an illustrative embodiment. [0084] Referring now to Figure 4, an illustration of an industrial environment is described as an illustrative embodiment. In this illustrative example, industrial environment 400 can be an example of an implementation for industrial environment 101 in Figure 1. As described, aircraft wing 402 can be manufactured within industrial environment 400. Aircraft wing 402 can be an example of one implementation for object 103 in Figure 1. [0085] Mobile robots 404 can be used to perform the operations needed to manufacture the 402 aircraft wing. Mobile robots 404 can be an example of an implementation for various mobile platforms 102 in Figure 1. In this illustrative example, mobile robots 404 can be be configured to move on the 406 floor of the 400 industrial environment. Each of the 404 mobile robots may be able to identify its position within and navigate through the 400 industrial environment. [0086] Referring now to Figure 5, an illustration of a mobile robot is described as an illustrative embodiment. In this illustrative example, mobile robot 500 can be an example of an implementation for mobile robot 106 in Figure 1. Furthermore, mobile robot 500 can be an example of a way in which each of the mobile robots 404 in Figure 4 can be implemented. [0087] As described, the mobile robot 500 may include the base 502, motion system 504 and the plurality of devices 506. In this illustrative example, the plurality of devices 506 may include the light detection and range determination system 508, color and depth odometry system 510 and objectives 512. Light detection and range determination system 508 can be an example of an implementation for light detection and range determination system 210 in Figure 2. Color and range determination odometry system depth 510 can be an example of an implementation for color odometry system and depth 204 in Figure 2. Objectives 512 can be an example of an implementation for motion capture objective 232 and laser objective 238 in Figure 2. [0088] The illustrations in Figures 4-5 are not intended to imply physical or architectural limitations to the manner in which an illustrative embodiment may be implemented. Components other than or in place of those illustrated may be used. Some components may be optional. [0089] The different components shown in Figures 4-5 can be illustrative examples of how components shown in block form in Figures 1-3 can be implemented as physical structures. Additionally, some of the components in Figures 4-5 can be combined with components in Figures 1-3, used with components in Figures 1-3, or a combination of the two. [0090] Referring now to Figure 6, an illustration of a process to generate a position estimate for a mobile platform in an environment is described in the form of a flowchart as an illustrative embodiment. The process illustrated in Figure 6 can be implemented to manage the movement of various mobile platforms 102 in Figure 1. [0091] The process may start by generating the plurality of data streams 126 wherein the plurality of data streams 126 includes several first type data streams 132 and several second type data streams 130 (operation 600). Next, the probability distribution 137 can be applied to each of a number of second type data streams 130 to form a plurality of modified data streams 129 (operation 602). [0092] Thereafter, several first type data streams 132 and several modified data streams 129 can be merged to generate position estimate 122 for mobile platform 104 with respect to environment 100 around mobile platform 104 with a desired level of accuracy (operation 604), with the process ending thereafter. In operation 604, merging may mean using Bayesian estimation techniques to generate the position estimate 122. [0093] Referring now to Figure 7, an illustration of a process for guiding a mobile robot within an industrial environment is described in the form of a flowchart according to an illustrative embodiment. The process illustrated in Figure 7 can be implemented to manage the movement of mobile robot 106 within industrial environment 101 in Figure 1. In particular, the process in Figure 7 can be used to reduce the error in moving mobile robot 106 along. path 117 in environment 100. [0094] The process can start by identifying the desired location 140 to which the mobile robot 106 is to be moved (operation 700). Next, an embedded data system 144 sensor system for mobile robot 106 is used to search the reference point 146 in environment 100 while mobile robot 106 is at a current location (operation 702). Next, the current reference point in environment 100 is observed a selected number of times using the embedded data system sensor system 144 with the mobile robot 106 at the current location (operation 704). Next, the embedded data system 144 identifies an initial relative distance between the mobile robot's current location 106 and the current reference point (operation 706). In operation 706, this identification can be an estimation. [0095] Thereafter, the mobile robot 106 is moved to a new location as far as possible to the desired location direction 140 for mobile robot 106 without losing the current reference point in the field of view of the built-in data system sensor system 144 (operation 708). The current reference point is re-observed a selected number of times using the built-in data system sensor system 144 while the mobile robot 106 is at the new location (operation 710). The embedded data system 144 identifies a new relative distance between the mobile robot's new location 106 and the current reference point (operation 712). In operation 712, this identification can be an estimation. [0096] Embedded data system 144 computes the difference between the initial relative distance and the new relative distance (operation 714). An estimate of the new mobile robot location 106 is then identified using the difference (operation 716). A determination is then made as to whether the new location is at the desired location 140 within selected tolerances (operation 718). If the new location is at the desired location 140 within selected tolerances, the process ends. [0097] Otherwise, embedded data system 144 searches for new waypoint 147 while at the new location (operation 720). While at the new location, embedded data system 144 then observes new reference point 147 the selected number of times using embedded data system sensor system 144 (operation 722). Embedded data system 144 then identifies a relative distance between the new mobile robot location 106 and the new reference point 147 (operation 722). In operation 722, this identification can be an estimation. In this way, the new datum 147 can be considered "correlated" with the datum 146 at the new location. The process then re-identifies the new waypoint 147 as the current waypoint, the new location as the current location, and the relative distance as the initial relative distance (operation 724), with the process then returning to operation 708 as described above. . [0098] Returning now to Figure 8, an illustration of a data processing system in the form of a block diagram is described as an illustrative embodiment. Data processing system 800 can be used to implement controller 111 in Figure 1. As described, data processing system 800 includes communications structure 802, which provides communications between processor unit 804, storage devices 806, communications unit 808, input/output unit 810 and display 812. In some cases, communications structure 802 can be implemented as a bus system. [0099] Processor unit 804 is configured to execute instructions for software to perform various operations. Processor unit 804 can include multiple processors, a multiprocessor core, and/or some other type of processor, depending on implementation. In some cases, the processor unit 804 may take the form of a hardware unit, such as a system circuit, an application-specific integrated circuit (ASIC), a programmable logic device, or some other suitable type of hardware unit. . [00100] Instructions for the operating system, applications, and/or running programs per processor unit 804 may be located on storage devices 806. Storage devices 806 may be in communication with processor unit 804 per communications frame 802. As used here, a storage device, also called a computer-readable storage device, is any piece of hardware capable of storing information on a temporary and/or permanent basis. This information may include, but is not limited to, data, program code, and/or other information. [00101] Memory 814 and persistent storage 816 are examples of storage devices 806. Memory 814 can take the form of, for example, a random access memory or some type of volatile or non-volatile storage device. Persistent Storage 816 can include any number of components or devices. For example, persistent storage 816 may include a hard disk, a flash memory, a rewritable optical disk, a rewritable magnetic tape, or some combination of the foregoing. Media used by 816 Persistent Storage may or may not be removable. [00102] Communications unit 808 allows data processing system 800 to communicate with other data processing systems and/or devices. Communications unit 808 may provide communications using physical and/or wireless communications links. [00103] Input/output unit 810 allows input to be received and output to be sent to other devices connected to the data processing system 800. For example, the input/output unit can allow user input to be received by a keyboard, a mouse and/or some other type of input device. As another example, input/output unit 810 may allow output to be sent to a printer connected to data processing system 800. [00104] Display 812 is configured to display information to a user. Display 812 may include, for example, without limitation, a monitor, a touch screen, a laser display, a holographic display, a virtual display device, and/or some other type of display device. [00105] In this illustrative example, the processes of the different illustrative embodiments can be performed by the processor unit 804 using computer-implemented instructions. These instructions may be called program code, computer usable program code, or computer readable program code and may be read and executed by one or more processors in processor unit 804. [00106] In these examples, program code 818 is located in functional form on computer readable media 820, which is selectively removable, and can be loaded to or transferred to data processing system 800 for execution by processor unit 804 Program code 818 and computer readable media 820 together form computer program product 822. In this illustrative example, computer readable media 820 may be computer readable storage media 824 or computer readable signal media 826. [00107] Computer readable storage media 824 is a physical or tangible storage device used to store 818 program code in place of a medium that propagates or transmits 818 program code. Computer readable storage media 824 may be, per for example, without limitation, an optical or magnetic disk or persistent storage device that is connected to data processing system 800. [00108] Alternatively, program code 818 may be transferred to data processing system 800 using computer readable signal media 826. Computer readable signal media 826 may be, for example, a propagated data signal containing code. 818 program. This data signal may be an electromagnetic signal, an optical signal, and/or some other type of signal that can be transmitted over physical and/or wireless communications links. [00109] The illustration of data processing system 800 in Figure 8 is not meant to provide architectural limitations on the manner in which the illustrative embodiments may be implemented. Different illustrative embodiments may be implemented in a data processing system that includes components in addition to or in place of those illustrated for data processing system 800. In addition, components shown in Figure 8 may be varied from the illustrative examples shown. [00110] The flowcharts and block diagram in the different described embodiments illustrate the architecture, functionality and operation of some possible implementations of apparatus and methods in an illustrative embodiment. In this regard, each block in the flowchart or block diagram can represent a module, a segment, a function, and/or a portion of an operation or step. [00111] In some alternative implementations of an illustrative embodiment, the function or functions noted in the blocks may occur outside the order noted in the figures. For example, in some cases, two blocks shown in succession can be played substantially simultaneously, or the blocks can sometimes be played in reverse order, depending on the functionality involved. Also, other blocks can be added in addition to the blocks illustrated in a flowchart or block diagram. [00112] Illustrative embodiments of the exhibit can be described in the context of manufacturing and service method of aircraft 900 as shown in Figure 9 and aircraft 1000 as shown in Figure 10. Returning first to Figure 9, an illustration of a manufacturing and service method of aircraft is described in the form of a block diagram in accordance with an illustrative embodiment. During pre-production, the manufacturing and service method of aircraft 900 may include specification and design 902 of aircraft 1000 in Figure 10 and obtaining material 904. [00113] During production, component fabrication and subassembly 906 and system integration 908 of aircraft 1000 in Figure 10 takes place. Thereafter, aircraft 1000 in Figure 10 may undergo certification and 910 delivery in order to be placed in 912 service. While in 912 service by a customer, aircraft 1000 in Figure 10 is scheduled for routine maintenance and 914 service, which may include modification, reconfiguration, renovation and other maintenance or service. [00114] Each of the processes of the 900 aircraft manufacturing and service method may be performed or performed by a system integrator, a third party, and/or an operator. In these examples, the operator can be a customer. For purposes of this description, a system integrator may include, without limitation, any number of aircraft manufacturers and major system subcontractors; a third party may include, without limitation, any number of vendors, subcontractors, and providers; and an operator can be an airline, a leasing company, a military entity, a service organization, and so on. [00115] Referring now to Figure 10, an illustration of an aircraft is described in the form of a block diagram in which an illustrative embodiment can be implemented. In this example, aircraft 1000 is produced by aircraft manufacturing and service method 900 in Figure 9 and may include fuselage 1002 with the plurality of systems 1004 and interior 1006. Examples of systems 1004 include one or more of propulsion system 1008, electrical system 1010, hydraulic system 1012 and environmental system 1014. Any number of other systems can be included. Although an aerospace example is shown, different illustrative embodiments can be applied to other industries, such as the automotive industry. [00116] Apparatus and methods embodied here may be employed during at least one of the phases of the aircraft manufacturing and service method 900 in Figure 9. In particular, various mobile platforms 102 can be used during any of the phases of the manufacturing method and 900 aircraft service. For example, without limitation, various mobile platforms 102 can be used to perform operations during at least one of component fabrication and subassembly 906, system integration 908, routine maintenance and service 914, or some other phase of 900 aircraft manufacturing and service method. [00117] In an illustrative example, components or subassemblies produced in the fabrication of component and subassembly 906 in Figure 9 may be fabricated or manufactured in a similar manner to components or subassemblies produced while the aircraft 1000 is in service 912 in Figure 9. As yet In another example, one or more apparatus embodiments, method embodiments, or a combination thereof may be used during production phases, such as component fabrication and subassembly 906 and system integration 908 in Figure 9. One or more apparatus embodiments, method embodiments, or a combination thereof, may be used while the aircraft 1000 is in service 912 and/or during maintenance and service 914 in Figure 9. The use of several of the different illustrative embodiments can substantially expedite assembly and/or reduce cost of aircraft 1000. [00118] The description of the different illustrative embodiments has been presented for purposes of illustration and description, and is not intended to be exhaustive or limited to the embodiments in the form set forth. Many modifications and variations will be apparent to those of ordinary skill in the art. Furthermore, different illustrative embodiments may provide different characteristics when compared to other desirable embodiments. The selected embodiment or embodiments are chosen and described in order to better explain the principles of the embodiments, the practical application, and enable others of ordinary skill in the art to understand the exhibit for various embodiments with various modifications as are suitable for the particular use contemplated. [00119] Thus, in summary, according to a first aspect of the present invention, there is provided: A1. An apparatus including: a plurality of data systems configured to generate a plurality of data streams in which the plurality of data streams includes several first type data streams and several second type data streams; a modifier configured to apply a probability distribution to each of the number of second type data streams to form multiple modified data streams; and an embedded position estimator of a mobile platform and configured to receive and merge several first type data streams and several modified data streams to generate a position estimate with a desired level of accuracy for the mobile platform with respect to a environment around the mobile platform. A2. Also provided is the apparatus of paragraph A1, wherein at least one of several of the second type of data streams is received from an odometry system in the plurality of data systems. A3. Also provided is the apparatus of paragraph A1, in which a data system embedded in the plurality of data systems is used to observe a reference point in the environment a selected number of times for use in moving the mobile platform to a desired location within. of the environment. A4. Also provided is paragraph A3 apparatus, in which increasing the selected number of times the reference point is observed reduces an overall error in moving the mobile platform to the desired location within the environment. A5. Also provided is the apparatus of paragraph A1, wherein the plurality of data systems include a number of embedded data systems and a number of external data systems. A6. Also provided is the apparatus of paragraph A5, wherein the various embedded data systems include at least one of an inertial measurement unit, a light range detection and determination system, a color and depth odometry system, a wheel odometry system, a visual odometry system, or a location and mapping system. A7. Also provided is the apparatus of paragraph A5, wherein the various external data systems include at least one of an indoor global positioning system, a motion capture system, or a laser system. A8. Also provided is the apparatus of paragraph A1, wherein the position estimate includes a position and an orientation of the mobile platform with respect to the environment. A9. The apparatus of paragraph A1 is also provided, in which the mobile platform is a mobile robot and the environment is an industrial environment. A10. Also provided is paragraph A1 apparatus, in which the position estimator is configured to merge several first type data streams and several modified data streams using a Bayesian estimation algorithm to generate the position estimate. A11. Also provided, the apparatus of paragraph A1 further including: a controller configured to use position estimation to guide the mobile platform along a path within the environment. A12. Also provided is the apparatus of paragraph A11, wherein the mobile platform includes: a motion system configured to be controlled by the controller based on position estimation. 82. A mobile platform including: a base; a controller associated with the base and configured to receive data streams from a plurality of data systems in which the data streams include several first type data streams and several second type data streams, and wherein the controller includes: a modifier configured to apply a probability distribution to each of several second type data streams to form multiple modified data streams; and a position estimator configured to receive the first type of data streams and the modified data streams as a plurality of data streams and merge the plurality of data streams together to generate a one-level position estimate. precision target for the mobile platform with respect to an environment around the mobile platform; and a motion system associated with the base and configured to be controlled by the controller based on the position estimate to move the mobile platform within the environment. C1. A method for guiding a mobile platform within an environment, the method including: generating a number of first type data streams and a number of second type data streams using a plurality of data systems; applying a probability distribution to each of several second type data streams to form multiple modified data streams; and merging the various first type data streams and the various modified data streams to generate a position estimate with a desired level of accuracy for the mobile platform with respect to the environment around the mobile platform. C2. Also provided, the method of paragraph C1 further including: guiding the mobile platform along a path in the environment using the position estimate generated for the mobile platform. C3. Also provided, the method of paragraph C1, in which applying the probability distribution to each of several of the second type of data streams to form the various modified data streams includes: applying an empirical covariance to each data point in each one of several of the second type of data streams to form the various modified data streams. C4. Also provided is the method of paragraph C1, in which merging the various first type data streams and several modified data streams includes: merging several first type data streams and several modified data streams using an estimation algorithm Bayesian to generate the position estimate with the desired level of precision. C5. Also provided is the method of paragraph C1, in which generating several first type data streams and several second type data streams includes: generating several second type data streams using at least one of an odometry system of color and depth, a wheel odometry system, a visual odometry system, a light range detection and determination system, or a locator and mapper. C6. Also provided, paragraph C1 method, further including: observing a current reference point in the environment a selected number of times using an embedded data system while the mobile platform is at a current location; identify an initial relative distance between the mobile platform's current location and the current reference point; and moving the mobile platform from the current location to a new location as far as possible to a desired location without losing the current reference point within a field of view of a sensor system of the embedded data system. C7. The C6 paragraph method is also provided, further including: re-observing the current reference point the selected number of times using the built-in data system; identify a new relative distance between the new mobile platform location and the current reference point; compute a difference between the initial relative distance and the new relative distance; and identify the new location of the mobile platform in the environment using the difference. C8. The method of paragraph C7 is also provided, further including: determining whether the new location is at the desired location within selected tolerances; and moving the mobile platform closer to the desired location using a new reference point in response to a determination that the new location is not at the desired location within selected tolerances. C9. Also provided is the method of paragraph C8, in which moving the mobile platform closer to the desired location using the new reference point includes: searching for the new reference point using the built-in data system; observe the new reference point the selected number of times using the built-in data system while the mobile platform is in the new location; and identify a relative distance between the new location of the mobile platform and the new reference point. C10. Also provided is the method of paragraph C8, in which moving the mobile platform closer to the desired location using the new reference point further includes: re-identifying the new reference point as the current reference point, the new location as the current location , and the relative distance as the initial relative distance; and repeat the steps of moving the mobile platform from the current location to the new location, identifying the new relative distance between the mobile platform's new location and the current reference point, computing the difference between the initial relative distance and the new relative distance, identify the new location of the mobile platform in the environment using the difference, and determine if the new location is at the desired location within selected tolerances. C11. The method of paragraph C10 is also provided, further including: repeating the step of moving the mobile platform closer to the desired location using the new reference point in response to the determination that the new location is not at the desired location within selected tolerances until the mobile platform to reach the desired location within selected tolerances.
权利要求:
Claims (13) [0001] 1. Apparatus for guiding a mobile platform within an environment, characterized in that it comprises: a plurality of data systems (128) configured to generate a plurality of data streams (126) in which the plurality of data streams includes a first type of data streams (132) and a second type of data stream (130), wherein each data point in the second type of data stream does not include an uncertainty measure; a modifier (120) configured to apply a probability distribution (137) to the second type of data streams to form a plurality of modified data streams (135); and a position estimator (118) built into a mobile platform (104) and configured to receive and merge the first type of data streams and the modified data stream to generate a position estimate (122) with a desired level of accuracy. for the mobile platform with respect to an environment (100) around the mobile platform, where the device is configured to: observe a current reference point in the environment a selected number of times using an embedded data system (134) while the mobile platform is in a current location; identify an initial relative distance between the mobile platform's current location and the current reference point; and moving the mobile platform from the current location to a new location as far away as possible to a desired location without losing the current reference point within the field of view of a sensor system of the embedded data system. [0002] 2. Apparatus according to claim 1, characterized in that at least one of the number of the second type of data streams is received from an odometry system in the plurality of data systems. [0003] 3. Apparatus according to any one of claims 1 or 2, characterized in that the plurality of data systems includes a number of embedded data systems and a number of external data systems. [0004] 4. Apparatus according to any one of claims 1 1 to 3, characterized in that the position estimate includes a position and an orientation of the mobile platform with respect to the environment. [0005] 5. Apparatus according to any one of claims 1 1 to 4, characterized in that the mobile platform is a mobile robot and the environment is an industrial environment. [0006] 6. Apparatus according to any one of claims 1 1 to 5, characterized in that the position estimator is configured to merge the number of first type data streams and the number of modified data streams using a Bayesian estimation algorithm to generate the position estimate. [0007] 7. Apparatus according to any one of claims 1 1 to 6, characterized in that it comprises: a controller configured to use position estimation to guide the mobile platform along a path within the environment. [0008] 8. Method for guiding a mobile platform (104) within an environment (100), the method characterized in that it comprises: generating a first type of data stream (132) and a second type of data stream (130) using a plurality of data systems (128), wherein each data point in the first type of data stream includes an uncertainty measure, and wherein each data point in the second type of data stream does not include an uncertainty measure; applying a probability distribution (137) to the second type of data streams to form a plurality of modified data streams (135); merging the first type of data streams and the modified data stream to generate a position estimate (122) with a desired level of accuracy for the mobile platform with respect to the environment around the mobile platform; observe a current reference point in the environment a selected number of times using an embedded data system (134) while the mobile platform is at a current location; identify an initial relative distance between the mobile platform's current location and the current reference point; and moving the mobile platform from the current location to a new location as far away as possible to a desired location without losing the current reference point within the field of view of a sensor system of the embedded data system. [0009] 9. Method according to claim 8, characterized in that it comprises: guiding the mobile platform along a path in the environment using the position estimate generated for the mobile platform. [0010] 10. Method according to any one of claims 8 or 9, characterized in that applying the probability distribution to each of the number of second type of data streams to form the number of modified data streams comprises: applying an empirical covariance to each data point in each of the number of second type of data streams to form multiple modified data streams. [0011] 11. Method according to any one of claims 8 to 10, characterized in that merging the number of first type of data streams and the number of modified data streams comprises: merging the number of first type of data streams and the number of data streams modified using a Bayesian estimation algorithm to generate the position estimate with the desired level of precision. [0012] 12. Method according to any one of claims 8 to 11, characterized in that it is to generate the number of first type of data streams and the number of second type of data stream comprises: generating the number of second type of data streams using at least one of a color and depth odometry system, a wheel odometry system, a visual odometry system, a light range detection and determination system, or a locator and mapper. [0013] 13. Method according to any one of claims 8 to 12, characterized in that it comprises: re-observing the current reference point the selected number of times using the embedded data system; identify a new relative distance between the new mobile platform location and the current reference point; compute a difference between the initial relative distance and the new relative distance; and identify the new location of the mobile platform in the environment using the difference.
类似技术:
公开号 | 公开日 | 专利标题 BR102015007157B1|2021-07-27|APPARATUS AND METHOD FOR GUIDING A MOBILE PLATFORM WITHIN AN ENVIRONMENT Huai et al.2018|Robocentric visual-inertial odometry Filion et al.2018|Robot calibration using a portable photogrammetry system Bloesch et al.2013|State estimation for legged robots-consistent fusion of leg kinematics and IMU Li et al.2014|LIDAR/MEMS IMU integrated navigation | method for a small UAV in indoor environments Huang et al.2014|Towards consistent visual-inertial navigation Fethi et al.2018|Simultaneous localization, mapping, and path planning for unmanned vehicle using optimal control Yang et al.2019|Observability analysis of aided ins with heterogeneous features of points, lines, and planes Schauwecker et al.2012|Markerless visual control of a quad-rotor micro aerial vehicle by means of on-board stereo processing Demim et al.2017|Cooperative SLAM for multiple UGVs navigation using SVSF filter Caruso et al.2017|Robust indoor/outdoor navigation through magneto-visual-inertial optimization-based estimation Ding et al.2021|Vid-fusion: Robust visual-inertial-dynamics odometry for accurate external force estimation Chen et al.2019|Perception system design for low-cost commercial ground robots: sensor configurations, calibration, localization and mapping Zhang et al.2013|Cumulative error estimation from noisy relative measurements Alaeddini et al.2013|Autonomous state estimation using optic flow sensing Shin et al.2014|Precision improvement of MEMS gyros for indoor mobile robots with horizontal motion inspired by methods of TRIZ Perez-Paina et al.2016|Fusion´ of monocular visual-inertial measurements for three dimensional pose estimation Martínez-García et al.2019|Divergent trinocular vision observers design for extended Kalman filter robot state estimation Lee et al.2020|Analysis on the performance of VIO according to Trajectory Planning of UAV Sadeghzadeh-Nokhodberiz et al.2015|Interconnected maximum likelihood estimator and extended Kalman filter for inertial measurement unit calibration fusing three-dimensional camera information Guráň et al.2014|Localization of iRobot create using inertial measuring unit Chen et al.2018|Multi-Sensor Fusion Technology in Inertial Navigation System Using Factor Graph Quan2017|State Estimation Wan et al.2016|Multi-sensor devices for UAV in both simulation platform and realistic tests Panahandeh et al.2013|Observability analysis of a vision-aided inertial navigation system using planar features on the ground: Supplemental material
同族专利:
公开号 | 公开日 JP6804825B2|2020-12-23| CN104977941B|2019-04-30| EP2927769A2|2015-10-07| US20150283700A1|2015-10-08| KR102359186B1|2022-02-04| EP2927769B1|2019-12-04| BR102015007157A2|2016-03-29| EP2927769A3|2016-07-20| KR20150114892A|2015-10-13| US9427867B2|2016-08-30| CN104977941A|2015-10-14| JP2015197925A|2015-11-09| CA2883622C|2017-07-11| PT2927769T|2020-02-14| ES2770730T3|2020-07-02| CA2883622A1|2015-10-02|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 EP0570581B1|1991-11-01|2001-12-05|Motorola, Inc.|An improved accuracy sensory system for vehicle navigation| JP2004133882A|2002-05-10|2004-04-30|Royal Appliance Mfg Co|Autonomous multi-platform robot system| AU2003300959A1|2002-12-17|2004-07-22|Evolution Robotics, Inc.|Systems and methods for visual simultaneous localization and mapping| KR100506097B1|2004-02-04|2005-08-03|삼성전자주식회사|Method and apparatus for making magnetic field map and method and apparatus for checking pose of the moving body using the map| US7689321B2|2004-02-13|2010-03-30|Evolution Robotics, Inc.|Robust sensor fusion for mapping and localization in a simultaneous localization and mapping system| US7831094B2|2004-04-27|2010-11-09|Honda Motor Co., Ltd.|Simultaneous localization and mapping using multiple view feature descriptors| US7317836B2|2005-03-17|2008-01-08|Honda Motor Co., Ltd.|Pose estimation based on critical point analysis| WO2008013568A2|2005-12-30|2008-01-31|Irobot Corporation|Autonomous mobile robot| KR100809352B1|2006-11-16|2008-03-05|삼성전자주식회사|Method and apparatus of pose estimation in a mobile robot based on particle filter| US8351646B2|2006-12-21|2013-01-08|Honda Motor Co., Ltd.|Human pose estimation and tracking using label assignment| KR100877071B1|2007-07-18|2009-01-07|삼성전자주식회사|Method and apparatus of pose estimation in a mobile robot based on particle filter| CN101576384B|2009-06-18|2011-01-05|北京航空航天大学|Indoor movable robot real-time navigation method based on visual information correction| KR101083394B1|2009-10-30|2011-11-14|주식회사 유진로봇|Apparatus and Method for Building and Updating a Map for Mobile Robot Localization| CN101712381B|2009-11-13|2013-01-02|北京航空航天大学|Multi-sensor-based attitude determination system| US8401225B2|2011-01-31|2013-03-19|Microsoft Corporation|Moving object segmentation using depth images| US8401242B2|2011-01-31|2013-03-19|Microsoft Corporation|Real-time camera tracking using depth maps|US9864909B2|2014-04-25|2018-01-09|Huntington Ingalls Incorporated|System and method for using augmented reality display in surface treatment procedures| US9947138B2|2014-04-15|2018-04-17|Huntington Ingalls Incorporated|System and method for augmented reality display of dynamic environment information| WO2015164755A1|2014-04-25|2015-10-29|Huntington Ingalls Incorporated|Augmented reality display of dynamic target object information| US10504294B2|2014-06-09|2019-12-10|Huntington Ingalls Incorporated|System and method for augmented reality discrepancy determination and reporting| WO2015191346A1|2014-06-09|2015-12-17|Huntington Ingalls Incorporated|System and method for augmented reality display of electrical system information| US10915754B2|2014-06-09|2021-02-09|Huntington Ingalls Incorporated|System and method for use of augmented reality in outfitting a dynamic structural space| WO2016011149A1|2014-07-16|2016-01-21|Huntington Ingalls Incorporated|System and method for augmented reality display of hoisting and rigging information| CN105180955A|2015-10-21|2015-12-23|福州华鹰重工机械有限公司|Real-time precise positioning method and real-time precise positioning device of motor vehicles| CN105938619A|2016-04-11|2016-09-14|中国矿业大学|Visual odometer realization method based on fusion of RGB and depth information| US10353400B2|2016-05-23|2019-07-16|Asustek Computer Inc.|Navigation system and navigation method| CN106227220A|2016-09-28|2016-12-14|关健生|Independent navigation crusing robot based on Distributed Architecture| CN107160419A|2017-06-09|2017-09-15|东华大学|A kind of robot movement mechanism of electromyographic signal control| CN109866217B|2017-12-01|2021-09-17|深圳市优必选科技有限公司|Robot mileage positioning method, device, terminal equipment and computer storage medium| CN110286682A|2019-07-08|2019-09-27|国网山东省电力公司枣庄供电公司|A kind of electric power storage Multifunctional security sniffing robot, method and system| KR102259247B1|2019-12-06|2021-06-01|한국로봇융합연구원|Strong Localization Method and System through Convergence of Multiple Localization Based on Covariance Matrix|
法律状态:
2016-03-29| B03A| Publication of a patent application or of a certificate of addition of invention [chapter 3.1 patent gazette]| 2018-10-30| B06F| Objections, documents and/or translations needed after an examination request according [chapter 6.6 patent gazette]| 2020-04-22| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]| 2021-06-01| B09A| Decision: intention to grant [chapter 9.1 patent gazette]| 2021-07-27| B16A| Patent or certificate of addition of invention granted [chapter 16.1 patent gazette]|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 30/03/2015, OBSERVADAS AS CONDICOES LEGAIS. |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 US14/243,419|2014-04-02| US14/243,419|US9427867B2|2014-04-02|2014-04-02|Localization within an environment using sensor fusion| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|