专利摘要:
METHODS AND SYSTEMS FOR INSPECTING A WORKPIECE The present invention relates to methods (300) and systems (100) for inspecting a workpiece (108). The method includes entering model data associated with the workpiece (108) into an inspection system, determining a relative position of a depth pickup device (102) in relation to the workpiece (108) and calibrating a pose view for the inspection system in relation to the model based on the position of the depth capture device in relation to the workpiece (108). The method further includes measuring the actual depth distance data of at least one pixel of the depth sensing device in relation to the workpiece (108) and determining, based on the actual depth distance data, whether the workpiece ( 108) meets the predetermined inspection criteria.
公开号:BR102013021722B1
申请号:R102013021722-0
申请日:2013-08-26
公开日:2020-12-01
发明作者:Joseph D. Doyle;Paul R. Davies
申请人:The Boeing Company;
IPC主号:
专利说明:

Background of the Invention
[001] The present invention relates to the inspection of non-destructive workpiece and, more particularly, to methods and systems for the automated inspection of a workpiece.
[002] The production or assembly of a workpiece may require the use of multiple parts and tools. It is desirable to carry out an accurate and reliable inspection of the assembly during and after construction to ensure production quality by identifying possible problems, such as assembly errors, damage to a component in the assembly and / or fragments of foreign objects (FOD) .
[003] At least some known inspections of the assemblies are carried out manually. In such inspections, inspectors typically use visual and tactile inspections, in combination with personal knowledge, to compare the workpiece with a model or drawing graph. However, manual inspections often require inspectors to inspect, touch and visually feel the workpiece to detect differences between the assembly and the model. As such, manual inspections introduce possibilities for human error. For example, large assemblies, such as aircraft, trains, automobiles and ships, may include small objects in areas that have limited accessibility and, therefore, such areas may be difficult for inspectors to inspect. In addition, manual inspections can be time-consuming and / or require expensive skilled labor. In some cases, inspectors may use a small optical inspection tool, such as a borescope or X-ray technology, however, such inspection methods still introduce a potential for human error, as they require inspectors to visually recognize the differences between the workpiece and the model. In addition, X-ray technology can only be used on a limited group of materials, as some materials are not visible using X-ray technology.
[004] Other known inspection methods use automated image processing to perform inspections. During inspections, images of a workpiece are captured and analyzed using image processing, so that the characteristics within the images are compared with a library of standard features. Features can be identified using edge adjustment, color matching and resizing of flexible objects. However, such inspection methods can still introduce inaccuracies when identifying small parts and inspecting objects of the same color. In addition, areas that can be inspected using such technologies may be limited.
[005] US 2006/0265177 describes a method for determining 3D coordinates of an object's surface. A tracking system tracks a 3D measuring device that measures the surface of an object. Brief Description of the Invention
[006] In one aspect, a method is provided for inspecting a workpiece. The method includes entering model data associated with the workpiece into an inspection system, determining a relative position of a depth pickup device in relation to the workpiece, and calibrating a pose view for the inspection system in relation to the model based on the position of the depth capture device in relation to the workpiece. The method further includes measuring the actual depth distance data of at least one pixel of the depth pickup device in relation to the workpiece and determining, based on the actual depth distance data, whether the workpiece meets the predetermined inspection.
[007] In another aspect, a computer system is provided to inspect a workpiece. The computer system includes a processor, and a computer-readable storage device that encodes computer-readable instructions that are executable by the processor to perform the functions. The processor performs functions that include inserting model data associated with the workpiece into the storage device and determining a relative position of a depth pickup device in relation to the workpiece. The processor additionally performs functions that include calibrating a pose view for the computer system in relation to the model based on the position of the depth capture device in relation to the workpiece. The processor additionally performs functions that include measuring the actual depth distance data of at least one pixel of the depth pickup device in relation to the workpiece and determining, based on the actual depth distance data, whether the workpiece satisfies predetermined inspection criteria.
[008] According to a system, a system is provided to inspect a workpiece. The system includes a depth capture device configured to measure the actual depth distance data of at least one pixel from the depth capture device in relation to the workpiece. The system also includes a pose detection system configured to determine a relative position of the depth pickup device in relation to the workpiece. The system also includes an inspection computer system in communication with the depth pickup device and the pose detection system. The inspection system is programmed to enter model data associated with the workpiece and calibrate a pose view for the inspection system against the model based on the position of the depth pickup device in relation to the workpiece. The inspection system is additionally programmed to determine, based on the actual depth distance data, whether the workpiece meets the predetermined inspection criteria.
[009] In accordance with one aspect of the present description, a method has been provided to inspect a workpiece, said method which comprises inserting model data associated with the workpiece into an inspection system, determining a relative position of a depth capture device in relation to the workpiece, calibrate a pose view for the inspection system in relation to the model based on the position of the depth capture device in relation to the workpiece, measure the actual depth distance data at least one pixel of the depth capture device in relation to the workpiece; and determine, based on actual depth distance data, whether the workpiece meets the predetermined inspection criteria. Advantageously, the insertion of model data comprises inserting computer-aided design model data associated with the workpiece. Advantageously, the insertion of the model data may further comprise inserting a previously scanned model of the workpiece.
[0010] Advantageously, determining the relative position of the depth pickup device comprises defining a coordinate system origin at a designated position of the workpiece and determining a location of the depth pickup device in relation to the workpiece using a plurality of position detection chambers. Advantageously, the method may further comprise transmitting the location of the depth capture device to the inspection system.
[0011] Advantageously, measuring the actual depth distance includes moving the depth pickup device around the workpiece to capture a plurality of actual depth distances, said method which further comprises comparing the plurality of actual depth distances with a plurality of associated model depth distances to identify non-equivalent portions.
[0012] Advantageously, the determination based on the actual depth distance data further comprises calculating the model depth distance data for the pose view of the inspection system, where the model depth distance data represents a model depth distance from the inspection system pose view to the model workpiece, which determines whether the actual depth distance differs from the model depth distance; and which determines whether the difference meets the inspection criteria when the actual depth distance differs from the model depth distance, where compliance with the limit indicates that the workpiece is in an acceptable condition. Furthermore, the method may comprise displaying a portion of the workpiece that is not present in the model workpiece that uses an overlay applied to a view of the depth pickup device. In addition, the overlay can be configured to illustrate the presence of at least one between fragments of foreign objects, a missing component, an assembly error and a damaged component. Advantageously, measuring the actual depth distance data comprises additionally generating a three-dimensional point cloud.
[0013] According to an example, a computer system has been provided for inspecting a workpiece, said computer system comprising a processor; and a computer-readable storage device that has encoded on it, computer-readable instructions that are executable by the processor to perform functions that include inserting model data associated with the workpiece into the storage device that determine a relative position of a depth capture in relation to the workpiece, calibrate a pose view for the computer system in relation to the model based on the position of the depth capture device in relation to the workpiece, measure actual depth distance data of at least a pixel of the depth capture device in relation to the workpiece, and determine, based on the actual depth distance data, whether the workpiece meets the predetermined inspection criteria.
[0014] Advantageously, the functions performed by the processor additionally include inserting at least one of the computer-aided design model data associated with the workpiece and a previously scanned model of the workpiece. Advantageously, the functions performed by the processor additionally comprise calculating the model depth distance data for the pose view of the computer system, where the model depth distance data represents a model depth distance from the pose view from the computer system to the model workpiece, determine if the actual depth distance differs from the model depth distance; and determine whether the difference meets the inspection criteria when the actual depth distance differs from the model depth distance, where compliance with the limit indicates that the workpiece is in an acceptable condition. In addition, the functions performed by the processor may comprise displaying a portion of the workpiece that is not present in the model workpiece that uses an overlay applied to a view of the depth pickup device.
[0015] In accordance with an additional aspect of the present description, a system has been provided for inspecting the workpiece, said system comprising a depth capture device configured to measure the actual depth distance data of at least one pixel the depth pickup device relative to the workpiece, a pose detection system configured to determine a relative position of the depth pickup device relative to the workpiece and an inspection computer system in communication with the workpiece device depth capture and the pose detection system, the inspection computer system programmed to insert model data associated with the workpiece, calibrate a pose view for the inspection computer system against the model based on the position of the device depth capture in relation to the workpiece, determine, based on the actual depth distance data, whether the workpiece meets the predetermined inspection criteria.
[0016] Advantageously, to insert the model data, the inspection computer system is additionally programmed to insert computer-aided design model data associated with the workpiece. Furthermore, to insert model data, the computer system inspection can be programmed to store a previously scanned model of the workpiece.
[0017] Advantageously, the inspection computer system can be further programmed to calculate the model depth distance data for the pose view of the inspection computer system, where the model depth distance data represents a model depth distance from the pose view of the inspection computer system to the model workpiece, determine whether the actual depth distance differs from the model depth distance, and determine whether the difference meets the inspection criteria when the actual depth distance differs from the model depth distance, where compliance with the limit indicates that the workpiece is in an acceptable condition. In addition, the overlay can be displayed in real time and near real time. And in addition, the inspection computer system can be configured to display a portion of the workpiece that is not present in the model workpiece that uses an overlay applied to a view of the depth pickup device. Brief Description of Drawings
[0018] Figure 1 is a schematic illustration of an exemplary inspection system.
[0019] Figure 2 is a schematic illustration of an exemplary computer inspection system that can be used with the inspection system shown in Figure 1.
[0020] Figure 3 is a flow chart of an exemplary method that can be implemented by the inspection system shown in Figure 1. Detailed Description of the Invention
[0021] This description generally refers to the inspection of workpieces and, more particularly, to methods and systems that allow automated inspection of a workpiece. In one embodiment, an inspection system includes a depth capture device that measures actual depth distance data for at least one pixel of the depth capture device, a position and orientation detection system (pose) that determines the data of the depth capture device in relation to the workpiece, and an inspection computer system that is coupled to the depth capture device and the pose detection system. The modalities of the methods and systems described in this document allow the computing system (i) to enter the model data associated with the workpiece, (ii) determine a relative position of the depth capture device in relation to the workpiece, (iii) calibrate a pose view for the inspection computer system against the model based on the position of the depth pickup device in relation to the workpiece, (iv) measure the actual depth distance data from at least one pixel of the depth capture device in relation to the workpiece, and (v) determine, based on the actual depth distance data, whether a predetermined limit in relation to the workpiece has been exceeded.
[0022] The methods and systems described in this document may be implemented using computer engineering or programming techniques that include computer software, firmware, hardware or any combination or subset of these, where the technical effects may include at least one among: ) load model data for the workpiece onto the inspection computer system; b) determine the pose data of a depth capture device in relation to the workpiece being inspected; c) calibrate a pose view for the inspection computer system in relation to the model and the pose of the depth capture device in relation to the workpiece being inspected; d) measure the actual depth distance data to at least one pixel, where the actual depth distance data represents an actual depth distance between the depth pickup device and the workpiece being inspected; e) calculating the model depth distance data for the pose view of the inspection computer system, where the model depth distance data represents a model depth distance from the computer system pose view from inspection to the model workpiece; f) compare the actual depth distance data with the model depth distance data; g) determine whether the actual depth distance differs from the model depth distance and determine whether the difference exceeds a predetermined limit; and h) displaying a portion of the workpiece that is not present in the model workpiece that uses an overlay applied to a view of the depth pickup device.
[0023] As used in this document, an element or step cited in the singular and preceded by the word "one" must be understood so as not to exclude the plural elements or stages, unless such exclusion is explicitly mentioned. In addition, references to "a modality" and / or "exemplary modality" are not intended to be interpreted in such a way as to exclude the existence of additional modalities that also incorporate the aforementioned characteristics.
[0024] Figure 1 is a schematic illustration of an example inspection system 100 that can be used to inspect an assembly or workpiece 108. Generally, the workpiece is the product of an engineering environment in which the elements of the structure they are assembled in a predefined manner, so that the constituent elements are positioned and oriented in a predefined manner with respect to each other and the workpiece as a whole. The inspection system 100 can be used in a wide variety of applications. For example, inspection system 100 can be used to inspect large assemblies, such as aircraft, trains, ships, or any other large assembly that has numerous elements. Alternatively, inspection system 100 can also be used to inspect small assemblies, such as tools or gas / fluid tubes, and the like.
[0025] As shown in Figure 1, the inspection system 100 includes a depth pickup device 102, a pose detection system 104 and an inspection computer system 106. As used herein, the term "pose" is defined as a position and orientation of one object in relation to another. The inspection system 100 is used to inspect a workpiece, for example, workpiece 108, as described in more detail below, and the depth pickup device 102 communicates with the inspection computer system 106. depth sensor 102 transmits a signal 110 indicating a distance D between depth sensor 102 and workpiece 108 for each pixel in a field of view of depth sensor 102. The system detection sensor 104 communicates with the inspection computer system 106 and transmits a signal 112 indicative of a pose of the depth pickup device 102 in relation to workpiece 108. Alternatively, or in addition, the device depth pickup 102 and pose detection system 104 may include a transmitter, a transceiver and / or any other signal transmission device that allows the inspection system 100 works as described in this document.
[0026] The depth capture device 102 can be any suitable depth capture device or camera capable of measuring a real distance between the depth capture device 102 and the workpiece 108. In some embodiments, the depth capture device Depth 102 is a laser or 3D light depth capture device. In one embodiment, the depth pickup device 102 determines the actual distance data by calculating the two-way travel time of a laser beam transmitted towards workpiece 108 and reflected from workpiece 108. In another embodiment, the depth pickup device 102 projects an infrared (IR) pattern towards workpiece 108. The depth pickup device 102 includes an infrared camera (not shown) that captures an image of the IR pattern . The depth data is then determined by comparing the expected IR pattern with the actual IR pattern seen by the depth pickup device 102. Alternatively, to calculate the distance, the depth pickup device 102 can determine a phase difference of the laser beam. The depth sensing device 102 determines the distance based on the travel time or phase difference that 3D coordinate components (ie, points on a geometric axis X, Y, Z) in a point cloud where multiple points are grouped .
[0027] In the exemplary embodiment, the depth pickup device 102 communicates with the inspection computer system 106 via a wired connection or via wireless transmissions, and transmits the actual depth distance data to the computer system inspection 106. In the exemplary embodiment, the depth capture device 102 includes an image processor that allows a real-time or substantially real-time video image of any object within its field of view to be generated. In an alternative embodiment, the depth capture device 102 can capture and store images of any object within its field of view. During use, in the exemplary mode, a user manually positions the depth pickup device 102 in a desired location in relation to workpiece 108. Due to the fact that the depth pickup device 102 generates a video image, the The user can move the depth pickup device 102 in relation to the workpiece 108 without causing an error or inaccuracy in the inspection. In alternative embodiments, the depth pickup device 102 can be positioned using automated control devices, or the depth pickup device 102 can remain stationary while workpiece 108 is moved relative to the depth pickup device 102.
[0028] In the exemplary embodiment, the depose 104 detection system determines a pose of the depth pickup device 102 in relation to the workpiece 108. More specifically, in the exemplary embodiment, the pose detection system 104 includes a processor which allows the pose detection system 104 to determine the pose of the depth capture device 102 in real time, or near real time. The pose detection system 104 communicates with the inspection computer system 106 via a wired connection or via wireless transmissions.
[0029] The pose detection system 104 can determine the position of the depth capture device 102 using different methods. In the exemplary modality, the pose detection system 104 is a motion capture system that includes a plurality of cameras 116 positioned around workpiece 108. A plurality of small reflective markers118 is attached to each object being tracked ( that is, to the depth catcher 102 and the workpiece 108). Such markers 118 facilitate the calibration of the pose of the depth capture device 102 in relation to workpiece 108. Cameras 116 emit near infrared light around workpiece 108, which is reflected back from markers 118 In the exemplary embodiment, workpiece 108 remains stationary during the inspection process and is calibrated at an origin (0, 0, 0) in relation to the coordinate system because workpiece 108 remains stationary during the inspection process. When multiple cameras 116 observe a reflective marker 118, the pose detection system 104 can determine, i.e., basically triangular, a position of that marker 118 in 3D space. In addition, when multiple markers 118 are connected to the same object, the pose detection system 104 can also determine a relative orientation of that object. Other systems and methods of determining the depth capture device's pose 102 may include, but are not limited to, marker-based tracking, two-dimensional planar natural feature (2D) tracking, 3D model-based tracking, sensor training 3D depth, 3D tracking using an iterative closest point, mechanical tracking devices that physically connect the depth pickup device 102 to a reference location (ie the marker on workpiece 108), magnetic tracking devices that determine an intensity and location of a pulsed magnetic field, non-inertial tracking devices without a source that use passive magnetic sensors in reference to the earth's magnetic field, optical tracking devices, acoustic tracking devices and / or any other tracking device, combination device or method that allows the pose to be determined gives.
[0030] Figure 2 is a schematic illustration of an exemplary computer inspection system 106 (shown in Figure 1) that can be used with inspection system 100 (shown in Figure 1). In the exemplary embodiment, the inspection computer system 106 includes a memory device 200, and a processor 202 coupled to the memory device 200 for use in execution instructions. More specifically, in the exemplary embodiment, the inspection computer system 106 is configurable to perform one or more operations described in this document by programming memory device 200 and / or processor 202. For example, processor 202 may be programmed by coding an operation as one or more executable instructions and providing executable instructions on the memory device 200.
[0031] Processor 202 may include one or more processing units (for example, in a multi-core configuration). As used in this document, the term "processor" is not limited to the integrated circuits referred to in the art as a computer, but preferably refers broadly to a controller, a microcontroller, a microcomputer, a programmable logic controller (PLC) , an application-specific integrated circuit and other programmable circuits. In the exemplary embodiment, processor 202 is configured to load model 120 workpiece data (shown in Figure 1) onto workpiece 108 (shown in Figure 1), to receive pose data from the pose detection system 104 (shown in Figure 1), calibrate a pose view for the inspection computer system 106 in relation to the model and the pose of the depth pickup device 102 (shown in Figure 1) in relation to workpiece 108, receive the actual depth distance data from the depth capture device 102, calculate the model 120 workpiece depth distance data for the pose view of the inspection computer system 106, where the depth distance data represent a model depth distance from the pose view of the inspection computer system 106 to the model workpiece 120, and compare the actual depth distance data with the data away from model depth.
[0032] In the exemplary embodiment, the memory device 200 includes one or more devices (not shown) that allow information, such as executable instructions and / or other data to be selectively stored and retrieved. In the exemplary modality, such data may include, but are not limited to, pose data, positional data, directional data, previously scanned model workpiece data 120, computer aided design (CAD) model data, data GPS data, map data, graphic plan data, floor plan data, operational data, inspection limit data and / or control algorithms. Alternatively, the inspection computer system 106 can be configured to use any algorithm and / or method that allows the methods and systems to function as described in this document. Memory device 200 may also include one or more computer-readable media, such as, without limitation, dynamic random access memory (DRAM), static random access memory (SRAM), a solid state disk and / or a disk hard. In the exemplary embodiment, memory device 200 stores data related to the inspection process, for example, previously scanned model workpiece data 120, CAD model data from workpiece 108 and / or inspection limit data. Point clouds detected by depth capture device 102 can also be saved to memory device 200 and used as documentation of a constructed condition or a verified inspection of workpiece 108.
[0033] In the exemplary embodiment, the inspection computer system 106 includes a display interface 204 that is coupled to processor 202 for use in presentation information to a user. For example, the display interface 204 may include a display adapter (not shown) that can be attached to a display device (not shown), such as, without limitation, a cathode ray tube (CRT), a display screen. liquid crystal (LCD), a light-emitting diode (LED) screen, an organic LED (OLED) screen, an "electronic ink" screen and / or a printer. In some embodiments, the display interface 204 includes one or more display devices. In the exemplary embodiment, processor 202 is configured to compare a distance D (shown in Figure 1) measured by the depth pickup device 102 with a distance D2 (shown in Figure 1) calculated for the model 120 workpiece by the inspection computer 106. If there is a difference, processor 202 compares the difference with the predetermined limit data stored in memory device 200. In the exemplary mode, when a difference in distances D and D2 exceeds the predetermined limit, processor 202 displays for a user a portion of workpiece 108 that is not present on the model 120 workpiece using the display interface 204. The overlay can be displayed using different methods. In one embodiment, only that portion of workpiece 108 that exceeds the predetermined limit is displayed. In another embodiment, the entire workpiece 108 is displayed, and the non-compatible portions of workpiece 108 are displayed in a different color than the rest of workpiece 108. Alternatively, any other display method that allows for display selected areas of a given workpiece that has predetermined tolerances can be used.
[0034] Inspection computer system 106, in the exemplary mode, includes an input interface 206 to receive input from the user. For example, in the exemplary embodiment, input interface 206 receives information suitable for use in any of the methods described in this document. Input interface 206 is coupled to processor 202 and may include, for example, a joystick, a keyboard, a pointing device, a mouse, a stylus, a touch panel (for example, a touch keyboard or a touch screen), and / or a position detector. It should be noted that a single component, for example, a touch screen, can function as the presentation interface 204 and as the input interface 206.
[0035] In the exemplary embodiment, the inspection computer system 106 includes a communication interface 208 that is coupled to processor 202. In the exemplary embodiment, the communication interface 208 communicates with at least one remote device, such as the depth capture 102 and / or pose detection system 104. For example, the communication interface 208 can use, without limitation, a wired network adapter, a wireless network adapter and / or a mobile telecommunications adapter. A network (not shown) used to couple inspection computer system 106 to the remote device may include, without limitation, the Internet, a local area network (LAN), a wide area network (WAN), a wireless LAN (WLAN), a mesh network and / or a virtual private network (VPN) or other suitable means of communication.
[0036] Figure 3 is a flow chart of an example method300 that can be implemented to inspect a workpiece, such as workpiece 108 (shown in Figure 1) that uses an inspection system, such as the system inspection number 100 (shown in Figure 1). During operation, in the exemplary mode, the inspection system 100 is operated by a user using the inspection computer system 106 (shown in Figure 1). Input interface 206 (shown in Figure 2) allows the user to enter 302 model workpiece data 120 associated with workpiece 108 into inspection computer system 106. In one embodiment, model data is given 3D CAD models that are stored in memory device 200 (shown in Figure 2).
[0037] After insertion of model data 302, the inspection computer system 106 transmits 304 a signal 112 (shown in Figure 1) which requests that pose detection system 104 (shown in Figure 1) determine 306 a pose depth capture device 102 (shown in Figure 1) in relation to workpiece 108.
[0038] To determine 306 the pose of the depth capture device 102, in the exemplary mode, the user defines 308 a 3D coordinate system origin in a position of the workpiece 108. The pose detection system 104 determines 306 a pose depth capture device 102 relative to workpiece 108 using a plurality of position sensing chambers, such as cameras 116 (shown in Figure 1). The pose detection system 104 transmits 310 the pose data back to the inspection computer system 106 as signal 112.
[0039] Using the pose data from the depth pickup device 102 and model data for the workpiece 108 stored in the memory device 200, the inspection computer system 106 generates 312 a pose view for the inspection computer 106 in relation to the model workpiece. The inspection computer system 106 calibrates 314 the pose view of the model workpiece 120 and the pose of the depth pickup device 102, which allows the pose view of the model workpiece 120 to be displayed by the inspection computer 106 so that it remains in sync with a view of the depth pickup device 102 in relation to the workpiece 108, as the depth pickup device 102 is repositioned around the workpiece 108.
[0040] The user positions 316 the depth pickup device 102 for the inspection of a desired portion of the workpiece 108. In the example mode, the depth pickup device 102 is manually positioned 316 by the user. In alternative embodiments, the depth pickup device 102 can be positioned 316 by an automated positioning system or it can remain stationary while workpiece 108 is moved for inspection.
[0041] After calibration 314 and positioning 316, depth capture device 102 measures 318 depth distance data to determine an actual depth distance between depth capture device 102 and workpiece 108 for each pixel depth pickup device 102. In one embodiment, the user can continuously move or sweep depth pickup device 102 around workpiece 108 to capture a plurality of actual depth distances and allow comparisons over a plurality of distances associated model depths to identify non-equivalent portions that may represent assembly error and / or damage to workpiece 108. Depth capture device 102 transmits 324 actual depth distance data as a signal 110 (shown in Figure 1) for the inspection computer system 106.
[0042] The inspection computer system 106 then calculates 326 model 120 workpiece depth distance data that represents a model depth distance between the pose view generated by the inspection computer system 312 and the model workpiece 120. Inspection computer system 106 compares 328 actual depth distance data with model depth distance data to determine 330 if the actual depth distance differs from the model depth distance. If the actual depth distance differs from the model depth distance, the inspection computer system 106 determines 332 whether the difference exceeds the predetermined limits. Compatibility with the limits is an indication that workpiece 108 is in an acceptable condition . If the limits are exceeded, the inspection computer system 106 generates 334 an alert or event to indicate a potential assembly error, the existence of foreign object fragments and / or damage to workpiece 108. In addition, the inspection system inspection computer 106 displays 336 a portion of the workpiece 108 that is not present in the model workpiece 120 that uses an overlay applied to a view of the depth pickup device 102.
[0043] The modalities described in this document generally refer to workpiece inspection and, more particularly, to methods and systems for automated inspection of a workpiece. The modalities described in this document coordinate precise and reliable systems and methods for workpiece inspection. More specifically, the modalities described in this document provide an automated method that facilitates the reduction of the human error component during workpiece inspection. A depth capture device measures a distance for each pixel in its field of view and transmits the distance data to an inspection computer system. A pose detection system tracks the position and orientation of the depth pickup device in relation to the workpiece and transmits the positioning to the inspection computer system. Using a model of the workpiece stored in the inspection computer system, the system generates a model view of the model workpiece, where the model view tracks the view observed by the depth capture device in real time or almost real time. The inspection computer system then compares the actual distance observed by the depth capture device with the model distance calculated in the inspection computer system, and creates an alert if the difference in distances exceeds a predetermined limit. The inspection computer system can also generate a display to illustrate to a user which section of the workpiece caused the alert. Human error is substantially reduced by the modalities described in this document. In addition, the inspection system allows the measurement of both large and small workpieces, as well as a workpiece with limited physical accessibility. The modalities described in this document also facilitate the reduction of inspection times for expensive manual inspections.
[0044] The exemplary modalities of the methods and systems for workpiece inspection systems are described in detail above. The methods and systems are not limited to the specific modalities described in this document, but, preferably, the components of the systems and / or steps of the method can be used independently and separately from other components and / or steps described in this document. Each method step and component can also be used in combination with other method steps and / or components. Although the specific characteristics of several modalities can be shown in some drawings and not in others, this is only for convenience. Any other characteristic of a design can be referred to and / or claimed in combination with any characteristic or any other design.
[0045] This written description uses examples to describe the modalities, including the best mode and also to allow anyone skilled in the art to practice the modalities, including the production and use of any devices or systems and the realization of any built-in methods. The patentable scope of the description is defined by the claims, and may include other examples that occur for those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements with non-substantial differences from the literal language of the claims.
权利要求:
Claims (12)
[0001]
1. Method (300) for inspecting a workpiece (108), said method characterized by the fact that it comprises: inserting model data (302) associated with the workpiece (108) in an inspection system; determining (306) a relative position of a depth sensing device (102) in relation to the workpiece (108); calibrate (314) a pose view for the inspection system in relation to the model (120) based on the position of the deep pickup device (102) in relation to the workpiece (108); measuring (318) actual depth distance data of at least one pixel from the depth sensing device (102) in relation to workpiece 1166 (108); and determining (332), based on the actual depth distance data, whether the workpiece (108) meets the predetermined inspection criteria; and where to determine, based on the actual depth distance data, whether the workpiece (108) meets the predetermined inspection criteria further comprises: calculating the model depth distance data for the pose view of the inspection system, wherein the model depth distance data represents a model depth distance from the pose view of the inspection system for model workpiece (120); determine whether the actual depth distance differs from the model depth distance; and determining whether the difference meets the inspection criteria when the actual depth distance differs from the model depth distance, where compliance with the limit indicates that the workpiece (108) is in an acceptable condition.
[0002]
2. Method according to claim 1, characterized by the fact that it further comprises displaying a portion of the workpiece (108) that is not present in the model workpiece (120) using an overlay applied to a view of the workpiece working (108).
[0003]
3. Method, according to claim 2, characterized by the fact that the overlap is configured to illustrate the presence of at least one of the foreign object debris, a missing component, an assembly error and a damaged component.
[0004]
4. Method, according to claim 1, characterized by the fact that the model data comprises inserting the model data comprises inserting computer-aided design model data associated with the workpiece (108).
[0005]
5. Method, according to claim 1, characterized by the fact that entering the model data comprises inserting a previously scanned model of the workpiece (108).
[0006]
6. Method, according to claim 1, characterized by the fact that determining the relative position of the depth capture device (102) comprises: defining a coordinate system origin in a designated position of the workpiece (108); and determining a location of the depth pickup device (102) in relation to the workpiece (108) using a plurality of position detection chambers.
[0007]
7. Method, according to claim 1, characterized by the fact that it also comprises transmitting the location of the depth capture device (102) to the inspection system.
[0008]
8. Method according to claim 1, characterized by the fact that measuring the actual depth distance includes moving the depth pickup device (102) around the workpiece (108) to capture a plurality of depth distances real, said method which further comprises: comparing the plurality of real depth distances with a plurality of associated model depth distances to identify non-equivalent portions.
[0009]
9. Method, according to claim 1, characterized by the fact that measuring the distance data of real depth comprises also generating a three-dimensional point cloud.
[0010]
10. System (100) for inspecting a workpiece (108), said system characterized by the fact that it comprises: a depth capture device (102) configured to measure the actual depth distance data of at least one pixel the depth capture device (102) in relation to the workpiece (108); a pose detection system (102) configured to determine a relative position of the depth pickup device (102) in relation to the workpiece (108); and an inspection computer system (106) in communication with the depth capture device (102) and the pose detection system (102), the inspection computer system (106) programmed to: insert the model data associated with the workpiece (108); calibrating a pose view for the inspection computer system in relation to the model (120) in the position of the depth capture device (102) in relation to the workpiece (108); and determining, based on the actual depth distance data, whether the workpiece (108) meets the predetermined inspection criteria; and where the inspection computer system is further programmed to: calculate the model depth distance data for the pose view of the inspection computer system, where the model depth distance data represents a depth distance model from the pose view of the inspection computer system for the model workpiece (108); determine whether the actual depth distance differs from the model depth distance; and determining whether the difference meets the inspection criteria when the actual depth distance differs from the model depth distance, where compliance with the limit indicates that the workpiece (108) is in an acceptable condition.
[0011]
11. The system according to claim 10, characterized by the fact that the inspection computer system is configured to display a portion of the workpiece (108) that is not present on the model workpiece (108) using a overlap applied to a view of the depth capture device (102).
[0012]
12. The system, according to claim 10, characterized by the fact that, to insert model data, the inspection computer system (106) is also programmed to insert a previously scanned model of the workpiece (108).
类似技术:
公开号 | 公开日 | 专利标题
BR102013021722B1|2020-12-01|method and system for inspecting a workpiece
US9488589B2|2016-11-08|Mapping damaged regions on objects
EP2576156B1|2019-05-29|Methods and systems for inspection sensor placement
Du et al.2013|Online robot calibration based on vision measurement
Becerik-Gerber et al.2011|Assessment of target types and layouts in 3D laser scanning for registration accuracy
JP2014044203A5|2016-09-29|
JP2011519419A|2011-07-07|Method and computer program for improving object dimension acquisition
US20180293801A1|2018-10-11|Near field maneuvering for ar-device using image tracking
US20160075020A1|2016-03-17|Methods and systems for automatically inspecting an object
Samper et al.2013|Analysis of Tsai calibration method using two-and three-dimensional calibration objects
US10854015B2|2020-12-01|Real-time quality control during manufacturing using augmented reality
JP2015049192A|2015-03-16|Position detection device of underwater mobile body
Martín et al.2014|Deterioration of depth measurements due to interference of multiple RGB-D sensors
Maboudi et al.2018|INVESTIGATION OF GEOMETRIC PERFORMANCE OF AN INDOOR MOBILE MAPPING SYSTEM.
KR101403377B1|2014-06-03|Method for calculating 6 dof motion of object by using 2d laser scanner
KR101438514B1|2014-09-11|Robot localization detecting system using a multi-view image and method thereof
KR101746792B1|2017-06-13|Method and apparatus for estimating transformation between distance sensor and rotating platform
JP2020172784A|2020-10-22|Mountain tunnel concrete thickness measuring method and measuring device
KR20210056893A|2021-05-20|Apparatus and method for data visualization in 3D digital twin for facility safety inspection
US20210366098A1|2021-11-25|Artificial intelligence based registration support for environmental scans
JP6581514B2|2019-09-25|Three-dimensional position calibration apparatus and program thereof, and free viewpoint image generation apparatus
同族专利:
公开号 | 公开日
RU2013139448A|2015-03-10|
EP2703776A1|2014-03-05|
CN103630071B|2018-01-23|
RU2643619C2|2018-02-02|
JP2014044203A|2014-03-13|
KR20140027868A|2014-03-07|
US20140056507A1|2014-02-27|
JP6251519B2|2017-12-20|
CA2820320C|2015-11-24|
BR102013021722A2|2015-07-21|
CN103630071A|2014-03-12|
EP2703776B1|2020-03-18|
KR102030854B1|2019-10-10|
US8855404B2|2014-10-07|
CA2820320A1|2014-02-27|
ES2785302T3|2020-10-06|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

FR2551860B1|1983-09-08|1987-05-07|Sciaky Sa|INSTALLATION FOR THE DETERMINATION OF SPATIAL COORDINATES OF A POINT OF A WORKPIECE, PARTICULARLY FOR THE CHECKING OF A TOOLS SUCH AS A WELDING TOOL FOR A VEHICLE BODY|
US5175601A|1991-10-15|1992-12-29|Electro-Optical Information Systems|High-speed 3-D surface measurement surface inspection and reverse-CAD system|
DE19526526A1|1995-07-20|1997-01-23|Bayerische Motoren Werke Ag|Device for the optical scanning of measuring surfaces|
JP3893169B2|1996-06-18|2007-03-14|コニカミノルタホールディングス株式会社|3D measuring device|
JP2001241927A|2000-03-01|2001-09-07|Sanyo Electric Co Ltd|Shape measuring apparatus|
JP2002053198A|2000-08-10|2002-02-19|Asahi Breweries Ltd|Method and apparatus for manufacturing bottled beer and method for manufacturing enclosure|
US6856407B2|2000-09-13|2005-02-15|Nextengine, Inc.|Method for depth detection in 3D imaging providing a depth measurement for each unitary group of pixels|
JP2003065736A|2001-08-24|2003-03-05|Sanyo Electric Co Ltd|Three-dimensional modeling apparatus|
CN1648667A|2004-01-30|2005-08-03|英业达股份有限公司|Method and system for detecting mounting missing of parts and assemblies|
DE102005022344B4|2005-05-13|2008-06-19|Siemens Ag|Apparatus and method for workpiece measurement|
DE102005043912B4|2005-05-18|2011-08-18|Steinbichler Optotechnik GmbH, 83115|Method for determining the 3D coordinates of the surface of an object|
US7495758B2|2006-09-06|2009-02-24|Theo Boeing Company|Apparatus and methods for two-dimensional and three-dimensional inspection of a workpiece|
WO2009038465A1|2007-09-17|2009-03-26|Conoptica As|Rotating part position and change finding method and apparatus|
US7778794B2|2007-11-26|2010-08-17|The Boeing Company|System and method for identifying an element of a structure in an engineered environment|
US8131055B2|2008-01-31|2012-03-06|Caterpillar Inc.|System and method for assembly inspection|
JP2010277164A|2009-05-26|2010-12-09|Seiko Epson Corp|Apparatus and method for model data generation, and apparatus and method for object recognition|
JP2012078105A|2010-09-30|2012-04-19|Mitsubishi Heavy Ind Ltd|Attitude controller, control method, and program|
US9082208B2|2011-07-12|2015-07-14|Spirit Aerosystems, Inc.|System and method for locating and displaying aircraft information|
CN102538677A|2012-01-16|2012-07-04|苏州临点三维科技有限公司|Optics-based quick pipeline detection method|
US9448758B2|2012-07-18|2016-09-20|The Boeing Company|Projecting airplane location specific maintenance history using optical reference points|US9124801B2|2012-07-26|2015-09-01|Omnivision Technologies, Inc.|Image processing system and method using multiple imagers for providing extended view|
KR102245648B1|2012-09-10|2021-04-29|에이매스, 아이엔씨.|Multi-dimensional data capture of an environment using plural devices|
EP2939423A1|2012-12-28|2015-11-04|Metaio GmbH|Method of and system for projecting digital information on a real object in a real environment|
TW201446386A|2013-06-14|2014-12-16|Hon Hai Prec Ind Co Ltd|Laser processing system|
EP3088977B1|2013-12-27|2019-08-28|Makino Milling Machine Co., Ltd.|Control apparatus for machine tool|
US9489724B2|2014-03-31|2016-11-08|The Boeing Company|Three-dimensional stereoscopic projection on complex surfaces|
NO343149B1|2014-04-22|2018-11-19|Vision Io As|Procedure for visual inspection and logging|
CN104200464B|2014-08-19|2018-03-09|苏州华兴致远电子科技有限公司|A kind of detection method and system for identifying that train is abnormal|
CN104986539B|2015-05-21|2017-03-01|上海神机软件有限公司|Gang form automatic conveying system|
US20180374239A1|2015-11-09|2018-12-27|Cognex Corporation|System and method for field calibration of a vision system imaging two opposite sides of a calibration object|
US10812778B1|2015-11-09|2020-10-20|Cognex Corporation|System and method for calibrating one or more 3D sensors mounted on a moving manipulator|
US10757394B1|2015-11-09|2020-08-25|Cognex Corporation|System and method for calibrating a plurality of 3D sensors with respect to a motion conveyance|
CN105486249B|2015-11-26|2018-01-16|北京市计算中心|The adaptive bottom surface removing method of 3 d scan data|
US9721334B2|2015-12-03|2017-08-01|International Business Machines Corporation|Work-piece defect inspection via optical images and CT images|
US9880075B2|2016-02-11|2018-01-30|Caterpillar Inc.|Wear measurement system using a computer model|
US9875535B2|2016-02-11|2018-01-23|Caterpillar Inc.|Wear measurement system using computer vision|
DE102016106696A1|2016-04-12|2017-10-12|Carl Zeiss Industrielle Messtechnik Gmbh|Coordinate Measuring System|
US9950815B2|2016-07-01|2018-04-24|United Technologies Corporation|Systems and methods for detecting damage|
US10249033B1|2016-12-20|2019-04-02|Palantir Technologies Inc.|User interface for managing defects|
US10620618B2|2016-12-20|2020-04-14|Palantir Technologies Inc.|Systems and methods for determining relationships between defects|
WO2019185153A1|2018-03-29|2019-10-03|Carl Zeiss Industrielle Messtechnik Gmbh|Method and apparatus for determining 3d coordinates of an object|
US10958843B2|2018-05-04|2021-03-23|Raytheon Technologies Corporation|Multi-camera system for simultaneous registration and zoomed imagery|
US11079285B2|2018-05-04|2021-08-03|Raytheon Technologies Corporation|Automated analysis of thermally-sensitive coating and method therefor|
US11268881B2|2018-05-04|2022-03-08|Raytheon Technologies Corporation|System and method for fan blade rotor disk and gear inspection|
US10488371B1|2018-05-04|2019-11-26|United Technologies Corporation|Nondestructive inspection using thermoacoustic imagery and method therefor|
US10928362B2|2018-05-04|2021-02-23|Raytheon Technologies Corporation|Nondestructive inspection using dual pulse-echo ultrasonics and method therefor|
US10685433B2|2018-05-04|2020-06-16|Raytheon Technologies Corporation|Nondestructive coating imperfection detection system and method therefor|
US10914191B2|2018-05-04|2021-02-09|Raytheon Technologies Corporation|System and method for in situ airfoil inspection|
US10902664B2|2018-05-04|2021-01-26|Raytheon Technologies Corporation|System and method for detecting damage using two-dimensional imagery and three-dimensional model|
US10473593B1|2018-05-04|2019-11-12|United Technologies Corporation|System and method for damage detection by cast shadows|
US10943320B2|2018-05-04|2021-03-09|Raytheon Technologies Corporation|System and method for robotic inspection|
KR102134590B1|2018-10-12|2020-08-27|주식회사 화승알앤에이|Method of extracting difference between 3d model of measurement object and 3d scan data of the same|
WO2021250633A1|2020-06-11|2021-12-16|Interaptix Inc.|A system and method for remote inspection of a space|
法律状态:
2015-07-21| B03A| Publication of a patent application or of a certificate of addition of invention [chapter 3.1 patent gazette]|
2018-11-21| B06F| Objections, documents and/or translations needed after an examination request according [chapter 6.6 patent gazette]|
2020-03-31| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]|
2020-08-04| B09A| Decision: intention to grant [chapter 9.1 patent gazette]|
2020-12-01| B16A| Patent or certificate of addition of invention granted [chapter 16.1 patent gazette]|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 26/08/2013, OBSERVADAS AS CONDICOES LEGAIS. |
优先权:
申请号 | 申请日 | 专利标题
US13/594,928|2012-08-27|
US13/594,928|US8855404B2|2012-08-27|2012-08-27|Methods and systems for inspecting a workpiece|
[返回顶部]