![]() GUIDED NAVIGATION OF AN ULTRASONIC PROBE
专利摘要:
Embodiments of the invention relate to guided navigation of an ultrasound probe. In one embodiment of the invention, a method of assisting ultrasound navigation includes acquiring an image by an ultrasound probe of a target organ of a body. The method also includes image processing in connection with an estimator such as a neural network. The treatment then determines a gap between a contemporary pose emerging from the acquired image and an optimal placement of the ultrasound probe for imaging the target organ. Finally, the method comprises presenting the calculated difference to an end user operator of the ultrasound probe. 公开号:FR3059541A1 申请号:FR1662038 申请日:2016-12-07 公开日:2018-06-08 发明作者:Charles Cadieu;Ha Hong;Kilian Koepsell;Johan Mathe;Martin Wojtczyk 申请人:Bay Labs Inc; IPC主号:
专利说明:
® FRENCH REPUBLIC NATIONAL INSTITUTE OF INDUSTRIAL PROPERTY © Publication number: (to be used only for reproduction orders) (© National registration number 059 541 62038 COURBEVOIE © IntCI 8 : A 61 B 8/00 (2017.01) PATENT INVENTION APPLICATION A1 ©) Date of filing: 07.12.16. © Applicant (s): BAY LABS, INC. - US. (30) Priority: @ Inventor (s): CADIEU CHARLES, HONG HA, KOEPSELL KILIAN, MATHE JOHAN and WOJTCZYK (43) Date of public availability of the MARTIN. request: 08.06.18 Bulletin 18/23. ©) List of documents cited in the report preliminary research: Refer to end of present booklet (© References to other national documents ® Holder (s): BAY LABS, INC .. related: ©) Extension request (s): ©) Agent (s): GEVERS & ORES. FR 3 059 541 - A1 GUIDED NAVIGATION OF AN ULTRASOUND PROBE. (© Embodiments of the invention relate to the guided navigation of an ultrasonic probe. In one embodiment of the invention, a method of aid to ultrasonic navigation comprises the acquisition of an image by a probe ultrasound of a target organ of a body. The method also includes image processing in connection with an estimator such as a neural network. The processing then determines a difference between a contemporary pose emerging from the acquired image and optimal placement of the ultrasound probe for imaging the target organ Finally, the method involves presenting the calculated deviation to an operator using the ultrasound probe. preset images Training GUIDED NAVIGATION OF AN ULTRASOUND PROBE BACKGROUND OF THE INVENTION Field of the Invention [0003] The present invention relates to ultrasound imaging and more particularly to the acquisition of ultrasound image. îo [0004] Description of the Related Art [0005] Ultrasound imaging, also known as ultrasound, is a medical imaging technique which employs high frequency sound waves to visualize three-dimensional structures inside. from the body of a living being. Since the ultrasound images are captured in real time, the ultrasound images also show the movement of the internal organs of the body as well as the blood flowing through the blood vessels of the human body and the stiffness of tissue. Unlike X-ray imaging, ultrasound imaging does not involve ionizing radiation, allowing prolonged use of ultrasound imaging without threatening tissue and internal organ damage from prolonged exposure to radiation. To acquire ultrasound imaging, during an ultrasound examination, a transducer, commonly referred to as a probe, is placed directly on the skin or inside a body opening. A thin layer of gel is applied to the skin so that the ultrasonic waves are transmitted through the transducer by means of the gel in the body. Ultrasound imagery is produced based on a measurement of the reflection of ultrasonic waves from body structures. The intensity of the ultrasonic signal, measured as the amplitude of the reflection of sound waves detected and the time it takes for the sound wave to travel through the body, provides the information necessary to calculate an image. Compared to other important medical imaging procedures, ultrasound has many advantages for the diagnostician and the patient. Above all, ultrasound imaging provides real-time images. Likewise, ultrasound imaging requires equipment that is portable and can be brought to the bedside. In addition, in practice, ultrasound imaging equipment costs significantly less than other medical imaging equipment, and as noted, does not use harmful ionizing radiation. Nevertheless, the production of quality ultrasound images remains extremely dependent on a qualified operator. In this regard, depending on the part of the body selected for imaging, the qualified operator must know where to place the ultrasound probe initially. Next, the qualified operator must know how to spatially orient the probe and finally, the qualified operator must know where to move the probe so as to acquire the desired imagery. Generally speaking, the ultrasound operator is guided during the initial placement, orientation and movement of the probe according to the visual feedback provided by the imagery produced during the ultrasound. Navigating the probe is therefore essentially a manual process consisting of iterative trial and error. Clearly, therefore, the modern ultrasonic navigation process is not optimal. BRIEF SUMMARY OF THE INVENTION Embodiments of the present invention address failures in the art relating to ultrasonic probe navigation and provide a new and non-obvious method, system, and computer program product for guided navigation of an ultrasonic probe. In one embodiment of the invention, an ultrasonic navigation aid method comprises the acquisition of an image by an ultrasonic probe of a target organ of a body. The method also includes subjecting the image to a processing in connection with an estimator formed as a function of a programmatic approximator, comprising for example, a classifier, a predictor, a state machine or a neural network. . The treatment compared to the estimator produces an output between a contemporary pose of the ultrasonic probe, namely the position and the orientation of the ultrasonic probe relative to the target organ, and an optimal pose of the ultrasonic probe for imaging the target organ. Finally, the method includes presenting the deviation to an end user operator of the ultrasonic probe. In one aspect of the embodiment, the contemporary positioning of the ultrasonic probe is further improved as a function of linear and angular displacement data received from an inertial measurement system comprising at least one of an accelerometer , a gyroscope and a magnetometer. In another aspect of the embodiment, the calculated difference is presented visually in a display of a computer system coupled to the probe, audibly by a variation of a tone as a function of proximity to the probe of the optimal pose, audibly by the variation of a repeated audible presentation frequency of a short sound as a function of a proximity of the probe of the optimal pose, or haptically by a variation of vibrations of the probe as a function of proximity to the probe for optimal placement. In another embodiment of the invention, an ultrasound imaging data processing system is configured for an ultrasound navigation aid. The system includes a computer with memory and at least one processor, a display coupled to the computer, a beamforming circuitry coupled to the computer and the display, and an ultrasonic probe that has a transducer network connected to the beam forming circuitry. The system further includes a navigation aid module running in the computer memory. The module includes a program code activated during an execution by the computer processor to acquire an image by the ultrasonic probe of a target organ of a body, to subject the image to a processing in connection with a estimator, for example, a neural network, so as to produce a difference between a contemporary placement of the ultrasound probe with respect to the target organ and an optimal placement of the ultrasound probe for imaging the target organ, and for present the calculated deviation to an end user operator of the ultrasonic probe. Additional aspects of the invention will be presented in part in the following description, and will appear in part from the description, or can be learned by practicing the invention. The aspects of the invention will be achieved and obtained by means of the elements and combinations notably indicated in the appended claims. It should be understood that the foregoing general description and the following detailed description are both given by way of example and explanation only, and do not limit the invention, as claimed. BRIEF DESCRIPTION OF THE VARIOUS VIEWS OF THE DRAWINGS The accompanying drawings, which are incorporated into and form part of this thesis, illustrate embodiments of the invention and together with the description, serve to explain the principles of the invention. The embodiments illustrated here are currently preferred, it being understood however that the invention is not limited to the precise arrangements and instrumentalities shown, in which: Figure 1 is an illustration in images of a guided navigation process of an ultrasonic probe; Figure 2 is a schematic illustration of an ultrasonic data processing system configured for guided navigation of an ultrasonic probe; and, Figure 3 is a flowchart illustrating a guided navigation process of an ultrasonic probe. DETAILED DESCRIPTION OF THE INVENTION Embodiments of the invention relate to the guided navigation of an ultrasonic probe. According to one embodiment of the invention, an ultrasonic probe is placed on the surface of a body. Then, an imagery of a target organ of the body is acquired, and a difference between a contemporary pose of the ultrasound pose emerging from the image acquired and an optimal pose of the target organ is presented to an operator who is the end user of the probe. ultrasonic. For example, the deviation is presented visually with respect to a corresponding display, audibly by means of an audible guidance signal, or alternatively, by means of the text-to-speech presentation of text instructions, or haptically. by the external envelope of the ultrasonic probe. îo In illustration, Figure 1 is an illustration in images of a guided navigation process of an ultrasonic probe. As shown in Figure 1, an ultrasonic probe 120 is placed on an external surface of a body 110 such as a human form. An image 130 of a target organ is acquired by the operator of the ultrasound probe 120 and the image 130 of the target organ is presented as input to an estimator 140, such as an is neural network. The estimator 140 is driven as a function of a set of training images 150 of one or more different target organs, each with a known probe pose deviation from the optimal probe pose so that the input of the image acquired in a contemporary manner 130 into the estimator 140 produces an output of a gap 190 between the contemporary fitting of the ultrasonic probe 120 and an optimal fitting of the ultrasonic probe 120. Optionally, the ultrasonic probe 120 acquires probe orientation and displacement data 180 comprising magnetometric information 180A, gyroscopic information 180B and accelerometric information 180C indicating an orientation and a displacement of the ultrasonic probe 120 so as to calculating the change in the pose of the ultrasonic probe 120. A guided navigation logic 160 then processes the orientation and displacement data of the probe 180 so as to better calculate the deviation from the optimal pose 190 taking into consideration not only the exposure difference 190 provided at the output by the estimator 140 relative to the acquired image 130 but also taking into consideration the change in the exposure of the ultrasonic probe 120 determined from the orientation and probe movement 180. The guided navigation logic 160 then processes the exposure difference 190 of the ultrasonic probe 120 and sends a return 170 in the form of visual feedback such as a scene rendered in three dimensions with the two probe models showing current and optimal probe poses with a suggested maneuver; the amount of match between the current and optimal poses; or red, green or yellow colors indicating how much adjustment of the orientation of the ultrasound probe 120 is required to approach the optimal pose, audible feedback such as a tone, or haptic feedback. Concerning the latter, in one aspect of the invention, the ultrasonic probe 120 can be made to vibrate more intensely or with a greater frequency in response to the pose deviation 190. As regards the first, in one aspect of the invention, the ultrasonic probe 120 can be made to emit a sound which is more intense of a different tone when the ultrasonic probe 120 is a function of the magnitude of the exposure gap 190. Similarly, the probe ultrasonic 120 may be made to emit a short-lived sound such as a click or popping noise repeatedly with a frequency relative to the magnitude of the exposure gap 190. As a variant, in another aspect of the invention , the ultrasonic probe 120 can be made to vibrate more intensely or with a greater frequency when the ultrasonic probe 120 is moved compilably as a function of a magnitude of the pose gap 190. i5 The process described in connection with Figure 1 can be implemented in an ultrasonic data processing system. As an additional illustration, Figure 2 schematically illustrates an ultrasonic data processing system configured for guided navigation of an ultrasonic probe. The system includes an ultrasonic probe 210 coupled to a host computer system 200 of one or more computers, each with memory and at least one processor. The ultrasonic probe 210 is activated to acquire ultrasonic imaging by means of a transducer connected to a beam forming circuitry in the host computer system 200, and transmit the acquired ultrasound imaging to the beam forming circuitry of the host computer system 200 to display it in a display of the host computer system 200 via an ultrasonic user interface 290 provided in the memory of the host computer system 200. The ultrasonic probe 210 includes an electromechanical vibration generator 230 such as a piezoelectric actuator, and a tone generator 240. The electromechanical vibration generator 230 can be driven in the ultrasonic probe 210 to cause the ultrasonic probe 210 to vibrate at a specific frequency and for a specific duration according to the instructions of the host computer system 200. Likewise, the tone generator 240 can be driven in the ultrasound probe 210 to cause the ultrasound probe 210 to emit an audible tone at a frequency and a specific amplitude and for a specific duration according to the instructions of the host computer system 200. Optionally, the tone generator 240 can be arranged in the host computer system 200. An image data store 260 stores a multiplicity of different previously acquired ultrasonic images in a regulated setting where the exposure difference of each image is known. The images from the image store 260 are provided as training images during the training of an estimator 220 such as a neural network providing decision making for a deviation from a probe placement. optimal concerning an input image of a target organ of a human form. In this regard, the estimator 220 includes a multiplicity of nodes treating different particularities extracted from an acquired image in order to decide on a pose of the ultrasonic probe providing a difference between the decided pose and an optimal pose known during imaging. of a target organ. In this respect, the pose can be represented mathematically in Euclidean space or any network of numbers. Finally, a navigation aid module 300 is coupled to the ultrasonic user interface 290. The navigation aid module 300 comprises a program code which, when executed in the memory of the system host computer 200, acquires a contemporary ultrasound image by the ultrasound probe 210 and processes the ultrasound image acquired in the host computer platform 200 using the estimator 220. The program code of the navigation aid module 300 during the execution in the memory of the host computer system 200 then receives with the help of the estimator 220 a difference calculated between a pose emerging from the acquired image and an optimal pose of the ultrasonic probe 210. Optionally, the ultrasonic probe 210 includes an inertial measurement unit 250. The inertial measurement unit 250 each includes a magnetometer 250A, a gyroscope 250B and an accelerometer 250C. As such, the data acquired by the inertial measurement unit 250 is translated into the host computer system 200 to estimate a change in pose during a given time interval, for example by measuring the linear acceleration and the angular speed of the probe 210. This change in pose can be combined with a pose deviation derived by the estimator to obtain a more precise pose estimate. A possible example involves the use of a Kalman filter. For example, algorithmically, the process using the inertial measurement unit 250 to develop a deviation determined by the estimator 220 can be expressed as follows: 1. Let / (i) be the image acquired by the ultrasonic probe 210 at time t. 2. Let f be an estimator such as a neural network 220 which, from an acquired image l (t), outputs the pose estimated by the image mg at time t with respect to the optimal pose (c ' that is, a gap between the pose of the optimal pose). Namely, P (0img = f (/ (0) · 3. Between t 0 and t, (> t 0 ), the change in pose Δρ (ίι, t 0 ) mg can be calculated as: Δρ (ί · , fo) img ~ P ((|) img ~ P (fo) img · Clearly, p (fi) img ~ P (fo) img + Ap (f- |, fo) img. 4. The change in pose between t 0 and h is measured by the inertial measurement unit 250 and indicated by Rp (f, î 0 ) imu · Rp (h, folmg and Δρ (Ε ί 0 ), Μυ are combined to produce a better estimate of the pose change using a Kalman filter expressed as Ap (fi, ί 0 ) κ = K (Ap (fi, f 0 ) img, Δρ (^, f 0 ) iMu; p (fo) img) where K is a Kalman filter (which takes a change of pose by the image, a change of pose according to the inertial measurement unit 250 and poses it at t 0 as inputs). Δρ (β, f 0 ) K is the change in combined pose that is expected to be more precise than Ap (fi, f 0 ) img or Δρ (ίι, f 0 ) | MU alone. 5. With Δρ (β, ί 0 ) κ more precise, it is then possible to estimate a more precise absolute pose p (t,) K at time t,: ρ (ί ^ κ = P (fo) img + Ap (L, ί 0 ) κ 6. For all the following instants: ρ ((/ + ι) κ = ρ (0κ + Ap (f y , ί 7+ ι) κ . In all cases, depending on the calculated difference, the program code of the navigation aid module 300 then determines a corresponding return to be presented by the ultrasonic probe 210. For example, the program code of the navigation aid module 300 can instruct the tone generator 240 to emit a particular tone model of specific periodicity proportional or inversely proportional to a determined proximity of the ultrasonic probe 210 to the optimal pose. As another example, the program code of the navigation aid module 300 can instruct the electromechanical vibration generator 230 of the ultrasonic probe 210 to emit a particular vibration of specific intensity proportional or inversely proportional to a determined proximity of the ultrasonic probe 210 to the optimal pose. As yet another illustration of the operation of the navigation aid module 300, FIG. 3 is a flowchart illustrating a guided navigation process of an ultrasonic probe. Starting from block 310, a target organ within the body is selected in a user interface for an ultrasound application viewing an ultrasound imaging acquired by the ultrasound probe. Then, at block 320, an estimator such as a neural network relating to the target organ is loaded into a memory of a computer system coupled to the ultrasonic probe. In block 330, contemporary ultrasound imaging is acquired by the ultrasonic probe and in block 340, optionally, probe orientation and displacement data received from an inertial measurement unit of the ultrasonic probe are acquired. In block 350, contemporary ultrasound imaging is processed in conjunction with the estimator in the computer system. In block 360, a difference between an pose of the ultrasound probe and an optimal pose which emerges from the acquired image is determined according to the application of the estimator on the acquired image. Optionally, probe orientation and displacement data is used to further improve the accuracy of the determined installation deviation. In block 370, a corresponding feedback as a function of the deviation is determined as a graphical representation of the deviation, a particular vibration resistance in the context of a haptic feedback, or a particular tone of frequency, periodicity, particular amplitude or any combination of these as part of an audible feedback. Then, at block 390, the determined feedback is output by the computer system or the coupled ultrasonic probe. Finally, in decision block 400, the inertial measurement unit of the ultrasonic probe indicates whether or not a threshold change of position or orientation has occurred with respect to the ultrasonic probe. If yes, or if an inertial measurement unit is neither present nor active, the process can then be repeated via block 340. The present invention can be integrated into a system, method, computer program product or any combination thereof. The computer program product may include a computer readable storage medium or media on which there are computer readable program instructions for causing a processor to realize aspects of the present invention. The computer-readable storage medium can be a tangible device that can store and store instructions for use by an instruction execution device. The computer-readable storage medium can be, for example, but not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semi-storage device conductors or any suitable combination of the foregoing devices. The computer readable program instructions described herein can be downloaded to respective computing / processing devices from a computer readable storage medium or to an external computer or an external storage device over a network. Computer-readable program instructions can be executed completely on the user's computer, partly on the user's computer, as a standalone package, partly on the user's computer and part on a remote computer or totally on the remote computer or server. Aspects of the present invention are described herein with reference to flowchart illustrations and / or block diagrams of methods, apparatus (systems) and computer program products in accordance with embodiments of the invention . It should be understood that each block of the flowchart illustrations and / or block diagrams, and combinations of blocks in the flowchart illustrations and / or block diagrams, can be implemented by computer readable program instructions. These computer-readable program instructions can be supplied to a processor of a universal computer, a specialized computer or other programmable data processing apparatus to produce a machine, so that the instructions, which run via the computer processor or other programmable data processing device, create means for implementing the functions / acts specified in the block or blocks of the flowchart and / or the functional diagram. These computer readable program instructions may also be stored in a computer readable storage medium which can instruct a computer, a programmable data processor and / or other devices to operate in a manner particular, so that the computer-readable storage medium in which instructions are stored includes an article of manufacture having instructions that implement aspects of the function / act specified in the block or blocks in the flowchart and / or the functional diagram. The computer-readable program instructions can also be loaded on a computer, another programmable data processing device or another device to cause a series of operational steps to be carried out on the computer, another programmable device. or other device to produce a computer-implemented process, so that instructions that run on the computer, another programmable device, or another device perform the functions / actions specified in the block or blocks the organization chart and / or the functional diagram. The flowchart and block diagrams in the figures illustrate the architecture, functionality and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams can represent a module, a segment or a part of instructions, which includes one or more executable instructions to implement the specified logic function (s). In some variant implementations, the functions noted in the block can take place in a different order than that noted in the figures. For example, two blocks shown in succession can actually be executed substantially simultaneously, or the blocks can sometimes be executed in reverse order, depending on the functionality involved. It should also be noted that each block of the block diagrams and / or the flow chart illustration, and combinations of blocks in the block diagrams and / or the block diagram illustration, can be implemented by hardware systems. specialized who perform the specified functions or acts or perform combinations of computer instructions and specialized hardware. Finally, the terminology is used here for the purpose of describing particular embodiments only and is not intended to limit the invention. It should also be understood that the terms "comprises" and / or "comprising", when used in this specification, specify the presence of particularities, whole numbers, steps, operations, elements and / or of stated components, but do not exclude the presence or addition of one or more other particulars, whole numbers, steps, operations, elements, components and / or groups thereof. The structures, materials, corresponding acts and equivalents of all means u or step plus function elements in the claims below are intended to include any structure, material or act to perform the function in combination with other claimed elements such that specifically claimed. The description of the present invention has been presented for purposes of illustration and description, but is not intended to be exhaustive or limited to the invention in its disclosed form. Many modifications and variations will appear to those skilled in the art without departing from the scope and spirit of the invention. The embodiment was chosen and described in order to best explain the principles of the invention and of the practical application, and to allow other persons of the trade to understand the invention for various embodiments with various modifications as appropriate for the particular application envisaged. Having thus described the invention of the present application in detail and with reference to its embodiments, it will be seen that modifications and variations are possible without departing from the scope of the invention defined in the appended claims as follows.
权利要求:
Claims (20) [1" id="c-fr-0001] 1. Ultrasonic navigation aid method comprising: the acquisition of an image by an ultrasonic probe of a target organ of a body; image processing in connection with an estimator, the estimator producing a difference between a contemporary pose emerging from the image and an optimal pose of the ultrasound probe for imaging the target organ; and presenting the calculated deviation to an operator who is the end user of the ultrasound probe. [2" id="c-fr-0002] 2. The method of claim 1, wherein the estimator comprises a neural network. [3" id="c-fr-0003] 3. Method according to claim 1, in which the contemporary pose is moreover determined as a function of orientation and displacement data of the probe received from an inertial measurement system comprising at least one of an accelerometer, a gyroscope and a magnetometer. [4" id="c-fr-0004] 4. The method of claim 1, wherein the calculated difference is presented visually in a display of a computer system coupled to the probe. [5" id="c-fr-0005] 5. Method according to claim 1, in which the calculated difference is presented audibly by a variation of a tone as a function of the proximity of the probe to the optimal pose. [6" id="c-fr-0006] 6. The method as claimed in claim 1, in which the calculated difference is presented audibly by the variation of a repeated audible presentation frequency of a sound of short duration as a function of a proximity of the probe to the optimal pose. . [7" id="c-fr-0007] 7. Method according to claim 1, in which the calculated difference is presented haptically by a variation in vibrations of the probe as a function of the proximity of the probe to the optimal pose. [8" id="c-fr-0008] 8. Ultrasound imaging data processing system configured for ultrasound navigation aid, the system comprising: a computer with memory and at least one processor; a display coupled to the computer; beam-forming circuitry coupled to the computer and the display; an ultrasonic probe comprising a transducer connected to the beam forming circuitry; and, a navigation aid module executing in the computer memory, the module comprising a program code activated during an execution by the computer processor to acquire an image by the ultrasonic probe d '' a target organ of a body, to process with an estimator the image acquired by determining a difference between a contemporary pose of the ultrasonic probe compared to the target organ emerging from the acquired image and an optimal pose of the ultrasonic probe for imaging the target organ, and to present the calculated deviation to an operator who is the final user of the ultrasonic probe. [9" id="c-fr-0009] 9. The system of claim 8, wherein the estimator is a neural network. [10" id="c-fr-0010] 10. The system as claimed in claims, in which the ultrasonic probe comprises an inertial measurement unit comprising at least one of an accelerometer, a gyroscope and a magnetometer and in which the contemporary pose is moreover determined as a function of orientation data and displacement of probe received from the inertial measurement unit. [11" id="c-fr-0011] 11. The system of claim 8, wherein the display is configured to visually present the calculated deviation. [12" id="c-fr-0012] 12. The system of claim 8, wherein the ultrasonic probe comprises a tone generator configured to present the difference calculated audibly by a variation of a tone as a function of the proximity of the probe to the optimal pose. [13" id="c-fr-0013] 13. System according to claims, in which the ultrasonic probe comprises an electromechanical vibration generator configured to present the deviation calculated haptically by a variation of vibrations of the probe as a function of a proximity of the probe to the optimal pose. [14" id="c-fr-0014] 14. Computer program product for an ultrasonic navigation aid, the computer program product comprising a computer-readable storage medium with which program instructions are integrated, the program instructions being executable by a device for get the device to carry out a process comprising: the acquisition of an image by an ultrasonic probe of a target organ of a body; image processing in connection with an estimator, the estimator producing a difference between a contemporary pose of the ultrasound probe emerging from the acquired image and an optimal pose of the ultrasound probe for imaging the target organ; and, presenting the calculated deviation to an end user operator of the ultrasonic probe. [15" id="c-fr-0015] 15. The computer program product of claim 14, wherein the program instructions of the storage medium are executable by a device to cause the device to perform a method in which the estimator is a neural network. [16" id="c-fr-0016] 16. A computer program product according to claim 14, in which the program instructions of the storage medium are executable by a device to cause the device to carry out a method in which the pose is moreover determined as a function of data from orientation and displacement of probe received from an inertial measurement system comprising at least one of an accelerometer, a gyroscope and a magnetometer. [17" id="c-fr-0017] 17. The computer program product of claim 14, wherein the program instructions of the storage medium being executable by a device to cause the device to perform a method in which the calculated deviation is presented visually in a display. a computer system coupled to the probe. [18" id="c-fr-0018] 18. The computer program product of claim 14, wherein the program instructions of the storage medium being executable by a device to cause the device to perform a method in which the calculated deviation is audibly presented by a variation of a tone depending on the proximity of the probe to the optimal pose. [19" id="c-fr-0019] The computer program product according to claim 14, wherein the program instructions of the storage medium being executable by a device to cause the device to perform a method in which the calculated deviation is audibly presented by the variation an audible presentation frequency repeated with a short sound as a function of proximity to the probe for optimal placement. [20" id="c-fr-0020] 20. The computer program product of claim 14, wherein the program instructions of the storage medium being executable by a device to cause the device to perform a method in which the calculated deviation is presented haptically by a variation of probe vibrations as a function of proximity of the probe to optimal placement. 1/2 Preset images 110
类似技术:
公开号 | 公开日 | 专利标题 FR3059541A1|2018-06-08|GUIDED NAVIGATION OF AN ULTRASONIC PROBE EP0840139B1|2005-02-16|Processing method for signals from objects having moving parts and echographic device carrying out the method TW201912114A|2019-04-01|Method and apparatus for collecting ultrasound data WO2016001548A1|2016-01-07|Method and device for functional imaging of the brain Gatta et al.2009|Fast rigid registration of vascular structures in IVUS sequences US20190200964A1|2019-07-04|Method and system for creating and utilizing a patient-specific organ model from ultrasound image data TW201923345A|2019-06-16|Methods and apparatus for configuring an ultrasound device with imaging parameter values Chung et al.2017|Freehand three-dimensional ultrasound imaging of carotid artery using motion tracking technology Lafci et al.2020|Deep learning for automatic segmentation of hybrid optoacoustic ultrasound | images US20200214674A1|2020-07-09|Methods and apparatuses for ultrasound data collection EP2961316B1|2018-11-28|Automatic method of predictive determination of the position of the skin FR2944203A1|2010-10-15|SYSTEM AND METHOD FOR DETERMINING THE POSITION OF A MEDICAL INSTRUMENT US20200054307A1|2020-02-20|Methods and apparatuses for guiding collection of ultrasound data Cohen et al.2018|3d vessel extraction in the rat brain from ultrasensitive Doppler images FR3058885A1|2018-05-25|METHOD AND SYSTEM FOR IMPROVED DETECTION AND VISUALIZATION OF A SURGICAL NEEDLE IN ULTRASONIC DATA BY PERFORMING SHEAR WAVE ELASTICITY IMAGING EP3317027B1|2021-09-29|Insonification method for obtaining a predetermined field of ultrasonic waves, and manufacturing method for obtaining an ultrasonic lens for such purpose FR3092241A1|2020-08-07|PRESCRIPTIVE GUIDANCE FOR ULTRASONIC DIAGNOSIS FR3099985A1|2021-02-26|Mid-procedure change of view for ultrasound diagnosis US20210192291A1|2021-06-24|Continuous training for ai networks in ultrasound scanners FR3106743A1|2021-08-06|ULTRASOUND IMAGE ACQUISITION OPTIMIZATION ACCORDING TO DIFFERENT RESPIRATION MODES Katouzian et al.2009|Classification of blood regions in IVUS images using three dimensional brushlet expansions WO2020260433A1|2020-12-30|Method for planning tissue ablation based on deep learning FR3099984A1|2021-02-26|Changing Ultrasound Imaging Guide Mode Dynamics FR3092240A1|2020-08-07|RETROSPECTIVE IMAGE SAVING FOR ULTRASONIC DIAGNOSTICS WO2021234304A1|2021-11-25|Method for predicting the recurrence of a lesion by image analysis
同族专利:
公开号 | 公开日 EP3332712A1|2018-06-13| FR3059541B1|2021-05-07| US20180153505A1|2018-06-07|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US20120035868A1|2008-03-18|2012-02-09|Orthosensor, Inc.|Method and System For Media Presentation During Operative Workflow| EP2807978A1|2013-05-28|2014-12-03|Universität Bern|Method and system for 3D acquisition of ultrasound images| US20160174934A1|2013-09-18|2016-06-23|Shenzhen Mindray Bio-Medical Electronics Co., Ltd.|Method and system for guided ultrasound image acquisition| WO2015150932A1|2014-03-31|2015-10-08|Koninklijke Philips N.V.|Haptic feedback for ultrasound image acquisition|FR3099985A1|2019-08-19|2021-02-26|Bay Labs, Inc.|Mid-procedure change of view for ultrasound diagnosis|WO2002039917A1|1998-05-14|2002-05-23|Calypso Medical, Inc.|Systems and methods for locating and defining a target location within a human body| JP4088104B2|2002-06-12|2008-05-21|株式会社東芝|Ultrasonic diagnostic equipment| EP3426158A1|2016-03-09|2019-01-16|Echonous, Inc.|Ultrasound image recognition systems and methods utilizing an artificial intelligence network|EP3307196A4|2015-06-09|2019-06-19|Intuitive Surgical Operations Inc.|Configuring surgical system with surgical procedures atlas| EP3373834A4|2015-11-12|2019-07-31|Intuitive Surgical Operations Inc.|Surgical system with training or assist functions| US11129591B2|2016-04-21|2021-09-28|The University Of British Columbia|Echocardiographic image analysis| CA3049148A1|2017-01-24|2018-08-02|Tietronix Software, Inc.|System and method for three-dimensional augmented reality guidance for use of medical equipment| US10628932B2|2017-10-27|2020-04-21|Butterfly Network, Inc.|Quality indicators for collection of and automated measurement on ultrasound images| WO2019222478A2|2018-05-17|2019-11-21|Teratech Corporation|Portable ultrasound system| US10751029B2|2018-08-31|2020-08-25|The University Of British Columbia|Ultrasonic image analysis| CN112888370A|2018-10-16|2021-06-01|皇家飞利浦有限公司|Ultrasound imaging guidance and associated devices, systems, and methods based on deep learning| FR3092241B1|2019-01-31|2021-01-01|Bay Labs Inc|PRESCRIPTIVE GUIDANCE FOR ULTRASONIC DIAGNOSIS| WO2020162989A1|2019-02-04|2020-08-13|Google Llc|Instrumented ultrasound probes for machine-learning generated real-time sonographer feedback| EP3711674A1|2019-03-21|2020-09-23|Medizinische Universität Wien|Method for acquiring image data of a body part| CN110974294A|2019-12-19|2020-04-10|上海尽星生物科技有限责任公司|Ultrasonic scanning method and device| WO2021175965A1|2020-03-05|2021-09-10|Koninklijke Philips N.V.|Ultrasound imaging guidance and associated devices, systems, and methods|
法律状态:
2018-01-02| PLFP| Fee payment|Year of fee payment: 2 | 2018-06-08| PLSC| Publication of the preliminary search report|Effective date: 20180608 | 2019-12-31| PLFP| Fee payment|Year of fee payment: 4 | 2020-12-28| PLFP| Fee payment|Year of fee payment: 5 | 2021-12-27| PLFP| Fee payment|Year of fee payment: 6 |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 FR1662038A|FR3059541B1|2016-12-07|2016-12-07|GUIDED NAVIGATION OF AN ULTRASONIC PROBE| FR1662038|2016-12-07|FR1662038A| FR3059541B1|2016-12-07|2016-12-07|GUIDED NAVIGATION OF AN ULTRASONIC PROBE| US15/831,375| US20180153505A1|2016-12-07|2017-12-04|Guided navigation of an ultrasound probe| EP17206010.5A| EP3332712A1|2016-12-07|2017-12-07|Guided navigation of an ultrasound probe| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|