![]() ROBOTIC DEVICE FOR MINI-INVASIVE MEDICAL INTERVENTION ON SOFT TISSUE
专利摘要:
The present invention relates to a robotic device (10) for medical intervention on a patient (30) with a medical instrument (13), comprising: - a robot arm (11) having several degrees of freedom, comprising an end adapted to receive the medical instrument; an image acquisition system (14) adapted to acquire positional information of the patient's anatomy; a storage medium (15) comprising a biomechanical model of the human body; process (17) configured to determine a position setpoint and an orientation setpoint for said medical instrument according to the biomechanical model, according to the position information and according to a path to be respected by the medical instrument (13) to carry out the medical intervention, - a control circuit (16) configured to control the robot arm (11) to place the medical instrument (13) in the position reference and the consig do orientation. 公开号:FR3073135A1 申请号:FR1760553 申请日:2017-11-09 公开日:2019-05-10 发明作者:Lucien Blondel;Fernand Badano;Bertin Nahum 申请人:Quantum Surgical; IPC主号:
专利说明:
TECHNICAL AREA The present invention belongs to the field of medical procedures, and relates more particularly to a robotic device for minimally invasive medical procedures of deformable tissues of a patient, for example for the treatment or diagnosis on deformable organs or anatomical structures. STATE OF THE ART Medical interventions (diagnostic, therapeutic and / or surgical) by minimally invasive or percutaneous route are becoming more and more important, especially in oncology in local cancer treatments, by acting directly on the cells of the affected organ such as the liver, kidney, lung, pancreas, breast, prostate, etc. In addition to oncology, there are multiple medical procedures and applications using a minimally invasive or percutaneous approach, for example by inserting a needle: biopsies (removal of tissue for pathological analysis), placement of drains (suction of fluids), injections of therapeutic products (pain treatment), etc. Unlike open or conventional surgery which can require an incision of several tens of centimeters, minimally invasive medical procedures use all or more small incisions or openings through which an endoscope, a probe, a needle or others are introduced. medical instruments to reach, visualize and / or treat the targeted anatomical area. Minimally invasive medical procedures can bring many benefits such as limiting pain and trauma to surgery, reducing bleeding during surgery, reducing the length of hospital stay. They make it possible to carry out the medical intervention in ambulatory surgery, which allows faster recovery of the patient, smaller scars, a reduction in the risk of infection, etc. In addition to the conventional technique of surgical resection with forceps, scissors and other medical instruments, several technologies are validated or being evaluated for tissue destruction by minimally invasive or percutaneous route. Examples include laser surgery, cryotherapy, radio frequency, microwaves, electroporation or focused ultrasound and brachytherapy. Most of these techniques have in common the making of a very small incision and the insertion to the targeted anatomical area of one or more needles, probes or electrodes to deliver a treatment in a precise and localized manner (heat treatment, non-thermal or radioactive). In most cases, minimally invasive medical procedures require the operator to insert a medical instrument inside the patient's body to a certain depth in order to reach the targeted anatomical area. Performing these gestures is sometimes long and difficult because, unlike open surgery, the operator does not always have direct vision of the anatomy of the patient and the organ to be treated. This complicates the identification of anatomical structures, the precise placement of the medical instrument and the avoidance of sensitive anatomical structures (nerves, vessels, healthy organs, etc.). Surgeons can use preoperative medical images (CT or “Computerized Tomography”, MRI or “Magnetic Resonance Imaging”, X-ray, etc.) taken for diagnostic purposes to facilitate spotting of the anatomy and prior planning of medical intervention. Preoperative images provide a representation of the anatomy that is valid at a given point in time, which is not, however, prior to and prior to the medical intervention. To correctly insert the medical instrument inside the patient's body to the desired position and depth without damaging sensitive anatomical structures during the operation, the operator must be able to know where the medical instrument is located. inside the patient's body. Several systems and methods are available today to determine the position and orientation of the medical instrument during minimally invasive medical intervention when direct vision of the anatomy is not available through a microscope or d 'an endoscope. Image-guided navigation (or computer-assisted surgery) systems track the position and orientation of a medical instrument in real time by displaying a virtual medical instrument superimposed on images of the patient's body. They use 3D localization technologies to locate both the patient and the medical device, the most common of which are optical or electromagnetic. The use of an image acquisition system is necessary before, at the start and / or during the medical intervention to acquire one or more images of the patient (by scanner, MRI, X-rays, ultrasound, etc.). Before starting the medical intervention, these images are matched with the real position of the anatomy of the patient installed on the operating table by various known registration methods, such as rigid or deformable registration of points and / or remarkable areas, or referencing the position of the image acquisition system itself. Optical navigation systems identify the position of the medical device using infrared cameras and transmitters or reflectors placed in known geometry on the medical device and on the patient to serve as a reference and track their movements. Electromagnetic navigation systems locate the position of the medical instrument using a low-intensity magnetic field generator placed near the patient's body, sensors that can be incorporated into the medical instrument, and reference sensors placed on the patient . These electromagnetic navigation systems are compact and do not suffer from the problem of obstructing the field of view of optical navigation systems. However, they require a specific and restrictive environment linked to the presence of a magnetic field formed by the magnetic field generator. All of these known navigation systems, even if they can improve the precision of the medical gesture compared to the conventional manual method by providing in real time the position and orientation of the medical instrument in images, present however strong limitations for minimally invasive medical procedures on deformable tissue. A first limitation is that the final gesture of introducing the medical instrument to the targeted anatomical area is carried out manually by the operator, which makes the result dependent on the skill of the operator and does not allow '' achieve high accuracy. A second limitation is that the functioning of these navigation systems supposes that the targeted organs or anatomical structures do not move and do not deform between the time when the reference examination is carried out and the time when the operator introduces the medical instrument. In the case where the examination was carried out several days before the medical intervention and with the patient in a different position on the operating table from that which he had on the examination table, the organs or anatomical structures targeted may have moved or deformed and the difference between the displayed position and the actual position of the target organ or anatomical structure may introduce high inaccuracy. In addition, the targeted organs or anatomical structures can be deformed simply by the patient's breathing, and known navigation systems rely on the patient's control of breathing, which greatly limits the accuracy attainable by these navigation systems. . There are also robotic medical assistance devices for minimally invasive surgery. It is notably known from US Pat. No. 8,795,188 a system for medical intervention on a patient which comprises a robot, a device for recording patient movements and a method for automatically taking into account periodic movements of the patient, typically the movements of the chest due to breathing. However, the variants describing the use of a navigation technology, or a continuous laser scan require acquisition of images before the operation and assume that the organ or the targeted anatomical structure does not move and not deform relative to the patient's outer shell (skin). The variant describing the use of X-ray type images during the intervention requires the complex and irradiating implementation of continuous image acquisition. STATEMENT OF THE INVENTION The present invention aims to remedy all or part of the limitations of the solutions of the prior art, in particular those set out above, by proposing a solution which makes it possible to assist the operator in positioning a medical instrument relative to a organ or anatomical structure in the body of a patient, for the purpose of carrying out a localized diagnosis or therapeutic treatment, taking into account the fact that the organ or the anatomical structure can move or deform inside of the patient's body. To this end, and according to a first aspect, the invention relates to a robotic device for medical intervention on a patient with a medical instrument, comprising: - a robot arm with several degrees of freedom, having an end adapted to receive the medical instrument, - an image acquisition system adapted to acquire position information of the patient's anatomy, - a storage medium comprising a biomechanical model of the anatomical structures of the human body, a processing circuit configured to determine a position setpoint and an orientation setpoint for said medical instrument as a function of the biomechanical model, as a function of position information and as a function of a trajectory to be observed by the medical instrument in order to produce medical intervention, - a control circuit configured to control the robot arm to place or assist in the placement of the medical instrument in the position setpoint and the orientation setpoint. Thanks to the robotic arm, the positioning accuracy and repeatability of the medical device are much higher than that of an operator. This gain in precision makes it possible to execute the treatment chosen by the operator as close as possible to the target organ or anatomical structure, and therefore to improve its clinical efficiency. It makes it possible to envisage treating lesions that are still inoperable because they are too small or located close to or within critical areas. Precision and repeatability also reduce the risk of complications such as bleeding, pain, and loss of function due to damage to sensitive anatomical structures present on the trajectory to be respected following manual positioning errors of the medical instrument. The robotic device also uses a priori knowledge of a biomechanical model of the human body. By “biomechanical model” of the human body is meant a mathematical model of the different anatomical structures (muscles, tendons, bone structures, organs, vascular network, etc.) of the human body and therefore of the patient in the anatomical area considered which allows model the deformations of said anatomical structures as well as the mechanical interactions between said anatomical structures. Such a biomechanical model therefore makes it possible, in particular, to determine the deformations and mechanical interactions (and therefore displacements) of the internal anatomical structures of the patient induced by, for example, a modification of the external envelope of said patient, a modification of the positions of the vessels of an organ, a modification of the external envelope of an organ, etc. Such modifications can for example be induced by the patient's breathing (displacement of the organs induced by the movement of the rib cage and the diaphragm), by a change in position of said patient (displacement of the organs induced by gravity), by contact with a medical instrument (local deformation), etc. The anatomical zone considered corresponds for example to the thoracic zone and / or to the abdominal zone and / or to the patient's pelvic zone. Thus, the robotic device uses the trajectory, the biomechanical model and the position information acquired during the medical intervention to determine the real position of a mobile and deformable anatomical structure in the patient's body, regardless of the patient's position on the operating table and the level of their breathing. This functionality greatly improves the reliability of the medical intervention by avoiding errors due to compensation by the operator for movements linked to breathing and internal deformations of the organs not taken into account by the navigation or robotics systems known to the industry. prior art. For all these reasons, the robotic device is particularly suitable for minimally invasive medical procedures on deformable tissue of a patient. In particular embodiments, the robotic device may further include one or more of the following characteristics, taken in isolation or in any technically possible combination. In particular embodiments, the image acquisition system is of the non-irradiating type. Indeed, by taking into account the biomechanical model, the image acquisition system used during the intervention can be of the non-irradiating type. By "non-irradiating" is meant that no ionizing radiation (in particular X-rays) is generated towards the patient to acquire the images during the medical intervention. The irradiation dose is therefore greatly reduced, both for the patient and for the medical team working near the image acquisition system. In addition, the image acquisition system can be much less costly and bulky than, for example a CT scanner, so that the robotic device can be used even in operating theaters of reduced dimensions and without CT scanner, which makes its use much less restrictive. In particular embodiments, the image acquisition system comprises at least one so-called contactless equipment suitable for acquiring position information without contact with the patient. In particular embodiments, the image acquisition system comprises at least one of the following contactless equipment: a stereoscopic camera, a structured light camera, a time-of-flight camera, a measurement camera deep, etc. In particular embodiments, the image acquisition system comprises at least one so-called contact equipment adapted to acquire position information with contact with the patient. In particular embodiments, the image acquisition system comprises at least one of the following contact devices: an ultrasonic probe, an endoscope, etc. In particular embodiments, the control circuit is configured to control the robot arm according to at least one of the following modes: an automatic mode, a collaborative mode, an automatic tracking mode, a collaborative tracking mode, etc. In particular embodiments, the processing circuit is configured to determine or assist in determining the trajectory of the medical instrument based on images of the patient. In particular embodiments, the treatment circuit is configured to adjust or assist in the adjustment of parameters of a treatment to be carried out during the medical intervention by simulating the effects of said parameters according to images of the patient. In particular embodiments, the robotic device includes a guide tool adapted to guide the medical instrument, fixed or intended to be fixed to one end of the robot arm. In particular embodiments, the robotic device comprises at least one man-machine interface equipment from among the following equipment: a display screen, a touch display screen, a keyboard, 2D and / or 3D vision glasses, a joystick, a motion detection module, a voice control module, etc. In particular embodiments, the robotic device comprises at least one point of entry location equipment among the following equipment: a medical instrument with atraumatic tip, a laser aiming module, etc. In particular embodiments, the medical instrument is one of the following medical instruments: a biopsy needle, a catheter, an endoscope, or an instrument for treatment by focused ultrasound, by laser, by cryotherapy, by radiofrequency, by electroporation, brachytherapy, etc. In particular embodiments, the robotic device comprises a mobile carriage carrying the robot arm, said mobile carriage comprising immobilizing means. PRESENTATION OF THE FIGURES The invention will be better understood on reading the following description, given by way of nonlimiting example, and made with reference to the figures which represent: - Figure 1: a schematic representation of an embodiment of a robotic device for minimally invasive soft tissue medical procedures, - Figure 2: a schematic representation of an alternative embodiment of the robotic device of Figure 1, - Figure 3: a schematic representation of another embodiment of a robotic device. In these figures, identical references from one figure to another denote identical or analogous elements. For the sake of clarity, the elements shown are not to scale, unless otherwise stated. DETAILED DESCRIPTION OF EMBODIMENTS FIG. 1 schematically represents an exemplary embodiment of a robotic device 10 for assisting an operator during a medical intervention, for example of the minimally invasive type on soft tissue. As illustrated in FIG. 1, the robotic device 10 comprises a robot arm 11 with several degrees of freedom. The robot arm 11 has one end adapted to receive a medical instrument 13. In the example illustrated in FIG. 1, the medical instrument 13 is mounted on the end of the robot arm 11 by means of a guide tool 12 adapted to guide said medical instrument 13. To this end, the robot arm 11 comprises at the end an interface adapted to receive said guide tool 12. Preferably, the robot arm 11 has at least 6 degrees of freedom in order to allow large ranges of control, in space, of the position and the orientation of the guide tool 12 relative to a patient 30, for example lying on an operating table 20. The guide tool 12 is adapted to guide the medical instrument 13, that is to say to constrain the movement of said medical instrument 13 relative to said guide tool 12. For example, the guide tool 12 is a slide adapted to guide the medical instrument 13 in translation in order to constrain the movement of said medical instrument 13 for example during its insertion into the body of patient 30. The guide tool 12 is for example removably attached to the robot arm 11, which is preferably adapted to receive different types of guide tools 12, for example associated with different medical instruments 13 and / or gestures different medical. The interface of the robot arm 11 may include, for example, a keying mechanism to ensure correct mounting of the guide tools 12 on the robot arm 11. In preferred embodiments, the interface may further include an electronic system allowing automatically recognize the guide tool 12 mounted by the operator in order to then use in the calculations the characteristics of the guide tool 12 such as its reference, its dimensions, its weight, its center of mass, and any other useful data for its function or performance of use. The robotic device 10 is preferably adapted to receive, on a guide tool 12 carried by the robot arm 11, any type of medical instrument 13, in particular any type of medical instrument used for minimally invasive medical interventions on soft tissue. For example, the robotic device 10 is preferably adapted to receive and move at least one of the following surgical medical instruments: - a biopsy needle, - a catheter, - an endoscope, - a focused ultrasound treatment instrument, - a laser treatment instrument, - a cryotherapy treatment instrument, - a radiofrequency treatment instrument, - an electroporation treatment instrument, - a brachytherapy treatment instrument, etc. The robotic device 10 also includes a control circuit 16 adapted to control the robot arm 11 to modify the position and the orientation of the guide tool 12 in a reference frame associated with the robotic device 10. The control circuit 16 comprises for example one or more processors and storage means (magnetic hard disk, electronic memory, optical disk, etc.) in which a computer program product is stored, in the form of a set of program code instructions to be executed to control the robot arm 11. Alternatively or in addition, the control circuit 16 comprises one or more programmable logic circuits (FPGA, PLD, etc.), and / or one or more circuits specialized integrated devices (ASIC, etc.), and / or a set of discrete electronic components, etc., adapted to control said robot arm 11. Thanks to the control circuit 16, the robot arm 11 and the guide tool 12 carried by the robot arm 11, the medical instrument 13 can be positioned, oriented and guided with much greater precision than in the case of a medical instrument 13 handled directly by an operator. In the example illustrated in FIG. 1, the robotic device 10 comprises a mobile carriage 18, for example mounted on wheels, on which the robot arm 11 is mounted. Such arrangements are particularly advantageous in that it is then particularly easy to move the robot arm 11 from one side to the other of the operating table, from one operating room to another, etc. The carriage 18 comprises immobilization means (not shown in the figures) making it possible to immobilize the carriage 18 relative to the operating table 20. The immobilization means may be of any suitable type, and may in particular include brakes on wheels, retractable pads or feet, mechanical attachment systems to the operating table 20, mechanical attachment systems to the ground, etc. However, nothing excludes, according to other examples, from having the robot arm 11 mounted directly on the operating table, in a removable or permanent manner (in which case the operating table is an integral part of the robotic device 10). FIG. 2 schematically represents an alternative embodiment of the robotic device 10 in which the robot arm 11 is removably mounted on the operating table 20. In the example illustrated by FIG. 2, the robot arm 11 is mounted on a support 110 making a rigid mechanical connection with rails 21 of the operating table 20. As illustrated in FIGS. 1 and 2, the robotic device 10 also includes an image acquisition system 14 adapted to acquire position information of the patient's anatomy 30 in the reference frame associated with the robotic device 10 , or in a reference frame different from said reference frame for which the transition matrix to said reference frame is known a priori or can be determined. In preferred embodiments, the image acquisition system 14 is of the non-irradiating type, in order to limit the radiation dose undergone by the patient 30 and by the medical team. The image acquisition system 14 makes it possible to acquire position information of the patient's anatomy 30. The position information of the patient's anatomy 30 corresponds for example to the position of the external surface of the patient's body 30 in the reference frame, the position of the bone structure of said patient's body 30 in the reference frame, the position of an organ or vessels inside said patient's body 30 in the reference frame, etc. In general, any type of image acquisition system 14 adapted to provide position information of the patient's anatomy 30 can be implemented in the robotic device 10. For example, the image acquisition system 14 the images may include one or more so-called contactless devices suitable for acquiring position information without contact with the patient 30 and / or one or more so-called contact devices suitable for acquiring position information with contact with the patient 30. In particular embodiments, the image acquisition system 14 includes at least one of the following contactless equipment: - a stereoscopic camera, - a structured light camera, - a time-of-flight camera (“Time of Flight camera” or “ToF camera” in Anglo-Saxon literature), - a depth measurement camera (for example an RGB-D camera), etc. Such contactless equipment makes it possible, for example, to acquire position information representative of the position of the external surface of the patient's body 30 relative to the contactless equipment. In particular embodiments, the image acquisition system 14 comprises at least one of the following contact devices: - an ultrasonic probe (acquisition with non-intrusive contact), - an endoscope (acquisition with intrusive contact), etc. Such contact equipment makes it possible, for example, to acquire position information representative of the position of an organ or vessels inside the body of the patient 30. The image acquisition system 14 is for example integrated into the robot arm 11 or mounted at the end of said robot arm 11. In the examples illustrated in FIGS. 1 and 2, the image acquisition system 14 is mounted on a support separate from the robot arm 11. The support is for example an articulated arm 140, possibly motorized in which case it forms a robot arm distinct from the robot arm 11 carrying the guide tool 12 of the medical instrument 13. In the example illustrated in FIG. 1, the articulated arm 140 is carried, like the robot arm 11, by the movable carriage 18. In the example illustrated in FIG. 2, the articulated arm 140 carrying the image acquisition system 14 is carried by a mobile carriage 18. Nothing further excludes, according to other examples, from having an image acquisition system 14 worn by the operator to acquire position information of the anatomy of the patient 30. The position and the orientation in space of the image acquisition system 14 are for example known in the reference frame of the robotic device 10, that is to say by the knowledge of its geometry when it is carried by the robot arm 11 , or by the use of a 3D localization system such as an optical, electromagnetic or other browser. As illustrated in FIGS. 1 and 2, the robotic device 10 also includes a storage medium 15 storing a biomechanical model of the anatomical structures of the human body. In the examples illustrated by FIGS. 1 and 2, the storage medium 15 is shown as being distinct from the control circuit 16. The storage medium 15 can however also, according to other exemplary embodiments, be one of the means of memorization of said control circuit 16. It should be noted that the biomechanical model of the human body is not necessarily specific to the patient 30 considered, and may be a biomechanical model of a generic patient, for example of the same sex, size, build, etc. that the patient 30 considered on which the medical gesture must be performed. The biomechanical model preferably includes the main anatomical structures of the thoraco-abdomino-pelvic area such as the thoracic and abdominal walls, muscles, tendons, bones and joints, organs, vascular network, etc., as well as their deformation models and their mechanical interactions. The biomechanical model also preferably takes into account the effects of gravity as a function of the patient's position 30. Such biomechanical models are known in the scientific literature, see for example the following publications: - “SOFA: A Multi-Model Framework for Interactive Physical Simulation”, F. Faure et al., Soft Tissue Biomechanical Modeling for Computer Assisted Surgery - Studies in Mechanobiology, Tissue Engineering and Biomaterials, Volume 11, Springer, - "A Personalized Biomechanical Model for Respiratory Motion Prediction", B. Fuerst et al., International Conférence on Medical Image Computing and Computer Assisted Intervention, 2012, - "Patient-Specific Biomechanical Model as Whole-Body CT Image Registration Tool", Mao Li et al., Medical Image Analysis, 2015, May, pages 22-34. The biomechanical model can for example be created by transcribing a database of medical images in three dimensions (CT scans, MRI, etc.). The geometry of the structures of interest can be extracted from medical images by segmentation and reconstruction algorithms. Analysis of the image database makes it possible to calculate an average geometry of the components of the biomechanical model as well as the main deformation parameters representative of all the medical images in the database. It is possible to assign to each of the structures mechanical characteristics and boundary conditions to create a biomechanical model. The biomechanical model preferably includes a modeling of the musculoskeletal system composed of bones, muscles, tendons, ligaments and cartilage. As illustrated in FIGS. 1 and 2, the robotic device 10 also includes a processing circuit 17. The processing circuit 17 is configured to determine, as a function of the biomechanical model of the anatomical structures of the human body and as a function of information from position acquired by the image acquisition system 14, a position setpoint and an orientation setpoint for the guide tool 12. The processing circuit 17 comprises for example one or more processors and storage means (magnetic hard disk, electronic memory, optical disk, etc.) in which a computer program product is stored, in the form of a set of 'program code instructions to be executed to determine the position setpoint and the orientation setpoint. Alternatively or in addition, the processing circuit 17 comprises one or more programmable logic circuits (FPGA, PLD, etc.), and / or one or more specialized integrated circuits (ASIC, etc.), and / or a set of electronic components discrete, etc., suitable for determining the position setpoint and the orientation setpoint. In the examples illustrated by FIGS. 1 and 2, the processing circuit 17 is shown as distinct from the control circuit 16. The processing circuit 17 can however also, according to other embodiments, be combined with or use equipment also used by said control circuit 16. In addition, the storage medium 15 is shown as distinct from the processing circuit 17. The storage medium 15 may however also be, according to other embodiments, one of the means memorizing said processing circuit 17. The position setpoint and the orientation setpoint for the guide tool 12 are also determined as a function of a path to be observed by the medical instrument 13 during the medical intervention. In the case of a medical intervention requiring the medical instrument 13 to be introduced into the patient's body 30, the trajectory corresponds to the path which must be traveled by the medical instrument 13 inside the patient's body 30, according to which said medical instrument must be guided during the medical intervention. The trajectory corresponds for example to the position of an entry point, for example on the external surface of the anatomy of the patient 30, by which the medical instrument 13 must enter the body of the patient 30, as well as the position of a target point inside the patient 30, at the targeted anatomical structure, to be reached with said medical instrument 13. The entry point and the target point are for example stored in the form of coordinates in a coordinate system associated with the patient's anatomy 30. In the case of a medical intervention which does not require the medical instrument 13 to be introduced into the body of the patient 30, for example in the case of a focused ultrasound treatment instrument, the trajectory corresponds to the path which must be traveled by the ultrasonic waves inside the patient's body 30. The trajectory corresponds for example to the position of an entry point, for example on the external surface of the anatomy of the patient 30, by which the ultrasonic waves must penetrate the body of patient 30, as well as the position of a target point inside patient 30 towards which the ultrasonic waves are to be focused. The entry point and the target point are for example stored in the form of coordinates in a reference frame associated with the patient's anatomy 30. The trajectory can be predetermined by other means than the robotic device 10, in which case it is for example stored in the storage medium 15 before the medical intervention. Alternatively or in addition, the trajectory can also be determined by means of the robotic device 10, as described in the following description. For example, the processing circuit 17 integrates algorithms for matching the biomechanical model with the position information of the anatomy of the patient 30 provided by the image acquisition system 14. Thus, the processing circuit 17 can determine the position and orientation of the patient 30 in the reference frame associated with the robotic device 10. The processing circuit 17 can also determine the position of the entry point and the position of the target point of the trajectory in said reference frame taking into account the deformations of the anatomical structures (by earth's gravity, breathing, mechanical contact with a medical instrument, etc.) of the patient 30 relative to the anatomical structures of the patient 30 considered to determine said trajectory. For example, an algorithm makes it possible to propagate the movements of the surface of the skin to the internal volume and to correctly calculate the position of the internal anatomical structures. According to another example, it is possible to determine the position and the deformation of an organ from position information of the vessels of this organ (position information provided for example by an ultrasound probe). According to another example, it is possible to determine the position and the deformation of an organ from position information of the external surface of said organ (position information provided for example by an endoscope). The acquisition of the position information of the anatomy of the patient 30 and the calculation of the correspondence of the biomechanical model with the said position information of the anatomy of the patient 30 are preferably carried out in real time or almost real time, so that the position of the entry point and the position of the target point in the reference frame can be updated in real-time or almost real-time to follow the movements and deformations of the anatomical structures of the patient 30. This update can also be performed during the insertion of the medical instrument 13 into the body of the patient 30, in order to take account of the deformations induced by the movement of said medical instrument 13. After having determined the parameters of the trajectory (positions of the entry point and the target point) in the reference frame associated with the robotic device 10, or simultaneously with this determination, the processing circuit 17 determines a position set point and a set point guide tool 12 for respecting said path. The control circuit 16 can then control the robot arm 11 to place or assist the operator in placing the guide tool 12 in said position setpoint and said orientation setpoint determined by the processing circuit 17. In modes preferred embodiments, the control circuit 16 is adapted to control the robot arm 11 according to at least one of the following modes: - an automatic mode, - a collaborative mode, - an automatic tracking mode, - a collaborative monitoring mode. In automatic mode, the control circuit 16 moves the robot arm 11 from its current position and orientation to the position setpoint and the orientation setpoint by automatically calculating the path between the current position and the position setpoint. In collaborative mode, the control circuit 16 moves the robot arm 11 in the direction of the forces exerted by the operator, these forces being able to be exerted on the guide tool 12 or on one of the axes of the robot arm 11. The forces are measured and calculated using one or more sensors (not shown in the figures) fitted to the end of the robot arm 11 and / or each of its axes. Geometric constraints can be integrated into the collaborative mode to restrict the movements of the robot arm 11 in order to facilitate the medical procedure. For example, movements can be constrained in an area, outside an area, along an axis or a curve, around a point, etc. The constraint can be defined by any type of geometric shape and associated behavior (inclusion / exclusion). In collaborative mode, the control circuit 16 assists the operator in placing the guide tool 12 in the position setpoint and the orientation setpoint. In tracking mode (or “tracking” in English literature), the control circuit 16 moves the robot arm 11 in the direction of the patient's movements 30 to place the guide tool 12 in the position setpoint and the orientation instruction updated in real time or almost real time by the treatment circuit 17. In this case, the medical instrument 13 moves in the reference frame associated with the robotic device 10, but remains substantially immobile over time in a coordinate system linked to the target organ. In collaborative tracking mode, the control circuit 16 moves the robot arm 11 in the direction of movement of the patient 30 with flexibility around the commanded position. The operator can for example exert forces on the guide tool 12 and slightly and temporarily deflect the position of the guide tool 12 relative to the position setpoint and the orientation setpoint. The robot arm 11 exerts forces opposite to those of the operator which aim to bring the guide tool 12 back into the position setpoint and the orientation setpoint as soon as no effort is exerted by the operator. The level of flexibility can for example be adjustable by a stiffness or distance parameter. FIG. 3 schematically represents a preferred embodiment, in which the robotic device 10 comprises a human-machine interface equipment 19. In the example illustrated in FIG. 3, the man-machine interface equipment 19 is a display screen, preferably a touch screen. The human-machine interface equipment 19 allows the operator to control the robotic device 10 and, possibly, to view images related to the medical procedure to be performed. For example, the man-machine interface equipment 19 can be used during the planning of the medical gesture, in order to establish the trajectory of the medical instrument 13, or even to visualize the progress of the medical instrument 13 in the patient's body 30, for example by displaying the real-time or near-real-time position of the medical instrument 13 relative to the position of the target point in the reference frame. The machine interface equipment 19 can also be used to display the images supplied by the image acquisition system 14. In the example illustrated by FIG. 3, the man-machine interface equipment 19 is carried by a mobile carriage 18 also carrying the robot arm 11. Nothing excludes, according to other examples, having the man-machine interface equipment 19 carried by a separate console, or mounted on a rail of the operating table 20, for example removably. In addition, other man-machine interface equipment 19 can be considered, alternatively or in addition. For example, the man-machine interface equipment 19 may include a mouse, a keyboard, a touchpad, a control lever (“joystick”), a non-contact motion detection module which locates the movements of the hand. , fingers, head or eyes of the operator, or a voice command module, etc. In addition, 2D and / or 3D vision glasses can also replace or supplement the display screen. In order to secure the use of the robotic device 10, the man-machine interface equipment 19 can also include an acknowledgment module (pedal with or without wire, button box, remote control, switch on the robot arm 11) for securing the movements of the robot arm 11, which are then conditioned to the activation of said acknowledgment module. In preferred embodiments, the processing circuit 17 is configured to determine or assist the operator in determining the trajectory of the medical instrument 13 based on images of the patient. In the following description, a non-limiting example of implementation of the robotic device 10 is described for planning a medical intervention requiring the introduction of the medical instrument into the body of the patient 30. The medical intervention can be defined for example by the trajectory, the medical instrument 13 to be used and the treatment parameters. The trajectory is for example made up of a target point located in the organ to be treated and an entry point located for example at the level of the skin. The medical instrument 13 is defined by several properties such as its length, its diameter, its 3D geometric shape, etc. Treatment parameters can include ablation technology settings such as the power of the current delivered, treatment time, area diameter, distance margins, etc. The robotic device 10 can, for example, load images of the patient 30 (CT scanner, PET or “Positron Emission Tomography”, MRI, radiography, ultrasound, etc.) from the hospital network or from an external network ( cloud computing), or from an external storage medium (USB key, CD, DVD, etc.) and allow them to be viewed, for example, on the 19 man-machine interface equipment according to cutting plans in two dimensions and by images reconstructed in three dimensions. For example, rigid and non-rigid registration algorithms make it possible to merge several images of the same patient 30 in order to provide the operator with all the anatomical and functional information necessary for planning the medical intervention. The operator can then plan one or more trajectories depending on the medical intervention to be performed. In the case of ablation by irreversible electroporation for example, the robotic device 10 makes it possible to create exactly parallel trajectories to optimize the effectiveness of the treatment. The positions of the target point and the entry point of the trajectory are for example identified manually in the images by the operator. The trajectory of the medical instrument 13 in the anatomy can be viewed in the images and modified to ensure that the end of the medical instrument 13 reaches an optimal target point and that the insertion of the medical instrument 13 does not damage sensitive anatomical structures between the entry point and the target point. To facilitate the planning decision-making process, the robotic device 10 can integrate segmentation algorithms which automatically identify the contours and volumes of certain organs of interest, nerves, arteries, veins and vessels, bones, as well as lesions to treat. Alternatively, the robotic device 10 can automatically determine the target point and the entry point. For example, the target point is calculated by pattern recognition methods and from the parameters of the treatment and the volume of the target lesion. The entry point can be calculated by criteria optimization methods such as the distance between the trajectory of the medical instrument 13 and sensitive anatomical structures and the position relative to preferential zones of insertion defined at the level of the skin. Alternatively or in addition, the robotic device 10 can also accumulate, as it is used, large amounts of planning data which it re-uses and analyzes in order to propose, by an artificial intelligence algorithm, a selection of entry points and optimal target points. In preferred embodiments, the processing circuit 17 is configured to adjust or assist the operator in adjusting the parameters of the treatment to be carried out during the medical intervention, by simulating the effects of said parameters as a function of images of the patient. For example, the robotic device 10 can, from the parameters of the treatment, trajectory information and medical instruments 13, calculate the effects of the treatment on the anatomy and make it possible to visualize a precise simulation thereof on the images of the patient 30 For a thermal ablation for example, the calculation can take into account in particular the presence of adjacent vessels and their cooling impact ("heat-sink effect"). The operator can then adjust the planning data to optimize the clinical outcome of the treatment. The planning process described above therefore makes it possible to plan a wide variety of medical procedures such as laser surgery, ablation by cryotherapy, radiofrequency, microwave or electroporation, brachytherapy, endoscopy, and any technique requiring the insertion of one or more medical instruments 13 into the body of a patient 30. The planning process described above also makes it possible to plan a wide variety of medical gestures not requiring the introduction of a medical instrument 13 into the body of a patient 30, as for example in the case of a focused ultrasound treatment. All the data necessary for planning can be saved by the robotic device 10 in storage means of the processing circuit 17 or in the storage medium 15, or on an external storage medium and reloaded later either to modify them. elements either to execute the treatment on the day of the operation with the robotic device 10. We now describe an example of implementation of the robotic device 10 for performing a previously planned medical gesture, requiring the insertion of a medical instrument 13 into the patient's body 30. Before starting the operation, the robotic device 10 is brought into the room and placed next to the patient 30. In the case of a robot arm 11 mounted on a mobile carriage 18, the robotic device 10 is immobilized prior to a phase patient tracking 30. The operator then controls the robotic device 10, for example by means of the man-machine interface equipment 19, to start the patient identification phase 30. The patient identification phase 30 aims to determine the position of the patient 30 in the reference frame associated with the robotic device 10, but also the position of the entry point and the target point, as well as the position setpoint and the orientation setpoint of the guide tool 12, using the biomechanical model, the position information provided by the image acquisition system 14 and the planned trajectory. Once the position of the patient 30 is known and matched with the preoperative images, the operator launches the positioning phase, which aims to place the guide tool 12 in the position setpoint and the orientation setpoint adapted for the medical intervention to be performed. For example, the robotic device 10 takes into account the dimensions of the guide tool 12, the position of the entry point, and the direction towards the target point to automatically position the robot arm 11 so that the guidance is aligned with the chosen path, at a safe distance from the adjustable entry point. The operator can then control the robot arm 11, for example in collaborative mode to adjust the position of the guide tool 12 as close as possible to the entry point while maintaining alignment on the trajectory, then block the movements of the robot arm 11 and finally insert a medical instrument 13 through the guide tool 12. In preferred embodiments, the robotic device 10 includes tracking equipment (not shown in the figures). Thus, thanks to the robotic device 10, the operator can pinpoint the point of entry on the skin of the patient 30, at which an incision must be made. For example, the point of entry location equipment corresponds to a medical instrument with atraumatic tip which is inserted into the guide tool 12, or to a laser aiming module integrated in a medical instrument 13 or in the tool guide 12. After making the incision, the operator can start the guiding phase by inserting the medical instrument 13 through the guiding tool 12 until the end of the medical instrument reaches the planned target point . The management of the insertion depth can be based simply on the length of the medical instrument 13 and / or on a mechanical stop system integrated into the guide tool 12. Alternatively or in addition, the guide tool 12 may include a sensor for entering the insertion depth of the medical instrument 13. The robotic device 10 can then display in real time or almost real time the position of the medical instrument 13 in the images and provide messages to the operator when the target point is near, reached or exceeded. In another variant, the medical instrument 13 is mechanically attached to the guide tool 12 and the robot arm 11 automatically inserts the medical instrument 13 to the planned target point. During insertion, the treatment circuit 17 can use the biomechanical model of the patient 30 to estimate the local deformations of the organs or anatomical structures traversed by the medical instrument 13 and take them into account to update the position of the target point. Depending on the needs of the operation during the guidance phase, the robotic device 10 is for example activated in tracking mode or in collaborative tracking mode to maintain the position of the guidance tool 12 relative to the target anatomy whatever be the movements of the patient 30. The robotic device 10 can also be activated in collaborative mode during the guidance phase, by applying or not applying geometric constraints. The constrained collaborative mode on the axis of the trajectory is, for example, useful for performing staged biopsies. When the medical instrument 13 has reached the target point, the planned medical intervention can be carried out for diagnostic or localized treatment purposes: for example the removal of tissue for a biopsy, the delivery of liquid nitrogen for cryotherapy, generation of an electric current for radiofrequency ablation, injection of radioactive sources for brachytherapy, direct visualization of the anatomy and insertion of medical instruments into the working channel of the endoscope for endoscopic surgery, etc. At any time during or after the guidance phase, the operator can verify the correct execution of the insertion of the medical instrument 13 by monitoring imagery. Depending on the equipment available at the hospital and in the operating room, an examination of the anatomical area of interest can be performed with a fixed or mobile imaging device (CT scanner, MRI, radiology arch, ultrasonic probe, etc.). In preferred embodiments, the images are directly transferred to the robotic device 10, the processing circuit 17 of which comprises, for example, registration algorithms which make it possible to automatically merge these intraoperative images with the preoperative images. The robotic device 10 then displays the planning information superimposed on the intraoperative images to assess the progress or effectiveness of the treatment and determine the corrections to be made if necessary. The processing circuit 17 can also include, in preferred embodiments, segmentation algorithms for automatically identifying a necrotic area, comparing it with the planned area, calculating and displaying the margins obtained in diameter or in volume, and indicating the diameter or volume remaining to be treated. The robotic device 10 can also, optionally, provide the information necessary for the complementary processing, such as the positions and parameters of one or more additional ablation trajectories. More generally, it should be noted that the embodiments and embodiments considered above have been described by way of nonlimiting examples, and that other variants are therefore possible. In particular, the invention has been described by considering that the medical instrument 13 is mounted on the robot arm 11 by means of a guide tool 12. It should however be noted that the robotic device 10 can also be brought into operation. works without using a guide tool 12. For example, the use of a guide tool 12 is not necessary in the case where the medical instrument 13 must not be introduced into the body of the patient 30, for example in the case of external treatment with focused ultrasound. In addition, in the case where the medical instrument 13 must be inserted into the body of the patient 30, the use of a guide tool 12 is required especially if it is the operator who inserts the medical instrument 13 into the patient's body 30, but not necessarily if it is the robot arm 11 which automatically inserts the medical instrument 13 into the patient's body 30.
权利要求:
Claims (14) [1" id="c-fr-0001] 1 - Robotic device (10) for medical intervention on a patient (30) with a medical instrument (13), comprising: - a robot arm (11) having several degrees of freedom, having an end adapted to receive the medical instrument, - an image acquisition system (14) adapted to acquire position information of the patient's anatomy, said robotic device being characterized in that it comprises: - a storage medium (15) comprising a biomechanical model of the anatomical structures of the human body, - a processing circuit (17) configured to determine a position setpoint and an orientation setpoint for said medical instrument as a function of the biomechanical model, as a function of position information of the anatomy of said patient and as a function of a trajectory to be respected by the medical instrument (13) to carry out the medical intervention, - a control circuit (16) configured to control the robot arm (11) for placing or assisting in the placement of the medical instrument in the position setpoint and the orientation setpoint. [2" id="c-fr-0002] 2- robotic device (10) according to claim 1, wherein the image acquisition system (14) is of the non-irradiating type. [3" id="c-fr-0003] 3 - Robotic device (10) according to one of claims 1 to 2, wherein the image acquisition system (14) comprises at least one so-called contactless equipment suitable for acquiring position information without contact with the patient (30). [4" id="c-fr-0004] 4- Robotic device (10) according to claim 3, in which the image acquisition system (14) comprises at least one of the following contactless equipment: - a stereoscopic camera, - a structured light camera, - a time-of-flight camera, - a depth measurement camera. [5" id="c-fr-0005] 5 - Robotic device (10) according to one of claims 1 to 4, in which the image acquisition system (14) comprises at least one so-called contact equipment suitable for acquiring position information with contact with the patient (30). [6" id="c-fr-0006] 6- robotic device (10) according to claim 5, wherein the image acquisition system (14) comprises at least one of the following contact devices: - an ultrasonic probe, - an endoscope. [7" id="c-fr-0007] 7 - Robotic device (10) according to one of claims 1 to 6, in which the control circuit (16) is configured to control the robot arm according to at least one of the following modes: - an automatic mode, - a collaborative mode, - an automatic tracking mode, - a collaborative monitoring mode. [8" id="c-fr-0008] 8 - robotic device (10) according to one of claims 1 to 7, wherein the processing circuit (17) is configured to determine or assist in determining the trajectory of the medical instrument based on images of the patient . [9" id="c-fr-0009] 9 - robotic device (10) according to one of claims 1 to 8, wherein the treatment circuit (17) is configured to adjust or assist in the adjustment of parameters of a treatment to be performed during the medical intervention by simulating the effects of said parameters as a function of patient images. [10" id="c-fr-0010] 10- robotic device (10) according to one of claims 1 to 9, comprising a guide tool (12) adapted to guide the medical instrument (13), fixed or intended to be fixed to one end of the robot arm (11 ). [11" id="c-fr-0011] 11 - Robotic device (10) according to one of claims 1 to 10, comprising at least one human-machine interface equipment (19) from the following equipment: - a display screen, - a touch screen display, - a keyboard, - 2D and / or 3D vision glasses, - a joystick, - a motion detection module, - a voice command module. [12" id="c-fr-0012] 12 - Robotic device (10) according to one of claims 1 to 11, comprising at least one point of entry location equipment among the following equipment: - a medical instrument with atraumatic tip, - a laser aiming module. [13" id="c-fr-0013] 13 - Robotic device (10) according to one of claims 1 to 12, in which the medical instrument (13) is one of the following medical instruments: - a biopsy needle, - a catheter, - an endoscope, - a focused ultrasound treatment instrument, - a laser treatment instrument, - a cryotherapy treatment instrument, - a radiofrequency treatment instrument, - an electroporation treatment instrument, - a brachytherapy treatment instrument. [14" id="c-fr-0014] 14- Robotic device (10) according to one of claims 1 to 13, comprising a carriage (18) movable carrying the robot arm (11), said movable carriage comprising immobilization means.
类似技术:
公开号 | 公开日 | 专利标题 FR3073135B1|2019-11-15|ROBOTIC DEVICE FOR MINI-INVASIVE MEDICAL INTERVENTION ON SOFT TISSUE CN106691580B|2020-02-14|System and method for ultrasound image guided ablation antenna placement US9782147B2|2017-10-10|Apparatus and methods for localization and relative positioning of a surgical instrument CN105101895B|2019-11-26|Instrument positioning in guided high dose rate plesioradiotherapy US20150031990A1|2015-01-29|Photoacoustic tracking and registration in interventional ultrasound JP2015083068A|2015-04-30|Radiotherapy treatment apparatus, system and method AU2015218552B2|2017-07-27|Interventional imaging US10166079B2|2019-01-01|Depth-encoded fiducial marker for intraoperative surgical registration US10111716B2|2018-10-30|System for and method of performing sonasurgery US20150371390A1|2015-12-24|Three-Dimensional Image Segmentation Based on a Two-Dimensional Image Information JP6717745B2|2020-07-01|Portal imaging for brachytherapy Ferguson et al.2018|Toward image-guided partial nephrectomy with the da Vinci robot: exploring surface acquisition methods for intraoperative re-registration JP2021512692A|2021-05-20|Systems and methods for estimating the pose of an imaging device and determining the position of a medical device with respect to a target JP6725508B2|2020-07-22|Adaptive planning and delivery of high dose rate brachytherapy JP2019535413A|2019-12-12|Custom surgery guide, custom surgery guide generation method and generation program Shahin et al.2014|Ultrasound-based tumor movement compensation during navigated laparoscopic liver interventions US20140316234A1|2014-10-23|Apparatus and methods for accurate surface matching of anatomy using a predefined registration path Shen2019|Framework for ultrasonography-based augmented reality in robotic surgery: application to transoral surgery and gastrointestinal surgery Vaughan2019|Computer-Integrated Methods for Breast Cancer Surgery and Brachytherapy Guidance CN109313698A|2019-02-05|Synchronous surface and internal tumours detection WO2019238230A1|2019-12-19|Registration of an anatomical body part by detecting a finger pose Villard et al.2009|Percutaneous transhepatic cholangiography training simulator with real-time breathing motion
同族专利:
公开号 | 公开日 IL274453D0|2020-06-30| CN110573105A|2019-12-13| JP2021502195A|2021-01-28| AU2018364406A1|2020-05-28| CA3082046A1|2019-05-16| EP3706656A1|2020-09-16| WO2019092372A1|2019-05-16| KR20200085762A|2020-07-15| US20200281667A1|2020-09-10| SG11202004181VA|2020-06-29| FR3073135B1|2019-11-15|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US20080186378A1|2007-02-06|2008-08-07|Feimo Shen|Method and apparatus for guiding towards targets during motion| US20100063514A1|2008-05-09|2010-03-11|Michael Maschke|Device and method for a medical intervention| US20110257514A1|2008-10-03|2011-10-20|Centre National De La Recherche Scientifique |Image processing method for estimating a brain shift in a patient| EP2468207A1|2010-12-21|2012-06-27|Renishaw Limited|Method and apparatus for analysing images| US20150366546A1|2014-06-18|2015-12-24|Siemens Medical Solutions Usa, Inc.|System and method for real-time ultrasound guided prostate needle biopsies using a compliant robotic arm| US20170265947A1|2016-03-16|2017-09-21|Kelly Noel Dyer|Trajectory guidance alignment system and methods|WO2021130670A1|2019-12-23|2021-07-01|Mazor Robotics Ltd.|Multi-arm robotic system for spine surgery with imaging guidance| WO2022003154A1|2020-07-03|2022-01-06|Squaremind|Robotic device for guiding a robotic arm|JP3492697B2|1994-10-07|2004-02-03|セントルイスユニバーシティー|Surgical guidance device with reference and localization frame| EP3416560A4|2015-12-28|2019-12-25|Metritrack, Inc.|System and method for the coregistration of medical image data|WO2021206161A1|2020-04-10|2021-10-14|川崎重工業株式会社|Diagnosis/treatment assistance robot, diagnosis/treatment assistance robot system, and diagnosis/treatment assistance method| CN111437174A|2020-04-16|2020-07-24|深圳瀚维智能医疗科技有限公司|Physiotherapy massage robot|
法律状态:
2018-11-30| PLFP| Fee payment|Year of fee payment: 2 | 2019-05-10| PLSC| Publication of the preliminary search report|Effective date: 20190510 | 2019-11-22| CA| Change of address|Effective date: 20191014 | 2019-11-29| PLFP| Fee payment|Year of fee payment: 3 | 2020-11-27| PLFP| Fee payment|Year of fee payment: 4 | 2021-11-30| PLFP| Fee payment|Year of fee payment: 5 |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 FR1760553A|FR3073135B1|2017-11-09|2017-11-09|ROBOTIC DEVICE FOR MINI-INVASIVE MEDICAL INTERVENTION ON SOFT TISSUE| FR1760553|2017-11-09|FR1760553A| FR3073135B1|2017-11-09|2017-11-09|ROBOTIC DEVICE FOR MINI-INVASIVE MEDICAL INTERVENTION ON SOFT TISSUE| KR1020207013357A| KR20200085762A|2017-11-09|2018-11-08|Robotic device for minimally incised medical intervention on soft tissue| CA3082046A| CA3082046A1|2017-11-09|2018-11-08|Robotic device for a minimally invasive medical intervention on soft tissues| AU2018364406A| AU2018364406A1|2017-11-09|2018-11-08|Robotic device for a minimally invasive medical intervention on soft tissues| PCT/FR2018/052769| WO2019092372A1|2017-11-09|2018-11-08|Robotic device for a minimally invasive medical intervention on soft tissues| CN201880027746.8A| CN110573105A|2017-11-09|2018-11-08|Robotic device for minimally invasive medical intervention on soft tissue| JP2020526272A| JP2021502195A|2017-11-09|2018-11-08|Robotic equipment for minimally invasive medical interventions in soft tissues| US16/762,876| US20200281667A1|2017-11-09|2018-11-08|Robotic device for a minimally invasive medical intervention on soft tissues| SG11202004181VA| SG11202004181VA|2017-11-09|2018-11-08|Robotic device for a minimally invasive medical intervention on soft tissues| EP18819178.7A| EP3706656A1|2017-11-09|2018-11-08|Robotic device for a minimally invasive medical intervention on soft tissues| IL274453A| IL274453D0|2017-11-09|2020-05-05|Robotic device for a minimally invasive medical intervention on soft tissues| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|