![]() imaging of areas outside the abdomen to improve placement and control of an in-use surgical device
专利摘要:
these are various systems and methods for controlling surgical devices. a computer system, such as a central surgical controller, can be configured to be communicably coupled to a surgical device and a camera configured to view an operating room. the computer system can be programmed to receive images from surgical team members and surgical devices within the operating room during a surgical procedure. the computer system may additionally determine conditions or characteristics associated with surgical team members and surgical devices, such as whether a surgical team member is performing a specific gesture or the position of a surgical device, such as a surgical instrument, during a surgical procedure. the computer system can then control the paired surgical devices properly. 公开号:BR112020013175A2 申请号:R112020013175-0 申请日:2018-11-14 公开日:2020-12-01 发明作者:Frederick E. Shelton Iv;Jason L. Harris;Chester O. Baxter Iii;Mark S. Zeiner 申请人:Ethicon Llc; IPC主号:
专利说明:
[0001] [0001] The present application claims the priority benefit of the non-provisional US patent application serial No. 16/182,269, entitled IMAGE CAPTURING OF THE AREAS OUTSIDE THE ABDOMEN TO IM- [0002] [0002] The present application claims priority under code 35 USC $119(e) to US Provisional Patent Application No. 62/729,191 entitled SURGICAL NETWORK RECOMMENDATIONS FROM REAL TIME ANALYSIS OF PROCEDURE VARIABLES AGAINST A BASELINE HIGHLIGHTING DIFFERENCES FROM THE OPTIMAL SOLUTION, filed on September 10, 2018, the description of which is incorporated herein. [0003] [0003] The present application also claims priority under 35 USC8 119(e) to provisional patent application No. 62/692,747, entitled SMART ACTIVATION OF AN ENERGY DEVICE BY ANOTHER DEVICE, filed on June 30, 2018, to the US provisional patent application No. 62/692,748, entitled SMART ENERGY ARCHITECTURE, filed on June 30, 2018 and US provisional patent application No. 62/692,768, entitled SMART ENERGY DEVICES, filed on June 30, 2018, being the description of each of which is incorporated herein by reference in their entirety. [0004] [0004] The present application also claims priority under 35 [0005] [0005] The present application also claims priority under 35 USC$119 (e) of US provisional patent application No. 62/650,898 filed on March 30, 2018, entitled CAPACITIVE COUPLED RETURN PATH PAD WITH SEPARABLE ARRAY ELEMENTS, from the US Provisional Patent Application Serial No. 62/650,887, entitled SURGICAL SYSTEMS WITH OPTIMIZED SENSING CAPABILITIES, filed on March 30, 2018, of US Provisional Patent Application Serial No. 62/650,882, entitled SMOKE EVACUATION MODULE FOR INTERACTIVE SURGICAL PLATFORM, filed on March 30, 2018, and the provisional US patent application Serial No. 62/650,877, entitled SURGICAL SMOKE EVACUATION SENSING AND CONTROLS, filed on March 30, 2018, whose description of each is incorporated herein by reference in their entirety. [0006] [0006] The present application also claims priority under 35 USC$ 119(e) of US provisional patent application serial No. 62/640,417, entitled TEMPERATURE CONTROL IN ULTRASONIC DEVICE AND CONTROL SYSTEM THEREFOR, filed on March 8, 2018, and US Provisional Patent Application Serial No. 62/640,415, entitled ES- [0007] [0007] The present application also claims priority under 35 USC$ 119(e) of US provisional patent application serial No. 62/611,341, entitled INTERACTIVE SURGICAL PLATFORM, filed on December 28, 2017, of the patent application Provisional US Serial No. 62/611,340, entitled CLOUD-BASED MEDICAL ANALYTICS, filed on December 28, 2017, and provisional US Patent Application Serial No. 62/611,339, entitled ROBOT ASSISTED SURGICAL PLATFORM, filed on December 28 December 2017, the description of each of which is incorporated herein by way of reference, in its entirety. BACKGROUND [0008] [0008] The present invention relates to various surgical systems. Surgical procedures are typically performed in surgical settings or operating rooms in a healthcare facility such as a hospital. A sterile field is typically created around the patient. The sterile field may include brushing team members, who are appropriately dressed, and all furniture and accessories in the area. Various devices and surgical systems are used in the performance of a surgical procedure. SUMMARY [0009] [0009] In a general aspect, a computer system configured to be communicatively coupled to a surgical device and a camera configured to view an operating room. The computer system comprises a processor and memory coupled to the processor. The memory stores instructions that, when executed by the processor, make the computer system: receive an image of an individual inside the operating room through the camera; determine whether the individual is making a gesture based on the image; and control the surgical device according to the gesture. [0010] [0010] In another general aspect, a computer system configured to be communicatively coupled to a surgical device and a camera configured to view an operating room. The computer system comprises a processor and memory coupled to the processor. Memory stores instructions that, [0011] [0011] In yet another general aspect, a computer system configured to be communicatively coupled to a surgical device and a camera configured to view an operating room. The computer system comprises a processor and memory coupled to the processor. The memory stores instructions that, when executed by the processor, cause the computer system to: receive an image of a surgical device or an individual inside the operating room through a camera that determines a position of the surgical device based on the image, according to the possibility that the image is of the surgical device; determine whether the individual is making a gesture based on the image according to the possibility that the image is of the individual; and controlling the surgical device according to at least one of the surgical device placement or gesture. FIGURES [0012] [0012] The new features of the various aspects are presented with particularity in the attached claims. The aspects described, however, whether with regard to organization or methods of operation, can be better understood by referring to the following description, taken in conjunction with the attached drawings, in which: [0013] [0013] The various aspects described here, both with regard to the organization and the methods of operation, together with objects and additional advantages thereof, can be better understood by referring to the description presented below, considered in together with the attached drawings as follows. [0014] [0014] Figure 1 is a block diagram of an interactive computer-implemented surgical system, in accordance with at least one aspect of the present disclosure. [0015] [0015] Figure 2 is a surgical system that is used to perform a surgical procedure in an operating room, in accordance with at least one aspect of the present description. [0016] [0016] Figure 3 is a central surgical controller paired with a visualization system, a robotic system, and an intelligent instrument, in accordance with at least one aspect of the present disclosure. [0017] [0017] Figure 4 is a partial perspective view of a central surgical controller housing, and a combination generator module slidably received in a drawer of the central surgical controller housing, in accordance with at least an aspect of the present description. [0018] [0018] Figure 5 is a perspective view of a generator module in combination with bipolar, ultrasonic and monopolar contacts and a smoke evacuation component, in accordance with at least one aspect of the present description. [0019] [0019] Figure 6 illustrates different power bus connectors for a plurality of side-coupled ports of a modular side cabinet configured to receive a plurality of modules, in accordance with at least one aspect of the present description. [0020] [0020] Figure 7 illustrates a modular vertical cabinet configured to receive a plurality of modules, in accordance with at least one aspect of the present description. [0021] [0021] Figure 8 illustrates a surgical data network comprising a central modular communication controller configured to connect modular devices located in one or more operating rooms of a healthcare facility, or any environment in a healthcare facility. public services specially equipped for surgical operations, to the cloud, in accordance with at least one aspect of the present description. [0022] [0022] Figure 9 illustrates an interactive computer-implemented surgical system, in accordance with at least one aspect of the present description. [0023] [0023] Figure 10 illustrates a central surgical controller comprising a plurality of modules coupled to the modular control tower, in accordance with at least one aspect of the present description. [0024] [0024] Figure 11 illustrates an aspect of a universal serial bus (USB) central network controller device, in accordance with at least one aspect of the present disclosure. [0025] [0025] Figure 12 is a block diagram of a cloud computing system comprising a plurality of intelligent surgical instruments coupled to central surgical controllers that can connect to the cloud component of the cloud computing system, in accordance with at least one aspect of the present description. [0026] [0026] Figure 13 is a functional module architecture of a cloud computing system, according to at least one aspect of the present description. [0027] [0027] Figure 14 illustrates a diagram of a surgical system with situational awareness, according to at least one aspect of the present description. [0028] [0028] Figure 15 is a timeline representing the situational recognition of a central surgical controller, in accordance with at least one aspect of the present description. [0029] [0029] Figure 16 is a diagram of an illustrative operating room (OS) configuration, in accordance with at least one aspect of the present description. [0030] [0030] Figure 17A is a logical flowchart of a process for controlling a surgical device, in accordance with at least one aspect of the present disclosure. [0031] [0031] Figure 17B is a logical flowchart of a process to generate surgical metadata, according to at least one aspect of the present description. [0032] [0032] Figure 18 is a block diagram of a gesture recognition system, in accordance with at least one aspect of the present description. [0033] [0033] Figure 19 is a representation of a surgeon holding a surgical instrument, in accordance with at least one aspect of the present description. [0034] [0034] Figure 20 is a scatterplot of the wrist angle versus the results of the surgical procedure, in accordance with at least one aspect of the present description. DESCRIPTION [0035] [0035] The applicant of the present application holds the following US patent applications, filed on November 6, 2018, the description of each being incorporated herein by reference, in its entirety: and US patent application no. 16/182,224, entitled SURGICAL NETWORK, INSTRUMENT, AND CLOUD RESPONSES BASED [0036] [0036] The applicant of the present application holds the following US patent applications filed on September 10, 2018, the description of each being incorporated herein by reference, in its entirety: and US Provisional Patent Application No. 62 /729,183, entitled A CONTROL FOR A SURGICAL NETWORK OR SURGICAL [0037] [0037] The applicant of the present application holds the following US patent applications, filed on August 28, 2018, the description of each being incorporated herein by reference in its entirety: and US patent application No. 16/ 115,214, entitled ESTIMATING [0038] [0038] The applicant of the present application holds the following US patent applications, filed on August 23, 2018, the description of each being incorporated herein by reference in its entirety: e Provisional patent application US No. 62/721,995, entitled CONTROLLING AN ULTRASONIC SURGICAL INSTRUMENT ACCORDING TO TISSUE LOCATION; and US Provisional Patent Application No. 62/721,998 entitled SITUATIONAL AWARENESS OF ELECTROSURGICAL SYSTEMS; and US Provisional Patent Application No. 62/721,999, entitled IN- [0039] [0039] The applicant of the present application holds the following US patent applications, filed on June 30, 2018, the description of each of which is incorporated herein, by way of reference, in its entirety: e Provisional patent application US No. 62/692,747, entitled SMART ACTIVATION OF AN ENERGY DEVICE BY ANOTHER DEVICE; and US Provisional Patent Application No. 62/692,748 entitled SMART ENERGY ARCHITECTURE; and and US Provisional Patent Application No. 62/692,768, entitled SMART ENERGY DEVICES. [0040] [0040] The applicant of the present application holds the following US patent applications, filed on June 29, 2018, the description of each of which is incorporated herein, by way of reference, in its entirety: e US patent application serial no. 16/024,090, entitled CA- [0041] [0041] The applicant of the present application holds the following US provisional patent applications, filed on June 28, 2018, the description of each of which is incorporated herein by reference in its entirety: and US Provisional Patent Application No. serial 62/691,228, entitled A METHOD OF USING REINFORCED FLEX CIRCUITS WITH MULTIPLE SENSORS WITH ELECTROSURGICAL DEVICES; and US Provisional Patent Application Serial No. 62/691,227 entitled CONTROLLING A SURGICAL INSTRUMENT ACCORDING TO SENSED CLOSURE PARAMETERS; and US Provisional Patent Application Serial No. 62/691,230 entitled SURGICAL INSTRUMENT HAVING A FLEXIBLE ELECTRODE; and US Provisional Patent Application Serial No. 62/691,219, entitled SURGICAL EVACUATION SENSING AND MOTOR CONTROL; and US Provisional Patent Application Serial No. 62/691,257 entitled COMMUNICATION OF SMOKE EVACUATION SYSTEM PA- RAMETERS TO HUB OR CLOUD IN SMOKE EVACUATION MODULE FOR INTERACTIVE SURGICAL PLATFORM; [0042] [0042] The applicant of the present application holds the following US provisional patent applications, filed on April 19, 2018, the description of each of which is incorporated herein by reference in their entirety: e US Provisional Patent Application serial no. 62/659,900, entitled METHOD OF HUB COMMUNICATION. [0043] [0043] The applicant of the present application holds the following provisional US patent applications, filed on March 30, 2018, the description of each of which is incorporated herein by reference in its entirety: and US Provisional Patent Application No. serial 62/650,898, filed March 30, 2018, titled CAPACITIVE COUPLED RETURN PATH PAD WITH SEPARABLE ARRAY ELEMENTS; and US Provisional Patent Application Serial No. 62/650,887 entitled SURGICAL SYSTEMS WITH OPTIMIZED SENSING CAPARBILITIES; and US Provisional Patent Application Serial No. 62/650,882 entitled SMOKE EVACUATION MODULE FOR INTERACTIVE SURGICAL PLATFORM; and e US Provisional Patent Application Serial No. 62/650,877, entitled SURGICAL SMOKE EVACUATION SENSING AND CONTROLS. [0044] [0044] The applicant of the present application holds the following US patent applications, filed on March 29, 2018, the description of each of which is incorporated herein by reference, [0045] [0045] The applicant of the present application holds the following provisional US patent applications, filed on March 28, 2018, the description of each of which is incorporated herein by reference in its entirety: and US Provisional Patent Application No. 62/649,302, entitled INTERACTIVE SURGICAL SYSTEMS WITH ENCRYPTED COMMUNICATION CAPABILITIES; and US Provisional Patent Application Serial No. 62/649,294 entitled DATA STRIPPING METHOD TO INTERROGATE PATIENT RECORDS AND CREATE ANONYMIZED RECORD; and US Provisional Patent Application Serial No. 62/649,300 entitled SURGICAL HUB SITUATIONAL AWARENESS; and US Provisional Patent Application Serial No. 62/649,309, entitled SURGICAL HUB SPATIAL AWARENESS TO DETERMINE DEVICES IN OPERATING THEATER; and US Provisional Patent Application No. 62/649,310 entitled COMPUTER IMPLEMENTED INTERACTIVE SURGICAL SYSTEMS; and US Provisional Patent Application No. 62/649,291 entitled USE OF LASER LIGHT AND RED-GREEN-BLUE COLORATION TO DETERMINE PROPERTIES OF BACK SCATTERED LIGHT; [0046] [0046] The applicant of the present application holds the following provisional US patent applications, filed on March 8, 2018, the description of each of which is incorporated herein by reference in its entirety: and US Provisional Patent Application No. serial 62/640,417 entitled TEMPERATURE CONTROL IN ULTRASONIC DEVICE AND CONTROL SYSTEM THEREFOR; and e US Provisional Patent Application Serial No. 62/640,415 entitled ESTIMATING STATE OF ULTRASONIC END EFFECTOR AND CONTROL SYSTEM THEREFOR. [0047] [0047] The applicant of the present application holds the following provisional US patent applications, filed on December 28, 2017, the description of each of which is incorporated herein by reference in its entirety: and Application for US Provisional Patent Serial No. 62/611,341 entitled INTERACTIVE SURGICAL PLATFORM; and US Provisional Patent Application Serial No. 62/611,340 entitled CLOUD-BASED MEDICAL ANALYTICS; and e US Provisional Patent Application Serial No. 62/611,339 entitled ROBOT ASSISTED SURGICAL PLATFORM. [0048] [0048] Before explaining in detail the various aspects of surgical devices and generators, it should be noted that the illustrative examples are not limited, in terms of application or use, to the details of construction and arrangement of parts illustrated in the drawings and in the attached description. The illustrative examples may be implemented or incorporated in other aspects, variations and modifications, and may be practiced or performed in a variety of ways. Furthermore, except where otherwise indicated, the terms and expressions used in the present invention have been chosen for the purpose of describing the illustrative examples for the convenience of the reader and not for the purpose of limiting the same. In addition, it is to be understood that one or more of the aspects, aspect expressions, and/or examples described below may be combined with any one or more of the other aspects, aspect expressions, and/or examples described below. Central surgical controllers [0049] [0049] Referring to Figure 1, a computer-implemented interactive surgical system 100 includes one or more surgical systems 102 and a cloud-based system (e.g., cloud 104 which may include a remote server 113 coupled to a device storage 105). Each surgical system 102 includes at least one central surgical controller 106 communicating with the cloud 104 which may include a remote server 113. In one example, as illustrated in Figure 1, the surgical system 102 includes a display system 108 , a robotic system 110, a handheld surgical instrument, and an intelligent 112 that are configured to communicate with each other and/or the central controller 106. In some aspects, a surgical system 102 may include a number of M central controllers 106 , a number N of visualization systems 108, a number O of robotic systems 110, and a number P of smart, hand-held surgical instruments 112, where M, N, O, and P are integers greater than or equal to one. [0050] [0050] Figure 2 represents an example of a surgical system 102 being used to perform a surgical procedure on a patient who is lying on an operating table 114 in a surgical operating room 116. A robotic system 110 is used in the pro- surgical procedure as a part of the surgical system 102. The robotic system 110 includes a surgeon's console 118, a patient cart 120 (surgical robot), and a robotic central surgical controller. [0051] [0051] Other types of robotic systems can be readily adapted for use with the Surgical System 102. Several examples of robotic systems and surgical instruments that are suitable for use with the present description are described in Provisional Patent Application Serial No. 62 /611,339, titled ROBOT ASSISTED SURGICAL PLATFORM, filed on December 28, 2017, the disclosure of which is incorporated herein by reference in its entirety. [0052] [0052] Several examples of cloud-based analysis that are performed by cloud 104, and suitable for use with the present description, are described in US Provisional Patent Application Serial No. 62/611,340, entitled CLOUD- BASED MEDICAL ANALYTICS, filed on December 28, 2017, the description of which is incorporated herein by way of reference, in its entirety. [0053] [0053] In various aspects, the imaging device 124 includes at least an Image sensor and one or more optical components. Suitable image sensors include, but are not limited to, charge-coupled device (CCD) sensors and complementary metal oxide semiconductor (CMOS) sensors. [0054] [0054] The optical components of the imaging device 124 may include one or more light sources and/or one or more lenses. One or more light sources can be directed to illuminate portions of the surgical field. The one or more image sensors can receive reflected or refracted light from the surgical field, including reflected or refracted light from tissue and/or surgical instruments. [0055] [0055] One or more lighting sources can be configured to radiate electromagnetic energy in the visible spectrum as well as the invisible spectrum. The visible spectrum, sometimes called the optical spectrum or light spectrum, is that portion of the electromagnetic spectrum that is visible to (that is, can be detected by) the human eye and may be called visible light or simply light. A typical human eye will respond to wavelengths in air that are from about 380 nm to about 750 nm. [0056] [0056] The invisible spectrum (ie the non-luminous spectrum) is that portion of the electromagnetic spectrum lying below and above the visible spectrum (ie wavelengths below about 380 nm and above about 750 nm). The invisible spectrum is not detectable by the human eye. Wavelengths longer than about 750 nm are longer than the visible red spectrum, and they become invisible infrared (IR), microwave, radio, and electromagnetic radiation. Wavelengths shorter than about 380 nm are shorter than the ultraviolet spectrum, and they become invisible ultraviolet, x-ray, and gamma-ray electromagnetic radiation. [0057] [0057] In many respects, the imaging device 124 is configured for use in a minimally invasive procedure. Examples of imaging devices suitable for use with the present description include, but are not limited to, an arthroscope, angioscope, bronchoscope, choledochoscope, colonoscope, cytoscope, duodenoscope, enteroscope, esophagus-duodenoscope (gastroscope), endoscope, laryngoscope, nasopharyngo-neproscope, sigmoidoscope, thoracoscope, and ureteroscope. [0058] [0058] In one aspect, the imaging device uses multi-spectrum monitoring to discriminate topography and underlying structures. A multispectral image is one that captures image data within wavelength ranges along the electromagnetic spectrum. Wavelengths can be separated by filters or through the use of instruments that are sensitive to specific wavelengths, including light of frequencies beyond the visible light range, eg IR and ultraviolet light. Spectral images can allow the extraction of additional information that the human eye cannot capture with its red, green, and blue color receptors. The use of multispectral imaging is described in greater detail under the title "Advanced Imaging Acquisition Module" in US Provisional Patent Application Serial No. 62/611,341 entitled INTERACTIVE SURGICAL PLATFORM, filed December 28, 2017 , the description of which is incorporated herein by way of reference in its entirety. Multispectral monitoring can be a useful tool for relocating a surgical field after a surgical task is completed to perform one or more of the previously described tests on treated tissue. [0059] [0059] It is axiomatic that strict sterilization of the operating room and surgical equipment is necessary during any surgery. The strict hygiene and sterilization conditions required in a "operating room", that is, an operating or treatment room, justify the highest possible sterilization of all medical devices and equipment. Part of this sterilization process is the need to sterilize anything that comes into contact with the patient or enters the sterile field, including the 124 imaging device and its connectors and components. It will be understood that the sterile field may be considered a specified area, such as inside a tray or on a sterile towel, which is considered to be free of microorganisms, or the sterile field may be considered an area, immediately surrounding a patient, who was prepared for a surgical procedure. The sterile field may include brushing team members, who are appropriately dressed, and all furniture and accessories in the area. [0060] [0060] In various aspects, the visualization system 108 includes one or more imaging sensors, one or more image processing units, one or more storage arrays, and one or more screens that are strategically arranged with respect to the field. sterile, as illustrated in Figure 2. In one aspect, the visualization system 108 includes an interface for HL7, PACS and EMR. Various components of the visualization system 108 are described under the title "Advanced Imaging Acquisition Module" in US Provisional Patent Application Serial No. 62/611,341 entitled INTERACTIVE SURGICAL PLATFORM, filed December 28, 2017, the description of which is here incorporated by way of reference in its entirety. [0061] [0061] As illustrated in Figure 2, a primary screen 119 is positioned in the sterile field to be visible to the operator at the operating table 114. In addition, a viewing tower 111 is positioned outside the sterile field. Viewing tower 111 includes a first non-sterile screen 107 and a second non-sterile screen 109, which are opposite each other. The display system 108, guided by the central controller 106, is configured to use screens 107, 109, and 119 to coordinate the flow of information for operators in and out of the sterile field. For example, the central controller 106 can cause the visualization system 108 to display a snapshot of a surgical site, as recorded by an imaging device 124, on a non-sterile screen 107 or 109, while maintaining a transmission to the live view of the surgical site on the main screen 119. The non-sterile screen snapshot 107 or 109 may allow a non-sterile operator to perform a diagnostic step relevant to the surgical procedure, for example. [0062] [0062] In one aspect, the central controller 106 is also configured to route diagnostic input or feedback by a non-sterile operator at the viewing tower 111 to the primary screen 119 within the sterile field, where it can be viewed by a sterile operator on the operating table. In one example, the input may be in the form of a modification of the snapshot displayed on the non-sterile screen 107 or 109, which can be routed to the main screen 119 by the central controller 106. [0063] [0063] Referring to Figure 2, a surgical instrument 112 is being used in the surgical procedure as part of the surgical system 102. The central controller 106 is also configured to coordinate the flow of information to a surgical instrument screen 112. For example, the flow of coordinated information is further described in US Provisional Patent Application Serial No. 62/611,341, entitled INTERACTIVE SURGICAL PLATFORM, filed December 28, 2017, the contents of which are incorporated herein by reference, in its entirety. Input or diagnostic feedback entered by a non-sterile operator into the viewing tower 111 can be routed by the central controller 106 to the surgical instrument screen 115 in the sterile field, where it can be viewed by the surgical instrument operator 112. Instruments Exemplary surgical devices that are suitable for use with the Surgical System 102 are described under the title "Surgical Instrument Hardware" in US Provisional Patent Application Serial No. 62/611,341 entitled INTERACTIVE SURGICAL PLATFORM, filed December 28 2017, the description of which is incorporated herein by reference, in its entirety, for example. [0064] [0064] Now referring to Figure 3, a central controller 106 is shown in communication with a display system 108, a robotic system 110 and a handheld smart surgical instrument 112. The central controller 106 includes a central controller screen 135, an imaging module 138, a generator module 140 (which may include a monopolar generator 142, a bipolar generator 144 and/or an ultrasonic generator 143), a communication module 130, a processor module 132 and a storage array 134. In certain aspects, as illustrated in Figure 3, the central controller 106 additionally includes a smoke evacuation module 126, a suction/irrigation module 128, and/or an OR mapping module 133. [0065] [0065] During a surgical procedure, the application of energy to tissue, for sealing and/or cutting, is usually associated with evacuation of smoke, suction of excess fluid and/or irrigation of the tissue. Fluid, power, and/or data lines from different sources are often intertwined during the surgical procedure. Valuable time can be lost in addressing this issue during a surgical procedure. To untangle the lines it may be necessary to disconnect the lines from their respective modules, which may require a restart of the modules. The 136 central controller's modular housing provides a unified environment for managing power, data, and fluid lines, which reduces the frequency of interleaving between such lines. [0066] [0066] Aspects of the present description present a central surgical controller for use in a surgical procedure that involves the application of energy to tissue at a surgical site. The core surgical controller includes a core controller housing and a combination generator module slidably received in a docking station of the core controller housing. The docking station includes data and power contacts. The combined generator module includes two or more of an ultrasonic energy generating component, a bipolar RF energy generating component, and a monopolar RF energy generating component that are housed in a single unit. In one aspect, the combined generator module also includes a smoke evacuation component, at least one power application cable to connect the combined generator module to a surgical instrument, at least one component, [0067] [0067] In one aspect, the fluid line is a first fluid line and a second fluid line extends from the remote surgical site to a suction and irrigation module slidably received in the housing of the central controller. In one aspect, the central controller housing comprises a fluid interface. [0068] [0068] Certain surgical procedures may require the application of more than one type of energy to the tissue. One type of energy may be more beneficial for cutting tissue, while a different type of energy may be more beneficial for sealing tissue. For example, a bipolar generator can be used to seal tissue while an ultrasonic generator can be used to cut sealed tissue. Aspects of the present description present a solution in which a modular housing of the central controller 136 is configured to accommodate different generators and facilitate interactive communication between them. One of the advantages of the 136 central controller modular housing is that it allows quick removal and/or replacement of multiple modules. [0069] [0069] Aspects of the present description feature a modular surgical sheath for use in a surgical procedure that involves application of energy to tissue. The modular surgical housing includes a first power generating module configured to generate a first power for application to tissue, and a first docking station comprising a first docking port that includes first data and power contacts, the first grinding wheel being - the power generator module is slidingly movable in an electrical coupling with the power and data contacts and the first power generating module being slidingly movable out of the electrical coupling with the first power and data contacts. [0070] [0070] In addition to the above, the modular surgical sheath also includes a second power generator module configured to generate a second power, different from the first power, for application to tissue, and a second docking station that comprises There is a second coupling port that includes second data and power contacts, the second power generating module being slidingly movable in electrical engagement with the power and data contacts, and the second module being power generator is slidingly movable out of electrical coupling with the second power and data contacts. [0071] [0071] In addition, the modular surgical cabinet also includes a communication bus between the first docking port and the second docking port, configured to facilitate communication between the first power generating module and the second power generating module. . [0072] [0072] With reference to Figures 3 to 7, aspects of the present description are presented for a modular housing of the central controller 136 that allows the modular integration of a generator module 140, a smoke evacuation module 126, and a suction module /irrigation 128. The modular housing of central controller 136 further facilitates interactive communication between modules 140, 126, 128. As illustrated in Figure 5, generator module 140 can be a generator module with monopolar, bipolar, and ultrasonic components. integrated, supported in a single cabinet unit 139 sliderable into the modular housing of the central controller 136. As illustrated in Figure 5, the generator module 140 can be configured to connect to a monopolar device 146, a bipolar device 147 and a ultrasonic device 148. Alternatively, the generator module [0073] [0073] In one aspect, the modular housing of the central controller 136 comprises a modular power and communication backplane 149 with external and wireless communication heads to allow removable attachment of modules 140, 126, 128 and interactive communication between them. three the same. [0074] [0074] In one aspect, the modular housing of the central controller 136 includes docking stations, or units, 151, here also called units, which are configured to slip-receive modules 140, 126, 128. Figure 4 illustrates a partial perspective view of a central surgical controller housing 136, and a combined generator module 145 slidably received in a docking station 151 of the central surgical controller housing. [0075] [0075] In various aspects, the smoke evacuation module 126 includes a fluid line 154 that transports trapped smoke/fluid collection away from a surgical site and to, for example, the smoke evacuation module 126. Vacuum suction originating from the smoke evacuation module 126 can draw smoke into an opening of a utility conduit in the surgical site. The utility conduit, coupled to the fluid line, may be in the form of a flexible tube terminating at the smoke evacuation module 126. The utility conduit and fluid line define a fluid path that extends toward the smoke evacuation module 126 which is received in the housing of the central controller 136. [0076] [0076] In various aspects, the suction/irrigation module 128 is coupled to a surgical tool comprising a fluid aspiration line and a fluid suction line. In one example, the suction and suction fluid lines are in the form of flexible tubes that extend from the surgical site towards the suction/irrigation module 128. One or more drive systems can be configured to cause irrigation and aspiration of fluids to and from the surgical site. [0077] [0077] In one aspect, the surgical tool includes a drive shaft that has an end actuator at a distal end thereof and at least one energy treatment associated with the end actuator, a suction tube, and a irrigation tube. The suction tube may have an inlet port at a distal end thereof and the suction tube extends through the drive shaft. Similarly, an irrigation tube may extend through the drive shaft and may have an inlet port near the power application implement. The power delivery implement is configured to deliver ultrasonic and/or RF energy to the surgical site and is coupled to the generator module 140 by a cable that initially extends through the drive shaft. [0078] [0078] The irrigation tube can be in fluid communication with a source of fluid, and the suction tube can be in fluid communication with a vacuum source. The fluid source and/or the vacuum source may be housed in the suction/irrigation module 128. In one example, the fluid source and/or the vacuum source may be housed in the housing of the central controller 136 separately from the suction/irrigation module 128. In such an example, a fluid interface may be configured to connect the suction/irrigation module 128 to the fluid source and/or the vacuum source. [0079] [0079] In one aspect, the modules 140, 126, 128 and/or their corresponding docking stations in the modular housing of the central controller 136 may include alignment features that are configured to align the docking ports of the modules in engage their counterparts in the docking stations of the central controller modular housing 136. For example, as illustrated in Figure 4, the combined generator module 145 includes side brackets 155 that are configured to slip-engage the corresponding brackets 156 of the corresponding docking station 151 of the central controller modular housing 136. The brackets cooperate to guide the docking port contacts of the combined generator module 145 into electrical engagement with the docking port contacts of the modular housing from the central controller 136. [0080] [0080] In some respects, the units 151 of the modular housing of the central controller 136 are the same, or substantially the same size, and the modules are adjusted in size to be received in the units 151. For example, the side supports 155 and/or or 156 can be larger or smaller depending on the module size. In other respects, the 151 drawers are different in size and are each designed to accommodate a specific module. [0081] [0081] In addition, the contacts of a specific module can be keyed to engage with the contacts of a specific unit to avoid inserting a module into a unit with contact mismatch. [0082] [0082] As illustrated in Figure 4, the docking port 150 of one drawer 151 can be coupled to the docking port 150 of another drawer 151 via a communication link 157 to facilitate interactive communication between modules housed in the modular housing of the central controller 136. The docking ports 150 of the central module housing 136 may alternatively or additionally facilitate interactive wireless communication between modules housed in the modular housing of the central controller 136. Any suitable wireless communication may be used, such as Air Titan Bluetooth. [0083] [0083] Figure 6 illustrates individual power bus connectors for a plurality of side coupling ports of a modular side cabinet 160 configured to receive a plurality of modules from a central surgical controller 206. The modular side cabinet 160 is configured to receive and laterally interconnect the modules 161. The modules 161 are slidably inserted into the docking stations 162 of the modular side cabinet 160, which includes a back plate for interconnecting the modules 161. As illustrated in Figure 6, the modules 161 are arranged sideways in a modular side cabinet 160. Alternatively, modules 161 can be arranged vertically in a modular side cabinet. [0084] [0084] Figure 7 illustrates a modular vertical cabinet 164 configured to receive a plurality of modules 165 from the central surgical controller 106. Modules 165 are slidably inserted into docking stations, or drawers, 167 of the vertical modular cabinet 164, which includes a back panel for interconnecting modules 165. Although the drawers 167 of the vertical modular cabinet 164 are arranged vertically, in certain cases, a modular vertical cabinet 164 may include drawers that are arranged laterally. . In addition, the modules 165 can interact with each other through the docking ports of the vertical modular cabinet 164. In the example in Figure 7, a screen 177 is provided to show data relevant to the operation of the modules 165. In addition, the vertical modular cabinet 164 includes a master module 178 that houses a plurality of submodules that are slidably received in the master module 178. [0085] [0085] In many respects, the imaging module 138 comprises an integrated video processor and a modular light source and is adapted for use with various imaging devices. In one aspect, the imaging device is comprised of a modular cabinet that can be assembled with a light source module and a camera module. The cabinet can be a disposable cabinet. In at least one example, the disposable case is detachably coupled to a reusable controller, a light source module, and a camera module. The light source module and/or the camera module can be selected selectively depending on the type of surgical procedure. In one aspect, the camera module comprises a CCD sensor. In another aspect, the camera module comprises a CMOS sensor. In another aspect, the camera module is configured to image the scanned beam. Similarly, the light source module can be configured to provide either a white light or a different light depending on the surgical procedure. [0086] [0086] During a surgical procedure, removing a surgical device from the surgical field and replacing it with another surgical device that includes a Different camera or other light source may be inefficient. Temporarily losing sight of the surgical field can lead to undesirable consequences. The imaging device module of the present disclosure is configured to allow replacement of a light source module or a midstream camera module during a surgical procedure, without the need to remove the imaging device from the surgical field. [0087] [0087] In one aspect, the imaging device comprises a tubular housing that includes a plurality of channels. A first channel is configured to slide in the Camera module, which can be configured to snap-fit with the first channel. A second channel is configured to slide-receive the camera module, which can be configured to snap-fit with the first channel. In another example, the camera module and/or the light source module can be rotated to an end position within their respective channels. A threaded hitch can be used instead of a snap fit. [0088] [0088] In several examples, multiple imaging devices are placed in different positions in the surgical field to provide multiple views. The imaging module 138 can be configured to switch between imaging devices to provide an optimal view. In various aspects, the imaging module 138 can be configured to integrate images from different imaging devices. [0089] [0089] Various image processors and imaging devices suitable for use with the present disclosure are described in US Patent No. 7,995,045 entitled COMBINED SBI AND CONVENTIONAL IMAGE PROCESSOR, issued August 9, 2011 which is incorporated herein as a reference in its entirety. In addition, US patent No. 7,982,776, entitled SBI MOTION ARTIFACT REMOVAL APPARATUS AND METHOD, granted on July 19, [0090] [0090] Figure 8 illustrates a surgical data network 201 comprising a central modular communication controller 203 configured to connect modular devices located in one or more operating rooms of a healthcare facility, or any environment in which a utility facility specially equipped for surgical operations, to a cloud-based system (e.g., cloud 204 which may include a remote server 213 coupled to a storage device 205). In one aspect, the modular communication central controller 203 comprises a central network controller 207 and/or a network switch 209 in communication with a network router. The central modular communication controller 203 may also be coupled to a local computer system 210 to provide local computer processing and data manipulation. The surgical data network 201 can be configured as a passive, intelligent, or switching network. A passive surgical data network serves as a conduit for data, allowing data to be transmitted from one device (or segment) to another and to cloud computing resources. An intelligent surgical data network includes features to allow traffic to pass through the surgical data network to be monitored and to configure each port on the central network controller 207 or network switch [0091] [0091] Modular devices 1a to 1n located in the operating room can be coupled to the central modular communication controller 203. The central network controller 207 and/or the network switch 209 can be coupled to a network router 211 to connect 1a to 1n devices to the 204 cloud or local computer system [0092] [0092] It will be understood that the surgical data network 201 can be expanded by interconnecting the multiple central network controllers 207 and/or the multiple network switches 209 with multiple network routers 211. The central modular communication controller [0093] [0093] In one aspect, the surgical data network 201 may comprise a combination of central network controllers, network switches, and network routers that connect 1a to 1n/2a to 2m devices to the cloud. Any or all of the 1a to 1n/2a to 2m devices coupled with the central network controller or network switch can collect data in real time and transfer the data to cloud computers for data processing and manipulation. It will be understood that cloud computing relies on sharing computing resources rather than having local servers or personal devices to handle software applications. The word "cloud" can be used as a metaphor for "the Internet", although the term is not limited as such. Consequently, the term "cloud computing" can be used here to refer to "a type of basic computing". [0094] [0094] By applying cloud computer data processing techniques to data collected by devices 1a to 1n/2a to 2m, the surgical data network provides better surgical outcomes, reduced costs, and improved patient satisfaction. At least some of the 1a to 1n/2a to 2m devices can be used to view tissue states to assess the occurrence of leaks or perfusion of sealed tissue after a tissue sealing and cutting procedure. At least some of the 1a to 1n/2a to 2m devices can be used to identify pathology, such as the effects of disease, using cloud-based computing to examine data including images of body tissue samples for diagnostic purposes. . This includes confirmation of tissue location and margin and phenotypes. At least some of the 1a to 1n/2a to 2m devices can be used to identify anatomical structures of the body using a variety of sensors integrated with imaging devices and techniques such as overlaying images captured by multiple imaging devices. Data collected by devices 1a to 1n/2a to 2m, including image data, can be transferred to the cloud 204 or the local computer system 210 or both for data processing and manipulation including data processing and manipulation. Image. Data can be analyzed to improve surgical procedure outcomes by determining whether additional treatment, such as application of endoscopic intervention, emerging technologies, a targeted radiation, targeted intervention, robotics precise to specific sites and conditions. of fabric can be followed. Such data analysis can additionally use analytical processing of the results, and using standardized approaches can provide standardized feedback beneficial either to confirm surgical treatments and surgeon behavior or to suggest modifications to surgical treatments and surgeon behavior. surgeon. [0095] [0095] In one implementation, OR devices 1a to 1n can be connected to the central modular communication controller 203 via a wired channel or a wireless channel depending on the configuration of devices 1a to 1n in a central network controller. The core network controller 207 can be implemented, in one aspect, as a LAN transmission device that acts on the physical layer of the open system interconnection ("OSI") model. The central network controller provides connectivity to devices 1a to 1n located on the same network as the operating room. The core network controller 207 collects data in the form of packets and sends them to the router in half-duplex mode. The network central controller 207 does not store any media access control/internet protocol (MACY/IP) to transfer the device data. Only one of the devices 1a to 1n at a time can send data through the central network controller 207. The central network controller 207 has no routing tables or intelligence about where to send information and transmits all network data through of each connection and to a remote server 213 (Figure 9) on the cloud 204. The central network controller 207 can detect basic network errors such as collisions, but have all (assume that) information transmitted to multiple inbound ports. can be a safety hazard and cause bottlenecks. [0096] [0096] In another implementation, OR devices 2a to 2m can be connected to a network switch 209 via a wired or wireless channel. Network key 209 works at the data connection layer of the OSI model. Network switch 209 is a multicast device for connecting devices 2a to 2m located in the same operation center to the network. Network switch 209 sends data in the form of frames to network router 211 and operates in full duplex mode. Multiple devices 2a to 2m can send data at the same time via network switch 209. Network switch 209 stores and uses MAC addresses of devices 2a to 2m to transfer data. [0097] [0097] Central network controller 207 and/or network switch 209 are coupled to network router 211 for a connection to the cloud [0098] [0098] In one example, the central network controller 207 can be implemented as a USB central controller, which allows multiple USB devices to be connected to a host computer. The USB central controller can expand a single USB port to several levels so that there are more ports available for connecting devices to the system's host computer. The core network controller 207 may include wired or wireless capabilities to receive information about a wired channel or a wireless channel. In one aspect, a wireless USB short-range, broadband, wireless radio communication protocol can be used for communication between devices la to In and devices 2a to 2m situated in the living room. operation. [0099] [0099] In other examples, operating room devices 1a to 1n/2a to 2m can communicate with the 203 modular communication central controller via standard wireless Bluetooth technology to exchange data over short distances (with the using short-wavelength UHF radio waves in the ISM band 2.4 to 2.485 GHz) from fixed and mobile devices and building personal area networks ("PANs"). In other respects, operating room devices 1a to 1n/2a to 2m can communicate with the 203 modular central communication controller via a number of wireless and wired communication standards or protocols, including but not limited to limited to, Wi-Fi [0100] [0100] The modular communication central controller 203 can serve as a central connection for one or all of the 1a to 1n/2a to 2m operating room devices and handles a type of data known as frames. Frames carry the data generated by devices 1a to 1n/2a to 2m. When a frame is received by the modular communication central controller 203, it is amplified and transmitted to the network router 211, which transfers the data to the cloud computing resources using a series of patterns. or wireless or wired communication protocols as described in the present invention. [0101] [0101] The 203 modular communication core controller can be used as a standalone device or be connected to compatible core network controllers and network switches to form a larger network. The 203 Modular Communications Central Controller is generally easy to install, configure, and maintain, making it a good choice for the 1a to 1n/2a to 2m operating room network. [0102] [0102] Figure 9 illustrates an interactive computer-implemented surgical system 200. The interactive computer-implemented surgical system 200 is similar in many respects to the interactive, computer-implemented surgical system 100. For example, the interactive surgical system, computer-implemented 200 , computer-implemented 200 includes one or more surgical systems 202, which are similar in many respects to surgical systems 102. Each surgical system 202 includes at least one central surgical controller 206 in communication with a cloud 204 that may include a remote server 213. In one aspect, the computer-implemented interactive surgical system 200 comprises a modular control tower 236 connected to multiple operating room devices, such as intelligent surgical instruments, robots, and other local computerized devices. - performed in the operating room. [0103] [0103] Figure 10 illustrates a central surgical controller 206 comprising a plurality of modules coupled to the modular control tower 236. The modular control tower 236 comprises a central modular communication controller 203, for example, a device network connectivity, and a computer system 210 to provide local processing, display, and imaging, for example. As shown in Figure 10, the modular communication central controller 203 can be connected in a layered configuration to expand the number of modules (e.g. devices) that can be connected to the modular communication central controller 203 and transfer data associated with the modules to the computer system 210, cloud computing resources, or both. As shown in Figure 10, each of the central controllers/network switches in the central modular communication controller 203 includes three downstream ports and one upstream port. The upstream central controller/network switch is connected to a processor to provide a communication link with the cloud computing resources and a local display 217. Communication with the cloud 204 can be done through a channel of wired or wireless communication. [0104] [0104] The central surgical controller 206 uses a non-contact sensor module 242 to measure the dimensions of the operating room and generate a map of the operating room using non-contact laser or ultrasonic measurement devices. An ultrasound-based non-contact sensor module scans the operating room by transmitting an ultrasound burst and receiving the echo as it bounces off the perimeter of an operating room wall, as described under the heading “Surgical Hub Spatial Awareness Within an Operating Room” in US Provisional Patent Application Serial No. 62/611,341, entitled INTERACTIVE SURGICAL PLATFORM, filed on December 28, 2017, which is hereby incorporated by reference in its entirety, in the which sensor module is configured to determine the size of the OR and adjust the Bluetooth pairing distance limits. A laser-based non-contact sensor module scans the operating room by transmitting pulses of laser light, receiving pulses of laser light that bounce off the perimeter walls of the operating room, and comparing the phase of the transmitted pulse to the received pulse to determine the size. - operating room and to adjust the pairing distance limits with Bluetooth, for example. [0105] [0105] The computer system 210 comprises a processor 244 and a network interface 245. The processor 244 is coupled to a communication module 247, storage 248, memory 249, non-volatile memory 250, and input/output interface. 251 over a system bus. The system bus can be any of several types of bus structures, including the memory bus or memory controller, a peripheral bus or external bus, and/or local bus using any variety of available bus architectures including , but not limited to, 9-bit bus, Industry Standard Architecture (ISA), Micro-Charmel Architecture (MSA), Extended ISA (EISA), Smart Drive Electronics (IDE), VESA Local Bus (VLB) , Peripheral Component Interconnect (PCI), USB, Accelerated Graphics Port (AGP), PCMCIA Bus (As- [0106] [0106] The 244 processor can be any single-core or multi-core processor, such as those known under the tradename ARM Cortex available from Texas Instruments. In one aspect, the processor may be a Core Cortex-M4F LM4F230H5QR ARM processor, available from Texas Instruments, for example, which comprises an integrated 256 KB single-cycle flash memory, or other non-volatile memory, up to 40 MHz, a prefetch buffer to optimize performance above 40 MHz, a 32 KB single-cycle serial random access memory (SRAM), an internal read-only memory (ROM) loaded with the StellarisWareO program, memory only programmable electrically erasable readout (EEPROM), one or more pulse-width modulation (PWM) modules, one or more analog quadrature encoder (QEI) inputs, one or more analog-to-digital (ADC) converters ) 12-bit with 12 analog input channels, details of which are available for the product data sheet. [0107] [0107] In one aspect, the processor 244 may comprise a safety controller comprising two controller-based families, such as TMS570 and RMA4x, known under the trade name Hercules ARM Cortex R4, also by Texas Instruments. The safety controller can be configured specifically for safety critical applications IEC 61508 and ISO 26262, among others, to provide advanced built-in safety features while providing scalable performance, connectivity and memory options. [0108] [0108] System memory includes volatile memory and non-volatile memory. The basic input/output system (BIOS), containing the basic routines for transferring information between elements within the computer system, such as during startup, is stored in nonvolatile memory. For example, non-volatile memory may include ROM, programmable ROM (PROM), electrically programmable ROM (EPROM), EEPROM, or flash memory. Volatile memory includes random access memory (RAM), which acts as external cache memory. Additionally, RAM is available in many forms such as SRAM, Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDR SDRAM), Enhanced SDRAM (ESDRAM), Synchlink DRAM (SLDRAM), and Direct RAM Rambus RAM (DRRAM). [0109] [0109] Computer system 210 also includes removable/non-removable, volatile/non-volatile computer storage media, eg disk storage. Disk storage includes, but is not limited to, devices such as a magnetic disk drive, floppy disk drive, tape drive, Jaz drive, Zip drive, LS-60 drive, flash memory card, or memory stick ( pen drive). In addition, the storage disc may include storage media separately or in combination with other storage media including, but not limited to, an optical disc drive such as a compact disc ROM device (CD-ROM) compact disc drive recordable (CD-R Drive), rewritable compact disc drive (CD-RW drive), or a digital versatile disc ROM (DVD-ROM) drive. To facilitate the connection of disk storage devices to the system bus, a removable or non-removable interface can be used. [0110] [0110] It is to be understood that the computer system 210 includes software that acts as an intermediary between the users and the basic computer resources described in a suitable operating environment. Such software includes an operating system. The operating system, which can be stored on disk storage, [0111] [0111] A user enters commands or information into the computer system 210 through the input device(s) coupled to the 1/O interface 251. Input devices include, but are not limited to, a pointing device such as a mouse, trackball, stylus, touchpad, keyboard, microphone, joystick, game pad, satellite card, scanner, TV tuner card, digital camera, digital video camera, web camera, and the like . These and other input devices connect to the processor via the system bus via the interface port(s). Interface ports include, for example, a serial port, a parallel port, a game port, and a USB. Output devices use some of the same types of ports as input devices. In this way, for example, a USB port can be used to provide input to the computer system and to provide information from the computer system to an output device. An output adapter is provided to illustrate that there are some output devices such as monitors, displays, speakers, and printers, among other output devices, that need special adapters. Output adapters include, by way of illustration and not limitation, video and sound cards that provide a means of connecting the output device to the system bus. It should be noted that other devices and/or device systems, such as remote computers, provide input and output capabilities. [0112] [0112] Computer system 210 can operate in a networked environment using logical connections to one or more remote computers, such as cloud computers, or local computers. Remote cloud computers can be a personal computer, server, router, networked personal computer, workstation, microprocessor-based device, peer device, or other common network node, and the like, and typically include many or all elements described in relation to the computer system. For the sake of brevity, only one memory storage device is illustrated with the remote computer. Remote computers are logically connected to the computer system through a network interface and then physically connected through a communication link. The network interface covers communication networks such as local area networks (LANs) and wide area networks (WANs). LAN technologies include Fiber Distributed Data Interface (FDDI), Copper Distributed Data Interface (CDDI), Ethernet/IEEE 802.3, Token Ring/IEEE [0113] [0113] In various aspects, the computer system 210 of Figure 10, the imaging module 238 and/or display system 208, and/or the processor module 232 of Figures 9 to 10, may comprise a data processor. image, image processing engine, media processor, or any specialized digital signal processor (DSP) used for processing digital images. The image processor can employ parallel computing with single instruction multiple data (SIMD) or multiple instruction multiple data (MIMD) technologies to increase speed and efficiency. The digital image processing engine can perform a number of tasks. The image processor can be a system on an integrated circuit with a multi-core processor architecture. [0114] [0114] Communication connections refer to the hardware/software used to connect the network interface to the bus. Although the communication connection is shown for illustrative clarity within the computer system, it may also be external to the computer system 210. Hardware/software required for connection to the network interface includes, for purposes only illustrative, internal and external technologies such as modems, including regular serial telephone modems, cable modems and DSL modems, ISDN adapters and Ethernet cards. [0115] [0115] Figure 11 illustrates a functional block diagram of an aspect of a USB 300 network central controller device, in accordance with at least one aspect of the present disclosure. In the illustrated aspect, the USB 300 core network controller device uses a TUSB2036 integrated circuit core controller available from Texas Instruments. The USB 300 core network controller is a CMOS device that provides one USB upstream transceiver port 302 and up to three USB downstream transceiver ports 304, 306, 308 in compliance with the USB 2.0 specification. Upstream USB transceiver port 302 is a differential data root port comprising a "minus" differential data input (DMO) paired with a "plus" differential data input (DPO). The three downstream USB transceiver ports 304, 306, 308 are differential data ports, with each port including "plus" differential data outputs (DP1-DP3) paired with "minus" differential data outputs (DM1-DM3) . [0116] [0116] USB 300 central network controller device is printed [0117] [0117] The USB 300 Core Network Controller Device includes a 310 Serial Interface Engine (SIE). The SIE 310 is the hardware front end of the USB 300 core network controller and handles most of the protocol described in Chapter 8 of the USB specification. The SIE 310 typically understands signaling down to the transaction level. The functions it handles could include: packet recognition, transaction sequencing, SOP, EOP, RESET, and RESUME signal detection/generation, clock/data separation, data encoding/decoding non-return to zero (NRZI), CRC generation and verification (token and data), packet ID (PID) generation and verification/decoding, and/or serial-parallel/parallel-serial conversion. The 310 receives a clock input 314 and is coupled to a suspend/resume and frame timer logic circuit 316 and a repeater circuit 318 of the central controller to control communication between the USB upstream transceiver port 302 and downstream USB transceiver ports 304, 306, 308 via the logic circuits of ports 320, 322, [0118] [0118] In many ways, the USB 300 core network controller can connect 127 functions configured in up to six logical layers (levels) to a single computer. In addition, the USB 300 central network controller can connect all peripherals using a standard four-wire cable that provides both communication and power distribution. Power settings are bus-powered and self-powered modes. The USB 300 core network controller can be configured to support four power management modes: a bus-powered core controller, with single port power management or grouped port power management, and the self-powered core controller, with single port power management or grouped port power management. In one aspect, with the use of a USB cable, the USB core network controller 300, the USB upstream transceiver port 302 plugs into a USB host controller, and the USB downstream transceiver ports 304, 306, 308 are exposed for connecting USB-compatible devices, and so on. [0119] [0119] Additional details regarding the structure and function of the central surgical controller and/or networks of central surgical controllers can be found in US provisional patent application No. 62/659,900, entitled METHOD OF HUB COMMUNICATION, filed on April 19, 2018, which is incorporated herein by reference in its entirety. Cloud system hardware and functional modules [0120] [0120] Figure 12 is a block diagram of the computer-implemented interactive surgical system, in accordance with at least one aspect of the present description. In one aspect, the computer-implemented interactive surgical system is configured to monitor and analyze data related to the operation of various surgical systems that include central surgical controllers, surgical instruments, robotic devices, and operating rooms or healthcare facilities. [0121] [0121] In addition, 7012 surgical instruments may comprise transceivers for transmitting data to and from their corresponding 7006 central surgical controllers (which may also comprise transceivers). Combinations of 7012 surgical instruments and corresponding 7006 central controllers can target specific locations, such as operating rooms in healthcare facilities (eg, hospitals), to provide medical operations. For example, the memory of a 7006 central surgical controller can store location data. As shown in Figure 12, cloud 7004 comprises central servers 7013 (which may be the same or similar to remote server 113 in Figure 1 and/or remote server 213 in Figure 9), application servers for central controllers 7002, 7034 data and a 7007 input/output ("I/O") interface. The 7013 central servers of the 7004 cloud collectively administer the cloud computing system, which includes monitoring requests by central 7006 client controllers and managing the capacity of 7004 cloud processing to execute requests. Each of the central servers 7013 comprises one or more processors 7008 coupled to suitable memory devices 7010 that may include volatile memory, such as random access memory (RAM), and non-volatile memory, such as storage devices. magnetic storage. The 7010 memory devices may comprise machine-executable instructions that, when executed, cause the 7008 processors to execute the 7034 data analysis modules for the cloud-based data analysis, operations, recommendations, and other operations described below. In addition, 7008 processors can run 7034 data analysis modules independently or in conjunction with central controller applications independently run by 7006 central controllers. 7013 central servers also comprise 2212 aggregated medical databases, which can reside in 2210 memory. [0122] [0122] Based on connections to multiple 7006 Surgical Central Controllers via the 7001 Network, the 7004 Cloud can aggregate data from specific data generated by various 7012 Surgical Instruments and their corresponding 7006 Central Controllers. Such aggregated data may be stored in the aggregated medical data databases 7011 of the cloud 7004. In particular, the cloud 7004 may advantageously perform data analysis and operations on the aggregated data to produce insights and/or perform functions that central controllers individual 7006 could not achieve on their own. For this purpose, as shown in Figure 12, the cloud 7004 and the central surgical controllers 7006 are communicatively coupled to transmit and receive information. The 7007 I/O interface is connected to the plurality of 7006 central surgical controllers through the 7001 network. In this way, the 7007 I/O interface can be configured to transfer information between the 7006 central surgical controllers and the 7011 aggregated medical databases. Consequently, the 7007 I/O interface can facilitate the read/write operations of the cloud-based data analysis system. Such read/write operations can be performed in response to requests from the 7006 central controllers. These requests can be transmitted to the 7006 central controllers through the central controller applications. The 7007 I/O interface can include one or more high-speed data ports, which can include universal serial bus (USB) ports, IEEE 1394 ports, as well as Wi-Fi and Bluetooth I/O interfaces for connecting to Cloud 7004 to 7006 Central Controllers. Application Servers for 7002 Central Controllers from Cloud 7004 are configured to host and provide shared capabilities to software applications (eg, applications for central controller) run by 7006 Central Surgical Controllers. For example , application servers for 7002 central controllers can manage requests presented by applications to central controllers through 7006 central controllers, control access to 7011 aggregated medical databases, and perform load balancing. The 7034 data analysis modules are described in more detail with reference to Figure 13. [0123] [0123] The specific cloud computing system configuration described in this description is specifically designed to address various issues raised in the context of medical operations and procedures performed using medical devices such as 7012 surgical instruments , 112. In particular, the 7012 surgical instruments can be digital surgical devices configured to interact with the 7004 cloud to implement techniques to improve the performance of surgical operations. Multiple 7012 surgical instruments and/or 7006 central surgical controllers can comprise touch-controlled user interfaces so clinicians can control aspects of interaction between 7012 surgical instruments and the cloud [0124] [0124] Figure 13 is a block diagram illustrating the functional architecture of the computer-implemented interactive surgical system, in accordance with at least one aspect of the present description. The cloud-based data analysis system includes a plurality of data analysis modules 7034 that can be executed by the processors [0125] [0125] For example, the 7022 data collection and aggregation module could be used to generate self-describing data (e.g. metadata), including identifying notable features or configuration (e.g. trends), managing the data sets of redundant data and storing the data in paired datasets that can be grouped by surgery, but not necessarily keyed to surgical dates and actual surgeons. In particular, paired datasets generated from operations of the 7012 surgical instruments may comprise application of a binary classification, for example, a bleeding or non-bleeding event. More generally, binary classification can be characterized as a desirable event (eg, a successful surgical procedure) or an undesirable event (eg, a failed or misused surgical instrument 7012). Aggregated self-describing data can correspond to individual data received from various groups or subgroups of 7006 central surgical controllers. Consequently, the 7022 Data Collection and Aggregation Module can manage aggregated metadata or other organized data based on raw data received from the controllers. 7006 core surgical devices. For this purpose, 7008 processors can be operationally coupled with 7014 core controller applications and 7011 aggregate medical databases to run 7034 data analysis modules. The 7022 data collection and aggregation module can store the aggregated data organized in the aggregated medical databases 2212. [0126] [0126] The 7020 Resource Optimization Module can be configured to analyze this aggregated data to determine an optimal use of resources for a specific group or group of health facilities. For example, the resource optimization module 7020 can determine an optimal order point for surgical stapling instruments 7012 for a group of clinics based on the corresponding forecast demand for such instruments 7012. The resource optimization module 7020 resources could also assess resource usage or other operational settings of various health facilities to determine whether resource use could be improved. Similarly, the recommendations module 7030 can be configured to analyze aggregated data organized from the data collection and aggregation module 7022 to provide recommendations. For example, the 7030 recommendations module could recommend to healthcare facilities (e.g. medical providers such as hospitals) that a specific 7012 surgical instrument should be upgraded to an improved version based on a higher than expected error rate. , for example. Additionally, the 7030 Recommendations Module and/or the 7020 Resource Optimization Module could recommend better supply chain parameters, such as product restock points, and provide suggestions for different 7012 surgical instrument uses, or steps. procedure to improve surgical results. Health centers can receive such recommendations through corresponding central surgical controllers 7006. More specific recommendations on parameters or settings of various 7012 surgical instruments can also be provided. The 7006 central controllers and/or 7012 surgical instruments can each also have display screens that display data or recommendations provided by the 7004 cloud. [0127] [0127] The 7028 Patient Outcome Analysis Module can analyze surgical outcomes associated with currently used operating parameters of 7012 surgical instruments. The 7028 Patient Outcome Analysis module can also analyze and evaluate other potential operating parameters. In this context, the 7030 recommendations module could recommend the use of these other potential operational parameters based on achieving better surgical outcomes, such as better sealing or less bleeding. For example, the 7030 recommendations module could transmit recommendations to a 7006 central surgical controller on when to use a particular cartridge for a corresponding 7012 stapling surgical instrument. In this way, the cloud-based data analysis system, while controlling for common variables, can be configured to analyze the large collection of raw data and provide centralized recommendations on multiple health centers (advantageously determined based on data). aggregates). For example, the cloud-based data analytics system could analyze, evaluate and/or aggregate data based on the type of medical practice, type of patient, number of patients, geographic similarity between medical providers, which providers/medical facilities use types similar instruments, etc., in a way that no single health facility would be able to independently analyze. [0128] [0128] The 7026 driver program update module can be configured to implement various 7012 surgical instrument recommendations when the corresponding driver programs are updated. For example, the 7028 Patient Outcome Analysis Module could identify correlations that link specific control parameters with successful (or unsuccessful) outcomes. These correlations can be addressed when updated control programs are transmitted to 7012 surgical instruments through the 7026 control program update module. Updates to 7012 instruments that are transmitted through a corresponding 7006 central controller can incorporate data. performance metrics that were gathered and analyzed by the 7022 data collection and aggregation module from the 7004 cloud. Additionally, the 7028 patient outcome analysis module and the 7030 recommendations module could identify better methods of using 7012 instruments based on aggregated performance data. [0129] [0129] The cloud-based data analytics system can include security features implemented by the 7004 cloud. These security features can be managed by the 7024 security and authorization module. Each 7006 central surgical controller can have unique credentials associated with username, password and other appropriate security credentials. These credentials could be stored in the 7010 memory and associated with a permitted cloud access level. For example, based on providing exact credentials, a 7006 central surgical controller may be granted access to communicate with the cloud up to a predetermined point (e.g., only certain defined types of information can participate in the transmission or reception of information). ). For this purpose, the aggregated medical databases 7011 of the cloud 7004 may comprise a database of authorized credentials to verify the accuracy of the credentials provided. Different credentials can be associated with varying levels of permission to interact with the 7004 cloud, such as a predetermined access level to receive cloud-generated data analytics [0130] [0130] Additionally, for security purposes, the cloud could maintain a database of 7006 central controllers, 7012 instruments, and other devices that may comprise a "blacklist" of prohibited devices. In particular, a blacklisted 7006 central surgical controller may not be allowed to interact with the cloud, while blacklisted 7012 surgical instruments may not have functional access to a corresponding 7006 central controller and/or may be prevented from fully functioning when paired with their corresponding 7006 central controller. Additionally or alternatively, the 7004 cloud can flag 7012 instruments based on incompatibility or other specified criteria. In this way, counterfeit medical devices and inappropriate reuse of such devices throughout the cloud-based data analytics system can be identified and addressed. [0131] [0131] Surgical instruments 7012 can use wireless transceivers to transmit wireless signals that can represent, for example, authorization credentials to access the corresponding central controllers 7006 and the cloud 7004. Wired transceivers can also be used to transmit signals. These authorization credentials can be stored in the respective memory devices of the 7012 surgical instruments. The authorization and security module 7024 can determine whether the authorization credentials are accurate or falsified. The 7024 authorization and security module can also dynamically generate authorization credentials for increased security. Credentials could also be encrypted, such as using hash-based encryption. After transmitting the proper authorization, the 7012 surgical instruments can transmit a signal to the corresponding 7006 central controllers and finally to the 7004 cloud to indicate that the 7012 instruments are ready to obtain and transmit medical data. In response, the 7004 cloud may transition to an enabled state to receive medical data for storage in the aggregated medical databases 7011. This availability for data transmission could be indicated, for example, by an indicator. light on 7012 instruments. The 7004 cloud can also transmit signals to 7012 surgical instruments to update their associated control programs. The 7004 cloud can transmit signals that are directed to a specific class of 7012 surgical instruments (eg, electrosurgical instruments), so that software updates for control programs are transmitted only to the appropriate 7012 surgical instruments. , the 7004 cloud could be used to implement system-wide solutions to address local or global issues based on selective data transmission and authorization credentials. For example, if a group of 7012 surgical instruments is identified as having a common manufacturing defect, the 7004 cloud can change the authorization credentials that correspond to that group to implement an operational crash of the group. [0132] [0132] Cloud-based data analytics system can enable monitoring of multiple health facilities (e.g. medical facilities such as hospitals) to determine improved practices and recommend changes (through the 2030 recommendations module, for example example) accordingly. In this way, the 7008 processors in the 7004 cloud can analyze the data associated with an individual clinic to identify the clinic and aggregate the data with other data associated with other clinics in a group. Groups could be defined based on similar operational practices or geographic location, for example. In this way, the 7004 cloud can provide analysis and recommendations for the entire group of clinics. The cloud-based data analysis system could also be used to increase situational awareness. For example, 7008 processors can predictively model the effects of recommendations on cost and effectiveness for a specific post (in relation to general operations and/or various medical procedures). The cost and effectiveness associated with that particular post can also be compared to a corresponding local region of other posts or any other comparable posts. [0133] [0133] The 7032 Data Classification and Prioritization Module can prioritize and classify data based on severity (eg severity of a medical event associated with the data, unpredictability, distrust). This classification and prioritization can be used in conjunction with the functions of the other 7034 data analysis modules described above to improve the cloud-based data analysis and operation described here. For example, the 7032 data classification and prioritization module can assign a priority to the data analysis performed by the 7022 data collection and aggregation module and the 7028 patient outcome analysis module. Different levels of prioritization can be performed. result in cloud-specific 7004 responses (corresponding to an urgency level), such as escalation to an accelerated response, special processing, deletion from aggregated 7011 medical databases, or other appropriate responses. Furthermore, if necessary, the 7004 cloud can transmit a request (eg a push message) through the application servers to central controllers for additional data from corresponding 7012 surgical instruments. The push message may result in a notification displayed on the corresponding 7006 central controllers to request support or additional data. This push message may be necessary in situations where the cloud detects an irregularity or results outside of significant limits and the cloud cannot determine the cause of the irregularity. Central 7013 servers can be programmed to activate this push message in certain significant circumstances, such as when data is determined to be different from an expected value beyond a predetermined threshold or when it appears that security has been composed, for example . [0134] [0134] Additional details related to the cloud data analysis system can be found in US provisional patent application No. 62/659,900 entitled METHOD OF HUB COMMUNICATION, filed on April 19, 2018, which is incorporated herein for reference in its entirety. [0135] [0135] While a "smart" device, including control algorithms responsive to detected data, may be an improvement over a "stupid" device that operates without carrying the detected data, some detected data may be incomplete or inconclusive when considered in isolation, that is, without the context of the type of surgical procedure being performed or the type of tissue undergoing the surgery. [0136] [0136] One solution utilizes a central surgical controller including a system configured to derive information about the surgical procedure being performed based on data received from various data sources, and then control the modular devices accordingly. Paired. In other words, the central surgical controller is configured to infer information about the surgical procedure from the data received and then control the modular devices paired with the central surgical controller based on the inferred context of the surgical procedure. Figure 14 illustrates a diagram of a situational awareness surgical system 5100, in accordance with at least one aspect of the present disclosure. In some examples, data sources 5126 include, for example, modular devices 5102 (which may include sensors configured to detect parameters associated with the patient and/or the modular device itself), databases 5122 (e.g., an EMR database containing the patient's chart), and 5124 monitoring devices (eg, a blood pressure (BP) monitor and an electrocardiography (ECG) monitor). [0137] [0137] A central surgical controller 5104 that can be similar to the surgical controller 106 in many respects, can be configured to derive contextual information related to the surgical procedure from the data based on, for example, the combination(s)( s) of data received or in the specific order in which the data are received from the 5126 data sources. Contextual information inferred from the data received may include, for example, the type of surgical procedure being performed, the specific step of the surgical procedure that the surgeon is performing, the type of tissue being operated on, or the body cavity that is the object of the procedure. This ability by some aspects of the central surgical controller 5104 to derive or infer information related to the surgical procedure from received data, can be called "situational awareness". In one example, the 5104 central surgical controller may incorporate a situational awareness system, which is the hardware and/or programming associated with the 5104 central surgical controller that derives contextual information related to the surgical procedure based on the data received. [0138] [0138] The 5104 Central Surgical Controller's situational awareness system can be configured to derive contextual information from the data received from the 5126 data sources in various ways. In one example, the situational awareness system includes a pattern recognition system, or machine learning system (eg, an artificial neural network), that has been trained on training data to correlate various inputs (eg. , data from databases 5122, patient monitoring devices 5124, and/or modular devices 5102) to corresponding contextual information pertaining to a surgical procedure. In other words, a machine learning system can be trained to accurately derive contextual information regarding a surgical procedure from the inputs provided. In another example, the situational awareness system may include a lookup table that stores pre-characterized contextual information referring to a surgical procedure in association with one or more entries (or ranges of entries) corresponding to contextual information. In response to a query with one or more entries, the lookup table may return the corresponding contextual information to the situational awareness system to control the 5102 modular devices. In one exemplification, the contextual information received by the situational awareness system of the 5104 central surgical controller, are associated with a specific control setting or set of control settings for one or more 5102 modular devices. In another example, the situational awareness system includes an additional machine learning system, table or other such system, generating or retrieving one or more control settings for one or more modular devices 5102, when provided the contextual information as input. [0139] [0139] A 5104 central surgical controller, which incorporates a situational awareness system, provides several benefits to the 5100 surgical system. One benefit includes improving the interpretation of detected and captured data, which in turn improves the accuracy of pro- cessation and/or use of data during the course of a surgical procedure. To return to an earlier example, a central surgical controller 5104 endowed with situational awareness could determine what type of tissue was being operated on; therefore, when an unexpectedly high force is detected to close the surgical instrument end actuator, the situational awareness central surgical controller 5104 could correctly accelerate or decelerate the surgical instrument motor for the tissue type. [0140] [0140] As another example, the type of tissue being operated on can affect the adjustments that are made to the load thresholds and compression ratio of a surgical stapling and cutting instrument for a specific tissue gap measurement. A situational awareness central surgical controller 5104 could infer whether a surgical procedure being performed is a thoracic or abdominal procedure, allowing the central surgical controller 5104 to determine whether tissue pinched by an instrument-end actuator surgical stapling and cutting material is lung tissue (for a thoracic procedure) or stomach tissue (for an abdominal procedure). The 5104 central surgical controller can then appropriately adjust the surgical cutting and stapling instrument's compression rate and load thresholds for the tissue type. [0141] [0141] As yet another example, the type of body cavity being operated on during an insufflation procedure can affect the function of a smoke evacuator. A 5104 situational awareness central surgical controller can determine whether the surgical site is under pressure (by determining that the surgical procedure is using insufflation) and determine the type of procedure. As one type of procedure is generally performed in a specific body cavity, the central surgical controller 5104 can then properly control the smoke evacuator motor speed for the body cavity being operated on. In this way, a 5104 situational awareness central surgical controller can provide a consistent amount of smoke evacuation for both thoracic and abdominal procedures. [0142] [0142] As yet another example, the type of procedure being performed can affect the optimal energy level for an ultrasonic surgical instrument or radio frequency (RF) electrosurgical instrument to operate. Arthroscopic procedures, for example, require higher energy levels because the end actuator of the ultrasonic surgical instrument, or RF electrosurgical instrument, is immersed in fluid. A central surgical controller with situational awareness 5104 can determine whether the surgical procedure is an arthroscopic procedure. The 5104 central surgical controller can then adjust the RF power level or ultrasonic amplitude of the generator (ie, the "power level") to compensate for the fluid-filled environment. Related to this, the type of tissue being operated on can affect the optimal energy level at which an ultrasonic surgical instrument or RF electrosurgical instrument operates. A 5104 situation-aware central surgical controller can determine what type of surgical procedure is being performed and then customize the power level for the ultrasonic surgical instrument or RF electrosurgical instrument, respectively, [0143] [0143] As yet another example, data can be extracted from additional data sources 5126 to improve the conclusions that the central surgical controller 5104 draws from a data source 5126. A central surgical controller with situational awareness 5104 can automatically supplement the data it receives from the modular devices 5102 with contextual information it has accumulated regarding the surgical procedure from other data sources 5126. For example, a central surgical controller with situational awareness 5104 can be configured to determine whether hemostasis has occurred (ie, bleeding has stopped at a surgical site), according to video or image data received from a medical imaging device. However, in some cases, the video or image data may be inconclusive. Therefore, in one example, the central surgical controller 5104 can be further configured to compare a physiological measurement (e.g., blood pressure detected by a BP monitor communicatively connected to the central surgical controller 5104) with the visual or imaging data of hemostasis (eg, from a Medical Imaging Device 124 (Figure 2) communicably coupled to the Central Surgical Controller 5104) to make a determination on the integrity of the staple line or tissue bond. In other words, the 5104 central surgical controller's situational awareness system can consider physiological measurement data to provide additional context in analyzing the visualization data. Additional context can be useful when visualization data may be inconclusive or incomplete on its own. [0144] [0144] Another benefit includes proactively and automatically controlling the 5102 paired modular devices according to the specific step of the surgical procedure being performed to reduce the number of times medical personnel are required to interact with or control the 5100 Surgical System during the course of a surgical procedure. For example, a situational awareness central surgical controller 5104 can proactively activate the generator to which an RF electrosurgical instrument is connected if it is determined that a subsequent step in the procedure requires the use of the RF electrosurgical instrument. instrument. Proactively activating the power source allows the instrument to be ready for use once the preceding step in the procedure is complete. [0145] [0145] As another example, a central surgical controller with situational awareness 5104 could determine whether the current or subsequent stage of the surgical procedure requires a different view or degree of magnification of the screen, according to the resource(s) (s) at the surgical site that the surgeon is expected to need to see. The central surgical controller 5104 could then proactively change the view displayed (provided, for example, by a medical imaging device to the visualization system 108) so that the screen automatically adjusts throughout the procedure. surgical. [0146] [0146] As yet another example, a situational awareness central surgical controller 5104 could determine which stage of the surgical procedure is being performed or will be performed subsequently and whether specific data or comparisons between the data will be required for that stage of the surgical procedure. The 5104 Central Surgical Controller can be configured to automatically call up screens based on data about the stage of the surgical procedure being performed, without waiting for the surgeon to request specific information. [0147] [0147] Another benefit includes error checking during the setup of the surgical procedure or during the course of the surgical procedure. For example, a central surgical controller with situational awareness 5104 could determine whether the operating room is properly or ideally configured for the surgical procedure to be performed. The 5104 central surgical controller can be configured to determine the type of surgical procedure being performed, retrieve corresponding checklists, product location, or configuration needs (e.g., from a memory), and then compare the current operating room layout with the default layout for the type of surgical procedure that the 5104 Central Surgical Controller determines is being performed. In an example, the central surgical controller 5104 can be configured to compare the list of items for the procedure scanned by a suitable CT scanner 5132, for example, and/or a list of devices paired with the central surgical controller 5104, with a manifest recommended or anticipated items and/or devices for the given surgical procedure. If there are any discontinuities between the lists, the 5104 central surgical controller can be configured to provide an alert indicating that a specific 5102 modular device, 5124 patient monitoring device, and/or other surgical item is missing. In one example, [0148] [0148] As another example, the situational awareness central surgical controller 5104 could determine if the surgeon (or other medical personnel) was making a mistake or otherwise deviating from the expected course of action during the course of a surgical procedure. For example, the 5104 central surgical controller can be configured to determine the type of surgical procedure being performed, retrieve the corresponding list of steps or order of use of the equipment (e.g. from a memory), and then compare the steps being performed or the equipment being used during the course of the surgical procedure with the steps or equipment expected for the type of surgical procedure that the 5104 Central Surgical Controller has determined is being executed. In one example, the 5104 central surgical controller can be configured to provide an alert indicating that an unexpected action is being performed or an unexpected device is being used at a specific step in the surgical procedure. [0149] [0149] Overall, the situational awareness system for the 5104 Central Surgical Controller improves surgical procedure outcomes by adjusting surgical instruments (and other mobile devices). [0150] [0150] Referring now to Figure 15, a timeline 5200 is shown representing the situational recognition of a central controller, such as the central surgical controller 106 or 206 (Figures 1 to 11), for example. Timeline 5200 is an illustrative surgical procedure and contextual information that the central surgical controller 106, 206 can derive from data received from data sources at each step in the surgical procedure. Timeline 5200 represents the typical steps that would be taken by nurses, surgeons, and other medical personnel during the course of a lung segmentectomy procedure, starting with the operating room setup and ending with the transfer. of the patient to a recovery room in the postoperative period. [0151] [0151] Situational awareness from a central surgical controller 106, 206 receives data from data sources throughout the course of the surgical procedure, including data generated each time medical personnel use a modular device that is paired with the controller central surgical controller 106, 206. The central surgical controller 106, 206 can receive this data from the paired modular devices and other data sources and continually derive inferences (i.e., contextual information) about the ongoing procedure as new data are received, such as which step of the procedure is being performed at any given time. The situational recognition system of the central surgical controller 106, 206 is capable of, for example, recording data concerning the procedure to generate reports, verifying steps being taken by medical personnel, providing data or warnings (for example, through of a display screen) that may be pertinent to the specific step of the procedure, adjusting modular devices based on context (e.g. activating monitors, adjusting the field of view (FOV) of the medical imaging device, or change the power level of an ultrasonic surgical instrument or RF electrosurgical instrument), and take any other action described above. [0152] [0152] In the first step 5202, in this illustrative procedure, hospital staff members retrieve the patient's electronic medical record (PEP) from the hospital's PEP database. Based on PEP patient selection data, the central surgical controller 106, 206 determines that the procedure to be performed is a thoracic procedure. [0153] [0153] In the second step 5204, team members scan the entry of medical supplies for the procedure. The central surgical controller 106, 206 cross-references the scanned supplies with a list of supplies that are used in various types of procedures and confirms that the mix of supplies corresponds to a thoracic procedure. Additionally, the central surgical controller 106, 206 is also able to determine that the procedure is not a wedge procedure (because the inlet supplies have an absence of certain supplies that are needed for a thoracic wedge procedure or, otherwise, that inlet supplies do not correspond to a thoracic wedge procedure). [0154] [0154] In the third step 5206, medical personnel scan the patient's band with a scanner that is communicably connected to the central surgical controller 106, 206. The central surgical controller 106, 206 can then confirm the patient's identity based on the scanned data . [0155] [0155] In the fourth step 5208, the medical personnel turn on the auxiliary equipment. The auxiliary equipment being used may vary according to the type of surgical procedure and the techniques to be used by the surgeon, but in this illustrative case they include a smoke evacuator, an insufflator and a medical imaging device. When activated, auxiliary equipment that are modular devices can automatically pair with the central surgical controller 106, 206 that is situated within a specific neighborhood of the modular devices as part of their startup process. The central surgical controller 106, 206 can then derive contextual information about the surgical procedure by detecting the types of modular devices that correspond with it during this preoperative or startup phase. In this particular example, the central surgical controller 106, 206 determines that the surgical procedure is a VATS (video-assisted thoracic surgery) procedure based on this specific combination of paired modular devices. Based on the combination of electronic patient record (PEP) data, the list of medical supplies to be used in the procedure, and the type of modular devices that connect to the central controller, the central surgical controller 106, 206 can, in general, infer the specific procedure that the surgical team will perform. After the central surgical controller 106, 206 recognizes which specific procedure is being performed, the central surgical controller 106, 206 can then retrieve the steps of that process from a memory or from the cloud and then cross-reference the data that subsequently receives from connected data sources (e.g. modular and distributed devices). [0156] [0156] In the fifth step 5210, team members attach electrocardiogram (ECG) electrodes and other patient monitoring devices to the patient. ECG electrodes and other patient monitoring devices are capable of pairing with the central surgical controller 106, 206. As the central surgical controller 106, 206 begins to receive data from the patient monitoring devices, the central surgical controller 106, 206 thereby confirming that the patient is in the operating room. [0157] [0157] In the sixth step 5212, medical personnel induce anesthesia on the patient. The central surgical controller 106, 206 can infer that the patient is under anesthesia based on data from modular devices and/or patient monitoring devices, including ECG data, blood pressure data, ventilator data, or combinations thereof, for example. Upon completion of the sixth step 5212, the preoperative portion of the lung segmentectomy procedure is completed and the operative portion begins. [0158] [0158] In the seventh step 5214, the lung of the patient being operated on is retracted (while ventilation is switched to the contralateral lung). The central surgical controller 106, 206 can infer from the ventilator data that the patient's lung has been retracted, for example. The central surgical controller 106, 206 can infer that the operative portion of the procedure has begun when it can compare the detection of the patient's lung collapse to the expected steps of the procedure (which can be accessed or retrieved earlier) and thus determine that lung retraction is the first operative step in this specific procedure. [0159] [0159] In the eighth step 5216, the medical imaging device (eg, an endoscopic probe) is inserted and video from the medical imaging device is started. [0160] [0160] In the ninth step 5218 of the procedure, the surgical team starts the dissection step. The central surgical controller 106, 206 can infer that the surgeon is in the process of dissecting to mobilize the patient's lung because he receives data from the RF or ultrasonic generator that indicates that an energy instrument is being fired. The central surgical controller 106, 206 can cross-reference the received data with the steps retrieved from the surgical procedure to determine that a power instrument is being fired at that point in the process (i.e., after completion of the previously discussed steps of the procedure) corresponds to the dissection stage. In certain cases, the power instrument may be a power tool mounted on a robotic arm of a robotic surgical system. [0161] [0161] In the tenth step 5220 of the procedure, the surgical team proceeds to the ligation step. The central surgical controller 106, 206 can infer that the surgeon is ligating arteries and veins because he receives data from the surgical stapling and cutting instrument indicating that the instrument is being triggered. Similar to the previous step, the central surgical controller 106, 206 can derive this inference by crossing the reception data from the surgical stapling and cutting instrument with the steps retrieved in the process. In certain cases, the surgical instrument may be a surgical tool mounted on a robotic arm of a robotic surgical system. [0162] [0162] In the eleventh step 5222, the segmentectomy portion of the procedure is performed. The central surgical controller 106, 206 can infer that the surgeon is transecting the parenchyma based on data from the surgical stapling and cutting instrument, including data from its cartridge. Cartridge data may correspond to the size or type of staple being fired by the instrument, for example. As different types of staples are used for different types of tissue, the cartridge data can thus indicate the type of tissue being stapled and/or transected. In this case, the type of staple that is fired is used for the parenchyma (or other similar types of tissue), which allows the central surgical controller 106, 206 to infer which segmentectomy portion of the procedure is being performed. [0163] [0163] In the twelfth step 5224, the node dissection step is then performed. The central surgical controller 106, 206 can infer that the surgical team is dissecting the node and performing a leak test based on data received from the generator that indicates which ultrasonic or RF instrument is being triggered. For this particular procedure, an RF or ultrasonic instrument being used after the parenchyma has been transected corresponds to the node dissection step, which allows the central surgical controller 106, 206 to make this inference. It should be noted that surgeons regularly switch between surgical stapling/cutting instruments and surgical energy instruments (i.e., RF or ultrasonic) depending on the specific step in the procedure because different instruments are better suited for specific tasks. - specific. Therefore, the specific sequence in which cutting/stapling instruments and surgical energy instruments are used may indicate which step of the procedure the surgeon is performing. Furthermore, in certain cases, robotic tools may be used for one or more steps in a surgical procedure and/or Hand Surgical Instruments may be used for one or more steps in the surgical procedure. The surgeon can switch between robotic tools and hand-held surgical instruments and/or can use the devices simultaneously, for example. Upon completion of the twelfth step 5224, the incisions are closed and the post-operative portion of the process begins. [0164] [0164] In the thirteenth step 5226, the patient's anesthesia is reversed. The central surgical controller 106, 206 can infer that the patient is emerging from anesthesia based on ventilator data (i.e., the patient's respiratory rate begins to increase), for example. [0165] [0165] Finally, in the fourteenth step 5228 is that the medical personnel removes the various patient monitoring devices from the patient. The central surgical controller 106, 206 can thus infer that the patient is being transferred to a recovery room when the central controller loses ECG, blood pressure and other data from patient monitoring devices. As can be seen from the description of this illustrative procedure, the central surgical controller 106, 206 can determine or infer when each step of a given surgical procedure is taking place according to data received from various data sources that are communicable. directly coupled to the central surgical controller 106, 206. [0166] [0166] Situational recognition is further described in US Provisional Patent Application Serial No. 62/659,900, entitled METHOD OF HUB COMMUNICATION, filed April 19, 2018, which is incorporated herein by reference in its entirety. In certain cases, the operation of a robotic surgical system, including the various robotic surgical systems disclosed herein, for example, may be controlled by the central controller 106, 206 based on its situational recognition and/or feedback from components thereof, and /or based on information from cloud 104. [0167] [0167] In many respects, computer systems, such as the central surgical controllers 106, 206 described in connection with Figures 1 to 11, can be programmed to track surgical devices or the surgical team within the OR through an camera set. Imaging of surgical devices and/or surgical staff within the OR can provide a wide variety of data, including the pose (i.e., position and orientation) of surgical devices, placements of surgical devices (eg. (eg, through which trocar a surgical instrument is currently inserted), which surgical team member is operating the surgical devices, gestures or activities being performed by the surgical team, surgical team locations, and so on. In one aspect, data generated from visual assessment of the surgical team and/or surgical instruments can be saved as metadata associated with or linked to perioperative data received from surgical devices. Such metadata can provide additional context to perioperative data generated by surgical devices, which can be utilized by a situational recognition system executed by a central surgical controller, as described above under SITUATIONAL RECOGNITION, or a computer-based system in the cloud, as described above under the heading CLOUD-BASED SYSTEM HARDWARE AND MODULES [0168] [0168] In one aspect, computer systems may utilize OS videos/images external to a surgical site (eg, a patient's abdomen undergoing a laparoscopic procedure). In this respect, the set of cameras that capture images for analysis by the computer system described here may exclude including images from a laparoscope, thoracoscope, or other endoscope and/or video camera used to visualize the part of a patient's body. Instead, the camera suite can include cameras positioned around the OR to view how surgical devices are being used and how the surgical team is interacting and surgical devices to provide a broader context for the actions taking place within the OR. ONLY. In another aspect, images/videos captured externally can be used together with images/videos from endoscopes for analysis and/or to optimize the control of surgical devices in use. [0169] [0169] Figure 16 is a diagram of an illustrative OS configuration, in accordance with at least one aspect of the present description. In various implementations, a 211801 central surgical controller can be communicatively connected to one or more 211802 cameras, 211810 surgical instruments, 211806 displays, 211808 overhead lights, and other surgical devices within the 211800 SO via a communications protocol (e.g. example, Bluetooth), as described above under the heading SURGICAL HUBS. The 211802 cameras can be oriented to capture images and/or video of the 211803 surgical team members and/or 211810 surgical instruments (or other surgical devices) within the 211800 SO during the course of a surgical procedure. The captured image(s) may include still images or moving images (ie video). Images of surgical team members 211803 and/or surgical instruments 211810 can be captured at a variety of angles and magnifications, use different filters, and so on. In one implementation, the 211802 cameras are arranged within the SO 211800 so that they can collectively view each member of the surgical team performing the procedure. Consequently, the 211801 central surgical controller can receive the captured image and/or video data from the 211802 cameras to visually analyze the 211803 surgical team members and/or the 211810 surgical instruments during the surgical procedure. Image and/or video data can be processed using a variety of optical tracking, machine vision, image processing, object recognition techniques to track characteristics, properties, actions and movements of surgical team members 211803 and /or surgical instruments 211810. [0170] [0170] Figure 17A is a logic flowchart of a process 211600 for controlling a surgical device, in accordance with at least one aspect of the present disclosure. Process 211600 may be executed by a processor or control circuit of a computer system, such as processor 244 of central surgical controller 206 illustrated in Figure 10. Accordingly, process 211600 may be incorporated as a set of computer-executable instructions stores - stored in a memory 249 which, when performed by the processor 244, causes the computer system (e.g., a central surgical controller 211801) to perform the described steps. [0171] [0171] Consequently, the 244 processor running the program [0172] [0172] Consequently, the processor 244 determines 211604 a characteristic or condition of the surgical team and/or the surgical devices captured by the images. These characteristics or conditions may include physical properties, actions, interactions between other objects or individuals, and so on. More particularly, characteristics or conditions of surgical team members 211803 may include determining whether a surgical team member 211803 is performing a 211804 gesture (as shown in Figure 16), whether a surgical team member 211803 is holding a given surgical instrument 211810, where a surgical team member 211803 is located, the number of surgical team members 211803 within the OR, whether a surgical team member 211803 is interacting with a surgical device (and which surgical device is interacting), if a 211803 surgical team member is passing a 211810 surgical instrument or other surgical device to another 211803 surgical team member, physical properties associated with a 211803 surgical team member (eg, arm position, wrist angle), and so on. Characteristics or conditions of surgical devices may include postures, whether they are being actively used (e.g., if a generator is actively supplying power to a connected 211810 surgical instrument), whether a 211810 surgical instrument is being inserted through a trocar (and the location or identity of that trocar), and so on. [0173] [0173] Consequently, the processor 244 controls 211606 a surgical device that is paired with the central surgical controller 211801 in a manner that depends on the specific characteristic or condition. For example, if processor 244 determines 211604 that a surgical team member 211803 is making a "change instrument mode" gesture, then processor 244 may transmit a signal, or otherwise control 211606 a specific surgical instrument 211810 (or its associated generator) connected to the 211801 central surgical controller to change the operating mode of the 211810 surgical instrument (e.g., changing an electrosurgical surgical instrument from a cautery mode to a cutting mode. This would allow the surgical team to control surgical instruments 211810 without the need to directly interact with surgical instruments 211810. As another example, if the processor 244 determines 211604 that a surgical instrument 211810 is being passed (or being prepared to be passed) from a member of the surgical team 211803 (e.g. a nurse) to another member of the 211803 surgical team (e.g. a surgeon), then the processor 244 can transmit a signal to, or otherwise control 211606, the power generator to activate and begin supplying power to the connected surgical instrument 211810. This would allow the central surgical controller 211801 to preemptively activate the 211801 surgical instruments so that they are ready for use without the surgeon having to take any affirmative action. As yet another example, if processor 244 determines 211604 that a surgical instrument 211810 is in a particular orientation when being (or about to be) fired, processor 244 may transmit a signal to or otherwise control 211606 the surgical instrument 211810 to modify the operating parameters of the surgical instrument 211810 (eg, force to fire or maximum allowable articulation angle) accordingly. This allows the 211801 central surgical controller to control the functions of the 211810 surgical instruments to account for differences in the placements and orientations of the 211810 surgical instruments. [0174] [0174] In another aspect, the central surgical controller 211801 may include a voice recognition system in addition to or in place of the gesture recognition system 211500, described below. In this regard, the 211801 central surgical controller can be programmed to identify and respond to a variety of voice commands and control the functions of any connected surgical device accordingly. [0175] [0175] In another aspect, Figure 17B is a logical flowchart of a process 211620 for generating surgical metadata, in accordance with at least one aspect of the present description. As described above in connection with Figure 17A, process 211620 can be executed by a processor 244. Consequently, the processor 244 executing process 211620 can capture 211622 image/video data and determine 211624 a characteristic of the members. surgical team 211803 and/or surgical instruments 211810 as described above in connection with Figure 17A. However, in that regard, processor 244 saves 211626 the characteristic or condition as metadata associated with or linked to perioperative data generated by the surgical devices during the course of the surgical procedure. As noted above, characteristics or conditions saved 211626 as metadata may include a wide range of physical properties of, actions by, and interactions between members of the Surgical Team 211803 and Surgical Instruments 211810 within SO 211800. [0176] [0176] In an implementation of processes 211600, 211620 described in connection with Figures 17A and 17B, the central surgical controller 211801 can be configured to recognize and respond to gestures performed by individuals within the SO 211800. For example, Figure 18 is a block diagram of a gesture recognition system 211500, in accordance with at least one aspect of the present disclosure. In the following description of Figure 18, reference should also be made to Figures 10 and 16. The gesture recognition system 211500 includes a gesture recognition module 211504 that can be executed by a processor or control circuit of a computer, such as the processor 244 of the central surgical controller 206 illustrated in Figure 10. Accordingly, the gesture recognition module 211504 can be incorporated as a set of computer-executable instructions stored in a memory 249 that, when executed by the processor 244 , make the computer system (eg, a 211801 central surgical controller) perform the described steps. [0177] [0177] The 211500 gesture recognition system is programmed to receive image or video data from image recognition hardware (e.g. 211802 cameras), recognize various 211804 gestures that can be performed by team members 211803 (i.e., determining 211604, 211624 if a gesture is being performed in processes 211600, 211620 described in connection with Figures 17A and 17B), and taking a corresponding action or otherwise responding to the specific gesture detected 211804 (i.e. , control 211606 a surgical device, or save 211626 the data as metadata in processes 211600, 211620 described in conjunction with Figures 17A and 17B). In one aspect, the gesture recognition module 211504 may include a feature extraction module 211506 and a gesture classification module 211508. The feature extraction module 211506 is programmed to extract measurable, discriminative properties or features (i.e. that is, resources) of the image/video data. Features can include edges (extracted using a Canny edge detection algorithm, for example), curvature, corners (extracted using a Harris & Stephens corner detection algorithm, for example), and so on. The gesture classification module 211508 determines whether the extracted features correspond to a gesture from a set of gestures. In one aspect, the 211508 gesture classification module may include a machine learning model (e.g., an artificial neural network or a support vector machine) that has been trained, using learning techniques with or unsupervised, to correlate a feature vector of features extracted with one or more gestures. In another aspect, the gesture classification module 211508 may include a Hu-invariant momentum-based algorithm or a k-curvature algorithm for classifying the gestures. In yet another aspect, the gesture classification module 211508 may include a template matching algorithm programmed to correlate the characterized image/video data (or portions thereof) with templates corresponding to predefined gestures. Other aspects may include various combinations of the above techniques and other techniques for classifying gestures. [0178] [0178] After recognizing a gesture through the gesture recognition module 211504, the gesture recognition system 211500 can take an action 211510 or give a response that corresponds to the identified gesture. In one aspect, the action 211510 taken by the computer system includes controlling a surgical device within the SO 211800, as discussed above in connection with Figure 17A. For example, the 211801 central surgical controller running the 211504 gesture recognition module can recognize a "brightness control" gesture and then accordingly dim or increase the intensity of the 211808 upper lights that are paired with the surgical controller. - central gico 211801. As another example, the central surgical controller 211801 that runs the gesture recognition module 211504 can recognize a "turn on generator" gesture and then activate a power generator paired with the central surgical controller 211801, the which, in turn, can energize an ultrasonic surgical instrument or an electrosurgical instrument connected to the generator. Gestures can also be used to change the information being shown on the 211806 screens (eg, navigate menus associated with a 211810 surgical instrument or toggle between switching between video streams being displayed); changing the mode, function, or operating parameters of a 211810 surgical instrument (eg, changing an electrosurgical instrument from a cautery mode to a transection mode); causing an endoscopic tube to start or stop a video recording; changing the power level of a power generator; and so on. Gestures can be beneficial for controlling surgical devices that are outside the sterile barrier from within the sterile barrier without creating a risk of contamination, allowing individuals who are not directly handling a surgical device or who are not close to the surgical device within the OS, control surgical device functions, and so on. [0179] [0179] In another aspect, the action 211510 taken by the computer system includes saving the gestures made by the surgical team as metadata associated or linked to the perioperative data generated by the surgical devices during the course of the surgical procedure, as discussed above in connection with Figure 17B. Such metadata can be useful in determining whether surgical teams are manually controlling surgical devices or controlling surgical devices through gestures, which in turn can be correlated to surgical team performance, procedure times, and other factors. similar metrics. In many other respects, the computer system can either control one or more surgical devices or save the gesture data as metadata. [0180] [0180] In another aspect, the 211500 gesture recognition system utilizes a magnetic detection system to receive non-contact input from users, in addition to or in place of 211802 cameras to visually identify gestures. In this regard, the 211500 gesture recognition system may include, for example, a magnetic detection matrix that can be positioned within the SO 211800. The magnetic detection matrix can be configured to monitor the positions of magnetic elements that can be controlled by surgical team members 211803. In one aspect, magnetic elements may be embedded in a surgical glove or other article of clothing. In another aspect, the magnetic elements may be situated within an object or token that is manipulated by members of the 211803 surgical team. Accordingly, the magnetic sensing array may be configured to detect the position of the magnetic sensing elements over time. and identify any gestures that are performed by the individual who controls the magnetic elements. As with the 211500 gesture recognition system, users can navigate menus or selected items from menus displayed on 211806 screens within the 211800 OS or make other gestures to control the functions of various surgical devices within the 211800 OS. SO 211800. Consequently, the position, movement, and/or orientation of the magnetic element can be used as a tracking marker to control 211806 screens or other surgical devices that are connected by the 211801 central surgical controller, whether they are located inside or outside the sterile field. [0181] [0181] In a prophetic implementation of processes 211600, 211620, described in connection with Figures 17A and 17B, the computer system (e.g. a central surgical controller 211801) can be configured to determine the pose of a surgical instrument 211654, as shown in Figure 19, and control 211606 the 211654 surgical instrument accordingly, or save 211626 pulse angle as metadata for analysis. In this specific implementation, the wrist angle of subject 211650 is defined as the angle a between the longitudinal axis 211656 of the surgical instrument 211654 being held by the surgeon and the longitudinal axis 211652 (i.e. , the proximal-distal axis) of the subject's hand. In other implementations, the wrist angle can be defined as the angle between the subject's hand and forearm, for example. The 211801 central surgical controller can determine the wrist angle a by visually identifying that the 211654 surgical instrument is being manipulated by the surgeon and the surgeon's hand, using the object recognition techniques described above, for example. example. [0182] [0182] In one aspect of the 211620 process described in Figure 17B, the pulse angle a can be saved 211626 as metadata and used to perform analyzes on recommended surgical techniques. For example, the scatterplot 211700 of Figure 20 represents such a prophetic analysis on the relationship between the wrist angle a and the results of the surgical procedure. In the scatterplot 211700, the vertical axis 211702 represents the pulse angle a and the horizontal axis 211704 represents results of the procedure. The portions of the horizontal axis 211704 to the right and left of the vertical axis 211702 can correspond to positive and negative results of the procedures, respectively, for example. A variety of different procedure outcomes can be compared with the surgeon's wrist angle a, such as whether a particular step of the procedure or firing of the surgical instrument 211654 resulted in excessive bleeding, the incidence of reoperation for the surgical procedure , and so on. Additionally, procedure results can be quantified in several different ways depending on the specific type of procedure result being compared with the surgeon's wrist angle a. For example, if the result of the procedure is bleeding occurring after a specific firing of surgical instrument 211654, the horizontal axis 211704 may represent the degree or amount of blood along the incision line from the firing of surgical instrument 211654. Additionally , the pulse angle a of each point plotted on the scatterplot [0183] [0183] Additionally, this data can then be used to establish thresholds or baselines, which, in turn, can be used to provide recommendations to surgical team members 211803 during or after completion of a surgical procedure. , as described in attorney-in-fact document No. END9018USNP2/180517-2, entitled "USAGE AND TECHNIQUE ANALYSIS OF SURGEON/STAFF PERFORMANCE AGAINST A BA- [0184] [0184] In one aspect of the 211600 process described in Figure 17A, a surgical instrument 211810 can be controlled 211606 according to the given pulse angle. For example, the 211801 central surgical controller can adjust the 211810 surgical instrument control program parameters, such as force to fire, force to close, or maximum allowable articulation angle, to compensate for the orientation of the 211810 surgical instrument. This compensation can ensure that the end actuator of the 211810 surgical instrument applies the same force that would have been applied if the 211810 surgical instrument had been oriented more properly, for example. [0185] [0185] In one aspect, the computer system can be programmed to create an orientation index that defines the posture of a 211810 surgical instrument relative to a predefined or normalized frame of reference. This can allow data captured on OSs of different sizes to be seamlessly compared. The orientation index can be set when the central surgical controller 206 scans its surroundings using a non-contact sensor module 242, as described under the heading SURGICAL HUBS, for example. Consequently, the computer system can detect and save the orientation of the 211801 surgical instrument as a function of the predefined reference frame. [0186] [0186] In other implementations, the computer system may track the locations and orientations of trocars used for a specific type of surgical procedure, which can then be saved as metadata and/or used to control 211806 screens or other devices. - surgical sites to provide recommendations to the surgical team. Trocar positions can be analyzed to determine which range of positions (or combination of positions for surgical procedures using multiple trocars) is most highly correlated with positive procedural outcomes. Consequently, the computer system can then provide recommendations for placement of trocars in future surgical procedures. [0187] [0187] In other implementations, the computer system may track the location of the grip in relation to surrounding objects (eg, the operating table or other equipment), which can then be saved as metadata and/or or used to control the screens [0188] [0188] In other implementations, the computer system may identify 211810 Surgical Instruments and other surgical devices in the configuration that are on the preoperative rear table to provide additional context to surgical procedure data and/or inferences made by the situational recognition system, as described under the heading SITUATIONAL RECOGNITION. The identification of which surgical devices are (or are not) in the preoperative configuration can inform the most recent inferences made by the situational recognition system. [0189] [0189] In other implementations, the computer system can identify circulating nurses and/or perioperative nurses among surgical team members 211803 and track their location and activity to help inform the next step. of the surgical procedure. Nurses' activities can be informative because the perioperative nurse usually retrieves the 211801 surgical instrument that will be needed next and then transfers that 211810 surgical instrument to the surgeon when needed. Additionally, some 211810 surgical instruments or other devices need preparation prior to use (eg, when dictated by tissue conditions, the reinforcement may be placed in a surgical stapler). conse- [0190] [0190] Various aspects of the matter described in this document are defined in the following numbered examples: [0191] [0191] Example 1. Computer system configured to be communicatively coupled to a surgical device and a camera configured to view an operating room. The computer system comprises a processor and memory coupled to the processor. The memory stores instructions that, when executed by the processor, cause the computer system to: receive an Image of an Individual inside the operating room through the camera; determine whether the individual is making a gesture based on the image; and control the surgical device according to the gesture. [0192] [0192] Example 2. Computer system, from Example 1, where the surgical device comprises a screen and the instructions stored in memory, when executed by the processor, make the computer system control the information displayed on the screen. according to the gesture. [0193] [0193] Example 3. Computer system, from Example 2, where the information displayed on the screen corresponds to a surgical instrument controlled by the individual. [0194] [0194] Example 4. Computer system, from Example 1, where the surgical device comprises a surgical instrument and the instructions stored in memory, when executed by the processor, cause the computer system to change an operation of the surgical instrument according to the gesture. [0195] [0195] Example 5. Computer system, from Example 4, where the surgical instrument is selected from the group consisting of an electrosurgical instrument, an ultrasonic surgical instrument and a surgical stapling instrument. [0196] [0196] Example 6. The computer system of any of Examples 1 to 5, the instructions stored in memory, when executed by the processor, cause the computer system to extract features from the image received from the camera and determine if the subject is making the gesture according to the extracted characteristics, correspond to the gesture. [0197] [0197] Example 7. Computer system configured to be communicatively coupled to a surgical device and a camera configured to view an operating room. The computer system comprises a processor and memory coupled to the processor. Memory stores instructions that, when executed [0198] [0198] Example 8. Computer system, from Example 7, where the instructions stored in memory, when executed by the processor, cause the computer system to change an operation of the surgical instrument according to the position. [0199] [0199] Example 9. The computer system of example 8, wherein the surgical device comprises an end actuator and the operation comprises an end actuator orientation. [0200] [0200] Example 10. The computer system of example 8, wherein the surgical device comprises an end actuator configured to clamp or supply energy to a tissue in accordance with a control algorithm and the operation comprises the control algorithm. [0201] [0201] Example 11. Computer system, from Example 7, where the instructions stored in memory, when executed by the processor, cause the computer system to cause the surgical device to display information corresponding to the position. [0202] [0202] Example 12. Computer system, from Example 11, where the information displayed corresponds to a surgical context. [0203] [0203] Example 13. The computer system of example 12, where the instructions stored in memory, when executed by the processor, make the computer system: receive perioperative data from one or more surgical devices, being that the one or more surgical devices comprise the surgical device; and determine the surgical context based, at least in part, on perioperative data from the one or more surgical devices. [0204] [0204] Example 14. Computer system, from any of Examples 7 to 13, where the instructions stored in memory, when executed by the processor, cause the computer system to determine the pose of the surgical device according to a frame of reference static associated with the operating room. [0205] [0205] Example 15. Computer system configured to be communicatively coupled to a surgical device and a camera configured to view an operating room. The computer system comprises a processor and memory coupled to the processor. Memory stores instructions that, when executed by the processor, cause the computer system to: receive an image of a surgical device or an individual within the operating room through a camera that determines a position of the surgical device based on the image , according to the possibility that the image is of the surgical device; determine if the individual is making a gesture based on the image according to the possibility that the image is of the individual; and control the surgical device according to at least one of the surgical device placement or gesture. [0206] [0206] Example 16. Computer system, from Example 15, where the surgical device comprises a screen and the instructions stored in memory, when executed by the processor, make the computer system control the information displayed on the screen according to the gesture. [0207] [0207] Example 17. Computer system, from Example 15, where the surgical device comprises a surgical instrument and the instructions stored in memory, when executed by the processor, cause the computer system to change an operation of the surgical instrument according to the gesture. [0208] [0208] Example 18. Computer system, according to any one of Examples 15 to 17, in which the instructions stored in memory, when executed by the processor, cause the computer system to alter an operation of the surgical instrument from according to position. [0209] [0209] Example 19. Computer system, according to any one of Examples 15 to 17, the instructions stored in memory, when executed by the processor, cause the computer system to cause the surgical device to display information corresponding to the position. [0210] [0210] Example 20. Computer system of any one of Examples 15 to 19, where the instructions stored in memory, when executed by the processor, cause the computer system to determine the pose of the surgical device according to a frame of reference static associated with the operating room. [0211] [0211] While various forms have been illustrated and described, it is not the intention of the claimant to restrict or limit the scope of the appended claims to such detail. Numerous modifications, variations, alterations, substitutions, combinations and equivalents of these forms can be implemented and will occur to those skilled in the art without departing from the scope of the present description. Furthermore, the structure of each element associated with the form may alternatively be described as a means of providing the function performed by the element. In addition, where materials for certain components are disclosed, other materials may be used. It is to be understood, therefore, that the foregoing description and the appended claims are intended to cover all such modifications, combinations, and variations within the scope of the embodiments presented. The appended claims are intended to cover all such modifications, variations, [0212] [0212] The preceding detailed description presented various forms of devices and/or processes through the use of block diagrams, flowcharts and/or examples. Although these block diagrams, flowcharts and/or examples contain one or more functions and/or operations, it will be understood by those skilled in the art that each function and/or operation within these block diagrams, flowcharts and/or examples may be implemented , individually and/or collectively, across a wide range of hardware, software, firmware, or virtually any combination thereof. Those skilled in the art will recognize, however, that some aspects of the aspects disclosed herein, in whole or in part, may be equivalently implemented on integrated circuits, as one or more computer programs run on one or more computers ( e.g. as one or more programs running on one or more computer systems), as one or more programs running on one or more processors (e.g. as one or more programs running on one or more microprocessors), such as firmware , or virtually any combination thereof, and that designing the circuitry and/or writing the code for the software and firmware would be within the skill of one skilled in the art, in light of this description. Furthermore, those skilled in the art will understand that the mechanisms of the subject matter described herein may be distributed as one or more program products in a variety of ways and that a form illustrative of the subject matter described herein is applicable irrespective of the specific type of transmission medium. signals used to effectively carry out the distribution. [0213] [0213] The instructions used to program the logic to perform various aspects revealed can be stored in a memory in the system, such as dynamic random access memory (DRAM), cache, [0214] [0214] As used in any aspect of the present invention, the term "control circuit" may refer to, for example, a set of wired circuits, programmable circuits (e.g., a computer processor that includes one or more cores instruction processing units, processing unit, processor, microcontroller, microcontroller unit, controller, digital signal processor (DSP), programmable logic device (PLD), programmable logic array (PLA), or array programmable gates (FPGA)), state machine circuits, firmware that stores instructions executed by the programmable circuit, and any combination thereof. The control circuit can, collectively or individually, be incorporated as an electrical circuit that is part of a larger system, e.g. an integrated circuit (IC), an application-specific integrated circuit (ASIC), an on-board system. -chip (SoC), desktop computers, laptop computers, tablet computers, servers, smart phones, etc. Accordingly, as used in the present invention, "control circuit" includes, but is not limited to, electrical circuits that have at least one discrete electrical circuit, electrical circuits that have at least one integrated circuit, electrical circuits that have at least one circuit application-specific integrated electrical circuits that form a general-purpose computing device configured by a computer program (for example, a general-purpose computer configured by a computer program that at least partially executes processes and/or devices described herein, or a microprocessor configured by a computer program that at least partially performs the processes and/or devices described herein), electrical circuits that form a memory device (e.g., random access memory forms ), and/or electrical circuits that form a communications device (e.g. a modem, communication switch o, or optical-electrical equipment). Those skilled in the art will recognize that the subject matter described herein may be implemented in an analog or digital manner, or in some combination thereof. [0215] [0215] As used in any aspect of the present invention, the term "logical" may refer to an application, software, firmware and/or circuit configured to perform any of the aforementioned operations. The software may be incorporated as a software package, code, instructions, instruction sets, and/or data recorded on computer-readable, non-transient storage media. Firmware can be embedded as code, instructions or sets of instructions and/or data that are hard-coded (eg, non-volatile) in memory devices. [0216] [0216] As used in any aspect of the present invention, the terms "component", "system", "module" and the like may refer to a computer-related entity, whether hardware, a combination of hardware and software, software or software running. [0217] [0217] As used herein in one aspect of the present invention, an "algorithm" refers to the self-consistent sequence of steps that lead to the desired result, where a "step" refers to the manipulation of physical quantities and/or logical states that can, although they do not necessarily need to, take the form of electrical or magnetic signals that can be stored, transferred, combined, compared and manipulated in any other way. It is common usage to call these signs bits, values, elements, symbols, characters, terms, numbers, or the like. These terms and similar terms may be associated with appropriate physical quantities and are merely convenient identifications applied to those quantities and/or states. [0218] [0218] A network may include a packet switched network. Communication devices may be able to communicate with each other using a selected packet-switched network communications protocol. An exemplary communications protocol may include an Ethernet communications protocol which may be capable of enabling communication using a transmission control protocol/Internet protocol (TCP/IP). The Ethernet protocol may conform or be compatible with the Ethernet standard published by the Institute of Electrical and Electronics Engineers (IEEE) entitled "EEE 802.3 Standard", published in December 2008 and/or later versions of this standard. Alternatively or additionally, communication devices may be able to communicate with each other using an X.25 communications protocol. The X.25 communications protocol may conform or be compliant with a standard promulgated by the International Telecommunication [0219] [0219] Unless expressly stated to the contrary, as is evident from the preceding description, it is understood that throughout the preceding description, discussions that use terms such as "processing", or "computation", or "calculation", or " determination", or "display", or the like, refers to the action and processes of a computer, or similar electronic computing device, which manipulates and transforms data represented in the form of physical (electronic) quantities in the records and in computer memories on other data similarly represented as physical quantities in computer memories or records, or on other similar information storage, transmission or display devices. [0220] [0220] One or more components may be referred to in the present invention as "configured for", "configurable for", "operated [0221] [0221] The terms "proximal" and "distal" are used in the present invention with reference to a physician manipulating the handle portion of a surgical instrument. The term "proximal" refers to the portion closest to the physician, and the term "distal" refers to the portion located away from the physician. It will also be understood that, for the sake of convenience and clarity, spatial terms such as "vertical", "horizontal", "upward" and "downward" may be used in the present invention in connection with the drawings. However, surgical instruments can be used in many orientations and positions, and these terms are not intended to be limiting and/or absolute. [0222] [0222] Persons skilled in the art will recognize that, in general, the terms used herein, and particularly in the appended claims (e.g. bodies of appended claims) are generally intended as "open" terms (e.g., the term "including" shall be interpreted as "including, but not limited to", the term "having" shall be interpreted as "having at least", the term "includes" shall be interpreted as "includes, but not limits to", etc.). It will further be understood by those skilled in the art that where a specific number of an introduced claim mention is intended, such intention will be expressly mentioned in the claim and, in the absence of such a mention, no intention will be present. For example, as an aid to understanding, the following appended claims may contain the use of the introductory phrases "at the same time [0223] [0223] Furthermore, even if a specific number of an introduced claim mention is explicitly mentioned, those skilled in the art will recognize that such mention typically needs to be interpreted to mean at least the mentioned number (e.g., the mere mention of "two mentions", without other modifiers, typically means at least two mentions, or two or more mentions). Furthermore, in cases where a convention analogous to "at least one of A, B, and C, etc." is used, this construction is usually intended to have the sense in which the convention would be understood by (e.g., "a system that has at least one of A, B and C" would include, but not be limited to, systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B and C together, etc.). In cases where a convention analogous to "at least one of A, B, or C, etc." is used, this construction is usually intended to have the sense in which the convention would be understood by (e.g., "a system that has at least one of A, B, and C" would include, but not be limited to, systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B and C together, etc.). It will further be understood by those skilled in the art that typically a disjunctive word and/or phrase having two or more alternative terms, whether in the description, claims or drawings, is to be understood as contemplating the possibility of including one of the terms - mos, either or both terms, unless the context dictates otherwise. For example, the phrase "A or B" will typically be understood to include the possibilities of "A" or "B" or "AeB". [0224] [0224] With respect to the appended claims, persons skilled in the art will understand that the operations mentioned therein can generally be performed in any order. Furthermore, although several operational flow diagrams are presented in one or more sequences, it should be understood that the various operations may be executed in orders other than those illustrated, or may be executed simultaneously. . Examples of such alternative orderings may include superimposed, interleaved, interrupted, reordered, incremental, preparatory, supplemental, simultaneous, reverse, or other variant orderings, unless the context dictates otherwise. Furthermore, terms such as "responsive to", "related to" or other adjectival participles are not generally intended to exclude these variants, unless the context dictates otherwise. [0225] [0225] It is worth noting that any reference to "one (1) aspect", "an aspect", "an exemplification" or "one (1) exemplification"", and the like means that a particular feature, structure or characteristic described in connection with the aspect is included in at least one aspect. Thus, the use of expressions such as "in one (1) aspect", "in one aspect", "in one exemplification", "in one (1) exemplification", in several places throughout this descriptive report is not necessarily refer to the same aspect. Furthermore, specific features, structures or features can be combined in any suitable way in one or more aspects. [0226] [0226] Any patent application, patent, non-patent publication or other descriptive material mentioned in this specification and/or mentioned in any application data sheet is hereby incorporated by reference, to the extent that that the materials incorporated are not inconsistent therewith. Accordingly, and to the extent necessary, the description as explicitly presented herein supersedes any conflicting material incorporated into the present invention by reference. Any material, or portion thereof, which is incorporated herein by reference, but which conflicts with existing definitions, statements, or other descriptive materials presented herein, is incorporated herein only to the extent that that there is no conflict between the incorporated material and the existing description material. [0227] [0227] In summary, numerous benefits have been described that result from using the concepts described in this document. The aforementioned description of one or more embodiments has been presented for purposes of illustration and description. This description is not intended to be exhaustive or to limit the invention to the precise form disclosed. Modifications or variations are possible in light of the above teachings. One or more embodiments have been chosen and described for the purpose of illustrating the principles and practical application to thereby enable one skilled in the art to use the various embodiments and with various modifications as may be convenient for the particular use contemplated. The attached claims are intended to define the global scope.
权利要求:
Claims (20) [1] 1. A computer system configured to be communicably coupled to a surgical device and a camera configured to view an operating room, characterized in that it comprises: a processor; and a memory coupled to the processor, where the memory stores instructions that, when executed by the processor, make the computer system: receive an image of an individual inside the operating room through the camera; determine whether the individual is making a gesture based on the image; and control the surgical device according to the gesture. [2] 2. Computer system according to claim 1, characterized in that: the surgical device comprises a screen; and instructions stored in memory, when executed by the processor, cause the computer system to control the information displayed on the screen in accordance with the gesture. [3] 3. Computer system according to claim 2, characterized in that the information displayed on the screen corresponds to a surgical instrument controlled by the individual. [4] 4. Computer system according to claim 1, characterized in that: the surgical device comprises a surgical instrument; and the instructions stored in memory, when executed by the processor, cause the computer system to alter an operation of the surgical instrument in accordance with the gesture. [5] 5. Computer system as claimed in claim 4, characterized in that the surgical instrument is selected from the group consisting of an electrosurgical instrument, an ultrasonic surgical instrument, and a surgical stapling instrument. [6] 6. Computer system, according to claim 1, characterized in that the instructions stored in the memory, when executed by the processor, make the computer system: extract resources from the images received from the camera; and determine whether the individual is making the gesture according to the possibility that the extracted features match the gesture. [7] 7. A computer system configured to be communicably coupled to a surgical device and a camera configured to view an operating room, characterized in that it comprises: a processor; and a memory coupled to the processor, and the memory stores instructions that, when executed by the processor, make the computer system: receive an image of the surgical device inside the operating room through the camera; determine a position of the surgical device based on the image; and control the surgical device according to the position of the surgical device. [8] 8. Computer system according to claim 7, characterized in that the instructions stored in memory, when executed by the processor, cause the computer system to change an operation of the surgical device according to position. [9] 9. Computer system, according to claim 8, characterized by: the surgical device comprises an end actuator; and the operation comprises an end actuator orientation. [10] A computer system as claimed in claim 8, characterized in that: the surgical device comprises an end actuator configured to clamp or apply energy to tissue in accordance with a control algorithm; and the operation understands the control algorithm. [11] 11. Computer system according to claim 7, characterized in that the instructions stored in memory, when executed by the processor, cause the computer system to cause the surgical device to display information corresponding to the position. [12] Computer system according to claim 11, characterized in that the displayed information corresponds to a surgical context. [13] 13. Computer system, according to claim 12, characterized in that the instructions stored in memory, when executed by the processor, cause the computer system to: receive perioperative data from one or more surgical devices, with the one or more surgical devices comprise the surgical device; and determine the surgical context based, at least in part, on perioperative data from the one or more surgical devices. [14] 14. Computer system according to claim 7, characterized in that the instructions stored in memory, when executed by the processor, cause the computer system to determine the position of the surgical device according to an associated static reference frame. to the operating room. [15] 15. A computer system configured to be communicably coupled to a surgical device and a camera configured to view an operating room, characterized in that it comprises: a processor; and a memory coupled to the processor, the memory of which stores instructions that, when executed by the processor, cause the computer system to: receive an image of a surgical device or an individual inside the operating room through the camera; determine a position of the surgical device based on the image according to whether the image is of the surgical device; determine if the individual is making a gesture based on the image according to the possibility that the image is of the individual; and controlling the surgical device according to at least one of the surgical device position or gesture. [16] Computer system according to claim 15, characterized in that: the surgical device comprises a screen; and instructions stored in memory, when executed by the processor, cause the computer system to control the information displayed on the screen in accordance with the gesture. [17] 17. Computer system according to claim 15, characterized in that: the surgical device comprises a surgical instrument; and the instructions stored in memory, when executed by the processor, cause the computer system to alter an operation of the surgical instrument in accordance with the gesture. [18] Computer system according to claim 15, characterized in that the instructions stored in memory, when executed by the processor, cause the computer system to change an operation of the surgical device according to position. [19] 19. Computer system according to claim 15, characterized in that the instructions stored in memory, when executed by the processor, cause the computer system to cause the surgical device to display information corresponding to the position. [20] 20. Computer system according to claim 15, characterized in that the instructions stored in memory, when executed by the processor, cause the computer system to determine the position of the surgical device according to an associated static reference frame. to the operating room.
类似技术:
公开号 | 公开日 | 专利标题 BR112020013175A2|2020-12-01|imaging of areas outside the abdomen to improve placement and control of an in-use surgical device US20210212774A1|2021-07-15|Usage and technique analysis of surgeon / staff performance against a baseline to optimize device utilization and performance for both current and future procedures US10943454B2|2021-03-09|Detection and escalation of security responses of surgical instruments to increasing severity threats US10892995B2|2021-01-12|Surgical network determination of prioritization of communication, interaction, or processing based on system or device needs US20190200980A1|2019-07-04|Surgical system for presenting information interpreted from external data US20210205020A1|2021-07-08|Surgical network recommendations from real time analysis of procedure variables against a baseline highlighting differences from the optimal solution US20190201128A1|2019-07-04|Sensing the patient position and contact utilizing the mono-polar return pad electrode to provide situational awareness to the hub BR112020013079A2|2020-12-01|wirelessly pairing a surgical device with another device within a sterile surgical field based on the use and situational recognition of devices BR112020013169A2|2020-12-01|surgical tool equipped with motor with predefined adjustable control algorithm to control end actuator parameters BR112020013013A2|2020-11-24|surgical systems with autonomously adjustable control programs BR112020013241A2|2020-12-01|control of a surgical system through a surgical barrier US20210192914A1|2021-06-24|Surgical hub and modular device response adjustment based on situational awareness US11278281B2|2022-03-22|Interactive surgical system US11273001B2|2022-03-15|Surgical hub and modular device response adjustment based on situational awareness BR112020013162A2|2020-12-01|interactive surgical system BR112020013087A2|2020-12-01|detection and escalation of surgical instrument safety responses to threats of increasing severity BR112020013031A2|2020-11-24|response adjustment of modular device and central surgical controller based on situational recognition BR112020012957A2|2020-12-01|surgical system to present information interpreted from external data
同族专利:
公开号 | 公开日 CN111787882A|2020-10-16| US20190201129A1|2019-07-04| US20210205030A1|2021-07-08| EP3505131A2|2019-07-03| WO2019133130A1|2019-07-04| EP3505131A3|2019-08-21| JP2021509052A|2021-03-18|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US20050020909A1|2003-07-10|2005-01-27|Moctezuma De La Barrera Jose Luis|Display device for surgery and method for using the same| US7995045B2|2007-04-13|2011-08-09|Ethicon Endo-Surgery, Inc.|Combined SBI and conventional image processor| US7982776B2|2007-07-13|2011-07-19|Ethicon Endo-Surgery, Inc.|SBI motion artifact removal apparatus and method| EP2391259A1|2009-01-30|2011-12-07|The Trustees Of Columbia University In The City Of New York|Controllable magnetic source to fixture intracorporeal apparatus| US8682489B2|2009-11-13|2014-03-25|Intuitive Sugical Operations, Inc.|Method and system for hand control of a teleoperated minimally invasive slave surgical instrument| US10098527B2|2013-02-27|2018-10-16|Ethidcon Endo-Surgery, Inc.|System for performing a minimally invasive surgical procedure| CN107708595B|2015-04-23|2020-08-04|Sri国际公司|Ultra-dexterous surgical system user interface device|US11103268B2|2017-10-30|2021-08-31|Cilag Gmbh International|Surgical clip applier comprising adaptive firing control| US11141160B2|2017-10-30|2021-10-12|Cilag Gmbh International|Clip applier comprising a motor controller| US11229436B2|2017-10-30|2022-01-25|Cilag Gmbh International|Surgical system comprising a surgical tool and a surgical hub| US11147607B2|2017-12-28|2021-10-19|Cilag Gmbh International|Bipolar combination device that automatically adjusts pressure based on energy modality| US11109866B2|2017-12-28|2021-09-07|Cilag Gmbh International|Method for circular stapler control algorithm adjustment based on situational awareness| US20190201146A1|2017-12-28|2019-07-04|Ethicon Llc|Safety systems for smart powered surgical stapling| US11051876B2|2017-12-28|2021-07-06|Cilag Gmbh International|Surgical evacuation flow paths| US20190274716A1|2017-12-28|2019-09-12|Ethicon Llc|Determining the state of an ultrasonic end effector| US11166772B2|2017-12-28|2021-11-09|Cilag Gmbh International|Surgical hub coordination of control and communication of operating room devices| US11234756B2|2017-12-28|2022-02-01|Cilag Gmbh International|Powered surgical tool with predefined adjustable control algorithm for controlling end effector parameter| US10849697B2|2017-12-28|2020-12-01|Ethicon Llc|Cloud interface for coupled surgical devices| US10966791B2|2017-12-28|2021-04-06|Ethicon Llc|Cloud-based medical analytics for medical facility segmented individualization of instrument function| US11069012B2|2017-12-28|2021-07-20|Cilag Gmbh International|Interactive surgical systems with condition handling of devices and data capabilities| US11045591B2|2017-12-28|2021-06-29|Cilag Gmbh International|Dual in-series large and small droplet filters| US11266468B2|2017-12-28|2022-03-08|Cilag Gmbh International|Cooperative utilization of data derived from secondary sources by intelligent surgical hubs| US10892995B2|2017-12-28|2021-01-12|Ethicon Llc|Surgical network determination of prioritization of communication, interaction, or processing based on system or device needs| US20190206551A1|2017-12-28|2019-07-04|Ethicon Llc|Spatial awareness of surgical hubs in operating rooms| US10987178B2|2017-12-28|2021-04-27|Ethicon Llc|Surgical hub control arrangements| US20190201087A1|2017-12-28|2019-07-04|Ethicon Llc|Smoke evacuation system including a segmented control circuit for interactive surgical platform| US11213359B2|2017-12-28|2022-01-04|Cilag Gmbh International|Controllers for robot-assisted surgical platforms| US11096693B2|2017-12-28|2021-08-24|Cilag Gmbh International|Adjustment of staple height of at least one row of staples based on the sensed tissue thickness or force in closing| US10695081B2|2017-12-28|2020-06-30|Ethicon Llc|Controlling a surgical instrument according to sensed closure parameters| US10944728B2|2017-12-28|2021-03-09|Ethicon Llc|Interactive surgical systems with encrypted communication capabilities| US11257589B2|2017-12-28|2022-02-22|Cilag Gmbh International|Real-time analysis of comprehensive cost of all instrumentation used in surgery utilizing data fluidity to track instruments through stocking and in-house processes| US11202570B2|2017-12-28|2021-12-21|Cilag Gmbh International|Communication hub and storage device for storing parameters and status of a surgical device to be shared with cloud based analytics systems| US11132462B2|2017-12-28|2021-09-28|Cilag Gmbh International|Data stripping method to interrogate patient records and create anonymized record| US10758310B2|2017-12-28|2020-09-01|Ethicon Llc|Wireless pairing of a surgical device with another device within a sterile surgical field based on the usage and situational awareness of devices| US11160605B2|2017-12-28|2021-11-02|Cilag Gmbh International|Surgical evacuation sensing and motor control| US11253315B2|2017-12-28|2022-02-22|Cilag Gmbh International|Increasing radio frequency to create pad-less monopolar loop| US10892899B2|2017-12-28|2021-01-12|Ethicon Llc|Self describing data packets generated at an issuing instrument| US20190205001A1|2017-12-28|2019-07-04|Ethicon Llc|Sterile field interactive control displays| US11013563B2|2017-12-28|2021-05-25|Ethicon Llc|Drive arrangements for robot-assisted surgical platforms| US11056244B2|2017-12-28|2021-07-06|Cilag Gmbh International|Automated data scaling, alignment, and organizing based on predefined parameters within surgical networks| US11076921B2|2017-12-28|2021-08-03|Cilag Gmbh International|Adaptive control program updates for surgical hubs| US11100631B2|2017-12-28|2021-08-24|Cilag Gmbh International|Use of laser light and red-green-blue coloration to determine properties of back scattered light| US11179208B2|2017-12-28|2021-11-23|Cilag Gmbh International|Cloud-based medical analytics for security and authentication trends and reactive measures| US10943454B2|2017-12-28|2021-03-09|Ethicon Llc|Detection and escalation of security responses of surgical instruments to increasing severity threats| US10932872B2|2017-12-28|2021-03-02|Ethicon Llc|Cloud-based medical analytics for linking of local usage trends with the resource acquisition behaviors of larger data set| US11259830B2|2018-03-08|2022-03-01|Cilag Gmbh International|Methods for controlling temperature in ultrasonic device| US10973520B2|2018-03-28|2021-04-13|Ethicon Llc|Surgical staple cartridge with firing member driven camming assembly that has an onboard tissue cutting feature| US20190298350A1|2018-03-28|2019-10-03|Ethicon Llc|Methods for controlling a powered surgical stapler that has separate rotary closure and firing systems| US11197668B2|2018-03-28|2021-12-14|Cilag Gmbh International|Surgical stapling assembly comprising a lockout and an exterior access orifice to permit artificial unlocking of the lockout| US11213294B2|2018-03-28|2022-01-04|Cilag Gmbh International|Surgical instrument comprising co-operating lockout features| US11090047B2|2018-03-28|2021-08-17|Cilag Gmbh International|Surgical instrument comprising an adaptive control system| US11207067B2|2018-03-28|2021-12-28|Cilag Gmbh International|Surgical stapling device with separate rotary driven closure and firing systems and firing member that engages both jaws while firing| US11166716B2|2018-03-28|2021-11-09|Cilag Gmbh International|Stapling instrument comprising a deactivatable lockout| US11219453B2|2018-03-28|2022-01-11|Cilag Gmbh International|Surgical stapling devices with cartridge compatible closure and firing lockout arrangements| US11096688B2|2018-03-28|2021-08-24|Cilag Gmbh International|Rotary driven firing members with different anvil and channel engagement features| US11259807B2|2019-02-19|2022-03-01|Cilag Gmbh International|Staple cartridges with cam surfaces configured to engage primary and secondary portions of a lockout of a surgical stapling device| CN112022357A|2020-09-16|2020-12-04|微创(上海)医疗机器人有限公司|Doctor console, surgical robot system, and control method for doctor console| CN112472225B|2020-12-05|2021-07-27|山东第一医科大学第二附属医院|Robot for locally removing rectal cancer| CN112220530B|2020-12-05|2021-05-04|盛保璐|Local excision device for duodenal ulcer|
法律状态:
2021-12-07| B350| Update of information on the portal [chapter 15.35 patent gazette]|
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 US201762611341P| true| 2017-12-28|2017-12-28| US201762611339P| true| 2017-12-28|2017-12-28| US201762611340P| true| 2017-12-28|2017-12-28| US62/611,339|2017-12-28| US62/611,341|2017-12-28| US62/611,340|2017-12-28| US201862640417P| true| 2018-03-08|2018-03-08| US201862640415P| true| 2018-03-08|2018-03-08| US62/640,415|2018-03-08| US62/640,417|2018-03-08| US201862650882P| true| 2018-03-30|2018-03-30| US201862650887P| true| 2018-03-30|2018-03-30| US201862650898P| true| 2018-03-30|2018-03-30| US201862650877P| true| 2018-03-30|2018-03-30| US62/650,887|2018-03-30| US62/650,877|2018-03-30| US62/650,898|2018-03-30| US62/650,882|2018-03-30| US201862659900P| true| 2018-04-19|2018-04-19| US62/659,900|2018-04-19| US201862692747P| true| 2018-06-30|2018-06-30| US201862692748P| true| 2018-06-30|2018-06-30| US201862692768P| true| 2018-06-30|2018-06-30| US62/692,768|2018-06-30| US62/692,748|2018-06-30| US62/692,747|2018-06-30| US201862729191P| true| 2018-09-10|2018-09-10| US62/729,191|2018-09-10| US16/182,269|US20190201129A1|2017-12-28|2018-11-06|Image capturing of the areas outside the abdomen to improve placement and control of a surgical device in use| US16/182,269|2018-11-06| PCT/US2018/060960|WO2019133130A1|2017-12-28|2018-11-14|Image capturing of the areas outside the abdomen to improve placement and control of a surgical device in use| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|