专利摘要:
The display system (10) for a cockpit (3) of an aircraft (1), comprises a display device (20) configured to be attached to the head of a user, a display computer ( 18), a sensor (16) of position and orientation of the user's head and a database (28) of interior geometry of the cockpit. In a procedure management assistance mode, the display computer is configured to: - receive, from a procedure management calculator (13), at least one information relating to a first control element (45a) the cockpit on which a user action is expected; determining a display position of a symbol (46) on the display device, as a function of the position of the first control element, acquired from the database (28) and position information and orienting the user's head in the acquired cockpit of the sensor (16), so that this symbol is visible to the user in superposition of the first control element; and - controlling the display of the symbol on the display device.
公开号:FR3063713A1
申请号:FR1751917
申请日:2017-03-09
公开日:2018-09-14
发明作者:Javier Manjon Sanchez;Cedric DESCHEEMAEKER
申请人:Airbus Operations SAS;
IPC主号:
专利说明:

® FRENCH REPUBLIC
NATIONAL INSTITUTE OF INDUSTRIAL PROPERTY © Publication number: 3,063,713 (to be used only for reproduction orders)
©) National registration number: 1751917
COURBEVOIE © Int Cl 8 : B 64 D 11/00 (2017.01), B 64 D 45/00, G 02 B 27/01
A1 PATENT APPLICATION
©) Date of filing: 09.03.17. © Applicant (s): AIRBUS OPERATIONS (S.A.S.) (© Priority: Simplified joint stock company - FR. @ Inventor (s): MANJON SANCHEZ JAVIER and DES- CHEEMAEKERCEDRIC. (43) Date of public availability of the request: 14.09.18 Bulletin 18/37. ©) List of documents cited in the report preliminary research: Refer to end of present booklet (© References to other national documents ® Holder (s): AIRBUS OPERATIONS (S.A.S.) Company related: by simplified actions. ©) Extension request (s): (© Agent (s): AIRBUS OPERATIONS SAS Company anonymous.
4) DISPLAY SYSTEM AND METHOD FOR AN AIRCRAFT.
FR 3 063 713 - A1 (6y The display system (10) for a cockpit (3) of an aircraft (1), comprises a display device (20) configured to be worn integral with the head of a user, a display computer (18), a sensor (16) of position and orientation of the head of the user and a database (28) of interior geometry of the cockpit. procedure management, the display calculator is configured to:
- receive, from a procedure management computer (13), at least one item of information relating to a first control element (45a) of the cockpit on which action by the user is expected;
- determining a position for displaying a symbol (46) on the display device, as a function of the position of the first control element, acquired from the database (28) and from position and position information orientation of the head of the user in the cockpit acquired from the sensor (16), so that this symbol is visible by the user superimposed on the first control element; and
- control the display of the symbol on the display device.

Display system and method for an aircraft.
The invention relates to the display of piloting aid information in an aircraft cockpit. Modern aircraft, particularly transport aircraft, generally have a system for displaying pilot assist information in their cockpit. Such a system, for example of the CDS (“Control and Display System”) type, controls the display of information on cockpit screens. This system generally includes a screen configured to display procedures ("checklists" in English) to be carried out by an aircraft pilot. This screen is for example an ECAM (“Electronic Centralized Aircraft Monitor”) screen, which is associated with at least one procedure management computer controlling the display of the procedures on this screen. To carry out the procedures displayed on this screen, the pilot must in particular interact with cockpit control elements such as, for example, buttons located on panels of the cockpit ceiling, thrust levers for controlling the engines of the aircraft, etc. However, an aircraft cockpit includes a large number of such control elements. It would therefore be advantageous to help the pilot quickly identify a control element on which he must act in order to facilitate the piloting of the aircraft. Document US7260453B2 describes a system which illuminates a cockpit control element on which the pilot is supposed to act, by means of a set of projectors arranged in the cockpit so as to allow the lighting of all of the control elements of the cockpit. The use of said headlights requires their installation in the cockpit, which must be planned during the design of the aircraft cockpit. In addition, an aircraft cockpit with a large number of controls and display screens can make it difficult to find suitable locations for the installation of projectors. In a variant, the system includes display means surrounding the control elements or located nearby. This requires significant wiring which is costly and which increases the weight of the aircraft. There is therefore a need for a system to help the pilot quickly identify a control element on which he must act, which would not require a complex and costly installation in the cockpit of the aircraft.
PRESENTATION OF THE INVENTION:
The object of the present invention is in particular to provide a solution to these problems. It relates to a display system for a cockpit of an aircraft comprising a set of control elements, the display system comprising:
- a display device configured to be worn integral with the head of a user in the cockpit of the aircraft;
- a display calculator configured to control the display of information on the display device;
- a position and orientation sensor for the user's head in the cockpit; and
- a database including information on the interior geometry of the cockpit.
The display system is remarkable in that it includes a so-called procedure management assistance operating mode, in which the display computer is configured to:
- receive, from an aircraft procedure management computer, at least information relating to a first cockpit control element on which action by the user is expected;
- acquire, from the database, position information of the first control element in the cockpit;
- acquire position and orientation information of the user's head in the cockpit, provided by the sensor;
- determining a position for displaying a symbol on the display device, as a function of the position of the first control element in the cockpit and of position and orientation information of the user's head in the cockpit , so that this symbol is visible to the user superimposed on the first control element; and
- control the display of the symbol on the display device.
Thus, when the procedure management assistance mode is activated, the display system controls the display of a symbol on the display device, superimposed on the first control element visible to the user through the display device. This allows the user to quickly and easily identify the first control element by means of which he must perform a requested action in a procedure managed by the procedure computer. The display system has the advantage of not requiring a specific installation in the cockpit of the aircraft, since the display is carried out by means of the display device worn integral with the head of the user.
In a particular embodiment, in the procedure management assistance mode, the display computer is further configured to control the display on the display device, near the symbol, of information of help relating to the action expected from the user on the first cockpit control element.
In an advantageous embodiment, the display system includes a second sensor configured to detect at least part of a user's hand in a detection volume of the second sensor and, in the management assistance mode procedures, the display computer is further configured to:
- acquire information from the second sensor;
- determine if the information acquired from the second sensor corresponds to the presence of a part of a user's hand in the detection volume of the second sensor; and
- if the information acquired from the second sensor corresponds to the presence of part of a user's hand:
. determining a position of said part of the user's hand in the cockpit;
. depending on the position of the user's hand part in the cockpit, search the database for information to determine if there is a cockpit control element located at a distance from the hand part user less than a distance threshold; and. if there is a cockpit control element, other than the first control element, located at a distance from the part of the user's hand less than this distance threshold, order the display of information of user alert on the display device.
In particular, the second sensor is configured to detect said at least part of a user's hand, said part of the hand comprising at least one part of a finger, in particular an end of the finger.
Advantageously, the second sensor is integral with the display device. It corresponds in particular to an optical sensor.
According to a first variant, the display system further comprises a device for activating the procedure management assistance mode, and the display computer is configured to activate or deactivate the procedure management assistance mode. procedures based on user actions on the device to activate the procedure management assistance mode.
According to a second variant, the display system further comprises at least one sensor for physiological signals from the user and the display calculator is configured to:
- acquire information from at least one physiological signal sensor;
- determine a stress level for the user based on information acquired from at least one physiological signal sensor;
- compare the user's stress level with a predetermined stress level;
- activate the procedure management assistance mode if the user's stress level is higher than the predetermined stress level.
According to a third variant, the display calculator is configured to:
- receive, from the procedure management calculator, at least one piece of information relating to inappropriate user behavior;
- activate the procedure management assistance mode when it receives this information relating to inappropriate user behavior.
The invention also relates to a display method for an aircraft cockpit, the aircraft comprising a set of control elements in the cockpit, a procedure management computer and a display system comprising:
- a display device configured to be worn integral with the head of a user in the cockpit of the aircraft;
- a display calculator configured to control the display of information on the display device;
- a position and orientation sensor for the user's head in the cockpit; and
- a database including information on the interior geometry of the cockpit.
The method is remarkable in that, in a mode of operation known as assisting in the management of procedures of the display system, the method comprises the following steps implemented by the display computer:
- receive, from the procedure management calculator, at least information relating to a first cockpit control element on which action by the user is expected;
- acquire, from the database, position information of the first control element in the cockpit;
- acquire position and orientation information of the user's head in the cockpit, provided by the sensor;
- determining a position for displaying a symbol on the display device, as a function of the position of the first control element in the cockpit and of position and orientation information of the user's head in the cockpit , so that this symbol is visible to the user superimposed on the first control element; and
- control the display of the symbol on the display device.
In an advantageous embodiment, the display system comprising a second sensor configured to detect at least part of a hand of the user in a detection volume of the second sensor, in the mode of assistance in the management of procedures the method further comprises the following steps implemented by the display computer:
- acquire information from the second sensor;
- determine if the information acquired from the second sensor corresponds to the presence of a part of a user's hand in the detection volume of the second sensor; and
- if the information acquired from the second sensor corresponds to the presence of part of a user's hand:
. determining a position of said part of the user's hand in the cockpit;
. depending on the position of said part of the user's hand in the cockpit, search the database for information making it possible to determine whether there is a cockpit control element located at a distance from said part of the hand user less than a distance threshold; and. if there is a cockpit control element, other than the first control element, located at a distance from said part of the user's hand less than this distance threshold, order the display of information of user alert on the display device.
The invention also relates to an aircraft comprising a display system as mentioned above.
DETAILED DESCRIPTION :
The invention will be better understood on reading the description which follows and on examining the appended figures.
Figure 1 illustrates in a simplified manner an aircraft comprising a cockpit.
Figures 2a and 2b schematically illustrate embodiments, in accordance with the invention, of a display system for a cockpit of an aircraft.
FIGS. 3a, 3b and 3c illustrate the orientation of the head of a user of the display system, respectively in top view, in side view and in back view.
FIGS. 4a, 4b and 5 illustrate display examples, on a display device worn integral with the head of a user.
The aircraft 1 shown in FIG. 1 comprises a cockpit 3 in a front part 4 of said aircraft. It has a longitudinal axis 5, corresponding to a roll axis of the aircraft. This longitudinal axis is substantially horizontal when the aircraft is parked on the ground. The aircraft also has a yaw axis (not shown), which is substantially vertical when the aircraft is parked on the ground. By convention, in the following description, the term horizontal designates a line or a substantially horizontal plane when the aircraft is parked on the ground, so that this line or this plane is perpendicular to the yaw axis of the aircraft. Similarly, the term vertical designates a line or a plane that is substantially vertical when the aircraft is parked on the ground, such that this line or this plane is parallel to (or contains) the yaw axis of the aircraft.
The display system 10 according to an embodiment of the invention and shown in Figure 2a, includes a display computer 18 comprising a processing unit (labeled PROC in the figure). This processing unit can in particular correspond to a processor or a microprocessor of the display computer. According to different embodiments, the display computer 18 is a common display computer controlling several display devices of the aircraft or even a computer dedicated to the display system 10. In a particular embodiment, this computer corresponds to a modular avionics type IMA (“Integrated Modular avionics” in English) computer which also supports functions other than display. The display computer 18 is connected to at least one procedure management computer 13 of the aircraft. In the particular example shown in FIG. 2a, the display computer is connected to the procedure management computer 13 by a link 15 and a communication network 14 (labeled “Net” in the figure) to which the computer is also connected for managing procedures 13 as well as avionics computers 12. The procedure management computer 13 and avionics computers 12 are for example located in an avionics bay 2 of the aircraft. The display system 10 further comprises a display device 20 configured to be worn integral with the head of a user, in particular a pilot, in the cockpit of the aircraft. This display device is connected to the display computer 18 by a link 19. This display device 20 corresponds to a display device commonly called HMD for "Head Mounted Display" in English. It is sometimes also called HWD for "Head Worn Display" in English. It includes a display mounted integral with glasses or a helmet, so that the user can see information displayed on the display when he is wearing these glasses or helmet. The display is semi-reflective so that the user can view the information displayed, in augmented reality, superimposed on the environment. In the following description, the term HMD denotes both an HMD device and an HWD device. The display system 10 further comprises a sensor 16 for orientation and position of the user's head and a database 28 comprising information on the interior geometry of the cockpit. The user's head orientation and position sensor 16 is connected to the display computer 18 by a link 17 and the database 28 is connected to the display computer 18 by a link 27. In one mode particular embodiment, the sensor 16 is mounted integral with the display device 20, as symbolized by the arrow 21 in broken lines. It then corresponds, for example, to a set of inertial sensors integrated into the display device 20. In another particular embodiment, the sensor 16 is mounted integral with the cockpit 3 of the aircraft. It then corresponds, for example, to a camera arranged so as to automatically monitor the relative position of the head of the user of the display device 20 relative to the cockpit. Without departing from the scope of the invention, the sensor 16 can correspond to a group of sensors, for example a user head orientation sensor and a user head position sensor. The cockpit interior geometry information contained in the database 28 is for example obtained from a digital model of the aircraft or from the cockpit of the aircraft, in particular from a three-dimensional digital model. Advantageously, this geometry information is simplified compared with information from the digital model of the aircraft, so as to reduce the volume of said information as well as the calculation times. This cockpit interior geometry information contained in the database 28 at least relates to cockpit control elements, for example such as buttons located on cockpit control panels, thrust control levers for the engines. aircraft, etc.
In operation, when a procedure must be executed by a user, in particular an aircraft pilot, the procedure management computer 13 displays this procedure on an ECAM type screen of the aircraft cockpit. The display system 10 includes a procedure management assistance mode. When the procedure management assistance mode is activated, the procedure management computer 13 transmits to the display computer 18 through the communication network 14, information relating to a first cockpit control element on which a user action is expected for the execution of the procedure. The display computer 18, which receives this information, then queries the database 28 to acquire position information from the first control element in the cockpit. It also acquires position and orientation information of the user's head in the cockpit, provided by the sensor 16. Depending on the position of the first control element in the cockpit and position and orientation information from the head of the user in the cockpit, the display computer 18 determines a position for displaying a symbol on the display device 20, so that this symbol is visible to the user in overlay of the first control element. The display computer 18 then controls the display of the symbol on the display device 20.
In an example illustrated by FIG. 5, the user can see a set of control elements 45 of the cockpit through the display 8 of the display device 20. These control elements correspond in particular to buttons of a control panel located on the cockpit ceiling. According to the procedure being executed, the user must perform an action on a first element 45a of the set of control elements 45. As described above, the display computer 18 controls the display of a symbol 46 superimposed on the first control element 45a. In the example illustrated by the figure, the symbol 46 corresponds to a frame surrounding the first control element 45a. Other types of representations of the symbol are possible without departing from the scope of the invention.
Advantageously, the display computer 18 is further configured to control the display on the display device 20, near the symbol 46, of help information 48 relating to the next action expected from the user on the first control element 45a of the cockpit. For this, in addition to the information relating to the first control element, the display computer 18 receives from the procedure management computer 13, either the help information 48, or an identifier of the help information. The display of help information 48 makes it possible to inform the user of the next action to be carried out by means of the first control element 45a.
In special circumstances, the user may not look towards the first control element 45a when an action on this first control element is required by the procedure. FIG. 4a illustrates such a situation, in which the user looks at the exterior of the aircraft through a windshield of the cockpit. The display on the display 8 of the display device 20 then comprises, for example, an aircraft reference symbol 38 (or "aircraft reference" in English), an aircraft speed vector symbol 40 and a scale 36 attitude of the aircraft, displayed in accordance with the environment of the aircraft. The display also includes an altitude scale 32, a speed scale 30 and an aircraft roll scale 34. In such a situation, the symbol 46 cannot be displayed superimposed on the first control element 45a. The display computer 18 then determines a direction in which the user must turn his head to see the first control element 45a and displays an arrow pointing to this direction. In an example illustrated in FIG. 4b, the user looks through a windshield of the cockpit, towards the front of the aircraft, and the first control element 45a is located on a control panel of the cockpit ceiling. Consequently, the display computer 18 controls the display, on the display 8, of an arrow 44 pointing upwards. Advantageously, the display computer 18 also controls the display of information 42 near the arrow 44 so as to inform the user that action on his part is required. If he turns his head upwards, the user then accesses the display, already described, illustrated in FIG. 5.
In an advantageous embodiment illustrated in FIG. 2b, the display system 10 further comprises a second sensor 22 configured to detect at least part of a user's hand in a detection volume of the second sensor. In a first example, when the first control element corresponds to a lever for controlling the thrust of the motors, this part of a hand of the user corresponds to any part of the hand capable of acting on the lever. In a second example, when the first control element on which user action is expected corresponds to a button, this part of a user's hand corresponds to a part of a user's finger, preferably at least at one end of the finger with which the user is likely to press the button. The second sensor 22, which corresponds for example to an optical sensor such as a camera, is connected to the display computer 18 by a link 25. According to a first alternative, the second sensor 22 is integrated into the display device 20 as illustrated in the figure. According to a second alternative, the second sensor 22 is integrated in the cockpit of the aircraft. In the procedure management assistance mode, the display computer is further configured to acquire information from the second sensor 22 and to determine whether the information acquired from the second sensor corresponds to the presence of part of a user's finger in the detection volume of the second sensor (in the context of the second example mentioned above). If the information acquired from the second sensor corresponds to the presence of part of a finger of the user, the display computer 18 determines a position of said part of the finger of the user in the cockpit and according to the position of the part of the user's finger in the cockpit, it searches the database 28 for information making it possible to determine whether there is a cockpit control element located at a distance from the part of the user's finger less than a distance threshold. If there is a cockpit control element, other than the first control element 45a, situated at a distance from the part of the user's finger less than this distance threshold, the display computer 18 controls the display user alert information on the display device 20. This provides better assistance to the user by monitoring his actions on the cockpit control elements and alerting him when he risk of acting on a control element other than that provided for in the procedure. The system is particularly advantageous, since it makes it possible to anticipate the alert even before the user performs an action on this other control element, by detecting the presence of a part, preferably an end, of a finger. of the user near said other control element.
According to a first variant, the display system further comprises a device for activating the mode of assistance with the management of procedures. This device corresponds for example to a button in the aircraft cockpit or to a man-machine interface in the cockpit, comprising a display screen as well as a keyboard and / or a pointing device of the trackball or mouse type. As a function of actions of the user on the device for activating the procedure management assistance mode, this device for activating the procedure management assistance mode transmits information to the computer 13 for managing the procedures, according to a first alternative, or to the display computer 18, according to a second alternative. As a function of said information, the procedure management computer 13 or the display computer 18 activates or deactivates the procedure management assistance mode. In the first alternative, the procedure management computer 13 informs the display computer 18 of the activation or deactivation of the procedure management assistance mode.
According to a second variant, the display system also comprises at least one sensor for physiological signals from the user. This at least one sensor is for example mounted integral with the HMD display device 20. According to a first variant, this sensor corresponds to one or more electrodes intended to be placed on the head of the user when the latter is wearing the HMD display device. These electrodes measure electroencephalography signals, which reflect the activity of the user's brain. According to a second variant which can be coupled with the first variant, the sensor uses infrared signals of the FLIR (“Forward Looking InfraRed”) type in order to measure a temperature of the user's head. The display computer 18 acquires information from the at least one physiological signal sensor and determines a stress level for the user as a function of the information acquired from the at least one physiological signal sensor. The display computer 18 compares the stress level of the user with a predetermined stress level and activates the procedure management assistance mode if the stress level of the user is greater than the predetermined stress level. . Thus, the procedure management assistance mode is activated automatically when the user's stress level exceeds the predetermined stress level.
According to a third variant, the display computer 18 activates the procedure management assistance mode when it receives, from the procedure management computer, information relating to inappropriate behavior of the user. Thus, the procedure management assistance mode is activated automatically when the user does not perform the actions expected within the framework of a procedure.
In an advantageous embodiment, the orientation information of the user's head corresponds to at least one angle from a set of angles, as illustrated in FIGS. 3a, 3b and 3c. In these figures, the orientation of the user's head 50 is represented by a straight line 52. In an exemplary embodiment, this straight line 52 corresponds to a theoretical direction of gaze of the user when he looks in front of him without turning his eyes neither on the right nor on the left and without raising or lowering the eyes. Other definitions of the orientation of the head of the user are however possible without departing from the scope of the invention. In said advantageous embodiment, the orientation information of the user's head corresponds to at least one angle from a yaw angle ψ, a pitch angle Θ and a roll angle φ illustrated respectively in FIGS. 3a , 3b and 3c. These angles are defined in a coordinate system linked to the aircraft. Thus, the yaw angle ψ is an angle, defined in projection in a horizontal plane, between a straight line 5 'parallel to the longitudinal axis 5 of the aircraft and the straight line 52 representing the orientation of the head of the user. The pitch angle Θ is an angle, defined in projection in a vertical plane parallel to the longitudinal axis 5 of the aircraft, between a straight line 5 parallel to the longitudinal axis 5 of the aircraft and the straight line 52 representing l orientation of the user's head. The roll angle φ is an angle defined in projection in a vertical plane perpendicular to the longitudinal axis 5 of the aircraft, between a vertical line 56 and a yaw axis 54 of the head of the user. The orientation information of the head of the user acquired by the display computer 18 corresponds to at least one of the angles ψ, θ and φ.
Also advantageously, the position information of the head of the user corresponds to Cartesian coordinates of the center of gravity of the head 50 in an orthonormal reference frame. In particular, a first axis of the orthonormal coordinate system is parallel to the longitudinal axis 5, a second axis of the orthonormal coordinate system is vertical and a third axis of the orthonormal coordinate system is horizontal and perpendicular to the first two axes.
权利要求:
Claims (12)
[1" id="c-fr-0001]
1- Display system (10) for a cockpit (3) of an aircraft (1) comprising a set of control elements (45), the display system comprising:
- a display device (20) configured to be worn integral with the head of a user in the cockpit of the aircraft;
- a display computer (18) configured to control the display of information on the display device;
- a sensor (16) for position and orientation of the user's head in the cockpit; and
- a database (28) comprising information on the interior geometry of the cockpit, the display system being characterized in that it includes a mode of operation known as procedure management assistance, in which the computer display is configured for:
- receive, from a procedure management computer (13) of the aircraft, at least one item of information relating to a first control element (45a) of the cockpit (3) on which action by the user is expected;
- acquire, from the database (28), position information of the first control element in the cockpit;
- acquire position and orientation information of the user's head in the cockpit, provided by the sensor (16);
- determining a position for displaying a symbol (46) on the display device (20), as a function of the position of the first control element (45a) in the cockpit and of position and orientation information of the head of the user in the cockpit, so that this symbol is visible to the user over the first control element; and
- control the display of the symbol (46) on the display device.
[2" id="c-fr-0002]
2- System according to claim 1, characterized in that, in the procedure management assistance mode, the display computer is further configured to control the display on the display device, near the symbol (46), help information (48) relating to the action expected from the user on the first cockpit control element.
[3" id="c-fr-0003]
3- System according to one of claims 1 or 2, characterized in that it comprises a second sensor (22) configured to detect at least part of a hand of the user in a detection volume of the second sensor and , in the procedure management assistance mode, the display computer is also configured to:
- acquire information from the second sensor;
- determine if the information acquired from the second sensor corresponds to the presence of a part of a user's hand in the detection volume of the second sensor; and
- if the information acquired from the second sensor corresponds to the presence of part of a user's hand:
. determining a position of said part of the user's hand in the cockpit;
. depending on the position of the part of the user's hand in the cockpit, search the database (28) for information making it possible to determine whether there is a cockpit control element located at a distance from the part of the user's hand below a distance threshold; and. if there is a cockpit control element, other than the first control element (45a), located at a distance from the part of the user's hand less than this distance threshold, order the display of a user alert information on the display device.
[4" id="c-fr-0004]
4- System according to claim 3, characterized in that the second sensor is configured to detect at least part of a hand of the user, said part of the hand comprising at least part of a finger, in especially one end of the finger.
[5" id="c-fr-0005]
5- System according to claim 4, characterized in that the second sensor is integral with the display device (20).
[6" id="c-fr-0006]
6- System according to one of claims 3 to 5, characterized in that the second sensor is an optical sensor.
[7" id="c-fr-0007]
7- System according to any one of the preceding claims, characterized in that it further comprises a device for activating the procedure management assistance mode, and the display computer is configured to activate or deactivate the procedure management assistance mode based on user actions on the device for activating procedure management assistance mode.
[8" id="c-fr-0008]
8- System according to any one of claims 1 to 6, characterized in that it further comprises at least one sensor of physiological signals from the user and the display calculator is configured for:
- acquire information from at least one physiological signal sensor;
- determine a stress level for the user based on information acquired from at least one physiological signal sensor;
- compare the user's stress level with a predetermined stress level;
- activate the procedure management assistance mode if the user's stress level is higher than the predetermined stress level.
[9" id="c-fr-0009]
9- System according to any one of claims 1 to 6, characterized in that the display computer is configured for:
- receive, from the procedure management calculator, at least one piece of information relating to inappropriate user behavior;
- activate the procedure management assistance mode when it receives this information relating to inappropriate user behavior.
[10" id="c-fr-0010]
10- Display method for a cockpit (3) of an aircraft (1), the aircraft comprising a set of control elements (45) in the cockpit, a procedure management computer (13) and a system display (10) comprising:
- a display device (20) configured to be worn integral with the head of a user in the cockpit of the aircraft;
- a display computer (18) configured to control the display of information on the display device;
- a sensor (16) for position and orientation of the user's head in the cockpit; and
- a database (28) comprising information on the interior geometry of the cockpit, characterized in that in a mode of operation known as assistance in managing the procedures of the display system, the method comprises the following steps implemented by the display computer (18):
- receive, from the procedure management computer (13), at least one item of information relating to a first control element (45a) of the cockpit on which action by the user is expected;
- acquiring, from the database (28), position information of the first control element (45a) in the cockpit;
- acquire position and orientation information of the user's head in the cockpit, provided by the sensor (16);
- determining a position for displaying a symbol (46) on the display device, as a function of the position of the first control element (45a) in the cockpit and of position and orientation information of the head the user in the cockpit, so that this symbol is visible to the user superimposed on the first control element; and
- control the display of the symbol (46) on the display device.
[11" id="c-fr-0011]
11- Method according to claim 10, characterized in that, the display system comprising a second sensor (22) configured to detect at least part of a hand of the user in a detection volume of the second sensor, in the mode of assistance with the management of procedures the method further comprises the following steps implemented by the display computer:
- acquire information from the second sensor;
- determine if the information acquired from the second sensor corresponds to the presence of a part of a user's hand in the detection volume of the second sensor; and
- if the information acquired from the second sensor corresponds to the presence of part of a user's hand:
. determining a position of said part of the user's hand in the cockpit;
. depending on the position of said part of the user's hand in the cockpit, search the database (28) for information making it possible to determine whether there is a cockpit control element located at a distance from said part of the user's hand below a distance threshold; and. if there is a cockpit control element, other than the first control element (45a), located at a distance from said part of the user's hand less than this distance threshold, order the display of a user alert information on the display device
[12" id="c-fr-0012]
12- Aircraft (1) characterized in that it comprises a display system (10) according to any one of claims 1 to 9.
1/5
类似技术:
公开号 | 公开日 | 专利标题
US10032429B2|2018-07-24|Device control utilizing optical flow
FR3063713B1|2019-07-05|DISPLAY SYSTEM AND METHOD FOR AN AIRCRAFT
US9405977B2|2016-08-02|Using visual layers to aid in initiating a visual search
US9536354B2|2017-01-03|Object outlining to initiate a visual search
KR102313256B1|2021-10-18|Eye gaze detection with multiple light sources and sensors
US9684374B2|2017-06-20|Eye reflection image analysis
US20150154940A1|2015-06-04|Determining Correlated Movements Associated With Movements Caused By Driving A Vehicle
US20150097772A1|2015-04-09|Gaze Signal Based on Physical Characteristics of the Eye
EP3525066A1|2019-08-14|Peripheral vision in a human-machine interface
US10437882B2|2019-10-08|Object occlusion to initiate a visual search
EP2706454B1|2020-12-02|System and method for controlling the position of a movable object on a display device
FR3068481B1|2019-07-26|DISPLAY SYSTEM AND METHOD FOR AN AIRCRAFT
CA2458965A1|2004-09-19|Aircraft piloting system, at least for piloting the aircraft on a non-precision approach with a view to landing
US20160152351A1|2016-06-02|Near-to-eye display systems and methods for verifying aircraft components
EP2533095B1|2020-04-08|Piloting assistance system and aircraft
FR3037136A1|2016-12-09|SYSTEM AND METHOD FOR DISPLAYING AN AIRCRAFT
FR3049072A1|2017-09-22|HIGH HEAD DISPLAY SYSTEM OF AN AIRCRAFT
FR3060116A1|2018-06-15|SYSTEM AND METHOD FOR DISPLAYING AN AIRCRAFT
EP2932464B1|2019-09-11|Method of display and system for aiding navigation
EP3074842B1|2018-01-10|Drone control station
US20180241970A1|2018-08-23|Driving assistance method and driving assistance device and program using same
FR3090137A1|2020-06-19|System and method for displaying an aircraft
EP3028119A1|2016-06-08|Programming system for a situation analysis system on board a carrier comprising at least one onboard listening system
EP3816703B1|2022-02-23|Method to assist piloting of an aircraft
EP2664476A1|2013-11-20|Instrument cluster arrangement
同族专利:
公开号 | 公开日
FR3063713B1|2019-07-05|
US20180261148A1|2018-09-13|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题
WO2006013560A2|2004-08-02|2006-02-09|David Cohen|Emergency situation detector|
US7260453B2|2005-01-06|2007-08-21|The Boeing Company|Checklist error mitigation system|
EP2254023A2|2009-05-19|2010-11-24|Honeywell International Inc.|Systems, apparatus and fast methods for aligning images to external markers in Near-to-Eye display systems|
US20130050069A1|2011-08-23|2013-02-28|Sony Corporation, A Japanese Corporation|Method and system for use in providing three dimensional user interface|
EP3029510A2|2014-12-02|2016-06-08|Honeywell International Inc.|Near-to-eye display systems and methods for verifying aircraft components|
US5388990A|1993-04-23|1995-02-14|The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration|Virtual reality flight control display with six-degree-of-freedom controller and spherical orientation overlay|
US5803738A|1994-06-24|1998-09-08|Cgsd Corporation|Apparatus for robotic force simulation|
US5831584A|1995-07-28|1998-11-03|Chrysler Corporation|Hand calibration system and virtual display selection for vehicle simulator|
US6127990A|1995-11-28|2000-10-03|Vega Vista, Inc.|Wearable display and methods for controlling same|
US5689619A|1996-08-09|1997-11-18|The United States Of America As Represented By The Secretary Of The Army|Eyetracker control of heads-up displays|
US6053737A|1997-11-04|2000-04-25|Northrop Grumman Corporation|Intelligent flight tutoring system|
US6714141B2|2002-04-09|2004-03-30|Colm C. Kennedy|Electronic cockpit vision system|
US10039445B1|2004-04-01|2018-08-07|Google Llc|Biosensors, communicators, and controllers monitoring eye movement and methods for using them|
US20070198141A1|2006-02-21|2007-08-23|Cmc Electronics Inc.|Cockpit display system|
US8055412B2|2007-05-29|2011-11-08|Bayerische Motoren Werke Aktiengesellschaft|System and method for displaying control information to the vehicle operator|
US8687056B2|2007-07-18|2014-04-01|Elbit Systems Ltd.|Aircraft landing assistance|
FR2922323B1|2007-10-12|2012-08-03|Airbus France|CROSS-MONITORING DEVICE FOR HIGH HEAD DISPLAYS|
US8903430B2|2008-02-21|2014-12-02|Microsoft Corporation|Location based object tracking|
US9207758B2|2008-05-30|2015-12-08|Honeywell International Inc.|Operator assistance methods and systems|
US8126600B2|2008-06-18|2012-02-28|Honeywell International Inc.|Method and apparatus for improving pilot situational awareness during flare to touchdown|
US9728006B2|2009-07-20|2017-08-08|Real Time Companies, LLC|Computer-aided system for 360° heads up display of safety/mission critical data|
TWI400467B|2009-05-11|2013-07-01|Acer Inc|Electronic device with object guidance function and its object guidance method|
US9057874B2|2010-12-30|2015-06-16|GM Global Technology Operations LLC|Virtual cursor for road scene object selection on full windshield head-up display|
US9113050B2|2011-01-13|2015-08-18|The Boeing Company|Augmented collaboration system|
US8886372B2|2012-09-07|2014-11-11|The Boeing Company|Flight deck touch-sensitive hardware controls|
US9390559B2|2013-03-12|2016-07-12|Honeywell International Inc.|Aircraft flight deck displays and systems and methods for enhanced display of obstacles in a combined vision display|
US9460314B2|2013-12-01|2016-10-04|Apx Labs, Inc.|Systems and methods for providing task-based instructions|
US9615742B2|2014-01-21|2017-04-11|Osterhout Group, Inc.|Eye imaging in head worn computing|
US9772712B2|2014-03-11|2017-09-26|Textron Innovations, Inc.|Touch screen instrument panel|
US9428056B2|2014-03-11|2016-08-30|Textron Innovations, Inc.|Adjustable synthetic vision|
US10042456B2|2014-03-11|2018-08-07|Textron Innovations Inc.|User interface for an aircraft|
US9685090B2|2014-03-11|2017-06-20|Textron Innovations Inc.|Navigational aids|
WO2015138318A1|2014-03-11|2015-09-17|Cessna Aircraft Company|Standby instrument panel for aircraft|
EP2933707B1|2014-04-14|2017-12-06|iOnRoad Technologies Ltd.|Head mounted display presentation adjustment|
US10529248B2|2014-06-19|2020-01-07|Embraer S.A.|Aircraft pilot training system, method and apparatus for theory, practice and evaluation|
US9561866B2|2015-05-27|2017-02-07|The Boeing Company|Wearable electronic display and method for displaying information to a pilot|
US10108010B2|2015-06-29|2018-10-23|Rockwell Collins, Inc.|System for and method of integrating head up displays and head down displays|
US9842388B2|2015-07-02|2017-12-12|Honeywell International Inc.|Systems and methods for location aware augmented vision aircraft monitoring and inspection|
US10598932B1|2016-01-06|2020-03-24|Rockwell Collins, Inc.|Head up display for integrating views of conformally mapped symbols and a fixed image source|
US10289263B2|2016-01-08|2019-05-14|The Boeing Company|Data acquisition and encoding process linking physical objects with virtual data for manufacturing, inspection, maintenance and repair|
US10419053B2|2016-04-22|2019-09-17|Seabeck Holdings, Llc|Smart aviation communication headset and peripheral components|
US9995936B1|2016-04-29|2018-06-12|Lockheed Martin Corporation|Augmented reality systems having a virtual image overlaying an infrared portion of a live scene|
US10338885B1|2017-05-04|2019-07-02|Rockwell Collins, Inc.|Aural and visual feedback of finger positions|JP6874448B2|2017-03-17|2021-05-19|株式会社デンソーウェーブ|Information display system|
US10559135B1|2019-03-15|2020-02-11|Microsoft Technology Licensing, Llc|Fixed holograms in mobile environments|
FR3098932A1|2019-07-15|2021-01-22|Airbus Helicopters|Method and system for assisting the piloting of an aircraft by adaptive display on a screen|
法律状态:
2018-03-23| PLFP| Fee payment|Year of fee payment: 2 |
2018-09-14| PLSC| Publication of the preliminary search report|Effective date: 20180914 |
2020-03-19| PLFP| Fee payment|Year of fee payment: 4 |
2021-03-23| PLFP| Fee payment|Year of fee payment: 5 |
优先权:
申请号 | 申请日 | 专利标题
FR1751917A|FR3063713B1|2017-03-09|2017-03-09|DISPLAY SYSTEM AND METHOD FOR AN AIRCRAFT|
FR1751917|2017-03-09|FR1751917A| FR3063713B1|2017-03-09|2017-03-09|DISPLAY SYSTEM AND METHOD FOR AN AIRCRAFT|
US15/910,842| US20180261148A1|2017-03-09|2018-03-02|Display system and method for an aircraft|
[返回顶部]