![]() INFORMATION PRESENTATION APPARATUS AND INFORMATION PRESENTATION METHOD
专利摘要:
Information Display Apparatus and Information Display Method The present invention relates to an information display apparatus 1 used in an autonomous vehicle capable of switching between autonomous driving control and manual driving control. the information display device 1 determines a response action to verify that the driver is prepared to take command when autonomous driving control is switched to manual driving control, executes the control to request the driver to perform the action of determined response, and detects the response action performed by the driver. 公开号:BR112016022178A2 申请号:R112016022178-8 申请日:2014-03-26 公开日:2021-09-08 发明作者:Masami Funakawa 申请人:Nissan Motor Co. Ltd.; IPC主号:
专利说明:
[001] [001] The present invention relates to an information display device for use in an autonomous vehicle capable of switching between autonomous driving control, in which the vehicle is driven under autonomous cruise control, and manual driving control. , where the vehicle is driven by a driver. The present invention also relates to a method of presenting information. PREVIOUS TECHNIQUE [002] [002] Autonomous vehicles, whose cruise control is performed autonomously by an autonomous driving system, have been developed recently. This type of autonomous vehicle determines the driver's perception level by testing their perception during autonomous driving control. Patent Literature 1 discloses a conventional method for determining a driver's perception level in a vehicle capable of autonomous cruise control. [003] [003] The driver perception level determination method disclosed in Patent Literature 1 provides a test item to the driver after a predetermined driving time has elapsed and determines the driver perception level by evaluating the response to the test item . [004] [004] In this aspect, during autonomous driving control, the driver of the autonomous vehicle entrusts the driving to the autonomous driving system, being highly likely not to recognize the surroundings or know the state of the vehicle, which the driver does unconditionally when driving the vehicle. vehicle in manual mode. However, an autonomous vehicle can switch from autonomous driving control to manual driving control when the driver is in this state. The driver, therefore, is obliged to always be prepared to assume the direction of the 9/70 autonomous driving system. LIST OF QUOTATIONS PATENT LITERATURE [005] [005] Patent Literature 1: Japanese Patent Application Publication No. Hei 11-185200 SUMMARY OF THE INVENTION [006] [006] The conventional method of determining driver perception level described above simply determines whether the driver is attentive, and therefore fails to determine whether or not the driver can effectively take command when autonomous driving control is switched to manual driving control. [007] [007] The present invention was conceived in the light of these circumstances, and aims to provide an information display device and an information display method that can verify that the driver is prepared to take command when autonomous driving control is switched to manual driving control. [008] [008] To solve the problem identified above, an information display apparatus according to an aspect of the present invention is used in an autonomous vehicle capable of switching between autonomous driving control and manual driving control. This information display device determines a response action to be taken by a driver to verify that the driver is prepared to take command when autonomous driving control is switched to manual driving control, executes the control to request the driver that performs the given response action, and detects the response action performed by the driver. BRIEF DESCRIPTION OF THE DRAWINGS [009] [009] [Fig. 1] Figure 1 is a block diagram showing the configuration of an information display apparatus according to a first embodiment of the present invention. [Fig. 2] Figure 2 is a diagram illustrating examples of driver operated buttons of information display apparatus according to the first embodiment of the present invention. [010] [010] The first to third embodiments of the present invention are described below with reference to the drawings. [011] [011] Figure 1 is a block diagram showing the configuration of an information display apparatus in accordance with the present embodiment. [012] [012] The information display device 1 according to the present embodiment is installed in an autonomous vehicle capable of switching between autonomous driving control, in which the vehicle is controlled according to autonomous cruise control, and the manual driving control, where the vehicle is driven by a driver (an autonomous vehicle will simply be referred to as a vehicle). The vehicle is equipped with an autonomous driving system, which performs the vehicle's autonomous cruise control based on driving conditions and the vehicle's external environments. The autonomous driving system switches from autonomous driving control to manual driving control at the predetermined time or as needed. For example, the autonomous driving system transfers operational competence to the driver as soon as a sensor installed in the steering wheel or pedal detects that the driver has placed his hand on the steering wheel or his foot on the pedal. Specifically, the autonomous driving system transfers the operating competence of the steering wheel and operating competence of the pedal separately according to the driver's operation. [013] [013] Driver analysis unit 3 detects the driver's state and actions through devices installed in the vehicle, and also acquires information about the vehicle's operational status and other such information, analyzing what types of movements and driving operations the conductor ran in that kind of state. Specifically, the driver analysis unit 3 of the present embodiment is connected to the push-button switch 31. The driver analysis unit 3 detects operations performed on the push-button switch 31 by the driver, and thereby detects a response action performed by the driver. . [014] [014] In addition, the driver analysis unit 3 can estimate the stimulation level, the fatigue level, driving accuracy, and driving stability of the driver and other elements of the same nature by analyzing the driver's biological signals. Conductor 3 analysis unit acquires the physiological information, alpha and theta brain waves, heart rate, analysis of the frequency of heartbeat waveforms, breathing rate, frequency of eye movements 13/70, time interval of eye closure, head posture, cortisol and amylase levels in saliva. Furthermore, the driver analysis unit 3 acquires, vehicle information, the degree of change in the vehicle's lateral position, and such as behavior information, driving patterns, frequency analysis in the driving waveforms, and similar elements. Based on these information elements, the driver analysis unit 3 estimates the stimulation level, the fatigue level, the driving accuracy, and the driving stability. [015] [015] The driver response recording unit 5 records the driver's status and the responses given by the driver, such as movements and driving operations performed by the driver. Specifically, the driver response recording unit 5 records, for example, the results analyzed by the driver analysis unit 3, a manual driving operation by the driver, a task entered by the task input unit 11, and if an action is taken. response was executed or not. The driver response recording unit 5 in particular records information on whether and when the push-button switch 31 was actuated as a response action. [016] [016] Response action determination unit 7 determines response action to verify that the vehicle driver is prepared to take command when autonomous driving control is switched to manual driving control. In the present modality, the response action is an action to verify if the driver can carry out a decision-making process in manual driving. The response action is an action in which the driver responds to a request to make a decision about an autonomous vehicle driving operation. In manual driving, the driver normally performs the following steps: "recognise" the environment around the vehicle, "decide" an operation to perform, and "start" the vehicle based on the decision. In the present embodiment, the response action determination unit 7 establishes a response action 14/70 to verify if the driver can carry out the "decision" process of the above three processes, and requests the driver to perform this response action. response to verify that the driver is prepared to take command when autonomous driving control is switched to manual driving control. In the present embodiment, the response action determination unit 7 requests the driver, as a response action, to decide on a vehicle driving operation and respond by pressing a button. The satisfactory completion of the decision process means that the driver is also eligible for the recognition process. In this way, the information display apparatus 1 of the present embodiment can check whether the driver is suitable for the recognition process and the decision process. [017] [017] Specifically, the response action determination unit 7 acquires multiple candidates for the autonomous behavior of the vehicle generated by the unit generating the autonomous behavior of the vehicle 25, and also acquires, from the unit generating the autonomous behavior of the vehicle 25, information about an area where the vehicle can safely travel (information about the safe area of travel) and information about the position of an obstacle in the surroundings (information about the position of the obstacle). The response action determination unit 7 then determines a response action to request from the driver from among the acquired autonomous vehicle behavior candidates. [018] [018] Response action control unit 9 executes the control to request the driver to perform the response action determined by response action determination unit 7. Specifically, response action control unit 9 determines the time to request the handler to perform the response action, determines whether or not the handler has successfully performed the response action, and establishes an approval indicator accordingly. [019] [019] The task input unit 11 acquires the autonomous vehicle behavior candidates generated by the vehicle autonomous behavior generating unit 25 and displays these candidates on a display device 33 as tasks. The driver operates the vehicle by touching the display device 33 which has a touch panel and selecting one of the displayed tasks to be performed immediately afterwards. Tasks consist of minimal units of driving operation, for example, going straight, turning right or left, changing to the left or right lane, stopping, starting, accelerating, or reducing speed. Tasks such as going straight and turning right or left are represented by pictograms and displayed on a screen presented to the driver. The task input unit 11 displays the tasks so that the driver can identify which task is selected by the autonomous driving system, a task selectable by the driver, and a task not selectable by the driver. Thus, the driver can change the task selected by the autonomous driving system by selecting a task that is one of the selectable tasks, but not selected by the autonomous driving system. Even though the final behavior of the vehicle is determined by the autonomous behavior determination unit of the vehicle 27, the task entered by the task input unit 11 is preferably determined as an autonomous behavior of the vehicle. [020] [020] The displayed image generating unit 13, which is connected to the display device 33, generates an image to be presented to the driver and outputs the image to the display device 33 when, for example, it requests the driver to perform an action response or displaying tasks. The display device 33 may be a liquid crystal monitor, and is placed, for example, in a lower position in front of the driver at an angle of 10 degrees and at a viewing distance of 90 cm. Display device 33 has a touch panel and receives input from the driver. [021] [021] The alert control unit 15, which is connected to an alert sound generator unit 35, acquires information from the response action control unit 9, and upon determining that the alert needs to be turned on, sets an alert sound. appropriate alert and gives a signal to the alert sound generator unit 35. [022] [022] The audio generating unit 17, which is connected to the loudspeaker 37, generates an appropriate message and outputs the message to the loudspeaker 37 when, for example, it requests the conductor to perform a response action. [023] [023] The surroundings recognition unit 21 acquires information about the surroundings of the vehicle itself from a group of sensors 41, such as a camera, a radar, and a GPS, incorporated in the vehicle itself, and also acquires map data and information the current location of the vehicle itself from a database 43. Based on these information elements, the surroundings recognition unit 21 recognizes the surroundings of the vehicle itself and generates information about the surroundings. [024] [024] The route control unit 23 acquires the map data and the information of the current location of the vehicle itself stored in the database 43, and manages the entire route of the route from the current location of the vehicle itself to the destination. The route control unit 23 also generates information about the route containing the plans of the entire route of the route for the vehicle itself. [025] [025] The vehicle's autonomous behavior generating unit 25 acquires information about the surroundings from the surroundings recognition unit 21 to recognize the surroundings of the vehicle itself, and acquires information from the route 17/70 from the route control unit 23 to capture the route of the vehicle's own route. Based on these information elements, the vehicle's autonomous behavior generating unit 25 obtains information about the area that can be safely traversed and the obstacle location information of the vehicle itself at the present time, and calculates multiple candidates for autonomous behavior of the vehicle. vehicle (route paths and speed profiles) through which the vehicle itself is prepared to travel the safe route area. [026] [026] The vehicle autonomous behavior determination unit 27 acquires the surroundings information from the surroundings recognition unit 21 to recognize the surroundings of the vehicle itself, and acquires the route information from the route control unit 23 to capture the route of the vehicle's route. Based on these information elements, the autonomous vehicle behavior determining unit 27 determines an autonomous vehicle behavior from the autonomous vehicle behavior candidates generated by the autonomous vehicle behavior generating unit 25, and outputs the autonomous vehicle behavior to the vehicle behavior control unit 29. When the driver is going to perform the response action, the vehicle autonomous behavior determination unit 27 always evaluates whether the response action is effective or not, in light of the surrounding current. When a driver-selected task is entered by the task input unit 11, the vehicle autonomous behavior determining unit 27 preferentially determines the entered task as the autonomous behavior of the vehicle. [027] [027] The vehicle behavior control unit 29 sends control signals to each actuator 45 installed in the vehicle itself to perform the autonomous vehicle behavior determined by the vehicle autonomous behavior determination unit 27. The vehicle behavior control unit 18/70 vehicle 29 outputs the autonomous behavior of the vehicle to the driver analysis unit 3 as a control result. [028] [028] Button switch 31 is a device for receiving YES or NO inputs from the conductor requested to perform the response action. Specifically, the button switch 31 may have only one YES button, as shown in Figure 2(a), so that the driver activates the button only when entering an affirmative decision. [029] [029] As shown in Figure 2(a), an image of a button can be displayed on display device 33 so that the driver can enter their selection by touching the image on display device 33. In this way, a dedicated switch does not need be installed like push-button switch 31. Alternatively, a dedicated push-button can be fitted to the vehicle's instrument panel, as shown in Figure 2(b). A switch can be installed on the steering wheel as shown in Figure 2(c) for operability. However, Figure 2 provides mere examples. The YES button can be installed on the instrument panel or steering wheel, or the NO button 19/70 can be displayed on the display device 33. [030] [030] The information presentation apparatus 1 is configured by a general purpose electronic circuit that includes a microcomputer, a microprocessor, and a CPU, as well as peripherals, and operates like the previously described units executing particular programs. [031] [031] Next, a procedure of a control circuit of the response action by the information display apparatus 1 according to the present embodiment is described with reference to the flowchart in Figure 3. [032] [032] As shown in Figure 3, in Step 101, the response action control unit 9 reads the data necessary to execute the response action control circuit. Specifically, the response action control unit 9 reads the results of analysis of driver movements, driving operations and similar parameters from the driver analysis unit 3, reads the surroundings information from the surroundings recognition unit 21, and reads route information from route control unit 23. Response action control unit 9 also reads a history of operations on pushbutton switch 31 and a history of driver responses, which are recorded in the response recording unit of conductor 5. [033] [033] In Step S102, the response action control unit 9 analyzes, based on the read data, a requested response action from the driver and the history of the driver's responses to the response action. [034] [034] In Step S103, the response action control unit 9 determines whether or not the number of times the response action has not been performed is equal to or greater than a reference value. The number of times the response action was not performed includes the handler not responding to the request for a response action and an inappropriate handler response. The response action control unit 9 20/70 proceeds to Step S104 when the number of times the response action has not been performed is equal to or greater than the reference value, and proceeds to Step S105 when the number of times the response action was not performed is less than the reference value. [035] [035] In Step S104, the response action control unit 9 sets a pass indicator to "false" because the number of times the response action has not been performed is equal to or greater than the reference value. By means of this indicator, the control unit of the response action 9 determines that the driver has not approved the autonomous driving control, and executes the control to stop the autonomous driving control. The Response Action Control Unit 9 also determines that the driver is not prepared to take command when autonomous driving control is switched to manual driving control. Furthermore, the response action control unit 9 introduces into the vehicle autonomous behavior determination unit 27 an indication that the approval indicator has been set to "false". [036] [036] When the pass indicator is thus set to "false", the response action control circuit according to the present embodiment is terminated. [037] [037] In Step S105, the response action determination unit 7 determines a response action to request from the conductor. More specifically, the response action determination unit 7 acquires, from the vehicle autonomous behavior generating unit 25, a plurality of autonomous vehicle behavior candidates, as well as information about the area that can be safely traversed and information about the obstacle location. From the thus acquired autonomous vehicle behavior candidates, the response action determination unit 7 determines a response action to present to the driver. Examples of a driving operation that is 21/70 required from the driver as a response action include turning right or left, overtaking, shifting to the left or right lane, speeding up or slowing down, merging, stopping, starting, temporarily stopping, and follow the vehicle ahead. These driver's required driving operations are viable driving operations considering the surroundings. Driving operations considered rational in view of the vehicle's own complete itinerary are defined as options for the response action. [038] [038] Next, in Step S106, the response action control unit 9 determines the time to request the response action from the conductor. Specifically, the response action control unit 9 acquires, from the driver analysis unit 3, a result of the analysis of the driver status as well as the vehicle information, and acquires, from the driver response recording unit 5, the time a pre-response action is performed and information about the elapsed time. The Response Action Control Unit 9 also acquires the safely traversable area information and obstacle location information from the Response Action Determination Unit 7, and acquires information concerning the system's cruise control. autonomous driving of the vehicle itself. Based on these information elements, the response action control unit 9 determines the time to request the driver's response action. [039] [039] For example, the response action control unit 9 requests the driver to perform a response action whenever a predetermined time interval has elapsed or whenever the vehicle itself performs a predetermined number of cruise control operations. The response action control unit 9 can set the time to request the driver response action based on the driver's state. For example, if it is detected that the driver is not awake or does not look straight ahead, the response action 22/70 control unit 9 requests the driver to perform a response action. The response action control unit 9 may also set the time to request a response action from the driver based on the result of the analysis of a manual driving operation performed by the driver. For example, the manual driving operation performed by the driver is analyzed, if the driver demonstrates high levels of perception, driving precision, and driving stability, the response action control unit 9 increases the time lapse interval before to request the response action, and if the driver demonstrates low levels of perception, driving accuracy, and driving stability, the response action control unit 9 reduces the time lapse before requesting the response action. Similarly, the response action control unit 9 increases the time lapse interval before requesting response action if the driver demonstrates a high level of fatigue, and reduces the time lapse interval before requesting action. response if the driver demonstrates a low level of fatigue. [040] [040] The response action control unit 9 can also set the time to request a response action from the driver based on the conditions of the route and the external circumstances of the vehicle itself. For example, the response action control unit 9 sets the time to request the response action based on, for example, the speed of the vehicle itself, the volume of traffic, or whether the vehicle itself is traveling on a highway or not. However, if the driver performs manual driving control voluntarily or manually controls the vehicle when prompted by information from the autonomous driving system, this manual driving operation may be considered equivalent to a response action, provided that the driving manual has been performed safely. After the handler manually commands, the response action control unit 9 defers the time to request a response action 23/70 from the handler or reduces the number of response action requests. [041] [041] When the time to switch from autonomous driving control to manual driving control is predetermined, the response action control unit 9 shortens the interval for requesting a response action from the driver as the predetermined time elapses. approaches. For example, if autonomous driving control is switched to manual driving control when the vehicle leaves a highway to join local roads, the response action control unit 9 shortens the interval for requesting a response action from the vehicle. driver as the time to exit the highway approaches. [042] [042] Next, in Step S107, the response action control unit 9 determines whether or not the response action request time determined in Step S106 has arrived. If the response action request time has not yet arrived, the response action control unit 9 continues to determine whether the response action request time has arrived or not. If the response action request time has elapsed, the response action control unit 9 proceeds to Step S108. [043] [043] In Step S108, the response action control unit 9 requests the conductor to perform the response action. Specifically, through an image on the display device 33 and a message from the loudspeaker 37, the response action control unit 9 asks the driver whether or not to perform the driving operation determined by the response action determination unit. 7 and encourages the driver to press the button. For example, the 9-response action control unit asks the driver a question such as "Do you want to change to the right lane " or "Do you want to speed up " In response to this, the driver actuates the push-button switch 31 and thus performs the response action. [044] [044] A specific example of the processing performed when requesting the 24/70 driver to perform a response action will be described at this point. In this example, the driver is required to perform a response action on switching to the right lane when the vehicle itself is in autonomous cruising mode while following the vehicle in front with a certain interval maintained between vehicles. [045] [045] Response Action Control Unit 9 asks the driver "Do you want to change to the right lane " with the message displayed on the display device 33 and the audio coming from the speaker 37 at the same time. In the case where the button switch 31 has only one YES button, if the driver presses the YES button, it is determined that the driver has performed the response action. If the button is not pressed, it can be determined that the driver did not perform the response action. [046] [046] In the case where button switch 31 has a YES button and a NO button, if the driver presses the YES button, it is determined that the driver has performed the response action, and if the driver presses the NO button, it is determined that the handler did not perform the response action. If none of the buttons are pressed, it can be determined that the driver did not perform the response action. [047] [047] In the case where button switch 31 has only one NO button, it is determined that the driver performed the response action if the driver does not press the NO button, and it is determined that the driver did not perform the response action if the driver presses the NO button. [048] [048] In the case above where it is determined that a response action was performed without any button being pressed, the vehicle itself can proceed with autonomous driving despite the button not being pressed. To avoid 25/70 a situation in which the vehicle continues to drive autonomously with the driver ignoring the request and not pressing the button, the response action control unit 9 formulates a question to the driver for which pressing the button is inevitable. This question is called a trap test, and it is asked if the button is not pressed at a predetermined frequency or a predetermined number of times in a row, for example. In the event that pushbutton switch 31 has only one YES button, the response action control unit 9 formulates a question such as "Red light ahead. Do you want to stop ". Not pushing the button is an inappropriate reaction to this question, so the driver has to push the button. When button switch 31 is just a NO button, the response action control unit 9 formulates a question like "Do you want to stop at the pedestrian lane " This question proposes a driving operation that is clearly inappropriate. Avoiding pushing the button would NOT be an inappropriate reaction, so the driver has to push the button. For the case where there is both a YES button and a NO button, a question for the case of a NO button only is formulated. [049] [049] If the handler does not press the button in response to these trap tests, the response action control unit 9 determines that the handler is ignoring the request to perform a response action or is unable to perform the response action. , and alerts the driver by sounding an alert and flashing the monitor screen. [050] [050] Upon requesting you to perform a response action asking the type of question described above, the response action control unit 9 may display the pictograms shown in Figure 4. In the pictograms shown in Figure 4, reference numeral 51 indicates turn turn right, 52 turn left, 53 change to the left lane, 54 change to the right lane, 55 go straight (start or follow the car ahead), and 56 stop. In Figure 4, the pictograms shown in 26/70 dotted lines (such as turning left 52 and changing to the right lane 54) are driving operations not selectable by the driver because these driving operations are determined by the autonomous driving system to be infeasible. depending on the entire itinerary and surroundings. The white pictogram (go straight 55) is a drive operation that is currently being performed. The heavily shaded pictogram (turn right 51) is a driving operation instructed by the autonomous driving system. The lightly shaded pictogram (change to left lane 53) is a driving operation that is feasible in view of the surroundings, but has not been selected as a response action. [051] [051] Response Action Control Unit 9 can process the pictograms shown in Figure 4 so that they can be recognized by peripheral vision as shown in Figure 5. In this case, Response Action Control Unit 9 displays a formed pattern by the spatial frequency and by the temporal frequency without luminance edge. For example, the response action control unit 9 displays, on the display device 33, a pattern formed primarily by components at a maximum spatial frequency of 1 CPD (cycle per degree) and a temporal frequency of 1 Hz to 7 Hz. driver can thus recognize the response action by peripheral vision without looking directly at the display device [052] [052] In each pictogram image shown in Figure 5, the mountain-shaped elements that make up the pictogram image move smoothly from bottom to top, forming a continuous flow. As the moving pictogram image stream can easily guide the driver's eye, low-pass filter processing is performed on the entire image to eliminate high spatial frequency components. [053] [053] In Figure 5, the response action control unit 9 sets a pictogram speed as explained below. The pictogram representing the 27/70 driving operation that is currently being performed flows at a relatively high speed, the pictogram representing the driving operation as a response action flows more slowly, and the pictogram representing a standard driving operation flows at the slower speed. slow. By adding flow motions to pictograms, pictograms can be distinguished based on multiple attributes, eg color, flow direction and speed, and contrast and other similar elements. As the pictograms are blurred by image processing, the driver can distinguish, through peripheral vision, the driving operation that is currently being performed, the driving operation as a response action, and the driving operation set by default, without fix your eyes. For more information, see, for example, Japanese Patent Application Publication No. 2006-184854 and Funakawa, Masami. Ambient Information Display Method Based upon Spatial-Temporal Properties of Conductor's Field of View. AUTOMOTIVE ENGINEERING AND TECHNOLOGIES 40, 5, 1191-1196. The pictogram representing the driving operation as a response action can repeat flashing in place of flow. Thus discrimination will be easier than using different flow rates. If the pictogram flashes in such a way that there is a change in contrast with the square waves, this flasher can disturb the driver and guide his eyes. Therefore, it is preferable that the pictogram flashes with a slight contrast shift like sinusoid. [054] [054] In sync with the images displayed on the display device 33, the response action control unit 9 can provide a sensory stimulus to the driver. For example, the response action control unit 9 can offer the driver a vibration stimulus through the steering wheel, for example, or a sound effect. This ensures that the driver realizes that information is being provided. [055] [055] In Step S109, the response action control unit 9 compares the button operation performed by the handler with the required response action of the handler, and determines whether the handler has performed the response action or not. The specific processing to determine whether the response action has been performed will be described later with reference to Figure 6. [056] [056] Upon determining that the handler did not perform the response action in Step S109, the Response Action Control Unit 9 proceeds to Step S110. In Step S110, the response action control unit 9 enters, in the driver response recording unit 5, an indication that the response action was not performed, and returns to Step S101. [057] [057] Upon determining that the handler has performed the response action in Step S109, the Response Action Control Unit 9 proceeds to Step S111. In Step S111, the response action control unit 9 enters into the driver response recording unit 5 an indication that the response action has been performed, and proceeds to Step S112. [058] [058] In Step S112, the response action control unit 9 sets the pass indicator to "true". In this way, the response action control unit 9 determines that the driver has approved autonomous driving control, and executes the control so that autonomous driving control can continue. In addition, the response action control unit 9 determines that the driver is prepared to take command when autonomous driving control is switched to manual driving control. The response action control unit 9 also introduces into the vehicle autonomous behavior determination unit 27 an indication that the pass indicator has been set to "true". [059] [059] When the pass indicator is set to "true" in this manner, the response action control circuit in accordance with the present embodiment is terminated. [060] [060] Next, a description of a processing procedure for determining the execution of the response action performed in Step S109 of the response action control circuit described above is described, referring to the flowchart in Figure 6. [061] [061] As shown in Figure 6, in Step S201, the response action control unit 9 reads the data indicating an operation detected at the button switch 31 of the conductor analysis unit 3, so that the control unit of the response action 9 can recognize an operation performed by the driver on push-button switch 31. [062] [062] In Step S202, the response action control unit 9 determines whether the required response action from the conductor is a trap test or not. The 9-response action control unit proceeds to Step S203 if it is not a trap test, and proceeds to Step S206 if it is a trap test. [063] [063] In Step S203, the response action control unit 9 compares the operator's operation at push-button switch 31 with the feedback action determined in S105, and determines whether the operation at push-button switch 31 is compatible with the action response or not. When the operation on button switch 31 is compatible with the response action, the response action control unit 9 proceeds to Step S204. In Step S204, the response action control unit 9 issues an indication that the response action has been performed, and determines that the driver is prepared to take command when autonomous driving control is switched to manual driving control. . [064] [064] When the operation on button switch 31 is not compatible with the response action in Step S203, the response action control unit 9 proceeds to Step S205. In Step S205, the response action control unit 9 issues an indication that the response action has not been performed, and 30/70 determines that the driver is not prepared to take command when autonomous driving control is switched to manual driving control. [065] [065] In Step S206 to which the 9-response action control unit proceeds after determining in Step S202 that the conductor's requested response action is a trap test, the 9-response action control unit determines whether the driver pressed button switch 31 or not. When the answering action is the trap test, the question that triggers button switch 31 is inevitably asked. Thus, if the button switch 31 has been actuated, the response action control unit 9 proceeds to Step S204 to issue an indication that the response action has been performed. If it is determined in Step S206 that pushbutton switch 31 has not been actuated despite the trap test, the 9-response action control unit proceeds to Step S205 to issue an indication that the 9-response action has not been performed. [066] [066] After producing the indication that the response action was or was not performed, the response action control unit 9 ends the processing of determining the response action execution, and returns to the action control circuit Give me an answer. [067] [067] [Autonomous Driving Control Circuit Procedure] The following describes a procedure of an autonomous driving control circuit performed by the information display apparatus 1 in accordance with the present embodiment with reference to the flowchart in Figure 7. [068] [068] As shown in Figure 7, in Step S301, the unit generating the autonomous behavior of the vehicle 25 reads the information about the surroundings from the recognition unit of the surroundings 21, and the information about the itinerary from the control unit of the itinerary 23. [069] [069] In Step S302, the vehicle autonomous behavior generating unit 25 generates a vehicle autonomous behavior candidate in accordance 31/70 with the system logic of a typical autonomous driving system. Vehicle autonomous behavior candidates are generated in more than one. Each of these autonomous vehicle behavior candidates contains a trajectory of the vehicle's own path and a velocity profile. [070] [070] The system logic of an autonomous driving system is the logic used to determine the autonomous behaviors of the vehicle considering the efficiency and safety of driving. For example, the autonomous behavior of the vehicle is determined by prioritizing the shortest route to the destination or, in a high-traffic area, prioritizing a shorter time rather than distance. On the route, the necessary lane change is performed, but the unnecessary lane change is not performed to prioritize safety. [071] [071] According to said system logic, the unit generating the autonomous behavior of the vehicle 25 generates a trajectory of the route, for example, turning right, turning left, or going straight, and a speed profile as candidates of autonomous behavior of the vehicle. [072] [072] In Step S303, the vehicle autonomous behavior determination unit 27 determines, based on the input entered by the response action control unit 9, whether the pass indicator is set to "true" or not. When the approval indicator is "true", the vehicle autonomous behavior determination unit 27 determines that the driver approves that autonomous driving control should continue, and proceeds to Step S304. When the pass indicator is not "true", the vehicle autonomous behavior determination unit 27 returns to Step S301. [073] [073] In Step S304, the vehicle autonomous behavior determination unit 27 determines whether the driver has entered a task through the task input unit 11 or not. The vehicle autonomous behavior determination unit 27 proceeds to Step S305 when no task has been entered, and proceeds to Step S306 when a task has been entered. [074] [074] In Step S305, the vehicle autonomous behavior determination unit 27 determines an autonomous behavior of the vehicle according to the system logic of the autonomous driving system, because no task has been entered by the driver. The vehicle autonomous behavior determining unit 27 determines an autonomous vehicle behavior among the autonomous vehicle behavior candidates generated by the vehicle autonomous behavior generating unit 25, according to the logic of the system. [075] [075] For example, as the system logic of the autonomous driving system determines an autonomous behavior of the vehicle considering the efficiency and safety of driving, the system logic determines an autonomous behavior of the vehicle prioritizing the shortest route to the destination when the traffic is not intense, and it determines an autonomous behavior of the vehicle, prioritizing a shorter time, and not the distance, when the traffic is intense. [076] [076] The vehicle autonomous behavior determination unit 27 thus determines, as the vehicle's autonomous behavior, a trajectory of the route, such as turning right, turning left, or going straight, and its speed profile, and proceeds to the Step S307. [077] [077] In Step S306, the vehicle autonomous behavior determination unit 27 determines, as a vehicle autonomous behavior, the task entered by the driver through the task input unit 11. The vehicle autonomous behavior determination unit 27 then records in driver response recording unit 5 that the task entered by driver has been determined to be the autonomous behavior of the vehicle, and proceeds to Step S307. [078] [078] In Step S307, the vehicle behavior control unit 29 performs the autonomous behavior of the vehicle determined in Step S305 or Step S306. Specifically, the vehicle behavior control unit 29 performs the autonomous vehicle behavior determined according to the system logic in Step S305, or the task determined in S306, which is entered by the driver. The vehicle behavior control unit 29 performs the autonomous vehicle behavior by issuing control signals necessary to perform the vehicle autonomous behavior to the actuators 45. The vehicle behavior control unit 29 also inputs the performed autonomous vehicle behavior into the driver analysis unit 3. [079] [079] When the autonomous behavior of the vehicle is performed in this way, the information display apparatus 1 according to the present embodiment terminates the autonomous driving control circuit. [080] [080] As described above, the information display apparatus 1 in accordance with the present embodiment requests the driver to perform a response action to verify that the driver is prepared to take command when autonomous driving control is switched to manual driving control. By verifying that the driver is prepared to take command when autonomous driving control is switched to manual driving control, the autonomous vehicle can safely continue its journey even if autonomous driving control is switched to manual driving control . [081] [081] In the information display apparatus 1 according to the present embodiment, a response action is an action in which the driver responds to a request, to make a decision regarding an autonomous vehicle driving operation. The information display device 1 can thereby easily verify that the driver is prepared to take command 34/70 when autonomous driving control is switched to manual driving control. [082] [082] The information display apparatus 1 according to the present embodiment receives only an affirmative response from the driver, thus requiring only one button to receive an affirmative input. This eliminates the need for multiple buttons, allowing for simplified setup. [083] [083] Alternatively, the information display apparatus 1 according to the present embodiment receives only a negative response from the driver, thus requiring only one button to receive a negative input. This eliminates the need for multiple buttons, allowing for simplified setup. [084] [084] Alternatively, the information display apparatus 1 according to the present embodiment receives either an affirmative or a negative response from the driver. In this way, the information display apparatus 1 always receives an answer regardless of whether the answer is yes or no, and can therefore capture the driver's intentions with certainty. [085] [085] When the frequency of a response action is below a predetermined frequency, the information display apparatus 1 according to the present embodiment formulates a question to the driver for which the response is unavoidable. This can prevent the vehicle from continuing to drive autonomously without a response action being taken, and therefore increases the safety of the autonomous vehicle. [086] [086] The information display apparatus 1 according to the present embodiment uses the push of a button by the driver as a response action, and therefore can easily receive a decision from the driver. [087] [087] The information display device 1 according to the present modality 35/70 interrupts the autonomous driving control when the driver fails to perform the response action, thus being able to guarantee the safety of the autonomous vehicle. [088] [088] The information display apparatus 1 according to the present embodiment requests the execution of a response action whenever a predetermined time interval has elapsed or whenever the autonomous vehicle performs a predetermined number of cruise control operations. As the driver is periodically required to perform a response action, safety can be improved in autonomous driving control. [089] [089] The information display device 1 according to the present embodiment requests the driver to perform a response action upon detecting that the driver is not awake or does not look ahead. The information display device 1 can therefore request the driver to perform a response action when the driver demonstrates a decreased level of perception, enabling an improvement in safety in the autonomous vehicle. [090] [090] The information display device 1 according to the present embodiment establishes the time to request the driver to perform a response action, based on the driver's condition and the conditions of the route and the external circumstances of the autonomous vehicle. The information display device 1 can therefore request the driver to perform a response action in a timely manner in accordance with the driver's condition and the route conditions and external circumstances of the autonomous vehicle. [091] [091] If the time for switching from autonomous driving control to manual driving control is previously defined, the information display device 1 in accordance with the present embodiment shortens the interval to request the execution of a response action as measured as the previously set time approaches. This ensures that the driver will be prepared to take the 36/70 command when autonomous driving control is switched to manual driving control. [092] [092] If the driver performs a manual driving operation, the information display device 1 in accordance with the present embodiment defers the time to request the driver to perform a response action. This reduces the number of action-response requests, reducing system load and driver burden. [093] [093] The information display apparatus 1 according to the present embodiment displays, upon requesting the execution of a response action, a pattern formed by the spatial frequency and the temporal frequency without luminance edge. This allows the driver to recognize the request for a response action without looking directly at the display device 33, but through peripheral vision. [094] [094] Upon requesting a response action, the information display apparatus 1 according to the present embodiment provides the driver with a sensory stimulus in sync with a displayed image. This ensures that the conductor perceives a request for a response action. [095] [095] In the following, an information display apparatus according to a second embodiment of the present invention is described with reference to the drawings. The same components in the first and second embodiments are indicated by the same reference numerals, and the description of these components will not be detailed again. [096] [096] The information display device according to the first modality above requests the driver to press a button as a response action. [097] [097] Figure 8 is a block diagram showing the configuration of the information display apparatus in accordance with the present embodiment. [098] [098] The voice analysis unit 83 acquires the voice spoken by the driver through, for example, a microphone, and analyzes the voice. The voice analysis unit 83 then detects the type of words spoken by the driver and outputs voice information. The voice analysis unit 83 also analyzes, for example, the speed, pitch, and rhythm variation of the driver's voice. [099] [099]The driver analysis unit 3 detects the word(s) spoken by the driver based on the voice information emitted by the voice analysis unit 83 and analyzes the content of the driver's speech or conversation. [0100] [0100] Driver response recording unit 5 records the voice information emitted by the voice analysis unit 83 and records the result of analysis by driver analysis unit 3, i.e. the content of speech or conversation by the driver . [0101] [0101] Response action determination unit 7 determines a response action to verify that the vehicle driver is prepared to take command when autonomous driving control is switched to manual driving control. In the present embodiment, the response action determination unit 7 requests, as a response action, that the driver makes a decision about an autonomous vehicle driving operation and responds by speaking. [0102] [0102]Response Action Control Unit 9 includes a talk system to have a conversation with the driver. The chat system can have a conversation with the driver about the surroundings and driving operations by asking the driver questions and responding to the driver's words. Regarding setting the time for requesting a response action, the response action control unit 9 shortens the interval to request the driver to perform a response action when the driver manifests inappropriate speech or dialogue. [0103] [0103] The following describes a procedure of a response action control circuit performed by the information display apparatus 81 in accordance with the present embodiment. The response action control circuit of the present modality is identical to the flowchart in the first modality shown in Figure 3. Therefore, only the steps that perform a different processing than in the first modality will be described in detail. [0104] [0104] In the flowchart in Figure 3, after a response action is determined through processing in Steps S101 to S105, the time to request the response action is determined in Step S106. Then, when the time to request the response action arrives at Step S107, the response action control unit 9 requests the conductor to perform the response action at Step S108. Specifically, the response action control unit 9 asks the driver, via an image on the display device 33 or a message from the loudspeaker 37, whether or not it performs the driving operation determined by the action determining unit. response 7 and encourages the driver to speak. For example, the 9-response action control unit asks the driver a question such as "Do you want to change to the right lane " or "Do you 39/70 want to accelerate " The handler answers the question by speaking, and in this way performs the answer action. For example, the conductor performs the response action by speaking an affirmative word such as "Yes" or a negative word such as "No". [0105] [0105] Response Action Control Unit 9 may also request a trap test as described in the first embodiment. For example, the 9-response action control unit asks a clearly inappropriate question such as "Do you want to stop at the pedestrian lane " to which the handler inevitably says a negative word “No.” When the handler gives any response to the trap test, the response action control unit 9 determines that the handler is ignoring the request to perform the response action or is unable to perform the response action, and alerts the driver by sounding an alert and flashing the monitor screen. [0106] [0106] Next, in Step S109, the response action control unit 9 determines whether the driver has performed the response action or not. The response action execution determination processing will be described in detail later with reference to Figure 9. [0107] [0107] After that, processing in Steps S110 to S112 is performed, and the response action control circuit according to the present embodiment is terminated. [0108] [0108] The following describes a procedure for determining the execution of the response action performed in Step S109 of the response action control circuit described above, referring to the flowchart in Figure 9. [0109] [0109] As shown in Figure 9, in Step S401, the response action control unit 9 reads the data obtained by the voice analysis unit 83 from the 40/70 driver response recording unit 5. This data allows that the response action control unit 9 recognizes the speech content of the driver. [0110] [0110] In Step S402, the 9-response action control unit determines whether the conductor has spoken or not. The 9-response action control unit proceeds to Step S403 if the driver has spoken, and proceeds to Step S404 if the driver has not spoken. [0111] [0111] In Step S403, the response action control unit 9 determines whether the driver's speech regarding the response action was affirmative or not. [0112] [0112] In Step S405, the response action control unit 9 issues an indication that the response action has been performed, and determines that the driver is prepared to take command when autonomous driving control is switched to control manual driving. [0113] [0113] In Step S406, the response action control unit 9 executes the autonomous behavior of the vehicle determined by the vehicle autonomous behavior determination unit 27 because the speech by the driver was not affirmative, and emits an indication that the action response has not been performed. Response Action Control Unit 9 determines that the driver is not prepared to take command when autonomous driving control is switched to manual driving control. [0114] [0114] In Step 404, to which the 9-response action control unit proceeds after determining in Step S402 that the driver has not spoken, the 9-response action control unit determines whether the required driver-response action it's a trap test or not. The response action control unit 9 proceeds to Step S406 if the response action is a trap test, and 41/70 proceeds to Step S407 if the response action is not a trap test. [0115] [0115] In Step S406, the response action control unit 9 performs the autonomous behavior of the vehicle determined by the vehicle autonomous behavior determination unit 27 due to the driver not having spoken despite the trap test, and emits an indication that the response action was not performed. Response Action Control Unit 9 determines that the driver is not prepared to take command when autonomous driving control is switched to manual driving control. [0116] [0116] In Step S407, the response action control unit 9 performs a standard driving operation, such as following the car ahead, because the response action is not a trap test and the driver did not speak. [0117] [0117] After issuing an indication that the response action was or was not performed, the response action control unit 9 ends the response action execution determination processing and returns to the response action control circuit. [0118] [0118] The autonomous driving control circuit according to the present modality is identical to that of the first modality, therefore, in this document it is not described in detail. [0119] [0119] As described in detail above, the information display apparatus 81 according to the present embodiment requests, as a response action, the driver to make a decision about an autonomous vehicle driving operation and respond by speaking. The information display device 81 can easily check whether the driver is prepared to take the lead when autonomous driving control is switched to manual driving control. [0120] [0120] When the frequency of a speech or dialogue by the driver is below a predetermined frequency, the information display apparatus 81 according to the present embodiment formulates a question to the driver which inevitably forces the driver to speak or dialogue. This can prevent the vehicle from continuing autonomous driving in the absence of speech or dialogue, thus increasing the safety of the autonomous vehicle. [0121] [0121] When the driver emits an inappropriate speech or dialogue, the information display device 81 according to the present embodiment shortens the interval to request the driver to perform a response action. [0122] [0122] Next, a description is made of an information display apparatus according to a third embodiment with reference to the drawings. The same components in the first to third embodiments are indicated by the same reference numerals, and the description of these components will not be detailed again. [0123] [0123] The information display device according to the second modality requests the driver to speak as a response action. Instead, the information display device according to the present embodiment requires, as a response action, the driver to dialogue about an autonomous vehicle driving operation. [0124] [0124] The configuration of the information presentation apparatus according to the present embodiment is identical to that of the second embodiment shown in Figure 8, therefore, this configuration will not be detailed again. [0125] [0125]The following describes a procedure of the response action control circuit performed by the information display apparatus in accordance with the present embodiment. The response action control circuit of the present modality is identical to the flowchart in the first modality shown in Figure 3. Therefore, only the steps that perform a different processing than in the first and second modalities will be detailed. [0126] [0126] In the flowchart in Figure 3, after a response action is determined through processing in Steps S101 to S105, the time to request the response action is determined in Step S106. Then, when the time to request the response action arrives at Step S107, the response action control unit 9 requests the conductor to perform the response action at Step S108. Specifically, the response action control unit 9 asks the driver, via an image on the display device 33 and a message from the loudspeaker 37, whether or not it performs the driving operation determined by the action determination unit. answer 7 and encourages the driver to dialogue. [0127] [0127] Before requesting you to perform the response action, the response action control unit 9 conducts a dialog about the surroundings. For example, the response action control unit 9 conducts a dialog asking something like "The weather looks bad at the destination," "It started snowing," or "Traffic seems to be flowing smoothly." The conductor leads the conversation by saying something like "Does the 44/70 weather service say it's going to snow ", "Just keep going.", or "What time will I arrive " in response to these questions. [0128] [0128] Response Action Control Unit 9 may also request a trap test as described in the first modality. For example, the 9-response action control unit asks a clearly inappropriate question such as "Do you want to stop at the pedestrian lane " to which the handler inevitably responds with a negative word “No.” When the handler does not respond to the trap test, the response action control unit 9 determines that the handler is ignoring the request to perform the response action or is unable to perform the response action, and alerts the driver by sounding an alert and flashing the monitor screen. [0129] [0129] Next, in Step S109, the response action control unit 9 determines whether or not the driver has performed the response action. The response action execution determination processing will be detailed later with reference to Figure 10. [0130] [0130] After that, processing in Steps S110 to S112 is performed, and the response action control circuit according to the present embodiment is terminated. [0131] [0131] The following describes a processing procedure for determining the execution of the response action performed in Step S109 of the response action control circuit described above, referring to the flowchart in Figure 10. [0132] [0132] As shown in Figure 10, in Step S501, the response action control unit 9 reads the data obtained by the voice analysis unit 83 from the driver response recording unit 5. This data allows the 45 /70 response action control unit 9 recognize the content of a dialogue by the driver. [0133] [0133] In Step S502, the response action control unit 9 determines, based on the dialog content recognized in Step S501, whether the conductor is prepared to have the dialog or not. The response action control unit 9 proceeds to Step S503 when the driver is ready to have the dialogue, and proceeds to Step S507 when the driver is not ready to have the dialogue. [0134] [0134] In Step S503, based on the dialog content recognized in Step S501, the response action control unit 9 assesses the driver's level of perception of the surroundings. For example, if the driver had a usual dialogue about the surroundings, the response action control unit 9 assesses that the driver has a high level of awareness of the surroundings. If the driver did not respond to the dialogue or maintained an inappropriate dialogue, the response action control unit 9 determines that the driver has a low level of perception of the surroundings. [0135] [0135] In Step S504, the response action control unit 9 determines whether the driver has a high level of awareness of the surroundings or not. The 9-response action control unit proceeds to Step S505 when the driver has a high level of awareness of the surroundings, and returns to S501 to repeat a dialogue when the driver has a low level of awareness of the surroundings. [0136] [0136] In Step S505, the response action control unit 9 analyzes the content of the dialogue by the handler and determines whether the handler has had an affirmative dialogue regarding the response action or not. The response action control unit 9 proceeds to Step S506 when the driver has had an affirmative dialogue, and proceeds to Step S507 when the driver has not had an affirmative dialogue. [0137] [0137] At Step S506 to which the Response Action Control Unit 9 proceeds after determining that the driver has had an affirmative dialogue, the Response Action Control Unit 9 issues an indication that the Response Action has been performed , and determines that the driver is prepared to take command when autonomous driving control is switched to manual driving control. [0138] [0138] On the other hand, in Step S507 for which the response action control unit 9 proceeds after determining that the handler is not prepared to hold a dialogue or that the handler has not had an affirmative dialogue, the control unit of Response Action 9 gives an indication that the Response Action has not been performed, and determines that the driver is not prepared to take command when autonomous driving control is switched to manual driving control. [0139] [0139] After issuing an indication that the response action was performed or not, the response action control unit 9 ends the response action execution determination processing and returns to the response action control circuit. [0140] [0140] [Autonomous Driving Control Circuit Procedure] The autonomous driving control circuit according to the present embodiment is identical to the first embodiment, therefore, in this document it is not described in detail. [0141] [0141] As detailed above, the information display device according to the present modality requires the driver to dialogue about an autonomous vehicle driving operation as a response action. The information display device can then easily verify that the driver is prepared to take command when autonomous driving control is switched to manual driving control. [0142] [0142] The information display device according to the present embodiment repeats a dialogue when it determines, based on the dialogue held with the driver, that the driver has a low level of perception of the surroundings. As the repetition of the dialogue raises the driver's level of awareness of the surroundings, it is possible to prepare the driver to take command when autonomous driving control is switched to manual driving control. [0143] [0143] When the frequency of a speech or dialogue by the driver is below a predetermined frequency, the information presentation apparatus according to the present embodiment formulates a question to the driver for which the speech or dialogue is unavoidable. This can prevent the vehicle from continuing to drive autonomously without speech or dialogue, thus increasing the safety of the autonomous vehicle. [0144] [0144] When the driver manifests an inappropriate speech or dialogue, the information presentation device according to the present modality shortens the interval to request the driver to perform a response action. [0145] [0145] Note that the above embodiments are mere examples of the present invention. Therefore, the present invention is not limited to these modalities, admitting various modifications in compliance with the project, provided that such modalities do not detract from the technical scope of the present invention. LIST OF REFERENCE SIGNALS 1, 81information display device 3driver analysis unit 5driver response recording unit 7response action determination unit 48/70 9response action control unit 11 task input unit 13 displayed image generating unit 15 alert control unit 17audio generating unit 21 surroundings reconnaissance unit 23 itinerary control unit 25 vehicle autonomous behavior generating unit 27 unit for determining the autonomous behavior of the vehicle 29 vehicle behavior control unit 31 button switch 33display device 35 alert sound generator unit 37 speaker 41 group of sensors 43 database 45 actuators 83voice analysis unit 49/70
权利要求:
Claims (20) [1] 1. Information display device (1) for use in an autonomous vehicle capable of switching between autonomous driving control, in which the vehicle is driven under autonomous cruise control, and manual driving control, in which the vehicle is driven by a driver, the information display apparatus (1) being CHARACTERIZED in that it comprises: a response action determining unit (7) configured to determine a response action to be performed by the driver of the autonomous vehicle to maintain autonomous driving control by determining that the driver approves autonomous driving control; a response action control unit (9) configured to perform the control of requesting the driver to perform the response action determined by the response action determination unit; and a driver analysis unit (3) configured to detect the response action performed by the driver, wherein the response action is an action in which the driver responds to a request to make a decision about an autonomous vehicle driving operation , when the point for switching from autonomous driving control to manual driving control is predetermined, the response action control unit (9) shortens an interval to request the driver to perform the response action as the predetermined point is reached. approaches. [2] 2. Information presentation device (1), according to claim 1, CHARACTERIZED by the fact that the response action determination unit (7) acquires a candidate of autonomous behavior of the vehicle generated according to the logic of the system of an autonomous driving system, and determines a viable driving operation considering the surroundings of the autonomous vehicle, as the response action, of the autonomous vehicle behavior candidate. [3] 3. Information display device (1), according to claim 1 or 2, CHARACTERIZED by the fact that the response action control unit (9) receives only an affirmative response from the driver. [4] 4. Information display device (1), according to claim 1 or 2, CHARACTERIZED by the fact that the response action control unit (9) receives only a negative response from the driver. [5] 5. Information display device (1), according to claim 1 or 2, CHARACTERIZED by the fact that the response action control unit (9) receives both an affirmative and a negative response from the driver. [6] 6. Information display device (1), according to any one of claims 1 to 5, CHARACTERIZED by the fact that the response action control unit (9) formulates a question to the driver for which the answer is inevitable , when a response action frequency is below a predetermined frequency. [7] 7. Information presentation device (1), according to any one of claims 1 to 6, CHARACTERIZED by the fact that the response action is the activation of a button by the driver. [8] 8. Information presentation device (1), according to claim 1 or 2, CHARACTERIZED by the fact that the response action is a response speech by the driver. [9] 9. Information presentation device (1), according to claim 1 or 2, CHARACTERIZED by the fact that the response action is a dialogue with the driver. [10] 10. Information presentation device (1), according to claim 9, CHARACTERIZED by the fact that when the driver has a low level of perception of the surroundings in the dialogue with the driver, the dialogue with the driver is repeated. [11] 11. Information presentation apparatus (1) according to any one of claims 8 to 10, CHARACTERIZED in that when a frequency of speech or dialogue by the driver is below a predetermined frequency, a question for which the speech or dialogue is inevitable is formulated to the driver. [12] 12. Information display device (1), according to any one of claims 1 to 11, CHARACTERIZED by the fact that the response action control unit (9) interrupts the autonomous driving control of the autonomous vehicle when the driver does not perform the reply action. [13] 13. Information presentation device (1), according to any one of claims 1 to 12, CHARACTERIZED by the fact that the response action control unit (9) requests the driver to perform the response action whenever elapsed a predetermined time interval or whenever the autonomous vehicle performs a predetermined number of cruise control operations. [14] 14. Information display device (1), according to any one of claims 1 to 13, CHARACTERIZED by the fact that the response action control unit (9) requests the driver to perform the response action upon detecting that the driver is not awake or not looking straight ahead. [15] 15. Information display device (1), according to any one of claims 1 to 14, CHARACTERIZED by the fact that the response action control unit (9) sets the time to request the driver to perform the response action response, based on the state of the driver and the condition of the route and the external circumstance of the autonomous vehicle. [16] 16. Information display device (1), according to any one of claims 1 to 15, CHARACTERIZED by the fact that the response action control unit (9) defers the time to request the driver to perform the response action response when the driver manually controls the autonomous vehicle. [17] 17. Information display device (1), according to any one of claims 8 to 16, CHARACTERIZED by the fact that the response action control unit (9) shortens the interval to request the driver to perform the action of response when the driver makes inappropriate speech or dialogue. [18] 18. Information presentation device (1) according to any one of claims 1 to 17, CHARACTERIZED by the fact that, when requesting the execution of the response action, the response action control unit (9) displays a pattern formed by the spatial frequency and the temporal frequency without luminance edge. [19] 19. Information presentation device (1), according to any one of claims 1 to 18, CHARACTERIZED by the fact that, when requesting the execution of the response action, the response action control unit (9) provides the conductor a sensory stimulus in sync with a displayed image. [20] 20. Method of displaying information performed by an information display apparatus (1) for use in an autonomous vehicle capable of switching between autonomous driving control, in which the vehicle is driven under autonomous cruise control, and of manual driving, in which the vehicle is driven by a driver, the method of presenting information being CHARACTERIZED in that it comprises: determining a response action to maintain control of autonomous driving by determining that the driver of the autonomous vehicle approves the autonomous driving control; perform the command to request the driver to perform the determined response action; and detect the response action performed by the driver, where the response action is an action in which the driver responds to a request to make a decision about an autonomous vehicle driving operation, when the point to switch from autonomous driving control to manual driving control is predetermined, shorten the interval to request the driver to take the response action as the predetermined point approaches.
类似技术:
公开号 | 公开日 | 专利标题 BR112016022178A2|2021-09-08|INFORMATION PRESENTATION APPARATUS AND INFORMATION PRESENTATION METHOD Forster et al.2017|Driver compliance to take-over requests with different auditory outputs in conditional automation WO2015122158A1|2015-08-20|Driving support device JP6330411B2|2018-05-30|Information presentation device and information presentation method Nijboer et al.2016|Driving and multitasking: the good, the bad, and the dangerous US10642266B2|2020-05-05|Safe warning system for automatic driving takeover and safe warning method thereof JP2015185085A|2015-10-22|Information presentation device and information presentation method EP2138988A1|2009-12-30|Method for determining a driving demand value US10636301B2|2020-04-28|Method for assisting operation of an ego-vehicle, method for assisting other traffic participants and corresponding assistance systems and vehicles KR20170128919A|2017-11-24|Evaluating system of safe driving efficiency for old age driver Rauffet et al.2020|The relationship between level of engagement in a non-driving task and driver response time when taking control of an automated vehicle Recarte et al.2009|Driver distractions KR100811499B1|2008-03-07|Method and device for a lane departure warming system of automobile US11040720B2|2021-06-22|Sleepiness level prediction device and sleepiness level prediction method CN110383361A|2019-10-25|Method and apparatus for reminding driver to start at optical signal equipment JP2021079888A|2021-05-27|Display system JP2009264860A|2009-11-12|Navigation device, navigation method, and program KR20210113070A|2021-09-15|Attention-based notifications US10807605B2|2020-10-20|Systems and methods for detecting and dynamically mitigating driver fatigue JP2020125089A|2020-08-20|Vehicle control device JP2018069765A|2018-05-10|Method for determining proper state and device for determining proper state JPWO2020100539A1|2021-10-07|Information processing equipment, mobile devices, and methods, and programs JP6555649B2|2019-08-07|Vehicle driving support system and vehicle driving support method WO2021085371A1|2021-05-06|Driving assistance device, driving assistance method, and driving assistance program Lu et al.2021|Design of Human-Machine Interaction Interface for Autonomous Vehicles Based on Multidimensional Perceptual Context
同族专利:
公开号 | 公开日 US20170220039A1|2017-08-03| EP3125211A1|2017-02-01| RU2016141641A3|2018-04-26| EP3125211B1|2018-05-09| CN106133806B|2019-09-03| JPWO2015145606A1|2017-04-13| JP6315082B2|2018-04-25| MX2016012293A|2016-11-30| MX358325B|2018-08-15| RU2667796C2|2018-09-24| EP3125211A4|2017-03-29| RU2016141641A|2018-04-26| WO2015145606A1|2015-10-01| US10146221B2|2018-12-04| CN106133806A|2016-11-16|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US5469356A|1994-09-01|1995-11-21|Caterpillar Inc.|System for controlling a vehicle to selectively allow operation in either an autonomous mode or a manual mode| JP3239727B2|1995-12-05|2001-12-17|トヨタ自動車株式会社|Automatic driving control device for vehicles| JPH11185200A|1997-12-22|1999-07-09|Mitsubishi Motors Corp|Method for judging conscious level of driver in automatic traveling controllable vehicle| JP3094100B1|1999-03-26|2000-10-03|建設省土木研究所長|Automatic operation control device| JP3758443B2|2000-01-19|2006-03-22|日産自動車株式会社|Running state presentation device| JP2002183900A|2000-12-14|2002-06-28|Nissan Motor Co Ltd|Alarm for dozing drive| JP4203857B2|2004-03-01|2009-01-07|株式会社デンソー|Safe driving support system| JP4830384B2|2004-12-03|2011-12-07|日産自動車株式会社|Information display method, display control device, and information display device| JP4981279B2|2005-08-22|2012-07-18|クラリオン株式会社|Vehicle abnormality notification device and control program| JP4417320B2|2005-11-08|2010-02-17|本田技研工業株式会社|Shift control device for automatic transmission| DE102006018723A1|2006-04-20|2007-10-25|Hans Edmund Hochrein|Vehicle impact warning| US7652583B2|2007-03-20|2010-01-26|Deere & Company|Method and system for maintaining operator alertness| JP4985103B2|2007-05-28|2012-07-25|トヨタ自動車株式会社|Automatic vehicle driving device| JP5071033B2|2007-10-18|2012-11-14|日産自動車株式会社|Route guidance device and route guidance method| CN101823435B|2009-03-03|2012-12-12|珠海市智汽电子科技有限公司|Vehicular automatic driving system convenient for external installation and manual/automatic switching| JP5515446B2|2009-06-19|2014-06-11|トヨタ自動車株式会社|VEHICLE CONTROL DEVICE AND VEHICLE CONTROL METHOD| JP5774409B2|2011-08-19|2015-09-09|ダイムラー・アクチェンゲゼルシャフトDaimler AG|Driving support control device| KR101703144B1|2012-02-09|2017-02-06|한국전자통신연구원|Apparatus and method for autonomous driving| KR20140043536A|2012-09-24|2014-04-10|현대자동차주식회사|Driving control right exanging method for autonomous vehicle| US8825258B2|2012-11-30|2014-09-02|Google Inc.|Engaging and disengaging for autonomous driving| EP2902864B1|2014-01-30|2017-05-31|Volvo Car Corporation|Control arrangement for autonomously driven vehicle| US9869560B2|2015-07-31|2018-01-16|International Business Machines Corporation|Self-driving vehicle's response to a proximate emergency vehicle| US9785145B2|2015-08-07|2017-10-10|International Business Machines Corporation|Controlling driving modes of self-driving vehicles| US9513632B1|2015-09-16|2016-12-06|International Business Machines Corporation|Driving mode alerts from self-driving vehicles| US9834224B2|2015-10-15|2017-12-05|International Business Machines Corporation|Controlling driving modes of self-driving vehicles| KR20170094583A|2016-02-11|2017-08-21|삼성전자주식회사|Apparatus and Method for controlling a vehicle thereof| US9688288B1|2016-03-08|2017-06-27|VOLKSWAGEN AG et al.|Geofencing for auto drive route planning| US10459455B2|2016-07-28|2019-10-29|Lg Electronics Inc.|Vehicle control apparatus and vehicle including the same| KR101768590B1|2016-08-09|2017-08-17|엘지전자 주식회사|Vehicle control device mounted on vehicle and method for controlling the vehicle|JP6239144B2|2014-10-30|2017-11-29|三菱電機株式会社|Onboard equipment, autonomous driving vehicle, automated driving support system, automated driving monitoring device, road management device, and automated driving information collection device| JP6558732B2|2015-04-21|2019-08-14|パナソニックIpマネジメント株式会社|Driving support method, driving support device, driving control device, vehicle, and driving support program using the same| KR20170015114A|2015-07-30|2017-02-08|삼성전자주식회사|Autonomous vehicle and method for controlling the autonomous vehicle| US11040725B2|2015-09-04|2021-06-22|Inrix Inc.|Manual vehicle control notification| JP6377034B2|2015-10-21|2018-08-22|三菱電機ビルテクノサービス株式会社|Vehicle drowsiness prevention device and vehicle drowsiness prevention method| US10768617B2|2015-11-19|2020-09-08|Sony Corporation|Drive assistance device and drive assistance method, and moving body| JP6583061B2|2016-03-08|2019-10-02|トヨタ自動車株式会社|Automatic operation control device| JP6753923B2|2016-03-15|2020-09-09|本田技研工業株式会社|Vehicle control system, vehicle control method, and vehicle control program| CN108778882B|2016-03-15|2021-07-23|本田技研工业株式会社|Vehicle control device, vehicle control method, and storage medium| US20170285637A1|2016-03-31|2017-10-05|GM Global Technology Operations LLC|Driving assistance methods and systems| JP6294905B2|2016-03-31|2018-03-14|株式会社Subaru|Display device| JP6330842B2|2016-03-31|2018-05-30|マツダ株式会社|Driving assistance device| CN106080606B|2016-07-08|2019-01-01|百度在线网络技术(北京)有限公司|Method and apparatus for controlling automatic driving vehicle| DE102016008365A1|2016-07-08|2018-01-11|Audi Ag|Proactive control of an assistance system of a motor vehicle| CN106740853A|2016-11-21|2017-05-31|百度在线网络技术(北京)有限公司|The driving model method for handover control and device of a kind of automatic driving vehicle| US10479375B2|2017-02-16|2019-11-19|Qualcomm Incorporated|Autonomously performing default operations based on current operating contexts| JP2018149941A|2017-03-14|2018-09-27|オムロン株式会社|Concentration level determination device, concentration level determination method, and program for determining concentration level| JP2018149940A|2017-03-14|2018-09-27|オムロン株式会社|Concentration level determination device, concentration level determination method, and program for determining concentration level| US10365647B2|2017-03-29|2019-07-30|Qualcomm Incorporated|Ensuring occupant awareness in vehicles| US10471963B2|2017-04-07|2019-11-12|TuSimple|System and method for transitioning between an autonomous and manual driving mode based on detection of a drivers capacity to control a vehicle| CN110574086A|2017-04-27|2019-12-13|日立汽车系统株式会社|Vehicle control device| JP6946728B2|2017-05-11|2021-10-06|いすゞ自動車株式会社|Vehicle operation control system and vehicle operation control method| CN107054370A|2017-05-24|2017-08-18|成都志博科技有限公司|Anti-fatigue-driving system based on eyeball tracking technology| JP2018203009A|2017-06-02|2018-12-27|本田技研工業株式会社|Vehicle control system, vehicle control method, and program| WO2019000391A1|2017-06-30|2019-01-03|华为技术有限公司|Vehicle control method, device, and apparatus| JP6565988B2|2017-08-25|2019-08-28|トヨタ自動車株式会社|Automatic driving device| JP6863194B2|2017-09-19|2021-04-21|トヨタ自動車株式会社|Autonomous driving system| JP6841248B2|2018-02-13|2021-03-10|トヨタ自動車株式会社|Autonomous driving system| DE102018207339A1|2018-05-09|2019-11-14|Volkswagen Aktiengesellschaft|Method, apparatus and computer-readable storage medium with instructions for monitoring and validating operating data in the actuator system of an autonomous motor vehicle| US20200073379A1|2018-08-31|2020-03-05|Toyota Research Institute, Inc.|Systems and methods for confirming that a driver has control of a vehicle| JP2020078985A|2018-11-12|2020-05-28|トヨタ自動車株式会社|Automatic operation device module mounting structure and automatic operation electric vehicle including the same| WO2020181420A1|2019-03-08|2020-09-17|SZ DJI Technology Co., Ltd.|Techniques for switching between autonomous and manual control for a movable object| JP6856086B2|2019-04-18|2021-04-07|株式会社デンソー|Notification management device and notification management program| CN110281953B|2019-07-05|2020-09-11|厦门金龙联合汽车工业有限公司|Man-machine interaction method and interaction system of unmanned vehicle| DE102019133780A1|2019-12-10|2021-06-10|Bayerische Motoren Werke Aktiengesellschaft|Driving system and method for exiting an automated driving mode by voice| JP2021135057A|2020-02-21|2021-09-13|本田技研工業株式会社|Information processing system, and information processing method| US11039771B1|2020-03-03|2021-06-22|At&T Intellectual Property I, L.P.|Apparatuses and methods for managing tasks in accordance with alertness levels and thresholds| DE102020109666A1|2020-04-07|2021-10-07|Bayerische Motoren Werke Aktiengesellschaft|Method for operating a driver assistance system of a vehicle with driver moderation in the case of partially automated functions, driver assistance system and vehicle| DE102020111878A1|2020-04-30|2021-11-04|Bayerische Motoren Werke Aktiengesellschaft|PROCEDURE FOR ISSUING DRIVING INSTRUCTIONS AND A MOTOR VEHICLE|
法律状态:
2020-03-31| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]| 2021-08-03| B350| Update of information on the portal [chapter 15.35 patent gazette]|
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 PCT/JP2014/058483|WO2015145606A1|2014-03-26|2014-03-26|Information presentation device and information presentation method| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|