专利摘要:
VEHICLE DRIVING DEVICE AND GUIDANCE METHOD This is a travel route data generation section (101) that serves as an expected travel route configuration section and is configured to set up an expected travel route for a host vehicle. A monitor (18) serves as a display unit and is configured to display a route image corresponding to the predicted travel route configured for the host vehicle. The route image is displayed on the monitor (18) in a display mode corresponding to a driving force and / or a braking force of the host vehicle.
公开号:BR112016004552B1
申请号:R112016004552-1
申请日:2013-08-29
公开日:2021-02-17
发明作者:Makiko Sasaki
申请人:Nissan Motor Co., Ltd.;
IPC主号:
专利说明:

[0001] [001] The present invention relates to a device and method for vehicle driving guidance to provide guidance to inform an occupant of predicted driving states in which a vehicle is driven. BACKGROUND OF THE TECHNIQUE
[0002] [002] Patent Literature 1 describes a technique for dividing a target route into colored sections that depend on a distance from the current position of a host vehicle to an orientation point (for example, an intersection). LIST OF QUOTES PATENT LITERATURE Patent Literature 1: Unexamined Patent Publication No. JP 2011-185951 SUMMARY OF THE INVENTION TECHNICAL PROBLEM
[0003] [003] The technique described in Patent Literature 1 does not sufficiently allow an occupant to recognize that the host vehicle is either in a driving or accelerating state or in a braking or decelerating state before the current position.
[0004] [004] An object of the present invention is to provide a device and method for driving vehicle guidance to provide guidance to an occupant indicating which state the host vehicle will be in advance of, a driving or accelerating state or a braking or decelerating state . SOLUTION TO THE PROBLEM
[0005] [005] In order to achieve the objective described above, the present invention displays, in a display unit, a route image corresponding to a predicted travel route configured for a host vehicle in a display mode corresponding to a driving force and / or a host vehicle's braking force or host vehicle acceleration and / or deceleration. ADVANTAGE EFFECTS
[0006] [006] The device and method for driving vehicle guidance according to the present invention can provide guidance to an occupant indicating what state the host vehicle will be in, a driving or accelerating state or a braking or decelerating state. BRIEF DESCRIPTION OF THE DRAWINGS
[0007] [007] [Figure 1] Figure 1 is a block diagram showing a device for guiding vehicle driving according to a modality.
[0008] [008] [Figure 2] Figure 2 is a perspective view showing an example of a vehicle in which the device for driving the vehicle according to the modality is installed.
[0009] [009] [Figure 3] Figure 3 is a view that shows all parts of the information included in the map information used in the device and method for driving vehicle guidance according to the modality.
[0010] [010] [Figure 4] Figure 4 is a block diagram showing a functional internal configuration of a controller 10 shown in Figure 1.
[0011] [011] [Figure 5] Figure 5 is a view to explain an outline of a displacement plane formulated by the device and method for driving vehicle guidance according to the modality.
[0012] [012] [Figure 6] Figure 6 is a view that shows schematically a state in which a vehicle travels according to the displacement plane shown in Figure 5.
[0013] [013] [Figure 7] Figure 7 is a view showing a first example of a predicted travel line used in the device and method for driving vehicle guidance according to the modality.
[0014] [014] [Figure 8] Figure 8 is a view to explain a detail of the first example of the expected displacement line shown in Figure 7.
[0015] [015] [Figure 9] Figure 9 is a view showing a second example of the expected travel line used in the device and method for driving vehicle guidance according to the modality.
[0016] [016] [Figure 10] Figure 10 is a view showing a first example of a displacement plane image that indicates a steady driving state and a deceleration state used in the vehicle driving guidance device and in the driving method. according to the modality.
[0017] [017] [Figure 11] Figure 11 is a view showing a first example of a displacement plane image that indicates a constant driving state and an acceleration state used in the device for vehicle driving guidance and in the driving method according to the modality.
[0018] [018] [Figure 12] Figure 12 is a view showing a second example of the displacement plane image that indicates a constant driving state and a deceleration state used in the device for vehicle driving guidance and in the agreement method with the modality.
[0019] [019] [Figure 13] Figure 13 is a view showing a second example of the displacement plane image that indicates a constant driving state and an acceleration state used in the device for vehicle driving guidance and in the agreement method with the modality.
[0020] [020] [Figure 14] Figure 14 is a view showing a third example of the displacement plane image that indicates an acceleration state and a deceleration state used in the vehicle guidance device and in the method according to the modality.
[0021] [021] [Figure 15] Figure 15 is a view showing a third example of the displacement plane image that indicates an acceleration state, a deceleration state and a disengaged state used in the device and method for driving vehicle guidance according to the modality.
[0022] [022] [Figure 16] Figure 16 is a flowchart showing an operation of the device for guidance of driving a vehicle according to the modality and a process of the method of driving a vehicle according to the modality.
[0023] [023] [Figure 17] Figure 17 is a view showing a fifth example of the displacement plane image used in the device and method for driving vehicle guidance according to the modes. DESCRIPTION OF THE MODALITIES
[0024] [024] Hereinafter, a device and method for guiding vehicle driving according to a modality will be explained with reference to the accompanying drawings. The device for driving a vehicle according to the modality is installed, for example, in a vehicle 100 shown in Figure 2. In the following descriptions, vehicle 100 is illustrated as an autonomous driving vehicle that has autonomous driving capability controlled by the vehicle 100 itself, even when a driver is not present in it.
[0025] [025] A positioning section 11 shown in Figure 1 obtains a current position of the vehicle 100. The positioning section 11 receives GPS radio waves emitted from a GPS satellite through a GPS antenna 11a shown in Figure 2 to obtain the current position of the vehicle 100. The positioning data D11 of the current position of the vehicle 100 obtained through the positioning section 11 is inserted into a controller 10.
[0026] [026] Positioning section 11 can obtain a current position of vehicle 100 by positioning means such as a gyroscope instead of a GPS. Positioning section 11 can obtain a current position of vehicle 100 through a combination of GPS and gyroscope. The function of the positioning section 11 can be implemented inside the controller 10.
[0027] [027] A map information store section 12 stores map information. The map information can be three-dimensional (3D) information. The map information storage section 12 can be an optional storage medium, such as a hard disk drive, a semiconductor memory or an optical disk. The map data D12 issued from the map information storage section 12 is inserted into the controller 10.
[0028] [028] As shown in Figure 3, map data D12 includes multiple pieces of information as map information. The D12 map data includes D12N node information that indicates intersections or nodes between the roads and D12L link information about links corresponding to the road sections between the nodes indicated by the D12N node information. Map data D12 includes speed limit information D12s from a speed limit in the respective elances (namely, the respective road sections).
[0029] [029] D12 map data can include D12H altitude information. Controller 10 can obtain a gradient from a road on which vehicle 100 travels based on altitude information D12H. Controller 10 can obtain a road gradient based on GPS radio waves. The D12 map data includes information on the D12E road surroundings, such as buildings around the roads. Roadside information is assumed to be 3D information.
[0030] [030] As shown in Figure 2, vehicle 100 is equipped, inside, with a camera 13 to form an image of the exterior of the front of vehicle 100. Camera 13 forms an image of the surroundings ahead in such a way that includes a road wherein the vehicle 100 moves within a viewing angle. The fixed position of camera 13 is not limited to that shown in Figure 2. Although Figure 2 is illustrated in an example where a single camera 13 is provided, a stereo camera that includes two cameras 13 can be provided instead. Imaging data D13 of surroundings ahead of vehicle 100 imaged through camera 13 are entered into controller 10.
[0031] [031] A range sensor 14 is attached to the vehicle 100, for example, inside a front grille. The range sensor 14 can include laser radar or millimeter wave radar. The range sensor 14 measures a distance between the vehicle 100 and an object present in front of the vehicle 100. The range data D14 obtained through the range sensor 14 is entered in the controller 10.
[0032] [032] The range sensor 14 is used to detect obstacles located ahead of vehicle 100. Obstacles located ahead of vehicle 100 include an obstacle that moves in front of vehicle 100. An earlier vehicle offset in front of vehicle 100 is a obstacles. Controller 10 detects an obstacle based on range data D14. Controller 10 can detect an object based on range data D14 and imaging data D13. Controller 10 can detect an object based on D13 imaging data only.
[0033] [033] Vehicle 100 includes a speed sensor 15 to detect the speed of the moving vehicle 100. Speed sensor 15 is a speed detection section for detecting the speed of vehicle 100 and, for example, is a speed sensor. wheel speed. The speed data D15 obtained through the speed sensor 15 is entered in the controller 10.
[0034] [034] Once an occupant operates an operating section 16 to configure a destination, the destination D16 configuration data is entered into controller 10. Operating section 16 can configure other information needed for route guidance other than the destination, such as a transit point or a road taking precedence over others. When other information is configured, the D16 configuration data is also applied with the configuration data of other information.
[0035] [035] Controller 10 generates displacement route data D101 regarding a route from a current position to a destination based on positioning data D11, map data D12 and configuration data D16. Travel route data D101 is stored in a travel route data storage section 17. Travel route data storage section 17 is non-volatile memory.
[0036] [036] As described below, in the device and method for vehicle driving guidance according to the modality, controller 10 generates displacement plane imaging data D102 visually indicative of how a vehicle state in displacement 100 is controlled based on in the displacement route data D101 and the respective parts of the information described below.
[0037] [037] D102 displacement plane image data is obtained in such a way as to superimpose a predicted displacement line on which vehicle 100 travels and an displacement state distinguishing image on a camera image in D13 imaging data obtained through the camera 13.
[0038] [038] The displacement state distinguishing image indicates a displacement state of either a driving state in which vehicle 100 is moving while applying driving force or a braking state in which vehicle 100 is moving while suffers the application of a braking force. The displacement state distinguishing image can indicate a displacement state either from an acceleration state in which vehicle 100 accelerates or a deceleration state in which vehicle 100 decelerates regardless of whether the vehicle 100 is in the driving state or in the braking state. A displacement plane image included in the D102 displacement plane image data is explained in detail below.
[0039] [039] Controller 10 supplies the displacement plane image data D102 to a monitor (display unit) 18.
[0040] [040] Vehicle 100 includes a displacement device 22 for starting vehicle 100. Displacement device 22 includes a steering wheel 22s, an accelerator 22a and a brake 22b. The displacement device 22 may include a transmission, such as gears for acceleration and deceleration. A displacement device drive section 21 drives displacement device 22 according to control via controller 10.
[0041] [041] The specific operations of controller 10 are explained below with reference to Figure 4. Figure 4 shows a functional internal configuration of controller 10. Controller 10 may include a microcomputer. The operations of controller 10 described below can be implemented through a computer program.
[0042] [042] As shown in Figure 4, positioning data D11, map data D12 and configuration data D16 are inserted into a travel route data generation section 101. The travel route data generation section offset 101 generates, based on map data D12, offset route data D101 which indicates a predicted travel route from a current position indicated via positioning data D11 to a destination indicated via configuration data D16.
[0043] [043] Travel route data D101 is stored in travel route data storage section 17 and a portion of data in travel route data D101 required for route guidance in association with vehicle movement 100 is read from the travel route data storage section 17. When vehicle 100 deviates from the expected travel route, travel route data generation section 101 generated new travel route data D101 indicating another travel route planned travel in order to update the D101 travel route data. The travel route data generation section 101 is a planned travel route configuration section for configuring an expected travel route for a host vehicle.
[0044] [044] Travel route data D101 read from travel route data storage section 17 through travel route data generation section 101 is inserted into a plan plane image data generation section displacement 102 and a displacement plane formulation section 103.
[0045] [045] In addition to the displacement route data D101, positioning data D11, map data D12 and imaging data D13 are inserted in the displacement plane image data generation section 102. The processing implemented through the displacement plane image data generation section 102 is explained below.
[0046] [046] In addition to displacement route data D101, positioning data D11, map data D12, imaging data D13 and velocity data D15 are inserted in the displacement plane formulation section 103. The formulation section Displacement plan 103 includes a speed plan formulation section 1031 to formulate a plan indicative of which speed is selected to drive the vehicle 100 and a displacement target point configuration section 1032 to configure target points (points- displacement target) along which vehicle 100 travels.
[0047] [047] As described above, D12 map data includes information on D12E road surroundings. Therefore, the travel plan formulation section 103 can specify which point the vehicle 100 currently travels on the predicted travel route indicated through travel route data D101 by comparing map data D12 with imaging data D13. The travel plan formulation section 103 can specify a location on a road where vehicle 100 travels with reference to positioning data D11 in addition to map data D12 and imaging data D13.
[0048] [048] The operations of the 1031 speed plane formulation section and the 1032 offset target point configuration section are explained below with reference to Figure 5. Vehicle 100 is assumed to travel on an indicated road section through the L1 link, turn left at an N1 node and move additionally on a section of road indicated through the L2 link, as indicated by the arrow according to the D101 travel route data. The travel target point configuration section 1032 distributes travel target points P1, P2, P3, ... at predetermined intervals on the road the vehicle is traveling on.
[0049] [049] The travel target point configuration section 1032 positions the respective travel target points at intervals, with each interval having a distance from a position where vehicle 100 is located on the road to a position at that vehicle 100 travels for a predetermined period of time or to a position where vehicle 100 travels a predetermined distance on the road ahead in the direction of travel. Figure 5 is illustrated with an example in which the travel target point configuration section 1032 configures the respective travel target points within a distance from the vehicle's current position 100 to the travel target point P7.
[0050] [050] The distance from the current position of the vehicle 100 to the target point of displacement furthest ahead of the vehicle 100 is called a travel plane formulation distance. The offset target point configuration section 1032 can determine the offset plane formulation distance depending on vehicle speed 100. Alternatively, offset target point configuration section 1032 can increase the offset plan formulation distance displacement as vehicle speed 100 increases. As the displacement plane formulation distance increases, the displacement state distinction images described below have a greater scope.
[0051] [051] In particular, the 1032 offset target point configuration section divides the speed into a plurality of ranges and decreases the offset plane formulation distance in an area with the slowest distributed speed and increases the formulation distance displacement plane in an area with a faster distributed speed. For example, the 1032 offset target point configuration section can set the offset plane formulation distance to 50 m when the speed is 40 km / h and set the offset plane formulation distance to 80 m when the speed is is 60 km / h.
[0052] [052] The 1032 offset target point configuration section can vary an interval in which the respective offset target points are positioned depending on speed. The 1032 offset target point configuration section shortens the range in areas where speed is slower and extends the range in areas where speed is faster. For example, the 1032 offset target point configuration section can set the range to 1.5 m when the speed is 30 km / h and set the range to 4 m when the speed is 60 km / h.
[0053] [053] The travel target point configuration section 1032 defines, as the respective travel target points, positions on the road through which, for example, the center of the vehicle 100 or the middle portion between the front wheels passes. The displacement plane formulation section 103 can recognize a track on which the vehicle 100 travels according to the imaging data D13. The travel target point configuration section 1032 can distribute the travel target points P1, P2, P3, ... to the middle of the track.
[0054] [054] As described above, map data D12 includes speed limit information D12s in the respective road sections. Therefore, the speed plan formulation section 1031 can formulate the speed plan indicative of which speed for driving vehicle 100 is configured for the respective road sections of links L1 and L2.
[0055] [055] Figure 5 is illustrated with an example in which the speed limit on the L1 link is set to 40 km / h and the speed limit on the L2 link is set to 30 km / h. The speed plan formulation section 1031 sets the speed at the respective displacement target points P1 to P3 at 40 km / h, sets the speed at the displacement target point P4, for example, at 36 km / h and sets the speed at the displacement target point P5, for example, to 32 km / h in the road section of the L1 link.
[0056] [056] Node N1 is located in a different position from the offset target points configured through the offset target point configuration section 1032. The speed plane formulation section 1031 can also distribute speed to the N1 node . The speed plan formulation section 1031 sets the speed at node N1 to 30 km / h. The speed plan formulation section 1031 sets the speed at the respective displacement target points P6 and P7 in the road section of the L2 link to 30 km / h.
[0057] [057] As described above, the displacement plane formulation section 103 determines the displacement plane formulation distance on the road ahead, where the vehicle 100 travels and distributes the target displacement points on the road at predetermined intervals within the displacement plane formulation distance. The displacement plane formulation section 103 formulates the displacement plane indicative of the driving speed of the vehicle 100 configured for the respective displacement target points.
[0058] [058] The displacement plane formulation section 103 supplies the displacement plane data D103 indicating the displacement plane formulated for the displacement device drive section 21 and the displacement plane image generation section 102 D103 travel plane data includes travel target point data and speed data for the respective travel target points. The displacement plane formulation section 103 can supply only the displacement target point data to the displacement plane image data generation section 102.
[0059] [059] Travel device drive section 21 drives travel device 22 based on travel plane data D103. The vehicle 100, therefore, moves automatically on the predicted travel route indicated by the travel route data D101 according to the travel plan formulated through the travel plan formulation section 103.
[0060] [060] Figure 6 is a view as viewed from vehicle 100 and shows schematically a state in which vehicle 100 travels on the road sections of links L1 and L2 according to the displacement plane indicated through the plane data of displacement D103. The displacement target points P1, P2, P3, ... are distributed, for example, to the middle of the track. Vehicle 100 moves along the displacement target points P1, P2, P3, ... in sequence at a speed distributed to the respective displacement target points.
[0061] [061] Figure 5 shows a state in which obstacles, such as the displacement of previous vehicles ahead of vehicle 100, are not present, so that vehicle 100 can maintain displacement at a speed within a speed limit, title of convenience of explanation. In practice, vehicle 100 must stop or slow down at traffic lights or due to the presence of obstacles.
[0062] [062] Traffic lights are placed adjacent to intersections and their heights are approximately fixed. Therefore, the displacement plane formulation section 103 can recognize a traffic light color based on D13 imaging data. The 1031 speed plane formulation section varies the speed distributed to the respective travel target points depending on a traffic light color used as a condition other than the speed limit.
[0063] [063] Travel plan formulation section 103 serves as a road information detection section to detect road information on which vehicle 100 travels on the expected travel route. Road information is, in particular, traffic signs that include, for example, speed limits, traffic signs and traffic lights on the road.
[0064] [064] The travel plan formulation section 103 also serves as a travel state configuration section provided for configuring vehicle 100 for both the driving state (or acceleration state) and the braking state (or state deceleration) when vehicle 100 travels on the planned travel route based on road information and vehicle speed 100.
[0065] [065] As shown in Figure 4, imaging data D13 and range data D14 are inserted into an obstacle detection section 104. Obstacle detection section 104 detects obstacles through the use of at least as much of the D13 imaging and D14 range data. The obstacle detection section 104 can detect obstacles through the use of both imaging data D13 and range data D14.
[0066] [066] In this document, obstacles are objects (including people) that impede the safe driving of the vehicle 100. Specific examples of obstacles include a previous vehicle, a person or a vehicle (such as a car or bicycle) that crosses in front of vehicle 100 and a vehicle parked on a street.
[0067] [067] When obstacle detection section 104 detects an obstacle, obstacle detection section 104 supplies the displacement plane formulation section 103 with obstacle detection data D104 which indicates that the obstacle has been detected. The obstacle detection data D104 includes distance data for a distance between the vehicle 100 and the obstacle. The speed plan formulation section 1031 varies the speed distributed to the respective travel target points depending on the presence or absence of an obstacle or a distance between the vehicle 100 and the obstacle used as a condition other than the speed limit and the traffic light color.
[0068] [068] The travel plan formulation section 103 which serves as a predicted travel state configuration section configures vehicle 100 for both the driving (or accelerating) and braking (or decelerating) state depending on the result of the presence or absence of the obstacle or the distance between the vehicle 100 and the detected obstacle.
[0069] [069] The displacement plane image data generation section 102 specifies the position in which the vehicle 100 travels on the road and recognizes the runway in the same way as the displacement plane formulation section 103. The generation section displacement plane image data 102 generates, according to the displacement target point data included in displacement plane data D103, image data obtained in such a way as to overlap a predicted displacement line generated based on the points -displacement target in a camera image in D13 imaging data.
[0070] [070] Figure 7 shows a first example of the expected displacement line. Vehicle 100 is presumed to move upwards on track LN0. The first example is a predicted displacement line R1, which is a straight line (or a curved line) that connects the target displacement points and which has a predetermined width Wr1. The predicted travel line R1 of the first example is a line through which, approximately, the center of a vehicle width W100 of the vehicle 100 passes.
[0071] [071] The displacement plane image data generating unit 102 determines the width Wr1 of the predicted displacement line R1 as follows. A straight line R10 shown in Figure 8 is a line that connects the respective displacement target points. In this document, vehicle 100 may not travel along the straight line R10 due to irregularities on the road surface even when the drive device drive section 21 drives the steering wheel 22s based on the displacement plane data D103 so driving vehicle 100 in a straight line. Vehicle 100 moves in a zigzag pattern as indicated by the curved line R20.
[0072] [072] When the expected travel line is the narrow straight line R10 as shown in Figure 8, an occupant may feel uncomfortable due to the non-conformity of the expected travel line with the zigzag movement of the vehicle 100. Thus, in view of such a zigzag movement of the vehicle 100, the width Wr1 of the predicted displacement line R1 can be configured for a zigzag movement width or greater.
[0073] [073] A zigzag movement level of the vehicle 100 corresponds to a Dss control error derived from the steering wheel deviation 22s caused when triggered to drive the vehicle 100 in a straight line. As shown in Figure 4, the control error Dss is inserted in the displacement plane image data generation section 102 from the displacement device trigger section 21. The displacement plane image data section 102 determines the width Wr of the expected displacement line R1 depending on the control error D22 of the steering wheel 22s.
[0074] [074] Figure 9 shows a second example of the expected displacement line. The second example includes two predicted displacement lines R2 aligned parallel to the straight line connecting the respective displacement target points. The displacement plane image data section 102 sets a distance between the two predicted offset lines R2 approximately for the vehicle width W100 of the vehicle 100. The displacement plane image data section 102 you can set the distance between the two predicted travel lines R2 to a width W110 greater than the vehicle width W100 while considering the control error Dss.
[0075] [075] The width W110 can be a width between the left planned shift line R2 defined on the outside of the zigzag movement towards the left of the vehicle 100 and the right planned shift line R2 defined on the outside of the zigzag movement in right direction of vehicle 100. The distance between the two planned travel lines R2 set to width W110 prevents vehicle 100 from projecting out of the planned travel lines R2 even when vehicle 100 moves in a zigzag pattern. This helps to prevent an occupant from feeling uncomfortable.
[0076] [076] As described below, the only predicted displacement line R1 or the area defined through the two predicted displacement lines R2 is a portion in which the distinction images that indicate a state of the vehicle in motion (state distinction images) displacement) are superimposed. The displacement plane image generation section 102 sets the overlapping portion in a range of a minimum width corresponding to the width of the steering wheel control error 22s of the vehicle 100 to a maximum width obtained in such a way as to add the width of the control error at vehicle width W100 of vehicle 100.
[0077] [077] The displacement plane image data section 102 generates the displacement plane image data D102 obtained in such a way that the predicted displacement line R1 or the predicted displacement lines R2 overlap in the camera image and additionally, they superimpose the displacement state distinction images that indicate the state of the displacement vehicle 100 in them. The state of the moving vehicle 100 is, in particular, both the driving (or accelerating) and braking (or decelerating) states.
[0078] [078] The state of the moving vehicle 100 may include a disengaged state in which neither a driving force nor a braking force is applied. Alternatively, the disengaged state can include a state in which vehicle 100 accelerates and a state in which vehicle 100 decelerates.
[0079] [079] Each of Figure 10 and Figure 11 shows a first example of displacement state distinguishing images displayed on monitor 18 and superimposed on the camera image. Each of Figure 10 and Figure 11 shows a case of using the two predicted displacement lines R2 explained with reference to Figure 9.
[0080] [080] In Figure 10, the obstacle detection section 104 detects, as an obstacle, a previous vehicle 150 located in front of vehicle 100 (host vehicle) moving on track LN0 and supplies the obstacle detection data D104 to the travel plane formulation section 103. travel plan formulation section 103 supplied the obstacle detection data D104 distributes the speed to the respective travel target points in such a way that it gradually decelerates the vehicle 100.
[0081] [081] As shown in Figure 10, the displacement plane image data generation section 102 superimposes a displacement state distinction image (first distinction image) Im10 in the region between the two predicted displacement lines R2 where vehicle 100 travels at a constant speed in the driving state. The displacement plane image generation section 102 additionally superimposes a displacement state distinction image (second distinction image) Im11 over the region between the two predicted displacement lines R2 in which the vehicle 100 decelerates in the state braking.
[0082] [082] The displacement plane image data section 102 generates, for example, displacement plane image data D102 in which the displacement state image Im10 and the displacement state image Im11 differ from each other in color. The Im10 displacement state distinction image and the Im11 displacement state distinction image can be semitransparent images. When the Im10 displacement state image and Im11 displacement state image are semitransparent images, the road (lane image) is prevented from being completely blocked so that road signs or objects on the road can be recognized .
[0083] [083] An occupant views the displacement plane image in the displacement plane image data D102 displayed on the monitor 18 in order to easily recognize that the vehicle 100 is controlled, so that it changes from the constant displacement state to the state deceleration.
[0084] [084] Figure 11 shows a state in which vehicle 100 stops in front of an intersection due to the fact that a traffic light TL1 is red, however, it starts to move, as traffic light TL1 turns green. The travel plane formulation section 103 sets the respective travel target points to a speed in such a way that the vehicle 100 gradually accelerates from start to travel by a predetermined distance. Once the vehicle 100 reaches a speed limit, the travel plane formulation section 103 configures the respective travel target points to drive the vehicle 100 at a constant speed.
[0085] [085] As shown in Figure 11, the displacement plane image data generation section 102 superimposes a displacement state distinction image (third distinction image) Im12 in the region between the two predicted displacement lines R2 where vehicle 100 accelerates in the driving state. The displacement plane image generation section 102 additionally superimposes the displacement state distinction image Im10 to the region between the two predicted displacement lines R2 in which the vehicle 100 travels at a constant speed in the driving, as in the case of Figure 10.
[0086] [086] In Figure 11, the displacement plane image data section 102 also generates displacement plane image data D102 where the displacement state distinction image Im10 and the displacement state distinction image Im12 offset different in color. The Im12 displacement state distinguishing image can also be a semitransparent image for the same reason.
[0087] [087] An occupant visualizes the displacement plane image indicated through the displacement plane image data D102 displayed on monitor 18 in order to easily recognize that the vehicle 100 is controlled so that it changes from the acceleration state to the state constant displacement.
[0088] [088] In Figure 10 and Figure 11, the displacement plane image data generation section 102 can change the color of the predicted displacement lines R2 so that they are different from the respective displacement status images Im10, Im11 and Im12 or configure the color of the predicted displacement lines R2 to conform to the respective displacement status images Im10, Im11 and Im12. When the predicted displacement lines R2 are colored in the same way as the displacement state distinguishing images Im10, Im11 and Im12, the displacement plane image thus obtained is substantially the same as an image that includes only the images of distinction of displacement status Im10, Im11 and Im12.
[0089] [089] Displacement state distinction images are not limited to those that have different colors as shown in Figure 10 and Figure 11. Different distinction images that vary in shape depending on a displacement state can be used instead .
[0090] [090] Each of Figure 12 and Figure 13 shows a second example of the displacement state distinction images displayed on monitor 18 and superimposed on the camera image.
[0091] [091] Figure 12 shows an example of distinction images that vary in shape depending on the displacement state, instead of the Im10 displacement state distinction image and the Im11 displacement state distinction image shown in Figure 10. As shown in Figure 12, the displacement plane image data section 102 superimposes displacement state distinction images (first distinction images) Im20 that have, for example, a square shape in the region where the vehicle 100 travels at a constant speed in the driving state.
[0092] [092] The displacement plane image generation section 102 also superimposes displacement state distinction images (second distinction images) Im21 that have, for example, a downward arrow shape in the region where the vehicle 100 decelerates in the braking state.
[0093] [093] Figure 13 shows an example of distinction images that vary in shape depending on the displacement state, instead of the Im10 displacement state distinction image and the Im12 displacement state distinction image shown in Figure 11. As shown in Figure 13, the displacement plane image data section 102 superimposes displacement state distinction images (third distinction images) Im22 that have, for example, a rising arrow shape in the region where the vehicle 100 accelerates in the driving state.
[0094] [094] The displacement plane image generation section 102 also superimposes the displacement state distinction images Im20, which have a square shape, over the region in which the vehicle 100 travels at a constant speed in the state of driving, as in the case of Figure 12.
[0095] [095] The square shape, the descending arrow shape and the rising arrow shape of the displacement state distinction images Im20, Im21 and Im22 are merely examples and can be of any shape, as long as it can provide images to an occupant constant speed, deceleration and acceleration. The displaced state distinction images Im20, Im21 and Im22 with the internal side in white hardly have an influence on the recognition of traffic signs or objects on the road.
[0096] [096] In this document, when the predicted displacement line R1 as explained in Figure 7 is used instead, the predicted displacement line R1 can be divided into parts with different colors in the same way as the state distinction images. displacement Im10, Im11 and Im12. Alternatively, the distinction images similar to the displacement state distinction images Im10, Im11 and Im12 can be superimposed on the predicted displacement line R1.
[0097] [097] Figure 10 to Figure 13 are illustrated with the example in which the displacement plane image is indicated with three different types of sections that show both the driving state and the variant braking state depending on the vehicle's acceleration 100 .
[0098] [098] For example, when vehicle 100 travels on a road with an increasing gradient, vehicle 100 may be in the driving state in which accelerator 22a is propelled downwardly in order to travel at a constant speed. When vehicle 100 travels on a road with a decreasing gradient, vehicle 100 may be in the braking state in which accelerator 22a is released in order to travel at a constant speed. The displacement plane image generation section 102 may use a displacement plane image indicated with different displacement state distinguishing images indicative of which state the vehicle 100 is in, the driving state or the braking state, regardless of vehicle 100 acceleration.
[0099] [099] The travel plan formulation section 103 can also formulate a travel plan that indicates a predicted travel state of the driving state and the braking state depending on the information on the road the vehicle 100 travels on, such as a gradient or curvature. In this case, the displacement plane image data generation section 102 superimposes, in the camera image, the displacement state distinction images configured depending on a gradient or curvature of the road.
[0100] [0100] Since displacement status distinction images vary depending on road conditions, an occupant can recognize the predicted displacement state within a distance for which the displacement plane was formulated from the vehicle's current position 100 while recognizing road conditions.
[0101] [0101] Additionally, the displacement plane image data generation section 102 can indicate the displacement state of both the vehicle 100 acceleration state and the vehicle 100 deceleration state regardless of whether the vehicle 100 is in the state driving or braking state.
[0102] [0102] Figure 14 shows a third example of the displacement state distinguishing images that only indicate the acceleration state and the deceleration state, regardless of whether the vehicle 100 is in the driving state or the braking state. . In Figure 14, a traffic light TL2 ahead is red. As shown in Figure 14, the displacement plane imaging data section 102 superimposes an Im32 displacement state distinction image in a region where vehicle 100 accelerates and superimposes an Im31 displacement state distinction image on a region in which vehicle 100 decelerates.
[0103] [0103] Figure 15 shows a fourth example of the displacement state distinction images. In the present document, vehicle 100 may be in a disengaged state from the acceleration state depending on road conditions, while vehicle 100 is neither in the driving state nor in the braking state. Vehicle 100 may also be in a disengaged state from the deceleration state depending on road conditions, while vehicle 100 is neither in the driving state nor in the braking state. Figure 15 is an example indicating the displacement state distinction images stored according to a disengaged state which is a part of the acceleration state and a disengaged state which is a part of the deceleration state.
[0104] [0104] As shown in Figure 15, the displacement plane image data generation section 102 overlays the displacement state distinction image Im32 in the region where the vehicle 100 accelerates and superimposes a displacement state distinction image Im32c in a region where vehicle 100 is in a state of disengagement from the state of acceleration. In addition, the displacement plane image generation section 102 overlays the displacement state distinction image Im31 in the region where the vehicle 100 decelerates and overlays an displacement state distinction image Im31c in a region where the vehicle 100 is in a disengaged state from the deceleration state.
[0105] [0105] The displacement status distinction images according to the device and the method for vehicle driving guidance are summarized below. The displacement status recognition images represent a route image corresponding to the predicted displacement route configured for the host vehicle. As a first aspect, the route image can be displayed in a display mode corresponding to each of a driving state (driving force) and the braking state (braking force). Each of Figure 10 and Figure 11 indicates the route image in such a way that the driving state varies depending on the state of acceleration and the state of constant displacement. The first aspect provides the route image displayed in the modes corresponding to the driving state and the braking state without the differentiated acceleration and constant driving state.
[0106] [0106] Each of Figure 10 to Figure 13 shows a second aspect of the route image displayed in a display mode corresponding to each of the driving state and the braking state and indicates the route image in such a way that vary the driving state depending on the acceleration state and the constant travel state.
[0107] [0107] As a third aspect, as explained with reference to Figure 14, the route image can be displayed in a display mode corresponding to each of the acceleration (acceleration) and deceleration (deceleration) states. As a fourth aspect, as explained with reference to Figure 15, the route image can be displayed in a display mode corresponding to each of the acceleration state, the deceleration state and the disengaged state. In Figure 15, only the disengaged state from the deceleration state can be displayed without the disengaged state from the acceleration state being displayed.
[0108] [0108] Like other aspects, the route image can be displayed in display modes depending on a level of a driving force and a braking force or it can be displayed in display modes depending on an acceleration and deceleration level.
[0109] [0109] The operation of the vehicle driving assistance device according to the modality and the process of the vehicle driving assistance method according to the modality described above are explained in greater detail below with reference to the flowchart shown in Figure 16 As shown in Figure 16, controller 10 obtains travel route data D101 from travel route data storage section 17 in step S1.
[0110] [0110] In step S2, controller 10 distributes the target displacement points at predetermined intervals to the track where the host vehicle travels from the current position of the host vehicle (vehicle 100) to the position where the vehicle host moves by the distance plan formulation distance according to the predicted travel route indicated by the D101 travel route data. In step S3, controller 10 distributes the speed to the respective displacement target points to formulate the speed plane depending on the current speed of the host vehicle, the speed limit on the road traveled, the color of the traffic light, the presence or absence of a obstacle and the distance to the same.
[0111] [0111] Through step S2 and step S3, displacement plane data D103 is generated in order to indicate the predicted travel status of both the driving state and the braking state in which the vehicle 100 is configured.
[0112] [0112] In step S4, controller 10 supplies the displacement plane data D103 to the travel device drive section 21. In step S5, controller 10 overlaps the predicted travel line or the anticipated travel lines in the image in the camera image. In step S6, controller 10 superimposes the displacement state distinguishing images on the predicted displacement line or between the predicted displacement lines. Through step S5 and step S6, displacement plane image data D102 that indicates the displacement plane image is generated.
[0113] [0113] In step S7, controller 10 supplies the displacement plane image data D102 to monitor 18. Monitor 18 therefore displays the camera image on which the displacement plane image is superimposed. Controller 10 returns the process to step S1 after completing step S7 and repeats the process from step S1 to step S7.
[0114] [0114] The above process exemplifies the case where camera 13 is used and the displacement plane image is superimposed on the track image on the camera image. A laser range finder can be used in place of camera 13. The laser range finder radiates laser light and the controller 10 therefore detects reflected waves that are reflected on the road in order to obtain the image data that indicate the road lane.
[0115] [0115] When the laser range finder is used in place of the camera 13, the image data indicating the road lane emitted from the laser range finder is inserted in the plan plane image data generation section displacement 102 instead of D13 imaging data. The displacement plane image generation section 102 can superimpose the displacement plane image on the lane image indicated by the image data indicating the road lane.
[0116] [0116] In the case of a fifth example shown in Figure 17, the displacement plane image is not superimposed on the camera image that forms the image of the actual road on which the vehicle 100 travels, however, it can be superimposed on the image of map included in map data D12. The map image indicates a route image corresponding to the expected displacement route of the host vehicle. Figure 17 is a view showing roads around the vehicle's current position 100 indicated on monitor 18 based on map data D12. The current position marking M100 indicates the current position of vehicle 100.
[0117] [0117] The example display shown in Figure 17 indicates the direction of travel of vehicle 100 in an upward fashion. In this document, a traffic light TL3 in front of vehicle 100 is assumed to be red. The travel plane formulation section 103 formulates a travel plane in such a way that it gradually decelerates the vehicle 100 from a constant speed in order to stop the vehicle in front of traffic light TL2.
[0118] [0118] As shown in Figure 17, the displacement plane image data generation section 102 superimposes the displacement state distinction image Im10 on the track image on the map image indicating that vehicle 100 is moving at a constant speed in the driving state and the displacement state distinguishing image Im11 which indicates that the vehicle 100 decelerates in the braking state.
[0119] [0119] The example display shown in Figure 17 indicates that the displacement status distinction images Im10 and Im11 are superimposed over the entire track image, while omitting the predicted offset line R1 or the predicted offset lines R2. The exemplary display shown in Figure 17 can be deployed in such a way that it omits step S5 shown in Figure 16 and superimposes the displacement state distinction images on the track image on the map image in step S6.
[0120] [0120] The predicted displacement line R1 or the predicted displacement lines R2 can be superimposed on the track image and the displacement status distinction images Im10 and Im11 can be additionally superimposed on them. Although the image thus obtained indicates only the states of constant speed and deceleration, the displacement state distinguishing image Im12 can be superimposed if it indicates the acceleration state.
[0121] [0121] As shown in Figure 17, when displacement state distinguishing images Im10, Im11 and Im12 are superimposed on the map image, displacement state distinguishing images Im10, Im11 and Im12 do not need to be semitransparent. Instead of the displacement state distinction images Im10, Im11 and Im12, the distinction images that vary in shape depending on the displacement state, such as the displacement state distinction images Im20, Im21 and Im22, can also be used.
[0122] [0122] Figure 17 is a two-dimensional (2D) map image, however, it can be a three-dimensional (3D) map image. The 3D map image can approximate a camera image.
[0123] [0123] The device for driving vehicle guidance according to the modality as illustrated above includes the displacement plane formulation section 103, the displacement plane image data generation section 102 and the monitor 18.
[0124] [0124] Displacement plan formulation section 103 configures, as a displacement plan formulation distance, a predetermined distance ahead of the current position of vehicle 100 traveling on a road on a predicted displacement route. The travel plan formulation section 103 formulates a travel plan indicative of which predicted travel state the vehicle 100 is in, a driving state or a braking state, when the vehicle 100 travels along the travel plan formulation distance .
[0125] [0125] The displacement plane image generation section 102 generates displacement plane image data D102 in which the displacement state distinguishing images (such as Im10, Im11 and Im12 or Im20, Im21 and Im22) which vary visually depending on the driving state and the braking state indicated by the displacement plane are positioned on a track image of the road on which the vehicle is traveling. Monitor 18 displays a displacement plane image indicated through displacement plane image data D102.
[0126] [0126] The device for driving the vehicle according to the mode is configured as follows. The travel plan formulation section 103 which serves as a road information detection section detects information from the road on which the host vehicle (vehicle 100) travels on the expected travel route. The speed sensor 15 which serves as a speed detection section detects the speed of the host vehicle.
[0127] [0127] Travel plan formulation section 103 which serves as a predicted travel state configuration section configures the host vehicle for both the driving state and the braking state when the host vehicle travels on the travel route predicted, based on the road information detected through the road information detection section and the speed detected using the speed sensor 15.
[0128] [0128] The displacement plane image generation section 102 generates image data (displacement plane image data) of an image in which the displacement state distinguishing images that indicate the driving state and the braking statuses configured by the predicted travel state configuration section are positioned on the road indicating the predicted travel route. Monitor 18 serving as a display unit displays the image of the image data.
[0129] [0129] In the present document, the positioning of the displacement state distinction images on the road can represent any one of an overlapping of the displacement state distinction images on the lane image that indicates the road, replacement of a part of the track by displacement state distinction images and replacement of the entire track image by displacement state distinction images.
[0130] [0130] The method of assisting with driving a vehicle according to the modality includes the following steps. A road information detection step detects information from a road on which the host vehicle is traveling on a predicted travel route. A speed detection step detects the speed of the host vehicle. A configuration step configures the host vehicle to either a driving state or a braking state when the host vehicle travels on the expected travel route based on the road information detected in the road information detection step and the speed detected in the speed detection step.
[0131] [0131] An image data generation step generates image data from an image in which the displacement state distinguishing images that indicate the driving state and the braking state configured in the predicted displacement state configuration step are positioned on the road to indicate the planned travel route. A display step displays the image of the image data on the display unit.
[0132] [0132] According to the configuration described above, the device and method for guiding vehicle driving according to the modality can provide guidance to inform an occupant as to what state of travel the host vehicle should be in, the state driving or braking status.
[0133] [0133] In the device for vehicle driving guidance according to the modality, the image data generation section visually differentiates a displacement state distinction image that indicates the driving state of a displacement state distinction image that indicates the state of braking, so that an occupant can easily distinguish the state of driving from the state of braking.
[0134] [0134] In the device for driving the vehicle according to the modality, the displacement plane image data generation section 102 can generate the image data as follows.
[0135] [0135] The displacement plane image generation section 102 generates displacement plane image data from a displacement state distinction image in which a first distinction image is positioned in a region on the road where the vehicle 100 moves at a constant speed. The displacement plane image generation section 102 also generates displacement plane image data from displacement state distinction images in which a second distinction image is positioned in a region on the road where vehicle 100 decelerates and a third distinction image is positioned in a region on the road where the vehicle 100 accelerates.
[0136] [0136] Consequently, the vehicle driving guidance device can provide, not only guidance for merely informing the driving status and braking status, but also guidance for informing the vehicle 100 acceleration status.
[0137] [0137] The vehicle plan formulation section 103 can include both the displacement target point configuration section 1032 and the speed plan formulation section 1031. The displacement target point configuration section 1032 distributes a plurality of target travel points at predetermined intervals for the road within a travel plane formulation distance in which the vehicle 100 travels. The speed plan formulation section 1031 distributes the speed of the vehicle 100 to the respective target travel points and formulates a speed plan when driving the vehicle 100 over the distance plan formulation distance.
[0138] [0138] Consequently, vehicle 100 can be driven autonomously.
[0139] [0139] The displacement plane image data generation section 102 can use, as the displacement state distinction images, distinction images that vary in color or shape depending on the driving state and the braking state. Consequently, an occupant can easily recognize the driving state and the braking state.
[0140] [0140] The displacement plane image data generation section 102 can use semitransparent distinction images as the displacement state distinction images. Consequently, the road image is prevented from being completely blocked, so that road signs or objects on the road can be recognized.
[0141] [0141] The displacement plane image data generation section 102 can configure a width of the respective displacement state distinguishing images in a range of a minimum width corresponding to a zigzag movement width of the vehicle 100 due to a control error of the steering wheel 22s of the vehicle 100 to a maximum width obtained in such a way as to add the width of the zigzag movement of the vehicle 100 due to the control error to a width of the vehicle 100. Consequently, it is possible to prevent an occupant from feeling uncomfortable when viewing displacement state distinguishing images.
[0142] [0142] The displacement plane formulation section 103 can increase the displacement plane formulation distance as the speed of the vehicle 100 increases. The displacement plane image data generation section 102 can increase the scope of the respective displacement state distinguishing images in the direction of travel of the host vehicle as the speed of the vehicle 100 increases. Consequently, an occupant can easily recognize the predicted travel status of the vehicle 100 even when the speed increases.
[0143] [0143] The displacement plane image data generation section 102 may vary the displacement state distinguishing images depending on a gradient or curvature of the road on which the vehicle 100 travels. Consequently, an occupant can also recognize road conditions.
[0144] [0144] The device for guiding vehicle driving according to the modality may additionally include obstacle detection section 104 to detect an obstacle located ahead of vehicle 100. Displacement plan formulation section 103 may formulate the displacement depending on the result of the presence or absence of the obstacle detected by the obstacle detection section 104 or a distance from the vehicle 100 to the detected obstacle in order to configure the host vehicle for both the driving state and the braking state.
[0145] [0145] Since the host vehicle can be configured for both the driving state and the braking state while the displacement plane is formulated in view of the presence or absence of an obstacle, the displacement status distinction images can be displayed according to the travel plan formulated depending on the actual road conditions on which an obstacle may be present. Consequently, an occupant can be provided with precise images of distinction of displacement state corresponding to the actual road conditions.
[0146] [0146] Figure 4 shows the configuration in which the travel route data generation section 101 generates travel route data D101 and the travel route data storage section 17 stores the generated travel route data D101. When the predicted travel route on which the vehicle 100 travels is fixed, the travel route data generation section 101 is not necessarily provided in the device for vehicle driving guidance, as long as the route data storage section travel data 17 to store the predetermined travel route data D101 is provided.
[0147] [0147] Displacement plan formulation section 103 formulates vehicle 100 speed plan based on speed limit information D12s included in map data D12. The method of obtaining the speed limit information is not limited to that of the D12 map data.
[0148] [0148] For example, transmission devices can be installed adjacent to the roads on which the vehicle 100 travels to transmit speed limit information for each road via radio, so that a receiver mounted on the vehicle 100 can receive the information of speed limit.
[0149] [0149] Vehicle 100 can be both an autonomous vehicle and an ordinary vehicle that a driver drives.
[0150] [0150] As described above, the vehicle driving guidance device and method according to the modality can display the route image corresponding to the expected displacement route of the host vehicle on monitor 18 in the display mode corresponding to at least as much the driving state (driving force) and the braking state (braking force) of the host vehicle. The device and the vehicle driving guidance method according to the modality can display the route image corresponding to the expected displacement route of the host vehicle on monitor 18 in the display mode corresponding to at least both the acceleration (acceleration) state regarding the state of deceleration (deceleration) of the host vehicle.
[0151] [0151] The vehicle driving guidance device and method according to the present invention are not limited to the vehicle driving guidance device and method according to the modality described above and various modifications and enhancements can be made without depart from the scope of the present invention. INDUSTRIAL APPLICABILITY
[0152] [0152] The present invention can be applied to a vehicle driving guidance device to provide guidance to inform an occupant in a vehicle in which state of travel the vehicle is traveling on a predicted travel route, a state of displacement of a driving or acceleration state or a state of displacement from a state of braking or deceleration. LIST OF REFERENCE SIGNS 18 - MONITOR (DISPLAY UNIT) 101 - DISPLACEMENT ROUTE DATA GENERATION SECTION (EXPECTED DISPLACEMENT ROUTE CONFIGURATION SECTION)
权利要求:
Claims (9)
[0001]
Vehicle driving guidance device comprising a planned travel route configuration section (101), configured to determine an expected travel route for a host vehicle (100) the host vehicle (100) being an autonomous driving vehicle configured for autonomous driving controlled by the host vehicle itself (100) without the operation of a driver, the device for driving vehicle guidance further comprising: a travel plan formulation section (103), configured to formulate a travel plan indicative of how to autonomously drive the host vehicle (100) along a planned travel route, said travel plan formulation section (103) being also configured to serve as a road information detection section to detect road information on which the host vehicle (100) travels on the planned travel route, the travel plan formulation section (103) being also configured to serve as a travel state configuration section provided to configure the host vehicle (100) to either an acceleration state or a deceleration state when the host vehicle (100) travels on the intended travel route for a road ahead of the host vehicle (100) based on road information and vehicle speed, FEATURED by understanding a displacement plane imaging section (102) configured to generate a displacement plane image in which a plurality of displacement state distinguishing images indicating the acceleration state or the deceleration state defined by the configuration section predicted travel status (103) are positioned on a route image corresponding to the predicted travel route; and a display unit (18) configured to display the displacement plane image.
[0002]
Vehicle driving guidance device according to claim 1, CHARACTERIZED by the fact that the displacement plan formulation section (103) formulates the displacement plan based on information about a road on which the host vehicle (100) moves, including a road speed limit and a traffic light state and based on the speed of the host vehicle (100).
[0003]
Vehicle driving guidance device according to claim 1 or 2, CHARACTERIZED by the fact that it additionally comprises an obstacle detection section configured to detect an obstacle located in front of the host vehicle (100), the predicted travel state configuration section (103) formulates the travel plan depending on a result of the presence or absence of the obstacle detected through the obstacle detection section or a distance between the host vehicle (100) and the obstacle.
[0004]
Vehicle driving guidance device according to claim 2, CHARACTERIZED by the fact that the predicted travel state configuration section (103) determines a traffic light state based on the image data obtained through a camera to form an image of the exterior of the front of the host vehicle (100).
[0005]
Vehicle driving guidance device according to claim 3, CHARACTERIZED by the fact that the obstacle detection section detects the obstacle based on at least one image data obtained through a camera to form an image from the outside of the vehicle. front part of the host vehicle (100), and range data obtained through a range sensor to measure a distance between the host vehicle (100) and an object present in front of the host vehicle (100).
[0006]
Vehicle driving guidance device, according to any one of the preceding claims, CHARACTERIZED by the fact that the displacement plane imaging section (102) differentiates a color from the displacement image of the displacement state that indicates the state of acceleration of a color of the image of distinction of the displacement state that indicates the state of deceleration.
[0007]
Vehicle driving guidance device according to any of the preceding claims, CHARACTERIZED by the fact that the displacement plane imaging section (102) sets a route image width in a range of a corresponding minimum width to a host vehicle's zigzag movement width (100) due to a host vehicle (100) steering control error at a maximum width corresponding to a width in which the host vehicle's (100) zigzag movement width is due the host vehicle steering control error (100) is added to a host vehicle width (100).
[0008]
Vehicle driving guidance device according to any one of the preceding claims, CHARACTERIZED by the fact that the predicted travel state configuration section (103) comprises a travel target point configuration section (1032) which increases a distance of formulation of the plane of displacement of the route image in a direction in which the host vehicle (100) moves as the speed of the host vehicle (100) increases.
[0009]
Method for driving vehicle guidance comprising a first configuration step (S1) of configuring a predicted travel route for a host vehicle (100); the vehicle (100) being an autonomous driving vehicle configured for autonomous driving controlled by the host vehicle itself (100) without the operation of a driver, the method for driving vehicle guidance further comprising: a second formulation configuration step (S2 and S3) by means of a displacement plane formulation section (103) a displacement plane indicative of how to autonomously drive the host vehicle (100) along the planned displacement route and configuring the host vehicle (100) to either an accelerating state or a decelerating state when the host vehicle (100) traveling on the intended travel route for a road ahead of the host vehicle (100) based on road information and a vehicle speed, CHARACTERIZED by the fact that: a generation step (S5 and S6) of generating a displacement plane image in which a plurality of displacement state distinguishing images that indicate the acceleration state and the deceleration state configured in the second configuration step (S2 and S3 ) are positioned in a route image corresponding to the planned travel route; and a display step (S7) of displaying the displacement plane image in a display unit (18).
类似技术:
公开号 | 公开日 | 专利标题
BR112016004552B1|2021-02-17|vehicle driving guidance device and method
US10549780B2|2020-02-04|Driving assistance device for a vehicle
US20160304126A1|2016-10-20|Vehicle control device
BR112016021450A2|2021-08-17|VEHICLE OPERATING DEVICE
JP6642334B2|2020-02-05|Vehicle control device
US8730260B2|2014-05-20|Obstacle information notification apparatus for vehicle
JP2008015758A|2008-01-24|Driving support device
US10657822B2|2020-05-19|Vehicle control device
US20160318512A1|2016-11-03|Vehicle travel control device
RU2742400C1|2021-02-05|Driving assistance method and driving assistance device
JP6303403B2|2018-04-04|Vehicle travel guidance apparatus and method
BR112020004012A2|2020-09-01|displacement control method and displacement control device for vehicle with assisted driving
WO2017017759A1|2017-02-02|Lane display device and lane display method
BR112020001645A2|2020-07-21|travel assistance method and travel assistance device
BR112020002069A2|2020-07-21|travel assistance method and travel assistance device
JP2020135044A|2020-08-31|Automatic driving system
US20200148223A1|2020-05-14|Vehicle control device
JP2019095852A|2019-06-20|Reverse run warning system, reverse run warning method, and reverse run warning program
JP6819054B2|2021-01-27|Self-driving vehicle
JP2021105908A|2021-07-26|Vehicle control device
JP2020131874A|2020-08-31|Automatic driving system
JP7011559B2|2022-01-26|Display devices, display control methods, and programs
US20200211396A1|2020-07-02|Notification device and vehicle control device
JP2021172265A|2021-11-01|Travel support method and travel support device of vehicle
JP2020131876A|2020-08-31|Automatic driving system
同族专利:
公开号 | 公开日
MX2016002513A|2016-05-31|
US20160195407A1|2016-07-07|
MX348199B|2017-06-05|
JPWO2015029181A1|2017-03-02|
JP6107956B2|2017-04-05|
EP3040681A4|2016-09-21|
RU2631543C1|2017-09-25|
CN105612403A|2016-05-25|
EP3040681A1|2016-07-06|
WO2015029181A1|2015-03-05|
CN105612403B|2019-05-07|
US9696177B2|2017-07-04|
EP3040681B1|2019-05-22|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

JP3400584B2|1994-12-19|2003-04-28|本田技研工業株式会社|In-vehicle display device|
JPH09126790A|1995-10-31|1997-05-16|Honda Motor Co Ltd|Display device for vehicle|
EP1521059A1|2003-09-30|2005-04-06|Mazda Motor Corporation|Route guidance apparatus, method and program|
JP2006284458A|2005-04-01|2006-10-19|Denso Corp|System for displaying drive support information|
JP4720457B2|2005-11-22|2011-07-13|アイシン・エィ・ダブリュ株式会社|Vehicle driving support method and driving support device|
JP2007141178A|2005-11-22|2007-06-07|Aisin Aw Co Ltd|Driving support method and apparatus in braking|
CN101033978B|2007-01-30|2010-10-13|珠海市智汽电子科技有限公司|Assistant navigation of intelligent vehicle and automatically concurrently assisted driving system|
JP4513826B2|2007-05-24|2010-07-28|株式会社デンソー|Route display device and route display system|
JP2009244167A|2008-03-31|2009-10-22|Mazda Motor Corp|Operation support method and device for vehicle|
CN101357616B|2008-09-27|2011-04-27|清华大学|Intelligent environment-friendly type vehicle structure|
JP4614005B2|2009-02-27|2011-01-19|トヨタ自動車株式会社|Moving locus generator|
US8977446B2|2009-07-22|2015-03-10|Toyota Jidosha Kabushiki Kaisha|Driving support device|
CN101799977B|2009-12-31|2014-06-18|马正方|Intelligent traffic system|
US8346426B1|2010-04-28|2013-01-01|Google Inc.|User interface for displaying internal state of autonomous driving system|
JP5218607B2|2011-06-17|2013-06-26|パナソニック株式会社|Navigation device|
JP2013024679A|2011-07-20|2013-02-04|Toyota Motor Corp|Vehicle control device|
DE102011088130A1|2011-12-09|2013-06-13|Robert Bosch Gmbh|Method and device for detecting a braking situation|
RU2479015C1|2012-01-11|2013-04-10|Федеральное государственное бюджетное образовательное учреждение высшего профессионального образования "Санкт-Петербургский государственный электротехнический университет "ЛЭТИ" им. В.И. Ульянова "|Method of defining motion path of self-contained vehicle in dynamic medium|
US9057624B2|2012-12-29|2015-06-16|Cloudcar, Inc.|System and method for vehicle navigation with multiple abstraction layers|US8346426B1|2010-04-28|2013-01-01|Google Inc.|User interface for displaying internal state of autonomous driving system|
US8260482B1|2010-04-28|2012-09-04|Google Inc.|User interface for displaying internal state of autonomous driving system|
US8825258B2|2012-11-30|2014-09-02|Google Inc.|Engaging and disengaging for autonomous driving|
JP6558732B2|2015-04-21|2019-08-14|パナソニックIpマネジメント株式会社|Driving support method, driving support device, driving control device, vehicle, and driving support program using the same|
JP2016210243A|2015-04-30|2016-12-15|トヨタ自動車株式会社|Automatic driving system of vehicle|
JP6235528B2|2015-05-15|2017-11-22|トヨタ自動車株式会社|Vehicle control device|
JP5957745B1|2015-07-31|2016-07-27|パナソニックIpマネジメント株式会社|Driving support device, driving support system, driving support method, driving support program, and autonomous driving vehicle|
GB2546565A|2016-01-20|2017-07-26|Mahindra Reva Electric Vehicles Ltd|A method and system for trip planning for vehicles with limited on-board energy|
JP6476504B2|2016-11-04|2019-03-06|本田技研工業株式会社|Vehicle display system, vehicle display method, and vehicle display program|
JP6460420B2|2016-11-08|2019-01-30|本田技研工業株式会社|Information display device, information display method, and information display program|
CN110023162B|2016-12-02|2022-02-11|三菱电机株式会社|Automatic driving control plan making device and automatic driving control plan making method|
US10409280B2|2017-03-21|2019-09-10|Baidu Usa Llc|Control dominated planning and control system for autonomous driving vehicles|
US10699142B2|2017-04-20|2020-06-30|GM Global Technology Operations LLC|Systems and methods for traffic signal light detection|
WO2018220425A1|2017-05-31|2018-12-06|Lp-Research Inc.|Media augmentation through automotive motion|
US10560735B2|2017-05-31|2020-02-11|Lp-Research Inc.|Media augmentation through automotive motion|
US10514269B2|2017-08-31|2019-12-24|Denso International America, Inc.|Automated driving device and assisting device for vehicle|
CA3077128A1|2017-09-26|2019-04-04|Nissan Motor Co., Ltd.|Driving assistance method and driving assistance device|
US10830597B2|2017-11-02|2020-11-10|International Business Machines Corporation|Smart city crosswalk and traffic navigation system|
JP2019095851A|2017-11-17|2019-06-20|アイシン・エィ・ダブリュ株式会社|Vehicle drive auxiliary system, vehicle drive auxiliary method and vehicle drive auxiliary program|
JP6950538B2|2018-01-11|2021-10-13|トヨタ自動車株式会社|Vehicle photography support device and program|
WO2019189515A1|2018-03-28|2019-10-03|Ricoh Company, Ltd.|Control apparatus, display apparatus, movable body, and image display method|
JP2021098378A|2018-03-28|2021-07-01|日立Astemo株式会社|Information providing device for vehicle|
GB2585856A|2019-07-17|2021-01-27|Jaguar Land Rover Ltd|Control system and method for a vehicle|
RU2750243C2|2019-12-24|2021-06-24|Общество с ограниченной ответственностью "Яндекс Беспилотные Технологии"|Method and system for generating a trajectory for a self-driving car |
CN112767704A|2020-12-22|2021-05-07|上海鼎尚铭星科技有限公司|System and method for accurately judging vehicle traffic illegal behaviors|
法律状态:
2018-11-21| B06F| Objections, documents and/or translations needed after an examination request according art. 34 industrial property law|
2020-03-31| B06U| Preliminary requirement: requests with searches performed by other patent offices: suspension of the patent application procedure|
2021-01-12| B09A| Decision: intention to grant|
2021-02-17| B16A| Patent or certificate of addition of invention granted|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 29/08/2013, OBSERVADAS AS CONDICOES LEGAIS. |
优先权:
申请号 | 申请日 | 专利标题
PCT/JP2013/073137|WO2015029181A1|2013-08-29|2013-08-29|Vehicular driving guidance device and method|
[返回顶部]