![]() SELF-CONTAINING DRONE-DRIVED VIEWING SYSTEM WITH TARGET TRACKING AND TARGET SHIFTING ANGLE HOLDING.
专利摘要:
The invention relates to an animated filming system, comprising a drone (D) provided with a camera (C) and a ground station (S) communicating with the drone, the camera (C) being oriented along an axis of aiming at (3), the movements of the drone being defined by flight instructions applied to a set of powertrains of the drone, the drone being able to move autonomously to take an animated shot of a target (T) moving with the ground station (S). The system comprises control means (2) configured to generate said flight instructions so as to maintain substantially constant the angle between the line of sight of the camera and the direction of movement of the target upon activation of the tracking of the target. . The ground station comprises means, controlled by at least one control means forming a target tracking activation button, for alternately switching the piloting mode of the drone between an activation mode of the target tracking system. and a deactivation mode. 公开号:FR3058238A1 申请号:FR1660538 申请日:2016-10-28 公开日:2018-05-04 发明作者:Guillaume Pinto 申请人:Parrot Drones SAS; IPC主号:
专利说明:
© Publication no .: 3,058,238 (to be used only for reproduction orders) ©) National registration number: 16 60538 ® FRENCH REPUBLIC NATIONAL INSTITUTE OF INDUSTRIAL PROPERTY COURBEVOIE © Int Cl 8 : G 05 D 1/12 (2017.01), B 64 C 27/08 A1 PATENT APPLICATION ©) Date of filing: 28.10.16. (© Applicant (s): PARROT DRONES Company by (© Priority: simplified actions - FR. @ Inventor (s): PINTO GUILLAUME. ©) Date of public availability of the request: 04.05.18 Bulletin 18/18. ©) List of documents cited in the report preliminary research: Refer to end of present booklet (© References to other national documents ©) Holder (s): PARROT DRONES Joint-stock company related: simplified. ©) Extension request (s): © Agent (s): BARDEHLE PAGENBERG. SELF-CONTAINED DRONE-BASED SHOOTING SYSTEM WITH TARGET TRACKING AND HOLDING THE TARGET ANGLE. FR 3 058 238 - A1 (5 /) The invention relates to a system for moving pictures, comprising a drone (D) provided with a camera (C) and a ground station (S) communicating with the drone, the camera (C) being oriented along an aiming axis (3), the movements of the drone being defined by flight instructions applied to a set of drone propulsion units, the drone being able to move autonomously to take hold animated views of a target (T) moving with the ground station (S). The system comprises control means (2) configured to generate said flight instructions so as to maintain substantially constant the angle between the line of sight of the camera and the direction of movement of the target on activation of the tracking of the target. . The ground station comprises means, controlled by at least one piloting means forming a button for activating target tracking, for alternately switching the piloting mode of the drone between a mode for activating the target tracking system. and a deactivation mode. DP1 The invention relates to remote-controlled flying motorized devices, below generally designated under the name of drones. The invention is particularly applicable to rotary-wing drones such as quadricopters, a typical example of which is the Bebop 2 from Parrot SA, Paris, France, which is a drone equipped with a series of sensors (accelerometers, three gyrometers axes, altimeter), a front camera capturing an image of the scene towards which the drone is directed, and a vertical aiming camera capturing an image of the terrain overflown. The rotary wing drones are provided with multiple rotors driven by respective motors controllable in a differentiated manner in order to pilot the drone in attitude and speed. Documents WO 2010/061099 A2 and EP 2 364 757 A1 (Parrot SA) describe such a drone as well as its principle of piloting by means of a station, generally on the ground, such as a multimedia telephone or portable music player. integrated touch screen and accelerometer, for example an iPhone type cellular phone or an iPad type multimedia tablet (registered trademarks). These stations incorporate the various control devices necessary for the detection of piloting commands and for the bidirectional exchange of data with the drone via a wireless link of the local Wi-Fi network type (IEEE 802.11) or Bluetooth. They are also provided with a touch screen displaying the image captured by the front camera of the drone, with a number of symbols superimposed allowing the activation of commands by simple touch of the user's finger on this touch screen. The drone's front video camera can be used to capture sequences of images of a scene towards which the drone is turned. The user can thus use the drone in the same way as a camera or camcorder which, instead of being held in the hand, would be carried by the drone. The images collected can be saved then broadcast, posted on websites hosting video sequences, sent to other Internet users, shared on social networks, etc. The front camera can be an orientable camera, in order to orient in a controlled manner in a predetermined direction the line of sight, and therefore the field of images transmitted with the video stream. A technique implemented in particular in the aforementioned Bebop 2 device and described in the EP 2 933 775 A1, consists in using a high-definition wide-angle camera provided with a hemispherical field lens of the fish-eye type covering a field of approximately 180 ° and in windowing in cranking time the raw image delivered by this sensor, by software processing ensuring the selection of the useful pixels of the raw image in a capture zone determined as a function of a certain number of parameters, including pointing commands in the direction of a particular target chosen by the user or automatically followed by the drone. As a variant, or even in addition, of controlling the line of sight of the camera by windowing software, it is also possible to mount the camera on a three-axis articulated support of gimbal type with Cardan suspension, fitted servomotors controlled according to gyrometric data and pointing commands. The invention obviously applies to any type of camera, orientable or not, and whatever its pointing mode. In a so-called tracking mode, the drone can be programmed to follow a moving target whose coordinates are known and so that during its flight, the camera's line of sight is directed towards said target. This target is typically the station itself, carried by a user who can be in motion (for example practicing a sport where he is moving - running, sliding, driving, etc.). In this mode, the drone is able to film the changes in the user without the user having to act on the movements of the drone and on the line of sight of the camera. The drone following the target object adjusts its position and / or the position of the camera unit so that the target object is always filmed by the drone. The drone being autonomous, that is to say the displacement is calculated by the drone and not piloted by a user, it determines its trajectory according to the movements of the target object and controls the camera block so that it is always in the direction of the target object to be filmed. Thus, when the target moves, the drone is not only capable of chasing the target, but also it positions itself so as to orient the camera so that its line of sight points towards the target. For this purpose, the coordinates of the ground station, obtained by a GPS central unit equipping it in a manner known per se, are communicated to the drone by the wireless link, and the drone can therefore adjust its movements so as to follow the target and keep the camera's line of sight towards the target, so that the image remains framed on the subject. In tracking mode, it is known that the drone follows the movement of the target which is in the field of vision of the camera. Thus, when the target makes a trajectory which moves away from the drone, the latter detects the movement of the target and in turn moves towards the target so that it remains in the field of vision of the camera. In this process the drone comes to position itself behind the target to follow its distance. We thus obtain images with a shot on the back of the target. In addition, the angle of view of the images is changed as the target moves. Indeed, when the target makes a turn, while the drone follows it, the drone is behind or in front of the target depending on the direction of the turn. Therefore, the angle of view of the camera relative to the path of the target is very different from that preceding the turn. An object of the invention is to propose a system allowing a drone, in the autonomous shooting mode of a moving target, on the one hand to maintain the same angle of shooting of the target during the pursuit, and secondly to maintain the relative positioning of the drone around the target. To this end, the invention provides a system for taking animated shots, comprising a drone provided with a camera and a ground station communicating by a wireless link with the drone, the camera being oriented along an aiming axis, the movements of the drone being defined by flight instructions applied to a propulsion unit or a set of propulsion units of the drone, the drone being able to move autonomously to carry out an animated shooting of a target moving with the station at ground, the orientation of the line of sight being such that the target remains present in the successive images produced by said shot. Characteristically of the invention, the system comprises control means configured to generate said flight instructions so as to maintain substantially constant the angle between the line of sight of the camera and the direction of movement of the target on activation. of the target tracking and the ground station comprises means, controlled by at least one piloting means forming activation button of the target tracking, to switch alternately the piloting mode of the drone between: • a mode for activating the target tracking system capable of activating means for activating the tracking of the target by the drone, and • a deactivation mode able to deactivate said means for activating the target tracking by the drone. The following features can be taken together or separately. the ground station further comprises means capable of detecting signals emitted by at least one piloting means having a piloting function by the user and means for transforming said detected signals into flight instructions, and for transmitting to the drone these flight instructions when the activation mode is activated. the ground station also comprises • a screen, • means for displaying on the screen, an image taken by a camera on board the drone, said image comprising the target and • means for displaying on the screen, a dynamic icon when the activation mode is activated, the icon comprising at least one representation of the target and a representation of the viewing angle of the on-board camera. the dynamic icon includes a first representation of the target in the target tracking activation mode and a second representation of the target when the target is moving, showing the direction of movement of the target. the ground station further comprises means for locking the angle between the line of sight of the camera and the direction of movement of the target, the locking means forming an activation / deactivation button, for alternately locking and unlocking the value of said angle, the system further comprises means for determining the target speed vector and the position of the target in a given frame of reference, and - said control means are configured to generate said flight instructions from: of the determined speed vector, of the determined position and of a predetermined directional angle (a P ) so as to maintain the angle between the line of sight of the camera and the direction of the speed vector substantially at the value of said angle predetermined directional (a P ). the means for activating the tracking of the target by the drone are further capable of calculating the value of said predetermined directional angle (a P ) from the displacement of the target for a predetermined period of time consecutive to the activation of the tracking of said target. - the deactivation mode is a mode in which the piloting commands will generate flight instructions from the determined position of the target. - the ground station includes: i. a touch screen displaying a plurality of touch zones; ii. means for detecting contact signals emitted by the tactile zones and at least one tactile zone forms said at least one control means. - At least one touch zone forms said means for locking said angle. According to other optional features: - said flight instructions generated by the control means can be generated from a servo loop on a command to maintain said predetermined directional angle. - The control means are configured to generate said flight instructions to further control the movement of the drone at a predetermined distance between the drone and the target. - the means for activating the tracking of the target by the drone are further able to calculate the value of said predetermined distance at the time of said activation. - said flight instructions generated by the control means can also be generated from a servo loop on an instruction to maintain said predetermined distance. the control means are configured to generate said flight instructions to further control the movement of the drone so as to maintain a predetermined elevation angle, the predetermined elevation angle being an angle between the line of sight of the camera and a horizontal plane. - The means for activating the tracking of the target by the drone are also able to calculate the value of said elevation angle predetermined at the time of said activation. - said flight instructions generated by the control means can also be generated from a servo loop on a command to maintain said predetermined elevation angle. the direction of the line of sight of the camera is fixed relative to a main axis of the drone, the control means being configured to generate flight instructions so as to direct the line of sight of the camera towards the target when tracking the target by the drone. the direction of the line of sight of the camera can be modified with respect to a main axis of the drone by means of modification, the means of modification being configured to direct at least partially the line of sight of the camera towards the target when tracking the target by the drone. - Said means for determining the vector-speed and the position of the target can operate by observation of the successive GPS geographic positions of the target, the given reference being a terrestrial reference. - Said means for determining the speed vector and the position of the target, can operate by analysis of the images delivered by the drone camera, the reference given being a reference linked to the drone. - the analysis of the images delivered by the camera is preferably an analysis of the position of the target in the images generated successively by the drone camera, and the system includes means for locating and tracking said position in the successive images. We will now describe an example of implementation of the present invention, with reference to the accompanying drawings where the same references designate from one figure to another identical or functionally similar elements. Figure 1 is a schematic overview of a shooting system comprising a drone and a ground station. Figure 2 is a schematic representation of a top view of the system of Figure 1 according to the invention, the target and the drone being each represented in an initial position and in a subsequent position. FIG. 3 is a schematic representation of the means implemented in the system of FIG. 1. Figure 4 is a schematic representation of a side view of the system of Figure 1 according to the invention, the target and the drone being each represented in an initial position and in a subsequent position. FIG. 5 is an example showing a ground station in accordance with the invention during the activation of the activation means. Figure 6 illustrates the ground station in Figure 5 when a target displacement is detected. FIG. 7 illustrates the ground station of FIG. 6 when the value of the angle between the line of sight of the camera and the direction of movement of the target is locked. The invention applies to a drone D, for example a quadricopter type drone, such as the Parrot Bebop 2 Drone, the various technical aspects of which are described in EP2 364 757A1, EP2 613 213A1, EP 2 450 862 A1 or still EP 2 613 214 A1 mentioned above. The invention relates to a system 1 for moving pictures, comprising a drone D provided with a camera C and a ground station S communicating by a wireless link with the drone D, shown in FIG. 1. The drone D comprises a propulsion unit or a set of propulsion units comprising coplanar rotors whose engines are independently controlled by an integrated navigation and attitude control system. It is provided with a front view camera C making it possible to obtain an image of the scene towards which the drone is oriented. The camera C is oriented along a view axis 3, as shown in FIG. 2. Inertial sensors (accelerometers and gyrometers) make it possible to measure with a certain precision the angular speeds and the angles of attitude of the drone, that is to say the angles of Euler (pitch, roll and yaw) describing the inclination of the drone with respect to a horizontal plane of a fixed landmark. An ultrasonic rangefinder placed under the drone D also provides a measurement of the altitude relative to the ground. The drone D is also provided with location means making it possible to determine its absolute position DP1, DP2 in space, in particular from data coming from a GPS receiver. The drone D is controlled by the ground station S, typically in the form of a remote control, for example of the type of model aircraft remote control, of a smart phone or a smart tablet as shown for example in FIGS. 5, 6 and 7. The ground station S comprises at least one screen E and control means, at least one control means being produced in the form of a button. The screen is capable of displaying the image captured by the front camera C. According to a particular embodiment, the screen E is a touch screen. According to this embodiment, the touch screen, superimposed on the captured image displayed, displays a certain number of touch zones provided with symbols forming control means allowing the activation of control commands by simple touch of the finger of a user on touchscreen E. The ground station S further comprises means for detecting contact signals emitted by the control means, in particular by the tactile zones. When the drone D is controlled by a remote control station S, the user can be provided with immersion pilot glasses, often called FPV glasses (First Person View). The station S is also provided with radio link means with the drone D, for example of the local Wi-Fi network type (IEEE 802.11), for the bidirectional exchange of data from the drone D to the station S, in particular for the transmission of the image captured by camera C and flight data, and from station S to drone D for sending piloting commands. The system constituted by the drone D and the station S, is configured to be able to give the drone the ability to track and film a target autonomously. Typically, the target is constituted by the station S itself carried by the user. According to the invention, the tracking of the target by the drone is carried out while maintaining the same angle of view of the target by the camera C of the drone D. The displacements of the drone D are defined by flight instructions generated by means for controlling the navigation system of the drone D, and applied to the powertrain or to all of the powertrains of the drone D. According to the invention illustrated in Figures 2 and 3, to maintain the same angle of view of the target on the successive images, the system comprises control means 2 configured to generate said flight instructions so as to maintain substantially constant l angle between the line of sight 3 of the camera C and the direction of movement of the target T upon activation of the tracking of the target T. In particular, according to one embodiment, the instructions of the control means 2 are generated so as to maintain a predetermined directional angle a P formed between the line of sight 3 of the camera C and the direction of the vector-speed VT1, VT2 of the target T. This angle corresponds substantially to the angle of view of the target by the camera C of the drone D. In accordance with the invention, the ground station S comprises means, controlled by at least one piloting means forming an activation button for tracking the target, for alternately switching the piloting mode of the drone between an activation mode. of the target tracking system able to activate activation means 7 for monitoring the target T by the drone D, and a deactivation mode able to deactivate said activation means 7 for monitoring the target T by the drone D. In this way, the user can by means of the ground station S activate the tracking of the target T so as to maintain substantially constant the angle between the line of sight of the camera and the direction of movement of the target and deactivate tracking of target T. The deactivation of the tracking of the target T makes it possible for example to switch the piloting of the drone into a mode in which the piloting commands will generate flight instructions from the determined position of the target in particular by said determination means 6. Thus, disabling tracking of the target T makes it possible, for example, to pilot the drone according to the so-called tracking mode. Namely, the drone follows the target T by means of the coordinates thereof by adjusting its position and / or the position of the camera unit so that the target is always filmed by the drone. In particular, when the target moves, the drone determines its trajectory according to the movements of the target and controls the camera so that it is always in the direction of the target to be filmed. In other words, the drone positions itself and directs the camera so that its line of sight points towards the target. According to a particular embodiment, the ground station S further comprises means capable of detecting signals transmitted by at least one control means having a control function by the user and means for transforming said detected signals into instructions for flight, and to transmit these flight instructions to the drone when the activation mode is activated. According to this embodiment, the user can, via the piloting means having a piloting function for the ground station S, pilot the drone according to his wish despite the activation of the target tracking system. In this way, the user's flight instructions will have priority for piloting the drone over the flight instructions generated by the control means 2. The piloting means having a piloting function are the elementary piloting functions. They include in particular the following flight instructions: climb, descend, pivot to the right, pivot to the left, forward, backward, translate to the right, translate to the left as illustrated in Figures 5 to 7 with reference 6 on the screen E from ground station S. According to a particular embodiment detailed below with reference to FIGS. 5 to 7, a value, for example fixed, of the predetermined directional angle a P , which is substantially maintained in the tracking mode, is determined at a given time. the target T by the drone D. In other words, the drone D follows a displacement as a function of the displacement of the target T so that the current directional angle a is substantially equal to the value of the predetermined directional angle a P during respective displacements of the target T and the drone D. Thus, the predetermined directional angle a P is the angle at which it is desired to take the continuous shooting of the target T. The value of the predetermined directional angle a P can, according to another embodiment, be chosen from a set of values prerecorded in the system 1. According to one embodiment, the control means 2 of the system are configured to generate said flight instructions from: i. of the speed vector VT 1, VT2 of the target T, ii. the target position TP1, TP2, and iii. of a predetermined directional angle a P. The orientation of the line of sight 3 of the drone camera is such that the target T remains present on the successive images produced by said shooting. In a first embodiment, the direction of the aiming axis 3 of the camera C is fixed relative to a main axis of the drone. The control means 2 are therefore configured to generate flight instructions so as to position the main axis of the drone so that the line of sight 3 of the camera C is directed towards the target T during the tracking of the target T by drone D. In a second embodiment, the direction of the aiming axis 3 of the camera C can be modified relative to a main axis of the drone by means of modification. The modification means are configured to direct at least partially the line of sight 3 of the camera towards the target T during the tracking of the target by the drone D. The camera C is for example a fixed camera of the fish-eye type with hemispherical field, as described for example in EP 2 933 775 A1 (Parrot). With such a camera, the changes in the aiming axis 3 of the camera C are made not by physical movement of the camera, but by cropping and reprocessing of the images taken by the camera as a function of a virtual aiming angle, determined relative to the main axis of the drone, given as a setpoint. Camera C can also be a mobile camera assembled with the drone, for example under the body of the drone, in this case, the modification means comprise motors for rotating the camera along at least one of the three axes, or even the three axes in order to orient the line of sight of the camera so that the target remains present in the successive images produced by said shooting. The coordinates of the position TP1, TP2 of the target T make it possible to determine the orientation of the aiming axis 3 of the camera C so that the target T remains present on the successive images produced during the shooting. The coordinates of the line of sight 3 of camera C are determined using the drone's sensors, which determine the position of the drone D. The coordinates of the speed vector VT1, VT2 and the position TP1, TP2 of the target T relative with the drone D make it possible to determine the current directional angle a between the line of sight 3 of the camera C and the direction of the vector-speed VT 1, VT2. The control means 2 are for example configured to generate said flight instructions from a servo loop on a command to maintain said predetermined directional angle a P , for example using a calculation unit provided with '' an execution program provided for this purpose. The principle of servo-control is to continuously measure the difference between the current value of the variable to be servo-controlled and the predetermined value which one wishes to reach, in order to determine the appropriate control instructions to reach the predetermined value. Thus, the control means 2 first determine the current directional angle a, then give flight instructions for the drone D to move into a position DP2 in which the current directional angle a corresponds to the predetermined directional angle a P. The servo loop is continuously repeated by the control means 2 to maintain the value of the predetermined directional angle a P. Referring to Figure 2, there is shown schematically the drone D in autonomous movement, equipped with the camera C which performs a sequence of animated images of the target T. The target T has an initial position TP1 and the drone D has a initial position DP1, defined in the terrestrial frame. Target D moves with a speed vector VT1 at position DP1, and VT2 at position DP2, the direction and the value changing over time. In the initial position DP1, the axis of the camera C is directed towards the target T and forms a directional angle with the direction of the speed vector VT1, which corresponds to the predetermined directional angle a P. In the same figure 2, the target is shown in a later position TP2 and the drone in a later position DP2. The target T passes from the initial position TP1 to the subsequent position TP2. In the tracking mode, the drone D moves from the initial position DP1 to the subsequent position DP2, thanks to flight instructions generated by the control means. The flight instructions are defined so as to keep the same predetermined directional angle a P as that of the initial positions. Thus, in their respective subsequent positions TP2, DP2, it is observed that the directional angle formed between the line of sight 3 of the camera C and the direction of the speed vector VT2 is substantially identical to that which was defined in the initial positions TP1, DP1 of the target T and the drone D. Thus, thanks to the system 1 of the invention, the angle of view of the target T by the camera C remains the same despite the displacements of the target T and the drone D. As shown in FIGS. 2 and 3, and in order to allow the control means 2 to calculate the current directional angle a, the system 1 also comprises means 6 for determining the position TP1, TP2 and the speed vector VT1, VT2 of the target T in a given coordinate system. The determination means 6 transmit the coordinates of the speed vector VT1, VT2 and of the position TP1, TP2 of the target T to the control means 2. The determination means 6 determine these coordinates repeatedly to transmit to the control means 2 discounted values. In a first embodiment, said means 6 for determining the speed vector VT1, VT2 and the position of the target T operate by observing the successive GPS geographic positions of the target T. The given frame of reference for determining the speed vector VT1, VT2 and from the position TP1, TP2 of the target T is therefore a terrestrial reference. The determination means 6 receive the successive GPS positions of the target T over time. The determination means 6 can thus deduce therefrom the coordinates of the speed vector VT1, VT2 of the target T. The position of the target T is given by the GPS coordinates of the ground station. According to a first variant of this first embodiment, the determination means 6 are arranged in the drone D. The GPS positions of the target T are transmitted by the target T to the determination means 6 of the drone D. According to a second variant of this first embodiment, the determination means 6 are arranged in the ground station of the target T. Here, the coordinates of the speed vector VT1, VT2 and of the position TP1, TP2 of the target T are determined in the ground station, then transmitted to drone D. In a second embodiment, said means 6 for determining the speed vector VT1, VT2 and the position of the target T, operate by analyzing the images delivered by the camera C of the drone T. The reference given here is a linked reference to the drone D. In this case, the analysis of the images delivered by the camera C is an analysis of the position TP1, TP2 of the target T in the images successively generated by the camera C of the drone D. The determination means 6 comprise means for locating and monitoring the position TP1, TP2 of the target T in the successive images. In this particular embodiment, the determination means 6 are located in the drone D. For this purpose, an image analysis program provided in the determination means 6 on board the drone D, or in a dedicated circuit, is configured to follow the displacement of the target T in the sequence of images generated by the camera C, and to deduce therefrom in which angular direction the target T is located relative to the line of sight 3 of the camera C. More precisely , this program is configured to locate and follow in successive images a visual pattern or a spot of color representative of the visual aspect of the target with respect to a background (for example a pattern developed by analysis of the gray levels of the picture). For example, for the shooting of a target user practicing a snow sport, the background will generally be white and the color of the spot in the images will be that of the user's clothes. This approach also makes it possible to have angular position data of the user to be followed at a rate substantially faster than that with which the GPS coordinates are delivered (in general 1 time per second), namely the frame rate which is typically 30 frames per second for this type of application. In this second embodiment, the image analysis is associated with another measuring means which provides at least in part a geographical position TP1, TP2 of the target T. These means can in particular come from the GPS unit of the ground station, or a barometric type pressure sensor, arranged in the ground station of the target T. The means for determining the line of sight 3 of the camera C being capable of indicating an angular position of the target T relative to the main axis of the drone, are thus supplemented by taking into account a geographical signal. The on-board electronics of the drone is capable of knowing by cross-checking between the geographic data and the angular detection data, the position of the target T. In this way, very precise coordinates of the position TP1, TP2 and of the speed vector are obtained. VT1, VT2 of target T. According to a particular embodiment, the control means 2 are further configured to generate said flight instructions to control the movement of the drone D at a predetermined distance dp between the drone D and the target T. In other words, in the mode of the distance d between the target T and the camera C is also maintained, in addition to maintaining the angle of view of the camera C. The predetermined distance dp has a fixed value during the monitoring. Thus, the perception of the dimensions of the target T remains substantially the same during the shooting, with a constant focal length of camera C. The current distance d between the target T and the camera C is calculated by the control means 2, from the position of the target TP1, TP2 determined by the determination means 6 and by the position of the drone DP1, DP2 determined by its inertial sensors. The control means 2 are for example configured to generate the flight instructions from a servo loop on a command to maintain said predetermined distance dp. The process is similar to that concerning the control loop on the instruction to maintain the predetermined directional angle a P. The control means 2 calculate the current distance d and generate instructions for moving the drone into a position DP2 whose distance from the target T corresponds to the predetermined distance d P. Thus, the distance between the subsequent positions TP2, DP2 of the drone D and of the target T in FIG. 3 is substantially identical to the distance between the initial positions TP1, DP1 of the drone D and of the target T. According to a particular embodiment, shown in Figures 2 and 4, the control means 2 are further configured to generate flight instructions to control the movement of the drone D so as to maintain a predetermined elevation angle β Ρ defined between the line of sight 3 of camera C and the horizontal plane π. This predetermined elevation angle β Ρ makes it possible to determine the relative altitude of the drone relative to the target. By maintaining a predetermined elevation angle β Ρ constant, the drone D maintains its altitude relative to the target T. The horizontal plane π is defined with respect to the terrestrial reference, and can be defined at any altitude. The value of the current elevation angle β is determined by the control means 2 as a function of the line of sight 3 of the camera C and of the horizontal plane π. The control means 2 are for example configured to generate the flight instructions from a servo loop on a command to maintain said predetermined elevation angle β Ρ . The method is similar to that concerning the servo loop on the instruction to maintain the predetermined directional angle a P and that of the predetermined distance d P between the camera C and the target T. Thus, as shown in FIG. 4 , the current elevation angle β relative to the horizontal plane π has the same value between the initial positions TP1, DP1 of the target T and of the drone D, and the subsequent positions TP2, DP2. In a particular embodiment, the control means are configured to generate flight instructions making it possible to modify the position of the drone to simultaneously modify the current directional angle a, the current distance d between the camera C and the target T and l 'current elevation angle β to reach the three corresponding predetermined values. In the embodiment shown in FIG. 2, the system 1 comprises activation means 7 for tracking the target by the drone. The monitoring activation means 7 are able to control the activation and deactivation of the monitoring of the target T. According to a particular embodiment, the activation means 7 are for example arranged in the ground station. According to another embodiment, the activation means 7 are arranged in the drone. The ground station S comprises means, controlled by at least one piloting means forming a button for activating the target tracking to alternately switch the piloting mode of the drone between a mode of activating the target tracking system. able to activate the activation means 7 and a mode of deactivation of said activation mode able to deactivate the activation means 7. According to a particular embodiment in which the ground station S comprises a touch screen E provided with zones one of the touch zones forms a button for activating target tracking. Thus, when the user actuates the target tracking activation button, the drone switches to the target tracking mode in which the control means 2 generate said flight instructions. According to a particular embodiment, the flight instructions are generated from in particular the determined speed vector VT1, VT2, the determined position TP1, TP2 and said predetermined directional angle a P. According to a particular embodiment of the invention, the activation means 7 for tracking the target by the drone are capable of calculating the value of said predetermined directional angle following said activation. In other words, the activation means 7 define the value of the predetermined directional angle a P which will be maintained during the tracking of the target by the drone. In particular, the value of said directional angle a P is calculated from the displacement of the target T for a predetermined period of time consecutive to the activation of the tracking of said target. According to another particular embodiment, the activation means 7 are for example configured so that the value of the predetermined directional angle is the current directional angle at the time of activation, in particular at the time when the button is actuated. The activation means 7 calculate the current directional angle for example as a function of the coordinates of the speed vector and of the position of the target, which are transmitted to it by the determination means 6. Thus, the user positions the drone and the camera according to a viewing angle that it wishes to maintain, and activates the tracking mode thanks to the activation means 7 so that the drone follows the target while maintaining the chosen viewing angle. As illustrated in FIGS. 5, 6 and 7, the ground station S comprises a screen E and means for displaying on the screen E, an image 8 taken by a camera on board the drone, said image comprising the target T. In addition, the ground station S comprises means for displaying on the screen E, a dynamic icon 10 in particular when the activation mode is activated. The dynamic icon 10 includes at least one representation of the target 12 and a representation of the angle of view 14 of the on-board camera C. As illustrated in FIG. 5, when the activation mode for activating target tracking is activated, the dynamic icon 10 includes a first representation of the target 12, for example a circle, so as to signal the position of the target in the camera's viewing angle. In particular, this first representation makes it possible to illustrate the position of the target without however knowing the displacement of the target. FIG. 6 illustrates a second representation of the target 12 ′ in the dynamic icon 10. The representation of the target is modified so as to illustrate the direction of movement of the target T which is determined. As seen previously, the direction of movement, in particular, the value of the predetermined directional angle (a P ), is calculated, after activation of the tracking of said target, during movement of the target T, in particular from the movement of the target T achieved over a predetermined period of time. After determining the direction of movement of the target, the dynamic icon 10 is modified so as to no longer display the first representation of the target 12 but to display a second representation of the target 12 ′ indicating the direction of movement of the target . The second representation of the target 12 ′ is for example a triangle in order to illustrate the direction of movement of the target. The ground station S further comprises locking means 16, 16 ′ for the angle between the line of sight 3 of the camera C and the direction of movement of the target T, the locking means forming an activation button. / deactivation, to alternately lock and unlock the value of said angle. Figure 5 and Figure 6 illustrate locking means 16 in unlocked mode, for example in the form of an open padlock, so that the angle between the line of sight 3 of camera C and the direction of movement of the target T is modifiable. Thus, the user can modify via the control means of the ground station, the position of the drone according to the desired angle. FIG. 7 illustrates the locking means 16 ′ in locked mode, for example in the form of a closed padlock, so that the angle between the line of sight 3 of camera C and the direction of movement of the target T is no longer modifiable and is therefore fixed. The angle thus fixed will be the angle maintained between the line of sight of the camera and the direction of movement of the target as long as the activation mode of the tracking of the target is activated. However, at any time, the user can unlock the locking means to modify this value. In addition, as seen above, the user can, despite the activated mode of target tracking by the drone, pilot the drone in the direction desired by him, the flight instructions from the means of piloting the ground station are have priority over the instructions generated by the control means 2. According to a particular embodiment in which the ground station S comprises a touch screen E having a plurality of touch zones, at least one touch zone of the touch screen forms the means for locking said angle. According to a particular embodiment, the activation means 7 for tracking the target by the drone are also capable of calculating the value of the predetermined distance dp between the target and the drone at the time of said activation. Similarly, the activation means 7 are for example able to calculate the value of said predetermined elevation angle β Ρ at the time of said activation. The predetermined values are transmitted to the control means 2 which store them in a memory. Thus, when the activation means 7 are actuated, the values of the three predetermined parameters are calculated by the activation means 7, then are maintained during the monitoring of the target by the control means 2 as long as the user n did not deactivate tracking of target T.
权利要求:
Claims (10) [1" id="c-fr-0001] 1. Animated shooting system, comprising a drone (D) provided with a camera (C) and a ground station (S) communicating by a wireless link with the drone, the camera (C) being oriented in a line of sight (3), the displacements of the drone (D) being defined by flight instructions applied to a propulsion unit or a set of propulsion units of the drone (D), the drone (D) being able to move so autonomous to carry out an animated shooting of a target (T) moving with the ground station (S), the orientation of the line of sight (3) being such that the target (T) remains present in the successive images produced by said shot, the system being characterized in that it comprises - control means (2) configured to generate said flight instructions so as to maintain substantially constant the angle between the line of sight (3) of the camera (C) and the direction of movement of the target (T) on activation of target tracking (T) and in that the ground station includes: - means, controlled by at least one piloting means forming an activation button for tracking the target, to switch the drone piloting mode alternately between: • a mode of activation of the target tracking system capable of activating activation means (7) for tracking the target (T) by the drone (D), and • a deactivation mode capable of deactivating said means activation (7) of the target tracking (T) by the drone (D). [2" id="c-fr-0002] 2. System according to claim 1, characterized in that the ground station further comprises means capable of detecting signals transmitted by at least one control means having a control function by the user and means for transforming said signals detected in flight instructions, and to transmit these flight instructions to the drone when the activation mode is activated. [3" id="c-fr-0003] 3. System according to claim 1 or claim 2, characterized in that the ground station further comprises - a screen (E), means for displaying on the screen an image taken by a camera on board the drone, said image comprising the target and means for displaying on the screen a dynamic icon (10) when the activation mode is activated, the icon comprising at least one representation of the target (12, 12 ′) and a representation of the angle of view (14) of the on-board camera. [4" id="c-fr-0004] 4. System according to claim 3, characterized in that the dynamic icon comprises a first representation of the target (12) in the activation mode of the tracking of the target and a second representation of the target (12 ') at the time of target movement, showing the direction of target movement. [5" id="c-fr-0005] 5. System according to any one of the preceding claims, characterized in that the ground station further comprises locking means (16, 16 ') of the angle between the aiming axis (3) of the camera ( C) and the direction of movement of the target (T), the locking means forming an activation / deactivation button, for alternately locking and unlocking the value of said angle. [6" id="c-fr-0006] 6. System according to any one of the preceding claims, characterized in that the system further comprises means for determining (6) the speed vector (VT1, VT2) of the target (T) and the position (TP1, TP2) of the target (T) in a given frame of reference, and - said control means (2) are configured to generate said flight instructions from: i. the determined speed vector (VT 1, VT2), ii. of the determined position (TP1, TP2) and iii. a predetermined directional angle (a P ) so as to maintain the angle between the line of sight (3) of the camera (C) and the direction of the speed vector (VT1, VT2) substantially at the value of said angle predetermined directional (a P ). [7" id="c-fr-0007] 7. System according to the preceding claim, characterized in that the activation means (7) for tracking the target (T) by the drone (D) are further capable of calculating the value of said predetermined directional angle (a P ) from the displacement of the target (T) for a predetermined period of time consecutive to the activation of the monitoring of said target. [8" id="c-fr-0008] 8. System according to any one of the preceding claims, characterized in that the deactivation mode is a mode in which the piloting commands will generate flight instructions from the determined position of the target. [9" id="c-fr-0009] 9. System according to any one of the preceding claims, characterized in that the ground station comprises: - a touch screen displaying a plurality of touch zones; - Means for detecting contact signals emitted by the tactile zones and at least one tactile zone forms said at least one control means. [10" id="c-fr-0010] 10. System according to claim 9 taken in combination with claim 5, characterized in that at least one tactile zone forms said means for locking said angle. 2/2 14 10 16 '
类似技术:
公开号 | 公开日 | 专利标题 EP3316068B1|2019-03-06|Autonomous system for collecting moving images by a drone with target tracking and upkeep of the target collection angle EP3078402B1|2017-10-04|System for piloting an fpv drone EP3086195B1|2019-02-20|System for piloting a drone in first-person view mode EP3306428A1|2018-04-11|Autonomous system for collecting moving images by a drone with target tracking and upkeep of the target collection angle EP2613214B1|2017-08-02|Method for controlling a rotary-wing drone to operate photography by an on-board camera with minimisation of interfering movements EP2933775B1|2016-12-28|Rotary-wing drone provided with a video camera supplying stabilised image sequences EP3025770B1|2017-01-25|Video system for piloting a drone in immersive mode EP3142353B1|2019-12-18|Drone with forward-looking camera in which the control parameters, especially autoexposure, are made independent of the attitude EP3273317A1|2018-01-24|Autonomous system for taking moving images, comprising a drone and a ground station, and associated method EP2497555B1|2013-08-28|Method for piloting a rotary-wing drone with multiple rotors with curved steering. EP3273318B1|2021-07-14|Autonomous system for collecting moving images by a drone with target tracking and improved target positioning FR3031402A1|2016-07-08|METHOD OF AUTOMATICALLY CONTROLLING A ROTARY SAILING DRONE FOR OPERATING CAMERA MOVEMENTS BY AN EMBEDDED CAMERA EP2613213A1|2013-07-10|Intuitive piloting method of a drone by means of a remote control device EP3276591A1|2018-01-31|Drone with an obstacle avoiding system EP3142354A1|2017-03-15|Drone with forward-looking camera with segmentation of the image of the sky in order to control the autoexposure FR2957266A1|2011-09-16|METHOD AND APPARATUS FOR REMOTE CONTROL OF A DRONE, IN PARTICULAR A ROTARY SAILING DRONE. EP3058431A1|2016-08-24|Method and device for in-flight terrain identification for microdrone EP3281871A1|2018-02-14|Method for capturing a video with a fixed-wing drone, related computer program and electronic system EP3392728A1|2018-10-24|Method for piloting a rotary wing drone, related computer program, electronic apparatus and drone FR3057966A1|2018-04-27|DRONE PILOT IN A SPHERICAL MARK BY A GESTUELLE WITH SEVERAL SEGMENTS OF MEMBERS, STEERING METHOD AND ASSOCIATED COMPUTER PROGRAM FR3079943A1|2019-10-11|ELECTRONIC DEVICE AND METHOD FOR CONTROLLING A DRONE WITH TRAVELING COMPENSATION EFFECT, ASSOCIATED COMPUTER PROGRAM FR3020168A1|2015-10-23|ROTATING WING DRONE WITH VIDEO CAMERA DELIVERING STABILIZED IMAGE SEQUENCES
同族专利:
公开号 | 公开日 US20180143636A1|2018-05-24| CN108021145A|2018-05-11| FR3058238B1|2019-01-25| EP3316068B1|2019-03-06| EP3316068A1|2018-05-02|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 EP2613213A1|2012-01-04|2013-07-10|Parrot|Intuitive piloting method of a drone by means of a remote control device|CN109032184A|2018-09-05|2018-12-18|深圳市道通智能航空技术有限公司|Flight control method, device, terminal device and the flight control system of aircraft|FR2957266B1|2010-03-11|2012-04-20|Parrot|METHOD AND APPARATUS FOR REMOTE CONTROL OF A DRONE, IN PARTICULAR A ROTATING SAIL DRONE.| US9769387B1|2013-11-05|2017-09-19|Trace Live Network Inc.|Action camera system for unmanned aerial vehicle|USD814970S1|2016-02-22|2018-04-10|SZ DJI Technology Co., Ltd.|Aerial vehicle| CN105867361A|2016-04-18|2016-08-17|深圳市道通智能航空技术有限公司|Method and device for flight direction control and unmanned aerial vehicle thereof| US10496107B2|2017-01-17|2019-12-03|Valeo North America, Inc.|Autonomous security drone system and method| TWI657011B|2017-11-30|2019-04-21|財團法人工業技術研究院|Unmanned aerial vehicle, control system for unmanned aerial vehicle and control method thereof| US20190324447A1|2018-04-24|2019-10-24|Kevin Michael Ryan|Intuitive Controller Device for UAV| USD908588S1|2018-06-26|2021-01-26|SZ DJI Technology Co., Ltd.|Aerial vehicle| CN109981981B|2019-03-14|2021-04-06|广州市红鹏直升机遥感科技有限公司|Working mode switching method and device of aerial photographing equipment| CN110347186A|2019-07-17|2019-10-18|中国人民解放军国防科技大学|Ground moving object independent tracking system based on bionic binocular linkage|
法律状态:
2017-10-19| PLFP| Fee payment|Year of fee payment: 2 | 2018-05-04| PLSC| Search report ready|Effective date: 20180504 | 2018-10-19| PLFP| Fee payment|Year of fee payment: 3 | 2020-10-16| ST| Notification of lapse|Effective date: 20200906 |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 FR1660538A|FR3058238B1|2016-10-28|2016-10-28|SELF-CONTAINED DRAMA-CONDUCTED VIEWING SYSTEM WITH TARGET TRACKING AND TARGET SHIFTING ANGLE HOLDING.| FR1660538|2016-10-28|FR1660538A| FR3058238B1|2016-10-28|2016-10-28|SELF-CONTAINED DRAMA-CONDUCTED VIEWING SYSTEM WITH TARGET TRACKING AND TARGET SHIFTING ANGLE HOLDING.| EP17198785.2A| EP3316068B1|2016-10-28|2017-10-27|Autonomous system for collecting moving images by a drone with target tracking and upkeep of the target collection angle| US15/798,245| US20180143636A1|2016-10-28|2017-10-30|Autonomous system for shooting moving images from a drone, with target tracking and holding of the target shooting angle| CN201711038302.8A| CN108021145A|2016-10-28|2017-10-30|The autonomous camera system of unmanned plane mobile image kept with target following and shooting angle| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|