![]() methods for agronomic and agricultural monitoring using unmanned aerial systems
专利摘要:
METHODS FOR AGRONOMIC AND AGRICULTURAL MONITORING USING UNLIMITED AIR SYSTEMSIt is a method for agronomic and agricultural monitoring that includes designating an area for imaging, which determines a flight path above the designated area, which operates an unmanned aerial vehicle (UAV) along the flight path, which acquires images the area using a camera system attached to the UAV and which processes the acquired images. 公开号:BR112017003678A2 申请号:R112017003678-9 申请日:2015-08-20 公开日:2020-10-27 发明作者:Doug Sauder;Justin L. Koch;Troy L. Plattner;Phil Baurer 申请人:The Climate Corporation; IPC主号:
专利说明:
[001] [001] This request claims the benefit provided for in Title 35 of the USC, Sl19 (e) of Provisional Application No. 62 / 040,859, filed on August 22, 2014 and Provisional Application No. 62 / 046,438, filed on September 5, 2014 , in which the complete contents of the same are incorporated by reference as presented in their entirety in this document. FIELD [002] [002] This disclosure refers, in general, to agronomic and agricultural monitoring and, more specifically, to methods for agronomic and agricultural monitoring using unmanned aerial systems or drones. BACKGROUND [003] [003] Unmanned aerial vehicles (UAVs), sometimes called drones, are remotely piloted or self-piloted aircraft that can carry sensors, communications equipment, cameras or other payloads. UAVsS have been used for military reconnaissance and intelligence surveys, as well as for capturing terrestrial images for civilian applications. Although UAVs have also been used for agricultural monitoring, such systems are not entirely satisfactory. A UAV optimized for agricultural use is required. [004] [004] This Background section is intended to introduce the reader to various aspects of the technique that may be related to the various aspects of the present disclosure, which are described and / or claimed below. This discussion is believed to be useful in providing the reader with background information to facilitate a better understanding of the various aspects of the present disclosure. Consequently, it should be understood that these statements should be read in this light and not as admissions to the prior art. BRIEF SUMMARY [005] [005] One aspect is a method for agronomic and agricultural monitoring. The method includes designating an area for imaging, which determines a flight path above the designated area, which operates an unmanned aerial vehicle (UAV) along the flight path, which acquires images of the area using a camera system attached to the UAV and that processes the acquired images. [006] [006] Another aspect is a system for agronomic and agricultural monitoring. The system includes a computing device configured to designate an area for imaging and to determine a flight path above the designated area. The system also includes an unmanned aerial vehicle communicatively coupled to the computing device and which has a camera system, and the unmanned aerial vehicle is configured to travel along the flight path, acquire images of the area using the camera system. and process the acquired images. [007] [007] There are several refinements of the resources observed in relation to the aspects mentioned above. Additional features can also be incorporated into the aspects mentioned above. These refinements and additional features can exist individually or in any combination. For example, several features discussed below in relation to any of the illustrated modalities can be incorporated into any of the aspects described above, either alone or in any combination. BRIEF DESCRIPTION OF THE DRAWINGS [008] [008] Figure 1 is a block diagram of a system for use in agronomic and agricultural monitoring. [009] [009] Figure 2 is a flow chart of a method for operating an unmanned aerial vehicle for agronomic and agricultural monitoring that can be used with the system shown in Figure 1. [010] [010] Figure 3 is a flow chart of a mission planning stage for the method shown in Figure 2. [011] [011] Figure 4 is a flow chart of a flight execution stage of the method shown in Figure 2. [012] [012] Figure 5 is a flowchart of a data transfer / processing stage after the flight of the method shown in Figure 2. [013] [013] Figure 6, Figure 7, Figure 8, Figure 9, Figure 10, Figure 11 are examples of maps that can be created by the system shown in Figure 1. [014] [014] The corresponding reference characters indicate corresponding parts for all the different views of the Figures. DETAILED DESCRIPTION [015] [015] Referring initially to Figure 1, an example of an unmanned aerial system of the present disclosure is generally indicated at 100. The unmanned aerial system 100 includes a plurality of components that include an unmanned aerial vehicle ( UAV) 110, a cloud 120, a graphical user interface (GUI) 130 (for example, deployed using a tablet-type computing device), a base station 140, a personal computer 150 and a user input device ( UID) 160. The components of system 100 will be described in more detail below. [016] [016] In this modality, the components of system 100 are communicatively coupled to each other through another means of communication (for example, direct cable connection, cloud computing networks, the Internet, local area networks (LAN), networks wireless local area networks (WLAN) (for example, 802.1l1ac standard) or wide area networks (WAN). Consequently, components of system 100 may include a wireless transmitter and receiver (for example, 118, 135, 143, 151, 165) and / or a cell transfer module (for example, 113, 131, 142, 162) to facilitate wireless communication between components. Additionally, one or more of the components (110, 120, 130, 140, 150, and 160) can include a global positioning system (GPS) in it (for example, 111, 133, 145, 153, and 161) for determine a position of the associated component, which normalizes GPS data between components and enables triangulation calculations for position determinations. [017] [017] In this modality, the unmanned aerial vehicle 110 is a remotely piloted or self-piloted aircraft that may have the capacity to hover (for example, a helicopter or ultralight) or may have a fixed wing. An example of a "quadcopter" hover-type UAV is described in Patent Application Publication No. US 2013/0176423, which is incorporated by reference in its entirety. In the systems and methods described in this document, UAV 110 assists agricultural operations and agriculture mapping and monitoring agricultural evolution and status. [019] [019] In the exemplary modality, the UAV 110 also includes a camera system 117 mounted on the underside of it to acquire images during the flight. The camera system 117 can hang from UAV 110 through gravity using a set of gimbals that allow rotation around a plurality of geometric axes. Gimbals can include shock absorbers that slow down reactions to changes in the orientation of the UAV 110 during flight. Alternatively, the '117 camera system can be mounted directly on the UAV 110 and controlled by the movement of the actuators. The 117 camera system can include a photo camera, a video camera, a thermal imaging camera and / or a near infrared (NIR) camera to capture normalized difference vegetation index (NDVI) images. Alternatively, camera system 117 may include any image acquisition device that enables system 100 to function as described in this document. [020] [020] Camera system 117 and camera system 117 positioning are controlled by an onboard central processing unit (CPU) and memory storage unit 114. The central processing unit (CPU) may include a microprocessor . The CPU and memory storage unit 114 facilitate arithmetic, logic and input / output operations of the onboard CPU. The CPU and memory storage unit 114 can also assist and / or control other aspects of UAV 110, as discussed in this document. For example, in some embodiments, the CPU and memory storage unit 114 receive information from IMUSs 112 during in-flight operations to assist with guidance from the camera system 117 and / or to detect the possibility or not of conditions (eg light, speed, angle, etc.) are suitable for capturing visible and useful images. UAV 110 may also include one or more sensors (for example, an incident light sensor) coupled to the CPU and memory storage unit 114 to monitor ambient conditions. [021] [021] In the exemplary embodiment, base station 140 includes a communication system based on drone 141 that comprises a radio transmitter and receiver (for example, 900 MHz or 1.2 GHz) to facilitate communication with UAV 110 while in flight. Base station 140 also includes a GPS 145 system and a CPU and memory storage unit 144 similar to those discussed above in relation to UAV 110. [022] [022] In this modality, the personal computer (PC) 150 is a computing device, such as a laptop or desktop computer. The PC 150 includes a CPU and memory storage unit 153 and also includes software for mapping and processing spatial agricultural data (for example, Farm Works Software & or SST Summit Professionalo) installed on it. In one embodiment, PC 150 can serve as a user interface for system 100. [023] [023] System 100 also includes a graphical user interface (GUI) 130 that serves as a portable user interface. GUI 130 can be deployed using a tablet-type computer or other portable computing device that allows the user or operator to control the system 100. In particular, GUI 130 can allow the user to design flight paths of UAV 110 and / or identify aerial obstacles that may, in some way, obstruct the flight path of the UAV 110. In this modality, GUI 130 includes an application ("app") or visualization software 136 that allows the user to remotely access spatial maps that include data regarding harvest, yield and / or nitrogen content created from the images taken by UAV 110. For example, GUI 130 may include software similar to that described in International Patent Application Publication No. WO 2014/026183, which is incorporated by reference in its entirety. Consequently, GUI 130 includes a CPU and memory storage unit 132, and is in communication with other components of system 100. [024] [024] System 100 also includes the user interface (UID) device 160 (for example, a controller or keyboard) that allows the user or operator to control system 100. In particular, UID 160 can allow the user designate UAV 110 flight paths and / or identify aerial obstacles that may, in some way, obstruct the UAV 110 flight path. In this modality, UID 160 includes a display 164 that allows the user to remotely view images from the camera 117. Consequently, UID 160 includes a CPU and memory storage unit 163, and is in communication with other components of system 100. In one embodiment, UID 160 can allow the user or operator to control UAV 110 while view images from camera system 117 on the touchscreen display 134 in GUI 130. [025] [025] In this modality, cloud 120 is a data storage, image processing and computing hub for the unmanned aerial system 100. More specifically, cloud 120 is a set of interconnected computers and servers connected through a network. communication network to allow distributed computing. For example, cloud 120 could become a remote data storage center. Cell module 113 mounted on UAV 110 allows photographs to be transferred to cloud 120 while UAV 110 is in flight. Cloud 120 can receive and store current and forecast weather information that includes air temperature and amount of precipitation. Cloud 120 can also communicate with one or more recommendation and analysis services that provide recommendations and / or analysis based on image data acquired using UAV 110. [026] [026] In one embodiment, UAV 110 transmits images taken with the 117 camera system during flight to other components (eg 130, 140, 150, 160) for storage and / or processing. Images and metadata transferred from UAV 110 to cloud 120 can be orthorectified and stitched together to create a single contiguous image. Examples of orthorectified oblique image for a singular view are described, for example, in U.S. Patent Document No. 8,512,266, which is incorporated by reference in its entirety. [027] [027] With reference to Figure 2, an example of a method for operating an unmanned aerial system, such as the 100 system, is generally indicated in 200. In this modality, method 200 includes three stages: a stage mission planning 201, a flight execution stage 202 and a data transfer / processing stage after flight 203. The three stages of method 200 will be described in more detail below. [028] [028] With reference to Figure 3, an example of the mission planning stage of method 200 is generally indicated at 300. The mission planning stage 300 of method 200 includes a sequence of actions performed by the user and the system 100. In Figure 3, actions performed by the user are provided in a circle and actions performed by system 100 are provided in a square. [029] [029] In the example method 200, following the activation of system 100, the user first indicates the flight area 301 for mapping. In one mode, the user outlines the flight area to be covered by UAV 110 in GUI 130 or UID 160 using map data from Google maps or other GPS software. [030] [030] In one embodiment, system 100 analyzes the user's flight area entry 301, calculates the possible flight path (or flight paths) to generate a contiguous image of the flight area and provides the user with possible flight paths of UAV 302. System 100 can also identify potential obstacles (for example, telephone poles and / or power lines) in the flight path based on previous flights and / or user input and can therefore adjust the flight path . In another modality, system 100 provides the user with multiple UAV flight paths possible at different elevations and speeds that depend on the desired image resolution and flight duration. For example, and by way of illustration, system 100 could provide the user with two optional UVA flight paths in GUI 130 or UID 160, as provided in Table 1 below: Table 1 (feet)) [031] [031] In the example method 200, once equipped with possible flight paths, the user selects a desired flight path 303. In one mode, the user can request that the system 100 provide additional flight path options by introducing specific parameters for the flight path (ie elevation, figuration resolution, duration, etc.). [032] [032] In the example method 200, as the user selects a desired flight path, the system 100 provides the user with a selection of possible image types to be taken by the camera system 117. In one mode, the user has the option to select 305 from static photos, thermal images, near infrared (NIR) images and visible light, thermal and / or NIR imaging. For example, GUI 130 or UID 160 can provide the user with a list that allows the user to select the desired image type 304 (for example, displaying a checkbox or other selection mechanism). Based on the types of images selected by the user, in some modalities, GUI 130 or UID 160 determine an optimized elevation and / or estimate a flight time. [033] [033] In the example method 200, system 100 provides the user with details and flight suit 306 in GUI 130 or UID 160. In one embodiment, system 100 can provide the user with the route, elevation and / or duration of the UAV flight. , as well as the advance resolution of the images to be taken in the selected image type. In another embodiment, before generating flight paths, system 100 determines whether flight obstacles (for example, telephone poles or power lines) have previously been identified in the applicable flight area. In yet another modality, the user identifies flight obstacles 307 in GUI 130 or UID 160 using satellite imagery from Google Earthô or another image provider. Specifically, in a modality, GUI 130 or UID 160 enable the user to draw a border around any flight obstacles and enter the approximate height of the obstacle to prevent the UAV from entering the obstructed area. Using the input from the user, system 100 recalculates the flight path to avoid obstacles. [034] [034] Referring to Figure 4, an example of the method 200 flight execution stage is indicated, in general, at 400. Method 200 flight execution stage 400 includes a sequence of actions performed by the user and the system [035] [035] In the example method 200, the flight execution stage 400 occurs after the mission planning stage 201. In one mode, the user directs the system 100 to start the flight execution stage 400 using GUI 130 or UID 160. In another modality, flight execution stage 400 starts automatically following the identification of obstacles, if present, in the UAV flight path. [036] [036] In the example method 200, the flight execution stage 400 starts with the system 100 that compares the flight duration and elevation with a battery life 401 of the UAV 110. When a battery charge level is insufficient, the System 100 provides an indication to the user (for example, in GUI 130 or UID 160) that loading is required. In addition to an energy check, [037] [037] In the example method 200, following confirmation by system 100 that UAV 110 is capable and ready to operate, the user is alerted by system 100, through GUI 130 or UID 160, to start flight 403. In one mode, the user presses a "start flight" or "depart" button in GUI 130 or UID 160. Under flight initiation, system 100 starts UAV flight and continuously monitors UAV 404 systems. In one mode , UAV 110 performs one or more test maneuvers. For example, UAV 110 can take off vertically from base station 140 and perform simple maneuvers (for example, moving forward and backward, from side to side, up and back, etc.) to verify the capabilities of operation and maneuver. In the event of a UAV or system malfunction at any time during the flight, the system 100 and the user have the ability to end the flight prematurely 405. In such a case, the selected flight path is terminated and the UAV 110 returns to base station 140 and / or tries to return to the ground without damaging the UAV 110. [038] [038] In the example method 200, during the flight of the UAV 110, the camera system 117 takes photos or videos of the selected flight area and stores the images in the onboard CPU and memory storage unit 114. In one mode, i On-board CPU and memory storage unit 114 orthorectifies the image for a unique view and identifies areas with poor image quality. [039] [039] In some modalities, UAV 110 acquires an initial set of images and then returns to one or more target areas to acquire additional images at a higher resolution after reviewing the initial image maps [040] [040] In another mode, UAV 110 automatically flies a "Back and Check" route 407 that follows a first pass over the selected flight area to take additional high-resolution photos of target areas (for example, areas with low nitrogen levels) shown in the NDVI image. In yet another modality, additional high-resolution photos of target areas are taken to eliminate crop shadows. In such modalities, to reduce the processing time, [041] [041] In example method 200, following the completion of the UAV flight path, the UAV lands (for example, at base station 140) to end flight 408. [042] [042] In the example method 200, when completing the flight execution stage 400, the data transfer / processing stage after flight 203 begins. Alternatively, data transfer / processing can take place while UAV 110 remains airborne, so that data transfer / processing stage 203 overlaps flight execution stage 400. Therefore, transfer and processing of the image obtained by UAV 110 can occur in real time as data is captured or shortly thereafter (for example, within 10 minutes of data capture). In one mode, low quality images are constantly transmitted to GUI 130 or UID 160 during the flight to keep the user informed of the flight status. [043] [043] Transferring the data and images captured by UAV 110 can be done through wireless and / or cellular communication between the components of the system 100. The transfer is typically directed to the components where the processing will be performed. [044] [044] Processing the data and images can include orthorectification and stitching of aerial images on a single map of contiguous area. Notably, data and image processing can be performed using any system component 100. For example, processing can be performed on board UAV 110 and the processed images can then be transferred to base station 140, GUI 130 and / or UID 160. [045] [045] In one mode, the acquired images are superimposed (for example, with 50% transparency) on Google map mosaics or aerial geographical images and displayed to the user. Alternatively, aerial images can be processed and displayed with Google map8 mosaics or with the aerial geographic image, so that they are displayed in an orientation stuck from side to side, so that moving and / or zooming in on an image moves and / or zoom in on another image by the same amount. The central points of the images in the side-by-side orientation can be indicated by an icon (for example, crosshairs), similar to the techniques described in International Patent Application Publication No. WO 2014/026183. In another modality, a sequence of aerial images taken at different times during a growing season (for example, daily, weekly, monthly, etc.) are processed in an animation that passes through the images in sequence. In some modalities, the animation is activated automatically displaying the images to define periods of time; in other modalities, the next image in sequence is displayed in response to a user input in a graphical interface (for example, selection using an arrow icon or by dragging a sliding icon over a scale). The animation can be superimposed over mosaics from Google mapo or aerial geographic images. The images can be, for example, NDVI images, aerial maps and / or emergency maps. [046] [046] Processing images can also include filtering images using software for subject filtering and shadows that can affect image quality. Filtering creates a color contrast between the cockpit and plant dirt, which can be difficult to distinguish from each other in the unfiltered image. For example, in one embodiment, image processing removes everything in aerial photography below a reflection threshold or color value. [047] [047] In one example, an expected greenness density is identified based on a planted population and / or a plant development stage in the area transformed into an image. The planted population can be determined from a map as planted and the stage of development can be determined, for example, using a specific hybrid chart that relates the number of days of growth grade to an expected stage of development. Once the expected green density is identified, everything in an image that is above the expected green density can be represented in shades of green and everything in the image that is below the expected green density can be represented in shades of red. [048] [048] System 100 can also use imagology data to generate emergency maps in which a number of plants per area is calculated and areas devoid of plants or desirable vegetation in the images are marked as "gaps." Gaps are areas where plants or vegetation failed to grow or were not initially planted. In one embodiment, system 100 correlates gap data with initial planting data (for example, the map as planted) to remove any gaps that have occurred due to no initial planting, which leaves only true gaps that are indicative of areas where seeds were planted, however, did not emerge. This processing can be applied to NDVI image data or other image data acquired by the camera system [049] [049] In some modalities, spatial application decisions can be made automatically based on images acquired by UAV 110. For example, the levels of an NDVI map can be associated with an application after planting (for example, cover fertilization or crop spraying) to generate an application map based on the NDVI map. The generated application map can be displayed to the user to allow the user to reject, modify or accept the application map. In some embodiments, the generated application map is passed on to a service provider (for example, a worker or contractor) with instructions for applying the application map. [050] [050] The data acquired by UAV 110 can also be used to produce general agronomic recommendations. For example, if an NDVI map generated using system 100 has a nitrogen level below a threshold, system 100 may recommend that nitrogen be applied by a cover to increase nitrogen levels. The threshold can be determined based on a stage of culture development, for example. In another example, if a plant health map indicates that an area of healthy plants is below a threshold, system 100 may recommend that nitrogen be applied by a cover. In yet another example, if an emergency map has an emergency area below a threshold before a critical time in development, system 100 may recommend that the field be replanted. [052] [052] The data transfer / processing stage 500 includes, in general, the following eight stages: stage of obtaining an NDVI image (or NDVI images) from flight 501; stage of converting the NDVI image (or NDVI images) into a 502 map; stage of filtering non-culture matter 503; stage of identifying 504 culture rows; stage of dividing individual plants 505; stage of identifying individual plant resources 506; stage of estimating potential of crop yield stage 507; and stage of generating 508 report. [053] [053] In example method 200, the data transfer / processing stage after flight 500 begins with system 100 obtaining an NDVI image (or NDVI images) from flight 501. Again, data transfer / processing can occur while UAV 110 remains airborne, so that data transfer / processing stage 500 overlaps flight execution stage 400. data transfer / processing stage 500 can occur in real time as data is captured UAV 110 or shortly thereafter (for example, within 10 minutes [054] [054] In this example, the images obtained from the flight execution are converted by the system 100 into a map 502 (for example, a bitmap, an emergency map, etc.). In one embodiment, an expected greenness density is established based on a planted population and / or a plant development stage in the area transformed into an image. Once the expected green density is identified, in the generated map, pixels in each image, which are above the expected green density are represented in white and pixels, in the image, which are below the expected green density are represented in black. Consequently, a map is created with 601 unitary white spaces that roughly correlate with the location and area of each plant in the images. An exemplary map 600 is provided in Figure 6. In Figure 6, individual plants in the planted population (identified using the expected greenness density) are represented as 601 white spaces. Surrounding resources 602 (e.g., surrounding soil, weeds, etc.). ) are slightly shaded. Until further processing, the map may include single white spaces 601 that include multiple plants (for example, shown on the right side of the map 600) and / or white spaces 601 that are weeds or other non-crop plant material (for example , shown on the lower left side of the map 600). [055] [055] The stage of filtering non-culture matter 503, in the example method, includes identifying "anomalies" in the generated map. “Anomalies", as used in this document, refers to the areas on the generated map that are [056] [056] The stage of identifying culture rows 504 in the example method includes marking a centroid 801 for each remaining white space 601. Figure 8 shows an example map 600 with a row 802 of white spaces 601 marked with centroid 801. Figure 9 shows another example map 600 with two rows 901 and 902 of white spaces 601 marked with centroid 801. System 100 identifies rows (for example, 802, 901, 902) by calculating, approaching and assigning the best fit line for the rows based on centroid positions 801. Specifically, system 100 uses a row spacing distance 805, which can be a standard value (for example, 76.20 centimeters (30 inches)) or a value entered per user, to identify approximate locations of parallel rows through white spaces 601 and / or centroid 801. In other modalities, stage 504 can overlap or occur at the same time as stage 503 to assist go system 100 to identify anomalies. [057] [057] The stage of dividing individual plants 505 in the example method includes identifying two or more overlapping white spaces 601 (that is, two or more overlapping plants). Figure 10 shows an example of a pair of overlapping white spaces 601 within circle 1000. In the example method, to identify a pair of overlapping white spaces 601, system 100 first compares (i) the row offset (for example, 1001 and 1003) between adjacent centroides; and (ii) the row spacing value (for example, 1002 and 1004) determined by (a) a nominal value from the user, (b) a spacing value as planted from a map as planted or (c ) the median or average spacing between plants in a row. In the case of a pair of overlapping white spaces [058] [058] In this example, system 100 also assigns a "confidence value" (for example, 90%) for each white space 601 that indicates the probability or statistical certainty that each white space 601 correlates with location and / or distinct plant area in the images. In one example, the confidence value for an individual white space 601 is greater when (i) the location of the respective centroid 801 is approximately equal to a row offset value (for example, 1002 and 1004); and (ii) the respective area is approximately equal to the median and / or average area of white spaces 601 on map 600. Consequently, system 100 can store the confidence value for each white space 601 on each map 600 for reference, for several purposes, as described below. [060] [060] In another example, correcting images captured by the 117 camera system may include a 407 "Back and Gain" route through UAV 110 to take additional high-resolution photos of the target areas. [061] [061] In this example, system 100 also analyzes individual plants (for example, those identified within white spaces 601) by examining one or more images of each plant captured by camera system 117 from different positions and elevations. Similar to stage 504 in which each white space 601 is marked with a centroid 801, system 100 locates the structures (e.g., leaves, stems, ears, etc.) of each plant and marks each structure with a centroid. In one example, system 100 finds plant structures using a length: width ratio for structures consistent with the planted population. In addition, leaf spines can be located by calculating midpoints between the leaf edges. In this example, system 100 also locates an updated and more accurate centroid of the plant using centroid from the individual plant structures. In another example, system 100 may use a point of intersection of nested lines along the length or width and through the centroid of a plurality of plant structures (e.g., leaf spines) to find the updated plant centroid. In still other modalities, the system 100 can return to the previous stages to improve the identification of white space 601 and / or placement of centroid 801, for example. [062] [062] In this example, system 100 uses the image structure and plant location to determine data about plant characteristics in the planted population. Plant characteristics of particular interest, for example, include, without limitation, leaf length (e.g., average thorn length), width and area (e.g., of the entire plant) and number of leaves (which can be, for example, the number of spines identified). Again, system 100 can use the image data points to adjust the unclear or distorted views of the plant characteristics. Consequently, system 100 can store information regarding plant characteristics for each plant for reference, for the various purposes described below. [063] [063] The stage of estimating the yield potential of culture 507 in the example method includes using information gathered and calculated by system 100 to estimate a yield potential. Information gathered includes, for example, the number of plants in the planted population, the confidence value for each 601 blank space and / or information about the plant characteristics. In this example, system 100 may not consider plant characteristics when a confidence value for a particular plant is below a first threshold (for example, 95%). Also in this example, system 100 may not include that particular plant for the planted population count raised when the confidence value is below a lower second threshold (for example, 80%). [064] [064] In an example, system 100 can use the following Equation 1 to estimate a plant yield or planted population potential: Equation 1: yield potential = Ax + By + Cz where, x = number of leaves y = leaf area z = maximum leaf length or average of the two longest leaves A = 0, if x <threshold value; A> 0, if x> threshold value B = O, if x <threshold value; B> O, if x> threshold value C = 0, sex <threshold value; C> O, if x> threshold value [065] [065] In one example, system 100 can calculate an ear potential using a Boolean approach. Per [066] [066] In both Equations 1 and 2, the threshold value for determining A, B and C can be (i) a nominal value from the user; (ii) an expected value based on previous planted populations; (iii) an extrapolated value of the individual plants; or (iv) an interpolated value of the largest planted populations. [067] [067] The stage of generating 508 report in the example method includes creating a map or data report about the planted population. In this example, the generated map compares the map as planted with another map, later on, in the development of the planted population. The map can show, for example, regularity of plant spacing, skipped plantings, plants twice planted, etc. Also in this example, the generated report can include a potential or yield generated (for example, number of ears, seeds, stems, etc.) from the planted population. [068] [068] In some modalities, one or more measurements and spatial maps can be generated and displayed to the user based on information gathered from an aerial image. [069] [069] In one embodiment, a weed pressure value is determined for each location or region in the field based on the relative amount of weeds in the crop grown. The weed pressure value is, [070] [070] In another embodiment, the leaf width of the crop plants identified in the field (determined as described above) is reported as a field average or presented to the user as a spatial map of average leaf width in the field. [071] [071] In another embodiment, an estimated emergence date of the identified crop plants is determined for each plant or region in the field. The estimated emergence date can be estimated based on the size of each identified crop plant; further, where no crop plant is observed in a portion of the field on a given date, it is assumed that the emergency date for that portion of the field may be after that date. The spatial variation in the estimated emergency date can be presented to the user as [072] [072] In another embodiment, an estimated plant vigor of the crop plants identified in the field is reported as a field average or presented to the user as a spatial map of the plant vigor in the field. The plant vigor value for each plant or group of plants is determined, preferably, by calculating a weighted sum or the product of the plant characteristics (for example, leaf width and number of leaves). For example, a plant vigor value for a crop plant can be calculated by multiplying the average leaf width by the number of leaves or adding the average leaf width to a value 10 times the number of leaves. A statistical variation (for example, standard deviation) of the plant vigor value in relation to the average plant vigor value for the field (or for a region that includes multiple fields) can also be measured and used to generate a spatial map of the deviation of plant vigor. [073] [073] In another embodiment, a plant disease identification is determined by comparing the reflectivity (for example, visible spectrum, infrared or NDVI value) of portions of a single crop plant identified with [074] [074] In another embodiment, a pest identification is determined by comparing the reflectivity (for example, visible spectrum, infrared or NDVI value) of the portions of a single crop plant identified with a threshold value or the average reflectivity value of the culture plant. If one or more portions of a crop plant has a reflectivity greater than the selected reflectivity threshold (and preferably has an area greater than an area threshold), the user is preferably alerted to the presence of a potential pest and can be presented with a photographic image of the crop plant. [075] [075] Because the pest and disease identification methods discussed above can be enhanced by higher resolution images, in some modalities the UAV 110 returns to areas that have low NDVI values (either those selected by the user or those that have values of NDVI below the threshold) and captures a high resolution image, for example, flying at lower altitudes (for example, 6.09 meters (20 feet) or lower) over the identified area or hovering (ie, stopping in a stationary position ) over the identified area and taking an image at a higher resolution and / or closer level than [076] [076] In some embodiments, the yield potential and / or ear potential of plants (for example, plants in the seedling stage), as discussed above, can be determined alternatively by taking images of the crop at a significant angle (for example, between 30 and 60 degrees) in relation to the vertical, in order to observe and compare the height of the individual plants. Plants shorter than neighboring plants by a threshold percentage are identified, preferably, as late emergent plants that have a lower yield potential. [077] [077] In some embodiments, the orientation of the identified crop plants can be determined by determining the plant orientation (for example, in relation to the north) of a better fit in line through the spines of one or more leaves (for example, a line passing through the two opposite leaves 180 degrees apart around the stem). A correlation of plant orientation with performance of [078] [078] In some modalities of the methods described in this document, a measurement based on an image of a first portion of the field can be generalized to a larger portion of the field for the purpose of generating a measurement map across the field. In some embodiments, the largest portion of the field may comprise a surrounding and / or adjacent area [079] [079] When introducing the elements of the present invention or the various modalities thereof, the articles “one”, “one”, “o”, "a”, “said” and “said” are intended to mean that there are one or more of the elements. The terms "who understands", "who includes" and "who has" are intended to be inclusive and mean that there may be additional elements in addition to the elements listed. The use of terms that indicate a particular orientation (for example, "top", "bottom", "side", etc.) is given for convenience of the description and does not require any particular orientation of the item described. As several changes can be made to the above constructions and methods without departing from the scope of the invention, it is intended that all subjects contained in the description above and shown in the attached Drawing Figures should be interpreted as illustrative and not in a limited sense.
权利要求:
Claims (15) [1] AMENDED CLAIMS í 1. Method for agronomic and agricultural monitoring, the method characterized by the fact that it comprises: using a first central processing unit (CPU) at a base station, which monitors an unmanned aerial vehicle (UAV), according to the UAV flies along a flight path above an area and as the UAV performs: using a second UAV CPU, capturing a plurality of initial images of the area as the UAV flies along the flight path; use the second CPU of the UAV, receive an identification from one or more target areas to take different additional images; will use. Monday. UAV CPU, in response to receiving identification: make the UAV capture one or more additional images different from one or more target areas; transmitting the plurality of initial images and one or more different additional images to an image recipient. [2] 2. Method, according to claim 1, characterized by the fact that it also comprises the UAV itself to identify one or more target areas for which one or more different images are required, as the UAV orthorectifies the plurality of initial images and identifies areas with a low quality image. [3] 3. Method, according to claim 1, characterized by the fact that it also comprises: using the UAV, sending to an computing device an indication of areas that have certain characteristics that comprise one or more among low nitrogen levels, low crop maturity levels or high crop shade levels; receive, from the computing device, instructions to acquire one or more images of one or more target areas for which one or more different additional images are required; receive, from the computing device, an identification of one or more target areas for which one or more different additional images are required. [4] 4, Method, according to claim 1, characterized by the fact that it further comprises, using the UAV, to capture one or more different additional images in a higher resolution than a resolution in which the plurality of initial images was captured. [5] 5. Method, according to claim 1, characterized by the fact that it also comprises, using the UAV, to capture one or more different images at a lower elevation. [6] 6. Method, according to claim 1, characterized by the fact that it further comprises transferring the plurality of initial images and one or more different additional images to the image recipient as the images are captured by the UAV and while the UAV is airborne. [7] 7. Method, according to claim 1, characterized by the fact that it also comprises determining the flight path: receiving, at the base station, an entry that indicates a type of image to be acquired; receiving, at the base station, obstacle data that indicates an obstacle within the area; and determining, using the base station, the flight path based, at least in part, on entry and obstacle data. " [8] 8. Method, according to claim 1, characterized ”by the fact that it also comprises, using the UAV, processing the plurality of initial images and one or more different additional images on board, as the UAV flies over the area. [9] 9. Method, according to claim 1, characterized by the fact that it also comprises, using the UAV, processing the plurality of initial images and one or more different additional images to orthorectify and sew the plurality of initial images and one or more more different additional images on a single continuous area map. [10] 10. Method, according to claim 1, characterized by the fact that it further comprises: processing, using the base station or a cloud computer, the plurality of initial images and one or more different additional images, performing one or more among : orthorectify and sew the plurality of initial images and one or more different images in a single continuous area map; superimpose the plurality of initial images and one or more different images over other types of aerial geographic images; display the plurality of starting images and one or more different additional images in a graphical user interface for a user; based on the plurality of initial images and one or more different additional images, generate a visual animation and display the visual animation in the graphical user interface to the user; filter the plurality of initial images and one or more additional different images by applying one or more r. filters to the plurality of initial images and to one or more different additional images; based on the plurality of initial images and one or more different additional images, determine a green density map for the area, the green density map having an expected green density area represented in a first color and other areas represented in a second color; based on the plurality of initial images and one or more different additional images, generate at least one of a normalized difference vegetation index map or an application map for the area; based on the plurality of initial images and one or more different additional images, determine an emergency map for the area; based on the plurality of initial images and one or more different additional images, generate at least one of a normalized difference vegetation index map or an application map; based on the plurality of initial images and one or more different additional images, generate one or more agronomic recommendations for the area; wherein the processing of the plurality of initial images and one or more different images is performed on one or more of them: the base station or a cloud system. [11] 11. Method, according to claim 10, characterized by the fact that it also comprises processing the emergency map for the area to: identify rows of areas above a density of “Greenness expected on the emergency map; dividing areas above. the expected greenness density in individual areas where each corresponds to a plant of a plurality of plants and is consistent with the planting data; identify structures of at least one plant among the plurality of plants; calculate an estimated yield from at least one plant among the plurality of plants based on the identified structures; generate a report that includes the estimated income. [12] 12. System for agronomic and agricultural monitoring, the system characterized by the fact that it comprises: a base station comprising a first central processing unit (CPU) and is configured to monitor one or more unmanned aerial vehicles according to one or more aerial vehicles unmanned flies along a flight path above an area; an unmanned aerial vehicle (UAV) comprising a second CPU and configured to: use the second CPU of the UAV, capture a plurality of initial images of the area, as the UAV flies along the flight path; use the second CPU of the UAV, receive an identification from one or more target areas to take different additional images; use the second CPU of the UAV, in response to receiving the identification: make the UAV capture one or more additional images different from one or more target areas; transmit the plurality of initial images and one or more different additional images : for an image recipient. " [13] 13. System, according to claim 12, characterized by the fact that the UAV itself is further configured to identify one or more target areas for which one or more different additional images are required, as the UAV orthorectifies the plurality initial images and identifies areas with a low quality image. [14] 14. System, according to claim 12, characterized by the fact that the UAV is further configured to: send to an computing device an indication of areas that have certain characteristics that comprise one or more among low nitrogen levels, levels of low crop maturity or high crop shade levels; receive, from the computing device, instructions to acquire one or more images of one or more target areas for which one or more different additional images are required; receive, from the computing device, an identification of one or more target areas for which one or more different additional images are required. [15] 15. System, according to claim 12, characterized by the fact that the UAV is further configured to: capture one or more different additional images at a higher resolution than a resolution in which the plurality of initial images was captured.
类似技术:
公开号 | 公开日 | 专利标题 US10902566B2|2021-01-26|Methods for agronomic and agricultural monitoring using unmanned aerial systems Matese et al.2017|Assessment of a canopy height model | in a vineyard using UAV-based multispectral imaging US11050979B2|2021-06-29|Systems and methods for agricultural monitoring US10395115B2|2019-08-27|Systems, devices, and methods for robotic remote sensing for precision agriculture CN111225855A|2020-06-02|Unmanned plane Henrickson et al.2016|Infrastructure assessment with small unmanned aircraft systems Flores et al.2017|Multispectral imaging of crops in the Peruvian Highlands through a fixed-wing UAV system Jimenez Soler et al.2015|Validation and calibration of a high resolution sensor in unmanned aerial vehicles for producing images in the IR range utilizable in precision agriculture Vanegas et al.2018|Multi and hyperspectral UAV remote sensing: grapevine phylloxera detection in vineyards Jiménez López et al.2015|Crops diagnosis using digital image processing and precision agriculture technologies López et al.2015|Diagnóstico de Cultivos Utilizando Procesamiento Digital de Imágenes y Tecnologías de Agricultura de Precisión Miller et al.2017|Providing aerial images through UAVs Ristorto et al.2017|Mission planning for the estimation of the field coverage of unmanned aerial systems in monitoring mission in precision farming Naumann2015|Exploration of tropical landscape structures by an unmanned aerial vehicle in Sumatra KR20210143627A|2021-11-29|Vegetation prediction system using drone and vegetation prediction device and vegetation prediction method Allison et al.2019|Drone surveying in a complex area: A case study Torres-Sánchez et al.2014|Multitemporal weed mapping using UAV imagery for early site-specific control: the case of wheat as a narrow row crop. Collins et al.2019|Using Flight and Field Datasets with Digital Photogrammetric Systems to Assess the Accuracy of a UAS Sensor Platform for Mapping Geospatial Input Treatment Zones Jensen et al.2021|Sensing from Unmanned Aerial Vehicles Delparte et al.2019|sUAS for Wildlife Conservation–Assessing Habitat Quality of the Endangered Black-Footed Ferret Torres-Sánchez et al.2014|WEED SEEDLINGS DETECTION IN WINTER CEREALS FOR SITE-SPECIFIC CONTROL: USE OF UAV IMAGERY TO OVERCOME THE CHALLENGE Hristov et al.0|OBSERVATION AND ANALYSIS OF REMOTE FOREST AREAS AND EARLY FOREST FIRE DETECTION USING DRONES8 Hensh0|Application of Drone in Agriculture: An Overview
同族专利:
公开号 | 公开日 US20160050840A1|2016-02-25| WO2016029054A1|2016-02-25| CN107148633A|2017-09-08| CL2017000426A1|2018-03-02| UA123573C2|2021-04-28| EP3183697B1|2022-01-12| CA2957081A1|2016-02-25| EP3183697A4|2018-03-28| US20180158179A1|2018-06-07| US10902566B2|2021-01-26| CN107148633B|2020-12-01| EP3183697A1|2017-06-28| ZA201701638B|2018-06-27| US10346958B2|2019-07-09| CO2017002677A2|2017-06-09| US20210217148A1|2021-07-15| AU2015305406B2|2020-09-10| AU2015305406A1|2017-03-23| US9922405B2|2018-03-20| US20190333195A1|2019-10-31|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US6529615B2|1997-10-10|2003-03-04|Case Corporation|Method of determining and treating the health of a crop| US20100004802A1|2005-01-25|2010-01-07|William Kress Bodin|Navigating UAVS with an on-board digital camera| CN100541120C|2005-10-10|2009-09-16|中国农业机械化科学研究院|Extreme low-altitude early warning system of a kind of plague of locusts and field locust automatic identifying method thereof| US9195891B2|2006-11-07|2015-11-24|The Curators Of The University Of Missouri|Method of predicting crop yield loss due to n-deficiency| US7911517B1|2007-10-31|2011-03-22|The United States Of America As Represented By The Secretary Of Agriculture|Device and method for acquiring digital color-infrared photographs for monitoring vegetation| CA2840860C|2007-11-14|2015-05-26|Intergraph Software Technologies Company|Method and apparatus of taking aerial surveys| US8675068B2|2008-04-11|2014-03-18|Nearmap Australia Pty Ltd|Systems and methods of capturing large area images in detail including cascaded cameras and/or calibration features| WO2009149389A1|2008-06-06|2009-12-10|Monsanto Technology Llc|Generating agricultural information products using remote sensing| CN102246159A|2008-12-09|2011-11-16|通腾北美有限公司|Method of generating a geodetic reference database product| US9213905B2|2010-10-25|2015-12-15|Trimble Navigation Limited|Automatic obstacle location mapping| US10115158B2|2010-10-25|2018-10-30|Trimble Inc.|Generating a crop recommendation| US20120101784A1|2010-10-25|2012-04-26|Trimble Navigation Limited|Wide-area agricultural monitoring and prediction| US9058633B2|2010-10-25|2015-06-16|Trimble Navigation Limited|Wide-area agricultural monitoring and prediction| US9354216B2|2011-11-07|2016-05-31|Brian Harold Sutton|Infrared aerial thermography for use in determining plant health| US9113590B2|2012-08-06|2015-08-25|Superior Edge, Inc.|Methods, apparatus, and systems for determining in-season crop status in an agricultural crop and alerting users| US20140089045A1|2012-09-27|2014-03-27|Superior Edge, Inc.|Methods, apparatus and systems for determining stand population, stand consistency and stand quality in an agricultural crop and alerting users| US9235763B2|2012-11-26|2016-01-12|Trimble Navigation Limited|Integrated aerial photogrammetry surveys| US9075415B2|2013-03-11|2015-07-07|Airphrame, Inc.|Unmanned aerial vehicle and methods for controlling same| US20140312165A1|2013-03-15|2014-10-23|Armen Mkrtchyan|Methods, apparatus and systems for aerial assessment of ground surfaces| WO2014160589A1|2013-03-24|2014-10-02|Bee Robotics Corporation|Aerial farm robot system for crop dusting, planting, fertilizing and other field jobs| US9147110B2|2013-04-05|2015-09-29|Pioneer Hi-Bred International, Inc.|Field and crop evaluation tool and methods of use| US9798928B2|2013-07-17|2017-10-24|James L Carr|System for collecting and processing aerial imagery with enhanced 3D and NIR imaging capability| CN103425102A|2013-07-30|2013-12-04|上海丰凸通讯科技有限公司|Intelligent fertilizing system based on low-altitude remote sensing and multispectral accurate recognition| CN103523224A|2013-10-31|2014-01-22|无锡同春新能源科技有限公司|Unmanned aerial vehicle with colorful rice disease image recognition instrument and used for preventing and controlling rice bacterial leaf blight| US20150254800A1|2014-03-06|2015-09-10|F12 Solutions, Llc|Nitrogen status determination in growing crops| US10089766B2|2014-03-28|2018-10-02|Konica Minolta Laboratory U.S.A., Inc|Method and system of stitching aerial data using information from previous aerial images| US9390331B2|2014-04-15|2016-07-12|Open Range Consulting|System and method for assessing riparian habitats| US9401030B2|2014-04-25|2016-07-26|Tazco Soil Service Co.|Image processing system for soil characterization| WO2016029054A1|2014-08-22|2016-02-25|The Climate Corporation|Methods for agronomic and agricultural monitoring using unmanned aerial systems|US10234439B2|2012-11-07|2019-03-19|Airscout Inc.|Methods and systems for analyzing a field| US9451745B1|2012-09-21|2016-09-27|The United States Of America, As Represented By The Secretary Of Agriculture|Multi-band photodiode sensor| US9488630B2|2013-11-08|2016-11-08|Dow Agrosciences Llc|Integrated remote aerial sensing system| CA2931632C|2013-11-27|2020-07-14|The Trustees Of The University Of Pennsylvania|Multi-sensor fusion for robust autonomous flight in indoor and outdoor environments with a rotorcraft micro-aerial vehicle | US9678506B2|2014-06-19|2017-06-13|Skydio, Inc.|Magic wand interface and other user interaction paradigms for a flying digital assistant| US9798322B2|2014-06-19|2017-10-24|Skydio, Inc.|Virtual camera interface and other user interaction paradigms for a flying digital assistant| WO2016029054A1|2014-08-22|2016-02-25|The Climate Corporation|Methods for agronomic and agricultural monitoring using unmanned aerial systems| US9618934B2|2014-09-12|2017-04-11|4D Tech Solutions, Inc.|Unmanned aerial vehicle 3D mapping system| BR112018000627A2|2015-07-15|2018-09-18|Climate Corp|generation of digital nutrient models available for a crop through the course of crop development based on weather and soil data| WO2016061774A1|2014-10-22|2016-04-28|深圳市大疆创新科技有限公司|Flight path setting method and apparatus| IL236606A|2015-01-11|2020-09-30|Gornik Amihay|Systems and methods for agricultural monitoring| US9792557B2|2015-01-14|2017-10-17|Accenture Global Services Limited|Precision agriculture system| WO2016123201A1|2015-01-27|2016-08-04|The Trustees Of The University Of Pennsylvania|Systems, devices, and methods for robotic remote sensing for precision agriculture| US9922261B2|2015-04-16|2018-03-20|Regents Of The University Of Minnesota|Robotic surveying of fruit plants| US10139279B2|2015-05-12|2018-11-27|BioSensing Systems, LLC|Apparatuses and methods for bio-sensing using unmanned aerial vehicles| US10791666B2|2015-06-08|2020-10-06|The Climate Corporation|Agricultural data analysis| CA2990438A1|2015-06-30|2017-01-05|The Climate Corporation|Systems and methods for image capture and analysis of agricultural fields| US9745060B2|2015-07-17|2017-08-29|Topcon Positioning Systems, Inc.|Agricultural crop analysis drone| US10884430B2|2015-09-11|2021-01-05|The Trustees Of The University Of Pennsylvania|Systems and methods for generating safe trajectories for multi-vehicle teams| US10231441B2|2015-09-24|2019-03-19|Digi-Star, Llc|Agricultural drone for use in livestock feeding| US10321663B2|2015-09-24|2019-06-18|Digi-Star, Llc|Agricultural drone for use in livestock monitoring| US10540901B2|2015-11-23|2020-01-21|Kespry Inc.|Autonomous mission action alteration| CA3006974A1|2015-11-23|2017-06-01|Kespry Inc.|Autonomous mission action alteration| EP3392612A4|2015-12-14|2019-07-10|Nikon-Trimble Co., Ltd.|Defect detection apparatus and program| US10301019B1|2015-12-17|2019-05-28|Amazon Technologies, Inc.|Source location determination| US10262407B2|2016-02-25|2019-04-16|Prospera Technologies, Ltd.|System and method for efficient identification of developmental anomalies| DE102016004250A1|2016-04-08|2017-10-12|Liebherr-Components Biberach Gmbh|Method and device for controlling a crane, an excavator, a caterpillar or similar construction machine| US10817754B2|2016-04-27|2020-10-27|SlantRange, Inc.|Systems and methods for object classification and selective compression in aerial imagery| US10435176B2|2016-05-25|2019-10-08|Skydio, Inc.|Perimeter structure for unmanned aerial vehicle| CN105955294A|2016-05-26|2016-09-21|北京大工科技有限公司|Method and device used for controlling plant protection work of unmanned plane| CN106054917A|2016-05-27|2016-10-26|广州极飞电子科技有限公司|Unmanned aerial vehicle flight control method and device, and remote controller| CN105979145A|2016-06-22|2016-09-28|上海顺砾智能科技有限公司|Shooting system and shooting method for improving aerial image quality of unmanned aerial vehicle| US9881214B1|2016-07-13|2018-01-30|The Climate Corporation|Generating pixel maps from non-image data and difference metrics for pixel maps| EP3276544A1|2016-07-29|2018-01-31|Accenture Global Solutions Limited|Precision agriculture system| US10520943B2|2016-08-12|2019-12-31|Skydio, Inc.|Unmanned aerial image capture platform| US10089716B2|2016-09-03|2018-10-02|Microsoft Technology Licensing, Llc|Generating real-time sensor maps from videos and in-ground sensor data| US10084868B2|2016-09-03|2018-09-25|Microsoft Technology Licensing, Llc|IoT gateway for weakly connected settings| US11087132B2|2016-09-07|2021-08-10|Precision Hawk Usa, Inc.|Systems and methods for mapping emerged plants| CA3035074A1|2016-09-08|2018-03-15|Walmart Apollo, Llc|Systems and methods for identifying pests in crop-containing areas via unmanned vehicles based on crop damage detection| GB2568183B|2016-09-08|2022-02-23|Walmart Apollo Llc|Systems and methods for identifying pests in crop-containing areas via unmanned vehicles| MX2019002644A|2016-09-08|2019-06-17|Walmart Apollo Llc|Systems and methods for dispensing an insecticide via unmanned vehicles to defend a crop-containing area against pests.| CA3035771A1|2016-09-09|2018-03-15|Walmart Apollo, Llc|Geographic area monitoring systems and methods through interchanging tool systems between unmanned vehicles| CA3035546A1|2016-09-09|2018-03-15|Walmart Apollo, Llc|Systems and methods to interchangeably couple tool systems with unmanned vehicles| MX2019002714A|2016-09-09|2019-10-02|Walmart Apollo Llc|Geographic area monitoring systems and methods utilizing computational sharing across multiple unmanned vehicles.| WO2018049289A1|2016-09-09|2018-03-15|Cibo Technologies, Inc.|Systems for adjusting agronomic inputs using remote sensing, and related apparatus and methods| WO2018049148A1|2016-09-09|2018-03-15|Wal-Mart Stores, Inc.|Geographic area monitoring systems and methods that balance power usage between multiple unmanned vehicles| US11244398B2|2016-09-21|2022-02-08|Iunu, Inc.|Plant provenance and data products from computer object recognition driven tracking| US10635274B2|2016-09-21|2020-04-28|Iunu, Inc.|Horticultural care tracking, validation and verification| US10791037B2|2016-09-21|2020-09-29|Iunu, Inc.|Reliable transfer of numerous geographically distributed large files to a centralized store| US10627386B2|2016-10-12|2020-04-21|Aker Technologies, Inc.|System for monitoring crops and soil conditions| RU2749033C2|2016-10-13|2021-06-03|Маккейн Фудс Лимитед|Method, medium and system for detecting potato virus in agricultural crop image| US10192112B2|2016-11-02|2019-01-29|Blue River Technology Inc.|Plot gap identification| US10127450B2|2016-11-02|2018-11-13|Blue River Technology Inc.|Plot gap identification| US10928821B2|2016-11-04|2021-02-23|Intel Corporation|Unmanned aerial vehicle-based systems and methods for generating landscape models| US11183073B2|2016-11-29|2021-11-23|American Robotics|Aircraft flight plan systems| BR112019015015A2|2017-01-23|2020-04-28|Univ Illinois|adaptive cyber-physical system for efficient monitoring of unstructured environments| US10909367B2|2017-03-02|2021-02-02|Basecamp Networks, LLC|Automated diagnosis and treatment of crop infestations| US10599959B2|2017-04-05|2020-03-24|International Business Machines Corporation|Automatic pest monitoring by cognitive image recognition with two cameras on autonomous vehicles| US20180302572A1|2017-04-17|2018-10-18|Jacob Barnes|Three-dimensional image capturing system and method for obtaining three-dimensional images| US10537062B2|2017-05-26|2020-01-21|Cnh Industrial America Llc|Aerial vehicle systems and methods| US9984455B1|2017-06-05|2018-05-29|Hana Resources, Inc.|Organism growth prediction system using drone-captured images| US11061155B2|2017-06-08|2021-07-13|Total Sa|Method of dropping a plurality of probes intended to partially penetrate into a ground using a vegetation detection, and related system| IT201700067764A1|2017-06-19|2018-12-19|Dinamica Generale S P A|Self-propelled apparatus for the analysis and optimal management of fields for agricultural cultivation.| CA2972055C|2017-06-29|2021-01-26|Conseiller Forestier Roy Inc.|Airborne material spreading assembly and method for spreading material| US10642264B2|2017-07-19|2020-05-05|Superior Communications, Inc.|Security drone system| US10983249B2|2017-09-14|2021-04-20|Farmers Edge Inc.|Indicator interpolation to predict a weather state| CN107680084B|2017-09-21|2021-11-19|中山乡游生态科技有限公司|Modern agriculture monitoring system| US20200272168A1|2017-10-11|2020-08-27|Trinamix Gmbh|Autonomous robotic device and associted control method| RU2668319C1|2017-10-27|2018-09-28|Общество с ограниченной ответственностью "АгроДронГрупп"|Method and system for processing zones of sowing agricultural crops based on monitoring data| US10872534B2|2017-11-01|2020-12-22|Kespry, Inc.|Aerial vehicle inspection path planning| US10725468B2|2017-11-13|2020-07-28|Intel IP Corporation|Bounding-volume based unmanned aerial vehicle illumination management system| US10779458B2|2017-12-01|2020-09-22|International Business Machines Corporation|Monitoring aerial application tasks and recommending corrective actions| US10477756B1|2018-01-17|2019-11-19|Cibo Technologies, Inc.|Correcting agronomic data from multiple passes through a farmable region| US11062516B2|2018-02-07|2021-07-13|Iunu, Inc.|Augmented reality based horticultural care tracking| US10769466B2|2018-02-20|2020-09-08|International Business Machines Corporation|Precision aware drone-based object mapping based on spatial pattern recognition| US10750733B1|2018-02-28|2020-08-25|Espen Garner|Autonomous insect carrier| JP6914874B2|2018-03-14|2021-08-04|株式会社ゼンリンデータコム|Flight route generator and flight route generation method| US11235874B2|2018-03-30|2022-02-01|Greensight Agronomics, Inc.|Automated drone-based spraying system| US10929664B2|2018-03-30|2021-02-23|Iunu, Inc.|Visual observer of unmanned aerial vehicle for monitoring horticultural grow operations| US11205073B2|2018-03-30|2021-12-21|Greensight Agronomics, Inc.|System to automatically detect and report changes over time in a large imaging data set| US11116145B2|2018-03-30|2021-09-14|Greensight Argonomics, Inc.|Automated optimization of agricultural treatments based on raster image data system| CN108537164B|2018-04-08|2020-08-28|北京农业智能装备技术研究中心|Method and device for monitoring germination rate of dibbling and sowing based on unmanned aerial vehicle remote sensing| CN109155668A|2018-08-06|2019-01-04|北京小米移动软件有限公司|flight path configuration method and device| US10660277B2|2018-09-11|2020-05-26|Pollen Systems Corporation|Vine growing management method and apparatus with autonomous vehicles| US10779476B2|2018-09-11|2020-09-22|Pollen Systems Corporation|Crop management method and apparatus with autonomous vehicles| AU2019360153A1|2018-10-19|2021-05-20|The Climate Corporation|Detecting infection of plant diseases by classifying plant photos| US11240961B2|2018-10-26|2022-02-08|Deere & Company|Controlling a harvesting machine based on a geo-spatial representation indicating where the harvesting machine is likely to reach capacity| US11178818B2|2018-10-26|2021-11-23|Deere & Company|Harvesting machine control system with fill level processing based on yield data| US11059582B2|2019-02-11|2021-07-13|Cnh Industrial Canada, Ltd.|Systems for acquiring field condition data| US11001380B2|2019-02-11|2021-05-11|Cnh Industrial Canada, Ltd.|Methods for acquiring field condition data| US11009625B2|2019-03-27|2021-05-18|The Climate Corporation|Generating and conveying comprehensive weather insights at fields for optimal agricultural decision making| US11079725B2|2019-04-10|2021-08-03|Deere & Company|Machine control using real-time model| US11234366B2|2019-04-10|2022-02-01|Deere & Company|Image selection for machine control| CN109977924A|2019-04-15|2019-07-05|北京麦飞科技有限公司|For real time image processing and system on the unmanned plane machine of crops| CN110222903B|2019-06-13|2021-08-31|苏州市农业科学院|Rice yield prediction method and system based on unmanned aerial vehicle remote sensing| KR102101355B1|2019-07-08|2020-04-17|주식회사 이노드|Agricultural Drone and Measuring System for Remaining Quantity of Pesticide|
法律状态:
2020-11-10| B06F| Objections, documents and/or translations needed after an examination request according [chapter 6.6 patent gazette]| 2020-11-17| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]| 2021-12-07| B350| Update of information on the portal [chapter 15.35 patent gazette]|
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 US201462040859P| true| 2014-08-22|2014-08-22| US62/040,859|2014-08-22| US201462046438P| true| 2014-09-05|2014-09-05| US62/046,438|2014-09-05| US14/831,165|2015-08-20| US14/831,165|US9922405B2|2014-08-22|2015-08-20|Methods for agronomic and agricultural monitoring using unmanned aerial systems| PCT/US2015/046165|WO2016029054A1|2014-08-22|2015-08-20|Methods for agronomic and agricultural monitoring using unmanned aerial systems| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|