专利摘要:
system and method for monitoring field operations, and non-transient storage media are systems and methods for capturing images of a field and performing agricultural data analysis of the images. In one embodiment, a computer system for monitoring field operations includes a database for storing agricultural image data that includes images from at least one crop development stage that are captured with at least one of a device and a remote sensor. that moves across a field. The computer includes at least one processing unit that is attached to the database. at least one processing unit is configured to execute instructions to analyze captured images, to determine relevant images that indicate a change in at least one crop development condition, and to generate a localized view map layer to view the field at least one stage of culture development based on at least the relevant captured images.
公开号:BR112017028605A2
申请号:R112017028605-0
申请日:2016-06-28
公开日:2019-11-12
发明作者:Stoller Brad;Sauder Doug;Koch Justin;Baurer Phil
申请人:Climate Corp;
IPC主号:
专利说明:

SYSTEM AND METHOD FOR MONITORING FIELD OPERATIONS, AND MEANS
NON-TRANSITIONAL STORAGE
COPYRIGHT NOTE [001] A portion of the disclosure in this patent document contains material that is subject to copyright protection. The copyright owner has no objection to facsimile reproduction by any of the patent documents or the patent disclosure, as present in the patent or patent file of the Trademark and Patent Office, but, on the other hand, reserves any and all copyrights or rights. © 2016 The Climate Corporation.
RELATED REQUESTS [002] This request claims the benefit of U.S. Serial Order No. 62 / 187,183, filed on June 30, 2015, the contents of which are hereby incorporated by reference.
FIELD OF TECHNIQUE [003] The modalities of the present disclosure refer to systems and methods for image capture and analysis of agricultural data.
BACKGROUND OF THE INVENTION [004] Planters are used to plant crop seeds (for example, corn, soybeans) in a field. Some planters include a display monitor inside a booth to display a coverage map showing regions of the field that have been planted. The planter coverage map is generated based on the planting data collected by the planter.
[005] A combine harvester or combine is a
Petition 870190060828, of 06/28/2019, p. 7/85
2/71 machine harvesting crops. A combine coverage map shows regions of the field that were harvested by that combine. A coverage map lets the combine operator know that a region of the field has already been harvested by the same combine. Yield data for a field can then be generated after harvesting the field. Yield data can be analyzed to potentially improve agricultural operations for a subsequent growth crop.
SUMMARY [006] In one embodiment, a computer system to monitor field operations includes a database for storing agricultural image data that includes images from at least one crop development stage that are captured with at least one of a device and a remote sensor that moves across a field. At least one processing unit is attached to the database. At least one processing unit is set up to execute instructions to analyze the captured images, to determine relevant images that indicate a change in at least one condition of crop development, and to generate a localized view map layer to view the field in , at least one, culture development stage based on at least the relevant captured images.
[007] In one embodiment, an image capture and analysis method for an application pass for an agricultural crop includes making an application pass for a field with a device and, at the same time, capturing images of the field that include cultures, if they are visible, during
Petition 870190060828, of 06/28/2019, p. 8/85
3/71 application pass. The method further includes generating a localized view to view the field during the application pass based on the captured images and automatically analyzing the application pass which includes at least one of a plantation analysis, a fertilizer analysis, a harvest analysis and a plowing analysis based on the images captured during the application pass.
BRIEF DESCRIPTION OF THE DRAWINGS [008] The present disclosure is illustrated by way of example, and not by way of limitation, in the Figures of the attached drawings and in which:
[009] Figure 1 illustrates an exemplary computer system that is configured to perform the functions described in this document, shown in a field environment with another device with which the system can interoperate.
[010] Figure 2 illustrates two views of an example logical organization of instruction sets in main memory when an example mobile application is loaded for execution.
[011] Figure 3 illustrates a programmed process by which the agricultural intelligence computer system generates one or more pre-configured agronomic models using agronomic data provided by one or more data sources.
[012] Figure 4 is a block diagram illustrating a computer system 400 by which a modality of the invention can be implemented.
[013] Figure 5 depicts an exemplary modality of a timeline view for data entry.
Petition 870190060828, of 06/28/2019, p. 9/85
4/71 [014] Figure 6 depicts an exemplary modality of a spreadsheet view for data entry.
[015] Figure 7 illustrates a flowchart of a modality for a method 7 00 of capturing images and creating a map layer of local view for an application passage of an agricultural crop.
[016] Figure 8 illustrates a flowchart of a modality for an 800 method of capturing images and creating a map layer of local view for an agricultural application pass.
[017] Figure 9 illustrates a flowchart of a modality for a 900 method of capturing images of an agricultural crop in a field and determines crop information for the agricultural crop in the field.
[018] Figure 10 illustrates a flowchart of a modality for a method 1000 of determining the characteristics of an agricultural crop (for example, maize) in a field based on capturing images of the culture in the field.
[019] Figure 11 illustrates a diagram for capturing images of a culture from multiple points of view according to a modality.
DETAILED DESCRIPTION [020] These are systems and methods for capturing images of a field and performing analysis of agricultural data on the images. In one embodiment, a method includes moving at least one of a device and a remote sensor across a field at a crop development stage and capturing images of the field that include a crop (for example, corn, soybeans). The method additionally includes analyzing the captured images and determining relevant images that indicate a
Petition 870190060828, of 06/28/2019, p. 10/85
5/71 change in at least one condition of the development of culture. The method additionally includes generating a localized view map layer to visualize the field in the culture development stage based on at least the relevant captured images [021] The computer system includes at least one processing unit that is configured to execute instructions to analyze the captured images, to determine relevant images that indicate a change in at least one crop development condition, and to generate a localized view map layer to visualize the field at at least one crop development stage with base at least on the relevant captured images. A user can view the localized view map layer in order to have a better understanding of the actual current field conditions for the selected region. The user can identify any potential problems and take corrective or different action during a current crop growing season to improve crop yield for the current crop growing season.
[022] In the following description, several details are defined. It will be evident, however, to one skilled in the art, what modalities of the present revelation can be practiced without these specific details. In some examples, well-known structures and devices are shown in the form of a block diagram, rather than in detail, in order to avoid obscuring the present disclosure.
[023] Figure 1 illustrates an example computer system that is configured to perform the functions
Petition 870190060828, of 06/28/2019, p. 11/85
6/71 described in this document, shown in a field environment with another device with which the system can interoperate. In one embodiment, a user 102 has, operates, or owns a field manager computing device 104 at a field location or is associated with a field location, such as one intended for agricultural activities or a management location for one or more agricultural fields. The field manager computing device 104 is programmed or configured to provide field data 106 to an agricultural intelligence computer system 130 via one or more networks 109.
[024] Examples of field data 106 include (a) identification data (for example, acreage, field name, field identifiers, geographic identifiers, boundary identifiers, culture identifiers, and any other appropriate data that can be used to identify the farm land, such as a common land unit (CLU), lot number and lock, an order number, geographic coordinates and boundaries, Farm Serial Number (FSN), farm number, tractor number, field number, section, city and / or lane), (b) harvest data (for example, type of crop, crop variety, crop rotation, the possibility that the crop was organically grown, date harvest, Historical Production History (APH), expected yield, yield, crop price, crop recipe, grain humidification, tillage practice and previous crop yield information), (c) soil data (for example, type, composition, pH, organic matter (OM), cation exchange capacity (CEC)), (d) planting data
Petition 870190060828, of 06/28/2019, p. 12/85
7/71 (eg date of planting, type of seed (or type of seeds), relative maturity (RM) of planted seed (or planted seeds), seed population, entry cost information (for example, cost of seed), and property indices (for example, seed population ratio to a soil parameter), etc.), for the fields that are monitored), (e) fertilizer data (for example, type of nutrient (Nitrogen, Phosphorus, Potassium), type of application, date of application, quantity, source, method, cost of nutrients), (f) pesticide data (for example, pesticide, herbicide, fungicide, other substance or mixture of substances intended for use as a plant regulator, defoliant or desiccant, application date, quantity, source, method), (g) irrigation data (e.g. application date, quantity, source, method), (h) climatic data (e.g. precipitation, rainfall rate, predicted rainfall, re water flow rate, temperature, wind, forecast, pressure, visibility, clouds, heat index, dew point, humidity, snow depth, air quality, sunrise, sunset), ( i) image data (for example, image and light spectrum information from an agricultural device sensor, camera, computer, smart phone, tablet computer, unmanned aerial vehicle, drone, self-guided device, self-propelled device, airplanes or satellites), (]) gauging observations (photos, videos, freeform notes, voice recordings, voice transcriptions, weather conditions (temperature, precipitation (current and past weather), soil humidification, culture, vector wind speed, relative humidity,
Petition 870190060828, of 06/28/2019, p. 13/85
8/71 condensation, dark layer)), and (k) soil, seed, crop phenology, reports of pests and diseases, and predictions of sources and databases.
[025] A data server computer 108 is communicatively coupled to the agricultural intelligence computer system 130 and is programmed or configured to send external data 110 to the agricultural intelligence computer system 130 via the network (or networks) 109 External data server computer 108 may be owned or operated by the same legal person or entity as the agricultural intelligence computer system 130, or by a different person or entity, such as a government agency, non-governmental organization (NGO) and / or a private data service provider. Examples of external data include climate data, image data, soil data, field conditions or statistical data related to crop yields, among others. External data 110 may consist of the same type of information as field data 106. In some embodiments, external data 110 is provided by an external data server 108 that belongs to the same entity that owns and / or operates the computer system. agricultural intelligence 130. For example, the agricultural intelligence computer system 130 may include a data server focused exclusively on a type of data that may otherwise be obtained from third party sources, such as weather data. In some embodiments, an external data server 108 can actually be incorporated into system 130.
[026] An agricultural device 111 may have one or more remote sensors 112 attached to it, whose sensors are
Petition 870190060828, of 06/28/2019, p. 14/85
9/71 communicatively coupled, directly or indirectly, by means of the agricultural apparatus 111 to the agricultural intelligence computer system 130 and are programmed or configured to send sensor data to the agricultural intelligence computer system 130. Examples of agricultural apparatus 111 include tractors, combine harvesters, harvesters, planters, trucks, fertilizer equipment, unmanned aerial vehicles, drone, self-guided device, self-propelled device, and any other item of physical machinery or hardware, typically mobile machinery, that can be used in tasks associated with agriculture. In some embodiments, a single unit of apparatus 111 may comprise a plurality of sensors 112 that are locally coupled to a network in the apparatus; the controller area network (CAN) is an example of such a network that can be installed on combined combines or combines. Application controller 114 is communicatively coupled to the agricultural intelligence computer system 130 via the network (or networks) 109 and is programmed or configured to receive one or more routes to control an operating parameter of a vehicle or agricultural implement of the farm intelligence computer system 130. For example, a controller area network (CAN) bus interface can be used to enable communications from the farm intelligence computer system 130 with farm appliance 111, such as how CLIMATE FIELD VIEW DRIVE, available from The Climate Corporation, San Francisco, California, is used. Sensor data can consist of the same type of information as field data 106. In some embodiments, remote sensors 112 can
Petition 870190060828, of 06/28/2019, p. 15/85
10/71 cannot be attached to an agricultural device 111, but can be located remotely in the field and can communicate with network 109.
[027] Apparatus 111 may optionally comprise a cabin computer 115 that is programmed with a cabin application, which may comprise a version or variant of the mobile application for device 104 which is further described in other sections in this document. In one embodiment, the cabin computer 115 comprises a compact computer, often a tablet-sized computer or smart phone, with a graphical screen display, such as a color display, which is mounted in an operator's cabin. apparatus 111. Cabin computer 115 may implement part or all of the operations and functions that are further described in this document for mobile computer device 104.
[028] The network (or networks) 109 largely represents any combination of one or more data communication networks that include local area networks, wide area networks, connection between networks or internets, with the use of any one of wired and wireless connection, which include terrestrial or satellite links. The network (or networks) can be deployed by any medium or mechanism that provides data exchange between the various elements in Figure 1. The various elements in Figure 1 can also have direct communication links (wired or wireless). Sensors 112, controller 114, external data server computer 108 and other system elements each comprise an interface compatible with the network (or networks) 109 and are
Petition 870190060828, of 06/28/2019, p. 16/85
11/71 programmed or configured to use standardized protocols for communication over networks, such as TCP / IP, Bluetooth, CAN and upper layer protocols such as HTTP, TLS and the like.
[029] The agricultural intelligence computer system 130 is programmed or configured to receive agricultural data that includes field data 106 from the field manager computing device 104, external data 110 from the external data server computer 108 and sensor data remote sensor 112. The agricultural intelligence computer system 130 can be further configured to host, use or run one or more computer programs, other software elements, digitally programmed logic, such as FPGAs or ASICs, or any combination thereof to perform translation and storage of data values, build digital models of one or more cultures in one or more fields, generate recommendations and notifications, and generate and send scripts to application controller 114, in the manner further described in the other sections of this disclosure.
[030] In one embodiment, the agricultural intelligence computer system 130 is programmed with, or comprises, a communication layer 132, instructions 136, presentation layer 134, data management layer 140, hardivare / virtualization layer 150, and model and field data repository 160. Layer, in this context, refers to any combination of electronic digital interface circuits, microcontrollers, firmware, such as drivers, and / or computer programs or other software elements.
[031] Communication layer 132 can be programmed or
Petition 870190060828, of 06/28/2019, p. 17/85
12/71 configured to perform functions of realizing the admission / emission interface which includes sending requests to the field manager computing device 104, the external data server computer 108 and the remote sensor 112 for field data, external data and sensor data, respectively. The communication layer 132 can be programmed or configured to send the received data to the model and field data repository 160 to be stored as field data 106.
[032] Presentation layer 134 can be programmed or configured to generate a graphical user interface (GUI) to be displayed on the field manager computing device 104, cabin computer 115 or other computers that are coupled to system 130 through network 109. The GUI can comprise controls for inserting data to be sent to the agricultural intelligence computer system 130, generating requests for models and / or recommendations, and / or displaying recommendations, notifications, models and other field data.
[033] Data management layer 140 can be programmed or configured to manage read operations and read operations involving repository 160 and other functional elements of the system, including queries and result sets communicated between the functional elements of the system and the repository. Examples of data management layer 140 include JDBC, SQL server interface code, and / or HADOOP interface code, among others. The repository 160 can comprise a database. As used in this document, the term database can refer to a body of data, a
Petition 870190060828, of 06/28/2019, p. 18/85
13/71 relational database management system (RDBMS) or both. As used in this document, a database can comprise any collection of data that includes hierarchical databases, relational databases, flat file databases, object-relational databases, object-oriented databases, and any other structured collection of records or data that is stored on a computer system. Examples of RDBMSs include, but are not limited to, ORACLE®, MYSQL, IBM® DB2, MICROSOFT® SQL SERVER, SYBASE® and POSTGRESQL databases. However, any database can be used as long as it allows the systems and methods described in this document.
[034] When field data 106 is not provided directly to the agricultural intelligence computer system by means of one or more agricultural machine devices or agricultural machinery that interact with the agricultural intelligence computer system, the user may be required through one or more user interfaces on the user device (served by the agricultural intelligence computer system) to insert such information. In an exemplary mode, the user can specify identification data by accessing a map on the user's device (served by the agricultural intelligence computer system) and selecting CLUs that were shown graphically on the map. In an alternative embodiment, user 102 can specify identification data by accessing a map on the user's device (served by the agricultural intelligence computer system 130) and establishing boundaries of the field across the map. Such
Petition 870190060828, of 06/28/2019, p. 19/85
14/71 CLU selection or map drawings represent geographic identifiers. In alternative modalities, the user can specify identification data by accessing field identification data (provided as shape files or in a similar format) from the US Department of Rural and Agricultural Services Agency or other source through the device and providing such field identification data to the agricultural intelligence computer system.
[035] In an exemplary embodiment, the agricultural intelligence computer system 130 is programmed to generate and cause the display of a graphical user interface that comprises a data manager for data entry. After one or more fields have been identified using the methods described above, the data manager can provide one or more graphical user interface accessories that, when selected, can identify changes in the field, soil, crops, plowing or nutrient practices. The data manager can include a timeline view, a spreadsheet view and / or one or more editable programs.
[036] Figure 5 depicts an exemplary embodiment of a 501 timeline view for data entry. Using the display depicted in Figure 5, a user's computer can insert a selection of a particular field and a particular date for adding an event. Events depicted at the top of the timeline may include Nitrogen, Plantation, Practices and Soil. To add a nitrogen application event, a user computer can provide admission to select the nitrogen flap.
Petition 870190060828, of 06/28/2019, p. 20/85
15/71 user computer can then select a location on the timeline for a particular field in order to indicate an application of nitrogen in the selected field. In response to receiving a selection of a location on the timeline by a particular field, the data manager may display a data entry overlay, which allows the user's computer to enter data belonging to nitrogen applications, planting procedures , application of soil, plowing procedures, irrigation practices or other information related to the particular field. For example, if a user's computer selects a portion of the timeline and indicates an application of nitrogen, then the data entry overlay may include fields to enter an amount of applied nitrogen, an application date, a type of fertilizer used, and any other information related to nitrogen application.
[037] In one embodiment, the data manager provides an interface for creating one or more programs. Program, in this context, refers to a set of data pertaining to nitrogen applications, planting procedures, soil application, plowing procedures, irrigation practices or other information that can be related to one or more fields, and which can be stored in digital data storage for reuse as a set in other operations. After a program has been created, it can be conceptually applied to one or more fields and references to the program can be stored in digital storage in association with the data that identifies the fields. So, instead of introducing
Petition 870190060828, of 06/28/2019, p. 21/85
16/71 manually identical data related to the same nitrogen applications across multiple different fields, a user computer can create a program that indicates a particular nitrogen application and then apply the program to multiple different fields. For example, in the timeline view in Figure 5, the top of two timelines has the Autumn applied program selected, which includes an application of 68 kg N / ha (150 lbs N / ac) in early April. The data manager can provide an interface for editing a program. In one mode, when a particular program is edited, each field that selected the particular program is edited. For example, in Figure 5, if the Autumn program is edited to reduce nitrogen application to 58.96 kg N / ha (130 lbs N / ac), the top two fields can be updated with a nitrogen application reduced based on the edited program.
[038] In one mode, in response to receiving edits to a field that has a selected program, the data manager removes the field's correspondence for the selected program. For example, if a nitrogen application is added to the top field in Figure 5, the interface can be updated to indicate that the program applied in the autumn is no longer applied to the top field. Although the application of nitrogen in early April may remain, updates to the program applied in the autumn would not alter the application of nitrogen in April.
[039] Figure 6 depicts an exemplary embodiment of a spreadsheet view 601 for data entry. Using the view depicted in Figure 6, a user can create and edit information for one or more fields. O
Petition 870190060828, of 06/28/2019, p. 22/85
17/71 data manager can include spreadsheets to insert information regarding Nitrogen, Plantation, Practices and Soil, as shown in Figure 6 To edit a particular introduction, a user computer can select the particular introduction in the spreadsheet and update the values. For example, Figure 6 depicts an update in progress to a target yield value for the second field. In addition, a user computer can select one or more fields in order to apply one or more programs. In response to receiving a program selection for a particular field, the data manager can automatically complete the introductions for the particular field based on the selected program. As with the timeline view, the data manager can update entries for each field associated with a particular program in response to receiving an update for the program. Additionally, the data manager can de-match the program selected for the field in response to receiving an edit for one of the introductions to the field.
[040] In one embodiment, the model and field data are stored in the model and field data repository 160. The model data comprises data models created for one or more fields. For example, a culture model may include a digitally constructed model of the development of a culture in one or more fields. Model, in this context, refers to a digitally stored electronic set of executable instructions and associated data values, which have the ability to receive and respond to a call, invocation or request
Petition 870190060828, of 06/28/2019, p. 23/85
18/71 programmatic or other resolution call, invocation or digital request based on specific admission values, to yield one or more stored emission values that can serve as the basis for computer-implemented recommendations, emission or control data displays machine, among other things. People versed in the field have found that it is convenient to express models using mathematical equations, but that the form of expression does not confine the models revealed in this document to abstract concepts; instead, each model in this document has a practical application on a computer in the form of executable instructions and stored data that deploy the model using the computer. The model data can include a model of events passed in one or more fields, a model of the current state of one or more fields, and / or a model of events predicted in one or more fields. Model and field data can be stored in data structures in memory, rows in a database table, in flat files or spreadsheets, or other forms of stored digital data.
[041] The hardware / virtualization layer 150 comprises one or more central processing units (CPUs), memory controllers, and other devices, components, or elements of a computer system, such as volatile or non-volatile memory, non-volatile storage volatile, such as disk, and I / O devices or interfaces, as illustrated and described, for example, in connection with Figure 4. Layer 150 can also comprise programmed instructions that are configured to support virtualization, containerization or other technologies. On a
Petition 870190060828, of 06/28/2019, p. 24/85
19/71 example, instructions 136 include different types of instructions to monitor field operations, capture images of crop development and field operations, and perform analysis of agricultural data based on the captured images. Instructions 136 may include analysis of agricultural data instructions that include instructions for performing the operations of the methods described in this document. Instructions 136 can be included in layer 150 programmed instructions.
[042] For the purpose of illustrating a clear example, Figure 1 shows a limited number of examples of certain functional elements. However, in other embodiments, there may be any number of such elements. For example, the modalities can use thousands or millions of different mobile computing devices 104 associated with different users. In addition, system 130 and / or external data server computer 108 can be deployed using two or more processors, cores, clusters or examples of physical machines or virtual machines, configured in a different location or colocalized with other elements in a data center, shared computing facility or cloud computing facility.
[043] In one modality, the implantation of functions described at the this document with the use of one or more Software in computer or others elements of soft mare que
are loaded on and run using one or more general purpose computers will cause general purpose computers to be configured as a particular machine or as a computer that is specially adapted
Petition 870190060828, of 06/28/2019, p. 25/85
20/71 to perform the functions described in this document. In addition, each of the flowcharts that are described further in this document can serve, either alone or in combination with the prose process and function descriptions in this document, as algorithms, plans or directions that can be used to program a computer or logic to implement the functions that are described. In other words, all prose text in this document, and all figures in drawings, are intended, together, to provide revelation of algorithms, plans, or directions that is sufficient to allow a person skilled in the art to program a computer to perform the functions that are described in this document, in combination with the skill and knowledge of such person given the level of skill that is suitable for such inventions and disclosures.
[044] In one embodiment, user 102 interacts with the agricultural intelligence computer system 130 using a field manager computing device 104 configured with an operating system and one or more application programs or apps; the field manager computing device 104 can also interoperate with the independent agricultural intelligence computer system and automatically under program control or logical control and direct user interaction is not always required. Field manager computing device 104 broadly represents one or more of a smart phone, PDA, tablet-type computing device, laptop-type computer, desktop-type computer, working crop, or any other computing device with
Petition 870190060828, of 06/28/2019, p. 26/85
21/71 the ability to transmit and receive information and perform the functions described in this document. The field manager computing device 104 can communicate over a network using a mobile application stored in the field manager computing device 104, and in some embodiments, the device can be coupled using a cable 113 or connector to sensor 112 and / or controller 114. A private user 102 may have, operate or own and use, in connection with system 130, more than one field manager computing device 104 at a time.
[045] The mobile app can provide client-side functionality over the network to one or more mobile computing devices. In an exemplary embodiment, the field manager computing device 104 can access the mobile application via a web browser or a local client application or app. The field manager computing device 104 can transmit data to, and receive data from, one or more front-end servers, using web-based protocols or formats, such as HTTP, XML and / or JSON, or app-specific protocols. In an exemplary modality, the data can take the form of admissions of requests and user information, such as field data, on the mobile computing device. In some embodiments, the mobile application interacts with location tracking hardware and software on the field manager computing device 104 which determines the location of field manager computing device 104 using standard tracking techniques, such as
Petition 870190060828, of 06/28/2019, p. 27/85
22/71 multilateration of radio signals, the global positioning system (GPS), WiFi positioning systems, or other mobile positioning methods. In some cases, location data or other data associated with device 104, user 102 and / or user account (or user accounts) can be obtained by querying an operating system on the device or by requesting that an app on the device obtain data from the operating system.
[046] In one embodiment, the field manager computing device 104 sends field data 106 to the agricultural intelligence computer system 130 which comprises or includes, but is not limited to, data values representing one or more of: a geographical location of one or more fields, plowing information for one or more fields, crops planted in one or more fields, and soil data extracted from one or more fields. Field manager computing device 104 can send field data 106 in response to a user admission from user 102 that specifies data values for the one or more fields. In addition, field manager computing device 104 can automatically send field data 106 when one or more of the data values becomes available to field manager computing device 104. For example, manager computing device field 104 can be communicatively coupled to remote sensor 112 and / or application controller 114. In response to receiving data indicating that application controller 114 has released water in one or more fields, the field manager computing device 104 can send field data 106 to the
Petition 870190060828, of 06/28/2019, p. 28/85
23/71 agricultural intelligence computer 130 indicating that water has been released in one or more fields. The field data 106 identified in this disclosure can be admitted and communicated using electronic digital data that is communicated between computing devices using parameterized URLs over HTTP, or other appropriate communication protocol or message.
[047] A commercial example of the mobile app is CLIMATE FIELD VIEW, commercially available from The Climate Corporation, San Francisco, California, USA. The CLIMATE FIELD VIEW application, or other applications, may be modified, extended or adapted to include features, functions and programming that were not previously disclosed as the filing date of this disclosure. In one embodiment, the mobile application comprises an integrated software platform that allows a producer to make fact-based decisions for his operations due to the fact that he combines historical data about the producer's fields with any other data that the producer wishes to compare. Combinations and comparisons can be performed in real time and are based on scientific models that provide potential scenarios to enable the producer to make better and better informed decisions.
[048] Figure 2 illustrates two views of an exemplary logical arrangement of instruction sets in main memory when an example mobile application is loaded for execution. In Figure 2, each named element represents a region of one or more pages of RAM or other main memory, or one or more blocks of disk storage or other non-volatile storage, and instructions
Petition 870190060828, of 06/28/2019, p. 29/85
24/71 scheduled in such regions. In one embodiment, a view (a), a mobile computer application 200 comprises instructions for sharing account field data assimilation 202, overview and alert instructions 204, digital map book instructions 206, seed instructions and planting 208, nitrogen instructions 210, climate instructions 212, field health instructions 214, and performance instructions 216.
[049] In one embodiment, a mobile computer application 200 comprises instructions for sharing account field data assimilation 202 that are programmed to receive, translate and assimilate field data from third-party systems via manual upload transfer or APIs. Data types can include field boundaries, yield maps, plantation maps, soil test results, application maps, and / or management zones, among others. Data formats can include shape files, native third-party data formats, and / or farm management information system (FMIS) exports, among others. Receipt of data can occur via manual upload transfer, attached email, external APIs that send data to the mobile app, or instructions that call APIs from external systems to retrieve data from the mobile app. In one embodiment, the mobile computer application 200 comprises a data entry box. In response to receiving a selection from the data entry box, the mobile computer application 200 can display a graphical user interface to manually upload data files and import uploaded files
Petition 870190060828, of 06/28/2019, p. 30/85
25/71 for a data manager.
[050] In one embodiment, digital map book instructions 20 6 comprise layers of field map data stored in device memory and are programmed with data visualization tools and geospatial field notes. This provides producers with convenient information at hand for reference, logging and visual insights into field performance. In one embodiment, the overview and alert instructions 204 are programmed to provide a broad operating view of what is important to the producer, and specific recommendations for taking action or focusing on particular problems. This allows the producer to focus time on what needs attention, to save time and preserve yield throughout the harvest. In one embodiment, seed and planting instructions 208 are programmed to provide tools for seed selection, hybrid placement and script creation, which includes creating variable rate (VR) scripts, based on scientific models and empirical data. This allows producers to maximize yield or return on investment through purchase, placement and optimized seed population.
[051] In one embodiment, script generation instructions 205 are programmed to provide an interface for generating scripts, including variable rate (VR) fertility scripts. The interface allows producers to create roadmaps for field implements, such as nutrient, planting and irrigation applications. For example, a planting roadmap interface can comprise tools to identify a type of seed for planting. Upon receipt of a seed type selection, the application
Petition 870190060828, of 06/28/2019, p. 31/85
26/71 mobile computer 200 can display one or more fields divided into management zones, such as layers of field map data created as part of digital map book instructions 206. In one embodiment, management zones comprise soil zones along with a panel that identifies each soil zone and a soil name, texture, drainage for each zone, or other field data. The mobile computer application 200 can also display tools for editing or creating such, such as graphical tools for drawing management zones, such as soil zones along a map of one or more fields. Planting procedures can be applied to all management zones or different planting procedures can be applied to different subsets of management zones. When a script is created, the mobile computer application 200 can make the script available for download in a format readable by an application controller, such as an archived or compressed format. In addition and / or alternatively, a script can be sent directly to the cabin computer 115 from the mobile computer application 200 and / or downloaded to one or more data servers and stored for additional use. In one embodiment, nitrogen instructions 210 are programmed to provide tools to inform nitrogen decisions by viewing nitrogen availability for crops. This allows producers to maximize yield or return on investment through improved nitrogen application during the harvest. Exemplary programmed functions include displaying images such as SSURGO images for
Petition 870190060828, of 06/28/2019, p. 32/85
27/71 allow drawing the zones and / or application images generated from subfield soil data, such data obtained from sensors, in a high spatial resolution (as well as 10 meters or less due to their proximity to the ground) ; upload existing producer-defined zones; provide an application chart and / or a map to allow the application of nitrogen application (or applications) across multiple zones; issuing scripts to drive machinery; tools for mass data entry and adjustment; and / or maps for data visualization, among others. Bulk data entry, in this context, can mean entering data once and then applying the same data to multiple fields that have been defined in the system; the example data may include nitrogen application data that is the same for many fields from the same producer, but such mass data entry applies to the introduction of any type of field data in the mobile computer application 200. For example, instructions Nitrogen 210 can be programmed to accept definitions of nitrogen planting programs and practices and accept user admission that specifies to apply such programs across multiple fields. Nitrogen planting programs, in this context, refer to a named stored data set that associates: a name, color code or other identifier, one or more application dates, types of material or product for each of the dates and the amounts, method for application or incorporation, as injected or flowing, and / or quantities or application ranges for each of the dates, the culture or the hybrid that is the subject of the application,
Petition 870190060828, of 06/28/2019, p. 33/85
28/71 among others. Nitrogen practice programs, in this context, refer to a named stored data set that associates: a practice name; a previous culture; a plowing system; a primary plowing date; one or more previous tillage systems that were used; one or more application type indicators, such as manure, that have been used. Nitrogen instructions 210 can also be programmed to generate and cause the display of a nitrogen graph, which indicates projections of plant use of the specified nitrogen and the possibility of an excess or shortage being predicted; in some modalities, different color indicators may signal a magnitude of excess or magnitude of scarcity. In one embodiment, a nitrogen graph comprises a graphical display on a computer display device that comprises a plurality of rows, each row being associated with and identifying a field; data specifying which crop is planted in the field, the size of the field, the location of the field, and a graphical representation of the field perimeter; in each row, a timeline per month with graphical indicators that specify each application of nitrogen and quantity in the points correlated to the month names; and numerical and / or colored indicators of excess or scarcity, in which the color indicates the magnitude.
[052] In one embodiment, the nitrogen graph can include one or more user admission features, such as dials or sliders, to dynamically change nitrogen planting programs and practices so that a user can optimize their nitrogen graph. nitrogen. The user can then use his graph of
Petition 870190060828, of 06/28/2019, p. 34/85
29/71 nitrogen optimized and nitrogen related planting programs and practices to deploy one or more roadmaps, including variable rate (VR) fertility roadmaps. Nitrogen instructions 210 can also be programmed to generate and cause the display of a nitrogen map, which indicates projections of plant use of the specified nitrogen and the possibility of an excess or a shortage being predicted; in some modalities, different color indicators may signal a magnitude of excess or magnitude of scarcity. The nitrogen map can display plant use projections of the specified nitrogen and the possibility of an excess or a shortage being predicted for different times in the past and in the future (such as daily, weekly, monthly or annually) using numerical indicators and / or colored of excess or scarcity, where the color indicates the magnitude. In one embodiment, the nitrogen map can include one or more user admission features, such as dials or sliders, to dynamically change nitrogen planting programs and practices so that a user can optimize their nitrogen map, such as how to get a preferred amount of excess to scarcity. The user can then use their optimized nitrogen map and related nitrogen planting programs and practices to deploy one or more road maps, including variable rate (VR) fertility road maps. In other embodiments, instructions similar to nitrogen 210 instructions can be used for applying other nutrients (such as phosphorus and potassium), pesticide application and irrigation programs.
Petition 870190060828, of 06/28/2019, p. 35/85
30/71 [053] In one embodiment, 212 weather instructions are programmed to provide data on recent field-specific weather conditions and information on predicted weather conditions. This allows producers to save time and have an effective integrated display regarding daily operational decisions.
[054] In one embodiment, field health instructions 214 are programmed to provide immediate remote capture images highlighting crop variation in the crop for at least one crop development stage and potential problems. Exemplary scheduled functions include cloud scanning, to identify possible clouds or cloud shadows; determine nitrogen indices based on field images; graphical visualization of gauging layers, which include, for example, those related to field health, and viewing and / or sharing gauging observations; and / or download satellite images from multiple sources and prioritize the images for the producer, among others.
[055] In one embodiment, performance instructions 216 are programmed to provide reports, analysis and perception tools using data on the farm for assessment, perceptions and decisions. This allows the producer to seek to improve results for the next year through conclusions based on fact about the reason for the return on investment having been at previous levels, and to observe the factors of income limitation. Performance instructions 216 can be programmed to communicate over the network (or networks) 109 with back analytical programs
Petition 870190060828, of 06/28/2019, p. 36/85
31/71 end executed in the agricultural intelligence computer system 130 and / or external data server computer 108 and configured to analyze measurements, such as yield, hybrid, population, SSURGO, soil tests or elevation, among others. Scheduled reports and analysis may include correlations between yield and another parameter or variable of agricultural data, analysis of yield variability, comparative yield analysis and other metrics with other producers based on anonymized data collected from many producers, or data for seeds and plantations , among others.
[056] Applications that have instructions configured in this way can be deployed to different computing device platforms while retaining the same user interface appearance. For example, the mobile application can be programmed to run on tablet computers, smart phones or server computers that are accessed using browsers on client computers. In addition, the mobile app, as configured for tablet computers or smart phones, can provide a full app experience or a cabin app experience that is suited to the display and processing capabilities of the 115 cabin computer. For example, Referring now to view (b) of Figure 2, in one embodiment, a cabin computer application 220 may comprise map cabin instructions 222, remote view instructions 224, data collection and transfer instructions 226, machine alerts 228, route transfer instructions 230, and scout booth instructions 232. The code base for
Petition 870190060828, of 06/28/2019, p. 37/85
32/71 instructions for view (b) can be the same as for view (a) and executables that deploy the code can be programmed to detect the type of platform on which they are executed and to expose, through an interface of graphical user, only those functions that are suitable for a cabin platform or complete platform. This approach allows the system to recognize the distinctly different user experience that is appropriate for a cabin environment and a different cabin technology environment. Map booth instructions 222 can be programmed to provide map views of fields, farms or regions that are useful in directing machine operation. Remote view instructions 224 can be programmed to connect, manage and provide machine activity views in real time or near real time to other computing devices connected to the 130 system via wireless networks, wired connectors or adapters, and the like. Data collection and transfer instructions 226 can be programmed to activate, manage and provide transfer of data collected from machine sensors and controllers to system 130 via wireless networks, wired connectors or adapters, and the like. Machine alert instructions 228 can be programmed to detect problems with machine or tool operations that are associated with the cab and generate operator alerts. Route transfer instructions 230 can be configured to transfer on instruction routes that are configured to direct machine operations or data collection. Scouting room instructions 230 can be programmed to display alerts based on
Petition 870190060828, of 06/28/2019, p. 38/85
33/71 location and information received from system 130 based on the location of the agricultural device 111 or sensors 112 in the field and assimilate, manage and provide transfer of location based measurement observations to system 130 based on the location of the agricultural device 111 or sensors 112 in the field.
[057] In one embodiment, the external data server computer 108 stores external data 110, which includes soil data representing the soil composition for the one or more fields and climatic data representing the temperature and precipitation in the one or more fields. Climate data can include past and present weather data, as well as forecasts for future weather data. In one embodiment, the external data server computer 108 comprises a plurality of servers hosted by different entities. For example, a first server can contain soil composition data while a second server can include weather data. Additionally, soil composition data can be stored on multiple servers. For example, one server can store data that represents the percentage of sand, sludge and clay in the soil while a second server can store data that represents the percentage of organic matter (OM) in the soil.
[058] In one embodiment, remote sensor 112 comprises one or more sensors that are programmed or configured to produce one or more observations. Remote sensor 112 can be overhead sensors such as satellites, vehicle sensors, image sensors (for example, image capture device for capturing crop images or
Petition 870190060828, of 06/28/2019, p. 39/85
34/71 soil conditions), planting equipment sensors, tillage sensors, fertilizer or insecticide application sensors, harvester sensors, and any other implantation capable of receiving data from one or more fields. In one embodiment, application controller 114 is programmed or configured to receive instructions from the agricultural intelligence computer system 130. Application controller 114 can also be programmed or configured to control an operating parameter of a vehicle or agricultural implement. For example, an application controller can be programmed or configured to control a vehicle's operating parameter, such as a tractor, planting equipment, tillage equipment, fertilizer or insecticide equipment, harvester equipment, or other farm implements , such as a water valve. Other modalities can use any combination of sensors and controllers, of which the following are merely selected examples.
[059] System 130 can obtain or assimilate data under user control 102, based on mass from a large number of producers who have contributed data to a shared database system. This way of obtaining data can be called manual data assimilation, since one or more user-controlled computer operations are requested or triggered to obtain data for use by the 130 system. As an example, the CLIMATE FIELD VIEW application, commercially available by Climate Corporation, San Francisco, California, USA, can be operated to export data to system 130 for storage in repository 160.
Petition 870190060828, of 06/28/2019, p. 40/85
35/71 [060] For example, seed monitoring systems can both control planter device components and obtain planting data, which includes signals from the seed sensors via a signal harness comprising a CAN main support and point-to-point connections for registration and / or diagnostics. Seed monitoring systems can be programmed or configured to display spacing, population and other seed information to the user via cabin computer 115 or other devices in system 130. Examples are disclosed in US Patent No. 8,738. 243 and Patent Publication No. 20150094916, and the present disclosure assumes knowledge of those other patent disclosures.
[061] Similarly, yield monitoring systems may contain yield sensors for the combine device that sends yield measurement data to the cabin computer 115 or other devices in the 130 system. Yield monitoring systems can use one or more remote sensors 112 to obtain grain humidification measurements on a combine harvester or other harvester and transmit these measurements to the user via the cabin computer 115 or other devices in the system 130.
[062] In one embodiment, examples of sensors 112 that can be used with any moving vehicle or apparatus of the type described elsewhere in this document include kinematics sensors and position sensors. The kinematics sensors can comprise any speed sensors, such as radar or wheel speed sensors, accelerometers or
Petition 870190060828, of 06/28/2019, p. 41/85
36/71 gyros. Position sensors can comprise GPS receivers or transceivers, or WiFi-based positioning or mapping apps that are programmed to determine location based on nearby WiFi access points, among others.
[063] In one embodiment, examples of 112 sensors that can be used with tractors or other moving vehicles include engine scalar speed sensors, fuel consumption sensors, area meters or distance meters that interact with GPS signals. or radar, PTO scalar speed sensors (starting power), tractor hydraulic sensors configured to detect hydraulic parameters, such as pressure or flow, and / or hydraulic pump scalar speed, wheel scalar speed sensors or wheel slip. In one embodiment, examples of controllers 114 that can be used as tractors include directional controllers, pressure controllers and / or flow controllers; hydraulic pump scalar speed controllers; scalar speed controllers or regulators; coupling position controllers; or wheel position controllers provide automatic steering.
[064] In one embodiment, examples of 112 sensors that can be used with seed planting equipment, such as planters, drills, or pneumatic seeders include seed sensors, which can be optical, electromagnetic or impact sensors; downward force sensors, such as load pins, load cells, pressure sensors; soil property sensors, such as
Petition 870190060828, of 06/28/2019, p. 42/85
37/71 reflectivity, humidification sensors, electrical conductivity sensors, optical residue sensors, or temperature sensors; component operating criteria sensors, such as planting depth sensors, downward pressure cylinder pressure sensors, seed disk scalar speed sensors, seed drive encoders and motor, scalar conveyor speed sensors seed, or vacuum level sensors; or pesticide application sensors, such as optical sensors or other electromagnetic sensors or impact sensors. In one embodiment, examples of controllers 114 that can be used with such seed planting equipment include: tool bar bend controllers, such as controllers for valves associated with hydraulic cylinders; downforce controllers, such as controllers for valves associated with pneumatic cylinders, air cushions, or hydraulic cylinders, and programmed to apply downforce to individual row units or an entire planter frame; planting depth controllers, such as linear actuators; metering controllers, such as electric seed meter drive motors, hydraulic seed meter drive motors, or planting range control clutches; hybrid selection controllers, such as seed meter drive motors, or other actuators programmed to allow or selectively prevent the seed or a mixture of air and seed from distributing seed to or from seed meters or central bulk hoppers; metering controllers, such as motors
Petition 870190060828, of 06/28/2019, p. 43/85
38/71 electric seed meter drive, or hydraulic seed meter drive motors; seed conveyor system controllers, such as controllers for a seed distribution conveyor motor; marker controllers, such as a controller for a pneumatic or hydraulic actuator; or pesticide application rate controllers, such as metering trigger controllers, orifice size or position controllers.
[065] In one embodiment, examples of sensors 112 that can be used with tillage equipment include position sensors for tools, such as rods or discs; tool position sensors for such tools that are configured to detect depth, series angle or lateral tool spacing; downward force sensors; or tensile strength sensors. In one embodiment, examples of controllers 114 that can be used with the tillage equipment include downforce controllers or tool position controllers, such as controllers configured to control depth, series angle or lateral tool spacing.
[066] In one embodiment, examples of 112 sensors that can be used in connection with the apparatus to apply fertilizer, insecticide, fungicide and the like, such as planter starter fertilizer systems, subsoil fertilizer applicators, or fertilizer sprinklers, include : fluid system criteria sensors, such as flow sensors or pressure sensors; sensors that indicate which spray head valves or
Petition 870190060828, of 06/28/2019, p. 44/85
39/71 fluid line valves are open; sensors associated with tanks, such as fill level sensors; transverse supply line or system wide sensors, or row specific supply line sensors; or kinematics sensors, such as accelerometers arranged on sprinkler beams. In one embodiment, examples of controllers 114 that can be used with such an apparatus include scalar pump speed controllers; valve controllers that are programmed to control pressure, flow, direction, PWM and the like; or position actuators, such as for beam height, subcultivator depth or beam position.
[067] In one embodiment, examples of sensors 112 that can be used with harvesters include performance monitors, such as impact plate strain gauges or position sensors, capacitive flow sensors, load sensors, weight sensors or pressure sensors. torque associated with elevators or trades, or optical or other electromagnetic grain height sensors; grain humidification sensors, such as capacitive sensors; grain loss sensors, including impact sensors, optical or capacitive; main conduit operating criteria sensors, such as main conduit height, main conduit type, deck plate gap, feeder scalar speed, and coil scalar speed sensors; separator that operates criteria sensors, such as concave clearance sensors, scalar rotor speed, shoe clearance, or chamfer clearance; auger sensors for position, operation or scalar speed; or scalar speed sensors of
Petition 870190060828, of 06/28/2019, p. 45/85
40/71 engine. In one embodiment, examples of controllers 114 that can be used with harvesters include main duct operation criteria controllers for elements such as main duct height, main duct type, deck plate gap, feeder scalar speed, or scalar coil speed; separator operating criteria drivers for features, such as concave clearance, scalar rotor speed, shoe clearance, or chamfer clearance; or controllers for auger position, operation or scalar speed.
[068] In one embodiment, examples of sensors 112 that can be used with grain buckets include weight sensors, or sensors for auger position, operation, or scalar speed. In one embodiment, examples of controllers 114 that can be used with grain buckets include controllers for auger position, operation, or scalar speed.
[069] In one embodiment, the examples of sensors 112 and controllers 114 can be installed in unmanned aerial vehicle (UAV) devices or drones. Such sensors may include image capture devices or cameras with effective detectors for any range of the electromagnetic spectrum that includes visible, infrared, ultraviolet, near infrared (NIR) light, and the like; accelerometers; altimeters; temperature sensors; humidity sensors; Pitot tube sensors or other sensors for scalar air velocity or wind vector velocity; battery life sensors; or radar emitters and apparatus for detecting reflected radar energy. Such controllers may include orientation or motor control apparatus,
Petition 870190060828, of 06/28/2019, p. 46/85
41/71 control surface controllers, camera controllers, or controllers programmed to connect, operate, obtain data from, manage and configure any of the forecast sensors. The examples are disclosed in U.S. Patent Application No. 14 / 831,165 and the present disclosure assumes knowledge of such another patent disclosure.
[070] In one embodiment, sensors 112 and controllers 114 can be attached to soil sampling and measuring apparatus that is configured or programmed to sample soil and perform chemical soil tests, soil humidification tests, and other pertaining tests solo. For example, the apparatus disclosed in U.S. Patent No. 8,767,194 and U.S. Patent No. 8,712,148 can be used, and the present disclosure assumes knowledge of such other patent disclosures.
[071] In another embodiment, sensors 112 and controllers 114 can comprise climatic devices to monitor climatic conditions in the fields. For example, the apparatus disclosed in international Patent Application No. PCT / US2016 / 029609 can be used, and the present disclosure assumes knowledge of those patent disclosures.
[072] In one embodiment, the agricultural intelligence computer system 130 is programmed or configured to create an agronomic model. In this context, an agronomic model is a data structure in memory of the agricultural intelligence computer system 130 comprising field data 106, such as identification data and harvest data for one or more fields. The agronomic model can also comprise calculated agronomic properties that describe conditions that can affect the cultivation of one or
Petition 870190060828, of 06/28/2019, p. 47/85
42/71 more crops in a field, or properties of one or more crops, or both. Additionally, an agronomic model can comprise recommendations based on agronomic factors, such as crop recommendations, irrigation recommendations, planting recommendations and harvest recommendations. Agronomic factors can also be used to estimate one or more crop-related results, such as agronomic yield. The agronomic yield of a crop is an estimate of the amount of the crop that is produced, or, in some examples, the revenue or profit obtained from the crop produced.
[073] In one embodiment, the agricultural intelligence computer system 130 can use a pre-configured agronomic model to calculate agronomic properties related to location and crop information currently received for one or more fields. The preconfigured agronomic model is based on previously processed field data, including, without limitation, identification data, harvest data, fertilizer data, image data and climatic data. The pre-configured agronomic model may have been cross-validated to ensure model accuracy. Cross-validation can include a comparison to field survey that compares predicted results to actual results in a field, such as a comparison of rainfall estimation with a rain gauge or sensor that provides climatic data at the same or nearby location or an estimate of content nitrogen with a soil sample measurement.
[074] Figure 3 illustrates a programmed process by which
Petition 870190060828, of 06/28/2019, p. 48/85
43/71 the agricultural intelligence computer system generates one or more pre-configured agronomic models using field data provided by one or more data sources. The Figure can serve as an algorithm or instructions for programming
the functional elements of system computer in agricultural intelligence 130 for accomplish the operations that are now described.[075] In the block 305, O system computer in
agricultural intelligence 130 is configured or programmed to implement pre-processing of agronomic data from field data received from one or more data sources. Field data received from one or more data sources can be pre-processed for the purpose of removing noise and distortion effects in agronomic data that include measured outliers that would deviate received field data values. Pre-processing modalities for agronomic data may include, but are not limited to, removing data values commonly associated with atypical data values, specific measured data points that are known to unnecessarily skew other data values, data smoothing techniques used to remove or reduce additive or multiplicative noise effects, and other filtering or data derivation techniques used to provide clear distinctions between admitting positive and negative data.
[076] In block 310, the agricultural intelligence computer system 130 is configured or programmed to carry out the selection of a subset of data using the pre-processed field data in order to identify useful data sets for generating an agronomic model initial. O
Petition 870190060828, of 06/28/2019, p. 49/85
44/71 agricultural intelligence computer system 130 can implement subset data selection techniques that include, but are not limited to, a genetic algorithm method, in every subset model method, a sequential search method, a search method gradual regression, a particle swarm optimization method, and an ant colony and optimization method. For example, a genetic algorithm selection technique uses an adaptive heuristic search algorithm, based on evolutionary principles of natural and genetic selection, to determine and evaluate data sets in pre-processed agronomic data.
[077] In block 315, the agricultural intelligence computer system 130 is configured or programmed to implement field data set evaluation. In one modality, a specific field data set is evaluated by creating an agronomic model and using specific quality thresholds for the created agronomic model. Agronomic models can be compared with the use of cross-validation techniques that include, without limitation, mean quadratic error of leave-one-out cross validation (RMSECV), mean absolute error and mean percentage error. For example, RMSECV can cross-validate agronomic models by comparing predicted agronomic property values created by the agronomic model against historical collected and analyzed agronomic property values. In one embodiment, the agronomic data set evaluation logic is used as a feedback loop in which agronomic data sets that do not satisfy configured quality thresholds are used during selection steps.
Petition 870190060828, of 06/28/2019, p. 50/85
45/71 subset of future data (block 310).
[078] In block 320, the agricultural intelligence computer system 130 is configured or programmed to implement the creation of an agronomic model based on cross-validated agronomic data sets. In one embodiment, the creation of an agronomic model can implement multivariate regression techniques to create preconfigured agronomic data models.
[079] In block 325, the agricultural intelligence computer system 130 is configured or programmed to store pre-configured agronomic data models for future field data evaluation.
[080] According to one embodiment, the techniques described in this document are implemented by one or more special purpose computing devices. Special purpose computing devices can be wired to perform the techniques, or can include digital electronic devices, such as one or more application-specific integrated circuits (ASICs) or field programmable port arrangement (FPGAs) that are programmed persistently to perform the techniques, or may include one or more general purpose hardware processors programmed to perform the techniques, according to the program instructions in firmware, memory, other storage or a combination. Such special-purpose computing devices can also combine custom wiring-linked logic, ASICs or FPGAs with custom programming to perform the techniques. Special purpose computing devices can be desktop computer systems,
Petition 870190060828, of 06/28/2019, p. 51/85
46/71 portable computer, portable devices, network devices or any other device that incorporates program logic and / or wired to implement the techniques.
[081] For example, Figure 4 is a block diagram illustrating a computer system 400 by which a modality of the invention can be implemented. Computer system 400 includes a bus 402 or other communication mechanism for communicating information and a hardware processor 404 coupled to bus 402 for processing information. The hardware processor 404 can be, for example, a general purpose microprocessor.
[082] Computer system 400 also includes main memory 406, such as random access memory (RAM) or other dynamic storage device, coupled to bus 402 to store information and instructions to be executed by the 404 processor. Main memory 406 can also be used to store temporary variable information or other intermediate information during the execution of instructions to be executed by the 404 processor. Such instructions, when stored in non-transitory storage medium accessible to the 404 processor, render the computer system 400 in one special purpose machine that is customized to perform the operations specified in the instructions.
[083] Computer system 400 additionally includes a read-only memory (ROM) 408 or other static storage device coupled to bus 402 to store static information and instructions for processor 404. A storage device 410, such as
Petition 870190060828, of 06/28/2019, p. 52/85
47/71 a magnetic disk, optical disk or solid state drive is supplied and coupled to the 402 bus to store information and instructions.
[084] Computer system 400 can be coupled via bus 402 to a display 412, such as a cathode ray tube (CRT), to display information to a user computer. An admission device 414, which includes alphanumeric keys and other keys, is coupled to bus 402 to communicate information and command selections for the 404 processor. Another type of user admission device is cursor control 416, such as a mouse, a ball tracking keys or cursor direction keys to communicate direction information and command selections to the 404 processor and to control cursor movement on the 412 display. This intake device typically has two degrees of freedom on two geometric axes, a first geometric axis ( for example, x) and a second geometry axis (for example, y), which allows the device to specify positions on a plane.
[085] The computer system 400 can implement the techniques described in this document with the use of logic connected by custom wiring, one or more ASICs or FPGAs, firmware logic and / or program that, in combination with the computer system causes that or programs computer system 400 is a special purpose machine. According to one embodiment, the techniques in this document are performed by computer system 400 in response to processor 404 that executes one or more sequences of one or more instructions contained in main memory 406. Such instructions can be read in main memory 406 to leave
Petition 870190060828, of 06/28/2019, p. 53/85
48/71 from other storage media, such as storage device 410. The execution of the instruction sequences contained in main memory 406 causes the processor 404 to perform the process steps described in this document. In alternative modes, the wiring harness can be used in place of or in combination with the software instructions.
[086] The term storage media, as used in this document, refers to any non-transitory media that store data and / or instructions that cause a machine to operate in a specific mode. Such storage media may comprise non-volatile media and / or volatile media. Non-volatile media include, for example, optical discs, magnetic discs or solid-state drives, such as the storage device 410. Volatile media include dynamic memory, such as 406 main memory. Common forms of storage media include, for example , a floppy disk, floppy disk, hard drive, solid state drive, magnetic tape or any other means of storing magnetic data, a CD-ROM, any other means of storing optical data, any physical medium with hole patterns, a RAM, a PROM and EPROM, a FLASH-EPROM, NVRAM, any other chip or memory cartridge.
[087] Storage media are different, but can be used in conjunction with transmission media. Transmission media participate in the transfer of information between storage media. For example, transmission media includes coaxial cables, copper wires and optical fibers, which include wires that comprise
Petition 870190060828, of 06/28/2019, p. 54/85
49/71 the 402 bus. Transmission media can also take the form of acoustic or light waves, such as those generated during radio and infrared wave data communications.
[088] Several forms of media can be involved in carrying out one or more sequences of one or more instructions for 404 processor for execution. For example, instructions can initially be carried out on a magnetic disk or solid state drive on a remote computer. The remote computer can load the instructions into its dynamic memory and send the instructions over a telephone line using a modem. A local modem for a computer system 400 can receive data on the telephone line and use an infrared transmitter to convert the data into an infrared signal. An infrared detector can receive data from the infrared signal and the circuitry set can place data on bus 402. Bus 402 carries data to main memory 406, from which processor 404 retrieves and executes instructions . The instructions received by main memory 406 can optionally be stored on storage device 410 or before or after execution by processor 404.
[089] Computer system 400 also includes a communication interface 418 coupled to bus 402. Communication interface 418 provides a bidirectional data communication coupling for a network link 420 that is connected to a local network 422. For example , communication interface 418 can be an integrated services digital network card (ISDN), cable modem, modem
Petition 870190060828, of 06/28/2019, p. 55/85
50/71 satellite or a modem to provide a data communication connection for a corresponding type of telephone line. As another example, the communication interface 418 can be a local area network (LAN) card to provide a data communication connection to a compatible LAN. Wireless links can also be deployed. In any such deployment, communication interface 418 sends and receives electrical, electromagnetic or optical signals that
carry the given flows digital what represent various types of information. [090] 0 network link 420 provides typically Communication data through one or more nets for others
data devices. For example, network link 420 may provide a connection via LAN 422 to a host computer 424 or to data equipment operated by an Internet Service Provider (ISP) 426. ISP 426, in turn, provides data communication via the packet data communication network around the world now commonly called the Internet 428. Both the local network 422 and the Internet 428 use electrical, electromagnetic or optical signals that carry digital data streams. Signals through the various networks and signals on network link 420 and through communication interface 418, which transport digital data to and from computer system 400, are examples of transmission media.
[091] Computer system 400 can send messages and receive data, which includes program code, over the network (or networks), network link 420 and communication interface 418. In the example of the Internet, a server 430
Petition 870190060828, of 06/28/2019, p. 56/85
51/71 can transmit a requested code to an application program via the Internet 428, ISP 426, the local network
422 and communication interface 418.
[092] The received code can be executed by processor 404 as it is received and / or stored in storage device 410 or other non-volatile storage for later execution.
[093] Figure 7 illustrates a flowchart of a modality for a method 7 00 of capturing images and creating a map layer of local view for a passage of application of an agricultural crop. Method 700 is performed by processing logic that can comprise hardware (circuitry, dedicated logic, etc.), software (as it runs on a general-purpose computer system or a dedicated machine or device) or a combination of themselves. In one embodiment, method 700 is performed by processing logic of at least one computer system (for example, computer system 130, computer system 400, field manager computing device 104, cabin computer 115, application controller 114, remote sensor 112, drone, self-guided device, self-propelled device, etc.). The computer system executes instructions from an application or software program with processing logic. The application or software program can be started by the computer system, a device, or remote sensor. In one example, a computer system, a field manager computing device 104, cabin computer 115, application controller 114, device 111, remote sensor 112, drone, self-guided device, or device
Petition 870190060828, of 06/28/2019, p. 57/85
52/71 self-propelled performs some or all of the 200 method operations. In another example, a computer system 130 in combination with the field manager computing device 104, cabin computer 115, application controller 114, device 111, remote sensor 112, drone, self-guided device, or self-propelled device performs method 700 operations.
[094] In block 702, at least one of a device (for example, field manager computing device 104, cabin computer 115, application controller 114, device 111) and a remote sensor (for example, remote sensor 112, image sensor, image capture device, drone, self-guided device, self-propelled device, etc.) moves across a field to capture images of the field that include cultures if they are visible. A launched software application can control operations from a device's image capture device or remote sensor. The remote sensor can be integrated with the device (for example, agricultural device 111) coupled with it that performs an application pass (for example, planting, plowing, fertilization). The source of the images during any passage could be a drone with a camera that is instructed to track (for example, drive or follow) the machine (for example, agricultural device 111) carrying out the field crossing and standing crop capture images in front of the machine, processed culture (for example, ears of corn) that enters the machine, or soil and crop residue in the soil in which the machine has already moved. In another example, a user walks through a field and captures images with a mobile device or tablet device that has a
Petition 870190060828, of 06/28/2019, p. 58/85
53/71 image capture device (eg camera) and software application. In another example, a user guides a device (for example, wheeled device and support frame to position image capture devices) that has at least one image capture device (for example, remote sensor 112) through a field to capture images. In another example, a self-guided or self-propelled device moves through a field to capture images with the software application. The software application controls the possibility of images being captured continuously or during periods of time of more stable movement as opposed to unstable movement.
[095] In block 704, a computer system (for example, computer system 130, computer system 400, field manager computing device 104, cabin computer 115, application controller 114, remote sensor 112, drone, device self-guided, self-propelled device, etc.) analyzes captured images and determines relevant images that indicate a change in at least one condition of culture development (for example, images relevant for further analysis, images showing a change in culture development, change of ear potential, change in yield, change in weed cover, images showing lower predicted yield, etc.). In block 706, the computer system generates a map layer of localized view (for example, map layer based on image, elapsed time video, viewed in 360 degrees) to visualize the field (for example, at a development stage of particular culture, during the passage of
Petition 870190060828, of 06/28/2019, p. 59/85
54/71 application) based on at least the relevant captured images (or a subset of all captured images). An image-based map layer can comprise a map layer of image capture locations (for example, locations along a vehicle's path or deployment that crosses the field.) In this way, fewer images and more images relevant data can be saved to reduce the memory resources required to save these images and a localized view map layer.
[096] In block 708, the computer system generates and causes a graphical user interface to display performance data that includes a performance map in response to a user input. In block 710, the computer system receives a user selection from a region of the yield map. In block 712, the computer system generates and causes a graphical user interface to display a localized view map layer that is geographically associated with the selected region of the field map in response to user selection. The localized view map layer can be overlaid with a second map layer (for example, an agronomic information layer such as a yield map, a plantation population map, a seed spacing map, a strength map descendant of a plantation, or a field health map such as an NDVI map). The user can then identify a region of interest on the second map layer and select the region of interest on the overlay located map layer in order to view captured images and / or video for the region of interest and have a better understanding of the
Petition 870190060828, of 06/28/2019, p. 60/85
55/71 actual field conditions for the selected region. If the selected region has lower yield than other regions, then a user may be able to identify any problems (for example, weed cover, shorter crops compared to crops in other regions, smaller ear size for maize, crops with fewer leaves compared to crops in other regions) which cause lower yield. If the selected region has a higher yield than other regions, then a user may be able to identify certain crop characteristics or parameters (for example, lack of grass cover, taller crops compared to crops in other regions, ear size higher for maize, crops with more leaves compared to crops in other regions) that cause the highest yield.
[097] In another modality, blocks 708 and 710 are optional. A user may wish to view the view map layer earlier in a growing season or before yield data is available. In this case, in block 714, the computer system identifies regions where a higher yield potential is expected (for example, a higher potential for ear of corn) and regions where a lower yield potential is expected (for example, less potential of ear of corn). Due to the forecasts of regions with higher and lower income potential, a user may be able to take action to increase income in regions with lower income potential. For example, a user can increase or decrease fertilization, sprinkling, etc. as appropriate. A user can also remove crops that are expected to have a lower
Petition 870190060828, of 06/28/2019, p. 61/85
56/71 yield potential and this can increase yield for neighboring crops.
[098] Figure 8 illustrates a flowchart of a modality for an 800 method of image capture and analysis for an agricultural crop application pass. The 800 method is performed by processing logic that can comprise hardware (circuitry, dedicated logic, etc.), software (as it runs on a general-purpose computer system or a dedicated machine or device) or a combination of themselves. In one embodiment, method 800 is performed by processing logic of at least one computer system (for example, computer system 130, computer system 400, field manager computing device 104, cabin computer 115, application controller 114, remote sensor 112, drone, self-guided device, self-propelled device, etc.). The computer system executes instructions from an application or software program with processing logic. The application or software program can be started by the computer system, a device, or remote sensor. In one example, a computer system, a field manager computing device 104, cabin computer 115, application controller 114, device 111, remote sensor 112, drone, self-guided device, or self-propelled device performs some or all of the operations of the method 200. In another example, a computer system 130 in combination with the field manager computing device 104, cabin computer 115, application controller 114, device 111, remote sensor 112, drone, self-guided device, or self-propelled device performs
Petition 870190060828, of 06/28/2019, p. 62/85
57/71 the 800 method operations.
[099] In block 802, a device (for example, device 111, vehicle, planter, tractor, harvester, sprinkler, implantation, etc.) performs an application pass (for example, planting, plowing, fertilizing, etc.) for a field and, at the same time, captures images of the field that include cultures if they are visible during the application pass. Alternatively, a remote sensor (for example, remote sensor 112, drone, image capture device) associated with the device captures images during the application pass. The source of images during any pass could be a drone with a camera that is instructed to track (for example, drive or follow) the machine that performs the field pass.
[100] In block 804, the device (or computer system in communication with the device) generates a localized view (for example, image-based map layer, elapsed time video, 360 degree view) to view the field during the application pass based on the captured images. In block 806, the device (or computer system in communication with the device) automatically analyzes (for example, plantation analysis, fertilizer analysis, harvest analysis, tillage analysis) the images captured during the application pass. The device (or computer system in communication with the device) automatically performs the analysis in real time during the application transition or, alternatively, if necessary, communicates with an agricultural system for image analysis. In block 808, the device (or computer system in communication with the device) adjusts
Petition 870190060828, of 06/28/2019, p. 63/85
58/71 application pass settings if appropriate based on the analysis of the images captured during the application pass.
[101] For example, in a plantation application pass, the plantation analysis may include determining the current field conditions (for example, wet soil, dry soil, climatic conditions, etc.) of the captured images and that analysis can cause an adjustment to the parameters (for example, planter speed, downforce, etc.) of the planter during the planting pass. In another example, in a fertilizer application pass, a remote sensor (for example, drone camera) could induce the device, machine, or implant, gather images of the plants in front of the device, machine or implant, determine a criterion for crop health (e.g. crop growth stage, percentage or amount of grass cover) based on the images as part of the fertilizer analysis in block 806, and then adjust settings automatically in block 808 or by passing the criteria to the device, machine or implantation that could adjust an application rate for the fertilizer based on the crop health criterion. In one example, a lower amount of grass cover in a given region or range of a field would result in less fertilizer being applied to that particular region or range of the field. In contrast, a greater amount of grass cover in a given region or range of a field would result in more fertilizer being applied to that particular region or range of the field.
[102] In another example, in an application pass
Petition 870190060828, of 06/28/2019, p. 64/85
59/71 harvest, analysis in block 806 includes identifying crop components (eg corn cobs) in a crop processing device (eg corn head) and identifying the size and health of the crop component (eg ears of corn). The analysis may also include determining a delay between cutting a crop stalk (eg, corn stalk) and identifying a crop component (eg, corn cob). A device or structure could be added to a harvester duct to guide crop components (eg, corn cobs) in a position suitable for a harvester image capture device. A light source (eg, halogen lamp, infrared LED) can be installed on the outside of the harvesting equipment (for example, on the combine head) or on an interior region of the harvesting equipment (for example, on the feed chamber ) in order to illuminate the crops about to be harvested and / or the crop components harvested (for example, ears of corn) for better analysis. Stalk or ear analysis would then be used to adjust combine settings on block 808.
[103] In another example, in a tillage application pass, the analysis in block 806 includes analysis of soil residues after the tillage pass. The analysis can be used to adjust device settings or implantation during the tillage application pass.
[104] Figure 9 illustrates a flowchart of a modality for a 900 method of capturing images of an agricultural crop in a field and determines crop information from the
Petition 870190060828, of 06/28/2019, p. 65/85
60/71 agricultural culture in the countryside. The 900 method is performed by processing logic that can comprise hardware (circuitry, dedicated logic, etc.), software (as it runs on a general-purpose computer system or a dedicated machine or device) or a combination of themselves. In one embodiment, method 900 is performed by processing logic of at least one computer system (for example, computer system 130, computer system 400, field manager computing device 104, cabin computer 115, application controller 114, device 111, remote sensor 112, drone, self-guided device, self-propelled device, etc.). The computer system executes instructions from an application or software program with processing logic. The application or software program can be started by the computer system, a device, or remote sensor. In one example, a computer system, a field manager computing device 104, cabin computer 115, application controller 114, remote sensor 112, drone, self-guided device, or self-propelled device performs some or all of the 900 method operations. In another example, a computer system 130 in combination with the field manager computing device 104, cabin computer 115, application controller 114, device 111, remote sensor 112, drone, self-guided device, or self-propelled device performs the operations of the method 900.
[105] In block 902, at least one of an appliance (for example, field manager computing device 104, cabin computer 115, application controller 114,
Petition 870190060828, of 06/28/2019, p. 66/85
61/71 device 111) and a remote sensor (for example, remote sensor 112, image sensor, image capture device, drone, self-guided device, self-propelled device, etc.) moves across a field to capture images of the field that include cultures if they are visible. A launched software application (for example, image capture software application, field application) can control operations for one image capture device or can control operations for multiple image capture devices that are associated with at least one of the device and the remote sensor. In one example, two rows of a culture are captured by the images. The remote sensor can be integrated with the apparatus (for example, agricultural apparatus 111) coupled with it that performs an application pass (for example, planting, plowing, fertilizing) or moves through the field.
[106] The source of images during any pass could be a remote sensor (for example, drone with a camera) that is instructed to track (for example, drive or follow) the device that performs the field pass. In another example, a user walks through a field and captures images with a mobile device or tablet-type device that has an image capture device (for example, camera) and the software application. In another example, a user guides a device (for example, a device with wheels and a support frame to position image capture devices) that has at least one image capture device across a field to capture images. In another example, a self-guided or self-propelled device or robot moves across a field to capture images with the
Petition 870190060828, of 06/28/2019, p. 67/85
62/71 software application. The software application controls the possibility of images being captured continuously or during periods of time of more stable movement as opposed to unstable movement.
[107] In block 904, a computer system (for example, computer system 130, computer system 400, field manager computing device 104, cabin computer 115, application controller 114, remote sensor 112 (for example, image sensor, image capture device), drone, self-guided device, self-propelled device, etc.) determines location information (eg GPS data), guidance information (eg gyroscope, accelerometer), weather information (for example, time of day, day, position of the sun), and crop row information (for example, crop spacing between rows) and associates that information with each captured image. In one example, Crop row spacing for corn is defined at a fixed spacing (for example, 76.2 cm (30 inches)) that can be entered by the user and can act as a reference to determine crop characteristics. In block 906, the computer system analyzes the captured images and determines relevant images (for example, images relevant for further analysis) for saving. In block 908, the computer system generates a localized view map layer (eg, image-based map layer, elapsed time video, 360 degree view) to visualize the field (for example, at a development stage culture, during application pass) based on at least the captured images
Petition 870190060828, of 06/28/2019, p. 68/85
63/71 relevant (or a subset of all captured images). In this way, fewer images and more relevant images can be saved to reduce the memory resources needed to save those images and a localized view map layer. In block 910, the computer system generates and causes a graphical user interface to display performance data that includes a performance map in response to a user input. In block 912, the computer system receives a user selection from a yield map region. In block 914, the computer system generates and causes the graphical user interface to display a localized view map layer that is geographically associated with the selected region of the field map in response to user selection. The localized view map layer can be overlaid with the yield map. The user can view the localized view map layer in order to have a better understanding of the actual field conditions for the selected region. If the selected region has lower yield than other regions, then a user may be able to identify any problems (for example, weed cover) that cause the lower yield or to identify crop characteristics or parameters (shorter crops compared to crops in other regions, smaller ear size for corn, crops with fewer leaves compared to crops in other regions) that correlate with lower yield. If the selected region has higher yield than other regions, then a user may be able to identify any problems (for example, lack of grass cover) that cause the highest yield or
Petition 870190060828, of 06/28/2019, p. 69/85
64/71 identify certain crop characteristics or parameters (eg lack of grass cover, taller crops compared to crops in other regions, larger ear size for maize, crops with more leaves compared to crops in other regions) that correlate with the highest yield.
[108] Figure 10 illustrates a flowchart of a modality for a method 1000 of determining the characteristics of an agricultural crop (for example, maize) in a field based on capturing images of the culture in the field. Method 1000 is performed by processing logic that can comprise hardware (circuitry, dedicated logic, etc.), software (as it runs on a general-purpose computer system or a dedicated machine or device) or a combination of themselves. In one embodiment, method 1000 is performed by processing logic of at least one computer system (for example, computer system 130, computer system 400, field manager computing device 104, cabin computer 115, application controller 114, remote sensor 112, drone, self-guided device, self-propelled device, etc.). The computer system executes instructions from an application or software program with processing logic. The application or software program can be started by the computer system, a device, or remote sensor. In one example, a computer system, a field manager computing device 104, cabin computer 115, application controller 114, device 111, remote sensor 112, drone, self-guided device, or self-propelled device performs
Petition 870190060828, of 06/28/2019, p. 70/85
65/71 some or all of the operations of method 1000. In another example, a computer system 130 in combination with the field manager computing device 104, cabin computer 115, application controller 114, device 111, remote sensor 112, drone, self-guided device, or self-propelled device performs method 1000 operations.
[109] In block 1002, at least one of a device (for example, field manager computing device 104, cabin computer 115, application controller 114, device 111) and a remote sensor (for example, remote sensor 112, image sensor, image capture device, drone, self-guided device, self-propelled device, etc.) moves across a field to capture images of the field that include cultures if they are visible. A launched software application (for example, image capture software application, field application) can control operations for one image capture device or can control operations for multiple image capture devices that are associated with at least one of the device and the remote sensor. In one example, two rows of a culture are captured by the images. Images can be captured from different points of view (for example, top view image above a culture, side view image from the side of a culture). A launched software application can control operations from a device's image capture device or remote sensor. The remote sensor can be integrated with the device (for example, agricultural device 111) coupled to it that performs an application pass (for example,
Petition 870190060828, of 06/28/2019, p. 71/85
66/71 planting, plowing, fertilization). The source of images during any passage could be a remote sensor (for example, drone with a camera) that is instructed to track (for example, drive or follow) the machine that performs the field crossing. In another example, a user walks through a field and captures images with a mobile device or tablet-type device that has an image capture device (for example, camera) and the software application. In another example, a user guides a device (for example, a device with wheels and a support frame to position image capture devices) that has at least one image capture device across a field to capture images. In another example, a self-guided or self-propelled device or robot moves across a field to capture images with the software application. The software application controls the possibility of images being captured continuously or during periods of time of more stable movement as opposed to unstable movement.
[110] In block 1004, a computer system (for example, computer system 130, computer system 400, field manager computing device 104, cabin computer 115, application controller 114, remote sensor 112 (for example, image sensor, image capture device), drone, self-guided device, self-propelled device, etc.) determines location information (for example, GPS data), guidance information (for example, gyroscope provides device guidance or remote sensor in relation to a reference, accelerometer provides device or remote sensor guidance in relation to a reference), time information (eg
Petition 870190060828, of 06/28/2019, p. 72/85
67/71 example, time of day, day, position of the sun), shadow information (for example, shadow regions as extracted from the images), and crop row information (for example, crop spacing between rows) and associates this information with each captured image. In one example, the row spacing for maize is defined at a fixed spacing (for example, 76.2 cm (30 inches)) that can be entered by the user and can act as a reference to determine characteristics of a crop. In block 1006, the computer system analyzes the captured images and the information associated with the images to determine a planting area (for example, green planting area), a shade area, and different characteristics of the crop that includes the location of the stem or stem in the field, a relative size of a crop compared to neighboring crops to determine a relative growth stage (for example, corn ear potential), leaf size, leaf length, number of leaves for each stem or stem, and a location of a stalk or stem top (for example, corn cob end). The planting area and shade area can be used when determining the different characteristics of the crop. In block 1008, the computer system generates and causes a graphical user interface to display different characteristics of the culture. If the crop characteristics (eg, ear potential) indicate or predict a lower yield than other regions, then a user may be able to identify any problems (eg, herb cover, shorter crops compared to the crops in other regions, smaller size
Petition 870190060828, of 06/28/2019, p. 73/85
68/71 ear for corn, crops with fewer leaves compared to crops in other regions) that cause lower yield and take a corrective action (for example, fertilization application, sprinkling application) or different action (for example, replanting). If the selected region has a higher yield than other regions, then a user may be able to identify certain crop characteristics or parameters (for example, lack of grass cover, taller crops compared to crops in other regions, ear size higher for maize, crops with more leaves compared to crops in other regions) that cause the highest yield.
[111] The computer system can be integrated with a device or attached to it that performs an application pass (for example, planting, plowing, fertilization). Alternatively, the computer system can be integrated with a remote sensor (for example, drone, image capture device) associated with the device that captures images during the application pass.
[112] In some embodiments, the operations of the method (or methods) disclosed in this document may be altered, modified, combined or deleted. The methods in the modalities of the present disclosure can be carried out with a device, apparatus or computer system, as described in this document. The computer device, apparatus or system can be a conventional general purpose computer system or special purpose computers, which are designed or programmed to perform only one function, can also be used.
[113] Figure 11 illustrates a 1100 diagram for capturing
Petition 870190060828, of 06/28/2019, p. 74/85
69/71 images of a culture from multiple points of view according to one modality. Images are captured with processing logic that can comprise hardware (circuitry, dedicated logic, etc.), software (as it runs on a general-purpose computer system or a dedicated machine or device) or a combination thereof. In one embodiment, the processing logic of at least one computer system (for example, computer system 130, computer system 400, field manager computing device 104, cabin computer 115, application controller 114, remote sensor 112 (for example, image sensor, image capture device), drone, self-guided device, robot, self-propelled device, etc.) captures images while moving along a direction 67 0 that is substantially parallel to a row of a culture that has stems 650 to 652. The computer system executes instructions from an application or software program with processing logic. The application or software program can be started by the computer system.
[114] A software application launched (for example, image capture software application, field application) from the computer system captures images of the field that include stems 650 to 652. In one example, images can be captured from different views (for example, for each stem of a 690 row). For stem 650, one or more images are captured in each view 610, 620, and 630. Images captured in each view have a respective angle of view 610, 620, and 630. In one example, each angle of view view is approximately 90 to 120
Petition 870190060828, of 06/28/2019, p. 75/85
70/71 degrees (for example, 95 to 105 degrees). An image of stem 650 that is captured at viewpoint 630 will capture a larger portion of stem 650 than images of stem 650 that are captured at viewpoints 620 and 610 due to the fact that viewpoint 630 captures a larger portion (or all) of the stem 650. The stem 650 is approximately centered within the view angle 632. A line drawn from the view point 630 to a center of the stem 650 is approximately perpendicular to the path 670. The stem width measured with the use of such methods can be used, for example, to estimate a growth stage, relative growth stage, yield potential or ear potential for each plant.
[115] A high frequency of images is captured from different lateral views in order to obtain an estimate of the stem orientation (for example, along an oblong length (for example, main geometric axis 656) of the stem, along shorter length (for example, minor geometric axis 657)) and dimensions along the major and minor geometric axes of a stem's cross-sectional area. In one example, stem 650 has a major geometry axis along a y-axis of a 680 coordinate system, a minor geometry axis
over in a geometric axisand grows vertically to long of one geometric axis z. [116] 0 computer system Can be integrated with one device or coupled to it that performs a passage in
application (for example, planting, plowing, fertilization). Alternatively, the computer system can be integrated with a remote sensor (for example, remote sensor 112, drone,
Petition 870190060828, of 06/28/2019, p. 76/85
71/71 image capture device) associated with the device that captures images during the application pass. The source of images during any passage could be a remote sensor (for example, drone with a camera) that is instructed to track (for example, drive or follow) the device that performs the field crossing. In another example, a user walks through a field and captures images with a mobile device or tablet-type device that has an image capture device (for example, camera) and the software application. In another example, a user guides a device (for example, a device with wheels and a support frame to position image capture devices) that has at least one image capture device across a field to capture images. In another example, a self-guided or self-propelled device or robot moves across a field to capture images with the software application. The software application controls the possibility of images being captured continuously or during periods of time of more stable movement as opposed to unstable movement.
[117] It should be understood that the above description is intended to be illustrative and not restrictive. Many other modalities will become evident to people of ordinary skill in the technique by reading and understanding the description above. Therefore, the scope of the disclosure should be determined in reference to the attached claims, along with the full scope of equivalents to which such claims are entitled.
权利要求:
Claims (7)
[1]
1. Computer system to monitor field operations, characterized by the fact that it comprises:
a database for storing agricultural image data that includes images from at least one crop development stage that are captured with at least one of a device and a remote sensor that moves across a field; and at least one processing unit coupled to the database, at least one processing unit is configured to execute instructions to analyze the captured images, to determine relevant images that indicate a change in at least one condition of culture development, and to generate a localized view map layer to view the field at at least one culture development stage based on at least the relevant captured images.
[2]
2/7
2. System according to claim 1, characterized by the fact that at least one processing unit is additionally configured to execute instructions for generating yield data that include a yield map to be displayed in a graphical user interface and to receive a user selection from a yield map region.
[3]
3/7 characterized by the fact that at least one processing unit is additionally configured to execute instructions to adjust, with the apparatus or an agricultural computer system in communication with the apparatus, settings of the application passage, if appropriate, with based on the analysis of the images captured during the application pass.
8. System, according to claim 6, characterized by the fact that:
the application pass comprises a plantation application pass and the plantation analysis includes determining the current field conditions from the captured images, where that plantation analysis causes an adjustment to the device parameters during the application pass; or the application pass comprises a fertilizer application pass and a remote sensor causes the device to collect crop images in front of the device, determines a crop health criterion based on the images captured as part of the fertilizer analysis, and then, it automatically adjusts settings that include adjusting an application rate for the fertilizer based on the crop health criterion; or the application pass comprises a harvest application pass and the harvest analysis includes identifying culture components in a culture processing device and identifying the size and health of the culture component; or the application pass comprises a plowing application pass and the plowing analysis includes analysis
Petition 870190060828, of 06/28/2019, p. 80/85 of soil residues after application.
9. System according to claim 1, characterized by the fact that at least one processing unit is additionally configured to execute instructions for capturing images of an agricultural crop in a field, the instructions comprising:
move at least one of a device and a remote sensor across a field at a crop development stage and capture images of the field that include the crop;
determine location information, guidance information from the data processing system, weather information, shadow information and crop row information and associate that information with each captured image; and analyze the captured images and the information associated with the images to determine a planting area, a shade area, and different crop characteristics.
10. System, according to claim 9, characterized by the fact that:
the different characteristics of the crop include the location of a stalk or stem in the field; or the different characteristics of the crop include a relative size of a crop compared to neighboring crops to determine a relative growth stage, leaf size, leaf length, number of leaves
for each stalk or stalk is location of a top of one stalk or stem. 11. System, in wake up with claim 9, characterized by fact that at least one unit in
Petition 870190060828, of 06/28/2019, p. 81/85
3. System, according to claim 1, characterized by the fact that at least one processing unit is additionally configured to execute instructions to generate a localized view map layer that is geographically associated with the selected region of the field map in response to user selection.
Petition 870190060828, of 06/28/2019, p. 78/85
[4]
4. System according to claim 3, characterized by the fact that at least one processing unit is additionally configured to execute instructions for overlaying the view map layer located with the yield map.
[5]
5/7 processing is additionally configured to execute instructions for:
generate and make a graphical user interface display different characteristics of the culture; and
identify any problems with culture, if characteristics in culture indicate or provide for a lower yield of what others regions. 12. System, according with the claim 1,
characterized by the fact that at least one processing unit is additionally configured to execute instructions for:
access, in an agricultural device memory, agricultural image data that includes images from at least one crop development stage that are captured with a remote sensor that moves across a field; and analyze the captured images, to determine relevant images that indicate a change in at least one culture development condition, and to generate a localized view map layer to visualize the field at at least one culture development stage based on the least in the relevant captured images.
13. System according to claim 12, characterized by the fact that at least one processing unit is additionally configured to execute instructions for generating yield data that include a yield map to be displayed in a graphical user interface and to receive a user selection from a yield map region.
14. System, according to claim 13, characterized by the fact that at least one unit of
Petition 870190060828, of 06/28/2019, p. 82/85
5. System, according to claim 3, characterized by the fact that at least one processing unit is additionally configured to execute instructions to identify regions where a higher yield potential is expected and regions where a lower potential is expected income.
[6]
6 / Ί processing is additionally configured to execute instructions for:
generate a localized view map layer that is geographically associated with the selected region of the field map in response to user selection; and to identify regions where a higher income potential is expected and regions where a lower income potential is expected.
15. System according to claim 12, characterized by the fact that agricultural image data that includes images from at least one crop development stage, includes multiple points of view of each stem in order to determine a growth stage , a stage of relative growth, yield potential or ear potential for each stem.
16. Method for monitoring field operations, characterized by the fact that it comprises:
obtain, from a database that stores agricultural image data, a plurality of images from at least one stage of crop development, in which the plurality of images is captured with at least one of a device and a remote sensor that moves across a field; and analyze the captured images, to determine relevant images that indicate a change in at least one culture development condition, and to generate a localized view map layer to visualize the field at at least one culture development stage based on the least in the relevant captured images.
17. Non-transitory storage medium containing
Petition 870190060828, of 06/28/2019, p. 83/85
6. System, according to claim 1, characterized by the fact that at least one processing unit is additionally configured to execute instructions for capturing images and analyzing for an application passage of an agricultural crop, the instructions comprising:
perform, with a device, an application pass to a field and, at the same time, capture images of the field that include cultures if they are visible during the application pass;
generate a localized view to visualize the field during the application pass based on the captured images; and automatically analyze the application pass that includes at least one of a plantation analysis, a fertilizer analysis, a harvest analysis, and a plowing analysis based on the images captured during the application passage.
7. System according to claim 6,
Petition 870190060828, of 06/28/2019, p. 79/85
[7]
7/7 instructions, characterized by the fact that, when said instructions are executed by at least one processing unit, they cause at least one processing unit to execute the method, as defined in claim 16.
类似技术:
公开号 | 公开日 | 专利标题
US11191219B2|2021-12-07|Systems and methods for image capture and analysis of agricultural fields
BR112017026437B1|2022-01-18|COMPUTER SYSTEM AND COMPUTER DEPLOYED METHOD FOR MONITORING ONE OR MORE FIELDS OPERATIONS
AU2016344427B2|2021-08-12|Computer-implemented calculation of corn harvest recommendations
BR112020004115A2|2020-08-18|non-transient storage medium, computer implemented method of real-time disease recognition in a crop field and computer implemented method of determining a number of grains from an image of a corn
BR112019009310A2|2019-10-15|work layer image and analysis for implementing operator monitoring, control, and feedback
US10761075B2|2020-09-01|Detecting infection of plant diseases by classifying plant photos
BR112020010226A2|2020-11-10|digital modeling of crop diseases in agronomic fields
BR112021009659A2|2021-08-17|mapping of field anomalies using digital images and machine learning models.
BR112021000429A2|2021-04-06|GENERATION OF AGRONOMIC PRODUCTIVITY MAPS FROM FIELD HEALTH IMAGES
EP3869934A1|2021-09-01|Using machine learning-based seed harvest moisture predictions to improve a computer-assisted agricultural farm operation
US20200042890A1|2020-02-06|Automatic prediction of yields and recommendation of seeding rates based on weather data
BR112020009205A2|2020-10-13|selection of hybrid seeds and optimization of seeds portfolio by field
US11238283B2|2022-02-01|Hybrid vision system for crop land navigation
US20220019795A1|2022-01-20|Predicting horticultural yield for a field location using multi-band aerial imagery
CA3121005A1|2020-06-25|Utilizing spatial statistical models for implementing agronomic trials
BR112021010533A2|2021-08-24|Predictive script generation for soybeans
BR112019018057A2|2020-03-31|LOCATION SELECTION FOR TREATMENT SAMPLING
同族专利:
公开号 | 公开日
AR105191A1|2017-09-13|
US20180108123A1|2018-04-19|
AU2016287397B2|2021-05-20|
WO2017004074A1|2017-01-05|
CA2990438A1|2017-01-05|
US20200104988A1|2020-04-02|
EP3317812A4|2019-02-27|
AU2016287397A1|2018-02-01|
AU2021218100A1|2021-09-09|
EP3317812A1|2018-05-09|
US11191219B2|2021-12-07|
ZA201800608B|2019-09-25|
US10438343B2|2019-10-08|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

AU1177500A|1999-03-15|2000-10-04|Kumamoto Technopolis Foundation|Soil survey device and system for precision agriculture|
CA2283767C|1999-09-27|2007-06-19|Monsanto Company|Method and system for spatial evaluation of field crop perfomance|
US8737720B2|2003-04-30|2014-05-27|Deere & Company|System and method for detecting and analyzing features in an agricultural field|
US7765780B2|2003-12-12|2010-08-03|Vision Robotics Corporation|Agricultural robot system and method|
AT383743T|2004-12-18|2008-02-15|Deere & Co|HARVESTER|
EP2104413B2|2007-01-08|2020-03-25|The Climate Corporation|Planter monitor system and method|
WO2009149389A1|2008-06-06|2009-12-10|Monsanto Technology Llc|Generating agricultural information products using remote sensing|
US8477295B2|2009-05-07|2013-07-02|Solum, Inc.|Automated soil measurement device|
US10285325B2|2010-07-02|2019-05-14|Deere & Company|Seeding apparatus and method of determining a seed spacing variability value|
US8855937B2|2010-10-25|2014-10-07|Trimble Navigation Limited|Crop characteristic estimation|
JP5886054B2|2012-01-16|2016-03-16|株式会社日立製作所|Image analysis system and method|
US20130199807A1|2012-02-03|2013-08-08|Mark Hoffman|Tillage System with Interchangeable Modules|
US9030549B2|2012-03-07|2015-05-12|Blue River Technology, Inc.|Method and apparatus for automated plant necrosis|
EP3967121A1|2012-07-25|2022-03-16|Precision Planting LLC|System for multi-row agricultural implement control and monitoring|
US20140089045A1|2012-09-27|2014-03-27|Superior Edge, Inc.|Methods, apparatus and systems for determining stand population, stand consistency and stand quality in an agricultural crop and alerting users|
RU2537908C2|2013-03-06|2015-01-10|Государственное научное учреждение Агрофизический научно-исследовательский институт Российской академии сельскохозяйственных наук |Device for subsurface measuring agrotechnological characteristics of soil arable layer in motion|
US9943027B2|2013-03-14|2018-04-17|Precision Planting Llc|Systems, methods, and apparatus for agricultural implement trench depth control and soil monitoring|
US9686471B2|2013-11-01|2017-06-20|Light Labs Inc.|Methods and apparatus relating to image stabilization|
US9265187B2|2013-11-20|2016-02-23|Rowbot Systems Llc|Robotic platform and method for performing multiple functions in agricultural systems|
US9401030B2|2014-04-25|2016-07-26|Tazco Soil Service Co.|Image processing system for soil characterization|
WO2016029054A1|2014-08-22|2016-02-25|The Climate Corporation|Methods for agronomic and agricultural monitoring using unmanned aerial systems|
AU2015353587B2|2014-11-24|2021-05-20|The Climate Corporation|System and methods for identifying fields and tasks|
EP3854749A1|2015-04-29|2021-07-28|The Climate Corporation|Droplet counter and circuit for measuring precipitation|
CA2990438A1|2015-06-30|2017-01-05|The Climate Corporation|Systems and methods for image capture and analysis of agricultural fields|CA2990438A1|2015-06-30|2017-01-05|The Climate Corporation|Systems and methods for image capture and analysis of agricultural fields|
US10754063B2|2016-06-14|2020-08-25|The Climate Corporation|Supervised neural network to predict unlabeled rain rates|
WO2018049289A1|2016-09-09|2018-03-15|Cibo Technologies, Inc.|Systems for adjusting agronomic inputs using remote sensing, and related apparatus and methods|
EP3529771A1|2016-10-19|2019-08-28|BASF Agro Trademarks GmbH|Determining the grain weight of an ear|
US10721859B2|2017-01-08|2020-07-28|Dolly Y. Wu PLLC|Monitoring and control implement for crop improvement|
US10255670B1|2017-01-08|2019-04-09|Dolly Y. Wu PLLC|Image sensor and module for agricultural crop improvement|
US20180210456A1|2017-01-20|2018-07-26|Kubota Corporation|Travel route generation device and travel route generation method|
JP6971577B2|2017-01-20|2021-11-24|株式会社クボタ|Travel route generator and travel route generation program|
US10524409B2|2017-05-01|2020-01-07|Cnh Industrial America Llc|System and method for controlling agricultural product application based on residue coverage|
CN107122922B|2017-05-23|2020-06-16|哈尔滨工业大学|Agricultural machinery subsoiling operation quality evaluation method based on sequencing learning|
US11116155B2|2017-06-14|2021-09-14|Grow Solutions Tech Llc|Systems and methods for bypassing harvesting for a grow pod|
IT201700067764A1|2017-06-19|2018-12-19|Dinamica Generale S P A|Self-propelled apparatus for the analysis and optimal management of fields for agricultural cultivation.|
US10247717B2|2017-07-14|2019-04-02|SafeNet International LLC|Method of efficient acquisition of soil data using image mapping|
US10779458B2|2017-12-01|2020-09-22|International Business Machines Corporation|Monitoring aerial application tasks and recommending corrective actions|
WO2019133973A1|2017-12-29|2019-07-04|Ohio State Innovation Foundation|Crop health sensing system|
US11227382B2|2018-01-11|2022-01-18|Intelinair, Inc.|Change detection system|
US10477756B1|2018-01-17|2019-11-19|Cibo Technologies, Inc.|Correcting agronomic data from multiple passes through a farmable region|
US11006577B2|2018-02-26|2021-05-18|Cnh Industrial America Llc|System and method for adjusting operating parameters of an agricultural harvester based on estimated crop volume|
US11144775B2|2018-06-25|2021-10-12|Cnh Industrial Canada, Ltd.|System and method for illuminating the field of view of a vision-based sensor mounted on an agricultural machine|
US10650538B2|2018-06-27|2020-05-12|Cnh Industrial Canada, Ltd.|Detecting and measuring the size of clods and other soil features from imagery|
US10936870B2|2018-07-05|2021-03-02|Iron Ox, Inc.|Method for selectively deploying sensors within an agricultural facility|
US11234357B2|2018-08-02|2022-02-01|Cnh Industrial America Llc|System and method for monitoring field conditions of an adjacent swath within a field|
US10419902B1|2018-08-10|2019-09-17|Arvus Tecnologia Ltda.|Agricultural data transportation system|
US10959367B2|2018-10-03|2021-03-30|Cnh Industrial America Llc|System and method for controlling an agricultural tillage implement|
US10813262B2|2018-10-16|2020-10-27|Cnh Industrial America Llc|System and method for generating a prescription map for an agricultural implement based on yield map and/or crop biomass|
US11240961B2|2018-10-26|2022-02-08|Deere & Company|Controlling a harvesting machine based on a geo-spatial representation indicating where the harvesting machine is likely to reach capacity|
US11178818B2|2018-10-26|2021-11-23|Deere & Company|Harvesting machine control system with fill level processing based on yield data|
US20200217830A1|2019-01-08|2020-07-09|AgroScout Ltd.|Autonomous crop monitoring system and method|
WO2020148661A2|2019-01-15|2020-07-23|Tata Consultancy Services Limited|Method and system for plant health estimation|
US11079725B2|2019-04-10|2021-08-03|Deere & Company|Machine control using real-time model|
US11234366B2|2019-04-10|2022-02-01|Deere & Company|Image selection for machine control|
US11270189B2|2019-10-28|2022-03-08|International Business Machines Corporation|Cognitive decision platform for honey value chain|
US20210243936A1|2020-02-06|2021-08-12|Deere & Company|Predictive weed map generation and control system|
法律状态:
2019-12-03| B06F| Objections, documents and/or translations needed after an examination request according [chapter 6.6 patent gazette]|
2020-06-30| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]|
2021-10-13| B350| Update of information on the portal [chapter 15.35 patent gazette]|
优先权:
申请号 | 申请日 | 专利标题
US201562187183P| true| 2015-06-30|2015-06-30|
PCT/US2016/039880|WO2017004074A1|2015-06-30|2016-06-28|Systems and methods for image capture and analysis of agricultural fields|
[返回顶部]