专利摘要:
A computer system and related computer-implemented methods for the recognition of plantation diseases from large FoV images are released. In some embodiments, the computer system is configured to initially create a first digital model in memory to identify a region that captures a leaf and a second model to identify a region that captures a leaf infected with a disease. When a large FoV image is provided, under the control of the program, the system is then programmed to automatically identify candidate regions that could capture individual sheets from the large FoV image using the first model. The system is programmed to also determine whether candidate regions capture symptoms of a planting disease on individual leaves using the second model.
公开号:BR112020006760A2
申请号:R112020006760-1
申请日:2018-10-04
公开日:2020-10-06
发明作者:Yaqi Chen;Wei Guan
申请人:The Climate Corporation;
IPC主号:
专利说明:

[0001] [0001] A part of the disclosure of this patent document contains material that is subject to copyright protection. The copyright owner has no objection to facsimile reproduction by anyone of the patent document or patent disclosure, which appears in the files or patent records at the Patent and Trademark Office, but otherwise reserves all copyright. or other rights whatever they may be. 0 2015-2018 The Climate Corporation. FIELD OF DISSEMINATION
[0002] [0002] The present disclosure provides improvement in the technical fields of digital image processing and machine vision. The disclosure refers in general to the detection of plantation diseases using digital images and more specifically refers to the recognition of plantation diseases starting from digital images that have a large field of view. BACKGROUND
[0003] [0003] The approaches described in this section are approaches that could be sought after, but not necessarily approaches that were previously conceived or sought after. Therefore, unless otherwise stated, it should not be assumed that any of the approaches described in this section qualify as prior art merely by virtue of their inclusion in this section.
[0004] [0004] Plantation diseases are the main concerns of farmers, since plantation diseases can cause reduced production and economic loss. Automated disease recognition and treatment can begin with capturing digital images of plantations in the field. Many images captured by the machine have a large field of view ("FoV"), such as aerial reconnaissance photos taken at a distance from a field. In this image, the symptoms of the disease in an individual crop can be a secondary feature of the image, which can often not be detected with traditional image processing methods. It would be useful to be able to quickly and accurately identify the symptoms of the disease from images of large FoV. SUMMARY
[0005] [0005] The attached claims may serve as a summary of the disclosure. BRIEF DESCRIPTION OF THE DRAWINGS
[0006] [0006] In the drawings:
[0007] [0007] FIG. 1 illustrates an example of a computing system that is configured to perform the functions described here, shown in an environment in the field with other equipment with which the system can interoperate.
[0008] [0008] FIG. 2 illustrates two views of an example of logical organization of instruction sets in main memory when an example of a mobile application is loaded for execution.
[0009] [0009] FIG. 3 illustrates a programmed process through which the agricultural intelligence computing system generates one or more pre-configured agronomic models using agronomic data provided by one or more data sources.
[0010] [0010] FIG. 4 is a block diagram illustrating a computing system on which a modality of the invention can be implemented.
[0011] [0011] FIG. 5 represents an example of a timeline view modality for data entry.
[0012] [0012] FIG. 6 represents an example of a modality of a spreadsheet view for data entry.
[0013] [0013] FIG. 7 illustrates an example of computing vector histograms of features with gradients oriented to regions of a large FoV image.
[0014] [0014] FIG. 8 illustrates an example of convolutional neural network architecture.
[0015] [0015] FIG. 9 illustrates an example of a process of obtaining scales to generate an image pyramid.
[0016] [0016] FIG. 10 illustrates an example of a region selection result for an image pyramid.
[0017] [0017] FIG. 11 illustrates a product example of a second model for identifying regions that are likely to capture infected targets.
[0018] [0018] FIG. 12 illustrates an example of applying non-maximum suppression.
[0019] [0019] FIG. 13 illustrates an example of a process for identifying infected targets from a large FoV image. DETAILED DESCRIPTION
[0020] [0020] In the following description, for the purpose of explanation, numerous specific details are presented to provide a “complete understanding of this disclosure. It will be evident, however, that the modalities can be practiced without these specific details. In other cases, well-known structures and devices are shown in the form of a block diagram to avoid unnecessarily obscuring the present disclosure. The modalities are disclosed in sections according to the following general guidelines:
[0021] [0021] 1. OVERVIEW
[0022] [0022] A computer system and related computer-implemented methods for the recognition of plantation diseases from large FoV images are released. In some embodiments, the computer system is configured to initially build a first digital model in memory for the identification of a region capturing a leaf and a second model for the identification of a region capturing a leaf infected with a disease. When a large FoV image is provided, under the control of the program, the system is programmed to then automatically identify candidate regions that could capture individual sheets of the large FoV image using the first model. The system is programmed to further determine whether candidate regions capture the symptoms of a planting disease on individual leaves using the second model.
[0023] [0023] A large FoV image of a plantation field can capture dozens to hundreds of leaves. As the size of the field of view is unknown, the system is programmed to examine the large FoV image at different scales and apply the first model to each of the scaled images to identify the candidate regions. The first model can include or communicate with a module that computes a histogram of oriented gradient values (“HOG”) for a region and also includes a support vector machine (“SVM”) that determines whether the region captures a sheet based on the HOG value. SVM can be trained with images that are a certain size that mainly capture individual leaves. Specifically, the system is programmed to run the first model on each scaled image to compute the HOG for each sliding window of the scaled image that has the given size and to classify the corresponding region using the HOG value to determine whether the corresponding region is a candidate region that is likely to capture a leaf.
[0024] [0024] In addition to the HOG values for leaf recognition, additional characteristics can be computed for each candidate region for the recognition of symptoms of plantation diseases on the leaves. The second model can include a convolutional neural network (“CNN”) that includes a set of convolutional layer that computes the characteristic maps associated with an infected leaf for an area and a fully connected layer to determine whether the area or region within the area captures an infected leaf based on the characteristic maps. CNN may also include an additional region of interest (“Rol”) cluster layer before the fully connected layer. Training data for CNN can be distributed with an initial set of images that are a certain size that capture infected and healthy leaves at various scales. The training data can then be augmented with variants of the initial set of images through rotation, cropping and application of an improved generative contradictory Wassersstein (“WGAN”) network. Rather than computing the additional features "for different candidate regions separately, the system is programmed to process the large FoV image using the second model once to identify the additional features for the entire image and perform the classification for each candidate region through extraction of additional features for the candidate region from additional features for the entire image.
[0025] [0025] Specifically, the system is programmed to execute the set of convolutional layers on the large FoV image to generate a set of characteristic maps for the FoV image. Next, the system is programmed to map the candidate region back to the large FoV image to obtain a mapped region, extract a corresponding portion of each of the set of feature maps based on the mapping and execute the layer of Rol grouping and the fully connected layer in the feature map portion set to determine whether the mapped region is an intermediate region that is likely to capture an infected leaf. Finally, since some intermediate regions can overlap, the system is programmed to join some of the intermediate regions into a final region that is likely to capture an infected leaf.
[0026] [0026] The system produces many technical benefits. In one mode, the system analyzes a large FoV image in two stages, applying efficient and robust computational algorithms in each stage to achieve intended purposes. This systematic approach allows for a quick and accurate targeting of the candidate regions that are likely to capture the intended targets. For example, instead of looking directly into each window of the large FoV image, the system is programmed to first identify certain regions that are likely to contain individual leaves and to search only within those regions for infected leaves, using a relatively basic classifier to identify regions that capture individual leaves and an advanced classifier to recognize regions that capture infected leaves. The choice of an SVM classifier for HOG characteristic vectors appears to outperform many previous methods in efficiently identifying regions that capture individual leaves from a large FoV image that captures multiple leaves and crops. In maintaining an incremental layered network to achieve optimal high-speed, high-quality deposition, however, the systematic approach also offers flexibility within each layer to accommodate a potentially wide range of computational techniques. In addition, the system applies a variety of specific techniques to achieve optimal deposition of efficiency and accuracy. For example, the system is programmed to take a large FoV image of any size and resize the image internally to find infected sheets of varying size. In addition, all relevant features of the large FoV image can be computed in one pass and individual processing of focus regions is carried forward to the last few stages of the analytical data processing channel.
[0027] [0027] 2. EXAMPLE OF COMPUTER SYSTEM OF AGRICULTURAL INTELLIGENCE
[0028] [0028] 2.1 STRUCTURAL OVERVIEW
[0029] [0029] FIG. 1 illustrates an example of a computing system that is configured to perform the functions described here, shown in an environment in the field with other equipment with which the system can interoperate. In one embodiment, a user 102 has, operates, or owns a field management computing device 104 at a location in the field or associated with a location in the field such as an intended field for agricultural activities or a management location for one or more agricultural fields. Field management computing device 104 is programmed or configured to deliver field 106 data to an agricultural intelligence computing system 130 over one or more networks
[0030] [0030] Examples of field 106 data include (a) identification data (for example, acreage, field name, field identifiers, geographical identifiers, boundary identifiers, plantation identifiers, and any other appropriate data that can be used to identify farm land, such as a common land unit (CLU), lot and block number, a subdivision number, geographic coordinates and currencies, Farm Serial Number (FSN), farm number, land extension number, field number, section, municipal district and / or strip), (bj) harvest data (eg, type of plantation, variety of plantation, rotation of plantation, if the plantation is organically grown , harvest date, Current Production History (APH), expected yield, yield, plantation price, gross plantation revenue, grain moisture, tillage practice and information on the previous growing season), (c) data on the soil (by and example, type, composition, pH, organic matter (OM), cation exchange capacity (CEC)), (d) planting data (eg planting date, type of seed (s), relative maturity (RM) of the (s) seed (s) planted (s), seed population), (e) fertilizer data (eg type of nutrient (Nitrogen, Phosphorus, Potassium), type of application, date of application, quantity, source, method), (f) chemical reagent application data (eg, pesticide, herbicide, fungicide, other substance or mixture of substances intended for use as a plant regulator, defoliant or desiccant, application date, quantity, source, method) , (g) irrigation data (for example, application date, quantity, source, method), (h) climate data (for example, precipitation, rainfall rate, predicted rainfall, region with runoff rate , temperature, wind, weather, pressure, visibility, clouds, heat index, dew point o, humidity, snow depth, air quality, sunrise, sunset), (i) image data (for example, image information and light spectrum from a farm equipment sensor, camera, computer, smartphone, tablet, unmanned aerial vehicle, airplanes or satellites), (]) reconnaissance observations (photos, videos, free-form notes, voice records, voice transcriptions, weather conditions (temperature, precipitation (current and over the time), soil moisture, plantation cultivation stage, wind speed, relative humidity, dew point, black layer)) and (k) soil, seed, plantation phenology, pest and disease report, sources and bases forecast data.
[0031] [0031] A data server computer 108 is communicatively coupled to the agricultural intelligence computing system 130 and is programmed or configured to send external data 110 to the agricultural intelligence computing system 130 over the network (s) )
[0032] [0032] An agricultural equipment 111 can have one or more remote sensors 112 fixed on it, whose sensors are coupled "communicatively directly or indirectly through the agricultural equipment 111 to the agricultural intelligence computing system 130 and are programmed or configured to send the sensor data to the agricultural intelligence computing system 130. Examples of farm equipment 111 include tractors, combine harvesters, harvesters, planters, trucks, fertilizer equipment, aerial vehicles including unmanned aerial vehicles and any other item of physical machinery or hardware, typically mobile machinery that can be used in tasks associated with agriculture In some embodiments, a single unit of equipment 111 can comprise a large number of sensors 112 that are coupled locally to a network on the equipment; the controller area network (CAN) is an example of such a network that can be installed in cei threshing machines, harvesters, sprayers and cultivators. The application controller 114 is communicatively coupled to the agricultural intelligence computing system 130 through the network (s) 109 and is programmed or configured to receive one or more scripts that are used to control a vehicle's operational parameter or an agricultural implement of the agricultural intelligence computing system 130. For example, a controller area network (CAN) bus interface can be used to enable communications from the agricultural intelligence computing system 130 to agricultural equipment 111, such as as well as the way CLIMATE FIELDVIEW DRIVE, available from The Climate Corporation, San Francisco, California, is used. The sensor data may consist of the same type of information as field 106 data. In some embodiments, remote sensors 112 may not be attached to agricultural equipment 111, but may be located remotely in the field and may communicate with the network 109.
[0033] [0033] Equipment 111 may comprise a cab computer 115 that is programmed with a cab application, which may comprise a version or variant of the mobile application for device 104 which is further described in other sections here. In one embodiment, the cab computer 115 comprises a compact computer, often a computer the size of a tablet or smartphone, with a graphical display screen, such as a color screen, that is mounted inside the operator's chair 111. The cab 115 computer can implement some or all operations and functions that are further described here for the mobile computer device 104.
[0034] [0034] Aís) network (s) 109 broadly represents any deposit of one or more data communication networks including local area networks, long distance networks, network interconnection or internets, using any of the connections wired or wireless, including terrestrial or satellite connections. The network (s) can be implemented by any means or mechanism that enables the exchange of data between the various elements of FIG. 1. The various elements of FIG. 1 can also have direct communication connections (wired or wireless). Each of the sensors 112, the controller 114, the external data server computer 108 and other elements of the system comprises an interface compatible with the network (s) 109 and is programmed or configured to use standardized protocols for communication over networks such as TCP / IP, Bluetooth, CAN protocol and upper layer protocols such as HTTP, TLS and the like.
[0035] [0035] The agricultural intelligence computing system 130 is programmed or configured to receive the field 106 data from the field management computing device 104, the external data 110 from the external data server computer 108 and the sensor data from the remote sensor 112. The agricultural intelligence computing system 130 can also be configured to host, use or run one or more computer programs, other software elements, digitally programmed logic such as FPGAs or ASICs or any deposit of them to perform the transmission and storage of data values, the construction of digital models of one or more plantations in one or more fields, the generation of recommendations and notifications, and the generation and sending of scripts to application controller 114, in the manner further described in other sections of this disclosure.
[0036] [0036] In one embodiment, the agricultural intelligence computing system 130 is programmed with or comprises a communication layer 132, a presentation layer 134, a data management layer 140, a hardware / virtualization layer 150 and a model and a repository of field 160 data. “Layer”, in this context, refers to any deposition of electronic digital interface circuits, microcontrollers, firmware such as drivers and / or computer programs or other software elements.
[0037] [0037] Communication layer 132 can be programmed or configured to perform input / output interface functions including sending requests to the 104 field management computing device, the external data server computer 108 and the remote sensor 112 by field data, external data and sensor data respectively. The communication layer 132 can be programmed or configured to send the received data to the model and the field data repository 160 which will be stored as field data 106.
[0038] [0038] The presentation layer 134 can be programmed or configured to generate a graphical user interface (GUI) that will be displayed on the field management computing device 104, on the cab computer 115 or on other computers that are coupled to the system 130 through the network 109. The GUI can comprise controls to enter the data that will be sent to the agricultural intelligence computing system 130, to generate requests for models and / or recommendations and / or to display recommendations, notifications, models and other data from the field.
[0039] [0039] The data management layer 140 can be programmed or configured to manage read operations and write operations involving the repository 160 and other functional elements of the system, including queries and result sets communicated between the functional elements of the system and the repository. Examples of data management layer 140 include JDBC, SQL server interface code and / or HADOOP interface code, among other things. The repository 160 can comprise a database. As used here, the term “database” can refer to a body of data, a relational database management system (RDBMS) or both. As used here, a database can comprise any collection of data including hierarchical databases, relational databases, flat file databases, object relational databases, object oriented databases, databases distributed and any other structured collection of records or data that is stored in a computer system. Examples of RDBMS's include, but are not limited to, ORACLEG &, MYSQL, IBM € DB2, MICROSOFTE & SQL SERVER, SYBASE databases and POSTGRESQL. However, any database that enables the systems and methods described here can be used.
[0040] [0040] When data from field 106 is not provided directly to the agricultural intelligence computing system through one or more agricultural machines or agricultural machinery devices that interact with the agricultural intelligence computing system, the user may be notified via one or more user interfaces on the user device (served by the agricultural intelligence computing system) to enter such information. In an example of a modality, the user can specify the identification data by accessing a map on the user's device (served by the agricultural intelligence computing system) and selecting specific CLUs that have been shown graphically on the map. In an alternative embodiment, user 102 can specify identification data by accessing a map on the user's device (served by the agricultural intelligence computing system 130) and drawing field boundaries on the map. Such a selection of CLU's or drawings on the map represent geographical identifiers. In alternative modalities, the user can specify identification data by accessing the field identification data (provided in the form of shapefiles or in a similar format) from the US Department of Agriculture Farm Service Agency or another source through the user's device and providing such data field identification for the agricultural intelligence computing system.
[0041] [0041] In an example of a modality, the agricultural intelligence computing system 130 is programmed to generate and cause the display of a graphical user interface that comprises a data manager for data insertion. After one or more fields have been identified using the methods described above, the data manager can provide one or more widgets of the graphical user interface that when selected can identify changes in the field, soil, plantations, crops or practices of nutrients. The data manager can include a timeline view, a spreadsheet view and / or one or more editable programs.
[0042] [0042] FIG. 5 represents an example of a timeline view modality for data entry. Using the screen shown in FIG. 5, a user's computer can enter a selection of a particular field and a particular date for adding event. Events represented at the top of the timeline can include Nitrogen, Planting, Practices and Soil. To add a nitrogen application event, a user's computer can provide input to select the nitrogen tab. The user's computer can then select a location on the timeline for a particular field in order to indicate an application of nitrogen in the selected field. In response to receiving a selection of a location on the timeline for a particular field, the data manager may display a data entry overlay, allowing the user's computer to enter data regarding nitrogen applications, planting procedures , soil application, tillage procedures, irrigation practices or other information regarding the particular field. For example, if a user's computer selects a portion of the timeline and indicates a nitrogen application, then the data entry overlay may include fields for entering an applied nitrogen amount, an application date, a type of fertilizer used and any other information related to nitrogen application.
[0043] [0043] In one embodiment, the data manager provides an interface for creating one or more programs. “Program”, in this context, refers to a set of data referring to nitrogen applications, planting procedures, soil application, tillage procedures, irrigation practices or other information that may be related to one or more more fields and that can be stored in the digital data storage for reuse as a set in other operations. After a program has been created, it can be applied conceptually to one or more fields and references to the program can be stored in digital storage in association with data that identifies the fields. So, instead of manually entering identical data for the same nitrogen applications for several different fields, a user's computer can create a program that indicates a particular nitrogen application and then apply the program to several different fields. For example, in the timeline view of FIG. 5, the top two timelines have the selected program “Spring applied”, which includes an application of 150 lbs N / ac in early April. The data manager can provide an interface for editing a program. In one mode, when a particular program is edited, each field that has the particular program selected is edited. For example, in FIG. 5, if the “Spring applied” program is edited to reduce the nitrogen application to 130 lbs N / ac, the top two fields can be updated with a reduced nitrogen application based on the edited program.
[0044] [0044] In one mode, in response to receiving edits in a field that has a selected program, the data manager removes the field's correspondence with the selected program. For example, if a nitrogen application is added to the upper field in FIG. 5, the interface can update to indicate that the “Spring applied” program is no longer being applied in the upper field. Although the application of nitrogen in early April may continue, updates to the “Spring applied” program would not change the application of nitrogen in April.
[0045] [0045] FIG. 6 represents an example of a modality of a spreadsheet view for data entry. Using the screen shown in FIG.
[0046] [0046] In one embodiment, the model and field data are stored in the model and field data repository 160. The model data comprises models created for one or more fields. For example, a plantation model may include a digitally constructed model of developing a plantation in one or more fields. “Model” in this context refers to a digitally stored electronic set of executable instructions and data values, associated with each other, that is capable of receiving and responding to a programmatic or other digital call, invocation or request for resolution based on specified input values, to produce one or more stored or calculated output values that can serve as the basis for computer-implemented recommendations, output data displays or machine control, among other things. Experts in the field find it convenient to express models using mathematical equations, but such form of expression does not limit the models disclosed here to abstract concepts; instead, each model presented here has a practical application on a computer in the form of instructions and stored executable data that implement the model using the computer. The model can include a model of past events in one or more fields, a model of the current condition of one or more fields and / or a model of events predicted in one or more fields. The model and field data can be stored in the data structures in memory, filers in a database table, in flat files or spreadsheets or other forms of stored digital data.
[0047] [0047] In one embodiment, the agricultural intelligence computing system 130 is programmed with or comprises a large FoV disease recognition server computer or instruction set ("server") 170. Server 170 is additionally configured to comprise a plantation identification component 172, a disease recognition component 174 and a customer interface 176. Plantation identification component 172 is configured to create and execute an object identification model. The object is typically a single leaf from a certain plantation, but can comprise several leaves or other pieces of the cultivation plant. The object identification model can take a large FoV image and produce information that identifies regions of the image that are likely to capture desired objects. "Large" in this context means covering more than one object and often at least tens or hundreds of objects. The disease recognition component 174 is configured to create and execute a disease recognition model. As previously mentioned, “model” in this context refers to a digitally stored electronic set of executable instructions and data values, associated with another, which is capable of receiving and responding to a call, an invocation or a programmatic request or another digital for resolution based on the specified input values, to produce one or more stored or calculated output values that can serve as the basis for recommendations implemented by computer, displays output data or machine control, among other things. The disease can be any planting disease, such as Northern Leaf Blight (“NLB”), represented by its symptoms on an infected object, such as a corn leaf. The disease recognition model can take a large FoV image and produce information that identifies regions of the image that are likely to capture objects infected with a disease. The client interface 176 is configured to communicate with a client device, such as a field management computing device 104 or a cab computer 105, over a communication network, through communication layer 132. The interface customer 176 can also be configured to communicate with a display device through the presentation layer 134. Communication can include receiving training data for the creation of the object identification model or the disease recognition model, the receiving a large FoV image or transmitting information that identifies regions of a large FoV image that are likely to capture infected objects.
[0048] [0048] In one embodiment, each component of server 170 comprises a set of one or more pages of main memory, such as RAM, in the agricultural intelligence computing system 130 in which executable instructions were loaded and which when executed cause the agricultural intelligence computing system performs the functions or operations that are described here with reference to such modules. For example, the disease recognition component 174 may comprise a set of pages in RAM that contain instructions that when executed cause the realization of the location selection functions that are described here.
[0049] [0049] The hardware / virtualization layer 150 comprises one or more central processing units (CPUs), memory controllers and other devices, components or elements of a computing system such as volatile or non-volatile memory, non-volatile storage such as disk and 1 / O devices or interfaces that are illustrated and described, for example, in association with FIG. 4. Layer 150 can also comprise programmed instructions that are configured to support virtualization, containerization or other technologies.
[0050] [0050] For the purpose of illustrating a clear example, FIG. 1 displays a limited number of examples of certain functional elements. However, in other modalities, there may be any number of such elements. For example, the modalities can use thousands or millions of different mobile computing devices 104 associated with different users. Also, system 130 and / or the external data server computer 108 can be implemented using two or more processors, colors, clusters or examples of physical machines or virtual machines, configured in a different location or colocalized with other elements in a center data, shared computer installation or cloud computer installation.
[0051] [0051] 2.2. OVERVIEW OF THE APPLICATION PROGRAM
[0052] [0052] In one embodiment, the implementation of the functions described here using one or more computer programs or other software elements that are loaded and executed using one or more general purpose computers will cause the general purpose computers to be configured as one. particular machine or as a computer that is specially adapted to perform the functions described here. In addition, each of the flow diagrams that are additionally described here can serve, individually or in deposit with the descriptions of processes and functions in the narration presented here, as algorithms, plans or instructions that can be used to program a computer or logic to implement the functions that are described. In other words, it is intended that the entire text of the narration presented here and all the drawing figures together provide the dissemination of algorithms, plans or instructions that are sufficient to allow an expert in the art to program a computer to perform the functions that are described here, in deposition with this person's expertise and knowledge given the level of expertise that is appropriate for such inventions and disclosures.
[0053] [0053] In one embodiment, user 102 interacts with agricultural intelligence computing 130 using the field management computing device 104 configured with an operating system and one or more application programs or apps; the field management computing device 104 can also interoperate with the agricultural intelligence computing system independently and automatically under program control or logical control and direct user interaction is not always necessary. Field 104 computing computing device generally represents one or more of a smartphone, PDA, computing device in the form of a tablet, laptop computer, desktop computer, workstation or any other computing device capable of transmitting and receiving information and perform the functions described here. The field management computing device 104 can communicate over a network using a mobile application stored in the field management computing device 104 and in some embodiments, the device can be coupled using a cable 113 or a connector to sensor 112 and / or controller 114. A private user 102 may have, operate or own and use, in association with system 130, more than one field 104 management computing device at a time.
[0054] [0054] The mobile application can provide client-side functionality over the network to one or more mobile computing devices. In an example of a modality, the field management computing device 104 can access the mobile application through a web browser or a local client application or app. The field management computing device 104 can transmit data to and receive data from one or more front-end servers, using web-based protocols or formats such as HTTP, XML and / or JSON or application-specific protocols. In an example of a modality, the data can take the form of requests and insertion of user information, such as field data, into the mobile computing device. In some embodiments, the mobile application interacts with the location tracking hardware and software on the field management computing device 104 which determines the location of the field management computing device 104 using standardized tracking techniques such as signal multi-interaction radio, the global positioning system (GPS), WiFi positioning systems or other mobile positioning methods. In some cases, location data or other data associated with device 104, user 102 and / or user account (s) can be obtained by querying on a device's operating system or by requesting an app on the device to get data from the operating system.
[0055] [0055] In one embodiment, the field 104 management computing device sends field 106 data to agricultural intelligence computing system 130 that it comprises or includes, but is not limited to data values representing one or more of : a geographic location of one or more fields, crop information for one or more fields, crops planted in one or more fields and data on the soil extracted from one or more fields. The field management computing device 104 can send field 106 data in response to user input from user 102 by specifying the data values for the one or more fields. In addition, the field management computing device 104 can automatically send field 106 data when one or more of the data values becomes available to the field management computing device 104. For example, the field management computing device field 104 can be communicatively coupled to remote sensor 112 and / or application controller 114 which includes an irrigation sensor and / or an irrigation controller. In response to receiving data indicating that application controller 114 has released water over one or more fields, the field management computing device 104 can send data from field 106 to the agricultural intelligence computing system 130 indicating that water has been released over one or more fields. Field 106 data identified in this disclosure can be entered and communicated using electronic digital data that is communicated between computing devices using URLs parameterized in relation to HTTP or other appropriate communication protocol or message.
[0056] [0056] A commercial example of the mobile app is CLIMATE FIELDVIEW, commercially available from The Climate Corporation, San Francisco, California. The CLIMATE FIELDVIEW application or other applications may be modified, expanded or adapted to include features, functions and programming that were not disclosed before the filing date for this disclosure. In one embodiment, the mobile application comprises an integrated software platform that allows a farmer to make decisions based on the facts for his operation because it comprises historical data about the farmer's fields with any other data that the farmer wants to compare. Deposits and comparisons can be performed in real time and are based on scientific models that provide potential scenarios to allow the farmer to make more informed decisions in the best way.
[0057] [0057] FIG. 2 illustrates two views of an example of logical organization of instruction sets in main memory when an example of a mobile application is loaded for execution. In FIG. 2, each named element represents a region of one or more pages of RAM or other main memory or one or more blocks of disk storage or other non-volatile storage and instructions programmed within these regions. In one embodiment, in view (a), a mobile computer application 200 comprises instructions that share data ingestion from account 202 fields, overview and alert instructions 204, instructions from the digital map book 206, seeds and instructions planting 208, instructions for nitrogen 210,
[0058] [0058] In one embodiment, a mobile computer application 200 comprises instructions for account, fields, data ingestion, sharing 202 that are programmed to receive, transmit and ingest field data from third party systems via upload manual or APIs. Data types can include field boundaries, yield maps, such as crop maps, soil test results, maps as applied and / or management zones, among other things. Data formats can include shapefiles, native third party data formats and / or exports from the farm management information system (FMIS), among other things. The receipt of data can occur through manual upload, email with attachment, external APIs that deliver the data to the mobile application or instructions that call the APIs of external systems to pull data into the mobile application. In one embodiment, the mobile computer application 200 comprises inbox data. In response to receiving a selection of the inbox data, the mobile computer application 200 may display a graphical user interface for manually uploading data files and importing the uploaded files into a data manager.
[0059] [0059] In one embodiment, the instructions in the digital map book 206 comprise layers of data from the field map stored in the device's memory and are programmed with data visualization tools and geospatial notes from the field. This provides farmers with convenient information at hand for reference, logging and visual insights into field performance. In one embodiment, the overview and alert instructions 204 are programmed to provide a comprehensive view of the operation of what is important to the farmer and recommendations in time to take action or focus on particular issues. Which allows the farmer to concentrate on time on what needs attention, to save time and maintain income throughout the season. In one embodiment, seeds and planting instructions 208 are programmed to provide tools for seed selection, placement of hybrids and creation of scripts, including creation of variable rate (VR) scripts, based on scientific models and empirical data. This makes it possible for farmers to maximize yield or return on investment through obtaining, placing and optimized seed population.
[0060] [0060] In one embodiment, the script generation instructions 205 are programmed to provide an interface for the generation of scripts, including variable rate (VR) fertility scripts. The interface allows farmers to create scripts for implements in the field, such as nutrient applications, planting and irrigation. For example, a planting script interface can comprise tools for identifying a type of seed for planting. Upon receipt of a seed type selection, the mobile computer application 200 can display one or more fields divided into management zones, such as the field map data layers created as part of the instructions in the digital map book 206 In one embodiment, the management zones comprise soil zones together with a panel that identifies each soil zone and a soil name, texture, drainage for each zone or other field data. The mobile computer application 200 can also display tools for editing or creating such as graphical tools for drawing management zones, such as soil zones, on a map of one or more fields. Planting procedures can be applied in all management zones or different planting procedures can be applied in different subsets of management zones. When a script is created, the mobile computer application 200 can make the script available for download in a format that can be read by an application controller, such as an archived or compressed format. Additionally and / or alternatively, a script can be sent directly to the computer cab 115 of the mobile computer application 200 and / or uploaded to one or more data servers and stored for further use.
[0061] [0061] In one embodiment, nitrogen instructions 210 are programmed to provide tools for informing nitrogen decisions by viewing nitrogen availability for planting. This enables farmers to maximize yield and get the investment back through the application of optimized nitrogen during the season. Examples of programmed functions include the display of images such as SSURGO images to enable the design of fertilizer application zones and / or images generated from the subfield of soil data, such as data obtained from sensors, at a high spatial resolution ( as thin as millimeters or smaller depending on the proximity and resolution of the sensors); the upload of existing farmer-defined zones; the provision of a nutrient availability graph for the plants and / or a map to allow the adjustment of nitrogen application (s) over several zones; issuing scripts to drive machinery; tools for mass data entry and adjustment; and / or maps for viewing the data, among other things. “Mass data entry”, in this context, can mean entering data once and then applying the same data to various fields and / or zones that have been defined in the system; examples of the data may include nitrogen application data which is the same for many fields and / or zones of the same farmer, but such mass data entry applies to the entry of any type of field data into the mobile computer application 200 For example, nitrogen instructions 210 can be programmed to accept definitions of nitrogen application and practice programs and to accept input by the user specifying the application of these programs across the various fields. “Nitrogen application programs”, in this context, refer to stored named sets of data that associate: a name, color code or other identifier, one or more application dates, types of material or product for each dates and quantities, method of application or incorporation as injected or disseminated and / or quantities or rates of application for each of the dates, plantation or hybrid that is subject to application, among other things. “Nitrogen practice programs”, in this context, refer to stored named sets of data that associate: a practice name; a previous plantation; a tillage system; a primary crop date; one or more previous tillage systems that have been used; one or more application type indicators, such as fertilizer, that have been used. Nitrogen instructions 210 can also be programmed to generate and cause a nitrogen graph to be displayed, which indicates projections of the plant's use of the specified nitrogen and whether a surplus or deficiency is expected; in some modalities, indicators of different colors may signal a magnitude of surplus or a magnitude of deficiency. In one embodiment, a nitrogen graph comprises a graphical display on a computer display device that comprises a large number of rows, each row associated with and identifying a field; data specifying which crop plant is planted in, the size of the field, the location of the field and a graphical representation of the field's perimeter; in each row, a timeline per month with graphical indicators that specify each application of nitrogen and quantity in points correlated to the names of the months; and numerical and / or colored indicators of surplus or deficiency, in which the color indicates the magnitude.
[0062] [0062] In one embodiment, the nitrogen graph can include one or more input characteristics by the user, such as dialers or sliders, to dynamically change nitrogen at planting and practice programs so that a user can optimize your nitrogen graph. The user can then use his optimized nitrogen graph and nitrogen at planting and related practice programs to implement one or more scripts, including variable rate (VR) fertility scripts. Nitrogen instructions 210 can also be programmed to generate and cause a nitrogen map to be displayed, which indicates projections of use by the plant of the specified nitrogen and whether a surplus or deficiency is predicted; in some modalities, indicators of different colors may signal a magnitude of surplus or a magnitude of deficiency. The nitrogen map can display projections of the plant's use of the specified nitrogen and whether a surplus or deficiency is predicted for different times in the past and in the future (such as daily, weekly, monthly or annually) using numerical and / or color indicators of surplus or deficiency, in which the color indicates magnitude. In one embodiment, the nitrogen map can include one or more input characteristics by the user, such as dials or sliding bars, to dynamically change the nitrogen of the planting and practice programs so that a user can optimize his nitrogen map , in order to obtain a preferred amount of surplus to deficiency. The user can then use their optimized nitrogen map and nitrogen from planting and related practice programs to implement one or more scripts, including variable rate (VR) fertility scripts. In other modalities, instructions similar to nitrogen 210 instructions could be used for the application of other nutrients (such as phosphorus and potassium), pesticide and irrigation application programs.
[0063] [0063] In one embodiment, weather instructions 212 are programmed to provide recent weather data and weather information specific to the field. This allows farmers to save time and have an efficient integrated view of daily operational decisions.
[0064] [0064] In one embodiment, field health instructions 214 are programmed to provide images from remote sensors in time that show the variation in the plantation at the station and potential concerns. Examples of programmed functions include checking for clouds, to identify possible clouds or cloud shadows; determination of nitrogen indices based on field images; graphical visualization of recognition layers, for example, those related to field health and observation and / or sharing of recognition notes; and / or downloading satellite images from numerous sources and prioritizing the images for the farmer, among other things.
[0065] [0065] In one embodiment, performance instructions 216 are programmed to provide reports, analysis and insight tools using data on the farm for assessment, insights and decisions. This allows the farmer to look for improved results for the following year through conclusions based on facts about why the investment is recovered to be at the most important levels and insight into the income limiting factors. Performance instructions 216 can be programmed to communicate over the network (s) 109 with the back-end analytical programs run on the agricultural intelligence computing system 130 and / or the external data server computer 108 and configured to analyze measures such as yield, yield differential, hybrid, population, SSURGO zone, soil test properties or elevation, among other things. Scheduled reports and analyzes can include analysis of yield variability, estimation of treatment effect, yield benchmarking and other measures against other farmers based on anonymous data collected from many farmers or data for seeds and planting, among other things.
[0066] [0066] Applications that have instructions configured in this way can be deployed to different computing device platforms while maintaining the same overall user interface appearance. For example, the mobile application can be programmed to run on tablets, smartphones or server computers that are accessed using browsers on the client's computers. In addition, the mobile app that is configured for tablet or smartphone computers can provide a full app experience or a cab app experience that is suitable for the display and processing capabilities of the cab 115 computer. For example, now referring to view (b) of FIG. 2, in one embodiment a cab 220 computer application may comprise cab map instructions 222, remote viewing instructions 224, data collection and transfer instructions 226, machine alert instructions 228, script transfer instructions 230 and recognition-cab 232. The code base for the instructions in the view (b) can be the same as for the view (a) and executable programs that implement the code can be programmed to detect the type of platform on which they are executed and to display, through a graphical user interface, only those functions that are appropriate for a cab platform or complete platform.
[0067] [0067] 2.3. DATA INGESTION IN THE SYSTEM OF COMPUTING
[0068] [0068] In one embodiment, the external data server computer 108 stores external data 110, including data on the soil representing the soil composition for the one or more fields and climate data representing temperature and precipitation on the one or more fields. Climate data can include past and present weather data as well as weather forecasts for future weather data. In one embodiment, the external data server computer 108 comprises a large number of servers hosted by different entities. For example, a first server may contain data on soil composition while a second server may include data on the climate. Additionally, soil composition data can be stored on multiple servers. For example, one server can store data that represents the percentage of soil, sludge and clay in the soil while a second server can store data that represents the percentage of organic matter (OM) in the soil.
[0069] [0069] In one embodiment, remote sensor 112 comprises one or more sensors that are programmed or configured to produce one or more observations. Remote sensor 112 can be aerial sensors, such as satellites, sensors in vehicles, sensors in planting equipment, sensors in cultivated land, sensors for application of fertilizers or insecticides, sensors in harvesters and any other implement capable of receiving data from one or more more fields. In one embodiment, application controller 114 is programmed or configured to receive instructions from the agricultural intelligence computing system 130. Application controller 114 can also be programmed or configured to control a vehicle operating parameter or agricultural implement. For example, an application controller can be programmed or configured to control an operating parameter of a vehicle, such as a tractor, planting equipment, tillage equipment, fertilizer or insecticide equipment, harvesting equipment or other farm implements such as a water valve. Other modalities can use any deposition of sensors and controllers, of which the following are merely selected examples.
[0070] [0070] System 130 can obtain or ingest data under user control 102, on a mass basis from a large number of farmers who have contributed data to a shared database system. This way of obtaining data can be called “manual data ingestion” since one or more computer operations controlled by the user are requested or triggered to obtain data for use by the system 130. As an example, the CLIMATE FIELDVIEW application , commercially available from The Climate Corporation, San Francisco, California, can be operated to export data to system 130 for storage in repository 160.
[0071] [0071] For example, seed monitoring systems can both control planting equipment components and obtain planting data, including signals from seed sensors through a signal harness comprising a CAN structure and point-to-point connections for registration and / or diagnosis. Seed monitoring systems can be programmed or configured to display spacing, population and other seed information to the user via the cab 115 computer or other devices within the 130 system. Examples are disclosed in Pat. No. 8,738,243 and Pub. Of Pat. UJS. 20150094916 and the present disclosure admits knowledge of these other patent disclosures.
[0072] [0072] Similarly, yield monitoring systems may contain yield sensors for harvesting equipment that send yield measurement data to the cab 115 computer or other devices within the 130 system. Yield monitoring systems they can use one or more remote sensors 112 to obtain measurements of grain moisture in a combine harvester or other harvester and transmit these measurements to the user through the cab 115 computer or other devices within the system 130.
[0073] [0073] In one embodiment, examples of sensors 112 that can be used with any vehicle or equipment that moves the type described elsewhere here include kinematic sensors and position sensors. Kinematic sensors can comprise any of the speed sensors such as radar or wheel speed sensors, accelerometers or gyroscopes. Position sensors can comprise GPS receivers or transceivers or WiFi-based positioning or mapping apps that are programmed to determine location based on nearby WiFi hotspots, among other things.
[0074] [0074] In one embodiment, examples of 112 sensors that can be used with tractors or other moving vehicles include engine speed sensors, fuel consumption sensors, area meters or distance meters that interact with GPS or signals radar sensors, PTO speed sensors (PTO), tractor hydraulic sensors configured to detect hydraulic parameters such as pressure or flow and / or hydraulic pump speed, wheel speed sensors or wheel slip sensors. In one embodiment, examples of controllers 114 that can be used with tractors include hydraulic directional controllers, pressure controllers and / or flow controllers; hydraulic pump speed controllers; speed controllers or regulators; coupling position controllers; or wheel position controllers provide automatic steering.
[0075] [0075] In one embodiment, examples of sensors 112 that can be used with seed planting equipment such as planters, sowing or air seeders include seed sensors, which can be optical, electromagnetic or impact sensors; downward force sensors such as load pins, load cells, pressure sensors; sensors of soil properties such as reflectivity sensors, humidity sensors, electrical conductivity sensors, optical waste sensors or temperature sensors; component operating criteria sensors such as planting depth sensors, down force cylinder pressure sensors, seeding disk speed sensors, seed propulsion engine encoders, seed conveyor system speed sensors or vacuum level sensors; or pesticide application sensors such as optical or other electromagnetic sensors or impact sensors. In one embodiment, examples of controllers 114 that can be used with such seed planting equipment include: boom folding controllers, such as controllers for valves associated with hydraulic cylinders; downforce controllers, such as controllers for valves associated with pneumatic cylinders, airbags or hydraulic cylinders and programmed to apply downforce to individual row units or an entire frame planting frame; planting depth controllers, such as linear actuators; metering controllers, such as electric seed metering propulsion engines, hydraulic seed metering propulsion engines, or range control clutch; hybrid selection controllers, such as seed metering propulsion engines or other actuators programmed to selectively allow or prevent a seed or a sowing mixture with air from distributing seed to or from seed meters or central mass feeder deposits; metering controllers, such as electric seed metering propulsion engines or hydraulic seed metering propulsion engines; controllers of the seed conveyor system, such as controllers for a belt conveyor seed delivery engine; marking controllers, such as a controller for a pneumatic or hydraulic actuator; or pesticide application rate controllers, such as metering device controllers, orifice size or position controllers.
[0076] [0076] In one embodiment, examples of sensors 112 that can be used with tillage equipment include position sensors for tools such as rods or discs; tool position sensors for tools that are configured to detect depth, gang angle or side spacing; downward force sensors; or tensile strength sensors. In one embodiment, examples of controllers 114 that can be used with tillage equipment include downforce controllers or tool position controls, such as controllers configured to control tool depth, gang angle or side spacing.
[0077] [0077] In one embodiment, examples of sensors 112 that can be used in relation to equipment for applying fertilizers, insecticides, fungicides and the like, such as impeller fertilizer systems in planting equipment, fertilizer applicators under the ground or sprayers fertilizers, include: fluid system criteria sensors, such as flow sensors or pressure sensors; sensors that indicate which spray head valves or which fluid line valves are open; sensors associated with the tanks, such as fill level sensors;
[0078] [0078] In one embodiment, examples of sensors 112 that can be used with harvesters include performance monitors, such as plate deformation impact sensors or position sensors, capacitive flow sensors, load sensors, weight sensors or torque sensors associated with elevators or augers or height sensors for optical or other electromagnetic grains; grain moisture sensors, such as capacitive sensors; grain loss sensors, including impact sensors, optical or capacitive; feeder operating criteria sensors such as feeder height sensors, feeder type, deck plate space, feeder speed and drum speed; separator operating criteria sensors, such as concave spacing sensors, rotor speed, shoe spacing or chaffer spacing, auger sensors for position, operation or speed; or engine speed sensors. In one embodiment, examples of controllers 114 that can be used with harvesters include feeder operating criteria controllers for elements such as feeder height, feeder type, deck plate space, feeder speed or drum speed; separator operating criteria controllers for characteristics such as concave spacing, rotor speed, shoe spacing or chaffer spacing, or auger position, operation or speed controllers.
[0079] [0079] In one embodiment, examples of sensors 112 that can be used with grain buckets include weight sensors or sensors for auger position, operation or speed. In one embodiment, examples of controllers 114 that can be used with grain buckets include auger position, operation, or speed controllers.
[0080] [0080] In one embodiment, the examples of sensors 112 and controllers 114 can be installed in the equipment of unmanned aerial vehicles (UAV) or "drones". These sensors can include cameras with efficient detectors for any range of the electromagnetic spectrum including visible, infrared, ultraviolet, near infrared (NIR) and the like; accelerometers; altimeters; temperature sensors; humidity sensors; Pitot tube sensors or other air speed or wind speed sensors; battery time sensors; or radar emitters and reflected radar energy detection equipment; other emitters of electromagnetic radiation and equipment for detecting reflected electromagnetic radiation. These controllers can include engine guidance or control equipment, surface control controllers, camera controllers or controllers programmed to connect, operate, obtain data, manage and configure any of the previous sensors. The examples are disclosed in U.S. Patent Application No. 14 / 831,165 and the present disclosure admits knowledge of this other patent disclosure.
[0081] [0081] In one embodiment, sensors 112 and controllers 114 can be attached to soil sampling and measurement equipment that is configured or programmed to obtain soil samples and perform chemical soil tests, soil moisture tests and other tests referring to the soil. For example, the equipment disclosed in U.S. Pat. No. 8,767,194 and in U.S. Pat. No. 8,712,148 can be used and the present disclosure admits knowledge of these patent disclosures.
[0082] [0082] In one embodiment, sensors 112 and controllers 114 can comprise weather stations for monitoring the climatic conditions of the fields. For example, the equipment disclosed in US Provisional Patent Application No. 62 / 154,207, filed on April 29, 2015, in US Provisional Patent Application No. 62 / 175,160, filed on June 12, 2015, in Patent Application US Provisional No. 62 / 198,060, filed on July 28, 2015 and in Patent Application US Provisional No. 62 / 220,852, filed on September 18, 2015, can be used and the present disclosure admits knowledge of these patent disclosures .
[0083] [0083] 2.4. OVERVIEW OF THE PROCESS-TRAINING OF AGRONOMIC MODEL
[0084] [0084] In one embodiment, the agricultural intelligence computing system 130 is programmed or configured to create an agronomic model. In this context, an agronomic model is a data structure in the memory of the agricultural intelligence computing system 130 that comprises data from field 106, such as identification data and harvest data for one or more fields. The agronomic model can also comprise calculated agronomic properties that describe conditions that can affect the growth of one or more plantations in a field or properties of one or more plantations or both. In addition, an agronomic model can comprise recommendations based on agronomic factors such as recommendations on the crop plant, irrigation recommendations, planting recommendations, fertilizer recommendations, “fungicide recommendations, pesticide recommendations, harvest recommendations and other recommendations for crop plant management. Agronomic factors can also be used to estimate one or more results related to the crop plant, such as agronomic yield. The agronomic yield of a crop plant is an estimate of the amount of the crop plant that is produced or, in some examples, the gross revenue or profit obtained from the crop plant produced.
[0085] [0085] In one embodiment, the agricultural intelligence computing system 130 can use a pre-configured agronomic model to calculate agronomic properties related to the location currently received and information about the cultivation plant for one or more fields. The preconfigured agronomic model is based on previously processed field data, including, but not limited to, identification data, harvest data, fertilizer data and climate data. The pre-configured agronomic model may have undergone cross-validation to ensure the accuracy of the model. Cross-validation can include a comparison with the basic truth that compares predicted results with current results in a field, such as comparing the rainfall estimate with a rain gauge or rain sensor providing weather data in the same or a nearby location or a estimation of nitrogen content with a soil sample measure.
[0086] [0086] FIG. 3 illustrates a programmed process through which the agricultural intelligence computing system generates one or more preconfigured agronomic models using field data provided by one or more data sources. FIG. 3 can serve as an algorithm or instructions for programming the functional elements of the agricultural intelligence computing system 130 to perform the operations that are now described.
[0087] [0087] In block 305, the agricultural intelligence computing system 130 is configured or programmed to implement agronomic data by pre-processing field data received from one or more data sources. Field data received from one or more data sources can be pre-processed for the purpose of removing noise, distortion effects and factors that cause confusion within agronomic data including measured off-curve points that could adversely affect values field data received. Pre-processing agronomic data modalities may include, but are not limited to, removal of data values commonly associated with out-of-curve data values, specific measured data points that are known to unnecessarily distort other data values, smoothing, aggregating or sampling data used to remove or reduce additive or multiplicative effects from noise and other data filtration or derivation techniques used to provide clear distinctions between positive and negative data insertions.
[0088] [0088] In block 310, the agricultural intelligence computing system 130 is configured or programmed to carry out the selection of the data subset using the pre-processed field data in order to identify useful data sets to generate the initial agronomic model . The agricultural intelligence computing system 130 can implement data subset selection techniques including, but not limited to, a genetic algorithm method, a subset modeling method, a sequential search method, a stepwise regression method, a particle swarm optimization method and ant colony optimization method. For example, a selection technique with a genetic algorithm uses an adaptive heuristic search algorithm, based on the evolutionary principles of natural and genetic selection, to determine and evaluate data sets within pre-processed agronomic data.
[0089] [0089] In block 315, the agricultural intelligence computing system 130 is configured or programmed to implement the evaluation of the field data set. In one modality, a set of data from the specific field is assessed through the creation of an agronomic model and the use of specific quality limits for the created agronomic model. Agronomic models can be compared and / or validated using one or more comparison techniques, such as, but without limitation, root of the mean square error with Cross-Leave-One-Out (RMSECV), mean absolute error and mean percentage error . For example, RMSECV can cross-validate agronomic models by comparing predicted values of agronomic property by the agronomic model against historical values of agronomic property collected and analyzed. In one embodiment, the evaluation logic of the agronomic data set is used as a feedback loop in which the agronomic data sets that do not meet the configured quality limits are configured during the future stages of selecting the data subset (block 310) .
[0090] [0090] In block 320, the agricultural intelligence computing system 130 is configured or programmed to implement the creation of the agronomic model based on the cross-validated agronomic data sets. In one embodiment, the creation of the agronomic model can implement multivariate regression techniques to create pre-configured agronomic data models.
[0091] [0091] In block 325, the agricultural intelligence computing system 130 is configured or programmed to store pre-configured agronomic data models for future assessment of field data.
[0092] [0092] 2.5. EXAMPLE OF IMPLEMENTATION - OVERVIEW OF THE HARDWARE
[0093] [0093] According to one modality, the techniques described here are implemented by one or more computing devices with special purpose. Special purpose computing devices can be programmed to perform the techniques or can include digital electronic devices such as one or more application specific integrated circuits (ASICs) or field programmable port arrangements (FPGAs) that are persistently programmed to perform the techniques or may include one or more general purpose hardware processors programmed to execute the techniques as per the program instructions in firmware, memory, other storage or a deposition. Such special purpose computing devices can also deposit custom wiring logic, ASICs or FPGAs with customized programming to execute the techniques. Special purpose computing devices can be desktop-type computing systems, portable computing systems, handheld devices, network devices or any other device that incorporates custom wiring logic and / or logic programming to implement the techniques.
[0094] [0094] For example, FIG. 4 is a block diagram illustrating a computing system 400 in which an embodiment of the invention can be implemented. Computing system 400 includes a bus 402 or other communication mechanism for transmitting information and a hardware processor 404 coupled to bus 402 for processing information. The hardware processor 404 can be, for example, a general purpose microprocessor.
[0095] [0095] The computing system 400 also includes a main memory 406, such as a random access memory (RAM) or other dynamic storage device, coupled to bus 402 for storing information and instructions that will be executed by processor 404. Main memory 406 can also be used to store temporary variables or other intermediate information during the execution of instructions that will be executed by the processor
[0096] [0096] Computing system 400 further includes a read-only memory (ROM) 408 or other static storage device coupled to bus 402 for storing static information and instructions for processor 404. A storage device 410, such as a magnetic disk, an optical disk or a solid state drive is provided and attached to bus 402 for the storage of information and instructions.
[0097] [0097] The computing system 400 can be coupled via bus 402 to a screen 412, such as a cathode ray tube (CRT), to display the information to a computer user. An input device 414, including alphanumeric keyboards and others, is coupled to bus 402 to communicate information and command selections for the 404 processor. Another type of input device by the user is cursor control 416, such as a mouse, a trackball (command ball) or cursor direction keys for communicating direction information and command selections for the 404 processor and for controlling the movement of the cursor on screen 412. This input device typically has two degrees of freedom on two axes, a first axis (for example, x) and a second axis (for example, y), which allows the device to specify positions on a plane.
[0098] [0098] The computing system 400 can implement the techniques described here using custom wiring logic, one or more ASICs or FPGAs, firmware and / or logic programming that in deposition with the computing system causes or programs the computing system 400 to be a special purpose machine. According to one embodiment, the techniques presented here are performed by the computing system 400 in response to processor 404 which executes one or more sequences of one or more instructions contained in main memory 406. Such instructions can be read within main memory 406 from from other storage media, such as storage device 410. Executing the instruction sequences contained in main memory 406 causes processor 404 to perform the process steps described here. In alternative modes, a set of wiring circuits can be used in place or in deposit with software instructions.
[0099] [0099] The term "storage media" as used here refers to any non-transitory media that stores data and / or instructions that cause a machine to operate in a specific way. Such storage media may comprise non-volatile media and / or volatile media. Non-volatile media include, for example, optical discs, magnetic discs or solid state drives, such as the storage device 410. Volatile media include dynamic memory, such as main memory 406. Common forms of storage media include , for example, a floppy disk, floppy disk, hard drive, solid state drive, magnetic tape or any other magnetic data storage media, CD-ROM, any other optical data storage media, any medium physical with hole patterns, a RAM, a PROM and an EPROM, a FLASH-EPROM, an NVRAM, any other chip or memory cartridge.
[0100] [0100] The storage medium is different, but can be used in association with the transmission medium. The transmission medium participates in the transfer of information between the storage medium. For example, the transmission medium includes coaxial cables, copper wire and optical fibers, including the cables that comprise the 402 bus. The transmission medium can also take the form of acoustic or light waves, such as those generated during communications of data by radio and infrared waves.
[0101] [0101] Various forms of media may be involved in transporting one or more sequences of one or more instructions to the 404 processor for execution. For example, instructions can initially be carried on a magnetic disk or a solid state drive on a remote computer. The remote computer can load the instructions into its dynamic memory and send the instructions over a phone line using a modem. A local modem in the computer system 400 can receive the data on the phone line and use an infrared transmitter to convert the data into an infrared signal. An infrared detector can receive the data loaded in the infrared signal and a set of appropriate circuits can place the data on bus 402. Bus 402 loads the data into main memory 406, from which processor 404 retrieves and executes instructions. Instructions received by main memory 406 can optionally be stored in storage device 410 before or after execution by processor 404.
[0102] [0102] Computing system 400 also includes a communication interface 418 coupled to bus 402. Communication interface 418 provides two-way data communication coupled to a network link 420 that is connected to a local network 422. For For example, communication interface 418 can be an integrated services digital network card (ISDN), a cable modem, a satellite modem, or a modem to provide a data communication connection to a corresponding type of telephone line. As another example, the communication interface 418 can be a local area network (LAN) card to provide a data communication connection to a compatible LAN. Wireless links can also be implemented. In any implementation of this type, the communication interface 418 sends and receives electrical, electromagnetic or optical signals that carry digital data streams that represent various types of information.
[0103] [0103] Network link 420 typically provides data communication across one or more networks to other devices. For example, network link 420 can provide a connection via LAN 422 to a host computer 424 or to data equipment operated by an Internet Service Provider (ISP) 426. ISP 426 in turn instead it provides data communication services over the communications network, data packets commonly referred to now as the “Internet” 428. Both the local network 422 and the Internet 428 use electrical, electromagnetic or optical signals that carry digital data streams. The signals through the various networks and the signals on the network link 420 and through the communication interface 418, which carry the digital data to and from the computing system 400, are examples of forms of transmission media.
[0104] [0104] The computing system 400 can send messages and receive data, including program code, through the network (s), the network link 420 and the communication interface 418. In the example of Internet, a server 430 it could transmit a requested code to an application program via the Internet 428, the ISP 426, the local network 422 and the communication interface 418.
[0105] [0105] The received code can be executed by the processor 404 as it is received and / or stored in the storage device 410 or other non-volatile storage for later execution.
[0106] [0106] 3. - FUNCTIONAL DESCRIPTION
[0107] [0107] 3.1 MODEL DEVELOPMENT
[0108] [0108] In some embodiments, the large FoV 170 disease recognition server is programmed or configured with data structures and / or database records that are arranged to create a first model to identify a candidate region that is likely to capture a intended target of a large FoV image. The desired target can be a single leaf, a set of several leaves or a set of several cultivation plants regardless of the number of leaves. Server 170 is programmed to receive a first set of training images, such as approximately 2,000 images cropped from aerial reconnaissance images of a field. Each of the training images can be of a consistent size, such as 224 pixels by 224 pixels. When the intended target is a single sheet, the training images that serve as positive samples, which can be approximately half of all training images, are typically clear enough that the computer can detect a single healthy or infected sheet in the center. image occupying more than 50% of the total area. The rest of the images, which could be very dark, capturing several leaves or capturing only the soil, would serve as negative samples.
[0109] [0109] In some modalities, server 170 is programmed to create a first model next based on the first set of training images. The first model can be an SVM that accepts a vector of HOG characteristics for a training image and produces an indication of whether the training image captures an intended target. Specifically, the training data for SVM can be the set of HOG characteristics for the training images that serve as positive examples and the set of HOG characteristics for the training images that serve as negative examples.
[0110] [0110] FIG. 7 illustrates an example of vector computation of HOG characteristics for regions of a large FoV image. A region 704 can be extracted from a large FoV image 702. The gradient values shown as the black arrows over region 704 can be computed for region 704, as further described below. These gradient values generally trace an outline of foreground objects in an image, as illustrated by the white arrows over the large FoV 702 image. The vector of HOG characteristics for a region, such as the histogram 706 in relation to values of different gradients, you can thus capture specific shapes of foreground objects in the region.
[0111] [0111] In some embodiments, the first model can be built using any SVM module or library known to a person skilled in the art. For example, the SVC function in the scikit-learn library can be executed with pre-established parameter values. Alternatively, the first model can include one or more other classification methods, such as the k-neighbor algorithm or a decision tree. In addition, the first model does not need to work with HOG characteristic vectors or otherwise rely on edge detection. For example, the first model can be based on pattern deposition that looks for characteristic leaf shapes.
[0112] [0112] In some embodiments, server 170 is programmed to also create a second model to identify intermediate regions that are likely to capture an infected target from a large FoV image. Server 170 is programmed to receive a second set of training images, such as approximately 1,000 images generated from infected cell phones for each disease that must be recognized, 1,000 images of healthy leaves and 1,000 background images generated from the random cut of aerial reconnaissance images, with each image at or greater than 224 pixels by 224 pixels. As with the first set of training images, when the intended target is a single sheet, each of the training images that should serve as positive examples contains a single sheet in the center of the image occupying more than 50% of the total area and is sufficiently clear to indicate any symptoms of the disease. Server 170 can also be programmed to include images of varying sizes and scales in the second set of training images.
[0113] [0113] In some embodiments, server 170 can be programmed to refine the second set of training images when needed by the second model. Each of the second set of training images can be cut to a fixed size, such as 224 pixels by 224 pixels. The second set of training messages can also be extended by subjecting existing training images to various imaging operations. For example, the short edge of each training image can be resized to be between 256 pixels and 384 pixels for scaling, as further described below and the resized image can be cropped randomly by 224 pixels by 224 pixels. For an additional example, the training images can be randomly rotated in the range (-90 degrees, 90 degrees) and sheared randomly with a shear factor of 0.2. For training with various sizes, which will also be described further below, the training images can be scaled in successively smaller scales and used directly.
[0114] [0114] In some modalities, server 170 is programmed to further increase the second set of training images. Data augmentation can be accomplished using a generative contradictory network (“GAN”), which can be used to generate photographs that appear authentic to human observers and thus is useful for enhancing the performance of a machine learning model through expansion training data set. Specifically, a GAN involves a system of two neural networks, namely, a generative network and a discriminative network, competing with each other in a zero-sum game network. The set of authentic images is used as the training set for the discriminative network, while the generative network produces a set of artificial images that appear authentic to human observers. An example of GAN is an enhanced Wassersstein GAN (“WGAN”) that is described in the article by Gulrajani et al., ArXiv: 1704.0028v2 [cs.CV] (2015). The second set of current training images can be used as the training set for the degenerative network in a semi-supervised learning mode.
[0115] [0115] In some embodiments, data augmentation can be implemented using any WGAN module or library known to a person skilled in the art. For example, the improved training package wgan under the Gitlub library can be adapted by replacing the discriminator with a ResNet50 network (to accept training data of certain sizes), which can be implemented using the ResNet50 model from the Keras library compatible with Python 2.7-3.5.
[0116] [0116] In some modalities, server 170 is programmed to create a second model next based on the second set of training images. The second model can be a CNN that accepts a large FoV image and performs a classification for each of the candidate regions, which can be identified by the first model, indicating whether the region captures symptoms of one of the predetermined diseases of the crop plants.
[0117] [0117] FIG. 8 illustrates an example of CNN architecture according to a modality. CNN can comprise a set of 802 convolutional layers configured to generate feature maps for the large FoV image provided. The product of executing the set of convolutional layers is a set of characteristic maps that correspond to different characteristics of infected targets, which can be at different scales.
[0118] [0118] CNN can further comprise a grouping layer of Rol 804 to generate fixed-length representations for regions with arbitrarily measured sizes of the large FoV image. Specifically, a Rol projection can be performed to project a region of interest, such as one identified by the first model, onto the large input FoV image to extract a corresponding portion (“projection”) from each feature map produced by the set of convolutional 802 layers. The set of portions of the feature maps can then be submitted to the RoI 804 cluster layer. An example of the Rol cluster layer is a max cluster layer that converts a projection into a 7 feature map. pixels by 7 pixels.
[0119] [0119] Finally, CNN can comprise a fully connected 806 layer to generate a final rating using the corresponding set of 7 pixel by 7 pixel feature maps. The fully connected layer can include the softmax function, which produces a class association probability for each class or in this case an infection probability for each disease or a probability of no infection.
[0120] [0120] In some embodiments, the second model can be built using any CNN module or library known to a person skilled in the art. For example, the ResNet50 model of the Keras library compatible with Python 2.7-3.5 can be used, with include top set to True, max set to True and classes set to 5. The five classes would correspond to background, gray stain on the leaves ( GLS), Goss's Wilt (GW), NLB and healthy leaf.
[0121] [0121] In some modalities, to achieve scale invariance, the second model, such as a CNN, can be trained with one-size training or training of various sizes. In other embodiments, the second model may include one or more other classification methods known to one skilled in the art, such as linear regression, a Markov chain or a recurrent neural network.
[0122] [0122] 3.2 OBTAINING SCALE OF IMAGE AND SELECTION OF REGION
[0123] [0123] In some modalities, provided a large FoV image, the 170 server is programmed to resize the image provided in different scales to generate an image pyramid, so that a sliding window of fixed size can be used to find a leaf of an arbitrary size. The size of the sliding window would be expected to have the expected value for each of the second set of training images. Any method of assigning scales to content known to a person skilled in the art can be used, such as a scale from the nearest neighbor or Lanczos resampling.
[0124] [0124] FIG. 9 illustrates an example of a process of obtaining scales to generate an image pyramid.
[0125] [0125] As an example, an original large FoV image can be 1,600 pixels by 1,200 pixels in size and 904, 906, 908 and 910 images can represent scaled images in successively smaller scales, corresponding to 900 pixels by 800 sizes pixels, 800 pixels by 600 pixels, 560 pixels by 420 pixels and 400 pixels by 300 pixels, respectively. The smallest resized image 910 would still be larger than the size of a sliding window 902, which is 224 pixels by 224 pixels. As the resized images become smaller, the scale assignment method is expected to maintain the main characteristics of the original image even if more information is lost. Therefore, as the images get smaller, the same sliding window 902 would correspond to a larger region in the original image ideally with more or less the same characteristics. Consequently, at successively smaller scales, sliding window 902 corresponds to a region in the original image that captures more and more leaves. For an image of a specific size, when the size of the field of view is large, the image could capture many leaves and each leaf could be represented only in a small area and thus a large scale would be enough for a sliding window 902 to match. a region that captures a single leaf.
[0126] [0126] Similarly, when the size of the field of view is small, the image could capture fewer leaves and each leaf could be represented in a large area and thus a small scale would be necessary for a 902 sliding window to correspond to a region it captures. a single sheet. In general, any number of scales can be used, although a larger number of scales would mean a greater number of regions to be considered.
[0127] [0127] In some embodiments, server 170 is programmed to run sliding window 902 through each resized image and execute the first model discussed previously on the corresponding region to identify a candidate region that is likely to capture a desired alo, such as a single leaf. Specifically, for each corresponding region, server 170 is programmed to compute the vector of HOG characteristics. Computation can be performed using any HOG module or library known to a person skilled in the art. For example, the function
[0128] [0128] FIG. 10 illustrates an example of a region selection result from an image pyramid. Graphics 1002, 1004, 1006 and 1008 can represent the original and resized images mentioned above that are 1,600 pixels by 1,200 pixels, 900 pixels by 800 pixels, 800 pixels by 600 pixels and 560 pixels by 420 pixels, respectively. The x-axis and y-axis of each graph indicate the image size in units of 40 pixels, although each graph is shown in the same size to indicate how the resized image is mapped back to the original image. Candidate regions are shown as dark areas in these graphs, as is dark region 1010. As can be seen from graphs 1002, 1004 and 1006, different candidate regions can be identified when the original image is scaled to different scales. For example, dark region 1010 appears clearly on graph 1004, but only partially on graph 1002 and not on all graphs 1006 and
[0129] [0129] 3.3 - COMPUTING IMAGE CHARACTERISTICS
[0130] [0130] In some embodiments, server 170 is programmed to run the second model on the large FoV input image to initially compute the characteristics for the image that will be used to determine whether the image captures any infected target, such as a leaf infected. Specifically, server 170 is configured to run CNN's set of convolutional layers on top of the large FoV image. CNN is generally able to extract the characteristics of the target at different scales. The execution of the set of convolutional layers on the large FoV image once is clearly more efficient than the execution of the set of convolutional layers additionally on the different resized images. Therefore, it is desirable to feed the large FoV image on CNN, to allow different regions of the image to be evaluated through convolution operations, but to trim the characteristic maps for concentration on the candidate regions identified by the first model for the Rol clustering layer. and the fully connected layer.
[0131] [0131] 3.4. CLASSIFICATION AND MAPPING OF THE REGIONS OF IMAGE
[0132] [0132] In some modalities, when the second model is CNN, server 170 is programmed to perform Rol's projection, as previously mentioned. Specifically, server 170 is configured to take each candidate region identified by the first model, map it to the large FoV image inserted to obtain a mapped region and extract a corresponding portion (“projected feature map”) from each of the map maps. characteristics produced by CNN's set of convolutional layers. An example of this mapping procedure can be found in the article by He et al, arXiv: 1406.4729v4 [cs.CV] (2015).
[0133] [0133] In some modalities, server 170 is programmed to execute the Rol grouping layer on each of the projected feature maps that correspond to a candidate region identified by the first model. Since the Rol cluster layer is designed to work over a reasonably sized region to produce a significant fixed size representation, it is desirable to send regions from the selected feature maps produced by the convolutional layer set to the Rol cluster layer . The execution of the Rol cluster layer on the projected feature maps that correspond to the selected regions of the large FoV image is clearly more efficient than the execution of the Rol cluster layer on all portions of the possible feature maps that correspond to all the possible sliding window regions of the large FoV image. Therefore, the deposition of the execution of the set of convolutional layers on the large FoV image and the execution of the Rol grouping layer on the projected characteristic maps achieves the optimal execution efficiency.
[0134] [0134] In some modalities, obtaining scales of the initial image to generate an image pyramid can be replaced by placing the Rol cluster layer on top as a cluster layer of the space pyramid. In this case, server 170 can be configured to identify different sets of spatial deposits from each feature map that correspond to different scales. Server 170 can be further configured to feed each of the space deposits as a region of interest in the Rol cluster layer or use the first model to filter some of the space deposits.
[0135] [0135] In some embodiment, server 170 is configured to run the fully connected layer over each set of projected feature maps produced by the Rol cluster layer for each mapped region of the large FoV image to determine whether the mapped region can be classified as an intermediate region that is likely to capture an infected target for each disease. The fully connected layer can produce, for each mapped region, a probability of infection for each class of disease or a probability of non-infection. Server 170 can be further configured to classify a mapped region as an intermediate region based on the predetermined criteria related to an infection probability, such as when an infection probability exceeds a certain threshold or is within the highest 5%.
[0136] [0136] FIG. 11 illustrates a product example of a second model for identifying regions that are likely to capture infected targets. Graphics 1102, 1104, 1106 and 1108 can represent the original and resized images mentioned above that are 1,600 pixels by 1,200 pixels, 900 pixels by 800 pixels, 800 pixels by 600 pixels and 560 pixels by 420 pixels, respectively. The xeoeixoy axis of each graph indicates the size of the image in units of 40 pixels, although each graph is shown in the same size to indicate how the resized image is mapped back to the original image. Candidate regions (which have mapped regions) that were considered to be likely to capture infected leaves are shown in different shading patterns that correspond to different diseases in these graphs, as indicated by the legend 1010 in each graph. As can be seen from graphs 1102, 1104 and 1106, the same infected leaf can be infected from images at different scales. It can also be seen from graph 1102 that the same leaf, different portions of the same leaf or nearby leaves can be determined to be infected with different diseases.
[0137] [0137] In some embodiments, server 170 is programmed to condense the set of intermediate regions identified by the second model through non-maximum suppression. Since a sliding window is used by the first model to identify candidate regions, many of the candidate regions identified by the first model, many of the mapped regions identified by the second model and many of the intermediate regions can overlap. Therefore, server 170 is configured to systematically eliminate overlapping regions as end regions. Specifically, outside the intermediate regions, the one that has the highest probability of infection and an associated disease is first chosen. Any region that overlaps with the chosen intermediate region by more than a predetermined limit, such as 40% of any of the overlapping regions, can then be removed as a final region. This process can then be repeated over the rest of the intermediate regions a number of times until a predetermined criterion is satisfied, just as more than a certain number of iterations has been performed or less than a number of intermediate regions remain. Alternatively, server 170 can be configured to divide the set of intermediate regions into clusters of overlapping regions and select a region from each of the clusters simultaneously or in no particular order.
[0138] [0138] FIG. 12 illustrates an example of applying non-maximum suppression. In a large 1208 FoV image, some candidate regions of the resized images, including those corresponding to the mapped regions 1202, 1204 and 1206, were determined to be those that are likely to capture infected leaves. Since these regions overlap substantially, they can capture the same infected leaf. Therefore, the one associated with the highest probability of infection, namely, region 1204 associated with a 99% probability, would be selected as a final region. Server 170 can report that the leaf captured in region 1204 can be infected with GLS with a 99% probability. After removing the mapped regions 1202, 1204 and 1206 from further consideration, if any mapped region remains, the process can be repeated.
[0139] [0139] In some embodiments, server 170 is programmed to transmit data related to the final regions to a display device, a client device or a remote server via a communication network. The data may include, for each final region, the position within the large input FoV image, the one likely to be infected by disease or the probability of infection.
[0140] [0140] 3.5 PROGRAMMABLE PROCESS EXAMPLES
[0141] [0141] FIG. 13 illustrates an example of a process for identifying infected targets from a large FoV image.
[0142] [0142] In some embodiments, server 170 is programmed to create a first model to determine whether a region is likely to capture a single target, such as a leaf. The first model can include or communicate with a HOG module that computes a vector of HOG characteristics for a region. The first model can also include an SVM that classifies a region into one of several classes based on the feature vector, such as an individual leaf class and a leafless or multiple leaf class, which can lead to the identification of a candidate region that will likely capture a single target.
[0143] [0143] In some embodiments, server 170 is also programmed to create a second model to determine whether a large FoV image is likely to capture infected targets, such as infected leaves, or whether specific regions of the FoV image are likely to capture individual infected targets. The second model can include a CNN comprising a set of convolutional layers, a Rol cluster layer and a fully connected layer. The set of convolutional layers function as the HOG module of the first model in computing characteristic values for an input image, which can be represented in the form of vectors, maps or matrices. However, the set of convolutional layers can extract characteristics of the single object at different scales directly from the large FoV image. The Rol cluster layer can process a set of projected feature maps produced by the set of convolutional layers for each candidate region identified by the first model in the feature maps. Finally, the fully connected layer can classify each candidate region or the corresponding mapped region in the large FoV image in each of the various classes, such as a class of a leaf infected with a certain disease and a class of a healthy leaf, for a certain probability based on the product of the Rol grouping layer. The product of the fully connected layer can also lead to the identification of an intermediate region that is likely to capture a single target infected with a certain disease.
[0144] [0144] In some modalities, with the first and second models in place, in step 1302, server 170 is configured to receive a digital image that captures numerous objects, such as a large FoV photo of a corn field. In step 1304, server 170 is programmed to generate a large number of images in scales from the received digital image. The number or dimensions of the scales may depend on the possible sizes of the field of view, the amount of computational resources available or other factors.
[0145] [0145] In some modalities, server 170 is programmed to execute the first model on each of the images in scales to identify candidate regions of the received digital image, of which each one will probably capture a single object and to execute the second model to finally identify intermediate regions starting from candidate regions, each of which is likely to capture the single object infected with a certain disease, such as an NLB-infected maize leaf. When the second model is a CNN, using the set of convolutional layers to obtain feature maps separately for different candidate regions can be time consuming. In step 1306, however, server 170 is configured to use the set of convolutional layers once to extract the characteristics of the single object at different scales directly from the received digital image.
[0146] [0146] In some modalities, in step 1308, server 170 is configured to execute the first model on each of the images in scales to identify the candidate regions, as previously mentioned. This step can also be performed before the
[0147] [0147] In some embodiments, in step 1316, server 170 is programmed to clear the list of intermediate regions to obtain the final regions. Cleaning can consolidate duplicate or highly overlapping regions or additionally filter regions based on the chances of infection. The elimination of the overlapping regions can be accomplished through non-maximum suppression. Finally, in step 1318, server 170 is programmed to transmit data regarding the final regions to the display device, a client device or a remote server. The data may include, for each final region, the position within the received digital image, the probability of associated infection, the disease of infection or other information.
权利要求:
Claims (20)
[1]
1. Computer-implemented Method for Detecting Infected Objects Using Images of Large Field of View, which is characterized by comprising: receiving, by a processor, a digital image that captures numerous objects; generate, by the processor, a large number of images in scales starting from the digital image corresponding to a large number of scales, respectively; compute a group of characteristic matrices for the digital image; for each of the large number of images in scales: select a list of candidate regions starting from the image in scales from which each one is likely to capture a single object; and for each of the candidate regions list: map the candidate region back to the digital image to obtain a mapped region; identify a corresponding portion of each group of characteristic matrices based on the mapping; and determine whether the candidate region is likely to capture the single object infected with a disease based on the corresponding portion group; choose a group of final regions from the list of mapped regions based on the determination; and cause information to be displayed in relation to the group of final regions.
[2]
2. Computer Implemented Method for Detecting Infected Objects Using Large Field of View Images, according to Claim 1, characterized in that the numerous objects are several leaves in a plantation field, the single object being one of several leaves .
[3]
3. Computer Implemented Method of Detecting Infected Objects Using Large Field of View Images, according to Claim 1, characterized in that the determination comprises the computation of a probability of infection associated with the disease, the choice being based on the probability of infection greater than a certain limit.
[4]
4. Computer Implemented Method of Detecting Infected Objects Using Large Field of View Images, according to Claim 1, characterized in that the determination comprises the computation of a probability of infection associated with the disease, comprising the choice: identification one of the lists of mapped regions that are most likely to be infected; the elimination of any mapped region that overlaps a mapped region in an amount that exceeds a certain limit; the repetition of the previous two steps until an interruption criterion is satisfied.
[5]
5. Computer-implemented Object Detection Method
Infected From Images of Large Field of View, according to Claim 1, characterized in that the computation comprising the execution of one or more convolutional layers of a convolutional neural network (CNN) on the digital image, the set of convolutional layers configured to extract characteristic values from the single object in the large number of scales, the determination comprising: the execution of a CNN grouping layer on each of the groups of corresponding portions of the characteristic matrices; the execution of a fully connected layer of CNN on the output data of the cluster layer, the fully connected layer configured to classify one with deposit of characteristic values in relation to different classes that correspond to the single object respectively infected with different diseases.
[6]
6. Computer-implemented Method for Detecting Infected Objects from Large Field of View Images, according to Claim 1, characterized in that the selection comprises: generating a large number of regions from the image in scales using a sliding window of a specific size; for each of the large number of regions: calculate a vector of characteristics for the region; determine whether the region is likely to capture the single object based on the feature vector.
[7]
7. Computer Implemented Method for Detecting Infected Objects From Large Field of View Images, according to Claim 6, characterized in that the calculation comprises the construction of a gradient-oriented histogram (HOG) for the region.
[8]
8. Computer Implemented Method of Detecting Infected Objects Using Large Field of View Images, according to Claim 6, characterized in that it determines whether the region is likely to capture the single object that comprises the execution of a support vector machine (SVM) on the characteristic vector, the SVM configured to classify a set of characteristic values in relation to different classes that correspond to the presence or absence of the single object.
[9]
9. Computer Implemented Method for Detecting Infected Objects Using Large Field of View Images, according to Claim 1, characterized in that the single object is a corn leaf and the disease is a gray spot on the leaves (GLS), Goss's Wilt (GW) or Northern Leaf Blight (NLB).
[10]
10. Computer Implemented Method for Detecting Infected Objects From Large Field of View Images, according to Claim 1, characterized in that the information indicates, for one of the group of final regions, a position of the final region within the image digital and the corresponding disease.
[11]
11. Storage media, non-transitory, that can be read by a computer that stores one or more instructions that, when executed by one or more processors, cause one or more processors to perform a method of detecting infected objects starting from images of large field of view, characterized by that the method comprises: receiving a digital image that captures numerous objects;
generate a large number of images in scales starting from the digital image corresponding to a large number of scales, respectively; compute a group of characteristic matrices for the digital image; for each of the large number of images in scales: select a list of candidate regions starting from the image in scales from which each one is likely to capture a single object; and for each of the candidate region lists: map the candidate region back to the digital image to obtain a mapped region; identify a corresponding portion of each group of characteristic matrices based on the mapping; and determine whether the candidate region is likely to capture the single object infected with a disease based on the corresponding portion group; choose a group of final regions from the list of mapped regions based on the determination; and cause information to be displayed in relation to the group of final regions.
[12]
12. Storage media, non-transitory, that can be read by computer, according to Claim 11, characterized in that the innumerable objects are several leaves in a plantation field, the single object being one of the several leaves.
[13]
13. Storage media, non-transitory, that can be read by computer, according to Claim 11, characterized in that the determination comprising computing a probability of infection associated with the disease, the choice being based on the probability of infection greater than certain limit.
[14]
14. Storage media, non-transitory, that can be read by computer, according to Claim 11, characterized in that the determination comprising computing a probability of infection associated with the disease, the choice comprising: the identification of one of the lists of mapped regions that are most likely to be infected; the elimination of any mapped region that overlaps with a mapped region in an amount that exceeds a certain limit; the repetition of the previous two steps until an interruption criterion is satisfied.
[15]
15. Storage media, non-transitory, that can be read by computer, according to Claim 11, characterized in that the computation comprising the execution of one or more convolutional layers of a convolutional neural network (CNN) on the digital image, the set of convolutional layers configured to extract characteristic values from the single object in the large number of scales,
the determination comprising: the execution of a CNN grouping layer on each of the groups of corresponding portions of the characteristic matrices; the execution of a fully connected layer of CNN on the output data of the cluster layer, the fully connected layer configured to classify a combination of characteristic values in relation to different classes that correspond to the single object respectively infected with different diseases.
[16]
16. Storage media, non-transitory, that can be read by computer, according to Claim 11, characterized in that the selection comprising: generating a large number of regions starting from the image in scales using a sliding window of a specific size; for each of the large number of regions: calculate a vector of characteristics for the region; determine whether the region is likely to capture the single object based on the feature vector.
[17]
17. Storage media, non-transitory, that can be read by computer, according to Claim 16, characterized in that the calculation comprising the construction of a oriented gradient histogram (HOG) for the region.
[18]
18. Storage media, non-transitory, that can be read by computer, according to Claim 16, characterized in that it determines whether the region is likely to capture the single object that comprises the execution of a support vector machine (SVM) on the characteristic vector, the SVM configured to classify a set of characteristic values in relation to different classes that correspond to the presence or absence of the single object.
[19]
19. Storage media, non-transitory, that can be read by computer, according to Claim 11, characterized in that the single object is a corn leaf and the disease is a gray spot on the leaves (GLS), Goss's Wilt (GW) or Northern Leaf Blight (NLB).
[20]
20. Storage media, non-transitory, that can be read by computer, according to Claim 11, characterized in that the information indicates, for one of the group of final regions, a position of the final region within the digital image and the corresponding disease .
类似技术:
公开号 | 公开日 | 专利标题
BR122020015963A2|2020-11-10|computer-implemented method of determining a number of grains from an image of a corn and non-transitory storage media
US10755129B2|2020-08-25|Disease recognition from images having a large field of view
AU2016365336B2|2020-11-12|Forecasting field level crop yield during a growing season
EP3496526A1|2019-06-19|Generating pixel maps from non-image data and difference metrics for pixel maps
US10713542B2|2020-07-14|Detection of plant diseases with multi-stage, multi-scale deep learning
US10761075B2|2020-09-01|Detecting infection of plant diseases by classifying plant photos
BR112020009202A2|2020-10-20|selection of hybrid seeds and optimization of seeds portfolio by field
BR112021000429A2|2021-04-06|GENERATION OF AGRONOMIC PRODUCTIVITY MAPS FROM FIELD HEALTH IMAGES
BR112021006133A2|2021-06-29|machine learning techniques to identify clouds and cloud shadows in satellite imagery
BR112020009205A2|2020-10-13|selection of hybrid seeds and optimization of seeds portfolio by field
BR112020022715A2|2021-02-02|analysis and presentation of agricultural data
BR112021010115A2|2021-08-24|Using spatial statistical models to implement agronomic trials
同族专利:
公开号 | 公开日
US20190384998A1|2019-12-19|
AR113283A1|2020-03-11|
EP3692468A1|2020-08-12|
US10423850B2|2019-09-24|
AU2018346351A1|2020-04-30|
CN111406261B|2021-08-10|
CA3078119C|2020-10-13|
US20190108413A1|2019-04-11|
CN111406261A|2020-07-10|
AU2018346351B2|2020-09-03|
US10755129B2|2020-08-25|
ZA202002225B|2021-05-26|
EP3692468A4|2021-06-30|
WO2019070963A1|2019-04-11|
CA3078119A1|2018-10-04|
CN113486846A|2021-10-08|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

IL122258A|1997-11-20|2002-08-14|Israel Aircraft Ind Ltd|Method and system for determining temperature and/or emissivity function of objects by remote sensing|
US6654506B1|2000-01-25|2003-11-25|Eastman Kodak Company|Method for automatically creating cropped and zoomed versions of photographic images|
CA2443098A1|2001-04-13|2002-10-24|Cargill, Incorporated|Processes for evaluating agricultural and/or food materials; applications; and, products|
US7054496B2|2002-06-27|2006-05-30|Seiko Epson Corporation|Method and apparatus for displaying arbitrarily magnified high resolution images using compressed domain processing|
US8538966B2|2008-11-21|2013-09-17|Randall Reese|Machine, program product, and computer-implemented method for file management, storage, and access utilizing a user-selected trigger event|
BRPI0905641A2|2009-07-08|2011-03-29|Basf Se|plant anomaly diagnostics system|
WO2012124066A1|2011-03-15|2012-09-20|富士通株式会社|Portable terminal, work assisting program, information providing method, and information providing program|
US20150206255A1|2011-05-13|2015-07-23|HydroBio, Inc|Method and system to prescribe variable seeding density across a cultivated field using remotely sensed data|
US8493460B2|2011-09-15|2013-07-23|DigitalOptics Corporation Europe Limited|Registration of differently scaled images|
TWI435234B|2011-11-24|2014-04-21|Inst Information Industry|Plant disease identification method, system and record media|
US20130169834A1|2011-12-30|2013-07-04|Advanced Micro Devices, Inc.|Photo extraction from video|
US10520482B2|2012-06-01|2019-12-31|Agerpoint, Inc.|Systems and methods for monitoring agricultural products|
CA2879220A1|2012-07-23|2014-01-30|Dow Agrosciences Llc|Kernel counter|
US9113590B2|2012-08-06|2015-08-25|Superior Edge, Inc.|Methods, apparatus, and systems for determining in-season crop status in an agricultural crop and alerting users|
US20160000020A1|2013-02-01|2016-01-07|Brilliantservice Co., Ltd.|Cultivation system, cultivation program, and cultivation method|
US9147110B2|2013-04-05|2015-09-29|Pioneer Hi-Bred International, Inc.|Field and crop evaluation tool and methods of use|
EP3161124B1|2014-06-26|2020-06-03|Indigo Ag, Inc.|Endophytes, associated compositions, and methods of use thereof|
US10068354B2|2014-01-02|2018-09-04|Deere & Company|Obtaining and displaying agricultural data|
US10074006B2|2014-03-17|2018-09-11|Carnegie Mellon University|Methods and systems for disease classification|
US9364005B2|2014-06-26|2016-06-14|Ait Austrian Institute Of Technology Gmbh|Plant-endophyte combinations and uses therefor|
WO2016025848A1|2014-08-15|2016-02-18|Monsanto Technology Llc|Apparatus and methods for in-field data collection and sampling|
US10349584B2|2014-11-24|2019-07-16|Prospera Technologies, Ltd.|System and method for plant monitoring|
US9928410B2|2014-11-24|2018-03-27|Samsung Electronics Co., Ltd.|Method and apparatus for recognizing object, and method and apparatus for training recognizer|
IL236606A|2015-01-11|2020-09-30|Gornik Amihay|Systems and methods for agricultural monitoring|
US9792557B2|2015-01-14|2017-10-17|Accenture Global Services Limited|Precision agriculture system|
US10115187B2|2015-01-30|2018-10-30|Raytheon Company|Apparatus and processes for classifying and counting corn kernels|
US9734400B2|2015-01-30|2017-08-15|AgriSight, Inc.|System and method for field variance determination|
US9638678B2|2015-01-30|2017-05-02|AgriSight, Inc.|System and method for crop health monitoring|
CN104680149B|2015-03-10|2018-07-03|苏州科达科技股份有限公司|A kind of object type recognition methods and system|
WO2016145334A1|2015-03-12|2016-09-15|Raytheon Company|Apparatus and processes for corn moisture analysis and pprediction of optimum harvest date|
US20160309646A1|2015-04-24|2016-10-27|360 Yield Center, Llc|Agronomic systems, methods and apparatuses|
US10019631B2|2015-11-05|2018-07-10|Qualcomm Incorporated|Adapting to appearance variations when tracking a target object in video sequence|
US10491879B2|2016-01-15|2019-11-26|Blue River Technology Inc.|Plant feature detection using captured images|
CN106682704B|2017-01-20|2019-08-23|中国科学院合肥物质科学研究院|A kind of disease geo-radar image recognition methods of integrating context information|
CN106960179B|2017-02-24|2018-10-23|北京交通大学|Rail line Environmental security intelligent monitoring method and device|
US10438302B2|2017-08-28|2019-10-08|The Climate Corporation|Crop disease recognition and yield estimation|
US10423850B2|2017-10-05|2019-09-24|The Climate Corporation|Disease recognition from images having a large field of view|US10438302B2|2017-08-28|2019-10-08|The Climate Corporation|Crop disease recognition and yield estimation|
US10423850B2|2017-10-05|2019-09-24|The Climate Corporation|Disease recognition from images having a large field of view|
US20190108631A1|2017-10-06|2019-04-11|AgriSight, Inc.|System and method for field pattern analysis|
AU2019360153A1|2018-10-19|2021-05-20|The Climate Corporation|Detecting infection of plant diseases by classifying plant photos|
US10713542B2|2018-10-24|2020-07-14|The Climate Corporation|Detection of plant diseases with multi-stage, multi-scale deep learning|
US11240961B2|2018-10-26|2022-02-08|Deere & Company|Controlling a harvesting machine based on a geo-spatial representation indicating where the harvesting machine is likely to reach capacity|
US11178818B2|2018-10-26|2021-11-23|Deere & Company|Harvesting machine control system with fill level processing based on yield data|
US11079725B2|2019-04-10|2021-08-03|Deere & Company|Machine control using real-time model|
US11234366B2|2019-04-10|2022-02-01|Deere & Company|Image selection for machine control|
WO2020257791A1|2019-06-20|2020-12-24|InnerPlant, Inc.|Methods for deploying biosentinels to agricultural fields and monitoring biotic and abiotic stresses in crops remotely|
US11238283B2|2019-10-04|2022-02-01|The Climate Corporation|Hybrid vision system for crop land navigation|
CN111325809B|2020-02-07|2021-03-12|广东工业大学|Appearance image generation method based on double-impedance network|
CN112735070A|2020-12-29|2021-04-30|姜庆娟|Internet-based forestry monitoring method|
法律状态:
2021-11-23| B350| Update of information on the portal [chapter 15.35 patent gazette]|
优先权:
申请号 | 申请日 | 专利标题
US15/725,284|2017-10-05|
US15/725,284|US10423850B2|2017-10-05|2017-10-05|Disease recognition from images having a large field of view|
PCT/US2018/054348|WO2019070963A1|2017-10-05|2018-10-04|Disease recognition from images having a large field of view|
[返回顶部]