专利摘要:
A soil imaging system with a working layer sensor disposed on an agricultural implement to generate an electromagnetic field across an area of interest in the soil as the agricultural implement traverses a field. A monitor in communication with the working layer sensor is adapted to generate a working layer image of the soil layer of interest based on the generated electromagnetic field. The working layer sensor can also generate a reference image by generating an electromagnetic field through undisturbed ground. The monitor can compare at least one feature of the reference image with at least one feature of the working layer image to generate a characterized image of the working layer of interest. The display can show operator feedback and can affect operational control of the agricultural implement based on the characterized image.
公开号:BR112019009310B1
申请号:R112019009310-9
申请日:2017-10-25
公开日:2021-06-08
发明作者:Dale Koch;Dexter Travis;Todd Swanson;Dillon Sloneker
申请人:The Climate Corporation;
IPC主号:
专利说明:

BACKGROUND
[001] It is well known that the proper and uniform depth of the seed trench, the precise placement of the seed within the seed trench (at the proper depth and proper spacing), good seed-to-soil contact, and minimal residue of crop in the seed ditch are all critical factors in uniform seed emergence and high yields. Consequently, several planting improvements have been proposed to achieve each of these factors. While performing spot checks of the seed trench can help ensure that these critical factors are being met, these spot checks will only identify conditions at the specific location being checked. Consequently, there is a need for a seed trench imaging system to verify and ensure that these critical factors are being achieved during planting operations and to allow automatic or remote adjustment of the planter while on the move based on the images . There is a similar need for belowground surface imaging and control for other types of agricultural implements, including tillage implements, side fertilizing or in-ground fertilization implements, and agricultural data collection implements. BRIEF DESCRIPTION OF THE DRAWINGS
[002] The present disclosure is illustrated by way of example, and not limitation, in the attached drawings and in which:
[003] FIG. 1 schematically illustrates an embodiment of a working layer sensor, in elevation view, disposed relative to a seed trench.
[004] FIG. 2A, FIG. 2B, FIG. 2C are representative examples of working layer images generated by the working layer sensor of FIG. 1.
[005] FIG. 3 schematically illustrates another embodiment of a working layer sensor, in plan view, disposed relative to a seed trench.
[006] FIG. 4A, FIG. 4B are representative examples of working layer images generated by the working layer sensor of FIG. 3.
[007] FIG. 5 schematically illustrates another embodiment of a working layer sensor, in elevation view, disposed in relation to a seed trench.
[008] FIG. 6 is a representative example of a working layer image generated by the working sensor of FIG. 5.
[009] FIG. 7 is a side elevation view of an embodiment of a row unit of an agricultural planter incorporating a working layer sensor of FIG. 1, FIG. 3 or FIG. 5.
[0010] FIG. 8 illustrates an embodiment of a work layer implement operator monitoring, control, and feedback system.
[0011] FIG. 9 is a graph showing a process for monitoring, control, and operator feedback of work tier implements.
[0012] FIG. 10 schematically illustrates another embodiment of a working layer sensor, in plan view, disposed relative to a seed trench.
[0013] FIG. 11 schematically illustrates another embodiment of a working layer sensor, in plan view, disposed relative to a seed trench.
[0014] FIG. 12 schematically illustrates another embodiment of a working layer sensor, in side view, disposed relative to a seed trench.
[0015] FIG. 13 schematically illustrates, in side view, a spatial relationship between a transmitter and a receiver.
[0016] FIG. 14 schematically illustrates another embodiment, in side view, of a laser system.
[0017] FIG. 15 is a representative example of the working layer image generated by any of the working layer sensors.
[0018] FIG. 16 illustrates an example computer system that is configured to perform the functions described herein, shown in a field environment with other apparatus with which the system can interoperate.
[0019] FIG. 17 illustrates two views of an example logical organization of instruction sets in main memory when an example mobile application is loaded for execution.
[0020] FIG. 18 illustrates a programmed process by which the agricultural intelligence computational system generates one or more preconfigured agronomic models using agronomic data provided by one or more data sources.
[0021] FIG. 19 is a block diagram illustrating a computer system in which an embodiment of the invention can be implemented.
[0022] FIG. 20 depicts an example embodiment of a timeline view for data entry.
[0023] FIG. 21 depicts an exemplary embodiment of a spreadsheet view for data entry. SUMMARY OF THE INVENTION
[0024] The appended claims can serve as a summary of the invention, a ground imaging system. DETAILED DESCRIPTION 1. DISCLOSURE OVERVIEW
[0025] Referring now to the drawings, in which like reference numerals designate identical or corresponding parts throughout the various views, FIG. 1, FIG. 3 and FIG. 5 schematically illustrate alternative embodiments of a working layer sensor 100 for generating a signal or image representative of soil densities or other soil characteristics across a soil region of interest, hereinafter referred to as the "soil layer". working layer" 104. The representative image or signal generated by the working layer sensor 100 is hereinafter referred to as the "working layer image" 110. In a particular application discussed later, the working layer sensors 100 can be mounted in a planter row unit 200 (FIG. 7) to generate a working layer 110 image of the seed trench as the planter traverses the field. The image of working layer 110 may be displayed on a display 300 visible to an operator inside a tractor cab and the planter may be equipped with various actuators to control the planter based on the characteristics of the working layer 104 as determined from here. of the working layer image 110.
[0026] The working layer sensor 100 for generating the working layer image 110 may comprise a ground penetrating radar system, an ultrasound system, an audible range sound system, an electric current system or any another suitable system for generating an electromagnetic field 102 through the working layer 104 to produce the working layer image 110. It should be understood that the depth and width of the working layer 104 may vary depending on the agricultural implement and operation being performed.
[0027] FIG. 1 is a schematic illustration of an embodiment of a working layer sensor 1001 disposed with respect to a seed trench 10 formed in the ground 11 by a planter, wherein the seed trench 10 comprises the ground region of interest or working layer 104. In this embodiment, the working layer sensor 100-1 comprises a transmitter (T1) arranged on one side of the seed trench 10 and a receiver (R1) arranged on the other side of the seed trench 10 for produce the electromagnetic field 102 across the seed trough to generate the working layer image 110.
[0028] In some embodiments, the working layer sensor 100 may comprise a ground penetrating radar subsurface inspection system such as any of the following commercially available systems: (1) the StructureScan™ Mini HR available from GSSI in Nashua, New Hampshire; (2) the 3d-Radar GeoScope™ Mk IV coupled to a 3d-Radar VX-Series and/or DX-Series multichannel antenna, all available from 3d-Radar AS in Trondheim, Norway; or (3) the MALA Imaging Radar Array System available from MALA Geoscience in Mala, Sweden. In such embodiments, the commercially available system can be mounted on the planter or other implement, or it can be mounted on a cart that moves with the implement; in either case, the system is preferably arranged to capture an image of a working layer in the area of interest (eg the seed trench). In some embodiments, working layer image 110 may be generated from the signal outputs of working layer sensor 100 using commercially available software such as GPR-SLICE (e.g., version 7.0) available from GeoHiRes International Ltd .located in Borken, Germany.
[0029] FIG. 2A, FIG. 2B, FIG. 2C are intended to be representative examples of working layer images 110 generated by working layer sensor 100-1 of FIG. 1 showing various characteristics of the seed trench 10, including, for example, trench depth, trench shape, seed depth 12, seed depth relative to trench depth, crop residue 14 in the trench, and the empty spaces 16 within the ditch. As described in more detail later, the working layer 110 images can be used to determine other characteristics of the working layer 104, including, for example, seed-to-ground contact, closed trench percentage, trench top half percentage closed, percentage of the lower half of the closed trench, soil moisture, etc.
[0030] FIG. 3 schematically illustrates, in plan view, another embodiment of a working layer sensor 100-2 arranged with respect to a seed trench 10. In this embodiment, a transmitter (T1) is arranged on one side of the seed trench. seeds 10, a first receiver (R1) is disposed on the other side of the seed trench 10 and a second receiver (R2) is disposed adjacent and posterior to the transmitter (T1). FIG. 4A is a representative illustration of the working layer image 110 generated across the ditch between the transmitter (T1) and the first receiver (R1) and FIG. 4B is a representative illustration of the working layer image 110 generated between the transmitter (T1) and the second receiver (R2) providing an image of the undisturbed ground adjacent to the seed trench.
[0031] FIG. 5 is an elevation view schematically illustrating another embodiment of the working layer sensor 100-3 disposed with respect to a seed trench 10. In this embodiment, the working layer sensor 100-3 comprises a plurality of transmitter and receiver pairs arranged above and across the seed trench 10.
[0032] FIG. 6 is a representative illustration of the working layer image 110 generated by the working layer sensor 100-3 of FIG. 5, which provides a view not only of the seed trench, but also of a portion of soil adjacent to each side of the seed trench.
[0033] FIG. 10 schematically illustrates, in plan view, another embodiment of a working layer sensor 100-4 arranged with respect to a seed trench 10. In this embodiment, a transmitter (T1) is arranged over the seed trench 10 Arranged behind the transmitter (T1) in one direction of travel are three receivers (R1), (R2) and (R3). Receivers (R1) and (R3) are arranged along each side of seed trench 10, respectively. The receiver (R2) is disposed over the seed trench 10. Images of the working layer similar to those shown in FIG. 2A, FIG. 2B, FIG. 2C can be generated by 100-4 working layer sensor.
[0034] FIG. 11 schematically illustrates, in plan view, another embodiment of a working layer sensor 100-5 arranged with respect to a seed trench 10. In this embodiment, the transmitter (T2) is arranged over the seed trench 10 and transmitters (T1) and (T3) are disposed along each side of the seed trench 10, respectively. Arranged behind the transmitters (T1), (T2) and (T3) in one direction of travel, there are three receivers (R1), (R2) and (R3). Receivers (R1) and (R3) are arranged along each side of seed trench 10, respectively. The receiver (R2) is disposed over the seed trench 10. Images of the working layer similar to those shown in FIG. 2A, FIG. 2B, FIG. 2C can be generated by 100-5 working layer sensor.
[0035] FIG. 12 schematically illustrates, in side view, another embodiment of a working layer sensor 100-6 arranged with respect to the seed trench 10. In this embodiment, the transmitter (T1) is arranged over the seed trench 10 and it has a transmission angle that encompasses both sides of the seed trench 10. The receiver (R1) can be arranged adjacent or posterior to the transmitter (T1). Because it has a transmission angle that reaches both sides of the seed trench 10, the reflected signal received by the receiver (R1) is then an average of both sides of the seed trench 10. This gives a single measurement that is an average of the distance between transmitter (T1) and seed trench 10.
[0036] Any of the working layer sensor embodiments 100-1, 100-2, 100-3, 100-4, 100-5, 100-6 can also produce a working layer image as illustrated in FIG. 15. FIG. 15 is a profile of an open seed trench 10, shown with an optional seed.
[0037] For each of the working layer sensor embodiments 100-1, 100-2, 100-3, 100-4, 100-5, 100-6 the operating frequency of the working layer sensors 100 and the vertical position of transmitters (T) and receivers (R) above ground and the spacing between transmitters (T) and receivers (R) are selected to minimize the signal-to-noise ratio while also capturing the desired depth and width of the region. of interest of the ground (the working layer 104) for which the working layer 110 image is generated. In an embodiment illustrated in FIG. 13, the height of the receiver (R) above the ground can be less than the height of the transmitter (T) above the ground. An angle formed between the transmitter (T) and the receiver (R) can be from 0 to 80°.
[0038] In an embodiment illustrated in FIG. 14, a laser (L1) is positioned above a seed trench 10 and projects a laser into the seed trench 10. A receiver (R1), such as a camera, is positioned to receive the signal from the reflected laser. The receiver (R1) is at a height above ground less than the height of the laser (L1) above ground. An angle b formed between laser (L1) and receiver (R) can be greater than 0 to 80°. The same control system can be used, with laser (L1) replacing a transmitter (T).
[0039] In one embodiment, the selected transmitter frequency may be one that can penetrate vegetation and see the ground below. By not seeing the vegetation, a more accurate measure of the depth of the seed trench is obtained 10. It has been determined that the higher the frequency, the more the radar signal is reflected by the vegetation. In one embodiment, the frequency is 24 GHz. In another embodiment, the selected frequency may be one that can penetrate dust. Dust can be generated as an agricultural vehicle traverses a field. Frequencies in a range of 1 to 100 GHz can penetrate dust. In any of the 100-1, 100-2, 100-3, 100-4, 100-5, 100-6 working layer sensor embodiments, any of the transmitters (T) or receivers (R) can have a frequency that penetrates vegetation and dust. In another embodiment, any of the working layer sensor embodiments 100-1, 100-2, 100-3, 100-4, 100-5, any of the transmitters (T) or receivers (R) it can be replaced by multiple transmitters (T) or receivers (R) at the locations illustrated with each transmitter (T) or receiver (R) having a different frequency, such as one that penetrates through vegetation and one that penetrates through dust. A composite of the two working layers can be used to generate the seed trench profile 10.
[0040] In one embodiment, the radar is a Doppler radar. Doppler radar can provide the speed of a 200-row unit, which can then be used in a control system to change the application rate of an agricultural input to obtain an application selected by linear distance or area. Agricultural inputs include, but are not limited to, seeds, fertilizers, insecticides, herbicides and fungicides. Doppler radar can be pulse coherent, pulse-Doppler, continuous wave, or frequency modulation. Doppler radar can be used with any of the working layer sensor embodiments 100-1, 100-2, 100-3, 100-4, 100-5, 1006.
[0041] In one embodiment, the radar is a phased radar network. With an in-phase radar network, the signals generated by the in-phase network can be moved back and forth in the seed trench 10 to provide a more detailed profile of the seed trench 10. The staged radar network can be used with any of the working layer sensor embodiments 100-1, 100-2, 100-3, 100-4, 100-5, 100-6.
[0042] Planter applications in FIG. 7 illustrate an example of a particular application of working layer sensors 100 arranged in a row unit 200 of an agricultural planter. The row unit 200 includes a working layer sensor 100A disposed at a front end of the row unit 200 and a working layer sensor 100B disposed at the rear end of the row unit 200. The front working layer sensors and back 100A, 100B may comprise any of the embodiments of working layer sensors 100-1, 100-2, 100-3, 100-4, 100-5, 100-6 described above.
[0043] The front working layer sensor 100A is arranged to generate a reference working layer image (hereinafter a "reference layer image") 110A of the soil before the soil is disturbed by the planter, while the sensor The back working layer 100B generates the working layer image 110B, which in this example is the image of the closed seed trench 10 in which the seed was deposited and covered with soil. For reasons explained further below, it is desirable to obtain a reference image 110A and a working layer image 110B for analyzing soil characteristics through the working layer 104.
[0044] It should be appreciated that the front and rear working layer sensors 100A, 100B referenced in FIG. 7 can employ any one of embodiments 100-1, 100-2 or 100-3. 100-4, 100-5, 100-6 described above. However, it should be appreciated that if embodiments 100-2, 100-3, 100-4 or 100-5 are employed, the front working layer sensor 100A may be eliminated because embodiments 100-2 , 100-3, 100-4 and 100-5 are configured to generate the images of the working layer 110 of undisturbed soil adjacent to the seed trench 10, which could serve as the reference layer image 110A.
[0045] Referring to FIG. 7, row unit 200 comprises a frame 204 pivotally connected to tool bar 202 by a parallel link 206, allowing each row unit 200 to move vertically independently of tool bar 202. Frame 204 operatively supports one or more hoppers 208, a seed meter 210, a seed dispensing mechanism 212, a down force control system 214, a seed trench assembly 220, a trench closure assembly 250, a seed wheel assembly. packer 260 and a row cleaning assembly 270. It should be understood that the row unit 200 shown in FIG. 7 may be for a conventional planter or row unit 200 may be a central filling planter, in which case hoppers 208 may be replaced with one or more mini hoppers and frame 204 modified as would be recognized by those skilled in the art .
[0046] Down force control system 214 is arranged to apply lifting and/or lowering force to row unit 200, as disclosed in US Publication No. US2014/0090585, which is incorporated herein in its entirety by reference.
[0047] The seed trench opening assembly 220 includes a pair of opening discs 222 rotatably supported by a rod member 205 which extends below the frame 204. The opening discs 222 are arranged to diverge outwardly and to back to open a V-shaped ditch 10 in the ground 11 as the planter crosses the field. Seed delivery mechanism 212, such as a seed tube or seed conveyor, is positioned between opening discs 222 to deliver seeds from seed meter 210 and deposit them in the open seed trench 10. The depth of the seed trough 10 is controlled by a pair of gauge wheels 224 positioned adjacent to the opening discs 222. The gauge wheels 224 are rotatably supported by gauge wheel arms 226 which are rotatably fixed at one end to the frame 204 around the gauge pin. pivot 228. A swingarm 230 is rotatably supported on the frame 204 by a pivot pin 232. It should be appreciated that rotation of the swingarm 230 around the pivot pin 232 defines the depth of the trench 10 limiting upward displacement of the arms. of gauge wheel 226 (and thus the gauge wheels) relative to opening disks 222. Rocker arm 230 can be positioned so adjustable via a linear actuator 234 mounted on the row unit frame 204 and rotatably coupled to an upper end of the swing arm 230. The linear actuator 234 can be remotely controlled or automatically actuated as disclosed, for example, in International Publication No. WO2014 /186810, which is incorporated herein in its entirety by reference.
[0048] A down force sensor 238 is configured to generate a signal related to the amount of force imposed by the gauge wheels 224 on the ground. In some embodiments, pivot pin 232 for swingarm 230 may comprise down force sensor 238, such as the instrumented pins disclosed in US Patent No. 8561472, which is incorporated herein in its entirety by reference.
The seed meter 210 may be any commercially available seed meter, such as the finger type meter or vacuum seed meter, such as the VSet® meter, available from Precision Planting LLC, 23207 Townline Rd, Tremont , IL 61568.
[0050] The ditch closure assembly 250 includes a closure wheel arm 252 that pivotally connects to the row unit frame 204. A pair of offset closure wheels 254 are rotatably connected to the closure wheel arm. closure 252 and angularly arranged to direct the soil back into the open seed trench so as to "close" the soil trench. An actuator 256 may be pivotally connected at one end to the closure wheel arm 252 and at its other end to the row unit frame 204 to vary the downward pressure exerted by the closure wheels 254 depending on ground conditions. Closure wheel assembly 250 may be of the type disclosed in International Publication No. WO2014/066650, which is incorporated herein in its entirety by reference.
[0051] The packer wheel assembly 260 comprises an arm 262 pivotally connected to the frame of the row unit 204 and extends behind the closure wheel assembly 250 and in alignment therewith.
Arm 262 rotatably supports a packer wheel 264. An actuator 266 is pivotally connected at one end to the arm and at its other end to the row unit frame 204 to vary the amount of downward force exerted by the packer wheel 264 to pack the soil over the seed trench 10.
[0053] Row Cleaning Kit 270 may be the CleanSweep® System available from Precision Planting LLC, 23207 Townline Rd, Tremont, IL 61568. Row Cleaning Kit 270 includes an arm 272 pivotally attached to the front end of the row unit frame 204 and aligned with trench cutter assembly 220. A pair of row clearing wheels 274 are rotatably connected to the front end of arm 272. An actuator 276 is pivotally connected at one end to the arm. 272 and at its other end to row unit frame 204 for adjusting the pressing force on the arm to vary the aggressiveness of action of row cleaning wheels 274 depending on the amount of crop residue and soil conditions.
[0054] It should be appreciated that instead of positioning the working layer sensors 100 as shown in FIG. 7, the working layer sensors can be positioned after the row clearing assembly 270 and before the trenching assembly 220, or at one or more locations between the trenching discs 222 and the closure wheels 254 or the packing wheel 264 depending on the soil region or characteristics of interest.
[0055] FIG. Planter Control and Operator Feedback 8 is a schematic illustration of a system 500 that employs working layer sensors 100 to provide operator feedback and to control the planter row unit 200. The working layer sensors 100A, 100B are arranged to generate an image of reference layer 110A of undisturbed soil and a working layer 110B of the closed seed trench (i.e., after the seed is deposited, covered with soil by the closure wheel assembly 250 and the packed soil with packaging wheel assembly 260). As described above, working layer sensors 100A, 100B may be separate working layer sensors disposed to the front and rear of row unit 200, as illustrated in FIG. 7, or working layer sensors 100A, 100B may comprise a single working layer sensor with transmitters (T) and receivers (R) arranged to generate a reference layer image 110A and a working layer image 110B.
[0056] The working layer image 110B can be communicated and presented to the operator on a monitor 300 comprising a display, a controller and a user interface, such as a graphical user interface (GUI), within the tractor cab.
[0057] The monitor 300 may be in signal communication with a GPS unit 310, the row clearing actuator 276, the down force control system 214, the depth adjustment actuator 234, the closure assembly actuator. trench 256 and wrapper wheel assembly actuator 266 to allow operational control of the planter based on characteristics of working layer image 110B.
[0058] For example, if the working layer image 110B indicates that the residue in seed trench 10 is above a predetermined threshold (as explained below), a signal is generated by monitor 300 to trigger the row cleaning actuator 276 to increase the queue clearing downward force. As another example, if the seed depth is less than a predetermined threshold (as explained below), a signal is generated by monitor 300 to trigger the down force control system 214 to increase the down force and/or trigger the down force actuator. depth adjustment 234 to adjust gauge wheels 224 relative to opening discs 222 to increase trench depth. Likewise, if the seed depth is greater than a predetermined threshold, a signal is generated by monitor 300 to trigger down force control system 214 to decrease down force and/or trigger depth adjustment actuator 234 to decrease the depth of the trench. As another example, if the upper portion of the trench has more than one threshold level of void space (as explained below), a signal is generated by monitor 300 to drive trench closure wheel assembly actuator 256 to increase force down on the closure wheels 254. As another example, if the lower portion of the trench has more than one threshold level of void space (as explained below), a signal is generated by monitor 300 to drive the wrapper wheel assembly actuator 266 to increase the down force on wrapper wheel 264.
[0059] In other examples, the 110B working layer image can identify and/or analyze (for example, determine depth, area, volume, density or other qualities or quantities of) underground features of interest such as large drainage lines rocks, or layers of compaction resulting from plowing and other field traffic. Such underground features can be displayed to the user on monitor 300 and/or identified by monitor 300 using an empirical correlation between the image properties and a set of underground features expected to be found in the field. In one example, the area traversed by the gauge wheels (or other wheels) of the planter (or tractor or other implement or vehicle) can be analyzed to determine a depth and/or soil density of a compaction layer under the wheels. In some of these examples, the working layer image area can be divided into sub-regions for analysis based on predicted underground features in such sub-regions (eg, the area traversed by the gauge wheels can be analyzed for compaction).
[0060] In other examples, monitor 300 can estimate a soil property (eg, soil moisture, organic matter or electrical conductivity, groundwater level) based on the image properties of the 110B working layer image and display the land property to the user as a numerical value (eg average or current value) or a spatial map of the land property at georeferenced locations in the field associated with each land property measurement (eg correlating measurements with georeferenced locations competitors reported to GPS unit 310).
[0061] Alternatively or additionally, monitor 300 can be programmed to display operational recommendations based on characteristics of working layer image 110B. For example, if the working layer image 110B identifies that the seed 12 is unevenly spaced in the trench 10 or if the seed 12 is not being evenly deposited at the base of the trench, or if the spacing of the seed 12 in the trench does not match the advance seed spacing based on signals generated by the seed sensor or seed meter speed, such irregular spacing, non-uniform positioning or other inconsistencies with advance spacing may be due to speed excess causing seed to bounce into the ditch or excess of vertical acceleration of the row unit. As such, monitor 300 can be programmed to recommend decreasing planting speed or suggesting increasing downforce (if not automatically controlled as described above) to reduce the vertical acceleration of the planter row units. Likewise, as the other actuators 276, 214, 234, 256, 266 are not integrated with the monitor controller, the monitor can be programmed to present recommendations to the operator to make manual or remote adjustments as described above based on the characteristics of the 110B working layer image.
[0062] FIG. 9 illustrates the process steps for controlling the planter and providing operator feedback. At steps 510 and 512, the reference image 110A and the working layer image 110B are generated by the working image sensor(s) 100. At step 514, the working layer image 110B can be displayed to the operator on monitor 300 in the tractor cab. In step 516, the reference layer image 110A is compared to the working layer image 110B to characterize the working layer image. In step 518, the characterized working layer image 110B is compared against predetermined thresholds. In step 520, control decisions are made based on comparing the characterized image of working layer 110B with predetermined thresholds. At step 522, planter components may be controlled by the generator signals from monitor 300 to actuate one or more of the corresponding actuators 276, 214, 234, 256, 266 and/or at step 524, corresponding 0 recommendations may be displayed to the operator on the monitor display.
[0063] To characterize the working layer image 110B in step 516, the monitor 300 compares one or more features (e.g., density) of the reference image 110A with the same features of the working layer image 110B. In some embodiments, a characterized image may be generated comprising only portions of the working layer image that differ from the reference image by at least a threshold. The characterized image can then be used to identify and define features of the 110B working layer image, such as the shape of the trench, the depth of the trench, the residue in the trench, placement of seed and seeds within the trench, voids within the trench. ditch, and soil density differences within the ditch.
[0064] For example, to determine the depth of the seed, the seed is identified or identifiable from the working layer image 110B by determining regions within the working layer image having a size or shape corresponding to a seed and having a range of densities empirically corresponding to the seed.
[0065] Once a region is identified as a seed, the vertical position of the seed in relation to the soil surface is easily measurable or determined.
[0066] As another example, the amount of waste in the trench can be determined by (a) defining the cross-sectional area of the trench (based on the differences in soil density between the 110A reference image and the working layer image 110B); (b) identify the regions within the trench with a density range empirically corresponding to the residue; c) total the area of the regions corresponding to waste; and (d) divide the waste area by the cross-sectional area of the trench.
[0067] Other Applications - It should be appreciated that the working layer sensors 100 can be employed with other implements and agricultural operations, such as, for example, tillage operations and/or side cover fertilizer operations, or in connection with agricultural data collection operations to visualize or analyze subsurface soil characteristics, seed placement, root structure, location of underground water management resources such as drains, etc.
[0068] When employed with tillage implements, the working layer sensors 100 can be arranged in front of any tillage tools (ie, rod, disc, blade, knife, spoon, coulter, etc.) or between 1 tools of tillage tools spaced back and forth and/or behind any tillage tools.
[0069] When employed with side fertilizing tools or other soil fertilizing tools, the working layer sensors 100 can be arranged in front of or behind any side fertilizing tool or in the soil (eg rod) , disc, blade, knife, spoon, grinder, etc.).
[0070] When employed with a dedicated measuring instrument, the working layer sensors 100 can be arranged above undisturbed ground, which may or may not have debris removed by a row cleaner.
[0071] For tillage implements and side or soil fertilizing tools, the actuators on these implements can be automatically controlled to adjust the depth of the tillage tools or the 300 monitor can be programmed to provide feedback or recommendations to the operator to adjust manually or remotely the actuators as described above in relation to planter application. For example, if operator feedback or recommendations indicate that the depth of the tillage tools should be adjusted, an actuator associated with the ground support wheels that support the toolbar or a section of the toolbar can be triggered to increase or decrease the depth of penetration of the toolbars. Alternatively, separate actuators can be associated with individual tillage tools to adjust the depth of individual tillage tools.
[0072] As another example, if the working layer images indicate that the implement is approaching denser or compacted soil, the actuators associated with adjusting the down force or pressure can be actuated to increase the down force as the implement passes through the densest or most compacted soil. In other embodiments, if images of the working layer across the width of the implement indicate that one side or the other is more aggressively tilling the ground, an actuator associated with an implement wing can be actuated to ensure balance of the aggressiveness of tillage tools across the width from side to side of the implement. Likewise, an actuator associated with the forward and backward leveling of the implement can be actuated to ensure that the aggressiveness of the tools in front of the implement is balanced with those behind. In yet other embodiments, actuators can be provided to adjust the angle of attack of a group of discs or a wing of a tillage implement, or individual tillage tools depending on working layer images and operator feedback when the implement crosses the field, encountering different soil conditions. 2. EXAMPLE AGRICULTURAL INTELLIGENCE COMPUTER SYSTEM 2.1. STRUCTURAL OVERVIEW
[0073] FIG. 16 illustrates an example computer system that is configured to perform the functions described herein, shown in a field environment with other apparatus with which the system can interoperate. In one embodiment, a user 1602 owns or operates a field manager computing device 1604 at a field location or associated with a field location such as a field intended for agricultural activities or a management location for one or more fields. agricultural. Field manager computing device 1604 is programmed or configured to provide field data 1606 to an agricultural intelligence computing system 1630 via one or more networks 1609.
[0074] Examples of 1606 field data include (a) identifying data (eg, area, field name, field identifiers, geographic identifiers, boundary identifiers, crop identifiers, and any other suitable data that may be used to identify agricultural land, such as a common land unit (CLU), lot and block number, a package number, geographic coordinates and boundaries, Farm Serial Number (FSN), farm number, area number, field number, section, county and/or range), (b) harvest data (eg crop type, crop variety, crop rotation, whether the crop is organically grown, harvest date, Current Production History (APH), expected yield, yield, harvest price, harvest yield, grain moisture, tillage practice, and (c) soil data (eg, soil type, composition, pH, organic matter (OM), cation exchange capacity (CEC) ), (d) planting data (for example, date of planting, seed(s), relative maturity (RM) of seed(s) planted, seed population), (e) fertilizer data (by nutrient type (nitrogen, phosphorus, potassium), application type, application date, quantity, source, method), (f) chemical application data (eg pesticides, herbicides, fungicides, other substances or mixtures of substances intended to be used as a plant regulator , defoliant or desiccant, application date, quantity, source, method), (g) irrigation data (eg application date, quantity, source, method), (h) meteorological data (eg precipitation, water rate). precipitation, predicted precipitation, water runoff rate region, temperature, wind, forecast, pressure, visibility, clouds, heat index, dew point, humidity, snow depth, air quality, sunrise, sunset ), (i) image data (eg image spectrum information and light from an agricultural appliance sensor, camera, computer, smartphone, tablet, unmanned aerial vehicle, aircraft or satellite), (j) recognition observations (photos, videos, free-form notes, voice recordings, transcripts of voice, climatic conditions (temperature, precipitation (current and over time), soil moisture, crop growth stage, wind speed, relative humidity, dew point, black layer) and (k) soil, seed, phenology of crop, pests and diseases reported, and forecast sources and databases.
[0075] A data server computer 1608 is communicatively coupled to agricultural intelligence computing system 1630 and is programmed or configured to send external data 1610 to agricultural intelligence computing system 1630 via network(s) 1609. External data server computer 1608 may be owned or operated by the same person or legal entity as agricultural information computer system 1630, or by a different person or entity, such as a government agency, non-governmental organization (NGO) and/ or a private data service provider. Examples of external data include meteorological data, image data, soil data or statistical data related to crop yield, among others. External data 1610 may consist of the same type of information as field data 1606. In some embodiments, external data 1610 is provided by an external data server 1608 owned by the same entity that owns and/or operates the system. agricultural intelligence computing system 1630. For example, agricultural information computing system 1630 may include a data server focused exclusively on a type of data that could otherwise be obtained from third-party sources, such as data climatic. In some embodiments, an external data server 1608 may actually be incorporated into system 1630.
[0076] An agricultural apparatus 1611 may have one or more remote sensors 1612 attached to it, which sensors are communicatively coupled directly or indirectly through the agricultural apparatus 1611 to the agricultural intelligence computing system 1630 and are programmed or configured to send sensor data for agricultural intelligence computer system 1630. Examples of 1611 agricultural apparatus include tractors, harvesters, combines, planters, trucks, fertilizer equipment, aerial vehicles including unmanned aerial vehicles, and any other item of physical machinery or hardware, usually mobile machinery , and that can be used in tasks associated with agriculture. In some embodiments, a single unit of apparatus 1611 may comprise a plurality of sensors 1612 that are locally coupled into a network in the apparatus; the controller area network (CAN) is an example of such a network that can be installed on combines, combines, sprayers and cultivators. Application controller 1614 is communicatively coupled to agricultural intelligence computing system 1630 via network(s) 1609 and is programmed or configured to receive one or more scripts that are used to control an operating parameter of a vehicle. or agricultural implement of agricultural intelligence computing system 1630. For example, a controller area network (CAN) bus interface can be used to enable communications from agricultural intelligence computing system 1630 to agricultural apparatus 1611, such as the CLIMATE FIELDVIEW DRIVE, available from The Climate Corporation, San Francisco, Calif., is used. Sensor data may consist of the same type of information as field data 1606. In some embodiments, remote sensors 1612 may not be attached to an agricultural apparatus 1611, but may be located remotely in the field and may communicate with the 1609 network.
[0077] Apparatus 1611 may comprise a cockpit computer 1615 that is programmed with a cockpit application, which may comprise a version or variant of the mobile application for device 1604 that is further described in other sections presented herein. In one embodiment, the cabin computer 1615 comprises a compact computer, often a tablet-sized computer or smartphone, with a graphical display, such as a color monitor, that is mounted within the operator's cabin of the apparatus 1611 Cabin computer 1615 may implement some or all of the operations and functions that are described hereinafter for mobile computer device 1604.
[0078] The 1609 network(s) represents, in general terms, any combination of one or more data communication networks, including local area networks, wide area networks, internal networks or internets, using any fixed connection or wireless, including terrestrial or satellite connections. Networks can be implemented by any means or mechanism that provides for the exchange of data between the various elements of the FIG. 16. The various elements of FIG. 16 can also have direct communication links (wired or wireless). Sensors 1612, controller 1614, external data server computer 1608, and other system elements each comprise an interface compatible with network(s) 1609 and are programmed or configured to use standardized protocols for communication across networks such as TCP/IP, Bluetooth, CAN protocol and higher layer protocols such as HTTP, TLS and the like.
[0079] Agricultural information computer system 1630 is programmed or configured to receive field data 1606 from field manager computing device 1604, external data 1610 from external data server computer 1608, and sensor data from of remote sensor 1612. Agricultural information computer system 1630 may further be configured to host, use or execute one or more computer programs, other software elements, digitally programmed logic such as FPGAs or ASICs, or any combination thereof to perform the translating and storing data values, building digital models of one or more cultures in one or more fields, generating recommendations and notifications, and generating and sending scripts to the 1614 application controller, as described later in other sections of this disclosure.
[0080] In one embodiment, agricultural intelligence computing system 1630 is programmed with or comprises a communication layer 1632, presentation layer 1634, data management layer 1640, hardware/virtualization layer 1650, and data repository. field data and model 1660. "Layer" in this context refers to any combination of electronic digital interface circuits, microcontrollers, firmware such as drivers, and/or computer programs or other software elements.
[0081] Communication layer 1632 may be programmed or configured to perform input/output interface functions including sending requests to field manager computing device 1604, external data server computer 1608, and remote sensor 1612 for data field data, external data, and sensor data, respectively. Communication layer 1632 may be programmed or configured to send the received data to field data repository and model 1660 to be stored as field data 1606.
[0082] The presentation layer 1634 can be programmed or configured to generate a graphical user interface (GUI) to be displayed on the computing device field manager 1604, cabin computer 1615 or other computers that are coupled to the system 1630 via the 1609 network. The GUI may comprise controls for inputting data to be sent to the 1630 agricultural intelligence computing system, generating model requests and/or recommendations, and/or displaying recommendations, notifications, models, and other field data.
[0083] Data management layer 1640 can be programmed or configured to manage read operations and write operations involving repository 1660 and other functional elements of the system, including result sets and queries communicated between the functional elements of the system and the repository. Examples of the 1640 data management layer include JDBC, SQL server interface code, and/or HADOOP interface code, among others. Repository 1660 may comprise a database. As used here, the term "database" can refer to a body of data, a relational database management system (RDBMS), or both. As used here, a database can include any collection of data, including hierarchical databases, relational databases, simple databases, object-relational databases, object-oriented databases, distributed databases, and any other structured collection of records or data stored in a computer system. Examples of RDBMS include, but are not limited to, ORACLE®, MYSQL, IBM® DB2, MICROSOFT® SQL SERVER, SYBASE® and POSTGRESQL Databases. However, any database can be used to enable the systems and methods described here.
[0084] When the 1606 field data is not provided directly to the agricultural intelligence computing system through one or more agricultural machines or agricultural machinery devices that interact with the agricultural intelligence computing system, the user may be requested through one or more user interfaces on the user device (served by the agricultural intelligence computing system) to input this information. In an example embodiment, the user can specify identification data by accessing a map on the user's device (served by the agricultural intelligence computing system) and selecting specific CLUs that have been graphically displayed on the map. In an alternative embodiment, user 1602 may specify identification data by accessing a map on the user device (served by agricultural intelligence computing system 1630) and drawing field boundaries on the map. Such selection of CLU or map drawings represent geographic identifiers. In alternative embodiments, the user may specify identification data by accessing field identification data (provided as format files or in a similar format) from the US Department of Agriculture Agricultural Service Agency or other source via a user device and providing this field data to the agricultural intelligence computing system.
[0085] In an exemplary embodiment, agricultural intelligence computing system 1630 is programmed to generate and cause the display of a graphical user interface comprising a data manager for data entry. After identifying one or more fields using the methods described above, the data manager can provide one or more graphical user interface elements that when selected can identify changes in the field, soil, crops, crops, or practices of nutrients. The data manager can include a timeline view, a spreadsheet view and/or one or more editable programs.
[0086] FIG. 20 depicts an example embodiment of a timeline view for data entry. Using the display shown in FIG. 20, a user computer can input a selection of a particular field and a particular date for adding an event. Events described at the top of the timeline can include Nitrogen, Planting, Practices and Soil. To add a nitrogen application event, a user's computer can provide information to select the nitrogen tab. The user's computer can then select a timeline location for a particular field to indicate an application of nitrogen to the selected field. In response to receiving a timeline location selection for a specific field, the data manager can display a data entry overlay, allowing the computer to enter data pertaining to nitrogen applications, planting procedures, soil application , tillage procedures, irrigation practices or other information related to the specific field. For example, if a user's computer selects a portion of the timeline and indicates an application of nitrogen, then the data entry overlay can include fields to enter an amount of nitrogen applied, an application date, a type of fertilizer used. and any other information related to nitrogen application.
[0087] In one embodiment, the data manager provides an interface to create one or more programs. "Program" in this context refers to a set of data relating to nitrogen applications, planting procedures, soil application, tillage procedures, irrigation practices, or other information that may be related to one or more fields, and that can be stored in digital data storage for reuse as a set in other operations. Once a program has been created, it can be conceptually applied to one or more fields, and references to the program can be stored in digital storage in association with the data identifying the fields. So, instead of manually entering identical data related to the same nitrogen applications for several different fields, a user's computer can create a program that indicates a specific nitrogen application and then apply the program to several different fields. For example, in the timeline view of FIG. 20, the top two timelines have the "Spring Applied" program selected, which includes an application of 150 lbs (68.0389 kg) N/ac in early April. The data manager can provide an interface for editing a program. In one embodiment, when a particular program is edited, each field that selected the particular program is edited. For example, in FIG. 20, if the "Spring Applied" program is edited to reduce nitrogen application to 130 lbs (58.967 kg) N/ac, the top two fields can be updated with a reduced nitrogen application based on the edited program.
[0088] In one embodiment, in response to receiving edits in a field that has a selected program, the data manager removes the field's match with the selected program. For example, if an application of nitrogen is added to the upper field in FIG. 20, the interface may be updated to indicate that the "Spring Applied" program is no longer being applied to the upper field. While early April nitrogen application may remain, the "Spring Applied" program updates would not change April nitrogen application.
[0089] FIG. 21 depicts an exemplary embodiment of a spreadsheet view for data entry. Using the display shown in FIG. 21, a user can create and edit information for one or more fields. The data manager can include spreadsheets to input information relating to Nitrogen, Planting, Practices and Soil, as represented in FIG. 21. To edit a specific entry, a user's computer can select the specific entry in the spreadsheet and update the values. For example, FIG. 21 describes an update in progress to a desired yield value for the second field. In addition, a user's computer can select one or more fields to apply one or more programs. In response to receiving a selection from a program for a specific field, the data manager can automatically populate entries for the specific field based on the selected program. As with the timeline view, the data manager can update entries for each field associated with a specific program in response to receiving an update for the program. In addition, the data manager can unmatch the selected program for the field in response to receiving an edit to one of the field's entries.
[0090] In one embodiment, model and field data are stored in model and field data repository 1660. Model data includes data models created for one or more fields. For example, a culture model can include a digitally constructed model of the development of a culture in one or more fields. "Model" in this context refers to a digitally stored electronic set of executable instructions and data values, associated with each other, capable of receiving and responding to a programmatic or other digital call, invocation, or resolution based request. on specified input values, to produce one or more stored or calculated output values that can serve as the basis for computer-implemented recommendations, output data displays, or machine control, among other things. Technicians on the subject find it convenient to express models using mathematical equations, but this form of expression does not limit the models disclosed here to abstract concepts; rather, each model here has a practical application on a computer in the form of stored executable instructions and data that implement the model using the computer. The model can include a model of past events in one or more fields, a model of the current state of one or more fields, and/or a model of predicted events in one or more fields. Field and model data can be stored in in-memory data structures, rows in a database table, in flat files or spreadsheets, or other forms of stored digital data.
[0091] In one embodiment, the imaging support instructions 1636 comprise a set of one or more pages of main memory, such as RAM, in the agricultural information computing system 1630 into which the executable instructions have been loaded and which when performed cause the agricultural information computing system to perform the functions or operations described herein with reference to these modules. For example, imaging support instructions 1636 may comprise a set of pages in RAM that contain instructions that, when executed, cause the imaging support functions to aid in the execution or output of steps 516, 518 , 520 of FIG. 9. For example, monitor 300 can be programmed to upload, via computer and booth 1615, images from the working layer to computer system 1630 for cloud-based image analysis, and receive in response messages from the computer system. , characterizations of the working layer images for use in the other processes that have been described in relation to FIG. 1 to FIG. 15. Instructions may be in the machine's executable code in the instruction set of a CPU and may have been compiled from source code written in JAVA, C, C ++, OBJECTIVE-C or any other programming language or environment readable, alone or in combination with scripts in JAVASCRIPT, other scripting languages and other programming source text. The term "pages" is intended to refer broadly to any region within main memory, and the specific terminology used in a system may vary depending on memory architecture or processor architecture. In another embodiment, each of the image support instructions 1636 may also represent one or more source code files or projects that are digitally stored on a mass storage device such as non-volatile RAM or disk storage, in agricultural intelligence system 1630 or separate repository system, which, when compiled or interpreted, generates executable instructions that, when executed, cause the agricultural intelligence computing system to perform the functions or operations described herein with reference to those modules. In other words, the drawing figure can represent the way programmers or software developers organize and arrange source code for later compilation into an executable, or interpretation in byte code or equivalent, for execution by the agricultural intelligence computing system 1630.
[0092] Hardware/virtualization layer 1650 comprises one or more central processing units (CPUs), memory controllers and other devices, components or elements of a computer system, such as volatile or non-volatile memory, non-volatile storage, such as disk, and I/O devices/interfaces as illustrated and described, for example, in connection with FIG. 19. Layer 1650 may also comprise programmed instructions that are configured to support virtualization, containerization, or other technologies.
[0093] For the purpose of illustrating a clear example, FIG. 16 shows a limited number of instances of certain functional elements. However, in other embodiments, there can be any number of these elements. For example, embodiments may use thousands or millions of different mobile computing devices 1604 associated with different users. In addition, system 1630 and/or external data server computer 1608 can be implemented using two or more processors, cores, clusters or instances of physical machines or virtual machines, configured in a discrete location or co-located with other elements in a data center, shared computing facility or cloud computing facility. 2.2. APPLICATION PROGRAM OVERVIEW
[0094] In one embodiment, implementing the functions described herein using one or more computer programs or other software elements that are loaded and run using one or more general purpose computers will cause the general purpose computers to be configured as a machine or as a computer that is specially adapted to perform the functions described here. Furthermore, each of the flow diagrams that are further described herein may serve, alone or in combination with the prose descriptions of processes and functions here, as algorithms, plans, or directions that can be used to program a computer or logic to implement the functions that are described. In other words, all of the prose text here, and all the drawing figures together, are intended to provide disclosure of algorithms, plans, or directions that are sufficient to allow a person skilled in the art to program a computer to perform the functions described here , in combination with the skill and knowledge of such a person, given the skill level that is appropriate for inventions and disclosures of this type.
[0095] In one embodiment, user 1602 interacts with agricultural information computing system 1630 using field manager computing device 1604 configured with an operating system and one or more application programs or applications; the field manager computing device 1604 can also interoperate with the agricultural information computing system independently and automatically under program control or logic control and direct user interaction is not always required. Field manager computing device 1604 generally represents one or more of a smartphone, PDA, tablet computing device, laptop, desktop computer, workstation, or any other computing device capable of transmitting and receive information and perform the functions described here. Field manager computing device 1604 may communicate over a network using a mobile application stored on field manager computing device 1604 and, in some embodiments, the device may be coupled using a cable 1613 or connector 1612 to sensor 1612 and/or controller 1614. A particular user 1602 may own, operate, or own and use, in connection with system 1630, more than one field manager computing device 1604 at a time.
[0096] The mobile application can provide client-side functionality via the network to the one or more mobile computing devices. In an example embodiment, the field manager computing device 1604 can access the mobile application via a web browser or a local client application or application. The field manager computing device 1604 can transmit data to and receive data from one or more front-end servers using web-based protocols or formats such as HTTP, XML and/or JSON, or application-specific protocols. In an example embodiment, the data may take the form of requests and input of user information, such as field data, into the mobile computing device. In some embodiments, the mobile application interacts with location tracking hardware and software in field manager computing device 1604 that determines the location of field manager computing device 1604 using standard tracking techniques, such as multilateration of signal signals. radio, global positioning system (GPS), WiFi positioning systems, or other mobile positioning methods. In some cases, location data or other data associated with device 1604, user 1602 and/or user account(s) may be obtained by querying a device operating system or by requesting an application on the device to obtain data from the device. from the operating system.
[0097] In one embodiment, field manager computing device 1604 sends field data 1606 to agricultural intelligence computing system 1630 comprising or including, but not limited to, data values representing one or more of: a location geographic location of the one or more fields, tillage information for the one or more fields, crops planted in the one or more fields, and soil data extracted from the one or more fields. Field manager computing device 1604 may send field data 1606 in response to user input from user 1602, specifying data values for one or more fields. Additionally, field manager computing device 1604 may automatically send field data 1606 when one or more of the data values become available to field manager computing device 1604. For example, field manager computing device 1604 may be communicatively coupled to remote sensor 1612 and/or application controller 1614, which includes an irrigation sensor and/or irrigation controller. In response to receiving data indicating that application controller 1614 has released water in the one or more fields, field manager computing device 1604 may send data from field 1606 to agricultural intelligence computing system 1630 indicating that water has been released. in one or more fields. The 1606 field data identified in this disclosure may be input and communicated using electronic digital data that is communicated between computing devices using parameterized URLs over HTTP, or other suitable communication protocol or message.
[0098] A commercial example of the mobile application is CLIMATE FIELDVIEW, commercially available from The Climate Corporation, San Francisco, California. The CLIMATE FIELDVIEW application, or other applications, may be modified, extended, or adapted to include features, functions, and programming that were not disclosed prior to the filing date of this disclosure. In one embodiment, the mobile application comprises an integrated software platform that allows a producer to make fact-based decisions for its operation because it combines historical data about the producer's fields with any other data the producer wishes to compare. Combinations and comparisons can be performed in real time and are based on scientific models that provide potential scenarios to allow the producer to make better and more informed decisions.
[0099] FIG. 17 illustrates two views of an example logical organization of instruction sets in main memory when an example mobile application is loaded for execution. In FIG. 17, each named element represents a region of one or more pages of RAM or other main memory, or one or more blocks of disk storage or other nonvolatile storage, and the instructions programmed within those regions. In one embodiment, in view (a), a mobile computer application 1700 comprises account field data ingestion sharing instructions 1702, overview and alert instructions 1704, digital map book instructions 1706, instructions for Planting and Seeds 1708, Nitrogen Instructions 1710, Weather Instructions 1712, Field Health Instructions 1714, and Performance Instructions 1716.
[00100] In one embodiment, a mobile computer application 1700 comprises sharing, data ingestion, fields, account 1702 instructions that are programmed to receive, translate and ingest field data from third party systems via manual upload or APIs. Data types can include field boundaries, yield maps, as-planted maps, soil test results, as-applied maps and/or management zones, among others. Data formats can include shape files, third-party native data formats, and/or Farm Management Information System (FMIS) exports, among others. Data reception can occur through manual upload, email with attachment, external APIs that send data to the mobile application, or instructions that call APIs from external systems to extract data to the mobile application. In one embodiment, mobile computer application 1700 comprises a data entry box. In response to receiving a selection from the data entry box, the 1700 mobile computer application can display a graphical user interface for manually uploading data files and importing uploaded files to a data manager.
[00101] In one embodiment, digital map book instructions 1706 comprise layers of field map data stored in device memory and are programmed with data visualization tools and geospatial field notes. This provides growers convenient close-up information for reference, records and visual insights into field performance. In one embodiment, the 1704 overview and alert instructions are programmed to provide a broad view of the operation of what is important to the producer, and timely recommendations for taking action or focusing on specific issues. This allows the producer to focus time on what needs attention, to save time and preserve yield throughout the season. In one embodiment, seed and planting instructions 1708 are programmed to provide tools for seed selection, hybrid placement, and scripting, including variable rate (VR) scripting, based on scientific models and empirical data. This allows growers to maximize yield or return on investment through optimized seed purchase, placement and population.
[00102] In one embodiment, script generation instructions 1705 are programmed to provide an interface for generating scripts, including variable rate (VR) fertility scripts. The interface allows growers to create scripts for field implements such as nutrient applications, planting and irrigation. For example, a planting script interface might include tools to identify a type of seed for planting. Upon receiving a seed type selection, the mobile computer application 1700 may display one or more fields divided into management zones, such as the field map data layers created as part of the 1706 digital map book instructions. In one embodiment, the management zones comprise ground zones along with a panel that identifies each ground zone and a soil name, texture, drainage for each zone or other field data. The mobile computer application 1700 can also display tools for editing or creating such, such as graphical tools for drawing management zones, such as ground zones, on a map of one or more fields. Planting procedures can be applied to all Management Zones or different planting procedures can be applied to different subsets of Management Zones. When a script is created, the 1700 mobile computer application can make the script available for download in a format readable by an application controller, such as an archived or zipped format. Additionally, and/or alternatively, a script may be sent directly to cabin computer 1615 from mobile computer application 1700 and/or uploaded to one or more data servers and stored for later use.
[00103] In one embodiment, the 1710 nitrogen instructions are programmed to provide tools to inform nitrogen decisions by visualizing nitrogen availability in crops. This allows growers to maximize yield or return on investment through optimized in-season nitrogen application. Examples of programmed functions include displaying images as SSURGO images to allow the design of fertilizer application zones and/or images generated from subfield soil data, such as sensor data, at high spatial resolution (so fine as millimeter or smaller depending on sensor proximity and resolution); upload of existing producer-defined zones; provide a graph of plant nutrient availability and/or a map to allow for adjusted nitrogen application(s) across multiple zones; issue scripts to drive machines; tools for bulk data entry and adjustment; and/or maps for data visualization, among others. "Bulk data entry" in this context can mean entering data once and then applying the same data to multiple fields and/or zones that have been defined in the system; examples of data may include nitrogen application data that is the same for many fields and/or zones from the same producer, but this mass data entry applies to entering any type of field data into the mobile computer application 1700. For example, Nitrogen 1710 instructions can be programmed to accept application program definitions and nitrogen practices and to accept user input specifying to apply these programs in multiple fields. "Nitrogen application programs" in this context refers to named and stored data sets that associate: a name, color code or other identifier, one or more application dates, material or product types for each of the dates and quantities, method of application or incorporation, such as injected or transmitted, and/or quantities or application rates for each of the dates, crop or hybrid that is the object of application, among others. "Nitrogen practice programs" in this context refers to sets of stored named data that associate: a practice name; a previous vintage; a tillage system; a date mainly of tillage; one or more previous tillage systems that were used; one or more application type indicators, such as manure, that were used. The 1710 nitrogen instructions can also be programmed to generate and cause a nitrogen graph to be displayed, which indicates projections of plant use of the specified nitrogen and whether a surplus or shortfall is predicted; in some embodiments, different color indicators can signal a magnitude of surplus or magnitude of deficit. In one embodiment, a nitrogen graph comprises a graphical display on a computer display device comprising a plurality of rows, each row associated and identifying a field; data specifying which crop is planted in the field, field size, field location, and a graphical representation of field perimeter; in each row, a schedule per month with graphical indicators specifying each nitrogen application and amount at points correlated to month names; and numerical and/or colored indicators of surpluses or deficits, in which the color indicates magnitude.
[00104] In one embodiment, the nitrogen graph may include one or more user input features, such as dials or sliders, to dynamically change the nitrogen planting and practices programs so that a user can optimize their graph of nitrogen. The user can then use their optimized nitrogen graph and related nitrogen planting and practices programs to implement one or more scripts, including variable rate (RV) fertility scripts. The 1710 nitrogen instructions can also be programmed to generate and cause the display of a nitrogen map, which indicates projections of the plant's use of the specified nitrogen and whether a surplus or shortfall is predicted; in some embodiments, different color indicators can signal a magnitude of surplus or magnitude of deficit. The nitrogen map can display projections of plant use of the specified nitrogen and whether a surplus or deficit is predicted for different times in the past and future (such as daily, weekly, monthly or yearly) using numerical and/or colored surplus or deficit, in which color a indicates magnitude. In one embodiment, the nitrogen map can include one or more user input features, such as dials or sliders, to dynamically change nitrogen planting and practices programs so that a user can optimize their nitrogen map, in order to obtain a preferential quantity from surplus to deficit. The user can then use their optimized nitrogen map and related nitrogen practice and planting programs to implement one or more scripts, including variable rate (RV) fertility scripts. In other embodiments, instructions similar to the 1710 nitrogen instructions can be used for application of other nutrients (such as phosphorus and potassium), pesticide application, and irrigation programs.
[00105] In one embodiment, weather instructions 1712 are programmed to provide recent field-specific weather data and predicted weather information. This allows producers to save time and have an efficient integrated view of daily operational decisions.
[00106] In one embodiment, the 1714 Field Health Instructions are programmed to provide timely remote sensing images highlighting seasonal crop variation and potential concerns. Examples of programmed functions include cloud scanning, to identify possible clouds or cloud shadows; determination of nitrogen indices based on field images; graphical visualization of recognition layers, including, for example, those related to field health and viewing and/or sharing recognition notes; and/or download satellite images from multiple sources and prioritize the images for the producer, among others.
[00107] In one embodiment, the 1716 performance instructions are programmed to provide reporting, analysis, and insight tools using on-farm data for assessment, insight, and decisions. This allows the producer to seek better results for the coming year, through fact-based conclusions about why the return on investment occurred at previous levels and a perception of yield limiting factors. Performance instructions 1716 can be programmed to communicate over network(s) 1609 to back-end analytical programs running on agricultural intelligence computing system 1630 and/or external data server computer 1608 and configured to analyze metrics such as yield, yield differential, hybrid, population, SSURGO zone, soil test properties, or elevation, among others. Scheduled reporting and analysis may include yield variability analysis, treatment effect estimation, yield benchmarking and other metrics against other growers based on anonymous data collected from many growers, or data for seed and planting, among others.
[00108] Applications with instructions configured in this way can be implemented for different computing device platforms, keeping the same general user interface appearance. For example, the mobile application can be programmed to run on tablets, smartphones or server computers that are accessed using browsers on client computers. In addition, the mobile application as configured for tablet computers or smartphones can provide a full app experience or a cabin app experience that is suitable for the display and processing capabilities of the 1615 cabin computer. For example, referring now to the view (b) of FIG. 17, in one embodiment a booth computer application 1720 may comprise map booth instructions 1722, remote viewing instructions 1724, data collection and transfer instructions 1726, machine alert instructions 1728, script transfer instructions 1730, and observation cabin instructions 1732 . The code base for view (b) instructions can be the same as for view a(a) and executables implementing the code can be programmed to detect the type of platform they are running on and expose, through a graphical interface only those functions appropriate for a cabin platform or full platform. This approach allows the system to recognize the distinctly different user experience that is appropriate for an in-cab environment and for the different in-cab technology environment. The 1722 Map Booth Instructions can be programmed to provide map views of fields, farms, or regions that are useful in directing machine operation. The 1724 remote viewing instructions can be programmed to turn on, manage, and provide real-time or near real-time views of machine activity to other computing devices connected to the 1630 system via wireless networks, wired connectors or adapters, and similar. The 1726 data collection and transfer instructions can be programmed to connect, manage, and provide transfer of data collected in sensors and controllers to the 1630 system via wireless networks, wired connectors or adapters, and the like. The 1728 Machine Alert Instructions can be programmed to detect problems with machine operations or tools that are associated with the cab and generate operator alerts. The 1730 script transfer instructions can be configured to transfer scripts of instructions configured to direct machine operations or data collection. The 1732 reconnaissance booth instructions can be programmed to display location-based alerts and information received from the 1630 system based on the location of the 1604 field manager computing device, 1611 agricultural apparatus, or 1612 sensors in the field and ingest, manage, and deliver transferring location-based reconnaissance observations to system 1630 based on the location of agricultural apparatus 1611 or sensors 1612 in the field. 2.3. DATA INGESTION FOR THE COMPUTER SYSTEM
[00109] In one embodiment, external data server computer 1608 stores external data 1610, including soil data representing soil composition for the one or more fields and weather data representing temperature and precipitation in the one or more fields. Weather data can include past and present weather data as well as predictions of future weather data. In one embodiment, external data server computer 1608 comprises a plurality of servers hosted by different entities. For example, a first server might contain soil composition data, while a second server might include weather data. In addition, soil composition data can be stored on multiple servers. For example, one server might store data representing the percentage of sand, silt, and clay in the soil, while a second server might store data representing the percentage of organic matter (OM) in the soil.
[00110] In one embodiment, remote sensor 1612 comprises one or more sensors that are programmed or configured to produce one or more observations. Remote sensor 1612 can be aerial sensors, such as satellites, vehicle sensors, planting equipment sensors, crop sensors, fertilizer or insecticide application sensors, harvester sensors, and any other implement capable of receiving data from a or more fields. In one embodiment, application controller 1614 is programmed or configured to receive instructions from agricultural intelligence computing system 1630. Application controller 1614 may also be programmed or configured to control an operating parameter of an agricultural vehicle or implement. For example, an application controller can be programmed or configured to control an operating parameter of a vehicle, such as a tractor, planting equipment, tillage equipment, fertilizer or insecticide equipment, harvesting equipment, or other agricultural implements, such as a valve. Water. Other embodiments may use any combination of sensors and controllers, of which the following are merely selected examples.
[00111] System 1630 can obtain or ingest data under user control 1602, in bulk, from a large number of producers who have contributed data to a shared database system. This form of data acquisition may be termed "manual data ingestion" as one or more user-controlled computer operations are requested or triggered to obtain data for use by the 1630 system. As an example, the commercially available CLIMATE FIELDVIEW application from The Climate Corporation in San Francisco, California, can be operated to export data to the 1630 system for storage in the 1660 repository.
[00112] For example, seed monitoring systems can control planter apparatus components and obtain planting data, including signals from seed sensors via a signal beam comprising a CAN backbone and point-to-point connections for recording and /or diagnosis. Seed monitoring systems can be programmed or configured to display seed spacing, population, and other information to the user via cabin computer 1615 or other devices within system 1630. Examples are disclosed in US Patent No. 8,738,243 and US Patent No. Publication No. 170150094916, and the present disclosure assumes knowledge of those other patent disclosures.
[00113] Likewise, yield monitoring systems may contain yield sensors for harvesters that send yield measurement data to the 1615 cabin computer or other devices within the 1630 system. Yield monitoring systems may use one or more remote 1612 sensors to obtain grain moisture measurements on a combine or other combine and transmit these measurements to the user via the 1615 in-cab computer or other devices within the 1630 system.
[00114] In one embodiment, examples of sensors 1612 that can be used with any moving vehicle or apparatus of the type described elsewhere include kinematic sensors and position sensors. Kinematic sensors can include any of the speed sensors, such as wheel or radar speed sensors, accelerometers or gyroscopes. Position sensors can include GPS receivers or transceivers, or WiFi-based mapping or position applications that are programmed to determine location based on nearby WiFi hotspots, among others.
[00115] In one embodiment, examples of 1612 sensors that can be used with tractors or other moving vehicles include engine speed sensors, fuel consumption sensors, area counters or distance counters that interact with GPS or signals radar sensors, PTO (Power Take-Off) speed sensors, tractor hydraulic sensors configured to detect hydraulic parameters such as pressure or flow, and/or hydraulic pump speed, wheel speed sensors or wheel slip sensors. In one embodiment, examples of controllers 1614 that can be used with tractors include hydraulic directional controllers, pressure controllers and/or flow controllers; hydraulic pump speed controllers; speed regulators or controllers; engagement position controllers; or wheel position controllers provide automatic steering.
[00116] In one embodiment, examples of sensors 1612 that can be used with seed planting equipment, such as planters, drills or air seeders, include seed sensors, which can be optical, electromagnetic or impact sensors; down force sensors such as load pins, load cells, pressure sensors; soil property sensors, such as reflectivity sensors, moisture sensors, electrical conductivity sensors, optical residue sensors, or temperature sensors; component operational criteria sensors, such as planting depth sensors, down force cylinder pressure sensors, seed disk speed sensors, seed drive motor encoders, seed conveyor system speed sensors, or vacuum level sensors; or pesticide application sensors, such as optical sensors or other electromagnetic sensors, or impact sensors. In one embodiment, examples of controllers 1614 that can be used with such seed planting equipment include: toolbar folding controllers, such as controllers for valves associated with hydraulic cylinders; down force controllers, such as controllers for valves associated with pneumatic cylinders, airbags, or hydraulic cylinders, and programmed to apply down force to individual row units or an entire planter structure; planting depth controllers such as linear actuators; metering controllers, such as electric seed meter drive motors, hydraulic seed meter drive motors, or range control clutches; hybrid selection controllers, such as seed metering drive motors, or other actuators programmed to selectively allow or prevent seed or an air-seed mixture from distributing seed to or from seed meters or central bulk hoppers; metering controllers such as seed meter electric drive motors or seed meter hydraulic drive motors; seed transport system controllers, such as controllers for a belt seed delivery conveyor motor; marker controllers, such as a controller for a pneumatic or hydraulic actuator; or pesticide application rate controllers such as metering trigger controllers, orifice size controllers or position controllers.
[00117] In one embodiment, examples of sensors 1612 that can be used with tillage equipment include position sensors for tools such as shanks or discs; tool position sensors for such tools that are configured to detect depth, wheel tilt angle, or side spacing; down force sensors; or traction force sensors. In one embodiment, examples of controllers 1614 that can be used with tillage equipment include down force controllers or tool position controllers, such as controllers configured to control tool depth, disc tilt angle, or side spacing .
[00118] In one embodiment, examples of sensors 1612 that can be used in connection with apparatus for applying fertilizers, insecticides, fungicides and the like, such as starter fertilization systems in planters, underground fertilizer applicators, or sprayers fertilizer, include: fluid system criterion sensors, such as flow sensors or pressure sensors; sensors that indicate which spray head valves or fluid line valves are open; sensors associated with tanks, such as fill level sensors; sectional or system-wide feedline sensors, or row-specific feedline sensors; or kinematic sensors such as accelerometers arranged in spray bars. In one embodiment, examples of controllers 1614 that can be used with such apparatus include pump speed controllers; valve controllers that are programmed to control pressure, flow, direction, PWM and the like; or position actuators, such as boom height, subsoiler depth, or boom position.
[00119] In one embodiment, examples of 1612 sensors that can be used with combines include yield monitors such as impact plate strain gauges or position sensors, capacitive flow sensors, load sensors, weight sensors, or torque sensors associated with elevators or augers, or optical sensors or other electromagnetic grain height sensors; grain moisture sensors such as capacitive sensors; grain loss sensors, including impact, optical, or capacitive sensors; collector operation criteria sensors such as collector height, collector type, deck plate distance, feeder speed, and spool speed sensors; separator operating criteria sensors such as concave clearance, impeller speed, shoe clearance, or damper clearance sensors; auger sensors for position, operation or speed; or engine speed sensors. In one embodiment, examples of controllers 1614 that can be used with combines include collector operational criteria controllers for elements such as collector height, collector type, header gap, feed speed, or spool speed; separator operation criteria controllers for features such as concave clearance, rotor speed, shoe clearance, or damper clearance; or controllers for auger position, operation or speed.
[00120] In one embodiment, examples of sensors 1612 that can be used with grain carts include weight sensors, or sensors for auger position, operation or speed. In one embodiment, examples of controllers 1614 that can be used with grain carts include controllers for auger position, operation, or speed.
[00121] In one embodiment, examples of sensors 1612 and controllers 1614 can be installed in unmanned aerial vehicle (UAV) apparatus or "drones". Such sensors may include cameras with effective detectors for any band of the electromagnetic spectrum including visible light, infrared, ultraviolet, near infrared (NIR) and the like; accelerometers; altimeters; temperature sensors; humidity sensors; pitot tube sensors or other air speed or wind speed sensors; battery life sensors; or radar emitters and reflected radar energy detection apparatus; other electromagnetic radiation emitters and reflected electromagnetic radiation detection devices. Such controllers may include motor control or guidance devices, control surface controllers, camera controllers or controllers programmed to turn on, operate, obtain data, manage and configure any of the above sensors. Examples are disclosed in US Patent Application No. 164/831,165 and the present disclosure assumes knowledge of that other patent disclosure.
[00122] In one embodiment, sensors 1612 and controllers 1614 can be affixed to soil sampling and measurement devices that are configured or programmed to sample soil and perform chemical soil tests, soil moisture tests, and others tests relating to soil. For example, the apparatus disclosed in US Patent No. 8,767,194 and US Patent No. 8,712,148 can be used, and the present disclosure assumes knowledge of those patent disclosures.
[00123] In one embodiment, sensors 1612 and controllers 1614 may comprise meteorological devices for monitoring the weather conditions of fields. For example, the apparatus disclosed in US Provisional Patent Application No. 62/154,207, filed April 29, 2015, US Provisional Patent Application No. 62/175,160, filed June 12, 2015, US Provisional Application No. 62/198,060, filed July 28, 2015 and US Provisional Patent Application No. 62/220,852, filed September 18, 2015, all of which are now US Patent 10,768,340, may be used, and the present disclosure assumes knowledge of these patent disclosures. 2.4. PROCESS OVERVIEW - FORMATION OF AGRONOMIC MODELS
[00124] In one embodiment, agricultural intelligence computing system 1630 is programmed or configured to create an agronomic model. In this context, an agronomic model is an in-memory data structure of agricultural intelligence computing system 1630 that comprises field data 1606, such as identification data and harvest data for one or more fields. The agronomic model can also comprise calculated agronomic properties that describe conditions that may affect the growth of one or more crops in a field, or properties of one or more crops, or both. In addition, an agronomic model can include recommendations based on agronomic factors such as crop recommendations, irrigation recommendations, planting recommendations, fertilizer recommendations, fungicide recommendations, pesticide recommendations, harvesting recommendations, and other crop management recommendations. Agronomic factors can also be used to estimate one or more crop-related outcomes, such as agronomic yield. The agronomic yield of a crop is an estimate of the amount of crop produced or, in some instances, the income or profit obtained from the crop produced.
[00125] In one embodiment, agricultural intelligence computing system 1630 may use a preconfigured agronomic model to calculate agronomic properties related to currently received location and crop information for one or more fields. The pre-configured agronomic model is based on pre-processed field data, including, but not limited to, identification data, harvest data, fertilizer data, and meteorological data. The pre-configured agronomic model may have been validated to ensure model accuracy. Cross-validation can include comparison with terrestrial truth that compares predicted results with actual results in a field, such as comparing the precipitation estimate with a rain gauge or sensor providing meteorological data at the same or near-site location or an estimate of nitrogen content with a soil sample measurement.
[00126] FIG. 18 illustrates a programmed process by which the agricultural intelligence computational system generates one or more preconfigured agronomic models using field data provided by one or more data sources. FIG. 18 can serve as an algorithm or instructions for programming the functional elements of agricultural intelligence computing system 1630 to perform the operations that are now described.
[00127] At block 1805, agricultural information computer system 1630 is configured or programmed to implement agronomic data pre-processing of field data received from one or more data sources. Field data received from one or more data sources can be pre-processed for the purpose of removing noise, distortion effects and confounding factors in agronomic data, including measured outliers that may negatively affect the values of the agronomic data. field received. Embodiments of agronomic data pre-processing may include, but are not limited to, removing data values commonly associated with outliers, specific measured data points that are known to unnecessarily skew other data values, smoothing data, aggregation, or sampling techniques used to remove or reduce additive or multiplicative effects of noise, and other filtering or data derivation techniques used to provide clear distinctions between positive and negative data inputs.
[00128] In block 1810, the agricultural information computer system 1630 is configured or programmed to perform data subset selection using the pre-processed field data in order to identify useful data sets for initial model generation. agronomics. Agricultural information computer system 1630 can implement data subset selection techniques including, but not limited to, a generic algorithm method, an all subset model method, a sequential search method, a stepwise regression method. step, a particle swarm optimization method, and an ant colony optimization method. For example, a genetic algorithm selection technique uses an adaptive heuristic search algorithm, based on evolutionary principles of natural and genetic selection, to determine and evaluate datasets within pre-processed agronomic data.
[00129] At block 1815, agricultural information computer system 1630 is configured or programmed to implement the evaluation of the field dataset. In one embodiment, a specific field dataset is evaluated by creating an agronomic model and using specific quality thresholds for the agronomic model created. Agronomic models can be compared and/or validated using one or more comparison techniques, such as, but not limited to, square root mean error with leave-one-out cross validation (RMSECV), mean absolute error, and mean percentage error . For example, the RMSECV can cross-validate agronomic models by comparing the predicted values of agronomic properties created by the agronomic model with the historical values of collected and analyzed agronomic properties. In one embodiment, the agronomic dataset evaluation logic is used as a feedback loop where agronomic datasets that do not meet configured quality thresholds are used during future data subset selection steps (block 1810) .
[00130] In block 1820, the agricultural intelligence computing system 1630 is configured or programmed to implement the creation of agronomic models based on the cross-validated agronomic datasets. In one embodiment, the creation of agronomic models can implement multivariate regression techniques to create pre-configured agronomic data models.
[00131] In block 1825, the agricultural intelligence computing system 1630 is configured or programmed to store the pre-configured agronomic data models for future evaluation of field data. 2.5. EXAMPLE OF IMPLEMENTATION - HARDWARE OVERVIEW
[00132] According to one embodiment, the techniques described herein are implemented by one or more special purpose computing devices. Special-purpose computing devices can be physically connected to perform the techniques or can include digital electronic devices such as one or more application-specific integrated circuits (ASICs) or field-programmable gate arrays (FPGAs) that are persistently programmed to perform techniques, or may include one or more general purpose hardware processors programmed to perform the techniques in accordance with program instructions in firmware, memory, other storage, or a combination. These special-purpose computing devices can also combine custom hardwired logic, ASICs or FPGAs with custom programming to perform the techniques. Special purpose computing devices may be desktop computer systems, handheld computer systems, handheld devices, network devices, or any other device that incorporates hardwired logic and/or programming to implement the techniques.
[00133] For example, FIG. 19 is a block diagram illustrating a computer system 1900 in which an embodiment of the invention can be implemented. Computer system 1900 includes a 1902 bus or other communication mechanism for communicating information, and a 1904 hardware processor coupled with the 1902 bus for processing information. The 1904 hardware processor can be, for example, a general purpose microprocessor.
[00134] Computer system 1900 also includes main memory 1906, such as random access memory (RAM) or other dynamic storage device, coupled to bus 1902 to store information and instructions to be executed by the 1904 processor. 1906 main can also be used to store temporary variables or other intermediate information during the execution of instructions to be executed by the 1904 processor. Such instructions, when stored on non-transient storage media accessible to the 1904 processor, make the 1900 computer system a special-purpose machine that is customized to perform the operations specified in the instructions.
[00135] Computer system 1900 further includes a read-only memory (ROM) 1908 or other static storage device coupled to bus 1902 for storing static information and instructions for processor 1904. A 1910 storage device, such as a disk Magnetic, optical disk, or solid state drive is provided and coupled to bus 1902 to store information and instructions.
[00136] Computer system 1900 can be coupled via bus 1902 to a 1912 display, such as a cathode ray tube (CRT), to display information to a computer user. A 1914 input device, including alphanumeric and other keys, is coupled to the 1902 bus to communicate information and command selections to the 1904 processor. Another type of user input device is the 1916 cursor control, such as a mouse, a trackball, or cursor direction keys to communicate direction information and command selections to processor 1904 and to control cursor movement on display 1912. This input device typically has two degrees of freedom on two geometry axes, a first geometry axis ( for example, x) and a second geometric axis (for example, y), which allows the device to specify positions in a plane.
[00137] Computer system 1900 can implement the techniques described herein using custom hardwired logic, one or more ASICs or FPGAs, firmware and/or program logic which in combination with the computer system causes or programs the 1900 computer system to be a special purpose machine. According to one embodiment, the techniques described herein are performed by computer system 1900 in response to processor 1904 executing one or more sequences of one or more instructions contained in main memory 1906. Such instructions can be read in main memory 1906 to from another storage medium, such as storage device 1910. Executing the sequences of instructions contained in main memory 1906 causes processor 1904 to perform the process steps described herein. In alternative embodiments, hard-wired circuits may be used in place of or in combination with software instructions.
[00138] The term "storage media", as used herein, refers to any non-transient media that stores data and/or instructions that cause a machine to operate in a specific manner. These storage media may comprise non-volatile media and/or volatile media. Non-volatile media includes, for example, optical disks, magnetic disks, or solid state drives, such as the 1910 storage device. Volatile media includes dynamic memory, such as the 1906 main memory. Common forms of storage media include, by example, a floppy disk, floppy disk, hard disk, solid state drive, magnetic tape or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with hole patterns , a RAM, a PROM, and EPROM, a FLASH-EPROM, NVRAM, any other memory chip or cartridge.
[00139] The storage media is distinct, but it can be used in conjunction with the transmission media. Transmission media participates in the transfer of information between storage media. For example, transmission media includes coaxial cables, copper wire, and fiber optics, including the wires comprising the 1902 bus. Transmission media can also take the form of acoustic or light waves, such as those generated during communications from data by radio and infrared waves.
[00140] Various forms of media may be involved in transporting one or more sequences of one or more instructions to the 1904 processor for execution. For example, instructions may initially be transported on a magnetic disk or on a remote computer's solid state drive. The remote computer can load the instructions into its dynamic memory and send the instructions over a telephone line using a modem. A local modem for the 1900 computer system can receive the data on the phone line and use an infrared transmitter to convert the data into an infrared signal. An infrared detector can receive the data carried in the infrared signal and appropriate circuitry can place the data on bus 1902. Bus 1902 carries the data to main memory 1906, from which the processor 1904 retrieves and executes instructions. Instructions received by main memory 1906 may optionally be stored in storage device 1910 before or after execution by processor 1904.
[00141] Computer system 1900 also includes a 1918 communication interface coupled to the 1902 bus. The 1918 communication interface provides a bidirectional data communication coupling to a 1920 network link that is connected to a 1922 LAN. , the 1918 communication interface may be an integrated services digital network card (ISDN), cable modem, satellite modem, or a modem to provide a data communication connection to a corresponding type of telephone line. As another example, communication interface 1918 may be a local area network (LAN) card to provide a data communication connection to a compatible LAN. Wireless links can also be implemented. In any such implementation, the 1918 communication interface sends and receives electrical, electromagnetic, or optical signals that carry digital data streams representing various types of information.
[00142] The 1920 network link typically provides data communication over one or more networks to other data devices. For example, network link 1920 may provide a connection over LAN 1922 to a host computer 1924 or to data equipment operated by an Internet Service Provider (ISP) 1926. ISP 1926, in turn, provides data communication services through the worldwide packet data communication network, now commonly referred to as the "Internet" 1928. The local network 1922 and the Internet 1928 use electrical, electromagnetic, or optical signals that carry digital data streams. The signals across the various networks and the signals on the 1920 network link and across the 1918 communication interface, which carry digital data to and from the 1900 computer system, are examples of forms of transmission media.
Computer system 1900 can send messages and receive data, including program code, over the network(s), network link 1920 and communication interface 1918. In the example of the Internet, a server 1930 can transmit a code requested for an application program via the Internet 1928, ISP 1926, local network 1922 and communication interface 1918.
[00144] Received code may be executed by processor 1904 as it is received and/or stored in storage device 1910, or other non-volatile storage for later execution.
权利要求:
Claims (14)
[0001]
1. A soil imaging system comprising: at least one working layer sensor (100) on an agricultural implement, a monitor (300) in communication with the working layer sensor (100) and adapted to generate an image of working layer (110) of a soil layer based on the electromagnetic field (102) generated through the soil layer, the soil imaging system characterized by the fact that: the working layer sensor (100) generates an electromagnetic field (102) through a seed trench (10) as the agricultural implement traverses a field, where the working layer sensor (100) comprises a first transmitter, a second transmitter, a third transmitter, a first receiver, a second receiver, and a third receiver; the second receiver posterior to the second transmitter over the seed trench (10) in a first linear direction which is a direction of displacement of the agricultural implement, the first transmitter and the third transmitter on opposite sides of the seed trench (10) in a second linear direction with the second transmitter being different from the first linear direction, the first receiver and the third receiver on opposite sides of the seed trench (10) in a third linear direction with the second receiver being different from the first linear direction, and the first receiver and third receiver posterior to the first transmitter and third transmitter, respectively, in the first linear direction.
[0002]
2. Soil imaging system according to claim 1, characterized in that the working layer sensor (100) is mounted on a planter row unit (200) or on a planter and the monitor ( 300) is inside a cabin of a tractor connected to the planter.
[0003]
3. Ground imaging system according to claim 1, characterized in that the working layer sensor (100) is configured to generate a reference image (110A) by generating an electromagnetic field (102) through of undisturbed soil; wherein the monitor (300) is configured to compare at least one feature of the reference image (110A) with at least one feature of the working layer image (110) to generate a characterized image of the working layer of interest.
[0004]
4. Soil imaging system according to claim 3, characterized by the fact that the monitor (300) is configured to affect the operational control of the agricultural implement based on the image characterized using one or more actuators on the agricultural implement.
[0005]
5. Soil imaging system, according to claim 4, characterized by the fact that the monitor (300) is configured to affect the operational control of the agricultural implement based on the characterized image by executing one or more of: triggering a row wiper actuator (276) to increase the down force of the row wiper; actuating a down force control system (214) to increase down force; actuate a depth adjustment actuator (234) to increase or decrease the trench depth; actuate a ditch closure wheel assembly actuator (256) to increase the downward force on the closure wheels (254); or actuate a packer wheel assembly actuator (266) to increase the down force on a packer wheel (264).
[0006]
6. Soil imaging system, according to claim 1, characterized by the fact that: the monitor (300) is configured to perform one or more operations of: identifying that a seed is irregularly spaced in the ditch; identify that a seed is not uniformly deposited in a base of the trench; identify that the spacing of a seed in the ditch does not correspond to an anticipated spacing of the seed; the monitor (300) is configured to perform, in response to any of the identification operations, the display of a recommendation to decrease planting speed or increase downforce.
[0007]
7. Ground imaging system according to claim 1, characterized by the fact that the working layer sensor (100) comprises any one of: a ground penetration radar system, an ultrasound system, a audible range sound system, or an electrical current system.
[0008]
8. Ground imaging system according to claim 1, characterized in that the monitor (300) is adapted to generate the working layer image (110) comprising one or more of: trench depth, shape ditch depth, seed depth, seed depth relative to ditch depth, crop residue in ditch, or voids in ditch.
[0009]
9. Ground imaging system according to claim 1, characterized by the fact that at least the first transmitter has an operating frequency from 1 GHz to 100 GHz.
[0010]
10. Soil imaging system according to claim 1, characterized by the fact that at least one transmitter is over the seed trench (10) and has a transmission angle such that a signal from the hair at least one transmitter reaches each side of the seed trench (10).
[0011]
11. Soil imaging system, according to claim 1, characterized by the fact that the working layer sensor (100) is one of: an in-phase matrix radar and a Doppler radar.
[0012]
12. Soil imaging system comprising: at least one working layer sensor (100) on an agricultural implement, a monitor (300) in communication with the working layer sensor (100) and adapted to generate an image of the working layer (110) of the soil layer based on the electromagnetic field (102) generated through the soil layer, the soil imaging system characterized by the fact that the working layer sensor (100) generates a electromagnetic field (102) across a seed trench (10) as the agricultural implement traverses a field; wherein the working layer sensor (100) comprises a transmitter, a first receiver, a second receiver and a third receiver; the second receiver posterior to the transmitter over the seed trench (10) in a first linear direction which is a direction of displacement of the agricultural implement, the first receiver and the third receiver on opposite sides of the seed trench (10) in a second direction linear with the second receiver being different from the first linear direction, the first receiver posterior to the transmitter in a third linear direction which is different from the first linear direction, and the third receiver posterior to the transmitter in a fourth linear direction which is different from the first direction linear.
[0013]
13. Soil imaging system according to claim 12, characterized by the fact that at least one transmitter is over the seed trench (10) and has a transmission angle such that a signal from the hair at least one transmitter reaches each side of the seed trench (10).
[0014]
14. Ground imaging system, according to claim 12, characterized by the fact that the working layer sensor (100) is one of: a phase matrix radar and a Doppler radar.
类似技术:
公开号 | 公开日 | 专利标题
BR112019009310B1|2021-06-08|ground imaging systems
US10827669B2|2020-11-10|Method for recommending seeding rate for corn seed using seed type and sowing row width
US10769733B2|2020-09-08|Forecasting national crop yield during the growing season using weather indices
US20200279179A1|2020-09-03|Computer-implemented calculation of corn harvest recommendations
BR112017026437B1|2022-01-18|COMPUTER SYSTEM AND COMPUTER DEPLOYED METHOD FOR MONITORING ONE OR MORE FIELDS OPERATIONS
US20200034759A1|2020-01-30|Generating agronomic yield maps from field health imagery
BR112021007121A2|2021-07-20|use of machine learning-based seed crop moisture predictions to improve a computer-assisted farm farming operation
BR112021001667A2|2021-05-04|automatic yield forecast and sowing rate recommendation based on weather data
US20210267117A1|2021-09-02|Flagging operational differences in agricultural implements
US20200045898A1|2020-02-13|Digital nutrient models using spatially distributed values unique to an agronomic field
US20220067027A1|2022-03-03|Automated detection of errors in location data in agricultural operations
US20200201269A1|2020-06-25|Utilizing spatial statistical models for implementing agronomic trials
同族专利:
公开号 | 公开日
EP3534685A4|2020-10-21|
WO2018085095A1|2018-05-11|
US10768331B2|2020-09-08|
AR110141A1|2019-02-27|
EP3534685A1|2019-09-11|
BR112019009310A2|2019-10-15|
AU2017355315A1|2019-06-13|
ZA201903414B|2020-09-30|
US20180128933A1|2018-05-10|
CA3043200A1|2018-05-11|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

US5598794A|1995-02-13|1997-02-04|Fluid Power Industries, Inc.|High accuracy automatically controlled variable linear seed spacing planting apparatus|
US6082466A|1998-10-28|2000-07-04|Caterpillar Inc.|Rowcrop machine guidance using ground penetrating radar|
US6501413B2|1999-03-22|2002-12-31|Sensors & Software Inc.|Timing and control and data acquisition for a multi transducer ground penetrating radar system|
US6700526B2|2000-09-08|2004-03-02|Witten Technologies Inc.|Method and apparatus for identifying buried objects using ground penetrating radar|
US6597992B2|2001-11-01|2003-07-22|Soil And Topography Information, Llc|Soil and topography surveying|
DE10214648A1|2002-04-02|2003-10-16|Claas Selbstfahr Erntemasch|Measuring device on an agricultural machine|
US7467810B2|2004-07-07|2008-12-23|Sensors & Software Inc.|Apparatus for transporting a sensor|
US20080087447A1|2006-10-16|2008-04-17|Richard Paul Piekutowski|Control and method of control for an earthmoving system|
US8618465B2|2008-11-13|2013-12-31|Deere & Company|Seed sensor system and method for improved seed count and seed spacing|
KR101011575B1|2010-04-13|2011-01-27|경북대학교 산학협력단|Non invasive method and apparatus for selecting superior seed|
US8843281B2|2010-09-17|2014-09-23|Kinze Manufacturing, Inc.|Seed characteristic sensor|
US9743578B2|2011-04-27|2017-08-29|Kinze Manufacturing, Inc.|Agricultural devices, systems, and methods for determining soil and seed characteristics and analyzing the same|
US8869629B2|2012-03-08|2014-10-28|Cnh Industrial Canada, Ltd.|System and method for monitoring agricultural product delivery|
CA2888970C|2012-10-24|2020-12-08|Precision Planting Llc|Agricultural trench depth sensing systems, methods, and apparatus|
RU2537908C2|2013-03-06|2015-01-10|Государственное научное учреждение Агрофизический научно-исследовательский институт Российской академии сельскохозяйственных наук |Device for subsurface measuring agrotechnological characteristics of soil arable layer in motion|
US10145837B2|2013-03-14|2018-12-04|Robert Ernest Troxler|Systems and methods for asphalt density and soil moisture measurements using ground penetrating radar|
US8849523B1|2013-05-20|2014-09-30|Elwha Llc|Systems and methods for detecting soil characteristics|
US9401030B2|2014-04-25|2016-07-26|Tazco Soil Service Co.|Image processing system for soil characterization|
US10262413B2|2015-05-08|2019-04-16|The Climate Corporation|Work layer imaging and analysis for implement monitoring, control and operator feedback|
WO2017197274A1|2016-05-13|2017-11-16|Precision Planting Llc|Seed trench closing sensors|
US10867377B2|2016-12-21|2020-12-15|Massachusetts Institute Of Technology|Determining soil state and controlling equipment based on captured images|US10262413B2|2015-05-08|2019-04-16|The Climate Corporation|Work layer imaging and analysis for implement monitoring, control and operator feedback|
WO2018049289A1|2016-09-09|2018-03-15|Cibo Technologies, Inc.|Systems for adjusting agronomic inputs using remote sensing, and related apparatus and methods|
US10231376B1|2017-08-31|2019-03-19|Cnh Industrial America Llc|Systems and method for determining trench closure by a planter or seeder|
US10473559B2|2017-10-10|2019-11-12|James A. Case|Automated soil sample collection machine with scraper for removing surface debris|
US10477756B1|2018-01-17|2019-11-19|Cibo Technologies, Inc.|Correcting agronomic data from multiple passes through a farmable region|
US20190387668A1|2018-06-25|2019-12-26|Deere & Company|Prescription cover crop seeding with combine|
US20200000006A1|2018-06-29|2020-01-02|Deere & Company|Method of mitigating compaction and a compaction mitigation system|
FR3083959B1|2018-07-18|2020-08-14|Ev Dev|SEEDER AND SEEDER ASSEMBLY COUPLABLE TO A TRACTOR UNIT, WITH A VIEW TO THE REALIZATION OF THE SAID SEEDER|
US11147204B2|2018-08-07|2021-10-19|Cnh Industrial America Llc|System and related methods for adjusting a down force applied to a row unit of an agricultural implement|
EP3836783A1|2018-08-13|2021-06-23|Farmwise Labs, Inc.|Method for autonomous detection of crop location based on tool depth and location|
WO2020039322A1|2018-08-24|2020-02-27|Precision Planting Llc|Agricultural trench depth sensing systems, methods, and apparatus|
US10820468B2|2018-09-19|2020-11-03|Cnh Industrial America Llc|System and method for determining soil roughness of a field across which an agricultural implement is being moved based on ground engaging tool acceleration|
US10813265B2|2018-09-27|2020-10-27|Cnh Industrial America Llc|System and method for automatically resetting ground engaging tools|
US10959367B2|2018-10-03|2021-03-30|Cnh Industrial America Llc|System and method for controlling an agricultural tillage implement|
US11051446B2|2018-10-05|2021-07-06|Cnh Industrial America Llc|System and related methods for monitoring and adjusting actual seed depths during a planting operation|
US10973171B2|2018-10-17|2021-04-13|Cnh Industrial America Llc|System and method for monitoring field profiles based on data from multiple types of sensors|
DE102018221250A1|2018-12-07|2020-06-10|Robert Bosch Gmbh|Method and system for controlling an agricultural machine|
US11191204B2|2019-02-18|2021-12-07|Cnh Industrial Canada, Ltd.|System and method for monitoring soil conditions within a field|
US11071245B2|2019-04-25|2021-07-27|Deere & Company|Seeding machine including variable force trench closer|
US11202402B2|2019-07-24|2021-12-21|Cnh Industrial America Llc|System and method for identifying plugging of ground engaging tools based on wireless signal detection|
法律状态:
2020-12-15| B06A| Patent application procedure suspended [chapter 6.1 patent gazette]|
2021-03-30| B09A| Decision: intention to grant [chapter 9.1 patent gazette]|
2021-06-08| B16A| Patent or certificate of addition of invention granted [chapter 16.1 patent gazette]|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 25/10/2017, OBSERVADAS AS CONDICOES LEGAIS. |
优先权:
申请号 | 申请日 | 专利标题
US201662418645P| true| 2016-11-07|2016-11-07|
US62/418,645|2016-11-07|
PCT/US2017/058332|WO2018085095A1|2016-11-07|2017-10-25|Work layer imaging and analysis for implement monitoring, control and operator feedback|
[返回顶部]