Date: Mon Aug 1, 2016
Time: 10:20 AM - 12:00 PM
Digital aerial imagery (DAI) of the crop canopy collected by aircraft and unmanned aerial vehicles is the yardstick of precision agriculture. However, the quantitative use of this imagery is often limited by its variable characteristics, low quality, and lack of radiometric calibration. To increase the quality and utility of using DAI in crop management, it is important to evaluate and address these limitations of DAI. Even though there have been improvements in spatial resolution and ease of imagery access, current DAI sources appear to lack the end-user demand for products that provide more than just an aesthetic image of a place-specific snapshot at a given time. The objective of this study was to establish a site to test quality and different methods for radiometric calibration of DAI over time. A 120-ha study area located in Story County, Iowa was used during the 2015 growing season. Commercial calibration tarps with known reflectance (3, 6, 12, 22, 44, and 56%) values were deployed in a study area with two fields of corn (Zea mays L.) and two fields of soybean (Glycine max L.). Two commercial DAI providers collected 0.2 meter and 0.5 meter, multi-spectral (blue, green, red, and NIR) digital imagery every 10 to 12 days throughout the growing season. Empirical line method and segmental linear regression calibration techniques were utilized to convert digital numbers (DNs) to percent reflectance, which were used to create standardized NDVIs that could be compared across time (date to date) and across space (field to field). Image processing challenges resulted from a highly non-linear mathematical relationship between DN values and percent reflectance, and from significant inaccuracies in the spatial registration of pixels from flight to flight. Even so, calibrated temporal vegetation indices were produced allowing for identification of agronomic areas that had similar vegetative health over time. The analyses showed that it is critical for DAI providers to maintain the purest form of the original digital values (with minimal post-processing manipulation) to allow for radiometric calibration of the data for use in spatiotemporal vegetation indices, crop modeling, and any other standardized image comparisons for use in crop management.
Cotton root rot, caused by the soilborne fungus Phymatotrichopsis omnivore, is a severe plant disease that has affected cotton production for over a century. Recent research found that a commercial fungicide, Topguard (flutriafol), was able to control this disease. As a result, Topguard Terra Fungicide, a new and more concentrated formulation developed specifically for this market was registered in 2015, so cotton producers can use this product to control the disease. Cotton root rot only infects isolated portions of the field and tends to occur in the same general areas within the field in recurring years. The unique characteristic of cotton root rot makes it an excellent candidate for site-specific management based on historical infection maps. Remote sensing in conjunction with image classification techniques has been successfully used to detect cotton root rot and create classification maps. Although these maps can be directly used as prescription maps for site-specific fungicide application, some practical issues need to be considered for creating prescription maps. Two of these issues include the accommodation of the variation or potential expansion of the disease over years and consideration of minimum areas that can be practically managed. Moreover, it is important to select an accurate and effective image classification method that can be easily implemented. The objective of this study was to develop practical procedures to create prescription maps from remotely sensed imagery for site-specific treatment of cotton root rot. Airborne multispectral imagery taken from a cotton field with a history of cotton root rot in south Texas in 2002 and 2012 was used to illustrate the process. The images were rectified and resampled to the same pixel size (1 m) between the two years. The normalized difference vegetation index (NDVI) images were generated and unsupervised classification was then used to classify the NDVI images into root rot-infected and non-infected zones. Small inclusions of areas within the dominant zones were eliminated using different thresholds. Other artifacts such as missing plants due to planter skips and crop damage caused by wheel tracks of the center-pivot system were merged to the non-infected zone. Change detection analysis was performed to detect the consistency and change in root rot infection between the two growing seasons. To account for the potential expansion and temporal variation of the disease, buffer zones of 1-20 m around the infected areas were created and the effect of the buffers on treatment areas was analyzed. The selection of buffer distance and minimum management areas in the prescription maps was discussed. This study demonstrates the practical procedures and considerations for creating prescription maps from historical images. The results will provide cotton producers, consultants and service providers with practical guidelines for developing prescription maps for site-specific management of control cotton root rot.
High-resolution satellite and areal imagery enables multi-scale analysis that has previously been impossible. We consider the task of localized linear regression and show that window-based techniques can return results at different length scales with very high efficiency. The ability of inspecting multiple length scales is important for distinguishing factors that vary over different length scales. For example, variations in fertilization are expected to occur on shorter length scales than changes in soil type. We demonstrate the effectiveness of our approach for a small agriculturally relevant use case, in which regression lines are calculated for the dependency of yield on the Normalized Difference Vegetation Index, NDVI. This use case is relevant towards the In Season Estimation of Yield, INSEY. Conventionally, yield vs. NDVI dependencies are established based on data collected for test plots. However, the results from tests plots may not be representative of the growing conditions in a particular production field. On the other hand, when production-field data are used, dependencies on soil types and other factors may interfere with the fertilization-dependency that is of interest. Our approach promises to allow distinguishing such factors, provided they result in variations on different scales. We compare our technique with Geographically Weighted Regression, GWR. Even a single application of GWR takes over one hour for 10,000 data points, while our own approach completes in under one minute while, at the same time, returning multiple maps, each corresponding to a different resolution.
In precision agriculture, yield maps are important for farmers to make plans. Farmers will have a better management of the farm if early yield map can be created. In Florida, citrus is a very important agricultural product. To predict citrus production, fruit detection method has to be developed. Ideally, the earlier the prediction can be done the better management plan can be made. Thus, fruit detection before their mature stage is expected. This study aims to develop a thermal-visible camera system which will register thermal images with visible images, so that information fusion can be done later for detecting immature fruit. The registration method used in this study was based on photogrammetry that could be applied to register multiple cameras as well. The camera system used in the study consisted of two identical visible cameras and a thermal camera which were mounted on a single frame and their positions were fixed. Bundle adjustment was utilized to calibrate cameras’ relative orientations with respect to each other and intrinsic parameters for each camera. Image registrations were conducted in real-time after each set of images were taken by the three cameras. Common points of interest in the two visible images were selected by running the random sample consensus (RANSAC). Coordinates of corresponding points in the thermal image were calculated by utilizing image intersection method. A transformation matrix was then solved based on selected corresponding points in the thermal image and one of the two visible images. Finally, image registration between the thermal image and the visible image was completed by applying the transformation to the thermal image. The stated method is expected to be fast and can be expanded for multiple camera systems
Mapping natural variability of crops and land is first step of the management cycle in terms of crop production. Several methods have been developed and engaged for data recording and analyzing that generate prescription maps such as yield monitoring, soil mapping, remote sensing etc. Although conventional remote sensing by capturing images via satellites has been very popular tool to monitor the earth surface, it has several drawbacks such as orbital period, unattended capture, investment cost. On the contrary, unmanned air vehicle (UAV) is more flexible in terms of deploying, monitoring small area and easy to acquire at low cost.
In this aspect, the objective of this study was to develop a low-cost and easy to implement technical solution to map spatial variability, and to explore its relationship with crop conditions. The idea was to build a close loop process starting from a standardized UAV based image shooting till an easy but reasonable production of NDVI and derived/prescription maps especially for vineyards.
The main components of this image acquisition system are Unmanned Air Vehicle (UAV) and modified commercial digital camera. Within the study, 3 different UAV (2 fixed wings and one flying wing) have been built, based on commercial airplanes models fitting the purpose and used open source autopilot. As sensors, 2 small format digital cameras (1 Nikon and 1 Canon) have been tested, part of them modified in order to be able to acquire also NIR radiations.
Laboratory tests were conducted in order to calibrate the cameras. After all, UAV based image acquisition system was developed. Future tests were planned for the assessment of practical usage in situ.