Produced by the International Association for the Plant Protection Sciences (IAPPS). To join IAPPS and receive the Crop Protection journal online go to: www.plantprotection.org
In farming, weeds can strangle crops and destroy yields. Unfortunately, spraying herbicides to deal with the intrusive plants pollutes the environment and harms human health and there simply aren’t enough workers to tackle all the weeds by hand.
A new startup called FarmWise has come up with a solution: autonomous weeding robots that use artificial intelligence to cut out weeds while leaving crops untouched, according to an MIT report published on Thursday.
See Also
“We have a growing population, and we can’t expand the land or water we have, so we need to drastically increase the efficiency of the farming industry,” company co-founder Sebastien Boyer told MIT. “I think AI and data are going to be major players in that journey.”
The company currently boasts two super weed cropping robots: the Titan and the Vulcan. Both are powered by an AI that directs hundreds of tiny blades to snip out weeds around each crop without harming the healthy plants. Both also allow for human supervision as the robots work to remove the pesky weeds.
But that’s not all.
More than just weeding
FarmWise now has over 15,000 commercial hours under its belt and has ambitious plans to use the data it collects for more than just weeding.
“It’s all about precision,” Boyer said. “We’re going to better understand what the plant needs and make smarter decisions for each one. That will bring us to a point where we can use the same amount of land, much less water, almost no chemicals, much less fertilizer, and still produce more food than we’re producing today. That’s the mission. That’s what excites me.”
Boyer added that his company’s mission is to turn AI into a tool that is as reliable and dependable as GPS is now in the farming industry.
“Twenty-five years ago, GPS was a very complicated technology. You had to connect to satellites and do some crazy computation to define your position. But a few companies brought GPS to a new level of reliability and simplicity. Today, every farmer in the world uses GPS. We think AI can have an even deeper impact than GPS has had on the farming industry, and we want to be the company that makes it available and easy to use for every farmer in the world,” Boyer concluded in the report.
Fusarium head blight, commonly called scab disease, is a highly destructive wheat disease that leads to substantial yield loss and contamination of wheat grain with deoxynivalenol.
Jessica Rutkoksi, pictured, is part of a University of Illinois team using cell phone images and AI to detect fungal toxins in wheat kernels. The goal is to quickly identify wheat lines with lower susceptibility to the fungus, making it easier to breed for disease resistance in the crop. Image Credit: University of Illinois College of ACES
Deoxynivalenol (DON) is a mycotoxin that can cause adverse health effects in humans and animals. Phenotyping for Fusarium-damaged kernels (FDKs) provides an accurate assessment of resistance to accumulation of DON; however, it is a time-consuming and subjective process.
A study published in The Plant Phenome Journal implemented sophisticated object recognition technology for filtering out DON-contaminated wheat kernels from the food supply chain and to assist scientists in developing wheat that has stronger resistance to FHB.
Fusarium Head Blight – A Significant Threat to Wheat
Fusarium head blight (FHB) is a serious disease for wheat, causing billions of dollars of losses in crops to date. FHB causes deoxynivalenol buildup in wheat grains. DON is a mycotoxin belonging to the trichothecene family of vomitoxins. FHB is of great concern since DON ingestion in people and animals from infected wheat end products has detrimental effects on health.
In humans, DON consumption may cause nausea, headaches, vomiting, and diarrhea. The adverse health consequences of DON consumption differ amongst animals, but most typically result in weight loss, nutritional deficiencies, and immunological deficiencies.
Detecting Fusarium-Damaged Kernels Using AI
FDK is a well-established visual grain damage caused by Fusarium, which is observed post-harvest. It is used as a ‘proxy’ phenotype to indirectly select for resistance to DON accumulation within the grain.
Robotics & Automation eBook
Compilation of the top interviews, articles, and news in the last year.
The team developed a simple and user-friendly method to identify FDKs by training a convolutional neural network (CNN) model on images of healthy and infected wheat kernels.
The images were taken with a smartphone and uploaded to the app, which then used the trained CNN model to determine the percentage of infected kernels. The model achieved an accuracy of around 90% in detecting FDKs in wheat, which was comparable to manual FDK counting.
While alternative techniques for quantifying DON levels in wheat grain samples exist, they entail lab-intensive tests like mass spectrometry (MS) and enzyme-linked immunosorbent tests, which can be costly and time-consuming.
The CNN model used in the study was trained on numerous images of wheat kernels taken with a smartphone, half of which were healthy, and the other half were infected with Fusarium graminearum.
The model was then used to classify new images of kernels as healthy or infected. The researchers tested the model on additional images of wheat kernels, achieving a high accuracy in detecting FDKs in wheat.
Girish Chowdhary, an author of the study, remarked on the novelty of their research, “One of the unique things about this advance is that we trained our network to detect minutely damaged kernels with good enough accuracy using just a few images. We made this possible through meticulous pre-processing of data, transfer learning, and bootstrapping of labeling activities.”
Potential Applications
According to the researchers, the mobile app has the potential to make the process of phenotyping for FDKs more accessible and affordable, especially in developing countries where laboratory assays are not readily available.
It can also be used in the field to identify infected wheat kernels, enabling farmers to monitor FHB in real time and take necessary measures to minimize yield loss and mycotoxin contamination.
The app can also help researchers and industries to screen large numbers of wheat varieties for resistance to FHB and DON accumulation.
The CNN model can be fine-tuned to identify specific resistance mechanisms and to develop wheat varieties that are resistant to FHB and have low DON levels, thus contributing to global food safety and security.
Fusarium head blight remains one of the most destructive diseases in wheat, resulting in significant yield losses and the contamination of wheat grain with deoxynivalenol.
Phenotyping for Fusarium-damaged kernels is a critical component of identifying resistance to DON accumulation in wheat, but manual phenotyping can be time-consuming.
This study has developed and tested an open-access and easy-to-use method for the phenotyping of FDKs using a convolutional neural network trained on cell phone images.
The method achieved an accuracy of around 90% when tested on a separate dataset, demonstrating its potential to greatly improve the efficiency and accuracy of FDK phenotyping.
Future research in this area could focus on further refining the CNN model, as well as combining this method with other technologies to develop a more comprehensive system for monitoring crop health and identifying disease outbreaks.
Reference
Wu, J., Ackerman, A., Gaire, R., Chowdhary, G., & Rutkoski, J. (2023). A neural network for phenotyping Fusarium-damaged kernels (FDKs) in wheat and its impact on genomic selection accuracy. The Plant Phenome Journal, 6(1). https://doi.org/10.1002/ppj2.20065
Source:
Disclaimer: The views expressed here are those of the author expressed in their private capacity and do not necessarily represent the views of AZoM.com Limited T/A AZoNetwork the owner and operator of this website. This disclaimer forms part of the Terms and conditions of use of this website.
Shaheer is a graduate of Aerospace Engineering from the Institute of Space Technology, Islamabad. He has carried out research on a wide range of subjects including Aerospace Instruments and Sensors, Computational Dynamics, Aerospace Structures and Materials, Optimization Techniques, Robotics, and Clean Energy. He has been working as a freelance consultant in Aerospace Engineering for the past year. Technical Writing has always been a strong suit of Shaheer’s. He has excelled at whatever he has attempted, from winning accolades on the international stage in match competitions to winning local writing competitions. Shaheer loves cars. From following Formula 1 and reading up on automotive journalism to racing in go-karts himself, his life revolves around cars. He is passionate about his sports and makes sure to always spare time for them. Squash, football, cricket, tennis, and racing are the hobbies he loves to spend his time in.
UTA to use tiny sensors to track bugs and combat infestations
The University of Texas at Arlington is helping develop tiny sensors that attach to insects, tracking their movements and life cycles in an effort to combat infestations and increase farm production.
The project is led by computer science Professor Gautam Das and electrical engineering Professor Wei-Jen Lee, working with the U.S. Department of Agriculture (USDA). The $122,057, USDA grant runs through June 2023.
“This is a unique approach to the problem of infestations, and we hope to produce results that will allow us to expand our research later,” Das said. “The use of artificial intelligence in agriculture is a growing field, and this is just one small example of how it can make an impact.”
Das will work to develop a sensor that can be attached to the tarnished plant bug, a plant-feeding insect known to ruin crops of small fruits and vegetables. The sensors would relay information to a base station that tracks the insect’s coordinates and movements. Das and Jianzhong Su, professor and chair of mathematics, will perform data analysis to find patterns.
Lee will work on a radio-frequency identification (RFID) tag for the insects and use multiple readers to pinpoint their locations. A wireless sensor network will transmit data for analysis.
Wei-Jen Lee The researchers must also develop a way to provide power to the sensor, possibly by tapping into the insect’s movements. The team is working with University of Central Florida mechanical engineering Assistant Professor Wendy Shen.
“Insects can positively or negatively affect agricultural quality and production,” Lee said. “Understanding their behavior is an important step to taking advantage of their benefits and mitigating potential damages. Applying advanced sensor technologies and artificial intelligence will have a profound impact on the future development of agriculture.”
Jianzhong Su The insects will be released into special rooms maintained by the USDA that have large spaces where plants are grown, and insects can fly around in a controlled environment. This way, the team can test its technology without worrying about negative impacts on actual crops.
Since 2020, the USDA and the National Science Foundation have poured millions of dollars into artificial intelligence research in agriculture. Su has led a university-wide research collaboration with the USDA since 2018 with researchers from the Colleges of Science and Engineering, through funding from an earlier USDA Hispanic Serving Institution grant focused on agriculture data and Internet of Things.
“We have built a good relationship with the USDA, and we are happy that they have provided funding for this project,” Das said. “Hopefully, this is the beginning of a series of opportunities.”
This is what you will see from an automated camera trap: a photo of sticky paper delineating target pests, in this example codling moth, provided by CropVue Technologies of British Columbia. Camera traps, manufactured by several companies but typically packaged with a service that includes weather sensors, artificial intelligence and entomology expertise, are becoming more common in tree fruit orchards.(Courtesy CropVue Technologies)
Editor’s note: Teah Smith did not collaborate with Washington State University in her 2015 test of Semios camera traps, as reported in the print version of the October 2021 issue. This online version of the story has been corrected. Good Fruit Grower regrets the error.
The combination of increasingly affordable technology and improved artificial intelligence has led to a rise in automated camera traps for growers considering new precision pest control tools.
Camera traps allow users to check pest pressure over hundreds of acres from a computer, as often as needed, instead of driving through row after row to check by hand maybe once a week, while artificial intelligence recognizes key insects and alerts growers when target bugs are caught.
Camera traps are an important tool for the future in the management of many pests, including codling moth, especially since market trends have the industry shifting toward organic methods, said Chris Adams, an Oregon State University assistant professor of tree fruit entomology and chair of the Washington Tree Fruit Research Commission codling moth task force.
“What we have left is to make smarter and more timely decisions, and I think camera traps help us do that,” Adams said.
Adams has trials underway with two camera trap vendors, Semios and Trapview.
The traps themselves catch insects on old-fashioned sticky paper, and an internal camera uploads one or more photos in the wee hours of each morning. But the traps are provided as part of a service plan that includes data from the traps, artificial intelligence to identify pests, miniature weather stations and pheromone emitters, all connected remotely. Vendors typically have a team of entomologists to monitor and provide quality control for the artificial intelligence.
Three examples
Good Fruit Grower interviewed representatives from Semios and CropVue Technologies, both based in Vancouver, British Columbia, and Trapview, a Slovenia-based company with North American offices in Vancouver, Washington. Other examples include Isagro of Italy, Adama of Israel, FarmSense of the United Kingdom, DTN of Minnesota and Pessl Instruments of Austria.
Semios has about 10,000 camera traps in specialty crops throughout the world, working with tree fruit growers since 2015, said James Watson, director of sales and marketing.
Trapview has thousands of clients in specialty crops globally but has been operating in the United States only since 2018, said Jorge Pacheco, the North American managing director. So far, the company has more presence in California vegetables than Northwest tree fruit.
CropVue Technologies entered the arena in 2019. The company currently supports about 5,000 acres with a few Washington pilot growers but is poised for a full commercial launch next year, said Terry Arden, CEO.
Left: CropVue camera traps come with a solar panel to charge the camera battery. Center: Camera traps built by Trapview, based in Slovenia, come with self-scrolling replacement sticky paper, a solar panel and a weather node. Right: A Semios camera trap hangs in a Washington apple orchard.(Left to right: Courtesy CropVue Technologies, Courtesy Trapview, Courtesy Semios)
All three use similar technology but different business models.
Semios directly works with and sells to growers. Its software acts as a one-stop shop, which will allow growers to pull data from companies Semios acquired over the summer, including the company that owns the ApRecs online spray recommendation writing tool. The company hangs the traps, replenishes liners, installs the tools, remotely monitors the functions and maintains all equipment.
Trapview and CropVue distribute through suppliers and management companies such as Wilbur-Ellis, G.S. Long or Chamberlin Agriculture.
“They have a direct connection with those growers with other inputs, not just pest monitoring,” said Pacheco of Trapview.
CropVue’s Arden agreed. “The distributors have long-term relationships with growers,” he said.
The trap companies also differ in connectivity. Hinging their future on the build-out of cellular IoT (Internet of Things) service, CropVue and Trapview install a network in which each device independently uploads data to the cloud.
Semios, which built its infrastructure before IoT, relies on a “meshed network” with repeaters that talk to a gateway, which in turn uploads bigger lumps of data to its cloud. However, the company has already deployed IoT in some spots.
Camera traps work with a network of in-orchard weather monitors and mapping software, shown in this screenshot from Semios, to help growers make pest control decisions.(Courtesy Semios)
Costs
Semios and Trapview declined to discuss pricing because each orchard requires a unique level of service. Trapview is still in the process of setting its subscription rates.
CropVue is shooting for roughly $25 per acre per year, assuming one trap and one canopy weather node for every 10 acres, but that’s flexible. Washington State University researchers recommend a ratio of one trap per 2.5 acres for codling moth. Other entomologists’ suggestions range from one to five.
Sold individually, traps run $400 to $1,000 per season per trap, enough to be a barrier to entry, said Pete McGhee, research and development coordinator for Pacific Biocontrol Corp. in Corvallis, Oregon, and a former Michigan State University researcher who has worked with camera traps.
But the price will come down and the technology will continue to improve in all the cameras. Resolution is getting sharper, artificial intelligence is getting better at recognizing species and processing power continues to increase, McGhee said.
The main benefit to the camera traps and surrounding services is recognizing the threshold early in the season for “setting the biofix,” McGhee said — triggering the phenology model that will give predictive advice for when to spray.
After that, growers or consultants can monitor progress.
His concern is that many of the vendors have not publicly validated their approaches against the growing degree-day models and thresholds based on 30 years of university research.
All three companies in this story say they have run trials with university researchers. Meanwhile, in addition to running trusted models, their own vast datasets and artificial intelligence can refine the models and apply them uniquely to each orchard and its microclimate.
“They are changing the way decisions are made,” said Watson of Semios.
One grower’s take
Teah Smith, entomologist and agricultural consultant for Zirkle Fruit, is a fan of camera traps. The company is based in Yakima, Washington, though she is based in the Wenatchee area.
Smith is responsible for steering pest control over 6,500 acres of orchards. She used to send her team of scouts to check traps every week. Camera traps save them time for other things, she said.
She first experimented with Semios camera traps in 2015 on two 100-acre orchards. She hung standard delta sticky traps next to the automated traps, alternating locations each week, and she found comparable catch rates. She also double-checked the computer results with her visual inspections and found similar data.
Convinced, she expanded the use of camera traps for monitoring leafroller and codling moth over a lot more Zirkle acreage. If the company experienced oriental fruit moth pressure, she would use the traps for that pest, too, she said.
Smith also believes she gets more accurate pheromone emission and timing with camera traps, hung one per eight acres.
She has experienced some data processing limits in areas. Another challenge is the rise of sterile insect release. Currently, somebody or something has to smoosh a moth to find out if it’s irradiated or not, and the camera traps can’t do that.
However, technology will overcome those problems, she said.“It’s definitely the wave of the future.”
Ross Courtney is an associate editor for Good Fruit Grower, writing articles and taking photos for the print magazine and website. He has a degree from Pacific Lutheran University. — Follow the author — Contact: 509-930-8798 or email.
Just as meteorologists incorporate data into models to forecast weather, ecological scientists are using data to improve forecasting of environmental events – including pest or pathogen spread. – Photo: NCSU/Vaclav Petras
North Carolina State University researchers have developed a computer simulation tool to predict when and where pests and diseases will attack crops.
The computer modeling system is called “PoPS”, for the Pest or Pathogen Spread Forecasting Platform. Working with the U.S. Department of Agriculture’s Animal and Plant Health Inspection Service, the North Carolina State University (NCSU) researchers created the tool to forecast any type of disease or pathogen, no matter the location.
Model improves by adding data
The system works by combining information on climate conditions suitable for spread of a certain disease or pest with data on where cases have been recorded, the reproductive rate of the pathogen or pest and how it moves in the environment. Over time, the model improves as natural resource managers add data they gather from the field. This repeated feedback with new data helps the forecasting system get better at predicting future spread, the researchers said.
Increasing number of threats to crops
According to NCSU this tool can be put into the hands of a non-technical user to learn about disease dynamics and management, and how management decisions will affect spread in the future. The researchers say the tool is needed as state and federal agencies in the U.S. charged with controlling pests and crop diseases face an increasing number of threats to crops, trees and other important natural resources.
PoPS used to track 8 emerging pests and diseases
Researchers have been using PoPS to track the spread of 8 different emerging pests and diseases. They are improving the model to track spotted lanternfly, an invasive pest in the United States that primarily infests a certain invasive type of tree known as “tree of heaven.” Spotted lanternfly has been infesting fruit crops in Pennsylvania and neighboring states since 2014. It can attack grape, apple and cherry crops, as well as almonds and walnuts.
The study, “Iteratively Forecasting Invasions with PoPS and a Little Help From Our Friends,” was published June 3, 2021, in the journal Frontiers in Ecology and the Environment.FacebookTwitterEmailLinkedInPrint
“AI and robotics will bring us to the Olympic version of IPM”
“Data-driven growing is a big thing in horticulture in general. Many growers are into autonomous growing, data-driven greenhouse management, and advanced analytics. We’re convinced that this revolution will impact biological crop protection as well”, says Karel Bolckmans, COO with Biobest. “After all, if artificial intelligence (AI) can help you grow more efficiently and achieve higher yields, it will definitely render further improvement to your IPM program as well.”
“Since retailers want to offer a complete produce gamma year-round of for example greenhouse tomatoes and deal with as few suppliers as possible, we’re seeing an evolution towards rapid scale increase of greenhouse operations. Growers need to grow sufficient quantities of a complete offering twelve months per year, from cherry to beef tomato and everything in between. It results in bigger, multi-site, and international companies that can be complex to control. Data-driven growing enables you to keep track”, Karel explains.
“We also see that data-driven growing performs much better than growers themselves when it comes to optimizing plant growth. We’ll be moving to grow based on hard data, not on gut feeling.”
“The same is true for IPM. The results of biocontrol-based IPM tools are largely dependent on knowing exactly what is going on in the greenhouse. The better you know how your plants and their pests and their natural enemies are doing, the more efficient and effective you will be able to deploy your crop protection tools and the less chemical pesticides you will need to use.”
Partnerships and own development In May last year, Biobest launched Crop-Scanner, which comprises a scouting App for recording the location, severity, and identity of pests and diseases in the crop. Clearly visualizing these data via its web-based interface through heatmaps and graphs allows the grower to have a better overview of the situation in his crop while allowing his Biobest advisor to give him the best possible technical advice. More recently, Biobest also entered into a partnership with the Canadian company Ecoation, which developed a mobile data harvesting platform that combines deep biology, computer vision and sensor technology, artificial intelligence, and robotics. “We’ve been in touch for several years now and recently decided to work together on creating IPM 3.0. Their camera’s, sensors, and autonomous vehicles allow us to collect the best possible data which serve as input for an artificial intelligence-based Decision Support System (DSS) that allows us to provide the growers with the best-in-class technical advice regarding integrated pest and disease management (IPM)”, Karel explains. “At the same time, growers have been struggling with several severe virus outbreaks, of which ToBRFV and COVID were only a few. This has made it harder for us to frequently visit our customers in person to provide them with technical advice. But how to get accurate information from growers about the situation in the crop if you can’t visit them? Ecoation’s web-based user interface allows for remote counseling, thereby rendering frequent on-site technical visits are not necessary anymore.”
There’s more… Earlier this month, Biobest announced their investment in Arugga, Israeli developer of a robotic tomato pollinator. It might look like an alternative for the Biobest bumblebees – and actually, it is. “But our goal is not to sell the most bumblebees or beneficial insects and mites. We want to be the grower’s most reliable provider of the most effective solutions in pollination and integrated pest management in a world characterized by rapid innovation.” Although this might sound like a big change in policy for the company, Karel emphasizes that it is not at all as rash a decision as it might seem. “We’re convinced that having access to more accurate information of the status of pests, diseases and natural enemies in their crop will allow growers to develop more trust in biocontrol-based IPM and therefore reach out less fast to the pesticide bottle.”
We have done extensive research for over three years, studying the available technologies and patents. That way, we concluded that Ecoation made a wonderful match, not only in terms of technology but also when it came to vision and company culture. The same goes for Arugga. Their respective technologies support the development of the horticultural business to deal with the ever-increasing challenges of scale-increase, labor shortage, and market demand.”
The technologies Biobest now participates in go beyond IPM. The Ecoation technology for example also concerns yield prediction, high-resolution climate measurements, and controlling the quality of crop work. “Through the Ecoation technology anomalies can be detected much earlier, that way predicting and preventing outbreaks of pests and diseases. Non-stop measuring everywhere is our ideal. This way we will learn more about the effect of climate on the plant and, more importantly, the effects of the crop protection measures.”
Karel notices an increasing interest of growers in this kind of technology. “There is an increasing market demand for residue-free fruits and vegetables. That’s the direction we’re heading to. Our aim is to help growers do this in the best way possible: with the support of robotics and AI.”
Data collection will convince more growers He is convinced that the data that can be collected will convince more growers to start using the Ecoation and Arugga technologies. “We see now that pioneers in North America are highly interested and are currently successfully trialing these technologies. But it’s more than that: what we sell, is a production increase because of less plant stress from pests and diseases. Moreover, every single pesticide treatment causes plant stress and therefore negatively influences crop yield. This is very well known among experienced growers. ”
He remembers when a couple of decades ago, they saw the same when growers started switching from chemical crop protection to IPM. “I vividly remember 2006-2007 in Spain when many growers made the switch to biological control. They didn’t want to, they were forced by the retailers after the publication of a report on pesticide residues on Spanish produce by Greenpeace Germany. But at the end of that year, everybody was picking more and better peppers. In Kenya and elsewhere, rose growers who switch to biocontrol-based IPM pick more flowers, with a long stem and a better vase life. However, stories like this have never been scientifically quantified and published but are very well known to everyone in the industry. With our technologies, we will be able to immediately and continuously measure the exact effects of IPM on crop yield. Less work, more objective data. That means harvesting more kilos with less effort. AI and robotics will bring us to the Olympic version of IPM.”For more information:Biobestinfo@biobestgroup.comwww.biobestgroup.com