Feeds:
Posts
Comments

Archive for the ‘Artificial intelligence’ Category

Bud’, from Carbon Robotics

Robot lasers weeds from the fields without herbicide

Seattle autonomous robotics company Carbon Robotics aims to confront the multi-billion dollar global herbicide market with its laser-armed weed elimination robot. The machine, named “Bud”, rolls through farm fields using artificial intelligence to discern weeds from crops and using a high-power laser to kill the weeds. This will enable farmers to cultivate crops with less herbicide and reduced labor, improving crop yields and saving money.

https://www.youtube.com/embed/AP0yiOI8Qas

Bud’s robot brain is an Nvidia AI processor that gathers information from a dozen high-resolution cameras to feed its crop and weed computer vision models. Bud carries lighting so that it can illuminate the scene to let the cameras spot weeds at night.

Source: designnews.com

Read Full Post »

UTA to use tiny sensors to track bugs and combat infestations

The University of Texas at Arlington is helping develop tiny sensors that attach to insects, tracking their movements and life cycles in an effort to combat infestations and increase farm production.

The project is led by computer science Professor Gautam Das and electrical engineering Professor Wei-Jen Lee, working with the U.S. Department of Agriculture (USDA). The $122,057, USDA grant runs through June 2023.

“This is a unique approach to the problem of infestations, and we hope to produce results that will allow us to expand our research later,” Das said. “The use of artificial intelligence in agriculture is a growing field, and this is just one small example of how it can make an impact.”

Das will work to develop a sensor that can be attached to the tarnished plant bug, a plant-feeding insect known to ruin crops of small fruits and vegetables. The sensors would relay information to a base station that tracks the insect’s coordinates and movements. Das and Jianzhong Su, professor and chair of mathematics, will perform data analysis to find patterns.

Lee will work on a radio-frequency identification (RFID) tag for the insects and use multiple readers to pinpoint their locations. A wireless sensor network will transmit data for analysis.

Wei-Jen Lee
The researchers must also develop a way to provide power to the sensor, possibly by tapping into the insect’s movements. The team is working with University of Central Florida mechanical engineering Assistant Professor Wendy Shen.

“Insects can positively or negatively affect agricultural quality and production,” Lee said. “Understanding their behavior is an important step to taking advantage of their benefits and mitigating potential damages. Applying advanced sensor technologies and artificial intelligence will have a profound impact on the future development of agriculture.”

Jianzhong Su
The insects will be released into special rooms maintained by the USDA that have large spaces where plants are grown, and insects can fly around in a controlled environment. This way, the team can test its technology without worrying about negative impacts on actual crops.

Since 2020, the USDA and the National Science Foundation have poured millions of dollars into artificial intelligence research in agriculture. Su has led a university-wide research collaboration with the USDA since 2018 with researchers from the Colleges of Science and Engineering, through funding from an earlier USDA Hispanic Serving Institution grant focused on agriculture data and Internet of Things.

“We have built a good relationship with the USDA, and we are happy that they have provided funding for this project,” Das said. “Hopefully, this is the beginning of a series of opportunities.” 

Source: www.uta.edu

Publication date: Fri 17 Sep 2021

Read Full Post »

A new image of pest control

Camera traps monitor pests and inform decisions remotely.

October 2021 IssueRoss Courtney // October 21, 2021

Ross Courtney // October 21, 2021

This is what you will see from an automated camera trap: a photo of sticky paper delineating target pests, in this example codling moth, provided by CropVue Technologies of British Columbia. Camera traps, manufactured by several companies but typically packaged with a service that includes weather sensors, artificial intelligence and entomology expertise, are becoming more common in tree fruit orchards. (Courtesy CropVue Technologies)
This is what you will see from an automated camera trap: a photo of sticky paper delineating target pests, in this example codling moth, provided by CropVue Technologies of British Columbia. Camera traps, manufactured by several companies but typically packaged with a service that includes weather sensors, artificial intelligence and entomology expertise, are becoming more common in tree fruit orchards.(Courtesy CropVue Technologies)

Editor’s note: Teah Smith did not collaborate with Washington State University in her 2015 test of Semios camera traps, as reported in the print version of the October 2021 issue. This online version of the story has been corrected. Good Fruit Grower regrets the error.

The combination of increasingly affordable technology and improved artificial intelligence has led to a rise in automated camera traps for growers considering new precision pest control tools.

Camera traps allow users to check pest pressure over hundreds of acres from a computer, as often as needed, instead of driving through row after row to check by hand maybe once a week, while artificial intelligence recognizes key insects and alerts growers when target bugs are caught.

Camera traps are an important tool for the future in the management of many pests, including codling moth, especially since market trends have the industry shifting toward organic methods, said Chris Adams, an Oregon State University assistant professor of tree fruit entomology and chair of the Washington Tree Fruit Research Commission codling moth task force. 

“What we have left is to make smarter and more timely decisions, and I think camera traps help us do that,” Adams said.

Adams has trials underway with two camera trap vendors, Semios and Trapview. 

The traps themselves catch insects on old-fashioned sticky paper, and an internal camera uploads one or more photos in the wee hours of each morning. But the traps are provided as part of a service plan that includes data from the traps, artificial intelligence to identify pests, miniature weather stations and pheromone emitters, all connected remotely. Vendors typically have a team of entomologists to monitor and provide quality control for the artificial intelligence.

Three examples

Good Fruit Grower interviewed representatives from Semios and CropVue Technologies, both based in Vancouver, British Columbia, and Trapview, a Slovenia-based company with North American offices in Vancouver, Washington. Other examples include Isagro of Italy, Adama of Israel, FarmSense of the United Kingdom, DTN of Minnesota and Pessl Instruments of Austria.

Semios has about 10,000 camera traps in specialty crops throughout the world, working with tree fruit growers since 2015, said James Watson, director of sales and marketing. 

Trapview has thousands of clients in specialty crops globally but has been operating in the United States only since 2018, said Jorge Pacheco, the North American managing director. So far, the company has more presence in California vegetables than Northwest tree fruit.

CropVue Technologies entered the arena in 2019. The company currently supports about 5,000 acres with a few Washington pilot growers but is poised for a full commercial launch next year, said Terry Arden, CEO. 

Left: CropVue camera traps come with a solar panel to charge the camera battery. Center: Camera traps built by Trapview, based in Slovenia, come with self-scrolling replacement sticky paper, a solar panel and a weather node. Right: A Semios camera trap hangs in a Washington apple orchard. (Left to right: Courtesy CropVue Technologies, Courtesy Trapview, Courtesy Semios)
Left: CropVue camera traps come with a solar panel to charge the camera battery. Center: Camera traps built by Trapview, based in Slovenia, come with self-scrolling replacement sticky paper, a solar panel and a weather node. Right: A Semios camera trap hangs in a Washington apple orchard.(Left to right: Courtesy CropVue Technologies, Courtesy Trapview, Courtesy Semios)

All three use similar technology but different business models.

Semios directly works with and sells to growers. Its software acts as a one-stop shop, which will allow growers to pull data from companies Semios acquired over the summer, including the company that owns the ApRecs online spray recommendation writing tool. The company hangs the traps, replenishes liners, installs the tools, remotely monitors the functions and maintains all equipment.

Trapview and CropVue distribute through suppliers and management companies such as Wilbur-Ellis, G.S. Long or Chamberlin Agriculture. 

“They have a direct connection with those growers with other inputs, not just pest monitoring,” said Pacheco of Trapview. 

CropVue’s Arden agreed. “The distributors have long-term relationships with growers,” he said. 

The trap companies also differ in connectivity. Hinging their future on the build-out of cellular IoT (Internet of Things) service, CropVue and Trapview install a network in which each device independently uploads data to the cloud.

Semios, which built its infrastructure before IoT, relies on a “meshed network” with repeaters that talk to a gateway, which in turn uploads bigger lumps of data to its cloud. However, the company has already deployed IoT in some spots.

Camera traps work with a network of in-orchard weather monitors and mapping software, shown in this screenshot from Semios, to help growers make pest control decisions. (Courtesy Semios)
Camera traps work with a network of in-orchard weather monitors and mapping software, shown in this screenshot from Semios, to help growers make pest control decisions.(Courtesy Semios)

Costs

Semios and Trapview declined to discuss pricing because each orchard requires a unique level of service. Trapview is still in the process of setting its subscription rates.

CropVue is shooting for roughly $25 per acre per year, assuming one trap and one canopy weather node for every 10 acres, but that’s flexible. Washington State University researchers recommend a ratio of one trap per 2.5 acres for codling moth. Other entomologists’ suggestions range from one to five.

Sold individually, traps run $400 to $1,000 per season per trap, enough to be a barrier to entry, said Pete McGhee, research and development coordinator for Pacific Biocontrol Corp. in Corvallis, Oregon, and a former Michigan State University researcher who has worked with camera traps. 

But the price will come down and the technology will continue to improve in all the cameras. Resolution is getting sharper, artificial intelligence is getting better at recognizing species and processing power continues to increase, McGhee said.

The main benefit to the camera traps and surrounding services is recognizing the threshold early in the season for “setting the biofix,” McGhee said — triggering the phenology model that will give predictive advice for when to spray.

After that, growers or consultants can monitor progress.

His concern is that many of the vendors have not publicly validated their approaches against the growing degree-day models and thresholds based on 30 years of university research.

All three companies in this story say they have run trials with university researchers. Meanwhile, in addition to running trusted models, their own vast datasets and artificial intelligence can refine the models and apply them uniquely to each orchard and its microclimate.

“They are changing the way decisions are made,” said Watson of Semios.

One grower’s take

Teah Smith, entomologist and agricultural consultant for Zirkle Fruit, is a fan of camera traps. The company is based in Yakima, Washington, though she is based in the Wenatchee area.

Smith is responsible for steering pest control over 6,500 acres of orchards. She used to send her team of scouts to check traps every week. Camera traps save them time for other things, she said. 

She first experimented with Semios camera traps in 2015 on two 100-acre orchards. She hung standard delta sticky traps next to the automated traps, alternating locations each week, and she found comparable catch rates. She also double-checked the computer results with her visual inspections and found similar data.

Convinced, she expanded the use of camera traps for monitoring leafroller and codling moth over a lot more Zirkle acreage. If the company experienced oriental fruit moth pressure, she would use the traps for that pest, too, she said.

Smith also believes she gets more accurate pheromone emission and timing with camera traps, hung one per eight acres.

She has experienced some data processing limits in areas. Another challenge is the rise of sterile insect release. Currently, somebody or something has to smoosh a moth to find out if it’s irradiated or not, and the camera traps can’t do that. 

However, technology will overcome those problems, she said.“It’s definitely the wave of the future.” 

by Ross CourtneyOctober 21st, 2021|Insects and mitesNew DevelopmentsOctober 2021 IssuePest ManagementPesticidesRoss CourtneyTechnology and equipment

About the Author: Ross Courtney

Ross Courtney

Ross Courtney is an associate editor for Good Fruit Grower, writing articles and taking photos for the print magazine and website. He has a degree from Pacific Lutheran University. — Follow the author — Contact: 509-930-8798 or email.

Read Full Post »

Simulation tool predicts pests and disease spread in crops

Future Farming

08-06 | Updated on 19-07 |Crop solutions | Weed/Pest control | NewsFacebookTwitterEmailLinkedInPrint

Just as meteorologists incorporate data into models to forecast weather, ecological scientists are using data to improve forecasting of environmental events - including pest or pathogen spread. - Photo: NCSU/Vaclav Petras
Just as meteorologists incorporate data into models to forecast weather, ecological scientists are using data to improve forecasting of environmental events – including pest or pathogen spread. – Photo: NCSU/Vaclav Petras

North Carolina State University researchers have developed a computer simulation tool to predict when and where pests and diseases will attack crops.

The computer modeling system is called “PoPS”, for the Pest or Pathogen Spread Forecasting Platform. Working with the U.S. Department of Agriculture’s Animal and Plant Health Inspection Service, the North Carolina State University (NCSU) researchers created the tool to forecast any type of disease or pathogen, no matter the location.

Model improves by adding data

The system works by combining information on climate conditions suitable for spread of a certain disease or pest with data on where cases have been recorded, the reproductive rate of the pathogen or pest and how it moves in the environment. Over time, the model improves as natural resource managers add data they gather from the field. This repeated feedback with new data helps the forecasting system get better at predicting future spread, the researchers said.

Increasing number of threats to crops

According to NCSU this tool can be put into the hands of a non-technical user to learn about disease dynamics and management, and how management decisions will affect spread in the future. The researchers say the tool is needed as state and federal agencies in the U.S. charged with controlling pests and crop diseases face an increasing number of threats to crops, trees and other important natural resources.

PoPS used to track 8 emerging pests and diseases

Researchers have been using PoPS to track the spread of 8 different emerging pests and diseases. They are improving the model to track spotted lanternfly, an invasive pest in the United States that primarily infests a certain invasive type of tree known as “tree of heaven.” Spotted lanternfly has been infesting fruit crops in Pennsylvania and neighboring states since 2014. It can attack grape, apple and cherry crops, as well as almonds and walnuts.

The study, “Iteratively Forecasting Invasions with PoPS and a Little Help From Our Friends,” was published June 3, 2021, in the journal Frontiers in Ecology and the Environment.FacebookTwitterEmailLinkedInPrint

Claver

Hugo ClaverWeb editor for Future Farming

Read Full Post »

Three main components causing the digital agriculture revolution

“The farming industry is undergoing a digital revolution. Thanks to the large-scale availability of sensors, cameras and other mobile and computing technologies that we could only dream of in the past, growers have great amounts of data at their disposal,” says Dr. Gajendra Pratap Singh.

Cause of the revolution
In his view, there are three main components that are causing the digital revolution in agriculture. Firstly, there is the availability of affordable and portable sensors, that enable growers to monitor their crops more closely. Secondly, communication technology has allowed growers to be in closer contact with each other and with suppliers. But most of all, the massive availability of data analytics that the use of Artificial Intelligence (AI) technology allows for is helping to make smart decisions on time and increase productivity.  


Gajendra Pratap Singh

Gajendra is Principal Investigator and Scientific Director at Disruptive & Sustainable Technologies for Agricultural Precision (DiSTAP) at Singapore-MIT Alliance for Research and Technology (SMART), MIT’s research enterprise.

“AI technologies allow real-time interpretation of crop health data obtained from field sensors. Sensors in irrigation systems have been designed to provide water only when no rain is forecast, just to give one example. This both saves water and improves crop yield,” Dr. Singh explains.

The wealth of useful information that new technologies provide can also be used to breed resilient crops to withstand plant diseases. “In my view, AI technologies combined with sensor and mobile communication technologies have the potential to empower farmers like never before in history. This way, the huge amounts of data obtained from plants using novel sensors can help to increase farm productivity, develop new heat-tolerant varieties of crops, and to curtail the predicted food shortage due to climate change and population increase.”

Still room for improvements
However, the industry is not there yet, Dr. Singh claims. “AI technology on its own is not powerful enough to boost agriculture. Collaboration is needed with sensor and communication technologies to render them useful. Right now, sensors only measure the morphology or appearance of the plants or environmental factors such as temperature. They don’t monitor the biochemical changes occurring inside the plants in real-time yet. But when a plant is stressed due to the lack of nutrients or proper light, it generates a wealth of biochemical information. Being able to use this information will help growers even further.”

And this is exactly what DiSTAP wants to achieve with their new, portable Raman sensors that measure data concerning nitrate stress, shade avoidance syndrome, and bacterial infections in plants within a few hours. “For that reason, we’ve developed nano-sensors that can measure plant hormones, giving vital feedback to the farmer on a daily basis. Thus, it is possible to monitor the health of each plant every day by measuring the plant itself and not the symptoms or the environment only. Also for vertical farms, it is technologies these nano-sensors combined with AI that have the potential to improve crop productivity several times.”

This is important, according to Dr. Gajendra Pratap Singh, as agriculture has a profound impact on every human being. “In Bangladesh, for example, just two days of heat in April this year destroyed more than 60,000 hectares of rice, affecting more than quarter-million farmers with losses of about US$40 million. The availability of mobile technology in the remotest parts of the world will allow universal participation of farmers in the digital revolution in agriculture.”

For more information:
Dr. Gajendra Pratap Singh, Principal Investigator and Scientific Director
Disruptive & Sustainable Technologies for Agricultural Precision (DiSTAP)
Singapore-MIT Alliance for Research and Technology (SMART), MIT’s research enterprise in Singapore
gajendra@smart.mit.edu
distap@mit.edu 
www.smart.mit.edu

Publication date: Thu 12 Aug 2021

Read Full Post »