U of A partners on project combining AI, virtual reality and agriculture

Image
Plants rendered in a virtual reality environment with two white joysticks representing hands floating above the plants.

A virtual reality representation of sorghum plants created by VR-Bio-Talk, a platform developed by researchers from the U of A and Purdue University that uses artificial intelligence to analyze agricultural data.

Bedrich Benes, Voicu Popescu, Alejandra Magana, Jorge Askur, Bosheng Li and Radim Peša.

Using information gathered by agricultural sensors, researchers from the University of Arizona and Purdue University are developing a tool that uses virtual reality and artificial intelligence to better understand large datasets. Once finished, they expect the platform, called VR-Bio-Talk, will lead to groundbreaking research that bridges the gap between biology, agriculture and computer science.

The project is funded by the National Science Foundation, which awarded $860,000 to researchers from the U of A and $1.2 million to their Purdue counterparts to develop VR-Bio-Talk over the next three years. The goal of the project is to create a virtual reality environment that mirrors real-world agricultural environments and is paired with an artificial intelligence program.

Modern agricultural sensors allow researchers to collect acres-worth of 3D data that can be used to construct realistic "digital twins" of plants in a computer program. While these digital twins can help scientists better understand complex biological systems like plant growth, much of the agricultural data collected by scientists is never analyzed. 

Researchers using VR-Bio-Talk will be able to ask the platform to carry out processes like selecting all plants older than two weeks and calculating their average height. The AI software will convert the request into code, extract information from the appropriate dataset and display the results in virtual reality. Working in a 3D environment, researchers will be able to see each plant, enhanced with data such as height and leaf area index, and ask follow-up questions.

"Extracting information from data has always required programming expertise, but having the ability to break down technical barriers and let users converse natively with their data will enable researchers to focus more on their science and generate greater societal impact," said Duke Pauli, director of the Center for Agroecosystems Research and associate professor in the School of Plant Sciences, part of the College of Agriculture, Life and Environmental Sciences.

Pauli will develop VR-Bio-Talk alongside Nirav Merchant, director of the University of Arizona Data Science Institute  and principal investigator for CyVerse, a cyberinfrastructure platform for managing data and analysis workflows from phenotyping projects. Joining Pauli and Merchant from Purdue are Bedrich Benes, professor and associate head of the Department of Computer Science; Voicu Popescu, associate professor of computer science; and Alejandra Magana, the W.C Furnace Professor of Enterprise Excellence in computer and information technology and professor of engineering education.

"One of the main goals of this project is to bridge the domain gap between biology, agriculture and computer science researchers," Benes said. "Traditional data science projects require domain expertise to process the data. Someone needs to write programs to extract information from data. We aim to lower this barrier by using the latest advances in AI for voice recognition and conversational interfaces."

Image
A large, steel structure made of four legs and a walkway that holds a moving box, sits above a field of plants in a desert environment.

The Maricopa Field Scanner, a fully automated system that measures crop growth and development, will provide datasets to be explored using VR-Bio-Talk.

Alejandra Magana

Researchers will use VR-Bio-Talk to interact with agricultural data from the Field Scanalyzer, a fully automated system that measures crop growth and development using a suite of high-resolution cameras and sensors in a weatherproof cabinet that is moved over a field using a large, robotic frame. 

"We have over two petabytes of data that potentially include information yet to be discovered," Merchant said. "With recent advances and maturity in conversational technologies and the ability to chat and interact with large systems, we believe that this platform has the potential to dramatically change the way we extract information and knowledge from data."

Once developed, the VR-Bio-Talk platform will be used on various datasets and tested with a wide variety of users, ranging from experts to those with limited knowledge of biology and agriculture.