Abstract

A staggering $220 billion worth of crops are lost annually globally due to crop illness and the graph below shows a part of the economic losses. The cost is because agriculture has traditionally relied on time-tested, experience-driven methods for crop health monitoring. While rich in historical context, such practices have shown their limitations, leading to inefficiencies and, at times, resource wastage.

Recognizing this gap, our “Agricultural Field Crop Health Monitor Robot” project endeavors to usher in transitioning from this age-old experience-based system to a more contemporary and efficient data-driven model. The model is expected to help detect, recognize, and gather the illness leaves in the agricultural field. It will use the CNN method in machine learning with the help of a depth camera to achieve the expected goal.

                                                       Click figure above to view the original site.

Site figure caption content: (Insects, diseases and weeds are the three main biological factors for losing crop yield and causing economic loss to farmers. Unlike the visible impact of diseases and insects, the impact of weeds goes unnoticed,” said Dr Yogita Gharde, lead author of the paper and scientist at the directorate of weed research at Indian Council of Agriculture Research. “If weed growth is not stopped at a critical time, it results in massive crop loss, sometimes as high as 70%,” said Gharde.)

 

Introduction

Agriculture remains an indispensable facet of human civilization, sustaining an ever-increasing global population. In the face of escalating demands and the imperative to conserve energy and resources, it is critical to devise strategies that minimize agricultural losses. Crop diseases are a formidable impediment to food security, often exacerbated by limited infrastructure. Traditional reliance on human acumen, while invaluable, confronts boundaries, especially given the scale of potential crop devastation. Hence, there is an imperative to amalgamate human expertise with technological advancements to transform how we detect, manage, and thwart crop diseases, propelling us toward a more resilient and productive agricultural framework.

 

The innovative study by Hackster.io et al. introduces ‘Farmaid,’ an autonomous robot engineered for greenhouse deployment, capable of precise navigation to safeguard plant integrity and soil [1]. Farmaid’s chief role is detecting plant diseases, bolstered by an SMS notification system powered by Twilio for real-time disease severity alerts [1].

 

In disease detection, traditional methods pale against the backdrop of technological evolution. The review of contemporary advancements in Machine Learning (ML) and Deep Learning (DL) from 2015 to 2022 showcases a leap in the efficacy and accuracy of these technologies for plant disease diagnosis despite challenges such as limited data, imaging constraints, and differentiation between diseased and healthy vegetation [3].

 

Furthermore, the research by Mohanty et al. capitalizes on the surge in smartphone proliferation coupled with deep learning breakthroughs to usher in a novel paradigm of mobile-assisted disease diagnostics [4]. A deep convolutional neural network, trained on an extensive dataset of plant leaf images, exhibits remarkable proficiency—identifying various crop species and diseases with 99.35% accuracy, suggesting a scalable solution for precision agriculture [4].

FIGURE ABOVE SHOWS THE EXAMPLE OF LEAF IMAGES FROM THE PLANTVILLAGE DATASET, REPRESENTING EVERY CROP-DISEASE PAIR USED [4]. (Click the figure above to view the original paper)

Our initiative aims to synthesize the capabilities of robotics and ML algorithms into a mobile rover—a technological sentinel in the agricultural landscape. This rover is designed to autonomously detect and categorize plant ailments via symptomatic analysis on leaves and then precisely navigate to harvest affected samples for a comprehensive examination. Field tests of this integrated system have yielded promising advancements toward automating and refining crop health management.

Brain Storming

The figure below shows the initial mind map for our group, and we finally settled on the agricultural robot project. This mind map outlines various applications and considerations for autonomous vehicles in different contexts, which can be used for further projects. Additionally, each application has specific goals, customer groups, and technological requirements. The mind map also considers the pros and cons, including the benefits of saving time and labor and the drawbacks of research and development costs. Click on the picture to access the original mind map. 

Project Goals

This section will outline the project’s objectives from three distinct perspectives. Firstly, we will discuss the expected performance and functional goals, which encompass the desired outcomes and benchmarks of success for our project. Secondly, we will delve into the educational objectives, which align closely with a curated selection of courses offered at Duke University. These courses are strategically chosen not only to provide a foundational platform for initiating the project and facilitate its advancement to more sophisticated stages. This integrated approach ensures practical outcomes and comprehensive academic support underpin the project’s progression.

Project Decomposition

Our team has designed an autonomous agricultural robot system, which integrates a suite of sophisticated components. At the heart is the Jetson processor, which interprets data from a mounted camera and GPS for real-time navigation and task execution. We’ve equipped the system with motor drivers that translate the Jetson’s commands into precise movements of the rover, providing the agility to traverse farm terrain. Additionally, a robotic arm, controlled through the system, is ready to perform various tasks such as picking or treating plants. Our design prioritizes seamless communication, denoted by blue signal lines for data, red command lines for actions, and green software lines for operational logic, ensuring a harmonious interplay between technology and agriculture.


Project Breakdown

The following interfaces will chronicle the journey of our project, meticulously categorized by levels of complexity. We will guide you through a curated progression starting from ‘Beginner’, advancing to ‘Medium’, then to ‘Advanced’, and culminating at the ‘Expert’ level. Each category is thoughtfully designed to not only showcase the evolution and milestones of our project but also to encapsulate the growing sophistication and depth of our work. This tiered approach ensures a comprehensive understanding of our project’s scope and the incremental challenges we’ve surmounted at each stage.

Detailed Procedure of Each Robot Part

Final Presentation Video

Code for Operating Arm and Rover

 

Arm

 

#include <Wire.h>

 

#include <Adafruit_PWMServoDriver.h>

 

 

 

Adafruit_PWMServoDriver pwm = Adafruit_PWMServoDriver();

 

 

 

uint16_t servoMin = 150;  // Minimum pulse length

 

uint16_t servoMax = 600;  // Maximum pulse length

 

 

 

int initialAngles[5] = {30, –30, 180, –180, 30};

 

 

 

void setup() {

 

  Serial.begin(9600);

 

  pwm.begin();

 

 

 

  pwm.setPWMFreq(60);

 

 

 

  for(int i = 0; i < 5; i++) {

 

    setMotorToInitialPosition(i);

 

  }

 

 

 

  Serial.println(“Enter ‘start’ to run the sequence or ‘home’ to return all motors to initial positions”);

 

}

 

 

 

void loop() {

 

  if (Serial.available() > 0) {

 

    String inputString = Serial.readStringUntil(\n);

 

    inputString.trim();

 

 

 

    if (inputString == “home”) {

 

      for(int i = 0; i < 5; i++) {

 

        setMotorToInitialPosition(i);

 

      }

 

    } else if (inputString == “start”) {

 

      // Sequence of movements

 

      moveMotorToPosition(1, –120); // Motor 1 to -90 degrees

 

      moveMotorToPosition(2, 120);  // Motor 2 to 90 degrees

 

      moveMotorToPosition(4, 0); // Motor 4 to 30 degrees

 

      moveMotorToPosition(4, 120); // Motor 4 to 120 degrees

 

      moveMotorToPosition(3, 180); // Motor 3 to 180 degrees

 

      moveMotorToPosition(2, –180);

 

      moveMotorToPosition(4, 60);

 

      moveMotorToPosition(1, –75);

 

 

 

      // Return to initial positions

 

      for(int i = 0; i < 5; i++) {

 

        setMotorToInitialPosition(i);

 

      }

 

    } else {

 

      Serial.println(“Invalid input, please re-enter!”);

 

    }

 

  }

 

}

 

 

 

void moveMotorToPosition(int motorNum, int targetAngle) {

 

  int currentAngle = initialAngles[motorNum];

 

  while(currentAngle != targetAngle){

 

    if(abs(targetAngle – currentAngle) < 10){

 

      currentAngle = targetAngle;

 

    } else if(targetAngle > currentAngle){

 

      currentAngle += 10;

 

    } else {

 

      currentAngle -= 10;

 

    }

 

 

 

    uint16_t pulseLength = map(currentAngle, –180, 180, servoMin, servoMax);

 

    pwm.setPWM(motorNum, 0, pulseLength);

 

    delay(200); // Slower movement, increased delay

 

  }

 

}

 

 

 

void setMotorToInitialPosition(int motorNum) {

 

  int targetAngle = initialAngles[motorNum];

 

  uint16_t pulseLength = map(targetAngle, –180, 180, servoMin, servoMax);

 

  pwm.setPWM(motorNum, 0, pulseLength);

 

  delay(1000); // Slower return to initial position

 

}

 

Rover

 

 

Rover

#include <AFMotor.h>

 

AF_DCMotor motor3(3);
AF_DCMotor motor4(4);

 

const float wheelDiameterCM = 5.0;
const float wheelCircumferenceCM = wheelDiameterCM * 3.1416;
const int motorRPM = 350;
float rpmPerCM = motorRPM / wheelCircumferenceCM;

 

void setup() {
  Serial.begin(9600);
}

 

void loop() {
  if (Serial.available()) {
    int distanceCM = Serial.parseInt();
    if (distanceCM != 0) {
      moveDistance(distanceCM);
    }
  }
}

 

void moveDistance(int cm) {
  int direction = (cm > 0) ? FORWARD : BACKWARD;
  cm = abs(cm) / 2;
  int requiredRPM = rpmPerCM * cm;
  int runTime = (60000 / motorRPM) * requiredRPM;

 

  motor3.setSpeed(255);
  motor4.setSpeed(255);
  motor3.run(direction);
  motor4.run(direction);

 

  delay(runTime);

 

  motor3.run(RELEASE);
  motor4.run(RELEASE);

 

  // Back off a little
  delay(0.5); // shour pause
  int backRunTime = runTime / 20;
  motor3.run(direction == FORWARD ? BACKWARD : FORWARD); // move back
  motor4.run(direction == FORWARD ? BACKWARD : FORWARD);
  delay(backRunTime);
  motor3.run(RELEASE);
  motor4.run(RELEASE);
}

 

Milestones: 

  1. Understand how to use all the hardwares and softwares. (10/4 – 10/13)

  2. Finish training the plant-disease detection algorithm. (10/13 – 10/27)

  3. All subsystems are able to function well. (10/13 – 10/27)

  4. Finish the robot and vehicle and assemble them together. (10/27 – 11/3)

  5. Test and debug the whole system and write final reports. (11/3 – 11/17)

Deliverables:

  1. A system can detect plant illnesses.(10/27)

  2. Each part can function separately. (10/27)

  3. A functional robot that can run in the field, detect the illness, locate the illness and robot arm can work. May not be perfect but can work as an assembled robot. (11/3)

  4. A robot fit the initial goal and a completed final report. (11/17)

Literature review

  1. “Farmaid: Plant Disease Detection Robot.” Hackster.io, 30 October 2018, https://www.hackster.io/teamato/farmaid-plant-disease-detection-robot-55eeb1. Accessed 27 July 2023.

  2. Dev, Medidi. J. V. S. A. S., Ratna, T. V., Tharun, P. S., Harsha, M. S., & Daniya, T. (2023). Plant disease detection and crop recommendation using Deep Learning. 2023 2nd International Conference on Applied Artificial Intelligence and Computing (ICAAIC). https://doi.org/10.1109/icaaic56838.2023.10141294 

  3. Kasinathan, T., Singaraju, D., & Uyyala, S. R. (2021). Insect classification and detection in field crops using modern machine learning techniques. Information Processing in Agriculture8(3), 446–457. https://doi.org/10.1016/j.inpa.2020.09.006 

  4. Mohanty, S. P., Hughes, D. P., & Salathé, M. (2016). Using deep learning for image-based plant disease detection. Frontiers in Plant Science7. https://doi.org/10.3389/fpls.2016.01419 

  5. Shruthi, U., Nagaveni, V., & Raghavendra, B. K. (2019). A review on machine learning classification techniques for Plant Disease Detection. 2019 5th International Conference on Advanced Computing &amp; Communication Systems (ICACCS). https://doi.org/10.1109/icaccs.2019.8728415 

  6. Tianhai Wang, Bin Chen, Zhenqian Zhang, Han Li, Man Zhang, Applications of machine vision in agricultural robot navigation: A review, Computers and Electronics in Agriculture, Volume 198, 2022, 107085, ISSN 0168-1699, https://doi.org/10.1016/j.compag.2022.107085

  7. Smart system for early detection of agricultural plant diseases in the vegetation period Rustam  Baratov, Himola  Valixanova. E3S Web Conf. 386 01007 (2023). DOI: 10.1051/e3sconf/202338601007

  8. Arduino Robotic Arm Controlled by Touch Interface, Maurizio Miscio. https://www.instructables.com/Arduino-Robotic-Arm-Controlled-by-Touch-Interface/

  9. DIY Arduino Robot Arm – Controlled by Hand Gestures, Eben Kouao, January 10, 2021.https://www.youtube.com/watch?v=F0ZvF-FbCr0

  10. Robot Arm Automation, Danielgass, January 8, 2022 https://www.hackster.io/danielgass/robot-arm-automation-26a97f

Who we are?! (Team Members)

 

Website Cube Code, Share With Your Friends!

Click the photo to know more about us!

 
Chenxi Bai
Tingwei Ao
Dawen Huang
Yi Li