The objects can be further reassured by the object detection system before getting counted

Hyperspectral imaging was deployed to identify contaminated mangos. The algorithm’s overall error proportion of high infested samples ranges between 2% and 6%, whereas the algorithm’s overall error rate for low infested samples is 12.3 percent. To detect contaminated cherries, Xing et al. used reflectance and transmittance spectra. According to the extent of damage, the cherries were separated into two categories: “acceptable” and “nonacceptable.” On transmittance spectra, Canonical Discriminant Analysis achieved 85 percent classification accuracy. Potamitis et al. used optoacoustic spectrum analysis to construct an olive fruit fly detection system. The optoacoustic spectrum analysis detects the species of insects based on wing beat analysis. The authors examined the recorded signal’s temporal and frequency domains. The random forest classifier is fed the retrieved features from the time and frequency domains. The random forest classifier had a precision of 0.93, a recall of 0.93, and an F1-Score of 0.93. The optoacoustic approach, on the other hand, cannot distinguish between different types of fruit flies, including peaches and figs. Furthermore, solar radiation affects sensor readings, and the trap is susceptible to sudden strikes or shocks that cause false alarms on windy days. Böckmann et al. utilizes Bag of Visual Words to encode clusters of key points extracted by scale-invariant feature transform into some meaningful local features in a so-called visual codebook. This kind of dictionary is then used to incorporate how frequent each feature appears in each patch of newly extracted key points as the input to train an SVM classifier for different classes of flies as well as one background class for a patch of nothing of interest.

In contrast, blueberry containers the precision values decreased after 7 days of the insects remaining on the Yellow Sticky Paper by approximately 20% compared to the test results of the initialization measurement on day 0. Regarding class mean accuracy, the dictionary size had no obvious influence but on the recall in individual categories. Within the individual categories, the recall of the background class was the highest, as expected. A maximum value of 99.13% was achieved without differences in color space conversion or dictionary size. The best classification results were achieved with greyscale images and dictionary sizes of 200 and 500 words. Regarding deep learning techniques, Zhong et al. created a deep-learning-based multi-class classifier that can classify and count six different types of flying insects. The You Only Look Once algorithm is used for detection and coarse counting. To increase the number of training images required by the YOLO deep learning model, the scientists considered the six species of flying insects as a single class. The authors augment the images with translation, rotation, flipping, scaling, noise addition, and contrast adjustment to extend the data set size. They also employed a pre-trained YOLO to fine-tune its parameters on an insect dataset. Support Vector Machine is used for classification and fine counting, with global features. The technique was run on Raspberry PI, with detection and counting performed locally in each trap. The system attained a 92.5 percent average counting accuracy and a 90.18 percent average categorization accuracy. The Dacus Image Recognition Toolkit was created by Kalamatianos et al.. The toolkit includes MATLAB code samples for fast experimentation, as well as a collection of annotated olive fruit fly photos acquired by McPhail traps.

On the DIRT dataset, the authors tested various forms of the pre-trained Faster Region Convolutional Neural Networks deep learning detection technique. Prior to classification, RCNNs are convolutional neural networks containing region proposals that suggest the regions of objects. Faster-RCNN had a mAP of 91.52 percent, where mAP is the average maximum precision for various recall levels. The authors demonstrated that image size has a substantial impact on the detection, but RGB and grayscale images have almost the same detection accuracy. Because Faster RCNN is computationally costly, each e-trap regularly uploads its collected image to a server for processing. Ding et al. created a technique for detecting moth flies. Translation, rotation, and flipping are used to enhance the visuals. To balance the average intensities of the red, green, and blue channels, the photos are pre-processed with a color-correcting algorithm. The moths in the photos are then detected using a sliding window Convolutional Neural Network . CNNs are supervised learning algorithms that use learned weights to apply filters on picture pixels. Back propagation is used to learn the weights. Finally, Non-Max Suppression is used to remove the overlapping bounding boxes . Using an end-to-end deep learning neural network, Xia et al. detect 24 kinds of insects in agriculture fields. A pre-trained VGG-19 network is utilized to retrieve the features. The insect’s position is then determined through the Region Proposal Network . The proposed model had a mAP of 89.22 percent. Recently, YOLO is proving its notable performance in the work in pest detection. Especially, the reported results of YOLO v5 by the authors illustrate the mAP of 94.7 percent, where it has the highest recall score of 0.92 among all the other state-of-the-art methods, such as Fast RCNN, Faster RCNN and RetinaNet.

The models have been pretrained on COCO dataset and later fine-tuned on a training dataset of 4480 sub-images made from 280 images of yellow sticky pheromone traps. However, YOLO v5 is considered slower than YOLOv4. For the AI implementation on edge devices, works in demonstrate the AI applications on edge devices pest monitoring as well. In [30], Lynfield-inspired trap was used with naled-and fipronilintoxicated methyl eugenol in replacement of the yellow sticky paper trap combined with object detection system to detect only targeted oriental yellow flies. Unlike the yellow sticky paper, the substance is proved to only attract harmful fruit flies and the detection problem is thus reduced to one class detection for detecting the existence of the fruit flies and verifying whether the detection is correct. The work showed primary work and provided foundation to further develop real-time system for yellow fly detection in on-field scenario. Compared to [27], the application Single Shot Multibox Detector with variant backbones and YOLOv4- tiny show significant speed performance to YOLO v5, while taking the raw images as input instead of segmented sub-images. Nevertheless, the work also showed limitation of applying detection models on edge device due to the slow processing speed, which will be further addressed in this article.Most of the time, insects are not stationary, so it is difficult to get a clear image of flying insects. In studies [32 – 35], the authors chose insect specimens that were well-preserved in an ideal laboratory environment to capture images of the insects at high resolution. However, since fewer environmental factors are considered in this method, it is limited in specific applications. In this study, we designed a unique automatic autonomous environment data reading and pest identification system to try to eliminate the above problems. Being largely motivated by preventing the oriental fruit flies from destroying citrus fruits such as oranges and grapefruits, we come up with a trap which targets only that one type of the species, best indoor plant pots which is specifically named B. Dorsalis. This can be achieved by replacing the yellow sticky paper with the naled-and fipronil-intoxicated methyl eugenol attractant to assure only B. Dorsalis flies are lured into the trap. It eases the classification and counting process as no other insects will get attracted by the methyl eugenol attractant . The system involves a two-fold setting: a) an electronic system reads environment data with a sticky trap installed and a digital camera is set up to collect images of the flies, b) the object detection software to recognize fruit flies on the image before sending all information via email or SMS to alert farmers independently. The whole system is autonomous and powered by a solar system. This system is implemented on an Arduino Uno and Raspberry Pi system. The results provide precise prevention and treatment methods based on the combination of pest information and other environmental information. Based on this edge computing design, the computation pressure on the server is alleviated and the network burden is largely reduced.

The edge-computing traps are designed to work separately and individually re-port the count of fruit flies to the farmers. They are spread, based on the effectiveness of the attractant, such that each 2-3 devices can cover an area of 1000 square meters.Overall, the hardware part of the system consists of five interconnected subsystems with distinctive functions and behaviors, which are described in Figure 1, namely the solar panel system, the control system, the sensor system, the trap, and the object detection and communication system. The power system of the trap contains a solar panel, a battery, and a solar charge controller . The solar panel converts the solar energy to DC current with 830 mA to power the trap system. The converted energy is stored in an electro-chemical energy storage with a capacity of 5 Ah and a voltage of 12 V. The Arduino in the operating system will check voltage of the battery with a voltage sensor to make sure the battery voltage is above a certain level required for the system’s operation. If the condition is not met, the object detection module will not be operated. The Pulse Width Modulation solar charge controller is used to control the device voltage, open the circuit, and halt the charging process if the battery voltage is above a certain level. The operation system is controlled by an Arduino micro-controller board. As aforementioned, the Arduino module reads the battery voltage with a voltage sensor from the sensor system to decide whether to turn on or off the object detection system, which is controlled by the Raspberry module. The SSR10D is used to control activate and deactivate the object detection system. The SSR10D is a solid-state relay and uses lower power electrical signal to generate an optical semiconductor signal as an activate signal for the opto-transistor to allow high voltage going into and powering the device’s output device, which is the Raspberry device in this case. In addition, the lower electrical signal is the output from the 2N2222 bipolar junction transistor receiving control signal from the Arduino module. Hence, the Arduino can stop the Raspberry Pi 3b+ computer drawing current from the solar system after it is shut down. The sensor system takes responsibility for measuring the three important factors, temperature, humidity, and light. Also, it records the current created by the solar system and the voltage battery. The humidity and temperature, which also affect the living environment of the yellow flies, are measured with the AM2315 I2C sensor. RGB and clear light is measured with the TCS34725 light sensor with IR filter and white LED. In addition to sensor system, INA219 is used to read the solar current and battery voltage information. Moreover, a DS1307, which is a battery-backed real time clock , is used to help the microcontroller keep track of time. The information from the sensors along with their corresponding time are stored in an SD card attached on the device. These two factors, the operation system and sensor system, help the microcontroller decide whether to turn on the object detection or not. The object detection system, shown in Figure 1e, is operated by the Raspberry Pi 3b+ and collect images for its fruit fly detection algorithm with a Waveshare Pi camera with 5 MP. The camera is placed at the top of a double-size Lynfield shape trap with several holes at the bottom, shown in Figure 1d. To attract and capture only the yellow flies, methyl eugenol is used as the attractant to the insects, which later helps to simplify the detection and classification problem. The Raspberry Pi module will receive data from the sensor system and send all data to the notification system to notify or alert farmers about the environmental data and the number of detected fruit flies through email or SMS. The behavior of the whole system is described in the flow chart shown in Figure 2.The architectures used to train the yellow fly detection models are SSD with MobilenetV1 and MobilenetV2 backbones, and YOLOv4-tiny. The selected models are all single-stage detection models since, compared to their counterpart, the two-stage detection models, the single-stage detection models have been shown to have a faster processing speed with a competitive performance. Moreover, the three models were selected because of their comparable parameter size and their feasibility for real-time implementation on edge devices.