##plugins.themes.bootstrap3.article.main##

In this study, we develop a virtual reality (VR)–based agricultural (hereinafter, agri-) support technique to assist in training newcomers and trainees in agricultural work. The system consists of a head-mounted display (HMD)—a HTC Vive Pro Eye—gaming personal computer, and peripheral components. The HMD-based system consists of the following: (1) a precision eye-tracking system for tracking and interpreting eye movements to enable lifelike interactions, better manage GPU workload, and simplify input and navigation; (2) a room area tracker up to 100 m²; and (3) a VR-space experience with unmatched tracking accuracy with SteamVR™ Tracking. The agri-situation considered in this study is non-specific tomato harvesting performed by inexperienced agri-workers. The study aims to (1) measure and analyze persons’ cognition and behavior indicators in VR-based environments that simulate the work site, (2) provide suggestions regarding analyses of processes from the cognition of targets to specific behaviors using objective indexes; and (3) realize in advance the verification and comparison of improvements of measures related to specific agri-works without the need to utilize actual agri-work sites. Specifically, we utilize and apply the eye-tracking function incorporated in the HMD, in addition, we develop a Unity-based VR system with sound notification to indicate the validity of eye-tracking and the motion of manual agri-workers and managers. Subsequently, we conduct experimental trials in a non-specific room. In the trials, subjects wearing the HMD system sequentially gaze at figures of small red tomatoes in the VR spaces. In the process, the VR system plays alerts to notify subjects when they gaze at or miss the target tomatoes. The system provides quick and accurate operations, and the eye-tracking function of the system differs from existing agri-training-oriented techniques and products. The system has several advantages such as lower cost compared with existing similar mechanical systems found in the literature and similar commercial products. In addition, the Unity-based system is a minimal and flexibly scalable system that can be adjusted to suit future studies and expansion for different agri-situations. This study consists of five phases: (1) designing and confirming the validity of the system; (2) accumulating image data on outdoor farmland; (3) constructing the entire system and tuning various minor system settings like program parameters and other specifications; (4) executing experiments in an indoor room; and (5) assessing and discussing the results and gathering comments from the subjects and presenting the characteristics of these trials. We consider that, from the limited trials, the system can be judged to be valid to some extent in certain situations. However, we could not perform broader or more generalizable experiments using the system. We present experimental characteristics and numerical ranges related to the trials, particularly noting speed and likelihood of mistakes concerning the system’s practical operations. The novel achievements of this study lie in the fusion of the latest HMD and Unity-based agricultural training facilities. In future, agri-workers and managers can use the proposed system for training, particularly for eye movement. Furthermore, we believe that, by combining this system with other existing systems, agriculture can be greatly improved.

References

  1. F. Karim, and F. Karim, “Monitoring system using web of things in precision agriculture,” Procedia Computer Science, vol. 110, pp. 402-409, Jan. 2017.
     Google Scholar
  2. S. Wu, X. Li, and X. Wang, “IoU-aware single-stage object detector for accurate localization,” Image and Vision Computing, arXiv:1912.05992v4, April 2020.
     Google Scholar
  3. H. Pandey, A. Maurya, R. Prajapati, A. Pandey, and V. Nagve, “Augmented Reality in Agriculture (No. 2752),” EasyChair Reprint, Feb. 2020.
     Google Scholar
  4. A. Pandey, P. Tiwary, S. Kumar, and S. K. Das, “A hybrid classifier approach to multivariate sensor data for climate smart agriculture cyber-physical systems,” Proceedings of the 20th International Conference on Distributed Computing and Networking, pp. 337-341, Jan. 2019.
     Google Scholar
  5. V. Ponnusamy and S. Natarajan, “Precision Agriculture Using Advanced Technology of IoT, Unmanned Aerial Vehicle, Augmented Reality, and Machine Learning,” In Smart Sensors for Industrial Internet of Things, Springer, Cham, pp. 207-229, Feb. 2021.
     Google Scholar
  6. L. Vigoroso, F. Caffaro, C. M. Micheletti, and E. Cavallo, “Innovating Occupational Safety Training: A Scoping Review on Digital Games and Possible Applications in Agriculture,” International Journal of Environmental Research and Public Health, vol. 18, no. 4, pp. 1868, Feb. 2021.
     Google Scholar
  7. C. M. Flores-Cayuela, R. González-Perea, E. Camacho-Poyato, and P. Montesinos, “Verifiable Water Use Inventory Using ICTs in Industrial Agriculture,” Proceedings in Water Footprint, pp. 1-34, April 2021.
     Google Scholar
  8. C. H. Wang, C. Y. Liu, P. N. Pan, and H. R. Pan, “Research into the E-learning model of agriculture technology companies: Analysis by deep learning, Agronomy, vol. 9, no. 2, pp. 83, Feb. 2019.
     Google Scholar
  9. L. M. Müller, K. Mandon, P. Gliesche, S. Weiß, and W. Heuten, ember, “Visualization of Eye Tracking Data in Unity3D,” Proceedings in 19th International Conference on Mobile and Ubiquitous Multimedia, pp. 343-344, Nov. 2020.
     Google Scholar
  10. S. Narasimhan, “Evaluating the Accuracy of Gaze Detection for Moving Character (Doctoral dissertation),” URI http://hdl.handle.net/10106/29416, August 2020.
     Google Scholar
  11. S. Kim, M. Billinghurst, G. Lee, and W. Huang, “Gaze window: A new gaze interface showing relevant content close to the gaze point,” Journal of the Society for Information Display, vol. 28, no.12, pp. 979-996, July 2020.
     Google Scholar
  12. S. Pastel, C. H. Chen, L. Martin, M. Naujoks, K. Petri, and K. Witte, “Comparison of gaze accuracy and precision in real-world and virtual reality,” Virtual Reality, pp. 1-15, April 2020.
     Google Scholar
  13. A. K. Jogeshwar, G. J. Diaz, S. P. Farnand, and J. B. Pelz, “The Cone Model: Recognizing gaze uncertainty in virtual environments,” Electronic Imaging, vol. 2020, no.9, pp. 288-1-288-8, June 2020.
     Google Scholar
  14. A. Erickson, N. Norouzi, K. Kim, R. Schubert, J. Jules, J. LaViola, and G. F. Welch “Sharing gaze rays for visual target identification tasks in collaborative augmented reality. Journal on Multimodal User Interfaces, vol. 14, no. 4, pp. 353-371, Nov. 2020.
     Google Scholar
  15. G. Park, H. Choi, U. Lee, and S. Chin, “Virtual figure model crafting with VR HMD and Leap Motion,” The Imaging Science Journal, vol. 65, no. 6, pp. 358-370, July 2017.
     Google Scholar
  16. I. Kakinuka and S. Komiyama. “The effect of using a controller to adjust gaze pointing in VR space,” Journal of Japan Human Interface Society, vol. 23, no. 1, pp. 89-100, Jan. 2021.
     Google Scholar
  17. R. Nakamura, “Behavioral Analysis and Automatic Content generation for utilization of Virtual Reality Data,” Musashino University Academic Institutional Repository, vol. 23, no. 1, pp. 55-63, Jan. 2021.
     Google Scholar
  18. J. Hollis and J. Oliver, “The Development of Virtual Reality as a Tool to Investigate Eating Behavior,” Current Developments in Nutrition, vol. 4, no. 2, pp. 1308-1308, May 2020.
     Google Scholar
  19. M. Hirota, H. Kanda, T. Endo, T. Miyoshi, S. Miyagawa, Y. Hirohara, and T. Fujikado, “Comparison of visual fatigue caused by head-mounted display for virtual reality and two-dimensional display using objective and subjective evaluation,” Ergonomics, vol.62, no. 6, pp. 759-766, March 2019.
     Google Scholar
  20. R. Adachi, E. M. Cramer, and H. Song, “Using virtual reality for tourism marketing: A mediating role of self-presence,” The Social Science Journal, pp. 1-14, Feb. 2020.
     Google Scholar
  21. M. Singh, A. K. George, W. Eyob, R. P. Homme, D. Stansic, and S. C. Tyagi. “High-methionine diet in skeletal muscle remodeling: epigenetic mechanism of homocysteine-mediated growth retardation. Canadian Journal of Physiology and Pharmacology, vol. 99, no. 1, pp. 56-63, Aug. 2020.
     Google Scholar
  22. F. N. Schrama, F. Ji, A. Hunt, E. M. Beunder, R. Woolf, A. Tuling, and Y. Yang, “Lowering iron losses during slag removal in hot metal desulphurization without using fluoride,” Ironmaking & Steelmaking, vol. 47, no. 5, pp. 464-472, March 2020.
     Google Scholar
  23. A. Boumezoued, “Improving HMD mortality estimates with HFD fertility data,” North American Actuarial Journal, URL: https://hal.archives-ouvertes.fr/hal-01270565/document, pp.1-25, Feb. 2016.
     Google Scholar