=Paper=
{{Paper
|id=Vol-3293/paper17
|storemode=property
|title=Digital Representation of Smart Agricultural Environments for Robot Navigation
|pdfUrl=https://ceur-ws.org/Vol-3293/paper17.pdf
|volume=Vol-3293
|authors=Luis Emmi,Rebeca Parra,Pablo González-de-Santos
|dblpUrl=https://dblp.org/rec/conf/haicta/EmmiPS22
}}
==Digital Representation of Smart Agricultural Environments for Robot Navigation==
Digital Representation of Smart Agricultural Environments for Robot Navigation Luis Emmi 1, Rebeca Parra 1 and Pablo González-de-Santos 1 1 Centre for Automation and Robotics (CSIC-UPM), Arganda del Rey, Madrid, 28500, Spain Abstract In recent years, digitization has created a significant impact on food production systems, allowing various technologies and advanced data processing strategies to be implemented. Alongside the introduction of tools for the digitalization of the field, the automation of tasks through the use of mobile robots has also been growing in the last decades. These systems are nourished by the acquisition of field data to carry out autonomous operations including weed management, application of fertilizers, and harvesting, among others. One of the current challenges of integrating robotic solutions in the digital age of agriculture is the development of scalable and interoperable systems, able to manage data obtained from third parties. This work presents an overall methodology for map creation by using open-source tools, which allows data in the daily activities of an agricultural farm to be managed, and autonomous tasks of robotic systems to be planned and supervised. Keywords 1 Smart farming, digital representation, autonomous robot. 1. Introduction The current state-of-the-art in precision agriculture (PA) is shifting in favor of production quality with the aim of maximizing resources and reducing environmental impact. In this context, various applications are being developed that favor the digital management of the infrastructures involved in crop monitoring through the Internet of Things (IoT) [1], the optimization and management of resources [2] and the prevention of diseases and pests [3], among others. On the one hand, the digitization of the agents involved in the different agricultural activities (e.g., seeding, cultivation and harvesting) constitutes a significant advance in the phases of prototyping and implementation of configurations in unstructured environments. However, the analysis and processing of data can be described as a real challenge due to involvement in a sector governed by the dynamic laws of nature [4]. On the other hand, autonomous robots have drawn the attention of farmers in the last decades. They have the ability to carry advanced sensors and tools throughout the field, and they are able to execute the most complex tasks efficiently, as well as reduce both the workload of the operator and the impact on the environment. One of the characteristics that existing robotic systems have in common is that they depend on their own mapping system (including manual field surveillance) as well as missions and tasks planning. The interoperability of these robotic systems with other tools such as Farm Management Systems is a current challenge, as the implicit learning curve for farmers and their teams is complex. Decision-making is not carried out by a single individual, so these systems need to be user-friendly. Another problem is that the integration of a global system is complicated by the volume of data to be processed and the scarce development of specialized software [5], and depends largely on the use of standard communication systems and the definition of common data models. This paper presents a methodology to digitally represent the working area in an agricultural environment based on standard data models, and that can be used by robotic systems to enable Proceedings of HAICTA 2022, September 22–25, 2022, Athens, Greece EMAIL: luis.emmi@car.upm-csic.es (A. 1); rebeca.parra@car.upm-csic.es (A. 2); pablo.gonzalez@csic.es (A. 3) ORCID: 0000-0003-4030-1038 (A. 1); 0000-0003-0440-518X (A. 2); 0000-0002-0219-3155 (A. 3) ©️ 2022 Copyright for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0). CEUR Workshop Proceedings (CEUR-WS.org) 61 autonomous navigation both on the farm and in the field. To achieve this objective, a methodology is proposed that is capable of providing information related to the conceptual state of a farm, as well as the devices involved in the processes carried out on it. Open-source standards and tools such as FIWARE [6] and Geojson.io [7] are used to provide semantic structures to the agricultural entities being digitized. The final purpose is to represent, virtually, the ecosystem of an autonomous robot for weeding using laser technologies, as a use-case within the WeLASER project [8]. 2. Context and Related Work It could be complex to establish interconnections between infrastructures of different types, especially when some of them have dynamic singularities. Fortunately, there is a multitude of applications capable of achieving this. In the case of PA, one of the most convenient links would be the one established between the farmer, the field and the agricultural machinery that performs any production technique on it. The FIWARE foundation is a network of European organizations promoting the development of a data ecosystem of open-source technologies. One of its domains, Smart AgriFood, proposes a sustainability-oriented optimization of farm production based on monitoring, so that every decision is supported, cost-effective, scalable and interoperable [9]. There are few use cases developed in this sector applying this technology. One of them, OpenPD, is based on the exchange of information through an open community for the rapid identification of pests and diseases in crops [10]. In another instance, the QUHOMA platform provides architectures for crop data mining to process, distribute and monetize sustainable agriculture information through a web environment [11]. These applications share different technologies with the end-users, providing them with a direct interaction interface with the ecosystem. In terms of agricultural robotics, the representation is limited. The application of this casuistry has not yet been sufficiently deepened, so that certain tasks are still cumbersome and repetitive for humans. In our use case, it is required to harmonize the final results provided by the examples given above. 3. Methodology One of our objectives is to streamline the coordination of systems, supporting farmers to make use of the robotic system in a simple, reliable and robust way. To achieve this, the farmer must be able to give the robot enough information a priori (e.g., field locations, crop type and status, boundaries, etc.). This task (field surveillance or field mapping) can be carried out in different ways. A graphical model of the operational environment (the farm) has been created using geojson.io, a mapping tool based on various cartographic databases, which allows maps and geospatial data to be created, visualized and shared, in a simple and multi-format way. This map is later used by the robot to navigate on the farm, although the precision of the geometric points is not at the centimeter level. This is possible because the robot is expected to rely on on-board sensors to correct map inaccuracies, increasing the robustness of the system and without relying heavily on accurate and up-to-date maps. This graphic interaction generates static data that has been adapted to the FIWARE smart-data- model formats after a transformation, and that will be of great use throughout the agricultural process (see Fig. 1). In this particular case, two standardized FIWARE main entities have been selected and used for describing the operational area: • AgriFarm [12], referred to the harmonized environment of a generic farm made up of buildings and parcels (fields). • AgriParcel [13], related to a generic parcel of land, i.e. an agricultural field. This demonstrates the verticality of the standard, as AgriParcel will always belong to the AgriFarm entity. 62 Figure 1: Process Plan in project WeLaser. However, within the WeLASER project it has been detected that the description of the AgriParcel entity lacks some relevant information, and in particular for autonomous navigation tasks. Therefore, some properties have been added to those already defined by FIWARE: 1. bearing, which marks the clockwise angle from true north of a field's crop line. 2. headlandWidth, which indicates the existing headland meters in each field. 3. gateLocation, indicating the entry point into a field. 4. interRowDistance, indicating the meters between crop rows. 5. cropRow, representing the crop rows. 6. weedStatus, weed cover map. Regarding the AgriFarm entity, it is also essential to define restricted areas and roads that help the robotic systems to plan and navigate properly. Both the RestrictedTrafficArea [14] and Road [15] entities are part of FIWARE data models in another domain (Transportation). Figure 2 presents the relations and properties of the FIWARE entities under study. To incorporate these relationships, the inclusion of the following properties in the AgriFarm entity is proposed: 1. hasRestrictedArea, which limits restricted areas within the farm. 2. hasRoad, which marks the roads present on the farm. To convert maps based on geojson into the FIWARE entities, a local API (Application Programming Interface) called WeLaser MB (WeLaser Map Builder) has been developed. The methodology followed for the creation of this API is described as follows: 1. Selection of the place of action in geojson.io: This platform, consisting of a geographic viewer, an editor, and a series of geometric tools, allows territorial elements to be added, edited and characterized. 2. Assignment of essential attributes to comply with the standard proposed by FIWARE: The attributes added correspond to those that the farmer must provide on the basis of his intrinsic knowledge. There are two types of data that can be integrated in the API: static and dynamic data. Static data, as shown in Fig. 2, are those parameters that are identified in the geojson.io interface (i.e. location, type, category). Dynamic data, on the other hand, are those provided by means of forms or configuration parameters (i.e. crop type and status, planting date, etc.). 63 Figure 2: Relations and properties of several entities of FIWARE. 3. Export in *.geojson format: The map obtained will be imported and processed thanks to the WeLaserMB proposal, extracting the necessary information to fill in the predefined templates for each type of FIWARE entity contained in the map. All this data collection will be essential to sustain an interoperability structure within the WeLASER project. Thanks to the conversion, the connection of the robot will be facilitated through the standardization of data, with a view to achieving autonomous and safe human-machine management when defining the robot's context of action. Figure 3: Example of a study field designed in geojson.io. 64 4. Discussion The results obtained after testing a use case in the facilities of the Centre for Automatics and Robotics of the Spanish National Research Council (CAR- CSIC) prove that the WeLaserMB API is well adapted to its function. A site consisting of several fields and a road has been characterized. The information has been extracted and transformed into .json format following the standard implemented by FIWARE. The acquisition of Smart Data models within this project offers the opportunity to provide new archetypes that have not yet been studied in the field of Smart AgriFood. The proposed model is also scalable, as it could support data from other mapping applications such as QGIS, Geoman.io or OpenStreetMap. One of the main purposes is to simulate, in a virtual way, the ecosystem underlying the WeLASER project. Based on this and the previous collection of information, the different behaviors of the robot with its domain will be examined, as well as the optimal conditions of use to achieve the maximum performance and capacity of the system. A predictive behavioral model is also planned to avoid situations that could lead to a decrease in the efficiency of agricultural processes. 5. Conclusions The context shown gives us the opportunity to implement new archetypes that have not yet been studied. Following the conceptual design, development and implementation of the use case presented, several main conclusions can be identified: 1. Standardization of data relating to the physical elements that make up a precision farming system improves optimization and efficiency. This homogenization of information faces one of the still great challenges in the agriculture of the future, due to the diversity of interconnections and existing formats. 2. Adaptability is essential for the correct implementation of the model. It needs to be integrated into intuitive interfaces that are scalable for different devices. 3. The data, when processed, becomes an essential source for the design of processes by the robot and, in addition, its storage and subsequent analysis can provide forecasts with negligible margins of error when predicting future scenarios that occur in the field or directly to the behavior of the robot. It is possible to implement a digital twin from these, which replicates the behavior of the environment and analyses the optimum operating conditions depending on what happens in each case. As a future line of research, the creation of a digital twin instance (DTI) in a 3D robotic simulator is proposed, based on the data obtained in the use case described above, which will allow testing in different scenarios of use. 6. Acknowledgements This article is part of the WeLASER project funded by the European Union’s Horizon 2020 research and innovation programme under grant agreement No 101000256. 7. References [1] P. Corista, D. Ferreira, J. Giao, J. Sarraipa, and R. J. Goncalves, ‘An IoT Agriculture System Using FIWARE’, presented at the 2018 IEEE International Conference on Engineering, Technology and Innovation, ICE/ITMC 2018 - Proceedings, 2018. doi: 10.1109/ICE.2018.8436381. [2] R. Martínez, J. Á. Pastor, B. Álvarez, and A. Iborra, ‘A testbed to evaluate the fiware-based iot platform in the domain of precision agriculture’, Sens. Switz., vol. 16, no. 11, 2016, doi: 10.3390/s16111979. 65 [3] G. S. Kuaban, P. Czekalski, E. L. Molua, and K. Grochla, ‘An Architectural Framework Proposal for IoT Driven Agriculture’, in Computer Networks, Cham, 2019, pp. 18–33. doi: 10.1007/978-3- 030-21952-9_2. [4] M.-R. Miguel Alejandro and L. Serguei, ‘Modelo conceptual de entornos geográficos dinámicos’, Ing. Investig. Tecnol., vol. 15, no. 2, pp. 163–174, Apr. 2014, doi: 10.1016/S1405-7743(14)72207- 3. [5] A. Kaloxylos et al., ‘Farm management systems and the Future Internet era’, Comput. Electron. Agric., vol. 89, pp. 130–144, Nov. 2012, doi: 10.1016/j.compag.2012.09.002. [6] ‘About FIWARE - FIWARE’, Oct. 20, 2021. https://www.fiware.org/about-us/ (accessed May 13, 2022). [7] ‘About Edit GeoJSON’. http://geojson.io/about.html (accessed May 13, 2022). [8] ‘Overview’, WeLASER. https://welaser-project.eu/overview/ (accessed May 16, 2022). [9] S. Wolfert, L. Ge, C. Verdouw, and M.-J. Bogaardt, ‘Big Data in Smart Farming – A review’, Agric. Syst., vol. 153, pp. 69–80, May 2017, doi: 10.1016/j.agsy.2017.01.023. [10] C. Verdouw and J. W. Kruize, Digital twins in farm management: illustrations from the FIWARE accelerators SmartAgriFood and Fractals. 2017. doi: 10.5281/zenodo.893662. [11] H. Moysiadis et al., ‘Combining FIWARE and IoT technologies for smart, small-scale farming: The case of QUHOMA platform architecture’, in Cloud and Fog Computing in 5G Mobile Networks: Emerging advances and applications, 2017, pp. 239–270. [12] Entity: AgriFarm. Smart Data Models, 2022. Accessed: May 16, 2022. [Online]. Available: https://github.com/smart-data-models/dataModel.Agrifood/blob/master/AgriFarm/doc/spec.md [13] Entity: AgriParcel. Smart Data Models, 2022. Accessed: May 16, 2022. [Online]. Available: https://github.com/smart-data-models/dataModel.Agrifood/blob/master/AgriParcel/doc/spec.md [14] Entity: RestrictedTrafficArea. Smart Data Models, 2022. Accessed: May 17, 2022. [Online]. Available: https://github.com/smart-data- models/dataModel.Transportation/blob/master/RestrictedTrafficArea/doc/spec.md [15] Entity: RoadSegment. Smart Data Models, 2022. Accessed: May 17, 2022. [Online]. Available: https://github.com/smart-data- models/dataModel.Transportation/blob/master/RoadSegment/doc/spec.md 66