=Paper=
{{Paper
|id=Vol-2533/invited3
|storemode=property
|title=Use of Augmented Reality Technology to Develop an Application for Smart Factory Workers
|pdfUrl=https://ceur-ws.org/Vol-2533/invited3.pdf
|volume=Vol-2533
|authors=Ivan Tsmots,Vasyl Teslyuk,Viktor Khavalko,Yurii Lukashchuk,Aneta Poniszewska-Maranda
|dblpUrl=https://dblp.org/rec/conf/dcsmart/TsmotsTKLP19
}}
==Use of Augmented Reality Technology to Develop an Application for Smart Factory Workers ==
Use of Augmented Reality Technology to Develop an Application for Smart Factory Workers Ivan Tsmots1[0000-0002-4033-8618], Vasyl Teslyuk1[0000-0002-5974-9310], Viktor Khavalko1[0000-0002-9585-3078], Yurii Lukashchuk1 [0000-0002-8933-8635] and Aneta Poniszewska-Maranda 2 [0000-0001-7596-0813] 1 Lviv Polytechnic National University, S. Bandery Str., 12, Lviv 79013, Ukraine {ivan.tsmots, vasylteslyuk, khavalkov, urijlukas}@gmail.com 2 Technical University of Lodz, Wolczanska Str., 215, Lodz, Poland aneta.poniszewska-maranda@p.lodz.pl Abstract. The relevance of the introduction of augmented reality technology in the production process is substantiated. It is developed the structure for the newest system on the base of AR technology, which will increase the motiva- tion of employees and increase the level of information assimilations. The re- search used modern development technologies, such as ASP.NET Core 2 tech- nology and Unity 3d environment. To demonstrate augmented reality, the Vuforia platform was used and virtual models of equipment have been created in the Solid Works simulation environment. The Factory Maintance Visualisa- tion (FMV) software has been developed, witch consists of four scenes and in- cludes all the features described above. Two models of equipment were created for the demonstration: the Cable 6-Axis Robot Arm and the EXCT 100 move- ment system. Keywords: Augmented Reality Technology, Smart Factory, Virtual Reality, Client Interface Development, Markerbassed AR Approach. 1 Analysis of the state With the advancement of technology and the general informatization of our lives, the proliferation of computers and smartphones, humanity has been confronted with a problem called cognitive overload [1-3]. That is, the situation in which the number of operations that must be carried out by the human brain exceeds its capabilities. Aug- mented Reality (AR) [2] is precisely the technology that can unload our brains, re- lease part of the cognitive effort and help optimize their use. At first glance, augmented reality may not seem as exciting as virtual reality, but it brings considerable benefit. In contrast to the non-existent virtual space, in which fans of computer games are immersed, AR-technology is designed to enrich the everyday world with additional information [2-4]. It contains enormous potential as it moves Copyright © 2019 for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0) 2019 DCSMart Workshop. elements from the virtual world into the real, complementing the things we are able to see, hear, or even feel. It complements the physical world with digital data provided by computer devices (smartphones, tablets and AR glasses) in real time. AR uses an environment around the human (as opposed to virtual reality, which "absorbs" a per- son) and simply imposes a piece of virtual information over it, such as graphics, sounds and reaction to the touch. As virtual and real world coexist harmoniously, users with augmented reality experiences have the opportunity to try a whole new, improved world, where virtual information is used as an additional useful tool that provides assistance in daily activities. Thus, the augmented reality is a concept that describes the process of supplementing the existing reality with virtual objects, allow- ing visualization of information [5-10]. Communication with virtual reality is per- formed in the on-line mode, and to provide the necessary effect only a webcam is needed – the image from which will be supplemented with virtual objects. The impact of this technology on social development can be comparable to the ef- fect of internet appearance. 2 Formulation of the problem The relevance of introduction of augmented reality technology into the production process is that the use of such a newest system will undoubtedly increase the motiva- tion of employees, as well as increase the level of information assimilation due to the diversity and interactivity of its visual representation. Purpose of work is to develop a simple and understandable application for employ- ees of the automated ("smart") factory. Such software will facilitate the diagnostics of the equipment at the factory and prevent breakage. Also, this application demonstrates equipment through augmented reality technologies, which allows new employees to master their skills faster with any type of equipment. 3 Augmented reality and its classification Augmented Reality is a term that represents all projects aimed at supplementing the reality with any virtual elements [3]. This technology is part of mixed reality (MR), which also includes "augmented virtuality" (when real objects are integrated into the virtual environment). In other words, it is a technology of overlaying information in the form of text, graphics, audio and other virtual objects on real objects in real time. It is the interac- tion of computing devices with the real world picture distinguishes augmented reality from virtual. Even in the middle of the TWENTIETH century, for military pilots began to make helmets with a built-in additional display. With it's help, the pilot received important information: he saw, for example, how much fuel was left in the tank of the aircraft and what course the moving machine [7]. 1999 year deserves special attention: this year, "ARToolKit" was released — an open source library for developing applications with augmented reality elements. A few years later, the smartphone era has led a new wave of AR development, this time with a lot of mobile applications for a wide range of consumers. The very first program of this kind was Wikitude that represents something similar to Wikipedia. The application is available in the version for Google Android from the end of 2008, for the iPhone since October 2009 and recently for Symbian OS. Eligo- Vision was the first to implement augmented reality systems in 2008 [1-3]. Augmented reality is divided into several types: 1. Marker based AR [6]. Some also call this an image recognition because it re- quires a special visual object and camera. It can be anything from a printed QR code to special marks. The AR device also calculates the position and orientation of the markers to accommodate the content in some cases. In this way, the token initiates the digital animations for users viewing, and therefore the images in the log can turn into 3D models. 2. AR without markers [7]. A.k.a.-location or position based on augmented reali- ty that uses a GPS, compass, gyroscope and accelerometer to provide data based on user location This data then defines what AR content you find or get in a specific area. With smartphones, this type of AR usually creates maps and directions, infor- mation about neighboring businesses. Applications include events and information, pop-ups for business advertisement, navigation support. 3. Projection-based AR [6, 7]. Designing synthetic light on the physical surface, and in some cases allows you to interact with it. These are the holograms that we have all seen in science fiction movies like "Star Wars". It detects the user's interaction with the projection behind its changes. 4.Superimposition-based AR [6]. Overrides the original view, either completely or partially. Object recognition plays a key role, without it the whole concept is simp- ly impossible. We have all seen an example of an augmented reality overlay in the IKEA Catalog application it allows users to place virtual items in their catalog of furniture in their rooms. Many modern devices already support augmented reality. From smartphones and tablets to devices such as Google Glass or portable devices, these technologies con- tinue to evolve. In order to process and design AR devices and hardware must first meet the following requirements, such as sensors, cameras, accelerometer, gyroscope, digital compass, GPS, processor, displays, etc. [3] Augmented reality devices are classified into the following categories [3, 6-9]: Mobile devices (smartphones and tablets) — the most affordable and best suited for AR mobile applications, ranging from clean gaming and entertainment to busi- ness analytics, sports and social networks. Special AR devices, intended primarily and exclusively to work with augmented reality. One such example is Head Displays (HUD), data transfer on transparent display directly to the user's view. Originally introduced to train pilots, now such devices are used in aviation, automotive industry, production, sports, etc. AR glasses (or smart-glasses) — Google Glasses, Meta 2 Glasses, Laster See-Thru Glasses, Laforge AR Accessories and more. These devices are capable of display- ing notifications from your smartphone, assisting conveyor line workers, accessing hands-free content and more. AR contact lenses (or smart lenses), moving augmented reality to a new level. Manufacturers such as Samsung and Sony have announced the development of AR lenses. Accordingly, Samsung is working on lenses as a smartphone accessory, while Sony is developing lenses as standalone AR devices (with features such as photography or data storage). Virtual retina displays (VRD), creates images by projecting laser light into the human eye The goal is to detect bright, high-contrast, high-resolution images. The scope of this technology is quite large, for example: Security [6,7]. Aeroflot has invested in the development of a number of systems to increase the safety of flights using the technology of augmented reality. Among them – a pocket complex with glasses for pilots, which allows to do the blindly landing due to inclement weather (fog, rain, high clouds, etc.) and at night. In addition to the glasses system for pilots, simulators were developed for air traffic controllers and air traffic drivers. Boeing introduced the Enhanced Vision System (EVS) back in 2007, designed to increase pilots' awareness of approach to the airfield, landing, take-off and steering. Using an infrared camera and sensors, it shows the environment in a transparent panel between the pilot and the windshield. Industry [4, 11, 15]. Augmented reality allows you to show what can not be shown live. For example, a working hydroelectric power plant with a detailed text description of its operation in the air. Moreover, augmented reality does not require any special equipment. Plain Webcam, Plain Computer, Plain Plasma and Marker Plain on Plain Paper. Advertising and promotions [3, 16, 17]. Augmented reality technology is able to breathe life into the concept of promotional games and competitions, turning them into city quests. What if the task is not to sell the product but the emotions associated with that product? For example, using face recognition and emotion technology can make peo- ple smile. As Unilever did, they created an ice cream vending machine that gave out free ice cream for a smile. Medicine [9, 10]. The surgeon during the surgery can see in the glasses of aug- mented reality the important information about the patient's condition. This allows you to make decisions instantly without wasting time on communication. Games. (Box! Open Me, AppTag, Mosquitos, SpecTrek) — an oasis for AR. The PlayStation Vita, for example, comes with a set of augmented reality maps. Education [5]. In education, the augmented reality is now very popular — albums, encyclopedias are equipped with special markers that help children learn more inter- esting. AR is an effective mobile learning tool. There have already been specialized educational projects, such as LearnAR, using the technology of augmented reality in the lessons. Using this novelty in the field of education will not only make learning clear, but also help to keep the attention of the audience: How not to twist, the slide will remain visible and even partially will be remembered, regardless of the desire. 4 FactoryMaintanceVisualization Application Structure For the comfortable development of the application [13,14], the work was conven- tionally divided into two part: 1. Client interface development in Unity 3d environment. 2. Development of the server part based on ASP.NET Core 2 technology. The client part consists of the created four "scenes", namely: MainScene, Equip- mentScene, EquipmentDetailsScene, ModelVisualisationScene. The development of the client environment used the capabilities of the Unity3d, a cross-platform gaming engine developed by Unity Technologies and augmented reality platform, including libraries and development tools kit (SDK) to create augmented reality applications. For the creation of the server part used the capabilities of the Microsoft ASP.NET Core platform, which is designed to create a variety of web applications: from small websites to large web portals and web services, and the Entity Framework platform which is a set of ADO.NET technologies that provide development of applications related to data processing. It will also tell you about the FactoryManager database created for the program. 4.1 Client part structure 4.1.1 MainScene MainScene – this is the employee authorization page in the factory system. The purpose of this scene is to check the user's login and password, or they coincide with the existing in the database (Fig 1). Fig. 1. Employee authorization page in factory system Canvas was used to create it, which is the area of placing all elements for user interac- tion. This is a standard feature in Unity. Two input fields were added to it – LoginIn- putField, PassInputField, for login and password respectively. All the business logic that relates to this scene is implemented in the Main- SceneScript file. ChangeSceneObject was created to bind this file to MainScene. This is an empty game object, which serves as an intermediary between the appearance of the scene and its functionality. The query to the test server was based on the UnityEn- gine WWW method, which is a built-in Unity tool. The response and user data validation was performed in the OnResponse() func- tion. This was done using a third-party library – SimpleJSON. This library is used in all scenes. If all is well and the data entered by the user coincide with the data in the database, then after clicking on the button Login worker will be forwarded to the stage Equip- mentScene. 4.1.2 EquipmentScene When an employee is successfully logged into the system, capture device starts to work. The program looks for a marker on which the image will be rendered. Once the necessary section of space is found, the user just has to click on the corresponding button. In this work used markerbassed AR approach, that means that to show some- thing you need to find plane with corresponding symbols. For the project, two markers were created (Fig 2). As you can see it consists of two parts. On the left is the plane where the models will be shown and the data is dis- played. On the right is a virtual button that responds to the movement of the employee and shows information about the equipment (Fig. 3). Fig. 2. Marker structure The IVirtualButtonEventHandler interface has been connected in order to work with such button. It also has the OnButtonReleased method, but in this case it is not needed. Fig. 3. EquipmentScene window For requests to the test server, as in the previous case, was used WWW, the UnityEn- gine class method. In Fig. 4 demonstrated call procedure on server. The last parameter in the address strip corresponds to the id (identifier) of the device whose data was interesting to the employee. Also, all the data extraction logic is described in the OnResponse(request) function. Plane, a standard Unity object, was created to display all this data. It contains three text boxes: characteristiValue1, characteristiValue2, characteristiValue3, which show the employee the current characteristics of the equipment. There is also an equip- mentTitle text box that shows the name of the equipment. There are also two additional buttons on this scene: View Model, View Details. Each button has its own functionality that switches the scene to ModelVisualisa- tionScene and EquipmentDetailsScene respectively. Fig. 4. The procedure for calling the server It should also be noted that the work described above is similar for all models cre- ated for the project. 4.1.3 ModelVisualisationScene ModelVisualisationScene – is a scene where a factory worker can get acquainted with a 3D model of the equipment. In order to switch to the equipment in the program there is a button – View Model. Pressing this button will redirect user to the scene described (Fig. 5). Fig. 5. Cable 6-Axis Robot Arm It is worth saying that in Fig. 6 depicts a linear gantry portal EXCT-100 provided by FESTO. This is the so-called Cartesian movement system. In Fig. 5 shows the Cable 6-Axis Robot Arm, sponsored by 3DMEISTER. Just so these models do not display through Unity. They were originally compiled through Solid Works 2014 and saved in the IGES (Initial Graphics Exchange Specifi- cation) format. The next step was to open this file through Autodesk 3dmax. Due to such manipulations, most textures have disappeared from the models. From 3dmax environment, the models were exported in a Unity FBX (Filmbox) format. Fig. 6. Linear gantry portal EXCT-100 Each equipment has its own characteristics and description. In the case of the pro- gram, this was implemented using Plane, which is a standard Unity tool. An image with all the characteristics of the object was added to it. The employee also has the ability to rotate the model in arbitrary directions. A spe- cial controller was added for this purpose. This is a special module from the Unity Standard Assets suite, called MobileSingleStickControl. Also, to implement the movement, a Rigidbody component was imposed on the model. Its role is to add physics to the object. 4.1.4 EquipmentDetailsScene EquipmentDeatailsScene – this is the scene where the detailed condition of the equipment is described (Fig. 7). The main purpose of this scene is to give the employ- ee full information about the current state of the equipment, to show when and by whom the last technical inspection was performed. Fig. 7. EquipmentDetailsScene The user is also able to request an additional inspection. Data sent by the employee will be stored in the corresponding FactoryManager database table. For convenience, it was decided not to make this scene in virtual form, that is, to make it in 2D format. This was done for ease of use. As in the previous scenes, the OnResponse (request) function was created. The "body" function is similar to the previous one, the only difference is the object where the data is stored. The second section is historyViewBlock. The history of equipment maintenance is displayed here. The New Request button is located. Clicking this button displays a new input box where the employee writes the message. After clicking Submit, the data is sent to the server and stored in the database. 4.2 Server part structure The test server was created to simulate the operation of the present plant, at least the part related to the project (Fig. 8). Fig. 8. Server structure To create the server, in Visual Studio 2017 development environment was created a project. In turn, it is divided into two subprojects. The first is responsible for creating a database based on a written data model (the so-called code-first approach). The second is all server-related logic, controllers, links, and configuration files. 4.2.1 DataAccess class library DataAccess – a class library created to store data models of a future database. Here describes all classes that will create the corresponding tables in FactoryManager. Employee is a class that describes the user fields, that is, the future columns of the corresponding table. Equipment – model of equipment data. History is a class that describes the fields in the table where the last maintenance of the equipment will be stored. As a result, the FactoryManager database is formed (Fig. 9). Fig. 9. Database structure 4.2.2 Created controllers Three controllers were created to receive and process requests from the client side. The controller is a regular class written with the C # programming language. It is inherited from the abstract class Microsoft.AspNetCore.Mvc.Controller. Generally created: EmployeeController, EquipmentController, HistoryController. Each controller is responsible for its part. Each spelled out so-called CRUD operations (create update delete), these are the four basic functions of managing data creation, reading, modification and deletion. Due to these operations, the controller knows which data to insert into the database, which ones to update, which to return to the user. In order to make request on test server, you must correctly create a URL (Uniform Resource Locator). The server runs on "http: // localhost: 57144". The following is a prefix that must be added to do this. Depending on the needs, the name of the controller will be substituted instead of the controller. EmployeeController — this controller is responsible for the application user data. That is, it passes through all operations to extract user data. In the case of the applica- tion, this is the login, password, first and last name. This demonstrates a method that returns user data based on its id (identity docu- ment). Inquiries here come from MainScene. EquipmentController — this controller is responsible for providing user with in- formation about the relevant equipment. To display data on the EquipmentScene sce- ne, it return the name of the equipment, its characteristics, and the current state. The part of controllers code is shown below. HistoryController — is responsible for storing and returning data on the latest equipment inspection. The user is prompted by the date of the last inspection, the message and the author of the message. The user can also create a new record in the corresponding database table. Control- ler is used to process requests from the EquipmentDetailScene scene. So, Factory Maintance Visualization (FMV) software was developed in this work. It includes all the features described above. Also, two models of equipment were cre- ated for the demonstration: Cable 6-Axis Robot Arm, EXCT 100 relocation system. Markers, that attach to the equipment, have been developed to show the virtual ver- sion of the equipment as well as information about it. 5 Conclusions In this article is substantiated the actuality of augmented reality technology and con- sider various directions of use of this technology in the production process. Substanti- ated necessity of development mobile educational system based on augmented reality technology in order to increase information assimilation level of production sphere employees. The use of such a training system will certainly simplify perception of the theoretical material, as well as allow to consolidate knowledge into practice – having simulated virtual objects in real time. During this research, FactoryMaintanceVisualization software was developed. This application will be useful for the "smart" factory employees. Such application will facilitate the diagnostics of equipment at the factory and prevent breakage. Also, this application demonstrates equipment through augmented reality technologies, which allows new employees develop their skills faster with any type of equipment. FactoryMaintanceVisualization складається із чотирьох сцен: MainScene — is responsible for logging the employee into the system; EquipmentScene — provides user with general equipment information; EquipmentDetailsScene — shows a detailed description of the equipment, displays inspection histories; ModelVisualisationScene — shows a 3D model of the equipment. A test server was also created to simulate the work of the plant and a database called “FactoryManager” was developed for easy data sharing. The Vuforia platform was explored to create this application as it is a powerful tool for creating augmented reality applications. The capabilities of the Unity 3d develop- ment environment and the Solid Works 2014 design environment were also exploited. The latest ASP.NET Core 2 technology was explored to create a server that could provide user with the necessary information. References 1. Carmigniani, J., Furht, B., Anisetti, M., Ceravolo, P., Damiani, E., Ivkovic, M.: Augment- ed reality technologies, systems and applications. Multimedia Tools and Applica- tions, vol.51 (1), pp.341–377(2010). doi:10.1007/s11042-010-0660-6 2. Magee, D., Zhu, Y., Ratnalingam, R., Gardner, P., Kessel, D.: An augmented reality simu- lator for ultrasound guided needle placement training. Medical & Biological Engineering & Computing, vol. 45 (10), pp.957–967 (2007). doi:10.1007/s11517-007-0231-9 3. Krevelen, D, Poelman, R.: A Survey of Augmented Reality Technologies, Applications and Limitations. International Journal of Virtual Reality, vol.9(2), pp.1-20(2010) 4. Kryvenchuk, Y., Shakhovska, N., Melnykova, N., & Holoshchuk, R.: Smart Integrated Robotics System for SMEs Controlled by Internet of Things Based on Dynamic Manufac- turing Processes. In Conference on Computer Science and Information Technologies. pp. 535-549 (2018) 5. Wu, Hsin-Kai; Lee, Silvia Wen-Yu; Chang, Hsin-Yi; Liang, Jyh-Chong: Current status, opportunities and challenges of augmented reality in education. Computers & Education, vol. 62. pp. 41–49. doi:10.1016/j.compedu.2012.10.024 6. Peddie, J.: Augmented Reality: Where We Will All Live, Springer (2017) 7. Aukstakalnis S.: Practical Augmented Reality: A Guide to the Technologies, Applications, and Human Factors for AR and VR, Addison – Wisley, Boston (2016) 8. Borycki, D.: Programming for Mixed Reality with Windows 10, Unity, Vuforia, and UrhoSharp, Pearson Education, London (2018) 9. Tsmots, I., Teslyuk, V., Batyuk. A., Khavalko, V., Mladenow, A.: Information-Analytical Support to Medical Industry. CEUR Workshop Proceedings, 2488. pp. 246-257 (2019) (http://ceur-ws.org/Vol-2488/) 10. Kryvenchuk, Y., Shakhovska, N., Shvorob, I., Montenegro, S., & Nechepurenko, M.: The Smart House based System for the Collection and Analysis of Medical Data. CEUR, 2255. pp. 215-228. (2018) 11. Krasko, O., Kolodiy, R., Khavalko, V.: Wavelength rearrangement and load balancing al- gorithm for OWTDMA-PON network. In: Modern Problems of Radio Engineering, Tele- communications and Computer Science, Proceedings of the 13th International Conference on TCSET- 2016, pp. 950-952 (2016) 12. Mourtzis, D., Zogopoulos, V., Xanthi, F.; Augmented reality application to support the as- sembly of highly customized products and to adapt to production re-scheduling. The Inter- national Journal of Advanced Manufacturing Technology, pp. 1-12 (2019) doi:10.1007/s00170-019-03941-6 13. Glover, J.: Unity 2018 Augmented Reality Projects: Build four immersive and fun AR ap- plications using ARKit, ARCore, and Vuforia, Packt Publishing, Birmingham (2018) 14. Glover, J., Linowes, J.: Complete Virtual Reality and Augmented Reality Development with Unity, Packt Publishing, Birmingham (2018) 15. Mourtzis, D., Zogopoulos, V., Katagis, I., Lagios, P.: Augmented Reality based Visualiza- tion of CAM Instructions towards Industry 4.0 paradigm: a CNC Bending Machine case study. Procedia CIRP, vol. 70. pp. 368–373 (2018) 16. Pavlik, John V., McIntosh, S.: Augmented Reality. Converging Media: a New Introduction to Mass Communication, 5th ed., Oxford University Press, pp. 184–185 (2017) 17. Feiner, S., MacIntyre, B., Seligmann, D.: Knowledge-based augmented reali- ty". Communications of the ACM. Vol. 36(7), pp. 53–62 (1993) doi:10.1145/159544.159587