<?xml version="1.0" encoding="UTF-8"?>
<TEI xml:space="preserve" xmlns="http://www.tei-c.org/ns/1.0" 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
xsi:schemaLocation="http://www.tei-c.org/ns/1.0 https://raw.githubusercontent.com/kermitt2/grobid/master/grobid-home/schemas/xsd/Grobid.xsd"
 xmlns:xlink="http://www.w3.org/1999/xlink">
	<teiHeader xml:lang="en">
		<fileDesc>
			<titleStmt>
				<title level="a" type="main">System and method of automatic collection of objects in the room</title>
			</titleStmt>
			<publicationStmt>
				<publisher/>
				<availability status="unknown"><licence/></availability>
			</publicationStmt>
			<sourceDesc>
				<biblStruct>
					<analytic>
						<author>
							<persName><forename type="first">Mariia</forename><forename type="middle">Yu</forename><surname>Tiahunova</surname></persName>
							<email>mary.tyagunova@gmail.com</email>
						</author>
						<author>
							<persName><forename type="first">Halyna</forename><forename type="middle">H</forename><surname>Kyrychek</surname></persName>
						</author>
						<author>
							<persName><forename type="first">Tetiana</forename><forename type="middle">O</forename><surname>Bohatyrova</surname></persName>
							<email>t.bohatyrova.un@gmail.com</email>
						</author>
						<author>
							<persName><forename type="first">Daryna</forename><forename type="middle">D</forename><surname>Moshynets</surname></persName>
						</author>
						<author>
							<affiliation key="aff0">
								<orgName type="institution">National University &quot;Zaporizhzhia Polytechnic&quot;</orgName>
								<address>
									<addrLine>64 Zhukovsky Str</addrLine>
									<postCode>69063</postCode>
									<settlement>Zaporizhia</settlement>
									<country key="UA">Ukraine</country>
								</address>
							</affiliation>
						</author>
						<author>
							<affiliation key="aff1">
								<address>
									<settlement>Kryvyi Rih</settlement>
									<country key="UA">Ukraine</country>
								</address>
							</affiliation>
						</author>
						<title level="a" type="main">System and method of automatic collection of objects in the room</title>
					</analytic>
					<monogr>
						<idno type="ISSN">1613-0073</idno>
					</monogr>
					<idno type="MD5">F7EA30C0A05F4B1824BB3A7051304B91</idno>
				</biblStruct>
			</sourceDesc>
		</fileDesc>
		<encodingDesc>
			<appInfo>
				<application version="0.7.2" ident="GROBID" when="2023-03-24T22:17+0000">
					<desc>GROBID - A machine learning software for extracting information from scholarly documents</desc>
					<ref target="https://github.com/kermitt2/grobid"/>
				</application>
			</appInfo>
		</encodingDesc>
		<profileDesc>
			<textClass>
				<keywords>
					<term>roboplatform</term>
					<term>Alphabot</term>
					<term>Arduino</term>
					<term>optimization</term>
					<term>evolutionary method</term>
				</keywords>
			</textClass>
			<abstract>
<div xmlns="http://www.tei-c.org/ns/1.0"><p>A system and method of automatic collection of objects in a room focused on the minimum energy consumption is proposed in this paper. This result is achieved by the implementation of an improved method of automatic collection of items in the room and mathematical method for calculating the desired motion trajectory; development and implementation of algorithms that implement the proposed method; software and hardware implementation of the system for automatic collection of items with minimal energy consumption, including the small number of system components and an improved movement trajectory. The system's main component is the Arduino Uno, which acts as a controller. The developed software makes it possible to evaluate the implemented method's effectiveness in a real-life system. An application example of the proposed method is given.</p></div>
			</abstract>
		</profileDesc>
	</teiHeader>
	<text xml:lang="en">
		<body>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="1.">Introduction</head><p>Humanity is on the verge of a new era of technological development. Already now, robotics is rapidly being introduced into people's daily lives, helping them to interact more effectively with automated systems, improve existing jobs and, in general, give people more time to focus on what interests them, it is important and fun. We are talking about cyber-physical systems for modeling complex sociotechnical systems that largely control themselves.</p><p>The principle of operation of such systems is to combine physical production processes or any other processes that require continuous control in real-time using software <ref type="bibr" target="#b0">[1]</ref> and electronic systems.</p><p>The robot differs from a conventional automatic system in its multipurpose purpose, versatility, and ability to be reconfigured to perform various functions. Automation of robotic systems requires the development of robots capable of intellectual activity, using a pre-prepared program or artificial intelligence technologies <ref type="bibr" target="#b1">[2,</ref><ref type="bibr" target="#b2">3,</ref><ref type="bibr" target="#b3">4]</ref>.</p><p>Today, the use of robots covers almost all industries and tasks. With the rapid development of robotics, human-machine interaction will soon become a common daily practice. Moreover, today technological advances are increasing the adaptability and flexibility of robots. Most people in developed countries prefer home automation, allowing control of lighting, air conditioning, audio content, security systems, and home appliances. Much attention is paid to the creation of mobile robots that are used in home conditions or in situations where the presence of a person is dangerous and optional. Modern robotics focuses on the complete automation and autonomy of robots and robotic systems. A characteristic feature of work is the ability to partially or completely fulfill the mechanically motor and intellectual functions of a person.</p><p>Additional support is provided by connected jobs that apply to all types of services, such as robotic vacuum cleaners. Modern companies are actively developing this area, developing new assistant robots. The number and popularity of home robotic assistants in homes are growing at a high rate, due to the increase in the functions that provide work and the decrease in the price of them.</p><p>Most robotic home assistants are entertaining in nature, such as a voice assistant built into a music speaker. Another group of robots is designed to free humans from doing household chores, they can guardhouses, wash windows, clean pools and do small household chores.Work greatly improves the quality of our life at home, at work and during leisure. They are adapted to recognize objects, distinguish objects of the environment, repeat simple human movements, coordinate with other works. Such functionality has become possible thanks to the use of innovations in the development of robotics, as well as due to the improvement of algorithms that control perception, thinking, control and coordination of work.</p><p>The purpose of the study is to organize the automatic collection of items in the room by developing a system and collection method that allow assessing not only the results of the work, but also implementing the collection process with minimal energy consumption.</p><p>The objective of the research is to develop a robotic autonomous system for collecting objects in a room with increased requirements for energy saving due to the developed method.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="2.">System structure</head><p>The basis of the collection system is represented by the Alphabot double-deck robotic platform, which allows you to remotely control the development via Bluetooth or WiFi. The robotic platform is equipped with two motors, contains mounting holes for mounting various sensors, controllers and power supplies. The main component of the aforementioned system is the Arduino Uno hardware and software complex, which acts as a control controller. It is a board that houses the microcontroller and other electronic components.</p><p>The MG995R servo motor is used for automatic remote control of the claws, which assists in the opening (closing) functions. The operation of the wheels is provided by DC motors with a gearbox, the rotation speed of the motors is helped to determine the photo-interrupting sensors. To determine the distance, the HC-SR04 ultrasonic sensor is connected to the robot platform <ref type="bibr" target="#b4">[5,</ref><ref type="bibr" target="#b5">6]</ref>.</p><p>The robotic claw is made of light metal. The claw opens approximately 55 mm and, depending on the servo used, can lift heavy objects.</p><p>The main connections to the Arduino board are shown in figure <ref type="figure" target="#fig_0">1</ref>. In this case, the Board is connected in a standard way to the Alphabot platform, according to the documentation <ref type="bibr" target="#b4">[5]</ref>. </p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="3.">System operation</head><p>For the effective functioning of the system, a method for collecting items from the floor of the room was developed, which allows collecting with minimal energy consumption. In order for the energy consumption to be minimal, it is necessary that the time of the task being performed is also minimal. This result can be achieved by moving the robot along the minimum path along the total distance.</p><p>Thus, the developed method of collecting objects from the floor is as follows. The room where the robot moves to collect items is a matrix of points, each of which is equal to the length of the robot platform with a claw. The robot is installed in those points where it can turn to the left. With the help of an ultrasonic sensor, it is determined how far it can travel forward, that is, to a wall or an object <ref type="bibr" target="#b6">[7,</ref><ref type="bibr" target="#b7">8]</ref>.</p><p>As soon as the system reaches the obstacle and it turns out to be a wall, then this point becomes a "basket" where objects will be collected. Then the robot makes a 180-degree turn and continues to go to the next obstacle. If there is an object on the way, he tries to grab it and, if successful, determines the shortest path to the "basket" from the point at which he stopped, and begins to move in this direction.</p><p>Having reached the point of collecting objects, he leaves the object and returns to the point where he picked it up, and continues to search for the following objects. If the robot was unable to grip the object, because the dimensions differ from the permissible parameters, it makes a detour around the object. When the robot reaches the other end of the room, it looks for the shortest route back to the starting point. As soon as the robot is at the starting point of the room, it turns off.</p><p>The algorithm for the functioning of the system that implements the developed method of collecting items, based on the above, is as follows:</p><p>1. The robot scans the distance to the obstacle using an ultrasonic sensor. 2. The robot moves towards an obstacle. 3. Having stopped at an obstacle, the object is gripped: the claw opens, the robotic platform travels 4 cm and the claw is closed. 4. The robot moves backward 8 cm and checks the distance:</p><p>• if the distance does not change and the "basket" is indicated, then go to step 8;</p><p>• if the distance does not change and the "basket" is not indicated, then the robot turns in place by 180 degrees, moves to the wall with the help of obstacle sensors and makes a reverse turn; • if the distance changes and the "basket" is indicated, then go to step 6;</p><p>• if the distance changes and the "basket" is not indicated: the robot defines this point as a collection point. Then step 6 is performed.</p><p>5. The robot opens the claw, moves forward 4 cm, closes the claw. Then step 9 is performed. 6. Turn left. 7. If the distance allows movement, a 90-degree turn is performed, then step 1 is performed, otherwise, the robot is at the end of the room: step 10 is performed. 8. The shortest path to the "basket" is determined. A movement is made towards it and step 5 is carried out. 9. The shortest path to the capture point of the object is determined. When the robot arrives at this point, step 1 is performed. 10. The shortest path to the starting point is determined. 11. The movement to the starting point is carried out, and the robot finishes its work.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="4.">Mathematical justification</head><p>To prove that energy consumption will be minimal, with the smallest distance traveled by the robot and with the minimum number of devices, we will construct the following mathematical model.</p><p>First, let's prove that a device with fewer connected devices should use less power. Let 𝑆 -be the distance that the robot will cover when moving around the room, while collecting all possible objects. Then we will designate the energy that he will spend as 𝐸 𝑛𝑝 .</p><p>Value 𝑁 -the number of connected devices powered from one source.</p><p>The energy of each 𝑖-th device is denoted as 𝐸 𝑖 , where 𝑖 = (1, 𝑁 ). Then the energy that will be spent by the robot for the entire path traveled is determined as follows:</p><formula xml:id="formula_0">𝐸 𝑛𝑝 = 𝑁 ∑︀ 1 𝐸 𝑖 .</formula><p>Considering that the value 𝐸 𝑖 cannot be less than zero, then according to the mathematical law of additivity, the lower the value of the energy of the 𝑖 devices 𝐸 𝑖 , the lower the value of the energy that is spent by the robot while covering the entire distance 𝐸 𝑛𝑝 . This means that the fewer devices connected to the robotic platform, the less energy they will spend during a certain operating time. Thus, it has been proved that the device developed in this scientific work meets the requirements of energy saving. Secondly, we will prove that the energy consumption will be the least, with the minimum distance traveled by the robot. Suppose that the robot moves uniformly, because the time for gripping the object does not depend on the path of movement of the robot, but remains unchanged regardless of which path was traversed by the robot. Then the following formula for determining the distance is valid: 𝑆 = 𝑣𝑡, where 𝑡 -the time during which the movement is carried out at a given speed of robot 𝑣.</p><p>Then 𝑣 = 𝑆 𝑡 . Taking into account the energy of motion 𝐸 = 𝑚𝑣 2 2 , we get that 𝐸 =</p><formula xml:id="formula_1">𝑚( 𝑆 𝑡 ) 2 2</formula><p>. This means that the less the path traveled by the robot, the less energy will be spent by the device. Which it was necessary to prove. Thus, at this point of work, it was proved that the developed system meets the requirements of energy saving and has significant advantages in this.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="5.">Analysis of similar systems</head><p>Minimizing energy costs is always an urgent problem. Many scientists are dealing with this issue, and among robotics, not least. Naturally, there are many studies on similar topics, some of which can be considered as distant analogs of the developed system.</p><p>So, for example, Taniguchi et al. <ref type="bibr" target="#b8">[9]</ref> considers autonomous planning based on spatial concepts for guidance in the house using service robots. To complete the tidying task, the robot must recognize the environment, evaluate objects to clean while moving, and determine where to place objects in three-dimensional (3D) space.</p><p>The proposed tidy up planning method is formulated to first select the object whose tidied place is the most defined. In the spatial concept model learned in the tidied environment, the likelihood is the highest when each object position is tidied. Therefore, when tidying up one object from scattered objects, the object with the highest likelihood is selected first. In this study, tidying up implies moving the positions of the observed objects to increase the likelihood of the spatial concept mode.</p><p>First, the robot detects cluttered objects while moving. Next, if multiple scattered objects are detected, the robot decides on an object and a place to be tidied up. The robot estimates simultaneously the order and positions of the objects to tidy up from the multiple objects it observed in the cluttered environment. Moreover, the robot can determine if the tidied position of the object is unknown from Equation. If unknown, the robot can ask where to move the object.</p><p>Finally, the robot performs motion planning. After both estimating the tidied positions and planning the object order by Equation, the robot is required to accurately manipulate the objects to move them. The developed system uses the MoveIt! framework for the motion planning of the robot arm when grasping an object. Furthermore, the robot moves to the appropriate search positions while performing self-localization based on its map and observations. When the robot finishes tidying an object, the state of the other objects may have changed owing to external factors. To deal with such scenarios, it is possible to sequentially plan the tidy up task by redoing the object detection after tidying each object.</p><p>Approach is divided into learning and planning phases. In the learning phase, the robot observes the tidied environment. Spatial concepts are formed from observed data regarding objects and their 3D positions by multimodal learning. In the planning phase, the robot observes scattered objects in the environment, and estimates the order and positions of objects to be tidied up based on the spatial concepts formed in the learning phase.</p><p>According to the above flow, the robot tidies up a selected object to the target position repeatedly. In the case of an unknown object, the robot tidies it up by asking the user about its place name.</p><p>In contrast to the developed system, the reduced one is much more advanced, a robot that has undergone machine learning scans the surrounding space using a camera and builds the shortest paths based on the data obtained and built-in algorithms. On the one hand, such a system is more profitable in terms of finding goals and shortest paths in advance, on the other hand, it requires much higher development costs, in contrast to the one given in the article.</p><p>In a large number of works, the construction of trajectories for robots with obstacle detection is considered. So, for example, in work <ref type="bibr" target="#b9">[10]</ref> for the study of space, an algorithm is used, based on the algorithms of the robot's movement "Spiral" and "Movement along the wall".</p><p>Algorithm "Spiral" is a movement along a circle with increasing radius. "Movement along the wall" is an algorithm in which the robot moves along the perimeter of space. Unlike the "Spiral" algorithm, in this algorithm the robot can go around obstacles. The proposed algorithm for learning a mobile robot in order to detect obstacles is a modified combination of the two algorithms described above. The learning algorithm consists of three parts. The first part can be called the main one -it is an algorithm for passing one circle (figure <ref type="figure" target="#fig_1">2</ref>). The robot begins its exploration of space from a possible perimeter, and with each circle it narrows it down to a radius less by one. In order for the robot to narrow the radius with each passed circle, the second part of the algorithm is needed -the algorithm for starting the study of a new circle (figure <ref type="figure" target="#fig_2">3</ref>).</p><p>After the end of the movement according to this algorithm, the robot needs to check the space for unexplored areas (figure <ref type="figure" target="#fig_3">4</ref>). In the case when there are few obstacles and their shape is simple, the robot explores the space in one pass. Otherwise, there will be dark areas unknown to the robot. To prevent the robot from spending a lot of time on the way to such a site, it is necessary to use the algorithm for finding the shortest path. There are many obstacle avoidance algorithms. The most efficient is Dijkstra's algorithm if we consider the classical algorithms for solving the problem.</p><p>In <ref type="bibr" target="#b10">[11]</ref>, to solve the trajectory planning problem, an approach is used based on artificial neural networks of Hopfield, which belongs to intelligent algorithms -fuzzy and genetic algorithms, the main advantage of which is the speed of calculation with a moderate load on onboard computing complexes. However, the use of the specified mathematical apparatus imposes significant restrictions on the minimum system requirements of the MR computing platform, which excludes the possibility of implementing the method on devices with limited computing resources. For example, AVR or STM microcontrollers with a processor capacity of less than 32 bits, which are also a significant drawback.</p><p>Based on everything discussed above, we can conclude that the developed system is really relevant, having no exact analogs and showing an advantage over others for the set goals.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="6.">System implementation</head><p>By connecting the Arduino board to power, the firmware starts to actively execute. The microcontroller is configured so that the bootloader is responsible for the control functions at system startup. First, the loader checks for 1-2 seconds, the user will not start sending a new program. If the reprogramming process is started, the sketch is loaded into memory and control is transferred to it. If there are no new programs, the loader executes the previously saved program <ref type="bibr" target="#b11">[12]</ref>.  After starting the execution of the program, the Arduino performs a number of routine operations of initializing and setting up the environment, and only then proceeds to execute the same code that is contained in the sketches. Thus, Arduino eliminates the need to remember all the details of the microprocessor architecture and concentrate on the tasks at hand.</p><p>The robotic platform contains a large number of sensors and actuators, which can be controlled by the AlphaBot.h library, which can be downloaded from the official AlphaBot website <ref type="bibr" target="#b12">[13]</ref>. The instance used to send commands from the sketch is Car1, and the method implemented by the sketch is: SetSpeed(), Forward(), Brake(), Backward(), Left(), Right(), LeftCircle(), RightCircle(), MotorRun().</p></div><figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_0"><head>Figure 1 :</head><label>1</label><figDesc>Figure 1: Connections to the Arduino board.</figDesc><graphic coords="3,89.29,149.26,416.68,150.07" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_1"><head>Figure 2 :</head><label>2</label><figDesc>Figure 2: Algorithm for passing one circle.</figDesc><graphic coords="7,141.38,84.19,312.52,406.94" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_2"><head>Figure 3 :</head><label>3</label><figDesc>Figure 3: Algorithm for starting the study of a new circle.</figDesc><graphic coords="8,203.88,84.19,187.52,233.15" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_3"><head>Figure 4 :</head><label>4</label><figDesc>Figure 4: Algorithm for check the space for unexplored areas.</figDesc><graphic coords="8,203.88,356.75,187.51,149.10" type="bitmap" /></figure>
			<note xmlns="http://www.tei-c.org/ns/1.0" place="foot" xml:id="foot_0">CS&amp;SE@SW 2021: 4th Workshop for Young Scientists in Computer Science &amp; Software Engineering, December 18, 2021,</note>
		</body>
		<back>

			<div type="funding">
<div xmlns="http://www.tei-c.org/ns/1.0"><p>D. D. Moshynets) https://www.facebook.com/mary.tyagunova.7 (M. Yu. Tiahunova); http://csn.zntu.edu.ua/galina-grigorivna-kirichek (H. H. Kyrychek); https://www.facebook.com/profile.php?id=100074077952151 (T. O. Bohatyrova); https://www.facebook.com/profile.php?id=100012258404028 (D. D. Moshynets) 0000-0002-9166-5897 (M. Yu. Tiahunova); 0000-0002-0405-7122 (H. H. Kyrychek); 0000-0002-3430-2117 (T. O. Bohatyrova); 0000-0002-7743-3985 (D. D. Moshynets</p></div>
			</div>

			<div type="annex">
<div xmlns="http://www.tei-c.org/ns/1.0"><p>A set of functions for controlling a servo drive is provided by the Servo.h library:</p><p>• servo.attach() -indicates the pin to which the servo control wire is connected; • servo.write() -set the turning angle; • servo.writeMicroseconds(uS) -a value is passed to control the servowrite in microseconds (uS), setting the rotation angle to this value; • servo.read() -reads the current angle of rotation of the servo; • servo.attached() -determines if there is a binding to the servo via pin; • servo.detach() -disconnect a pin from the Servo library.</p><p>By downloading the program, the Arduino allows the code to participate in the initialization of the system. To do this, the microcontroller is given commands that it will execute at the time of the boot and will no longer return to them (that is, these commands are executed only once at system startup). And it is for this purpose that a block is allocated in the program in which such commands are stored -void setup() <ref type="bibr" target="#b13">[14,</ref><ref type="bibr" target="#b14">15]</ref>. In the function, the necessary properties are set for connecting the configuration of the ultrasonic sensor, setting the speed of the robot, specifying the pin for connecting to the servo drive. Variables that are used in the program for the functioning of the robot:</p><p>• run -to set the start of the robot's functioning;</p><p>• side -to determine the side to which the robot is returning;</p><p>• distance -to determine the distance to the obstacle at the present time;</p><p>• spacing -to save the old value of the distance to the obstacle; • cart -to determine the presence of a "basket"; • pos_last_point -to save the distance to the point where the object was taken;</p><p>• time_turn -to save the time for which the reversal is carried out;</p><p>• pos_cart -to save the distance to the "basket".</p><p>Be sure to after setup() there is a loop() function -the entry point to the program, which ensures the execution of commands while the Arduino board is turned on. This function contains all commands that will be executed cyclically. Starting from the first command, the microcontroller performs all subsequent steps to the very end and immediately returns to the beginning to repeat the same instruction. Checking the distance in the program is performed by the check distance() function. It allows you to check the driving distance in the range from 2 to 30 cm. If the distance to the object is less than the value of the distance that was before the capture, then it is considered that the object was taken by the robot. Otherwise, the object defines the wall.</p><p>To leave an item in the "basket", the program calls the leave_object() method. Obstacle sensors are controlled by the sensor_check() function. The movement of the robot backward is set and a cycle is performed until the sensors notify about the object. Upon completion, the system performs a 180-degree turn in place. The definition of the shortest path to the "basket" is provided by the road_to_busket() function. Changing side determines the side to return to (1 -left, 2 -right).</p><p>The road_to_last_point() function returns the work to the point along the shortest path where the item was taken.</p><p>To return to the starting position of the work, the road_to_start_point() function is called. The turn() function determines the side in which to turn and carries out movement in a certain direction. </p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="7.">Conclusion</head><p>On the basis of the Arduino family platform, an object collection system was created that is able to: estimate the distance to an obstacle using an ultrasonic sensor; perform the movement to a given point; grab an object no larger than 5.5 cm; deliver the item to the item collection point; determine the shortest path to a certain point. The movement to the right is performed in the same way as the algorithm of movement to the left.</p><p>There are limitations associated with the use of an ultrasonic distance sensor:</p><p>1. Partial displays can distort the measurement results (this can be caused by curved or inclined surfaces with respect to the direction of emission of the signal). 2. Objects made of sound-absorbing, insulating materials, with a woolen surface can weaken the signal. 3. The size of the object affects the signal quality. The smaller it is, the weaker the reflected signal. 4. High humidity (rain, snow) contributes to signal distortion.</p><p>The system can be improved with a movable ultrasonic sensor stand, which will help determine the width of the object in order to calculate the distance to avoid an obstacle.</p><p>The scientific novelty of the work lies in the fact that a new method of collecting objects in the room and a system have been developed that allows organizing the collection process with minimal energy consumption.</p><p>The practical significance of the work lies in the fact that the software has been developed, which makes it possible to assess the adequacy and effectiveness of the developed method in the real work of the system.</p></div>			</div>
			<div type="references">

				<listBibl>

<biblStruct xml:id="b0">
	<analytic>
		<title level="a" type="main">Implementation of web system optimization method</title>
		<author>
			<persName><forename type="first">G</forename><surname>Kirichek</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Skrupsky</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Tiahunova</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Timenko</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">CEUR Workshop Proceedings</title>
		<imprint>
			<biblScope unit="volume">2608</biblScope>
			<biblScope unit="page" from="199" to="210" />
			<date type="published" when="2020">2020</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b1">
	<analytic>
		<title level="a" type="main">The neural network for emotions recognition under special conditions</title>
		<author>
			<persName><forename type="first">M</forename><surname>Tiahunova</surname></persName>
		</author>
		<author>
			<persName><forename type="first">O</forename><surname>Tronkina</surname></persName>
		</author>
		<author>
			<persName><forename type="first">G</forename><surname>Kirichek</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Skrupsky</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">CEUR Workshop Proceedings</title>
		<imprint>
			<biblScope unit="volume">2864</biblScope>
			<biblScope unit="page" from="121" to="134" />
			<date type="published" when="2021">2021</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b2">
	<analytic>
		<title level="a" type="main">Computer simulation of neural networks using spreadsheets: The dawn of the Age of Camelot</title>
		<author>
			<persName><forename type="first">S</forename><surname>Semerikov</surname></persName>
		</author>
		<author>
			<persName><forename type="first">I</forename><surname>Teplytskyi</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Y</forename><surname>Yechkalo</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Kiv</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">CEUR Workshop Proceedings</title>
		<imprint>
			<biblScope unit="volume">2257</biblScope>
			<biblScope unit="page" from="122" to="147" />
			<date type="published" when="2018">2018</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b3">
	<analytic>
		<title level="a" type="main">System for detecting network anomalies using a hybrid of an uncontrolled and controlled neural network</title>
		<author>
			<persName><forename type="first">G</forename><surname>Kirichek</surname></persName>
		</author>
		<author>
			<persName><forename type="first">V</forename><surname>Harkusha</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Timenko</surname></persName>
		</author>
		<author>
			<persName><forename type="first">N</forename><surname>Kulykovska</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">CEUR Workshop Proceedings</title>
		<imprint>
			<biblScope unit="volume">2546</biblScope>
			<biblScope unit="page" from="138" to="148" />
			<date type="published" when="2020">2020</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b4">
	<monogr>
		<title/>
		<author>
			<persName><surname>Waveshare-Electronics</surname></persName>
		</author>
		<ptr target="http://www.waveshare.com/wiki/AlphaBot" />
		<imprint>
			<date type="published" when="2021">2021</date>
			<pubPlace>AlphaBot</pubPlace>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b5">
	<monogr>
		<title level="m" type="main">Designing Embedded Systems with Arduino: A Fundamental Technology for Makers</title>
		<author>
			<persName><forename type="first">T</forename><surname>Pan</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Y</forename><surname>Zhu</surname></persName>
		</author>
		<idno type="DOI">10.1007/978-981-10-4418-2</idno>
		<imprint>
			<date type="published" when="2018">2018</date>
			<publisher>Springer</publisher>
			<pubPlace>Singapore</pubPlace>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b6">
	<analytic>
		<title level="a" type="main">Development of a robot for transporting small-sized objects based on a AVR microcontroller (Razrabotka robota dlya transportirovki malogabaritnykh obyektov na baze mikrokontrollera AVR)</title>
		<author>
			<persName><forename type="first">M</forename><surname>Boranbaev</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Molodoy uchenyy</title>
		<imprint>
			<biblScope unit="page" from="277" to="286" />
			<date type="published" when="2016">2016</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b7">
	<analytic>
		<title level="a" type="main">A review of active mechanical driving principles of spherical robots</title>
		<author>
			<persName><forename type="first">R</forename><surname>Chase</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Pandya</surname></persName>
		</author>
		<idno type="DOI">10.3390/robotics1010003</idno>
		<ptr target="https://www.mdpi.com/2218-6581/1/1/3.doi:10.3390/robotics1010003" />
	</analytic>
	<monogr>
		<title level="j">Robotics</title>
		<imprint>
			<biblScope unit="volume">1</biblScope>
			<biblScope unit="page" from="3" to="23" />
			<date type="published" when="2012">2012</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b8">
	<analytic>
		<title level="a" type="main">Autonomous planning based on spatial concepts to tidy up home environments with service robots</title>
		<author>
			<persName><forename type="first">A</forename><surname>Taniguchi</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Isobe</surname></persName>
		</author>
		<author>
			<persName><forename type="first">L</forename><forename type="middle">E</forename><surname>Hafi</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Y</forename><surname>Hagiwara</surname></persName>
		</author>
		<author>
			<persName><forename type="first">T</forename><surname>Taniguchi</surname></persName>
		</author>
		<idno type="DOI">10.1080/01691864.2021.1890212</idno>
	</analytic>
	<monogr>
		<title level="j">Advanced Robotics</title>
		<imprint>
			<biblScope unit="volume">35</biblScope>
			<biblScope unit="page" from="471" to="489" />
			<date type="published" when="2021">2021</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b9">
	<analytic>
		<title level="a" type="main">Development of a learning algorithm for a mobile robot in order to detect obstacles in a confined space (Razrabotka algoritma obucheniya mobil&apos;nogo robota v tselyakh obnaruzheniya prepyatstviy v zamknutom prostranstve)</title>
		<author>
			<persName><forename type="first">O</forename><forename type="middle">V</forename><surname>Avseeva</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><forename type="middle">V</forename><surname>Larina</surname></persName>
		</author>
		<idno type="DOI">10.20914/2310-1202-2017-3-65-67</idno>
	</analytic>
	<monogr>
		<title level="j">Voronezh State University of Engineering Technologies</title>
		<imprint>
			<biblScope unit="volume">79</biblScope>
			<biblScope unit="page" from="65" to="67" />
			<date type="published" when="2017">2017</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b10">
	<analytic>
		<title level="a" type="main">Synthesis of a neural network path planning system for a group of mobile robots (Sintez neyrosetevoy sistemy planirovaniya trayektorii dlya grupp mobilnykh robotov)</title>
		<author>
			<persName><forename type="first">B</forename><forename type="middle">S</forename><surname>Yudintsev</surname></persName>
		</author>
		<idno type="DOI">10.24411/2410-9916-2019-10406</idno>
	</analytic>
	<monogr>
		<title level="j">Systems of Control, Communication and Security</title>
		<imprint>
			<biblScope unit="page" from="163" to="186" />
			<date type="published" when="2019">2019</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b11">
	<monogr>
		<title level="m" type="main">Exploring Arduino®: Tools and Techniques for Engineering Wizardry</title>
		<author>
			<persName><forename type="first">J</forename><surname>Blum</surname></persName>
		</author>
		<imprint>
			<date type="published" when="2013">2013</date>
			<publisher>Wiley</publisher>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b12">
	<monogr>
		<author>
			<persName><forename type="first">O</forename><forename type="middle">V</forename><surname>Antipov</surname></persName>
		</author>
		<ptr target="https://cxem.net/uprav/uprav66.php" />
		<title level="m">Robot platform RedBoard (Robot-platforma RedBoard)</title>
				<imprint>
			<date type="published" when="2015">2015</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b13">
	<monogr>
		<author>
			<persName><forename type="first">V</forename><surname>Petin</surname></persName>
		</author>
		<title level="m">Designing with an Arduino controller (Proyekty s ispolzovaniyem kontrollera Arduino</title>
				<meeting><address><addrLine>BHV-Peterburg</addrLine></address></meeting>
		<imprint>
			<date type="published" when="2019">2019</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b14">
	<monogr>
		<title level="m" type="main">Understanding and design of an arduino-based pid controller</title>
		<author>
			<persName><forename type="first">V</forename><surname>Bista</surname></persName>
		</author>
		<idno type="DOI">10.25772/790F-JP22</idno>
		<imprint>
			<date type="published" when="2016">2016</date>
		</imprint>
	</monogr>
</biblStruct>

				</listBibl>
			</div>
		</back>
	</text>
</TEI>
