<?xml version="1.0" encoding="UTF-8"?>
<TEI xml:space="preserve" xmlns="http://www.tei-c.org/ns/1.0" 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
xsi:schemaLocation="http://www.tei-c.org/ns/1.0 https://raw.githubusercontent.com/kermitt2/grobid/master/grobid-home/schemas/xsd/Grobid.xsd"
 xmlns:xlink="http://www.w3.org/1999/xlink">
	<teiHeader xml:lang="en">
		<fileDesc>
			<titleStmt>
				<title level="a" type="main">Feature Extraction for Terrain Classification with Crawling Robots</title>
			</titleStmt>
			<publicationStmt>
				<publisher/>
				<availability status="unknown"><licence/></availability>
			</publicationStmt>
			<sourceDesc>
				<biblStruct>
					<analytic>
						<author>
							<persName><forename type="first">Jakub</forename><surname>Mrva</surname></persName>
							<affiliation key="aff0">
								<orgName type="institution">Czech Technical University in Prague</orgName>
								<address>
									<addrLine>Technická 2</addrLine>
									<postCode>166 27</postCode>
									<settlement>Prague</settlement>
									<country key="CZ">Czech Republic</country>
								</address>
							</affiliation>
						</author>
						<author>
							<persName><forename type="first">Jan</forename><surname>Faigl</surname></persName>
							<affiliation key="aff0">
								<orgName type="institution">Czech Technical University in Prague</orgName>
								<address>
									<addrLine>Technická 2</addrLine>
									<postCode>166 27</postCode>
									<settlement>Prague</settlement>
									<country key="CZ">Czech Republic</country>
								</address>
							</affiliation>
						</author>
						<title level="a" type="main">Feature Extraction for Terrain Classification with Crawling Robots</title>
					</analytic>
					<monogr>
						<imprint>
							<date/>
						</imprint>
					</monogr>
					<idno type="MD5">52FC17797904E0368B0A5F1EB8E7CB6A</idno>
				</biblStruct>
			</sourceDesc>
		</fileDesc>
		<encodingDesc>
			<appInfo>
				<application version="0.7.2" ident="GROBID" when="2023-03-24T16:09+0000">
					<desc>GROBID - A machine learning software for extracting information from scholarly documents</desc>
					<ref target="https://github.com/kermitt2/grobid"/>
				</application>
			</appInfo>
		</encodingDesc>
		<profileDesc>
			<abstract>
<div xmlns="http://www.tei-c.org/ns/1.0"><p>In this paper, we address the problem of terrain classification using a technically blind hexapod walking robot. The proposed approach is built on top of the existing method based on analysis of the feedback from the robot's actuators and the desired trajectory. The formed method uses features for the Support Vector Machine classification method that assumes a regular time-invariant gait to control the robot. However, such a gait does not allow the robot to traverse rough terrains, and therefore, it is necessary to consider adaptive motion gait to deal with small obstacles, which is, unfortunately, not a regular gait with some fixed predefined period. Therefore, we propose to alter the features extraction process to utilize the terrain classification method also for an adaptive motion gait, which enables the robot to traverse rough terrains. The proposed method has been experimentally verified on several terrains that are not traversable by a default regular gait. The achieved results not only confirmed the high accuracy of the terrain classification as the existing approach, but also expanded the area of operation of a hexapod walking robot into more challenging terrains.</p></div>
			</abstract>
		</profileDesc>
	</teiHeader>
	<text xml:lang="en">
		<body>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="1">Introduction</head><p>Crawling robots can operate in a much greater scope in terms of terrain diversity than classical wheeled robots. The control complexity is, however, much greater due to the high number of degrees of freedom (DOF). One way to handle a high DOF is to generate a walking pattern-a gait <ref type="bibr" target="#b0">[1]</ref>. A simple regular gait gives the robot predefined trajectories for all legs, which are therefore alternating in their support and transfer phases.</p><p>In order to increase the robot's perception of the environment-for example to classify the terrain the robot is traversing-one can employ the robot with a variety of sensors. There can be found two complementary approaches based on exteroceptive and interoceptive sensors. In the case of exteroceptive sensing, we can utilize camera <ref type="bibr" target="#b1">[2,</ref><ref type="bibr" target="#b2">3]</ref> or laser-based range measurements <ref type="bibr" target="#b3">[4]</ref> for terrain classification.</p><p>However, if the robot is technically blind and dependent solely on interoceptive sensing, we can use force, torque <ref type="bibr" target="#b4">[5,</ref><ref type="bibr" target="#b5">6]</ref>, or other tactile sensors to gather data about the interaction of the robot with the terrain. Moreover, we can utilize the robot's actuators themselves and develop a classifier based on the differences between the expected and real trajectories of the robot servo drives <ref type="bibr" target="#b6">[7]</ref> without the need of any additional sensor. The existing method <ref type="bibr" target="#b6">[7]</ref> uses a default robot motion gait with regular and periodic phases of the leg movements, and therefore, it is suitable only for flat terrains without significant obstacles. Based on this method, we consider several terrains with obstacles or stairs and their classification using an adaptive motion gait <ref type="bibr" target="#b7">[8]</ref> that allows a smooth transition while reducing the workload of the servos and thus avoiding overheating. Such a gait does not preserve the predefined trajectories of each leg as a default motion gait does. Hence, the existing method of feature extraction proposed in <ref type="bibr" target="#b6">[7]</ref> is not directly applicable because it assumes a regular time-invariant gait with fixed trajectories. Therefore, in this paper, we propose a modification of the feature extraction process of <ref type="bibr" target="#b6">[7]</ref> to enable the terrain classification based only on the servo drive feedback also in crawling rough terrain using the adaptive motion gait. Therefore, the method can be used in more challenging terrains up to the structural limits of the hexapod walking robot.</p><p>The paper is organized as follows. A brief overview of the adaptive motion gaits for rough terrains and terrain classification methods is provided in the next section. A description of the considered robotic platform and definition of the problem is presented in Section 3. The utilized adaptive motion gait is briefly described in Section 4. The proposed feature extraction method is presented in Section 5 and experimental results in Section 6. The concluding remarks are in Section 7.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="2">Related Work</head><p>Adaptive motion of a walking robot to traverse a rough terrain has been addressed by many researchers and several approaches can be found in literature. A complex control architecture of quadruped walking robot to traverse challenging terrains has been presented in <ref type="bibr" target="#b8">[9]</ref> using several sensors attached to the robot and a precise map created off-line. The off-line scanning can be avoided by using an elevation map created from an on-board laser scanner and used further to alter the gait according to the terrain structure <ref type="bibr" target="#b9">[10]</ref>.</p><p>Another existing direction of the adaptive motion gaits are based on approaches that do not utilize a terrain map. They are based on a tactile information from force sensors <ref type="bibr" target="#b10">[11]</ref> (or torque-based estimation of the force <ref type="bibr" target="#b11">[12]</ref>) utilized to adapt the gait according to the terrain and to ensure the leg reaches the foothold. A passive actuator to measure the ground reaction force has been proposed in <ref type="bibr" target="#b12">[13]</ref> to substitute direct force or torque sensors, which is a suitable approach for the deployment of cheap robotic platforms. In <ref type="bibr" target="#b7">[8]</ref>, we proposed a similar approach that is even more minimalist since it does not need additional servos and thus it is solely based on the robot's actuators.</p><p>The problem of terrain classification is widely investigated also regarding on-board processing. A camera can be used to estimate the terrain class based on extracted features <ref type="bibr" target="#b2">[3]</ref> that can be further used to select an energy efficient motion gait <ref type="bibr" target="#b1">[2]</ref>. Authors of <ref type="bibr" target="#b13">[14]</ref> used a laser range finder for distinguishing between twelve terrains and achieved promising results; however, under specific laboratory conditions only.</p><p>Focusing on a structural point of view, an off-line scan of the terrain from a precise external laser-scanning system was used in <ref type="bibr" target="#b14">[15]</ref> to generate a database of terrain templates that are used for proper foothold planning. Authors of <ref type="bibr" target="#b3">[4]</ref> proposed an approach to avoid building a large database of templates. Their idea is based on a creation of a set of several templates that define good and poor footholds based on local concavity and sloppiness, which are useful attributes for predicting slipperiness of the terrain.</p><p>Beside exteroceptive sensing, tactile sensors are used to classify the terrain based on the direct measurements of the robot interaction with the terrain. In <ref type="bibr" target="#b4">[5]</ref>, authors used features extracted from the measurements of force sensors placed at the tip of the leg that are combined with the measurements of the motor current of the knee joint of a single vibrating robot leg detached from the body. A 6-DOF torque-force sensor was used in <ref type="bibr" target="#b15">[16]</ref> (under the same laboratory conditions as in <ref type="bibr" target="#b13">[14]</ref>) for a discriminant analysis between six types of terrain. However, all of these approaches are based on additional sensors, and therefore, they increase a complexity of the robot.</p><p>A slightly different approach that utilizes only interoceptive sensors built within the actuators was presented in <ref type="bibr" target="#b6">[7]</ref>. The actuators consist of position controllers that can send both the desired and the current position of the servo. The difference in these positions is then analyzed in time and frequency domains to extract a 660-dimensional feature vector from the two front legs during each gait cycle. This method is limited by using a periodic time-invariant gait and thus it is applicable almost exclusively on flat terrains without obstacles.</p><p>In this paper, we proposed a combination of the terrain classification method <ref type="bibr" target="#b6">[7]</ref> with the adaptive motion gait <ref type="bibr" target="#b7">[8]</ref>. Both these approaches are based solely on interoceptive sensing using active actuators and we propose a new feature extraction procedure to overcome limitations of <ref type="bibr" target="#b6">[7]</ref> and enable on-line classification of rough terrains.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="3">Problem Statement</head><p>The main problem being addressed in this paper is to extend the existing terrain classification approach <ref type="bibr" target="#b6">[7]</ref> to the adaptive motion gait <ref type="bibr" target="#b7">[8]</ref> and thus generalize terrain classification also for traversing rough terrains. A new feature extraction method is needed to deal with rough terrains because the adaptive motion gait does not preserve the required condition on the motion gait of <ref type="bibr" target="#b6">[7]</ref>, i.e., a timeinvariant motion gait. In the proposed approach, we consider a relatively cheap and easy-to-use platform Phan-tomX Hexapod Mark II with Dynamixel AX-12A actuators, see Fig. <ref type="figure" target="#fig_0">1</ref>, which is further described in the next section. An overview of the terrain classification method <ref type="bibr" target="#b6">[7]</ref> is provided in Section 3.2 to provide a background for the proposed approach.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="3.1">Hexapod Structure</head><p>The used hexapod platform has six legs each with three joints formed from the Dynamixel actuators. The schema of the leg and the description of its parts is depicted in Fig. <ref type="figure" target="#fig_1">2</ref>. All joints (θ C , θ F , and θ T ) are controlled with a position controller that provides every 33 ms the following information:</p><formula xml:id="formula_0">• Desired position θ des ; • Current position θ cur ; • Error in position e = |θ des − θ cur |.</formula><p>Using the adaptive motion gait <ref type="bibr" target="#b7">[8]</ref>, the robot can traverse small obstacles up to the limits of the robot structure. The joint θ C is fixed to the body with a vertical rotation axis while the other two joints have a horizontal axis.</p><p>We consider the robot is operating in an environment that satisfies the robot's structural limits and there is not a large obstacle that the robot cannot traverse. Hence, we are not addressing obstacle avoidance and other high-level navigation problem in this paper. Thus we are strictly focused on the problem of terrain classification and its practical validation in real experiments.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="3.2">Terrain Classification</head><p>The method of the terrain classification <ref type="bibr" target="#b6">[7]</ref> has been proposed for the same hexapod platform as we are using; however, a regular time-invariant gait is utilized for robot motion. The very general idea of the terrain classification is based on the small errors in position control (e) of all servo drives of the front legs (i.e., six servos) that are measured in the time domain at a non-uniform sample rate of approximately 20 Hz.</p><p>In order to obtain a more dense data and to get a uniform sample rate for the FFT used in the feature extraction, the signal is interpolated using a cubic Hermite spline interpolation method that creates a continuous function with a continuous first derivative. The interpolated function is then resampled at the frequency of 100 Hz. After that, a feature vector for the classification is created from the sampled data and computed characteristics of the signal.</p><p>Having the default regular gait, the data is windowed using a uniform window that contains the last three full gait cycles; so, each terrain-class prediction is based on the past three gait cycles worth of data. Given the motivation that different terrain surfaces induce a specific behavior in different sections of the gait cycle, the data are divided into 16 equally wide segments within a gait cycle to form a gait-phase domain. Respective segments from the last three gait cycles are joined together and basic statistics of all data samples that fall within are computed yielding in 5 values (features) for each segment (i.e., minimum, maximum, mean, median, and standard deviation). Repeated for each servo, we obtain a total of 480 gait-phase features, i.e., 2 (legs) × 3 (servos per leg) × 5 (features) × 16 (segments).</p><p>Additional 180 features are calculated in the frequency domain. A Hamming window and discrete Fourier transform are applied on the same resampled position error signal to obtain a frequency spectrum of 25 bins (0-12 Hz). All amplitude values of frequency bins are used alone, giving another 25 features (for each servo), supplemented by 4 features obtained from the shape of the spectrum (i.e., centroid, standard deviation, skewness, and kurtosis) and finalized with the energy of the spectrum. The overall 660-dimensional feature vector consists of:</p><p>• Statistics of each segment for each servo (5 × 16 × 6 = 480); • Bins of the frequency spectrum (25 × 6 = 150); • Shape of the frequency spectrum (4 × 6 = 24); • Energy of the frequency spectrum <ref type="bibr" target="#b5">(6)</ref>.</p><p>Such 660-dimensional feature vectors for particular type of the terrain and several trials of traversing the terrain are used to train a multi-class linear Support Vector Machine (SVM) classifier and authors of <ref type="bibr" target="#b6">[7]</ref> report 95% accuracy in distinguishing between 3 terrain classes (concrete, grass, and rocks/mulch).</p><p>In our approach presented in this paper, we follow the same idea of the terrain classifier based on the SVM, but we propose a new feature extraction process to address the absence of regularity in the adaptive motion gait for crawling rough terrains.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="4">Adaptive Motion Gait</head><p>The adaptive gait is originally based on a regular tripod gait in terms of predefined trajectories for each particular leg. However, the trajectories can be changed and the gaitcycle is divided into separate phases of the leg and body motion. The gait diagram is shown in Fig. <ref type="figure" target="#fig_2">3</ref>, where it can be seen that legs in the transfer phase move through predefined checkpoints (up and forward) and begin approaching another predefined checkpoint, which is situated far below the current ground level (but still reachable). The ground sensing is done via observing the position error e of the joint θ F during lowering the leg with respect to a certain threshold value. Although all legs in the transfer phase are moving simultaneously, each ground contact stops only the particular corresponding leg.</p><p>After the legs found their new footholds (the rest legs stay motionless), a new body posture is found given the feet positions in order to adapt to the terrain the robot is traversing. The body motion itself is provided by moving all legs according to a transformation of the feet positions.</p><p>In summary, the leg motion phase consists of 3 steps (up, forward, and down) and is followed by a body leveling step. Given the tripod gait in which the legs are grouped into two triplets that are alternating, we repeat the same steps for the other triplet to obtain a total of 8 discrete steps per one full gait cycle. Notice, that as a consequence of the adaptive-gait model, a leg is moving only in the transfer phase (3 steps) and in both body leveling steps (4th and 8th steps), where all legs are needed to move the body. A more detail description can be found in <ref type="bibr" target="#b7">[8]</ref>.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="5">Feature Extraction</head><p>The proposed feature extraction process is based on the terrain classification originally developed for a regular gait <ref type="bibr" target="#b6">[7]</ref>, which has to be altered to deal with a different behavior of the adaptive gait <ref type="bibr" target="#b7">[8]</ref>. The key difference between these two gaits is in their regularity and in the fact that the regular gait is synchronized by a time signal and a leg never stops moving, whereas the adaptive gait splits the leg and body motion according to the gait phases.</p><p>A leg trajectory during the phases of the gait is depicted in Fig. <ref type="figure" target="#fig_4">4</ref>. The regular gait is periodic and the robot is able to traverse a flat terrain at the constant speed. Although the robot can pass very small obstacles at the cost of a high servo load, the robot is incapable to traverse a rough terrain using this regular default gait <ref type="bibr" target="#b7">[8]</ref>.</p><p>On the other hand, the adaptive motion gait utilizes a tactile information to detect the ground-contact point and thus it is able to decrease the servo load and adjust the robot to the terrain. However, ground-contact points along the vertical line (during moving the leg down) are not known. Hence the time the leg spends in the groundapproaching phase is also not known. Moreover, the trajectory of the particular foot in the support phase is also influenced by the contacts of the other legs with the grounds, and therefore, the trajectory is not regular during crawling rough terrain and it may vary significantly. These variances have to be considered in the analysis of the servo position signal in the feature extraction process to avoid possible misinterpretation of the data.  Due to the variances of the gait phases, which depends on the roughness of the terrain the robot is traversing, we cannot rely on a uniform partitioning of the gait phases into 16 segments for the feature extraction as in <ref type="bibr" target="#b6">[7]</ref>. On the other hand, we can utilize the gait phases of the adaptive gait, as it is shown in the diagram in Fig. <ref type="figure" target="#fig_2">3</ref>, and the data from one gait cycle can be therefore divided into 8 segments according to the gait phases.</p><p>Authors of <ref type="bibr" target="#b6">[7]</ref> extended the feature vector by features extracted from a frequency analysis. However, such analysis requires a condition of periodicity that is not fulfilled in the adaptive gait. Nevertheless, during a practical experimenting, it has been observed that the absence of frequency-based features did not prevent the classifier to achieve accurate classification, which is shown in Section 6. Notice that the authors also did not consider features selection to reduce the 660-dimensional feature vector; so, the frequency analysis may be expendable.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="6">Experimental Results</head><p>Since the proposed method extends an existing approach by adding more rough terrains where the robot can operate, we focused the experimental evaluation of the proposed method solely on those challenging terrains. Nevertheless, we also used datasets from simple outdoor terrains for completeness.</p><p>The proposed multi-class SVM classifier (with linear kernel) was trained for feature vectors collected from 7 different classes:  Table <ref type="table">1</ref>: Confusion matrix of 2-fold cross-validation with overall accuracy 99.4%</p><p>Each terrain class was trained from several trials with 3-7 minutes worth of data; the exact number of feature vectors extracted from the data can be read from columns of Table <ref type="table">1</ref>.</p><p>The evaluation strategy is based on the verification of the distinguishability of the terrain using the features. Then, we evaluate online detection of the terrain in a separate scenario, where particular types of the terrains are altered and the robot is requested to traverse them and continuously detect the terrain. These two evaluation scenarios are described in the following sections.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="6.1">Distinguishability of the Terrains</head><p>Two-fold cross-validation with all datasets involved has been used to validate whether the classifier is able to distinguish between the considered 7 terrain classes. As can be seen from the confusion matrix in Table <ref type="table">1</ref>, the overall accuracy of 99.4% is very high even for only 2-fold crossvalidation (with more folds, we can easily get 100%).</p><p>However, notice this test is based on using always data from the same single experiment in both training and testing partition, and therefore, the data are more likely referring to themselves than to a generalized model of particular terrain class. In <ref type="bibr" target="#b6">[7]</ref>, authors achieved the same high accuracy when evaluating on the same datasets that were used for training.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="6.2">Terrain Classification</head><p>A more realistic practical scenario is based on evaluation of the classification for traversing rough terrains in a single run, where undefined terrains at the overlap of the particular terrain types are provided. The scenario setup is shown in Fig. <ref type="figure" target="#fig_7">6</ref> and it consists of a sequence of rough terrains used for the learning. The robot starts from a defined position and crosses progressively few small obstacles, a pool of wooden blocks, and wooden stairs. This scenario was repeated five times and the extracted feature vectors were evaluated against the model previously learned from a single-pass of the individual terrains.</p><p>The predicted terrain labels from each of five runs are shown in Fig. <ref type="figure" target="#fig_9">7</ref>. The class prediction is made once per each gait cycle and is computed from the last three gait cycles worth of data. Therefore (with respect to the robot's length) there are long transition areas of the overlapping terrains.</p><p>The transition between the office floor and the pool of wooden blocks is mostly characterized as the stairs, which corresponds with the entry side of the pool with increasing height of the blocks.</p><p>The other transition between the blocks and the stairs is undefinable and can be predicted as either terrain class, or a similar class (obstacles) based on the actual footholds in the area during the experiment. We can also see that there is some confusion between the dirt and the office floor terrain with obstacles which are both relatively flat and slippery for the robot.</p><p>Despite not analyzed, if another terrain class of stairs being traversed down was trained, it is highly probable that a transition from the blocks to the office floor would have been classified as this terrain (for the same reason as the opposite transition mentioned above).</p><p>The main aspect of the challenging terrain traversing, which cannot be seen on the simple flat terrains, is the  occurrence of a foot slippage on the edge of an obstacle (stair) that yields in a sudden fall to a lower level and impacts all legs. More slippages in a short time can lead in a confusion in the prediction, as can be seen in Fig. <ref type="figure" target="#fig_9">7</ref> where the transition between the blocks and the stairs was once predicted as a grass terrain.</p><p>The unequal length of all runs is purely dependent on the event when the robot steps over the side edge of the stairs and thus stops the experiment. This happens due to the fact that the robot cannot steer and is strictly going straight ahead. Notice, it is not possible to show the ground truth for the predictions in Fig. <ref type="figure" target="#fig_9">7</ref> because the robot can spend different number of gait cycles to get to the same point in the scenario in particular runs. Therefore, the only measure we can get is to compare the results from Fig. <ref type="figure" target="#fig_9">7</ref> to the overview of the testing scenario shown in Fig. <ref type="figure" target="#fig_7">6</ref>.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="7">Conclusion</head><p>We proposed an alternative method to extract features from servo drives to classify terrains for a technically blind robot traversing rough terrains. Although the proposed method simplifies the original feature extraction process, the results indicate it is sufficient to distinguish evaluated terrain classes. Moreover, the results also indicate we can employ the learned classifier in the on-line terrain classification in scenarios with rough terrains.</p><p>The classifier is based on the features of the robot motion and interaction with the terrain. However, the features of the terrain itself (e.g., slopiness, slipperiness, convexity) are not analyzed directly-they may be hidden inside the SVM layer and could be addressed in the future work.</p></div><figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_0"><head>Figure 1 :</head><label>1</label><figDesc>Figure 1: Used hexapod walking robot for the terrain classification</figDesc><graphic coords="1,310.48,171.92,237.74,140.93" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_1"><head>Figure 2 :</head><label>2</label><figDesc>Figure2: Schema of the leg consisting of three parts (links)-Coxa, Femur, and Tibia. The three joints (θ C , θ F , and θ T ) are indexed according to the next respective link. The joint θ C is fixed to the body with a vertical rotation axis while the other two joints have a horizontal axis.</figDesc></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_2"><head>Figure 3 :</head><label>3</label><figDesc>Figure3: Diagram of a gait cycle. Firstly, the legs in the transfer phase move to find new footholds. Secondly, the body is leveled to adapt the new footholds. Finally, another legs are chosen for the next transfer phase. Orange color highlights the motion of legs in the transfer phase only, while red color highlights the motion of all legs.</figDesc></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_3"><head></head><label></label><figDesc>Default gait (b) Adaptive gait</figDesc></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_4"><head>Figure 4 :</head><label>4</label><figDesc>Figure 4: Comparison of the leg trajectory using a regular default gait and an adaptive gait.</figDesc></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_5"><head>•</head><label></label><figDesc>Wooden stairs • Wooden blocks of different height • Office floor with small obstacles • Office floor • Asphalt • Grass • Dirt (a) Small obstacles (b) Wooden blocks (c) Wooden stairs</figDesc></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_6"><head>Figure 5 :</head><label>5</label><figDesc>Figure 5: Terrains traversable by the adaptive gait.</figDesc><graphic coords="4,308.06,587.77,78.43,58.85" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_7"><head>Figure 6 :</head><label>6</label><figDesc>Figure 6: Testing scenario consisting of small obstacles (bottom right corner) on the office floor, followed by a pool of wooden blocks and ending with climbing the stairs.</figDesc><graphic coords="5,309.17,249.18,240.36,181.52" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_9"><head>Figure 7 :</head><label>7</label><figDesc>Figure 7: predictions of terrain labels in the testing scenario.</figDesc></figure>
			<note xmlns="http://www.tei-c.org/ns/1.0" place="foot" xml:id="foot_0">J. Mrva, J. Faigl</note>
		</body>
		<back>

			<div type="acknowledgement">
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Acknowledgments</head><p>The presented work has been supported by the Czech Science Foundation (GA ČR) under research project No. 15-09600Y.</p></div>
			</div>

			<div type="references">

				<listBibl>

<biblStruct xml:id="b0">
	<monogr>
		<author>
			<persName><forename type="first">G</forename><surname>Dudek</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Jenkin</surname></persName>
		</author>
		<title level="m">Computational principles of mobile robotics</title>
				<meeting><address><addrLine>New York, NY, USA</addrLine></address></meeting>
		<imprint>
			<publisher>Cambridge University Press</publisher>
			<date type="published" when="2000">2000</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b1">
	<analytic>
		<title level="a" type="main">Visual terrain classification for selecting energy efficient gaits of a hexapod robot</title>
		<author>
			<persName><forename type="first">S</forename><surname>Zenker</surname></persName>
		</author>
		<author>
			<persName><forename type="first">E</forename><surname>Aksoy</surname></persName>
		</author>
		<author>
			<persName><forename type="first">D</forename><surname>Goldschmidt</surname></persName>
		</author>
		<author>
			<persName><forename type="first">F</forename><surname>Worgotter</surname></persName>
		</author>
		<author>
			<persName><forename type="first">P</forename><surname>Manoonpong</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM)</title>
				<imprint>
			<date type="published" when="2013">2013</date>
			<biblScope unit="page" from="577" to="584" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b2">
	<analytic>
		<title level="a" type="main">Feature-based terrain classification for littledog</title>
		<author>
			<persName><forename type="first">P</forename><surname>Filitchkin</surname></persName>
		</author>
		<author>
			<persName><forename type="first">K</forename><surname>Byl</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">IROS</title>
				<imprint>
			<date type="published" when="2012">2012</date>
			<biblScope unit="page" from="1387" to="1392" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b3">
	<analytic>
		<title level="a" type="main">Rough terrain mapping and classification for foothold selection in a walking robot</title>
		<author>
			<persName><forename type="first">D</forename><surname>Belter</surname></persName>
		</author>
		<author>
			<persName><forename type="first">P</forename><surname>Skrzypczyński</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Journal of Field Robotics</title>
		<imprint>
			<biblScope unit="volume">28</biblScope>
			<biblScope unit="issue">4</biblScope>
			<biblScope unit="page" from="497" to="528" />
			<date type="published" when="2011">2011</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b4">
	<analytic>
		<title level="a" type="main">Haptic terrain classification for legged robots</title>
		<author>
			<persName><forename type="first">M</forename><forename type="middle">A</forename><surname>Hoepflinger</surname></persName>
		</author>
		<author>
			<persName><forename type="first">C</forename><forename type="middle">D</forename><surname>Remy</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Hutter</surname></persName>
		</author>
		<author>
			<persName><forename type="first">L</forename><surname>Spinello</surname></persName>
		</author>
		<author>
			<persName><forename type="first">R</forename><surname>Siegwart</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">ICRA</title>
		<imprint>
			<biblScope unit="page" from="2828" to="2833" />
			<date type="published" when="2010">2010</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b5">
	<analytic>
		<title level="a" type="main">The classification of the terrain by a hexapod robot</title>
		<author>
			<persName><forename type="first">A</forename><surname>Schmidt</surname></persName>
		</author>
		<author>
			<persName><forename type="first">K</forename><surname>Walas</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">Proceedings of the 8th International Conference on Computer Recognition Systems (CORES)</title>
				<meeting>the 8th International Conference on Computer Recognition Systems (CORES)</meeting>
		<imprint>
			<date type="published" when="2013">2013</date>
			<biblScope unit="page" from="825" to="833" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b6">
	<analytic>
		<title level="a" type="main">Terrain classification using a hexapod robot</title>
		<author>
			<persName><forename type="first">G</forename><surname>Best</surname></persName>
		</author>
		<author>
			<persName><forename type="first">P</forename><surname>Moghadam</surname></persName>
		</author>
		<author>
			<persName><forename type="first">N</forename><surname>Kottege</surname></persName>
		</author>
		<author>
			<persName><forename type="first">L</forename><surname>Kleeman</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">Proceedings of the Australasian Conference on Robotics and Automation (ACRA)</title>
				<meeting>the Australasian Conference on Robotics and Automation (ACRA)</meeting>
		<imprint>
			<date type="published" when="2013">2013</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b7">
	<analytic>
		<title level="a" type="main">Tactile sensing with servo drives feedback only for blind hexapod walking robot</title>
		<author>
			<persName><forename type="first">J</forename><surname>Mrva</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Faigl</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">Proceedings of the 10th International Workshop on Robot Motion and Control (RoMoCo)</title>
				<meeting>the 10th International Workshop on Robot Motion and Control (RoMoCo)</meeting>
		<imprint>
			<date type="published" when="2015">2015</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b8">
	<analytic>
		<title level="a" type="main">Learning, planning, and control for quadruped locomotion over challenging terrain</title>
		<author>
			<persName><forename type="first">M</forename><surname>Kalakrishnan</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Buchli</surname></persName>
		</author>
		<author>
			<persName><forename type="first">P</forename><surname>Pastor</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Mistry</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Schaal</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">The International Journal of Robotics Research</title>
		<imprint>
			<biblScope unit="volume">30</biblScope>
			<biblScope unit="issue">2</biblScope>
			<biblScope unit="page" from="236" to="258" />
			<date type="published" when="2011">2011</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b9">
	<analytic>
		<title level="a" type="main">Gait modification strategy for a six-legged robot walking on rough terrain</title>
		<author>
			<persName><forename type="first">D</forename><surname>Belter</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">Proceedings of the 15th International Conference on Climbing and Walking Robots, Adaptive Mobile Robotics, World Scientific</title>
				<editor>
			<persName><forename type="first">A</forename><surname>Azad</surname></persName>
		</editor>
		<meeting>the 15th International Conference on Climbing and Walking Robots, Adaptive Mobile Robotics, World Scientific<address><addrLine>Singapore</addrLine></address></meeting>
		<imprint>
			<date type="published" when="2012">2012</date>
			<biblScope unit="page" from="367" to="374" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b10">
	<analytic>
		<title level="a" type="main">Path planning with force-based foothold adaptation and virtual model control for torque controlled quadruped robots</title>
		<author>
			<persName><forename type="first">A</forename><surname>Winkler</surname></persName>
		</author>
		<author>
			<persName><forename type="first">I</forename><surname>Havoutis</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Bazeille</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Ortiz</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Focchi</surname></persName>
		</author>
		<author>
			<persName><forename type="first">R</forename><surname>Dillmann</surname></persName>
		</author>
		<author>
			<persName><forename type="first">D</forename><surname>Caldwell</surname></persName>
		</author>
		<author>
			<persName><forename type="first">C</forename><surname>Semini</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">ICRA</title>
		<imprint>
			<biblScope unit="page" from="6476" to="6482" />
			<date type="published" when="2014">2014</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b11">
	<analytic>
		<title level="a" type="main">Supporting locomotive functions of a six-legged walking robot</title>
		<author>
			<persName><forename type="first">K</forename><surname>Walas</surname></persName>
		</author>
		<author>
			<persName><forename type="first">D</forename><surname>Belter</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">International Journal of Applied Mathematics and Computer Science</title>
		<imprint>
			<biblScope unit="volume">21</biblScope>
			<biblScope unit="issue">2</biblScope>
			<date type="published" when="2011">2011</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b12">
	<analytic>
		<title level="a" type="main">Blind hexapod walking over uneven terrain using only local feedback</title>
		<author>
			<persName><forename type="first">L</forename><surname>Palmer</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Palankar</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">IEEE International Conference on Robotics and Biomimetics (ROBIO)</title>
				<imprint>
			<biblScope unit="volume">2011</biblScope>
			<biblScope unit="page" from="1603" to="1608" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b13">
	<analytic>
		<title level="a" type="main">Terrain classification using laser range finder</title>
		<author>
			<persName><forename type="first">K</forename><surname>Walas</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Nowicki</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">IROS</title>
				<imprint>
			<date type="published" when="2014">2014</date>
			<biblScope unit="page" from="5003" to="5009" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b14">
	<analytic>
		<title level="a" type="main">Learning locomotion over rough terrain using terrain templates</title>
		<author>
			<persName><forename type="first">M</forename><surname>Kalakrishnan</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Buchli</surname></persName>
		</author>
		<author>
			<persName><forename type="first">P</forename><surname>Pastor</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Schaal</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">IROS</title>
				<imprint>
			<date type="published" when="2009">2009</date>
			<biblScope unit="page" from="167" to="172" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b15">
	<analytic>
		<title level="a" type="main">Tactile sensing for ground classification</title>
		<author>
			<persName><forename type="first">K</forename><surname>Walas</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Journal of Automation, Mobile Robotics &amp; Intelligent Systems</title>
		<imprint>
			<biblScope unit="volume">7</biblScope>
			<biblScope unit="issue">2</biblScope>
			<biblScope unit="page" from="18" to="23" />
			<date type="published" when="2013">2013</date>
		</imprint>
	</monogr>
</biblStruct>

				</listBibl>
			</div>
		</back>
	</text>
</TEI>
