<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta>
      <journal-title-group>
        <journal-title>Series</journal-title>
      </journal-title-group>
      <issn pub-type="ppub">1613-0073</issn>
    </journal-meta>
    <article-meta>
      <title-group>
        <article-title>Formalization of Word-Order Shifts by Restarting Automata∗</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Markéta Lopatková</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Martin Plátek</string-name>
          <email>martin.platek@ufal.mff.cuni.cz</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Charles University in Prague, Faculty of Mathematics and Physics Malostranské nám.</institution>
          <addr-line>25, 118 00 Prague</addr-line>
          ,
          <country country="CZ">Czech Republic</country>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2013</year>
      </pub-date>
      <volume>1003</volume>
      <fpage>2</fpage>
      <lpage>9</lpage>
      <abstract>
        <p>The phenomenon of word order freedom plays an important role in syntactic analysis of many natural languages. This paper introduces a notion of a word order shift, an operation reflecting the degree of word order freedom of natural languages. The word order shift is an operation based upon word order changes preserving syntactic correctness, individual word forms, their morphological characteristics, and their surface dependency relations in a course of a stepwise simplification of a sentence, a so called analysis by reduction. The paper provides linguistic motivation for this operation and concentrates on a formal description of the whole mechanism through a special class of automata, so called restarting automata with the shift operation enhanced with a structured output, and their computations. The goal of this study is to clarify the properties of computations needed to perform (enhanced) analysis by reduction for free word order languages.</p>
      </abstract>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>The phenomenon of word order freedom plays an
important role in syntactic analysis of many natural languages.
A relatively free word order combined with additional
linguistic constraints constitutes a big challenge for a
formal description of syntactic structures.Here we introduce
a notion of a word order shift, an operation reflecting the
degree of word order freedom of natural languages. The
word order shift is a word order change preserving
syntactic correctness, individual word forms, their
morphological characteristics, and their surface dependency relations
in the course of a stepwise simplification of a sentence
under investigation, referred to as an analysis by reduction.
Section 2 provides a linguistic motivation and informal
description of the process.</p>
      <p>Section 3 concentrates on a formal description of the
whole mechanism through a special class of automata,
so called restarting automata with the shift operation
enhanced with structured output, pRLe-automata, and their
computations.</p>
      <p>
        Previous Work. The paper [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ] introduced a specific
approach to the problem of measuring the word-order
freedom. It focused upon an interplay of two phenomena
related to word order: a (non-)projectivity of a sentence and
a number of word order shifts within the analysis by
reduction. This interplay was exemplified on a sample of
Czech sentences with clitics and non-projectivities (long
distance dependencies, see esp. [
        <xref ref-type="bibr" rid="ref10 ref4">10, 4</xref>
        ]). This approach is
based upon the method of analysis reduction – a stepwise
simplification of a sentence preserving its correctness. The
analysis by reduction had been applied in a relatively free
manner with no constraints on the projectivity of its
individual steps. Our investigation focused on a set of
syntactically analyzed Czech sentences from the Prague
Dependency Treebank (PDT) [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ]. The experiment on treebank
data showed that with only basic constraints on the
analysis by reduction, see [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ], the minimal number of necessary
shifts for any sentence from the test set did not exceed one.
This number actually confirmed the initial assumption for
Czech as a language with relatively high degree of
wordorder freedom; nevertheless, we have continued the search
for more complicated sentences. This lead to the
discovery of a special construction requiring (at least) two shifts,
as it is discussed in Section 2.1.
      </p>
      <p>
        The analysis by reduction has served as a motivation for
new a family of automata, so called restarting automata,
see esp. [
        <xref ref-type="bibr" rid="ref13 ref14">14, 13</xref>
        ]. As a first step, analysis by reduction
was modelled as a formal string to string translation using
a suitable model of restarting automata [
        <xref ref-type="bibr" rid="ref9">9</xref>
        ]. Then a class of
enhanced restarting automata with an output consisting of
a single DR-tree was introduced, [
        <xref ref-type="bibr" rid="ref16">16</xref>
        ]. Finally, the model
was further enhanced – a model of restarting automata that
assign a set of dependency structures (DR-structures) to
every reduction of an input sentence was discussed, [
        <xref ref-type="bibr" rid="ref15">15</xref>
        ].
DR-structures can capture a set of dependency trees
representing sentence on different layers of linguistic
description in a parallel way. Here we further enrich the
performance of these automata with the shift operation.
2
      </p>
    </sec>
    <sec id="sec-2">
      <title>Analysis by Reduction</title>
      <p>
        Let us first describe the main ideas behind the method used
for sentence analysis. Analysis by reduction (AR) is based
on a stepwise simplification of an analyzed sentence, see
[
        <xref ref-type="bibr" rid="ref8">8</xref>
        ]. It defines possible sequences of reductions (deletions)
in the sentence – each step of AR is represented by deleting
a sentence member, possibly accompanied by rewriting a
word of the sentence (and thus its shortening); in specific
cases, deleting is accompanied by a shift of a word to
another word order position.
      </p>
      <p>Let us stress the basic constraints imposed on the
analysis by reduction:
(i) the obvious constraint on preserving individual words
(word forms), their morphological characteristics
and/or their surface dependency relations;
(ii) the constraint on preserving the correctness (i.e., a
grammatically correct sentence must remain correct
after its simplification);
(iii) moreover, the application of the shift operation is
limited only to cases when a shift is enforced by the
correctness preserving principle of AR, i.e., a simple
deletion would result in an incorrect word order and
thus violate the principle (ii) above.</p>
      <p>
        As a consequence of AR, it is possible to derive
formal dependency relations between individual sentence
members – pairs of governing and dependent nodes –
based on the possible order(s) of reductions, as it is
described in [
        <xref ref-type="bibr" rid="ref15 ref8">8, 15</xref>
        ], see ex. (1) below. Compare also with
other approaches aiming to determine dependency
relations summed up esp. in [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ] and [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ].
      </p>
      <sec id="sec-2-1">
        <title>Example:</title>
        <p>(1) Petr se bojí o otce.</p>
        <p>‘Peter – REFL – fears – for – father’
‘Peter fears for his father.’
The analysis by reduction can be summarized in the
following scheme:</p>
        <sec id="sec-2-1-1">
          <title>Petr se bojí o otce.</title>
          <p>delet=e
Z~Zdelete</p>
        </sec>
        <sec id="sec-2-1-2">
          <title>Petr se bojí. * se bojí o otce.</title>
          <p>delete? ?shift
* se bojí. bojí se o otce.</p>
          <p>shifZtZ~ =delete</p>
          <p>bojí se.</p>
          <p>Example (1) can be simplified in two ways (for simplicity,
we do not make distinction between upper and lower case
letters at the beginning of the sentence):
(i) Either by simple deletion of the prepositional group
o otce ‘for father’ (following the constraint on correctness
of the simplified sentence, the pair of word forms must be
deleted in a single step; see the left branch of the scheme).
(ii) Or by deleting the subject Petr (the right part of the
scheme); however, this simplification results in an
incorrect word order variant starting with the clitic se (such
position of a clitic is forbidden in Czech). Thus the shift
operation correcting the word order is enforced →shi f t bojí
se o otce.</p>
          <p>As these possible reductions are independent of each other,
we can conclude that the words Petr and o otce are
independent – in other words, both of them depend on / modify
a word remaining in the simplified sentence, i.e., the verb
and its clitic bojí se.</p>
          <p>Now, we can proceed in a similar way until we get the
minimal correct simplified sentence→ bojí se.</p>
          <p>We can notice that the order of reductions reflects the
dependency relations in the corresponding dependency
tree. Informally, the words are ‘cut from the bottom of the
tree’ (in other words, the ‘pruning’ operation is applied on
the tree); i.e., a governing node must be preserved in a
simplified sentence until all its dependent words are deleted.1
In other words, AR makes it possible to identify the
dependency tree for sentence (1):
bojí</p>
          <p>Z</p>
          <p>JJZ
Petr se o .</p>
          <p>J
J
otce</p>
          <p>
            The analysis by reduction has been designed to model a
human who is able to check the correctness of the analysis.
However, it is possible to model correct reductions on the
basis of large corpora of natural language sentences, see
[
            <xref ref-type="bibr" rid="ref11">11</xref>
            ].
2.1 The role of shifts. The above mentioned experiment
on a set of non-projective sentences from PDT [
            <xref ref-type="bibr" rid="ref2">2</xref>
            ] (namely
on the sentences with a non-projectivity given by a modal
verb, see [
            <xref ref-type="bibr" rid="ref3">3</xref>
            ]) constituted only the first step.
          </p>
          <p>
            As a follow-up experiment, we have decided to
supplement the data with ‘suspicious’ types of sentences
identified in previous work on Czech word order freedom, esp.
in [
            <xref ref-type="bibr" rid="ref4">4</xref>
            ]. It allowed to include also less frequent
phenomena into our investigations. As a result, we found a
wellformed Czech sentence that requites at least two shifts in
the course of AR, see the following example (2).
          </p>
          <p>Let us stress that the role of shifts in the course of the
analysis by reduction was limited to cases where a shift
‘corrects’ word order of a simplified sentence – this
concerns primarily the cases when an input sentence contains
a clitic (as in sentences (1)-(2)).</p>
        </sec>
      </sec>
      <sec id="sec-2-2">
        <title>Example:</title>
        <p>(2) S teˇžkým se bála pomoci úkolem.</p>
        <p>‘with – difficult – REFL – (she) was afraid – to help – task’
‘With a difficulttask, she was afraid to help.’
Although the sample sentence is rather strange, the
splitting of the prepositional group is a grammatical
construction in Czech allowing to put a strong stress on an
adjective modifying the noun and not on the whole prepositional
group.</p>
        <p>Due to the dependency relations present in the sentence
there is only one possibility how to reduce it, the reduction
of the adjective teˇžkým ‘difficult’. Unfortunately, it results
in a syntactically incorrect word order:
→delete * s se bála pomoci úkolem.</p>
        <p>This situation can be corrected in two possible ways, let us
focus on one of them:</p>
        <p>1Let us note that some relations – e.g., the relation between a
preposition and its ‘head’ noun as well as a verb and some clitic – are rather
technical (not only) from the point of view of AR: such pairs must be
reduced in a single step (despite being represented as two nodes in a
dependency tree); here we adhere to the practice used in the PDT
annotation.
→shi f t s úkolem se bála pomoci. (A shift of the noun
úkolem ‘task’ next to the preposition.)
→delete * se bála pomoci. (Unfortunately, the next
reduction must remove the prepositional group s úkolem ‘with
task’ making the sentence again ungrammatical due to the
clitic on the sentence first position.)
→shi f t bála se pomoci. (Now we can repair the sentence
by shifting the verb bála to the left.)
→delete bála se. (The reduction of the word pomoci ‘help’
ends the process.)
Similarly for the second sequence of reductions and shifts.
Regardless of the selected sequence, we must apply at least
two shifts in the course of AR.</p>
        <p>Remark: Note that we limit our observations to simple
sentences, i.e., to sentences with a single clause. The
reason is obvious, complex sentences might blur the whole
picture by bringing additional phenomena into the play.
For example, a coordination of clauses might completely
change the total number of shifts in a complex sentence
regardless of the phenomena contained in individual clauses.
In general, each (coordinated) clause can induce a shift in
the course of AR, thus the overall number of shifts cannot
be bounded.
3</p>
      </sec>
    </sec>
    <sec id="sec-3">
      <title>Restarting Automaton with a Structured</title>
    </sec>
    <sec id="sec-4">
      <title>Output</title>
      <p>
        The crucial issue of the application of a word-order shift
in the process of AR is actually identifying WHAT should
be shifted WHERE. In order to model the whole process
by means of a special kind of an automaton, we have to
use special markers – pebbles, which mark the possible
targets of shift locations. For this purpose we are going
to define a restarting automaton with a finite set of special
meta-instructions using pebbles – an pRL-automaton. The
automaton with pebbles was originally introduced in [
        <xref ref-type="bibr" rid="ref15">15</xref>
        ]
in order to define dependency output structures, and we
are going to enhance its abilities with the shift operation.
      </p>
      <p>We proceed in several steps. First, a restarting
automaton with pebbles is enhanced with a shift operation
(Section 3.1). Second, a structured output is added (Section
3.2). Finally, a computation of such enhanced automaton
is described (Section 3.3).
3.1</p>
      <sec id="sec-4-1">
        <title>Restarting Automaton with the Shift Operation</title>
      </sec>
      <sec id="sec-4-2">
        <title>Operating on Strings</title>
        <p>Here we present a class of nondeterministic pRL-automata
that are suitable for modeling AR – these automata allow
for checking the whole input sentence (and marking
selected words with pebbles) prior to any changes. It
resembles a linguist who can read the whole sentence first,
and then can reduce the sentence in a correct way. Thus
he/she makes correct steps of the analysis by reduction.
We have chosen a nondeterministic model to enable
various sequences of reductions. This can serve for the
verification of the (in)dependency between the individual parts
of a sentence, as was sketched in Section 2.</p>
        <p>A pRL-automaton M is (in general) a nondeterministic
machine with a finite alphabetΓ, a head q (window of size
1) working on a flexible tape delimited by the left sentinel
c and the right sentinel $ (c, $ 6∈ Γ), and two finite sets
of restarting meta-instructions R(M) and accepting
metainstructions A(M), respectively. Such an automaton makes
use of k pebbles p1 · · · , pk.</p>
        <p>Let us first describe informally the way how an
pRLautomaton works. Each of its finite computations consists
of certain phases called cycles, and a last – halting – phase
called a tail. In each cycle, the automaton performs two
passes through the tape: during the first pass, it marks
selected symbols of a processed sentence with pebbles; then
during the second pass, it performs its editing operations
described by meta-instructions from R(M); the operations
can be applied (only) on the symbols marked by pebbles.
In each (accepting) tail, the automaton marks all symbols
by pebbles – according to a meta-instruction from A(M) –,
halts, and accepts the analyzed sentence.</p>
      </sec>
      <sec id="sec-4-3">
        <title>3.1.1 Restarting meta-instructions. Each cycle of the</title>
        <p>pRL-automaton M is controled by a single restarting
metainstruction IR ∈ R(M) of the form</p>
        <p>IR = (E0, I1, . . . Is; Op; Restart)
(1)
where:
– each Ii is an instruction of the form (ai, Ei), 1 ≤ i ≤ s,
– each Ei is a regular language over Γ; Ei is interpreted as
the i-th context of IR (the contexts are usually represented
by regular expressions);
– each instruction Ii marks the symbol ai ∈ Γ from the tape
with the pebble pi within the first pass;
– Op = {o1, · · · , op} is an ordered set of editing operations,
o j ∈{dl[i], wr[i, b], sh[i, l]|1 ≤ i, l ≤ s, i 6= l, b ∈ Γ},
– o j is the j-th editing operation performed in the second
pass:
— if o j = dl[i] then it deletes the symbol ai,
— if o j = wr[i, b] then it rewrites ai by b,
— if o j = sh[i, l] then it shifts ai to the position behind
al .</p>
        <p>We require a ‘uniqueness’ of operations on a single
symbol, i.e., if dl[i] ∈ Op then Op does not contain the
operations wr[i, b], sh[i, l] for any b, l.</p>
        <p>Performing a meta-instruction: For an (input) sentence
w ∈ Γ∗, we define thetape inscription cw$, and a reduction
c w0. When trying to execute a cycle C by performing
w `IR
the meta-instruction IR, it starts with the tape inscription
cw$, M will get stuck (and so reject) if w does not permit a
factorization of the form w = v0a1v1a2 . . . vs−1asvs (vi ∈ Ei
for all i = 0, . . . , s). On the other hand, if w permits
factorizations of this form, then one of them is
(nondeterministically) chosen.</p>
        <p>The automaton processes the input word in two passes:
1. During the first pass of C, M puts the pebble pi on
each ai (this corresponds to the instruction Ii, for each i,
1 ≤ i ≤ s).
2. During the second pass of C controlled by the
ordered set of operations Op, the tape inscription cw$ is
transformed (reduced) into a new tape inscription w0 =
v0r1v1 · · · vs−1rsvs, where 1 ≤ i ≤ s:
• In the second pass of C, M performs its operations in
the prescribed order o1, o2, · · · , op.
• ri = λ if there is a deleting operation o j ∈ Op such
that o j = dl[i];
i.e., ai is deleted during the cycle C;
• ri = b if there is a rewriting operation o j ∈ Op such
that o j = wr[i, b];
i.e., ai is rewritten with b during the cycle C;
• ri = λ and r j = al ai if there is a shifting operation
o j ∈ Op such that o j = sh[i, l];
i.e., ai is shifted behind the symbol marked by the
pebble pl during the cycle C;
• ri = ai if there is no operation o j ∈ Op applicable on
ai.
(Note that such an ai may – together with symbols
where some of above mentioned operation is applied
– create the output structure, see the following
section, and thus they must be marked with pebbles.)
• At the end of each cycle, the Restart operation is
performed: all the pebbles are removed from the new
tape inscription w0, and the head is placed on the left
sentinel.</p>
        <p>Of course, we can delete, rewrite or shift neither the left
sentinel c nor the right sentinel $. It is required that during
a cycle, M executes at least one delete operation.</p>
        <p>If no further cycle is performed, each finite (accepting)
computation necessarily finishes in a tail performed
according to an accepting meta-instruction.</p>
      </sec>
      <sec id="sec-4-4">
        <title>3.1.2 Accepting meta-instructions. Tails of accepting</title>
        <p>computations are described by a set accepting
metainstructions A(M), each of the form:</p>
        <p>IA = (a1, . . . , as, Accept),
(2)
where ai are symbols from Γ.</p>
        <p>Performing a meta-instruction: The tail performed by
the meta-instruction IA starts with the inscription on the
tape cz$; if z = a1 · · · as, then M marks each symbol ai
with the pebble pi, and accepts z (and the whole
computation as well). Otherwise, the computation halts with
rejection. Note that only operations distributing pebbles are
allowed within an accepting meta-instruction – they will
serve for building the structured output, as it is described
in the following sections.</p>
        <p>The notation u `cM v (a reduction of u to v by M) denotes
a reduction performed during a cycle of M that begins with
the tape inscription cu$ and ends with the tape inscription
cv$; the relation `cM∗ is the reflexive and transitive closure
of `cM. By REM(M) we denote the set of reductions
performed by M.</p>
        <p>A string w ∈ Γ∗ is accepted by M, if there is an
accepting computation which starts with the tape inscription cw$
and ends by executing an accept operation. By L(M) we
denote the language consisting of all words accepted by
M; we say that M recognizes (accepts) language L(M).2</p>
        <p>Note that we limit ourselves only to such combinations
of operations that are adequate (and necessary) for known
linguistic examples.</p>
        <p>Example: Let us illustrate the notions of restarting and
accepting meta-instructions on the analysis of sentence (1)
from Section 2. This example formalizes the AR of the
sentence Petr se bojí o otce. ‘Peter fears for his father.,
namely the left branch of its AR (see the scheme in Section
2). The respective pRLe-automaton M is described by two
restarting meta-instructions IR1 and IR2 , and one accepting
meta-instruction IA, namely:</p>
        <p>
          IR1 = (E0, I11, I21, I31; O12; Restart)
E0 = {Petr se}, I11 = ( bojí , {λ })
O12 = {dl[
          <xref ref-type="bibr" rid="ref2">2</xref>
          ], dl[
          <xref ref-type="bibr" rid="ref3">3</xref>
          ]}, I21 = ( o , {λ })
        </p>
        <p>
          I31 = ( otce , {·})
IR2 = (E0, I12, I22, I32; O22; Restart)
E0 = {λ }, I12 = ( Petr, {λ })
O22 = {dl[
          <xref ref-type="bibr" rid="ref1">1</xref>
          ], sh[
          <xref ref-type="bibr" rid="ref2 ref3">2, 3</xref>
          ]}, I22 = ( se, {bojí})
        </p>
        <p>I32 = ( bojí , {λ })</p>
        <p>
          IA = ( bojí , se, . , Accept)
The computation consists of two cycles. Within the first
cycle described by the meta-instruction IR1 , the
instructions I11, I21 and I31 put pebbles p1, p2 and p3 on the
symbols containing the words bojí (with the left context Petr
se and empty right context), o, and otce, respectively; the
operations dl[
          <xref ref-type="bibr" rid="ref2">2</xref>
          ] and dl[
          <xref ref-type="bibr" rid="ref3">3</xref>
          ] delete the word o (marked with
the pebble p2) and otce (marked with the pebble p3),
respectively. Then the automaton restarts (in particular, all
pebbles are removed). Similarly in the second cycle
(metainstruction IR2 ), the instructions I12, I22 and I32 mark the
respective words and then operation dl[
          <xref ref-type="bibr" rid="ref1">1</xref>
          ] deletes the word
Petr (marked with p1) and the operation sh[
          <xref ref-type="bibr" rid="ref2 ref3">2, 3</xref>
          ] shifts the
word se with the pebble p2 behind the word bojí (with the
pebble p3). The accepting instruction IA just marks the
remaining words bojí and se with pebbles.
        </p>
        <p>The following property – obviously ensured by
pRLautomata – has a crucial role in our applications of
restarting automata.</p>
        <p>
          Weak Correctness Preserving Property. Any
pRL-automaton M is weakly correctness preserving, i.e., (for all
2Note that language L(M) accepted by the automaton M is the same
language as LC(M) in [
          <xref ref-type="bibr" rid="ref15">15</xref>
          ].
u, v ∈ Γ∗)
        </p>
        <p>v ∈ L(M) and u `cM∗ v imply that u ∈ L(M).</p>
        <p>Our goal is to model the analysis by reduction by automata
with the following stronger property:</p>
      </sec>
      <sec id="sec-4-5">
        <title>Correctness Preserving Property. A pRL-automaton</title>
        <p>M is correctness preserving, if (for all u, v ∈ Γ∗) u ∈
L(M) and u `cM∗ v imply that v ∈ L(M).
3.2</p>
      </sec>
      <sec id="sec-4-6">
        <title>Restarting Automaton with the Shift Operation</title>
      </sec>
      <sec id="sec-4-7">
        <title>Enhanced with Structured Output</title>
        <p>In this section, we introduce enhanced restarting automata
with structured output – the so-called pRLe-automata.
During their enhanced computations, these automata build
– with the help of pebbles – so called DR-structures, i.e.,
graphs consisting of items representing individual
occurrences of symbols (words) in the input sentence and
directed edges between them.</p>
        <p>In order to represent a structured output by an
pRLeautomaton, the items of flexible tape (representing just
sentinels and symbols (= words) of a processed sentence)
are enriched with two integers denoting horizontal and
vertical position of a respective word in the resulting
DRstructure.</p>
        <p>An pRLe-automaton is specified in two steps: First,
instructions of an pRL-automaton are enhanced with graphs
(Sections 3.2.1 and 3.2.2). Second, an application of these
enhanced instructions (i.e., its computations) are specified
with the help of DR-structures (Sections 3.2.3 and 3.3).</p>
        <p>A pRLe-automaton Me = (Γ, c, $, ER(Me), EA(Me))
consists of an alphabet Γ, sentinels c, $ 6∈ Γ, a set of
enhanced restarting meta-instructions ER(Me), and a set of
enhanced accepting meta-instructions EA(Me).</p>
        <p>An enhanced meta-instruction is a pair IEX = (IX , G),
where IX is a (restarting or accepting) meta-instruction
of a pRL-automaton, and G is a directed acyclic graph
G = (U, H). The graph G represents the required structure
of symbols processed during the application of the
metainstruction IX . In this paper we request G to be a rooted
tree. (This restriction ensures linguistically adequate
output structures.)</p>
      </sec>
      <sec id="sec-4-8">
        <title>3.2.1 Enhanced restarting meta-instruction IER.</title>
        <p>Let IR be a restarting meta-instruction of the form (1)
introduced in section 3.1.1. We define the corresponding graph
G = (U, H) in the following way:
1. The set of nodes U consists of three disjunct subsets Un,
Uw, and Up:
• Un = {[i, ai] | dl[i] ∈ Op or wr[i, b] ∈ Op, 1 ≤ i ≤ s}.</p>
        <p>We can see that Un covers all (original) words which
are deleted or rewritten words within IR (ai ∈ Γ is the
word marked with a pebble pi).
• Uw = {[i, b] | wr[i, b] ∈ Op, 1 ≤ i ≤ s}}. The set Uw
represents words resulting from the rewritten
operations within IR.
• Up contains the root of G of the form [i, ai] (unless the
root is already contained in Uw).
2. An edge (u, v) ∈ H represents:
• either deleting: if dl[i] ∈ Op then the edge has the
form u = [i, ai], and v = [ j, a j] for some j 6= i ( j is
interpreted as a position of respective governing node);
• or rewriting: if wr[i, b] ∈ Op then the edge has the
form u = [i, ai], and v = [i, b] (i.e., u represents the
original symbol ai and v the rewritten)</p>
      </sec>
      <sec id="sec-4-9">
        <title>3.2.2 Enhanced accepting meta-instruction IA.</title>
        <p>Let IA be an accepting instruction of the form IA =
(a1, . . . , as, Accept) (i.e., as it is introduced in Section 3.1,
X = A). Then a corresponding enhanced accepting
metainstruction has a form IEA = (IA, G), where G = (U, H)
is an enhancing graph; the symbols a1, . . . , as are pebbled
and they correspond to the nodes {[1, a1], [2, a2] . . . , [s, as]}
of U . For each edge h = (u, v) ∈ H, we suppose that
u = [ai, i], v = [a j, j] (1 ≤ i, j ≤ s and i 6= j).</p>
        <p>Example: Let us return to the analysis of sentence (1)
Petr se bojí o otce. ‘Peter fears for his father.’, see Section
2. In Section 3.1, we have already presented two
restarting instructions IR1 and IR2 , and one accepting instruction
IA. Here we enrich these instructions with a graph
describing the (part of) resulting DR-structures – the
respective pRLe-automaton Me is described by enhanced
instructions IER1 = (IR1 , G1), IER2 = (IR2 , G2), and IEA = (IA, GA),
namely:</p>
        <p>G1 = (U1, H1)
G2 = (U2, H2)
GA = (UA, HA)</p>
        <p>U1 = {[1, bojí ], [2, o], [3, otce]}
H1 = {([2, o], [1, bojí ]), ([3, otce], [2, o])}
U2 = {[1, Petr], [3, bojí ]}
H2 = {([1, Petr], [3, bojí ])}
UA = {[1, bojí ], [2, se], [3, ·]}</p>
        <p>HA = {([2, se], [1,bojí ]), ([3, ·], [1,bojí ])}</p>
      </sec>
      <sec id="sec-4-10">
        <title>3.2.3 DR-structures.</title>
        <p>In this paragraph, we introduce so-called DR-structures
(Delete Rewrite structures), which serve for the
representation of output structures of restarting automata. A
DRstructure captures syntactic units (words and their
categories used in an input sentence and also in its reduced
forms) as nodes of a graph and their mutual syntactic
relations as edges; moreover, word order is represented by
means of total ordering of the nodes.</p>
        <p>A DR-structure is a directed acyclic graph D = (V, E),
where V is a finite set of nodes, andE ⊂ V × V is a finite
set of edges. A node u ∈ V is a tuple u = [i, j, a], where a is
a symbol (word) assigned to the node, and i, j are natural
numbers, i represents the horizontal position of the node
u (i.e., the word order in an input sentence), j represents
the vertical position of u (it is equal to a number of
rewritings of a word on a particular horizontal position). Each
edge h = (u, v) ∈ E, where u = [iu, ju, a] and v = [iv, jv, b]
for some non-negative integers iu, iv, ju, jv and a, b ∈ Γ, is
either
• oblique edge iff iu 6= iv;
such edges are interpreted as representing syntactic
relations between respective symbols (words), or
• vertical edge iff iu = iv and jv = ju + 1;
such edges are interpreted as representing a rewriting
of a symbol (word) on the respective position iu .
We say that D = (V, E) is a DR-tree if the graph D is a
rooted tree (i.e., all maximal paths in D end in its single
root).</p>
        <p>Remark: It is important to say that a DR-structure
preserves word order of an original sentence, i.e., the order of
nodes in DR-structure (corresponding to words and their
categories) is not affected by (possible) shifts. However,
shifts affect the way how a DR-structure corresponds to
individual reductions of the automaton – an information
of both original and changed positions of individual words
(and their categories) must be recorded during the
computation.
3.3</p>
      </sec>
      <sec id="sec-4-11">
        <title>Computations and (Structured) Languages by</title>
        <p>pRLe-automata
Based on the definition, it is clear that for a
pRLeautomaton Me, a reduction relation `Me can be defined in
the same way as the reduction relation `M of the
corresponding pRL-automaton M; further, Me accepts the same
language as the M (leaving aside output DR-structures),
and we can see that a set of reductions REM(Me) =
REM(M), as well.</p>
        <p>
          Similarly as in [
          <xref ref-type="bibr" rid="ref15">15</xref>
          ], computation of an pRLe-automaton
Me is characterized by sequences of its configurations. A
configurationis a pair of the form Cw = (Tw, Dw) , where
Tw (so called tape content of Cw) is a string of items
representing current sentence w on the tape and Dw = (V, E)
is a DR-structure (see above). The items of Tw create a
subset of V such that Tw contains just all roots of
individual components (trees) of the DR-structure Dw (thus each
item [i, j, xi] of Tw consists of a word position i (i.e., the
horizontal index), a symbol xi on this position (word of
the original sentence), and the number of its rewritings j).
Moreover, Tw contains at most one item for each horizontal
position i.
        </p>
        <p>
          Note that – contrary to [
          <xref ref-type="bibr" rid="ref15">15</xref>
          ] – the items of Tw are
ordered; however, their order need not reflect the original
word order of an input sentence as the items (triples) may
be re-ordered by a shift operation.
        </p>
        <p>An (accepting) computation C of Me on an input
sentence w starts in the initial configurationCwin = (Twin, Diwn):
for the given sentence w = x1 . . . xn (x j ∈ Γ, 1 ≤ j ≤ n),
the initial tape content consists in the string of items Twin =
[1, 0, x1] · · · [n, 0, xn], and the DR-structure Diwn = (Vwin, 0/ ),
where Vwin = {[1, 0, x1], · · · , [n, 0, xn]}. The computation
continues by performing several cycles, and it ends by a
tail.</p>
        <p>A cycle means an application of an enhanced
restarting meta-instruction on a given configuration, resulting in
a new configuration. During each cycle ofC, some items
of the tape content are deleted, rewritten, or shifted (as in
a cycle of the original automaton without structures);
further, some edges and (rewritten) nodes are added to the
DR-structure of the corresponding configuration, in
accordance with the graph attached to the respective enhanced
meta-instruction (see below for more formal description).</p>
        <p>A tail consists in an application of an enhanced
accepting meta-instruction – it results in an acceptance of a
configuration; further, some edges are added to the current
DR-structure in accordance with a graph attached to this
enhanced meta-instruction.</p>
        <p>Let us stress that during a computation, some nodes and
edges may be added to the DR-structures of the
computation, and never removed. At the end of each computation,
we obtain a DR-tree covering the whole input sentence.</p>
      </sec>
      <sec id="sec-4-12">
        <title>3.3.1 Application of an enhanced restarting meta</title>
        <p>instruction IER.</p>
        <p>Let IER = (IR, G) be an enhanced restarting instruction in
the previously defined form, withG = (U, H) (see Section
3.2). An application of IER on an enhanced configuration
Cw = (Tw, Dw) of a word w results in a new configuration
Cw0 = (Tw0 , Dw0 ). The application consists in the following
steps:
1. Choosing a factorization of w of the form w =
v0a1v1a2 . . . vs−1asvs such that vi ∈ Ei for all i =
0, . . . , s (see the form of a restarting meta-instruction
(1)). Let us consider that the items from Tw
containing the symbols a1, · · · , as are pebbled.</p>
        <p>If w does not permit any factorization of this form,
then IER cannot be applied on the configurationCw.
2. Transforming the tape content Tw containing the
sentence w into the tape Tw0 containing the sentence
w0 = v0r1v1 · · · vs−1rsvs ; ri (1 ≤ i ≤ s), are described
in section 3.1). The transformation of Tw to Tw0 keeps
the following rules:
• each item vi ∈ Tw (1 ≤ i ≤ s) which does not
undergo an editing operation is copied to Tw0 ,
• no deleted item from Tw is transferred to Tw0 ,
and
• each item [ki, ji, ai] from Tw that corresponds to
a rewritten symbol ai (hence oi = wr[i, b], b ∈ Γ,
1 ≤ i ≤ s ) is replaced by [ki, ji + 1, b] in Tw0 ;
• the shift of an item [i, j, ai] is performed in such
a way that the indices i, j remain unchanged in
3. All edges and nodes from the DR-structure Dw are
transferred into the new DR-structure Dw0 .
4. For each edge e ∈ H, a new edge is inserted into the
new DR-structure Dw0 = (V 0, E0):
• if e = ([i, ai], [r, ar]) is a deleting edge from
H (1 ≤ r ≤ s), and [ki, ji, ai], [kr, jr, ar] ∈ Tw
are the items covering the pebbled symbols ai
and ar, respectively, then an oblique edge eo =
([ki, ji, ai], [kr, jr, ar]) is inserted into Dw0 (i.e.,
into E0).
• if e = ([i, ai], [i, bi]) is a rewriting edge and
[ki, ji, ai] ∈ Tw is the item covering the pebbled
symbol ai, then a vertical edge ev is inserted into
E0: ev leads from the item [ki, ji, ai] to the item
[ki, ji + 1, bi].</p>
        <p>At the same time the node [ki, ji + 1, bi] is added
to the set V 0.</p>
        <p>We say that the configurationCw0 can be reached in one
(restarting) step from the configuration Cw by Me (using
the instruction IER) and denote it by Cw |=IMERe Cw0 , or simply
by Cw |=Me Cw0 .</p>
      </sec>
      <sec id="sec-4-13">
        <title>3.3.2 Application of an enhanced accepting meta</title>
        <p>instruction IEA.</p>
        <p>Let IEA = (I, G) be an enhanced accepting
metainstruction, where I = (a1, . . . , as, Accept) and G = (U, H).
Let Cw = (T, D) be an enhanced configuration ofMe with
the tape content T = [i1, j1, a1], · · · , [is, j j, as] covering the
word w= a1 · · · as. Then the application of IEA on the
configurationCw results in a new configurationCw0 = (T 0, D0),
where D0 = (V 0, E0):
1. T 0 = [i1, j1, a1], · · · , [is, j j, as] = T ;
2. all edges and nodes from the DR-structure D are
transferred into the new DR-structure D0;
3. moreover, for each edge e = ([r, ar], [s, at ]) ∈ H the
new edge ([ir, jr, air ], [it , jt , ait ]) is inserted into D0.</p>
        <p>Similarly as in the previous case, we say that the
configurationCw0 can be reached in one (accepting) step from
the configurationCw by Me using the instruction IEA, and
denote it by C |=IMEAe C0.</p>
        <p>We can finally define an accepting computation of an
enhanced pRLe-automaton and subsequently also a
DRlanguage consisting from the output DR-structures.</p>
        <p>Let C = C0,C1, . . . ,Ck be a sequence of configurations
such that
I
– C0 is the initial configuration for a wordw, – Ci |=Mi e Ci+1
holds for all i = 0, . . . , k − 1,
– Ii are restarting instructions for all i = 0, . . . , k − 2,
– Ik−1 is an accepting instruction.</p>
        <p>Then we say that C is an accepting computation of Me,
the word w is accepted by Me, and the DR-structure Dk
(from the last configuration Ck = (Tk, Dk)) is the output
DR-structure of the computation C . We write DR(C ) =
Dk.</p>
        <p>Let AC(Me) denote the set of all accepting computations
of the pRLe-automaton Me. Then the DR-language of Me
is the set DR(Me) = {DR(C ) | C ∈ AC(Me)}.</p>
        <p>Remark. From the linguistic point of view, DR(Me)
should be considered as a set of (shallow)
dependencybased syntactic trees / analyses computed by the
automaton Me.</p>
        <p>Example: Let us return to the example (1), section 2,
Petr se bojí o otce. ‘Peter fears for his father.’ In
Sections 3.1.2 and 3.2.2, the pRLe-automaton Me –
described by two restarting meta-instructions and one
accepting meta-instruction together with the graphs
enhancing these meta-instructions – was presented. The sequence
S = IER1 , IER2 , IEA performs the left branch of the analysis
by reduction. In the course of the corresponding
computation, the DR-tree D3 is gradually constructed.</p>
        <p>Now we can present the accepting computation C =
C0,C1,C2,C3 on the sentence (1) determined by the
sequence of meta-instructions S. Let Cin = (T in, Din), C1 =
(T1, D1), C2 = (T2, D2), C3 = (T3, D3), C0 is the initial
configuration ofC , C3 is the accepting configuration. Further
we gradually describe T in, T1, T2, T3, and Din, D1, D2, D3,
where D3 = DR(C ).</p>
        <p>T in = [1, 0, Petr][2, 0, se][3, 0, bojí ][4, 0, o][5, 0, otce][6, 0, ·]
Din = (V in, 0/)
V in = {[1, 0, Petr], [2, 0, se], [3, 0, bojí ], [4, 0, o], [5, 0, otce][6, 0, ·]}
—————–
T1 = [1, 0, Petr][2, 0, se][3, 0, bojí ][6, 0, ·]
D1 = (V in, E1)
E1 = {([4, 0, o], [3, 0, bojí ]), ([5, 0, otce], [4, 0, o])}
—————–
T2 = [3, 0, bojí ][2, 0, se][6, 0, ·]
D2 = (V in, E1 ∪ E2)
E2 = {([1, 0, Petr], [3, 0, bojí ])}
—————–
T3 = T2
D3 = (V in, E1 ∪ E2 ∪ E3)</p>
        <p>E3 = {([2, 0, se], [3, 0, bojí ]), ([6, 0, ·], [3, 0, bojí ])}
We can see that the presented example is very simple –
the context languages from the instructions contain always
one string (sometimes the empty one λ ). Moreover, we
have not used the operation wr[i] at all. That caused that
all items used in C have their vertical index equal to 0.</p>
        <p>
          We can notice that D3 is a formalization of the
dependency tree from the example (1). (Let us note that a
computation performing the another branch of AR should
create the same DR-tree.)
Observations. Let us finally (and just informally)
formulate some observations concerned the presented notions.
1. The classes of introduced languages, sets of reductions,
and DR-languages create infinite hierarchies with respect
to the number of used pebbles. We are able to prove
exactly such propositions.
2. On the other hand, we can observe that very few
pebbles (six, according to our preliminary hypothesis) suffice
to analyze adequately any sentence from the Prague
Dependency Treebank (see [
          <xref ref-type="bibr" rid="ref2">2</xref>
          ]). It means that the complexity
of analysis by reduction of natural languages is not too
hight; however, it is not too trivial.
3. In a similar way, we can see that the number of used
shifts in one cycle does not need to exceed one.
4. As far as we can foresee, the use of rewritings within
the analysis by reduction is forced only by two linguistic
phenomena: coordinations and noun groups with
numerals. Moreover, the number of rewritings in one cycle does
not need to exceed one.
5. Further, it not hard to see that the shift operation
cannot be simulated by rewritings within the class of
pRLeautomata, due to the limited number of rewritings in
encoded in meta-instructions.
6. Finally, the measure of word-order complexity from [
          <xref ref-type="bibr" rid="ref4">4</xref>
          ]
is not bounded by any natural number for DR-languages
of pRLe-automata.
        </p>
      </sec>
    </sec>
    <sec id="sec-5">
      <title>Conclusion and Perspectives</title>
      <p>The main novelty of the work presented in the paper is an
enhancement of a formal system in a way that makes it
possible to capture adequately relations between linguistic
notions in the course of (enhanced) analysis by reductions,
namely it describes dependency structures on the one side,
and (complexity of) word order on the other side (for the
clarity reason, we do not distinguish between input
symbols and their categories in this paper). However, the
investigation of these notions is complete neither from the
linguistic point of view, nor from the point of view of the
formal automata theory.</p>
      <p>As for the future work, we plan to continue the research
in the direction of combining more types of linguistic
phenomena which influence the degree of word order freedom
for natural languages as well as enhancing the proposed
formal system so that it would capture the added
phenomena. Further research of the presented type of automata
should present its properties in an exact way, as well.</p>
      <p>
        Further, we plan to propose a refinement of AR
allowing to set constraints on possible reduction steps, namely a
constraint on projective reductions only, as the additional
experiments with a different or modified set of constraints
applied on the shift operation will probably provide new
measures of word order freedom. An interesting research
direction in this respect would be the investigation of
projectivization of sentences prior to the individual reductions
of AR, which might completely change the picture and
allow for a more detailed investigation of an interplay
between clitics and non-projective sentences, see also [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ].
      </p>
      <p>Further investigation of the shift operation might also
help to describe the relationship between different levels of
formal description of natural languages in the Functional
Generative Description, namely between the
tectogrammatical and analytical layer. This would strengthen the
formal background of the theory.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <surname>Gerdes</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kahane</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          :
          <article-title>Defining dependencies (and constituents)</article-title>
          . In: Gerdes,
          <string-name>
            <surname>Hajicˇová</surname>
          </string-name>
          , Wanner (eds.)
          <source>Proceedings of Depling</source>
          <year>2011</year>
          . pp.
          <fpage>17</fpage>
          -
          <lpage>27</lpage>
          .
          <string-name>
            <surname>Barcelona</surname>
          </string-name>
          (
          <year>2011</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <surname>Hajicˇ</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Panevová</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Hajicˇová</surname>
          </string-name>
          , E.,
          <string-name>
            <surname>Sgall</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Pajas</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Šteˇpánek</surname>
          </string-name>
          , J.,
          <string-name>
            <surname>Havelka</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Mikulová</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Žabokrtský</surname>
            ,
            <given-names>Z.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ševcˇíková-Razímová</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          : Prague Dependency Treebank 2.0.
          <string-name>
            <given-names>Linguistic</given-names>
            <surname>Data</surname>
          </string-name>
          <string-name>
            <surname>Consortium</surname>
          </string-name>
          , Philadelphia (
          <year>2006</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <surname>Hajicˇová</surname>
            ,
            <given-names>E.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Havelka</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Sgall</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Veselá</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Zeman</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          :
          <article-title>Issues of Projectivity in the Prague Dependency Treebank</article-title>
          .
          <source>The Prague Bulletin of Mathematical Linguistics</source>
          <volume>81</volume>
          ,
          <fpage>5</fpage>
          -
          <lpage>22</lpage>
          (
          <year>2004</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <surname>Holan</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kubonˇ</surname>
          </string-name>
          , V.,
          <string-name>
            <surname>Oliva</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Plátek</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          :
          <article-title>On Complexity of Word Order</article-title>
          . Les grammaires de dépendance - Traitement
          <source>automatique des langues 41(1)</source>
          ,
          <fpage>273</fpage>
          -
          <lpage>300</lpage>
          (
          <year>2000</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <surname>Kubonˇ</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Lopatková</surname>
            ,
            <given-names>M.:</given-names>
          </string-name>
          <article-title>A case study of a free word order (manuscript)</article-title>
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <surname>Kubonˇ</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Lopatková</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Plátek</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          :
          <article-title>Studying formal properties of a free word order language</article-title>
          . In: Youngblood,
          <string-name>
            <given-names>G.</given-names>
            ,
            <surname>McCarthy</surname>
          </string-name>
          ,
          <string-name>
            <surname>P</surname>
          </string-name>
          . (eds.)
          <source>Proceedings of the FLAIRS 25 Conference</source>
          . pp.
          <fpage>300</fpage>
          -
          <lpage>305</lpage>
          . AAAI Press, Palo Alto (
          <year>2012</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <surname>Kunze</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          : Abhängigkeitsgrammatik, Studia Grammatica, vol. XII. Akademie Verlag, Berlin (
          <year>1975</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <surname>Lopatková</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Plátek</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kubonˇ</surname>
          </string-name>
          , V.:
          <article-title>Modeling Syntax of Free Word-Order Languages: Dependency Analysis by Reduction</article-title>
          . In: Matoušek,
          <string-name>
            <surname>V.</surname>
          </string-name>
          et al. (ed.)
          <source>Proceedings of TSD 2005. LNCS</source>
          , vol.
          <volume>3658</volume>
          , pp.
          <fpage>140</fpage>
          -
          <lpage>147</lpage>
          . Springer (
          <year>2005</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <surname>Lopatková</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Plátek</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Sgall</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          :
          <article-title>Towards a Formal Model for Functional Generative Description: Analysis by Reduction and Restarting Automata</article-title>
          .
          <source>The Prague Bulletin of Mathematical Linguistics</source>
          <volume>87</volume>
          ,
          <fpage>7</fpage>
          -
          <lpage>26</lpage>
          (
          <year>2007</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <surname>Marcus</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          : Sur la notion de projectivité.
          <source>Zeitschrift fur Mathematische Logik und Grundlagen der Mathematik</source>
          <volume>11</volume>
          (
          <issue>1</issue>
          ),
          <fpage>181</fpage>
          -
          <lpage>192</lpage>
          (
          <year>1965</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <surname>Marecˇek</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Žabokrtský</surname>
            ,
            <given-names>Z.</given-names>
          </string-name>
          :
          <article-title>Exploiting reducibility in unsupervised dependency parsing</article-title>
          .
          <source>In: Proceedings of EMNLCoNLL 2012</source>
          . pp.
          <fpage>297</fpage>
          -
          <lpage>307</lpage>
          . ACL (
          <year>2012</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <surname>Mel'cˇuk</surname>
            ,
            <given-names>I.A.</given-names>
          </string-name>
          :
          <article-title>Dependency in language</article-title>
          . In: Gerdes,
          <string-name>
            <surname>Hajicˇová</surname>
          </string-name>
          , Wanner (eds.)
          <source>Proceedings of Depling</source>
          <year>2011</year>
          . pp.
          <fpage>1</fpage>
          -
          <lpage>16</lpage>
          .
          <string-name>
            <surname>Barcelona</surname>
          </string-name>
          (
          <year>2011</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <surname>Otto</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          :
          <article-title>Restarting Automata and Their Relation to the Chomsky Hierarchy</article-title>
          . In: Ésik,
          <string-name>
            <given-names>Z.</given-names>
            ,
            <surname>Fülöp</surname>
          </string-name>
          ,
          <string-name>
            <surname>Z</surname>
          </string-name>
          . (eds.)
          <source>Proceedings of DLT 2003. LNCS</source>
          , vol.
          <volume>2710</volume>
          , pp.
          <fpage>55</fpage>
          -
          <lpage>74</lpage>
          . Springer, Berlin (
          <year>2003</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14]
          <string-name>
            <surname>Otto</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          :
          <article-title>Restarting Automata</article-title>
          . In: Ésik,
          <string-name>
            <given-names>Z.</given-names>
            ,
            <surname>Martin-Vide</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            ,
            <surname>Mitrana</surname>
          </string-name>
          , V. (eds.)
          <source>Recent Advances in Formal Languages and Applications, Studies in Computational Intelligence</source>
          . vol.
          <volume>25</volume>
          , pp.
          <fpage>269</fpage>
          -
          <lpage>303</lpage>
          . Springer-Verlag, Berlin (
          <year>2006</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [15]
          <string-name>
            <surname>Plátek</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Mráz</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Lopatková</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          :
          <article-title>(In)Dependencies in Functional Generative Description by Restarting Automata</article-title>
          . In: Bordihn,
          <string-name>
            <surname>H.</surname>
          </string-name>
          et al. (ed.)
          <source>Proceedings of NCMA</source>
          <year>2010</year>
          .
          <article-title>books@ocg</article-title>
          .at, vol.
          <volume>263</volume>
          , pp.
          <fpage>155</fpage>
          -
          <lpage>170</lpage>
          . Österreichische Computer Gesellschaft, Wien, Austria (
          <year>2010</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          [16]
          <string-name>
            <surname>Plátek</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Mráz</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Lopatková</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          :
          <article-title>Restarting Automata with Structured Output and Functional Generative Description</article-title>
          . In:
          <article-title>Dediu A</article-title>
          . et al. (ed.)
          <source>Proceedings of LATA 2010. LNCS</source>
          , vol.
          <volume>6031</volume>
          , pp.
          <fpage>500</fpage>
          -
          <lpage>511</lpage>
          . Springer, Berlin Heidelberg (
          <year>2010</year>
          )
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>