<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta>
      <journal-title-group>
        <journal-title>Technology and the Reincarnation Cycles of Software</journal-title>
      </journal-title-group>
      <issn pub-type="ppub">1613-0073</issn>
    </journal-meta>
    <article-meta>
      <title-group>
        <article-title>Technology and the Reincarnation Cycles of Software</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>HANNU JAAKKOLA</string-name>
          <email>hannu.jaakkola@tut.fi</email>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Tampere University of Technology</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="editor">
          <string-name>Additional Key Words and Phrases: Technology Analysis.</string-name>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Categories and Subject Descriptors: I.2 [Artificial Intelligence]: Applications and Expert Systems; H.2 [Database Management]: Systems</institution>
          ,
          <addr-line>Languages, Database Machines</addr-line>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>JUKKA MÄKELÄ, University of Lapland</institution>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2017</year>
      </pub-date>
      <volume>5</volume>
      <issue>7</issue>
      <fpage>11</fpage>
      <lpage>13</lpage>
      <abstract>
        <p>The idea to write this paper was based on a discussion with colleagues about the progress of Artificial Intelligence (AI) over the last five decades. The authors have been active in the field since the 1970s. AI is just one example in our paper, but it is also a good example of a “wave driven technology”: it bounces out from time to time in the role of emerging technology (technology that has the capability to cause major changes) in ICT. Technology analyst companies now consider it to be at the frontier again, having an influence both itself and embedded in a wide variety of applications. It has a wide influence on the whole of society and business, e.g., in employment structure and business processes. In our discussion, we wish to find answers to the questions: What is the reason for the same technology reappearing time after time and being reported as one of the important sources of changes? What are the differences between the waves or generations, and the reasons behind them? We have built a hypothesis that the kernel of AI has remained more or less the same but that the changes are triggered by certain enabling technologies, used to implement the AI applications of each time. We term these enabling technologies “resources”. The same progress as that in AI fits many other areas of ICT. We transferred our discussion to the area of data management, which become the second goal of our analysis. The current data driven ICT manifests itself in big data analysis and new database technologies, but has roots in the early decades of computing. In general, at the beginning of the computer era (1940s), the aim was always to develop intelligent computer based software systems within the framework of technical restrictions. We also see the same limitations in the changes of software engineering tools and practices, as well as in the interpretations of ICT related concepts (semantics of concepts changes with time). In our paper, we will briefly introduce our ICT Change Analysis and Categorization framework and introduce the current state-of-the-art of emerging ICTs. The aim of our paper is to act as a discussion opener in the question “How was the progress of software and software engineering born - what are the factors controlling and enabling the changes?” Our aim is to point out that, in addition to excellence in the area of frontier technologies and tools, a lot of basic level understanding in the area of computing fundamentals is needed - just to understand the root of the current situation.</p>
      </abstract>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. INTRODUCTION</title>
      <p>Emerging or emergent technologies (also known as frontier technologies) are technologies that are
perceived as capable of changing the status quo. Usually these technologies are new. Sometimes older
technologies also provide the potential for remarkable changes. Emerging technologies have a radical
novelty and potential for fast growth and impact. Uncertainty is also sometimes connected to
emerging technologies – the progress is not as expected. The tools of technology analysis provide a
means for detailed analysis of the recognition of the real life cycle of the technology – see e.g.
[Jaakkola et al., 2017].</p>
      <p>The starting point of this paper is the finding that many current frontier innovations have their
roots somewhere in the past. The innovations of today may be evolutions or new generations of earlier
innovations. The analysis of long-term progress also points out waves of the same innovations; some
innovations appear in waves, some appear only once in such a role and remain a part of the
mainstream. The idea to write this paper relates to the discussion of the role of Artificial Intelligence
(AI) as a part of current ICT. We noticed that since the 1950s it has reappeared time after time and
been reported as one of the important emerging technologies – it is at the top of the wave once again.
We counted the current wave to be the fourth generation of AI ranked as emergent technologies; there
could be more depending on the interpretation. Similar waves can be found in other technologies, such
as data management, data analysis, autonomous devices, networking, certain fields of user interfaces,
etc. This observation pushed us to think about the reasons for this wave-formed development.</p>
      <p>Our hypothesis is that the essence of these wave-formed technologies has remained the same for
decades, but the new resources available enable new modifications and capabilities in the applications.
We define a wave-based technology as a technology that is reported several times in technology
analysis to be one of the emergent technologies.</p>
      <p>The aim of this paper is not to focus on technology studies in a wide scope. We will touch that topic
only in the form of a short overview. Our recent paper [Jaakkola et al., 2017] from May 2017 contains
a good analysis of the current situation of technological changes. We collected data from more than
twenty reports of leading analysts with titles like “Technology outlook”, “Leading technologies 2017”,
“Technology Forecast” etc. The perspective of the analysis was five years ahead from 2017; some of the
reports had a longer time span, up to 20 years. The analyst companies cover Gartner, Cisco, Deloitte,
Forrester, Fjord, Forbes, IDC, PWC, and the World Economic Forum among others. An overview of
this study is in Section 2 of this paper; the aim of this overview is to provide a means to understand
how the world appears now through the eyes of a technology analyst.</p>
      <p>Above we pointed out “what the aim of this paper is not.” By this exclusion we wished to inform the
reader that we want to focus on a different topic, even though at the beginning of the paper there is a
lot of discussion of technological changes. The research problem of this paper is defined as follows:
“What are the reasons for certain technologies appearing in waves in the role of emerging technologies
in the ICT sector?” The problem is divided into three research questions:
1. What is the current state of emerging ICT related technologies?
2. What kind of evidence can be found to support the hypothesis of the wave-based appearance
of certain technologies?
3. What kind of explaining factors can be found for the wave-based appearance?</p>
      <p>The research questions also form the structure of this paper. In Section 2 we give a short overview
of the current state in the area of emerging ICT technologies. In Section 3 we give two examples of
wave-based technologies – Artificial Intelligence and Data Management (DM). Both of these have an
equally long history as computers. Their role has changed during the decades. Both of them belong to
the category of leading emerging technologies in the ICT sector today (and also in the past) – AI in the
form of deep learning and neural network based (embedded) applications; DM in the form of
intelligent data analytics and block chain technology. Section 4 handles our hypothesis related to the
role of enabling technologies that have remained more or less the same during the decades. In Section
5 we conclude the paper.</p>
    </sec>
    <sec id="sec-2">
      <title>2. A SHORT LESSON IN TECHNOLOGY ANALYSIS</title>
      <p>For our conference paper [Jaakkola et al., 2017], we analyzed close to twenty technology analysis
reports made by leading analysts. The aim of this analysis was to build a big picture of the role of ICT
technologies in the current society. The focus of this paper was on the education sector, especially in
changes in professions / employment and the needs to renew education. The results are organized in
table format (p. 743) and illustrated as a classification structure (ICT Change Analysis and
Categorization Framework – ICAC (p. 747). We will not return to this topic in detail in this paper, but
quote the presented framework as an introduction to the topic (Figure 1).</p>
      <p>In the figure, the current key technologies are classified according to Freeman and Perez [Freeman
and Perez, 1988], which is a commonly used in the area of technology research. They classify the
improvement capability of new technologies into four categories:
 Incremental changes appear continuously in existing products and services (continuing the
existing trend);
 Radical changes appear when new research findings are applied in products to transfer their
properties or performance to a new step or cycle (movement to a new trend);</p>
      <p>Technology and the Reincarnation Cycles of Software 5:3
 Changes in technological systems are caused by combinations of several incremental and
radical innovations in societal and organizational systems;
 Changes in paradigms are revolutionary and lead to pervasive effects throughout the target
system under discussion (in our case, the information society).</p>
      <p>We have organized our findings in the form of a mind map. The arrows between the categories
indicate the transfer between categories. Classification of technologies into categories is not
completely unambiguous, but presents the subjective and best understanding of the authors.</p>
      <p>In the “Incremental changes” category, there are technologies that are widely adopted in
mainstream software products and development. Some of these have been revolutionary at the time of
their first appearance, but in time have become a part of the mainstream. Technologies in the
“Radical” category still have such a competitive edge that, applied in a novel way, some benefits in
the form of faster growth or better quality can be reached. The current trend accelerates or continues
on a higher level after a step upward. The technologies classified in the category “Changes in
technological systems” enable new opportunities in society or business in the form of process changes.
The effect of changes appears in behavioral models, which are enabled by certain technologies or a
combination of them. The “Paradigm” category is the most interesting part of the analysis. These
technologies are in some way revolutionary now. They change the world permanently, like cloud
computing, which has provided access to scalable computing resources with minimal costs or open
data access to enormous data sources to cultivate knowledge of it by using big data analytics. Our
term “emerging” or “frontier technology” corresponds to the paradigm category.</p>
      <p>This short overview, we hope, has provided access to our way of thinking of the role of technologies
as a part of progress in ICT. For further analysis (Section 3), we have selected two technologies, which
are handled from a historical perspective as evidence of wave-based changes:
 Artificial intelligence is reported as belonging to the category of paradigm changes. All current
analysts report it as the most important source of changes today. It provides new products,
replaces human work, and is embedded in a wide variety of intelligent systems. AI has,
however, a long history, and it is easy to find it on the frontier already a couple of times
earlier.
 Data management is found in our classification in three categories. Enriched data and context
sensitivity have radical characteristics and the management of unstructured data, open data
(management), and big data analytics still has the potential for paradigm level changes.
In Section 3 we will handle these two technologies from a historical point of view, as a part of progress
during the decades of computing.</p>
    </sec>
    <sec id="sec-3">
      <title>3. WAVE-BASED EMERGING TECHNOLOGIES – TWO EXAMPLES</title>
      <p>In this section we will provide the reader with two examples of wave-based technologies – Artificial
Intelligence and Data Management (DM). Both of these have as long a history as computers although
their role has changed during the decades. Both of them belong to the category of leading emerging
technologies in the ICT sector today – AI in the form of deep learning and neural network based
(embedded) applications; DM in the form of intelligent data analytics and block chain technology. The
discussion below points out their earlier waves (over a period of 60 years). In this section we
concentrate on describing the changes they brought about; the reasons behind the phenomena are
discussed in Section 5.</p>
    </sec>
    <sec id="sec-4">
      <title>3.1 Artificial Intelligence</title>
      <p>The term “Artificial Intelligence” was coined by John McCarthy in 1955. He is also the inventor of
Lisp programming language. The first wave of AI in the role of emerging technology focused on
programming languages like Lisp (in the 1950s) and later Prolog (early 1970s; Alain Colmerauer and
Philippe Roussel.)1. The main feature was modifiable code – the program (application) was able to
modify itself, which was the source of learning capability and “intelligence”. The second wave relates
to expert systems (1970s – 1980s), which were based on rule or frame based reasoning and hypertext2.
Ultimately, neither of these waves had a significant role in the progress of ICT, but attracted wide
interest among the scientific community and industry – for a while.</p>
      <p>Somewhere on the timeline parallel to the era of simple expert systems (1980s) there was an aim to
build AI directly into the computer architecture; we call this the third wave of AI. One of these
commercial architectures was Symbolics3, which implemented Lisp in its processor. The origins of the
company developing this computer, Symbolics Inc., were at MIT, in Cambridge, Massachusetts.
Symbolics computers were produced in the 1980s; however, they never became a commercial success
and finally disappeared from the market. Another architecture level innovation – or actually a set of
innovations - was developed in the Japanese 5th Generation Computer System (FGCS Project)4. This
project was an initiative of Japan's Ministry of International Trade and Industry beginning in 1982.
The aim was to create massively parallel computer architectures based on the concept of logic
programming (Prolog). The Institute for New Generation Computer Technology (ICOT) had a
revolutionary ten-year plan for the development of large computer systems, which were applicable to
knowledge information processing systems. The idea never materialized on the commercial product
level. It provided, however, an enormous step forward for industrial software development in Japan,
as well as being a source of (parallel) computer architectures implemented by Japanese electronics
companies in their products. To conclude, AI support on the architecture level remained a promise, in
spite of enormous investments in Japan and marketing efforts in the case of Symbolics. Why? – would
be a topic for another conference paper. In spite of a lack of commercial success, the FGCS project
especially was followed with interest in the international scientific community and industry5.
1 About Lisp: https://en.wikipedia.org/wiki/John_McCarthy_(computer_scientist); About Prolog: https://en.wikipedia.org/wiki/
Prolog.
2 Our contribution to this wave: H. Jaakkola, J. Henno. 1991 (1989). Using Hypertext to Implement a Knowledge Acquisition
System. In H. Jaakkola, H. Kangassalo, S. Ohsuga (eds.), Advances in Information Modelling and Knowledge Bases. IOS,
Amsterdam.
3 https://en.wikipedia.org/wiki/Symbolics
4 https://en.wikipedia.org/wiki/Fifth_generation_computer
5 The first author collaborated actively with ICOT; three scientific papers were published in 1984-1985.
Technology and the Reincarnation Cycles of Software 5:5</p>
      <p>
        Nowadays – in the fourth wave of AI - the key elements are machine learning technologies, neural
networks, intelligent systems (as an application), and a variety of technologies related to data
analytics). All technology analysts see it providing revolutionary opportunities in a wide variety of
applications that replace human work, support humans in their work, in the form of robotics, as part
of a variety of intelligent devices, etc. A good example, as an evidence of our hypothesis related to the
role of enabling resources, is deep learning. Intelligence is based on learning. Deep learning theory
has its roots in the 1980s. In practice it is question on the use of nonlinear statistics in data analysis
and opportunity to learn from masses of data. More than thirty years was needed to make these
theories work in practice – parallel computing and big data technologies have made it possible; earlier
data was the bottleneck
        <xref ref-type="bibr" rid="ref10 ref9">(quoted Professor Aapo Hyvärinen’s seminar presentation, Helsinki, August
31st, 2017)</xref>
        .
      </p>
    </sec>
    <sec id="sec-5">
      <title>3.2 Data Management</title>
      <p>The second example we have selected for overall discussion in our paper is DM. In Section 4 below we
will handle the progress of magnetic storing devices. The period before these in commercial use data
was stored and handled in the form of punch cards, punch tapes etc., which have their roots in the
pre-computer era. Magnetic devices, the first tapes and later disks, replaced these, but for a long time
these primitive media remained a part of daily computing. The role of data in ICT sector has been
discussed by scientists during the decades. The traditional term “computer science” refers to the
importance of computing. The term “Data Science” as the replacement was presented by Peter Naur6
already as early as 1960. Nowadays the term is in common use and represents the interdisciplinary
field about methods, used to extract knowledge from data, either structured or unstructured. It unifies
theories drawn from mathematics, statistics, information science, and computer science, in particular
machine learning, classification, cluster analysis, data mining, databases, and visualization.</p>
      <p>Due to history and poor computing power, the first wave of data management was based on the
sequential handling of data. When magnetic devices became available, the first wave of simple data
management was still being used. The first database systems appeared in the late 1960s and early
1970s and then the second wave of DM began. The first databases implemented were implemented as
navigational linked datasets that were close to the physical implementation of data. The progress
enablers were the growth of computers’ processing capacity, novel mass memories, and high-level
programming languages (like COBOL in business applications).</p>
      <p>The third wave of data management was based on the idea of Edgar Codd in the early 1970s. His
invention of using the table format in data presentation and relational calculus to handle the data
separated database architecture from the physical implementation of data. During the 1970s, several
projects around the world were developing relational database ideas into commercial products.
Computer users had to wait until the 1980s for the first relational database products. Eventually
some mainstream products became worldwide standards. One of these was the SQL database
language. As a result, the era is also known as the SQL Database era. In the 1990s, progress
continued in the form of object databases and some other variants, which we consider more as an
evolution step of relational databases than as a real reincarnation wave. The 1990s also gave birth to
the open source relational database system MySQL that become one of the forerunners of the open
source software movement.</p>
      <p>The next step in DM was recognized with the appearance of a growing need to handle
nonstructured data. This was based on the need to handle live data streams, Internet based distributed
data collections, as well as open data. This started the fourth wave of DM and on the timeline it is
located in the early 2000s, and is continuing. It is interesting to note that the same actors are
participating in the present wave as earlier. Where MySQL was one of the important products in the
6 See https://en.wikipedia.org/wiki/Peter_Naur.
SQL database era, there is now non-SQL product, MariaDB, in the same position. Both have the same
innovator, Finnish Monty Widenius.</p>
      <p>At the moment we can see the beginning of the fifth DM wave – i.e., the appearance of block chain
technology. In analysts’ reports, it is seen to be in its breakthrough phase and has growing potential
in a variety of applications that need trusted data (handling). Block chain is not a true replacement of
other DM technologies, but extends the scope of DM into new areas (security, trust creation in data
intensive operations, trusted contracts).</p>
      <p>The analysis above is based on the “common sense” of the authors. Some Internet resources7 have
been used to check time-related details. Consequently, there are no references in this part of our text.</p>
    </sec>
    <sec id="sec-6">
      <title>4. THE ROLE OF ENABLING TECHNOLOGIES</title>
      <p>Our hypothesis is that enabling technologies as key resources are the most important explanatory
factor for the wave-based periodical re-appearance of certain emerging technologies. We list three
such technologies in the ICT sector:
 computing power and memory capacity (= VLSI, circuit technologies),
 data storing capacity, and
 data transmission speed.</p>
      <p>There are several others, which we do not handle in this paper.</p>
      <p>
        The book by [Endres &amp; Romb
        <xref ref-type="bibr" rid="ref6">ach, 2003</xref>
        ] lists a variety of ICT-related trends that have affected the
progress of systems and applications. Legislation indicates regularity in technological changes. These
are based on theory and have evidence in practice, and also indicate a systematic continuum in time.
The book covers more than two hundred laws relevant to ICT, classified in more than 20 categories.
We have picked out the three most important laws that explain a lot of the progress in ICT:
 Moore’s law: the price/performance of processors is halved every 18 months;
 Hoagland’s law: the capacity of magnetic devices increases by a factor of ten every decade; and
 Cooper’s law: wireless bandwidth doubles every 2.5 years.
      </p>
      <p>Additional laws that supplement the above-mentioned are:
 Butters’ Law: the amount of data one can transmit using optical fibers doubles every nine
months;
 Rose’s Law: the number of qubits in a scalable quantum computing architecture should double
every year; and
 Nielsen’s law: A high-end user's connection speed grows by 50% per year
The book provides references to the original sources or to the sources reporting the principles of the
laws introduced. The latter three are from separate sources.</p>
      <p>
        Moore’s l
        <xref ref-type="bibr" rid="ref6">aw [Endres &amp; Rombach 2003</xref>
        , pp. 244-247] is attributed to Gordon Moore, the co-founder
and chair of Intel in the 1960s. The law was published first in an article [Moore, 1965]. Even though
the law deals with the packing density of transistors in VLSI chips, the practical consequences are
seen in the doubling of processor processing capacity (every 18 months) and the doubling of memory
chip capacity (for the same price, every 15 months). Moore’s finding is based on his observations from
the late 1950s and his “optimistic” prediction was that “There is no reason to believe it will not remain
nearly constant for at least 10 years”. The evidence of today (Figure 2), however, shows that the law
has remained valid until now (50 years) and the current understanding of the progress in material
and VLSI technology do not yet foresee any physical limits for its continuation.
7http://www.dataversity.net/brief-history-database-management/
http://www.quickbase.com/articles/timeline-of-database-history
      </p>
      <p>Fig 2. Exponential growth of computing power - Moore’s Law [Roser &amp; Richie, 2017]</p>
      <p>Rose’s law [Jurvetson, 2017] extends the progress defined by Moore to the era of quantum
computers. In his article, Jurvetson describes that, as in Moore’s Law, the growth of the amount of
qubits in a quantum computer is exponential - currently the doubling time of the number of qubits in
a quantum computer is one year. In addition, the computational power of the quantum computer also
grows exponentially with the number of entangled qubits (parallelism). This makes the growth
compounded. We agree that the quantum computer era is not yet a reality outside laboratories. It is
seen as a promising technology of the future and it fosters belief in the continuing of this progress
even after Moore’s era.</p>
      <p>
        Hoagl
        <xref ref-type="bibr" rid="ref6">and’s law [Endres &amp; Rombach, 2003</xref>
        , pp. 247-249] predicts the capacity of magnetic devices
to increase by a factor of ten every decade. The law is credited to Albert Hoagland. He was one of the
developers of the IBM magnetic disk. IBM 305 RAMAC was the first commercial computer that used a
moving-head hard disk drive, the RAMAC 350. At the time of its introduction, in the middle of the
1950s, the RAMAC 350 storage unit could hold the equivalent of 62500 punched card or 5M
characters. Hoagland predicted a remarkable increase in the area density of magnetic storage devices,
which he predicted to increase by 25-30% per year, i.e., doubling every three years. The growth has
been exponential and provides the means for the large-scale data storages of today (Figure 3).
      </p>
      <p>
        There are three laws above that refer to the growth of data transmission. Cooper’s l
        <xref ref-type="bibr" rid="ref6">aw [Endres &amp;
Rombach 2003</xref>
        , pp. 249-250] reports that wireless bandwidth doubles every 2.5 years, which indicates
growing opportunities for wireless data. Martin Cooper is an American pioneer and visionary in the
area of wireless communication and mobile technology. His prediction (from 2000) is based on
empirical analysis without theoretical evidence. Cisco has reported on current progress in two reports
[
        <xref ref-type="bibr" rid="ref11 ref12">Cisco 2017</xref>
        a;
        <xref ref-type="bibr" rid="ref11 ref12">Cisco 2017</xref>
        b]. Global mobile data traffic in 2015 reached 3.7 exabytes (10**15) per
month; in 2020 it is expected to be 30.6 exabytes. The traffic has grown 4000-fold over the past 10
years and almost 400-million-fold over the past 15 years. Even though these numbers represent the
growth of the traffic itself, the numbers also indicate the improvement in transmission technologies.
Butters’ Law [Roser &amp; Richie, 2017] supplements the vision of progress in the data transmission area.
It states that the amount of data that can be transmitted using optical fibers doubles every nine
months, which indicates that the cost of transmission by optical fiber is halving every nine months.
Nielsen’s law indicates the same progress [Nielsen, 2017] When combining the progress in the area of
wired and wireless data transmission, we can simplify the progress in the form of Fred’s Law: the
regular priced data transmission capacity doubles every year. The origin of this law is unknown (to
the authors of this paper), but found some twenty years ago. However, the law simplifies real progress
in a simple and intelligible way, in spite of the fact that it is not exact – only indicating the expected
exponential growth.
      </p>
      <p>Fig 3. Increasing hard drive capacity, 1980-2011 [Roser and Richie, 2017]</p>
    </sec>
    <sec id="sec-7">
      <title>5. CONCLUSION</title>
      <p>The aim of this paper is to open up discussion on the factors allowing the periodical appearance and
changes in important areas of ICT. We have given a general overview of the current situation based
on selected technology analyst reports. We have selected two key technologies that have followed the
pattern of periodical, wave-based changes.</p>
      <p>In Section 4 we handled three enabling technologies that explain – at least partially - the changes
in AI and DM. The answers to our research questions and solution to our research problem are
illustrated in Figure 4. There are dependencies between the elements discussed:
 Progress in VLSI technology indicates fast growth in processing power and memory capacity
provide a platform for more complex software and bigger amounts of data (in primary memory)
for fast processing,
 Availability of big mass storage capacity to allow access to big data repositories to allow
intelligent data handling, and
 Fast data transmission to allow distributed (and also parallel) processing.</p>
      <p>All of these are key resources in new emergent applications and act as the evidence for our hypothesis.</p>
      <p>No doubt the progress in VLSI technology is the most important of these. Both massive data
storages and data transmission technology use improved and effective processing elements as their
components. We claim that the essence of applications remains the same over the waves. AI still has
the same principal task to solve as applications had in the 1950s during the first wave. Lack of
resources allowed only simple applications; currently the resources allow more complex applications
based on big data and scalable computing resources. The applications are constructed of the
permanent essence and the technology enabled variant.</p>
      <p>An additional aspect that is worth recognizing which explains the sequence of waves is the
evolution of concepts. It is clear that the semantics of concepts is also changing over time. The AI of
today does not have semantically the same meaning that it had in the 1950s, in the 1980s, etc. The
same goes for the concept of DM. In Figure 4 we have included a box “software quality” with reference
to “data quality” to indicate that even this concept has different semantics in different times.</p>
      <p>In the 1950s, the period of a shortage of resources, quality software was based on minimal usage of
the main memory (small size) and effective processing – the first wave. In the 1980s the focus was on
the logical structure and maintainability of software – the second wave. The next wave (third) focused
on quality (process) management. It was based on the idea that quality software is a product that is
produced by a high capability and maturity processes.</p>
      <p>Software product quality also has its own standard. The current version (2011), ISO 250008 series,
is the evolution of and replaces the series of ISO 91269 (from 1991 and 2001) standards. These
indicate the evolution of the concept of software product quality in ten years cycles. One of the
remarkable changes is that ISO 25000 includes a new area, data quality. This indicates that focus
must also be placed on the quality of the data that is handled by software products; is this the fourth
wave?</p>
      <p>At the beginning of the paper, we set our research problem and three research questions derived
from it:
1.</p>
      <p>What is the current state of emerging ICT related technologies: Section 2 has given the
answer to this in the form of a short technology review.
2. What kind of evidence can be found to support the hypothesis of the wave-based appearance
of certain technologies: In Section 3, we have handled AI and DA as example technologies.
Several others can be found, in addition to these. In Section 5, we have also handled the
concept evolution, software quality as an example.
3. What kind of explaining factors can be found for the wave-based appearance: We have given
an example of three enabling technologies that have an effect on the progress in the
application fields discussed.</p>
      <p>These answers act as a solution to our research problem “What are the reasons for certain
technologies appearing in waves in the role of emerging technologies in the ICT sector?”.</p>
      <p>Why must the waves be recognized? Being prepared to understand the changes, we have to be
eager to recognize both the essence and the technological changes that provide new opportunities in
applications. To understand the future, we have to know not only the history, but also the trends that
lead us to the future.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          <string-name>
            <given-names>Hannu</given-names>
            <surname>Jaakkola</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Jaak</given-names>
            <surname>Henno</surname>
          </string-name>
          , Jukka Mäkelä and
          <string-name>
            <given-names>Bernhard</given-names>
            <surname>Thalheim</surname>
          </string-name>
          .
          <year>2017</year>
          .
          <article-title>Today is the Future of Yesterday, What is the Future of Today</article-title>
          ? In Petar Biljanović (Ed.),
          <source>Proceedings of the 40th Jubilee International Convention. May 22-26</source>
          ,
          <year>2017</year>
          , Opatija, Croatia.
          <source>Mipro and IEEE</source>
          ,
          <fpage>741</fpage>
          -
          <lpage>749</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          <string-name>
            <given-names>C.</given-names>
            <surname>Freeman</surname>
          </string-name>
          and
          <string-name>
            <given-names>C.</given-names>
            <surname>Perez</surname>
          </string-name>
          .
          <year>1988</year>
          .
          <article-title>Structural Crises of Adjustment, Business Cycles and Investment Behavior</article-title>
          . In G. Dodi,
          <string-name>
            <given-names>C.</given-names>
            <surname>Freeman</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Nelson</surname>
          </string-name>
          ,
          <string-name>
            <given-names>G.</given-names>
            <surname>Silverberg</surname>
          </string-name>
          and L.
          <string-name>
            <surname>Soete</surname>
          </string-name>
          (Eds.),
          <source>Technical Change and Economic Theory</source>
          , London, Pinter Publishers.
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          <article-title>8 The series of standards ISO/IEC 25000 (SQuaRE)</article-title>
          . See details in http://iso25000.com/index.php/en/iso-25000-standards.
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          9 ISO/IEC 9126-1:2001: Software engineering --
          <string-name>
            <surname>Product</surname>
          </string-name>
          quality -- Part 1. See details in
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>https://www.iso.org/standard/22749.html and in https://en.wikipedia.org/wiki/ISO/IEC_9126.</mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          <string-name>
            <given-names>A.</given-names>
            <surname>Endres</surname>
          </string-name>
          . and
          <string-name>
            <given-names>D.</given-names>
            <surname>Rombach</surname>
          </string-name>
          .
          <year>2003</year>
          .
          <article-title>A Handbook of Software and Systems Engineering - Empirical Observations, Laws and Theories</article-title>
          . Pearson and
          <string-name>
            <given-names>Addison</given-names>
            <surname>Wesley</surname>
          </string-name>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          <string-name>
            <given-names>Gordon E.</given-names>
            <surname>Moore</surname>
          </string-name>
          .
          <year>1965</year>
          .
          <article-title>Cramming more components onto integrated circuits</article-title>
          .
          <source>Electronics</source>
          <volume>38</volume>
          ,
          <issue>8</issue>
          (April
          <year>1965</year>
          ).
          <source>Retrieved August 21st</source>
          ,
          <year>2017</year>
          from http://download.intel.com/museum/Moores_Law/Articles-Press _Releases/Gordon_Moore_1965_Article.pdf.
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          <string-name>
            <given-names>Max</given-names>
            <surname>Roser</surname>
          </string-name>
          and
          <string-name>
            <given-names>Hannah</given-names>
            <surname>Ritchie</surname>
          </string-name>
          .
          <year>2017</year>
          .
          <string-name>
            <given-names>Technological</given-names>
            <surname>Progress</surname>
          </string-name>
          .
          <source>Retrieved August 21st</source>
          ,
          <year>2017</year>
          from https://ourworldindata.org/ technological-progress/. OurWorldInData.org.
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          <string-name>
            <given-names>Steve</given-names>
            <surname>Jurvetson</surname>
          </string-name>
          .
          <year>2017</year>
          .
          <article-title>Rose's Law for Quantum Computers</article-title>
          .
          <source>Retrieved August 21st</source>
          ,
          <year>2017</year>
          from https://www.flickr.com/ photos/jurvetson/8054771535.
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          <string-name>
            <given-names>Jacob</given-names>
            <surname>Nielsen</surname>
          </string-name>
          .
          <year>2017</year>
          .
          <article-title>Nielsen's Law of Internet Bandwidth</article-title>
          .
          <source>Retrieved August 28</source>
          ,
          <year>2017</year>
          from https://www.nngroup.com/articles/ law-of-bandwidth/.
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          <string-name>
            <given-names>Cisco. 2017a. Cisco</given-names>
            <surname>VNI Mobile Forecast</surname>
          </string-name>
          (
          <year>2015</year>
          -
          <fpage>2020</fpage>
          ).
          <source>Retrieved August 21st</source>
          ,
          <year>2017</year>
          from http://www.cisco.com/c/en/us/solutions/ collateral/service-provider/
          <article-title>visual-networking-index-vni/mobile-white-paper-c11-520862</article-title>
          .html.
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          <string-name>
            <surname>Cisco. 2017b.</surname>
          </string-name>
          <article-title>The Zettabyte Era - Trends and Analysis</article-title>
          .
          <source>Retrieved August 21st</source>
          ,
          <year>2017</year>
          from http://www.cisco.com/c/en/us/ solutions/collateral/service-provider/
          <article-title>visual-networking-index-vni/vni-hyperconnectivity-wp</article-title>
          .html.
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>