=Paper= {{Paper |id=Vol-3630/paper28 |storemode=property |title=The Data Dilemma: Google Analytics’ Untapped Potential and Web Data Literacy |pdfUrl=https://ceur-ws.org/Vol-3630/LWDA2023-paper28.pdf |volume=Vol-3630 |authors=Tom Alby |dblpUrl=https://dblp.org/rec/conf/lwa/Alby23 }} ==The Data Dilemma: Google Analytics’ Untapped Potential and Web Data Literacy== https://ceur-ws.org/Vol-3630/LWDA2023-paper28.pdf
                                The Data Dilemma: Google Analytics’ Untapped
                                Potential and Web Data Literacy
                                Tom Alby1
                                1
                                    Humboldt-Universität zu Berlin, Unter den Linden 6, 10099 Berlin, Germany


                                                                         Abstract
                                                                         Since its introduction in 2005, Google Analytics has become the most popular web analytics system
                                                                         due to free availability, easy installation, and the power of the Google brand. However, despite the
                                                                         abundance of freely available training resources and the obvious need to understand and optimize the
                                                                         user experience, the majority of Google Analytics users do not leverage this tool in a useful way. Given
                                                                         that data literacy has become a key competence in a more and more data-driven world, the question
                                                                         arises why users fail to get meaningful insights from their data when all necessary resources for effective
                                                                         use are readily available.
                                                                             According to previous research, Google Analytics users assume value in the data and fail to find it,
                                                                         but it has been unclear until now why this is the case. This work is based on expert interviews with
                                                                         Google Analytics consultants and trainers in Germany and Austria that have been consulting hundreds
                                                                         of users. Using a widely referenced data literacy framework, these experts assessed their clients’ data
                                                                         competence and elaborated on the reasons for any shortcomings. While none of the consultants and
                                                                         trainers had been using a formal data literacy framework before, all of them had very similar approaches
                                                                         to examine their clients’ data analysis capabilities and help them to derive value from data. Not only do
                                                                         users have inflated expectations that data literacy is built into Google Analytics, they also have a hard
                                                                         time even asking questions that can be answered with data. The findings of our research contribute to
                                                                         the augmentation of existing data literacy frameworks, in particular for the workforce.

                                                                         Keywords
                                                                         web analytics, google analytics, digital analytics, data literacy, user experience management




                                1. Introduction
                                The Digital Analytics association defines web analytics as “the collection, measurement, analysis,
                                and reporting of internet data” [1]. Understanding and optimizing user experience through web
                                analytics is crucial across a variety of website types, including eCommerce, media, government,
                                corporate, social communities, educational, and personal websites [2]. Insights from web
                                analytics help businesses understand the effectiveness of their marketing campaigns and the
                                website itself, improving the return on investment and the contribution to business overall
                                [3, 4, 5]. For non-eCommerce websites, access to web analytics data allows authors to understand
                                what content is preferred, where users drop off, or the demographics of their users to finetune
                                the persona they are writing for [6, 7, 8]. Not doing web analytics right can result in a lack of

                                Proceedings of the LWDA 2023 Workshops: BIA, DB, IR, KDML and WM. Marburg, Germany, 09.-11. October 2023.
                                *
                                 Corresponding author.
                                $ thomas.alby@hu-berlin.de (T. Alby)
                                 0000-0002-6696-5185 (T. Alby)
                                                                       © 2023 by the paper’s authors. Copying permitted only for private and academic purposes.
                                    CEUR
                                    Workshop
                                    Proceedings
                                                  http://ceur-ws.org
                                                  ISSN 1613-0073
                                                                       CEUR Workshop Proceedings (CEUR-WS.org)




CEUR
                  ceur-ws.org
Workshop      ISSN 1613-0073
Proceedings
understanding of customer needs, leading to a poor user experience [9], which can decrease
customer satisfaction and loyalty [10]. Businesses that do not effectively use web analytics
may allocate resources to areas that do not generate sufficient returns and let them fall behind
competitors who leverage data-driven insights to optimize their strategies and operations,
ultimately leading to a loss of market share [11]. Given this, companies and individuals with a
website should be highly motivated to leverage their web analytics tool, understand the data
they collect, and generate actionable insights.
   The web analytics market is dominated by Google Analytics with a market share of up to
55.2%, depending on the measurement approach [12, 13]. Previous work has shown, however,
that not all Google Analytics users find it easy to derive insights from the system [14]. In fact,
the vast majority of Google Analytics implementations reveals a lack of competence of its users
[15]. While non-transactional site owners had the impression that Google Analytics is not made
for them due to Google’s focus on marketing [16], the reasons behind this perception for owners
of other types of sites have remained unclear until now. Google has been adding more and more
content to their help database, an abundance of free tutorials, YouTube videos and podcasts
exists, and for those who can afford it, there are books and professional training courses.
   To our knowledge, our contribution is the first study of the reasons why web analytics users
find it so difficult to gain insights from the data, based on interviews with six web analytics
consultants and trainers in Austria and Germany who have worked with hundreds of clients.
As this group helps users to leverage their data, they also have insights into the data literacy
levels of their clients and the problems that they encounter.
   This paper is organized as follows: In Section 2, we provide an overview of web analytics and
data literacy. Section 3 describes the setup of our interviews and analysis process, and Section 4
details the results. We close with a discussion and thoughts about future research in Section 5.


2. Background and Related Work
2.1. Web Analytics
Before Google acquired Urchin in 2005 and offered it as Google Analytics for free a few months
later [17, 18], web analytics in its first era required either a substantial monetary investment,
technical expertise or both in order to get meaningful data [15]. With Google Analytics, every
website owner had access to advanced reports for free, just by including a little piece of code in
the HTML. This second era of democratization of web analytics was not seen without concerns
as real insights require more than just placing code on a website [19].
   Google Analytics has been enriched with new features and companion tools such as the
Google Tag Manager since its introduction in 2005 [20, 21, 22, 23, 24]. This, however, has not
made it easier for its users to work with their data. Sleeper et al. conducted 18 interviews
with owners of non-transactional websites in 2014 and reported that some participants felt
overwhelmed by the data provided by Google Analytics, recommending that owners of such
sites should be taken into account in the design of analytics tools [16]. Google Analytics
users in the small and medium enterprises (SME) segment found it difficult to understand their
data and rather played with it, according to a study conducted by Petersen and Martin with
six participants in 2015 [14]. Zumstein et al. also identified problems in the adoption of web
analytics in the SME segment in 2022 [25]. In fact, most of the Google Analytics installations in
2022 did not go beyond a standard installation and were unlikely to provide actionable insights
to website owners [15]. The third era of web analytics was driven by implementation of the
General Data Protection Regulation (GDPR) in 2018, forcing website owners to modify their
site in order to request European users’ consent for tracking by Google Analytics and other
tools. However, the vast majority of cookie banners do not work as expected [26], thus exposing
the same technical shortcomings that website owners demonstrate in their Google Analytics
implementations. Web analytics users thus have emerged into a two-class society in which a
minority has access to experts to exploit tools such as Google Analytics whereas the majority
of site owners is unable to leverage the benefits of this free tool [15].

2.2. Data Literacy
Data Literacy has become an essential skill, especially for the business world, resulting in
an increase in data literacy programs in education curriculums since the 2000s [27]. Several
frameworks for the education sector as well as meta studies exist; we will focus on the latter
and augment these with additional research.
   In a well-referenced study, Ridsdale et al. conducted an analysis of existing strategies for
teaching data literacy. They define data literacy as “the ability to collect, manage, evaluate, and
apply data, in a critical manner” [28]. Their work includes five knowledge areas, that is, the
conceptual framework of data, data collection, data management, data evaluation, and data
application (see figure 1). These knowledge areas are further differentiated in 23 competences,
for example, “data collection” includes “data discovery and collection” and “evaluating and
ensuring quality of data and sources”. Each competence further includes tasks, e.g., for “data
discovery and collection”, three tasks are included, “performs data exploration”, “identifies
useful data”, and “collects data”. The competences are categorized in conceptual, core, and
advanced competences.
   Schüller and Busch provide another analysis of existing approaches [29], including the paper
by Ridsdale et al., but also a non-peer-reviewed framework that was created by Jarvis [30].
Schüller et al. defined a data literacy framework with competences in a process model of data
value creation [31]; this framework consists of six competence fields that are derived from
process steps: establishing a data culture, providing data, evaluating data, interpreting results,
interpreting data, and deriving action.
   Nath and Kirby present an empirical examination of the factors of data literacy and propose
a measurement scale, the Global Data Literacy Benchmark survey [32]. Using this scale, they
identified three factors of data literacy: data analysis skills including decision-making based on
the data, the acquisition and organization of data from different sources, and the identification of
data issues. They suggest that educational programs should focus on improving students’ data
preparation skills to enhance their overall data literacy. Carlson and Stowell Bracke emphasized
the importance of finding a context close to students in order to teach data literacy effectively
[33]. As all of these frameworks focus on the education sector, the question how the existing
workforce can be trained has hardly been covered by now [28]. There are also concepts such as
statistical literacy that can be regarded as a subset of data literacy [34, 35], and the concept of
information literacy of which data literacy can be seen as a subset [36].
Figure 1: Data Literacy Knowledge areas, Competences, and Knowledge/Tasks according to Ridsdale et
al. [28]. Only tasks pertinent to the focus of this paper are presented herein.


   Data literacy frameworks should not be confused with analytics maturity models that are
used to assess the competence of organizations and the gaps to fill to get to the level that these
organizations aspire to have. Gartner’s Analytics Maturity approach, for example, is structured
in stages, from initial, ad-hoc use of analytics to sophisticated, predictive, and automated
decision-making using advanced data analysis techniques [37]; at the same time, Gartner came
to the conclusion that companies are slow in their journey to advance in data and analytics.
Lismont et al. came to a similar conclusion using the DELTA model (Data, Enterprise, Leadership,
Techniques, and Applications) [38]. Finally, apart from data literacy frameworks and analytics
maturity models, there are also processes for data analysis, for instance, the CRISP-DM (CRoss-
Industry Standard Process for Data Mining) [39]. The use of appropriate processes could be
considered a component of data literacy, yet it is not addressed in any of the aforementioned
data literacy frameworks. We will be using Ridsdale et al.’s work as a reference point [28],
allowing us to highlight competences that may be underrepresented in their work but covered
in other frameworks.
   Data literacy especially for web analytics has not been the subject of research although a
few industry efforts exist. Hamel created a non-peer-reviewed Web Analytics Maturity Model
(WAMM) in 2009 [40]. Hausmann et al. created a framework for web analytics in 2012 that starts
with clear understanding of business requirements, continues with planning, development of the
data collection capabilities, and closes with actionable results and an evaluation of actions [41].
The German BITCOM association for IT and digital media published their own maturity model
Digital Analytics & Optimization Maturity Index in 2017 [42]; however, while the corresponding
website still exists, the offered tool does not work anymore.


3. Methodology
3.1. Approach to Data Collection and Analysis
Instead of surveying Google Analytics users in order to understand why they struggle to make
beneficial use of the available materials and the tool, six independent web analytics experts
were interviewed. These experts have assisted hundreds of clients in using Google Analytics
profitably and are able to provide a perspective on the challenges that users face. At the same
time, this approach offers an insight into the data literacy competences that these experts regard
as essential to successfully work with data in the business world.
   This study makes use of an inductive thematic analysis to collect the opinions and experiences
of the experts [43]. All these interviews were recorded, transcribed, reviewed for notable features
that were then coded, refined, reviewed again for coherence, and then finally defined and named.

3.2. Participants
Six interview partners were selected based on several criteria: the experts had to be owners
of their businesses, authors of a book or blog about web analytics, have at least 10 years of
experience in web analytics, and have contributed to industry conferences such as the Search
Marketing Expo.1 In addition, all participants were asked about other possible candidates to
interview, and every expert had to be recommended by at least two other experts in order to be
included.

3.3. Interview Process
The interviews took between 30 and 60 minutes and were conducted online from September 2021
until January 2023. While the interview structure was the same for all participants, there was
also room for an open part where thoughts could be explored more thoroughly (semi-structured
interviews). The interview questions (see section A) consisted of three groups:

       • A description of their own company, the clients they are consulting, and the problems
         that they are solving
       • An estimate of the data literacy of their clients
       • Potential opportunities to improve data literacy

  As it was unclear if participants were aware of data literacy frameworks, the frameworks
provided by Ridsdale et al. [28] and the levels described by Jarvis [30] were kept ready as
diagrams and tables to support the discussion. To avoid introducing bias, all experts were
1
    https://smxmuenchen.de/en/
initially asked about their approaches to data literacy prior to the introduction of any existing
framework.

3.4. Limitations
It is worth noting that a selection bias exists in this study in such that the interviewees only
face clients that seek help and almost all of their clients are businesses. As a consequence, the
results will not cover owners of private blogs, institutional websites etc; however, with respect
to the large number of suboptimal Google Analytics installations, it is assumable that these
groups face the same problems as the consultants’ clients but may not have the resources to
afford professional help.


4. Results
4.1. Thematic analysis of interview transcripts
The inductive thematic analysis produced seven themes and insights:

Choosing a Tool: In most cases, a decision for Google Analytics was made without adequate
    consideration of the specific requirements.
Data Collection: Users are concerned about the validity of their data, even though they
     expected Google Analytics to automatically ensure its accuracy.
Limited Data Literacy: Most experts see low to moderate data competency among their
     clients.
Understanding Business Requirements: The lack of business-related goals or the lack to
    distill them for web analytics is a big challenge for web analytics users.
Help-seeking mainly initiated by external factors: GDPR compliance and the phasing out
     of older Google Analytics versions are the primary drivers for clients seeking help, apart
     from issues related to data quality.
Critical Thinking as a competence: Critical thinking and formulating the right questions
      emerged as paramount skills within the domain of web analytics.
Artificial Intelligence as an Enabler: None of the experts believed that AI will be able to
      fill the current gap in data literacy.

  Details for each theme are provided in the next subsections.

4.2. Choosing a Tool
Experts agreed that in all cases, the tool was already chosen, no matter what the business
problem to solve was. Google Analytics was often chosen due to its popularity and the low-to-
none cost of its use, without fully understanding the tool’s capabilities, how the data it provides
can be used, and how to use the tool effectively.
   All consultants agreed that most of their clients do not question Google Analytics itself but, at
the same time, have trouble working with the data presented by the tool. The trust that Google
as a leading technology provider already knows what data is important plays an important
role in the selection of a web analytics tool. Moreover, according to the experts, some clients
think that they are using a data-driven strategy already just because they have installed Google
Analytics, even if they are not actively analyzing or leveraging the data.

4.3. Data Collection
Understanding how data is collected by a tool was highlighted as one of the key competences
by all experts. They emphasized that if data is not collected correctly, every following step will
produce incorrect results. In addition, all experts said that they were either approached because
a client was concerned about their data quality or data quality issues were found during the
consultancy project. While Google Analytics is easy to install at first sight and there is even a
debugging mode for the Google Tag Manager, debugging requires substantial expertise. Again,
the tool itself was not challenged in terms of data collection, and users initially assumed that
Google was acquiring the right data correctly, and without the need to customize anything.

4.4. Limited Data Literacy
All participants recognized the importance of data literacy and could enumerate elements of
data literacy frameworks, yet they had not formalized an approach to it or consulted an existing
framework prior. Rather, they were applying a mental model of data literacy and emphasized
those competences that were relevant for creating business value out of the data. When exposed
to the frameworks [28, 30], these were regarded as helpful but not to a degree that they would
be put to use in the future. Two of the experts mentioned the DAOMI model [42] when asked
for a data literacy model; however, even this was not used, and the one expert who endorsed
it was not aware that the tool was not working anymore when the interview was conducted.
Another expert mentioned Hamel’s WAMM [40] but also was not using it.
   With the exception of the CEO of the largest web analytics consultancy, all experts saw
most of their clients in the lower areas of data literacy, with a few exceptions. In addition,
only the largest consultancy saw an improvement in data literacy competences over the last
years. The other experts mainly worked with medium-sized companies but saw no difference
in data literacy and maturity with respect to the size of a company. Rather, some industries
such as eCommerce are more likely to be more advanced than traditional companies. Bigger
companies may also be able to afford web analytics teams but are also more likely to be slow
due to internal processes whereas smaller companies cannot afford web analytics specialists
and ask subject-matter experts of marketing domains to analyse the data of their own channel.
The difference between the largest consultancy and the others may also be a result of the client
base they cater to. The largest consultancy is a reseller of the premium version of Google
Analytics, potentially attracting a different caliber of clients, further highlighting the existence
of a two-class web analytics society [15].
   With regard to Ridsdale et al.’s framework, according to the five experts working with medium-
sized companies, none of these companies possess the competences outlined in Ridsdale et
al.’s framework that would allow them to effectively utilize Google Analytics. Clients do not
overestimate their capabilities [44]; they are aware that something is missing but they attribute
the problem to a different cause than their missing data literacy as they overestimate the tool’s
capabilities without being able to clearly articulate what is missing. One participant emphasized
that some clients were surprised that Google Analytics is just a tool and that data literacy is not
part of the tool itself.
    While every expert agreed that free training and documentation exist, these resources do not
provide enough practical guidance or industry-specific examples relevant to a client’s business
context. Additionally, one participant assumed that analysis and mathematics, in general, may
intimidate a portion of the population.

4.5. Understanding Business Requirements
All experts underscored the vital competence of understanding business requirements and
translating them into web analytics key performance indicators (KPIs). At the same time, all
experts agreed that most of their clients do not really understand what the problem to solve
is, beginning with what the goals are that need to be achieved with the website, connected
to overall business goals. More than once, the experts mentioned that asking for business
requirements overstrains clients. This is less of a problem if a manager is part of a consulting
mandate or a training course which is, however, rarely the case. Either business goals are not
communicated strongly enough or, if they are, the role of a website to contribute towards these
goals is unclear. For instance, there is a desire to improve user experience, but in a standard
installation, Google Analytics merely offers KPIs such as Bounce Rate and Time on Page. These
metrics do not provide any information on how users interact with the content of a page,
resulting in no clear measurement of user experience, and therefore, its improvement becomes
impossible.
   A decision to use Google Analytics is often made before business requirements are defined.
Consequently, data from a standard installation starts populating pre-configured reports, which,
due to their generic nature, may not meet specific business requirements. A gap thus emerges,
as these pre-configured reports struggle to align with needs that have not been adequately
defined in the form of KPIs. As a result, clients often ask consultants to explain the features of
Google Analytics itself, instead of seeking guidance on making better data-driven decisions.
This is due to their assumption that there’s valuable information hidden within the tool, which
they simply haven’t discovered yet. Most of the consultants conduct workshops with their
clients to define goals and KPIs before they start to refine the Google Analytics implementation.
Indeed, all interview partners unknowingly used an approach similar or equal to CRISP-DM
that starts with business understanding.

4.6. Help-seeking mainly initiated by external factors
The GDPR and the resulting need to install cookie consent banners is one of the main triggers
for clients to seek help. One of the experts assumed that 70% of consent banners are not working
correctly (which is a correct assumption [26, 15]). Another main trigger has been the planned
shutdown of older Google Analytics versions and Google’s focus on Google Analytics 4. As data
Figure 2: Screenshot of Google Analytics 4 with prompts for automated insights.


is collected differently in this new version, clients want to revamp their installations. Finally,
the assumption that something is wrong with the data also triggers clients to seek help, notable
as one of the few triggers that does not come from external sources.
   Increasing data literacy is not a driver for clients; web analytics trainings are provided to
new people on the job to make sure that they can support the business according to their
responsibility and tasks but not as part of a data literacy curriculum.

4.7. Critical Thinking as a Competence
All consultants expressed concerns about the lack of critical thinking demonstrated by Google
Analytics users. Their concerns encompassed the users’ approach to tool selection, data col-
lection, and the application of this data to support their businesses. Just by virtue of the tool
being offered by Google, users seem to not question the usefulness and quality of the data
provided. Clients rarely start with a hypothesis about how to make sense of a number. Also,
clients rather focus on a solution instead of understanding the problem first. Asking questions
has been regarded as a key competence as well, although all experts were concerned that their
clients do not know what questions to ask due to the missing business goals.
4.8. Artificial Intelligence as an Enabler
All experts agreed that it would be difficult if not impossible for a machine learning-driven or
AI-powered tool to understand the business requirements as they are different from company
to company, and thus an AI could not fill the current gap. While one participant believed that
80% of all questions in a company are comparable to those of other companies, all agreed that
the current status of the AI-driven suggestions made by Google Analytics (see Figure 2) are not
meeting expectations. Worse, as said before, users may not be able to phrase the right questions,
so that Natural Language Processing-based interactions will not help, and if data collection has
not been verified as a first step, the foundation of training as well as the interpretation may
already be wrong.


5. Discussion and Conclusion
The thematic analysis of the expert interviews reflected several competencies from Ridsdale
et al.’s data literacy framework, despite the experts not having prior exposure to any such
framework. Competencies such as data collection, data quality, decision-making based on data,
and critical thinking were the most frequently mentioned.
   External pressures such as the introduction of the GDPR and the sunsetting of older Google
Analytics versions were primary reasons for hiring an expert. Interestingly, while experts
generally perceived their clients as having low levels of data literacy, the lack of this competency
wasn’t a key motivation for these clients to seek expert help, as Google Analytics was expected
to have the solution built-in, no matter what the problem was. The interviewed experts were
primarily tasked with showing where the desired information is “hidden” in the tool.
   Instead, the experts focused on a key competence not explicitly emphasized in data literacy
frameworks and often lacking in their clients before choosing a tool: the ability to translate
(business) goals into data use cases. The cognitive translation required to turn goals into data
application may be encompassed in Ridsdale et al.’s “conceptual framework” knowledge area
[28], but it is not explicitly elaborated on. Similarly, Schüller et al. [31] focus their process
on value creation by leveraging data, in a manner similar to the CRISP-DM approach, but it
may not entirely bridge the gap identified in the expert interviews. This shortcoming in clearly
defining goals or the inability to link them to web analytics software are the primary reasons
for the underutilization of Google Analytics.
   This also explains why training content is mostly not relevant to users and their specific
needs as they are missing a step beforehand. The power of Google Analytics to be configurable
to different business needs thus seems to be its curse as users lack the understanding why they
should do so. While Google Analytics already includes insights based on Machine Learning and
natural language questions, these were not considered to be useful by the consultants.
   Future research holds significant potential to investigate how data analytics tools can foster
the definition and translation of goals into analysis. This could include formulating objectives,
KPIs, configuring and reviewing implementations, and guiding user inquiry, for instance, in
relation to user experience beyond bounce rates and time spent on site. At the same time, the
connection of goals to analytics could enrich current data literacy frameworks, especially in
workplace applications.
References
 [1] Digital Analytics Association, 2022. URL: https://www.digitalanalyticsassociation.org/.
 [2] F. Palomino, F. Paz, A. Moquillaza, Web Analytics for User Experience: A Systematic
     Literature Review, in: M. M. Soares, E. Rosenzweig, A. Marcus (Eds.), Design, User
     Experience, and Usability: UX Research and Design, volume 12779, Springer International
     Publishing, Cham, 2021, pp. 312–326. doi:10.1007/978-3-030-78221-4_21.
 [3] A. Phippen, L. Sheppard, S. Furnell, A practical evaluation of Web analytics, Internet
     Research (2004). doi:10.1108/10662240410555306.
 [4] D. Chaffey, M. Patron, From web analytics to digital marketing optimization: Increasing
     the commercial value of digital analytics, Journal of Direct, Data and Digital Marketing
     Practice 14 (2012) 30–45. doi:10.1057/dddmp.2012.20.
 [5] C. Coursaris, W. Van Osch, C. López-Nicolás, F.-J. Molina-Castillo, N. Rapp, Driving
     website performance using web analytics: A case study, in: 19th Americas Conference
     on Information Systems, AMCIS 2013-Hyperconnected World: Anything, Anywhere,
     Anytime, Chicago, Illinois, 2013.
 [6] W. Fang, Using google analytics for improving library website content and design: A case
     study, Library Philosophy and Practice, ISSN 1522-0222, Vol. 9, Nº. 2, 1, 2007 1 (2006).
     doi:10.7282/T3MK6B6N.
 [7] A. Wiggins, Information architecture: Data-driven design: Using web analytics to validate
     heuristics system, Bulletin of the American Society for Information Science and Technology
     33 (2008) 20 – 24. doi:10.1002/bult.2007.1720330508.
 [8] B. M. Stoesz, Using Google Analytics to Measure Engagement with a Teaching and
     Learning Centre During COVID-19, Journal on Centers for Teaching and Learning 14
     (2022) 39–57.
 [9] C. Brauer, D. Reischer, F. Moedritscher, What web analysts can do for human-computer
     interaction?, in: Nah, Fiona Fui-Hoon (Ed.), HCIB 2014. Lecture Notes in Computer Science,
     volume 8527, Springer International Publishing, Cham, 2014, pp. 471–481. doi:10.1007/
     978-3-319-07293-7_46.
[10] C. Flavián, M. Guinalíu, R. Gurrea, The influence of familiarity and usability on
     loyalty to online journalistic services: The role of user experience, Journal of Re-
     tailing and Consumer Services 13 (2006) 363–375. doi:https://doi.org/10.1016/
     j.jretconser.2005.11.003.
[11] C. L. C. de Oliveira, F. J. B. Laurindo, A framework of web analytics: Deploying the
     emergent knowledge of customers to leverage competitive advantage, in: Proceedings of
     the international conference on e-Business, 2011, pp. 1–6.
[12] A. Lerner, A. K. Simpson, T. Kohno, F. Roesner, Internet jones and the raiders of the lost
     trackers: An archaeological study of web tracking from 1996 to 2016, in: Proceedings of
     the 25th USENIX conference on security symposium, SEC’16, USENIX Association, USA,
     2016, pp. 997–1013.
[13] Historical trends in the usage statistics of traffic analysis tools for websites, 2013. URL:
     https://w3techs.com/technologies/history_overview/traffic_analysis/all.
[14] E. J. Petersen, B. M. Martin, Misuse, play, and disuse: Technical and professional commu-
     nication’s role in understanding and supporting website owners’ engagement with Google
     Analytics, in: 2015 IEEE International Professional Communication Conference (IPCC),
     IEEE, Limerick, Ireland, 2015, pp. 1–5. doi:10.1109/IPCC.2015.7235786.
[15] T. Alby, Popular, but hardly used: Has google analytics been to the detriment of
     web analytics?, in: Proceedings of the 15th ACM web science conference 2023, Web-
     Sci ’23, Association for Computing Machinery, New York, NY, USA, 2023, pp. 304–311.
     doi:10.1145/3578503.3583601.
[16] M. Sleeper, S. Consolvo, J. Staddon, Exploring the benefits and uses of web ana-
     lytics tools for non-transactional websites, in: Proceedings of the 2014 conference
     on Designing interactive systems, ACM, Vancouver BC Canada, 2014, pp. 681–684.
     doi:10.1145/2598510.2598555.
[17] K. Regan, Google Buys Web Analytics Firm Urchin Software, 2005. URL:
     https://www.ecommercetimes.com/story/google-buys-web-analytics-firm-urchin-
     software-41857.html.
[18] B. Crosby, The circle of analytics, 2005. URL: https://googleblog.blogspot.com/2005/11/
     circle-of-analytics.html.
[19] D. Clark, D. Nicholas, H. R. Jamali, Evaluating information seeking and use in the changing
     virtual world: the emerging role of Google Analytics, Learned Publishing 27 (2014) 185–194.
     doi:10.1087/20140304.
[20] Laura Holmes, Digital marketing made (much) easier: Introducing Google Tag Manager,
     2012. URL: https://analytics.googleblog.com/2012/10/google-tag-manager.html.
[21] M. Mishra, Re-imagining Google Analytics to support the versatile usage patterns of today’s
     users, 2012. URL: https://analytics.googleblog.com/2012/10/universal-analytics.html.
[22] J. Wang, Expanding Universal Analytics into Public Beta, 2013. URL: https://
     analytics.googleblog.com/2013/03/expanding-universal-analytics-into.html.
[23] R. Ketchum, A new way to unify app and website measurement in Google Analytics, 2019.
     URL: https://blog.google/products/marketingplatform/analytics/new-way-unify-app-and-
     website-measurement-google-analytics/.
[24] R. Ketchum, Prepare for the future with Google Analytics 4, 2022. URL: https://blog.google/
     products/marketingplatform/analytics/prepare-for-future-with-google-analytics-4/.
[25] D. Zumstein, C. Brauer, A. Zelic, Benefits, challenges and future developments in digital
     analytics in German-speaking countries: An empirical analysis, Applied Marketing
     Analytics 7 (2022) 246–259.
[26] I. Sanchez-Rola, M. Dell’Amico, P. Kotzias, D. Balzarotti, L. Bilge, P.-A. Vervier, I. San-
     tos, Can I opt out yet? GDPR and the global illusion of cookie control, in: Proceed-
     ings of the 2019 ACM asia conference on computer and communications security, Asia
     CCS ’19, Association for Computing Machinery, New York, NY, USA, 2019, pp. 340–351.
     doi:10.1145/3321705.3329806.
[27] M. Leon-Urrutia, D. Taibi, V. Pospelova, S. Splendore, L. Urbsiene, U. Marjanovic, Data
     literacy: An essential skill for the industry, in: B. Lalic, D. Gracanin, N. Tasic, N. Simeunović
     (Eds.), Proceedings on 18th international conference on industrial systems – IS’20, Springer
     International Publishing, Cham, 2022, pp. 326–331.
[28] C. Ridsdale, J. Rothwell, M. Smit, M. Bliemel, D. Irvine, D. Kelley, S. Matwin, B. Wuetherick,
     H. Ali-Hassan, Strategies and best practices for data literacy education knowledge synthesis
     report, 2015. doi:10.13140/RG.2.1.1922.5044.
[29] K. Schüller, P. Busch, Data Literacy: Ein Systematic Review zu Begriffsdefinition, Kompe-
     tenzrahmen und Testinstrumenten., Arbeitspapier 46, Hochschulforum Digitalisierung,
     Berlin, 2019. doi:10.5281/zenodo.3349865.
[30] Jeff Jarvis, If I Ran a Newspaper. . . ., 2017. URL: https://medium.com/whither-news/if-i-
     ran-a-newspaper-220a065d2232.
[31] K. Schüller, P. Busch, C. Hindinger, Ein Framework für Data Literacy, Arbeitspapier 47,
     Hochschulforum Digitalisierung, Berlin, 2019. Doi: 10.1007/s11943-019-00261-9.
[32] R. Nath, J. Kirby, An Empirical Examination of the Factors of Data Literacy, Journal of
     Digital Science 4 (2022) 3–20. doi:10.33847/2686-8296.4.1_1.
[33] J. Carlson, M. Stowell Bracke, Planting the Seeds for Data Literacy: Lessons Learned from
     a Student-Centered Education Program, International Journal of Digital Curation 10 (2015)
     95–110. doi:10.2218/ijdc.v10i1.348.
[34] J. Watson, R. Callingham, Statistical Literacy: A Complex Hierarchical Construct, Statistics
     Education Research Journal (2002).
[35] R. Callingham, Watson, Jane, Statistical Literacy: From Idiosyncratic to Critical Thinking,
     in: Curricular Development in Statistics Education IASE Roundtable Conference, Interna-
     tional Association for Statistical Education, 2004, pp. 116–162. doi:10.52041/SRAP.04301.
[36] M. Shields, Information literacy, statistical literacy, data literacy, IASSIST quarterly 28
     (2005) 6–6.
[37] Gartner Survey Shows Organizations Are Slow to Advance in Data and Analytics, 2018.
     URL: https://www.gartner.com/en/newsroom/press-releases/2018-02-05-gartner-survey-
     shows-organizations-are-slow-to-advance-in-data-and-analytics.
[38] J. Lismont, J. Vanthienen, B. Baesens, W. Lemahieu, Defining analytics maturity indicators:
     A survey approach, International Journal of Information Management 37 (2017) 114–124.
     doi:10.1016/j.ijinfomgt.2016.12.003.
[39] C. Shearer, The CRISP-DM model: the new blueprint for data mining, Journal of data
     warehousing 5 (2000) 13–22.
[40] S. Hamel, A strategic approach based on business maturity and critical suc-
     cess factors,        2009. URL: https://www.cardinalpath.com/wp-content/uploads/
     WAMM_ShortPaper_091017.pdf.
[41] V. Hausmann, S. P. Williams, P. Schubert, Developing a framework for web analytics, in:
     25th Bled eConference - eDependability: Reliable and Trustworthy eStructures, eProcesses,
     eOperations and eServices for the Future, Proceedings, 2012.
[42] Bitkom, Reifegradmodell zum Digital Analytics & Optimization Maturity In-
     dex (DAOMI). Leitfaden zur Anwendung und Interpretation., 2018. URL:
     https://www.bitkom.org/sites/default/files/file/import/20181018-Reifegradmodell-
     zum-Digital-Analytics-Optmization-Maturity-In.pdf.
[43] V. Braun, V. Clarke, Thematic analysis., in: APA handbook of research methods in psychol-
     ogy, Vol 2: Research designs: Quantitative, qualitative, neuropsychological, and biological.,
     APA handbooks in psychology®., American Psychological Association, Washington, DC,
     US, 2012, pp. 57–71. doi:10.1037/13620-004.
[44] J. Kruger, D. Dunning, Unskilled and unaware of it: How difficulties in recognizing one’s
     own incompetence lead to inflated self-assessments, Journal of Personality and Social
     Psychology 77 (1999) 1121–1134. doi:10.1037/0022-3514.77.6.1121.
A. Interview Questions
The interviews took place in German, questions have been translated to English for this article.

   1. Please describe your company and your work.
   2. How long have you been working in the field of web analytics/digital analytics?
   3. Why do website operators come to you? What problem or problems do you solve?
   4. Have the problems you solve changed over time?
   5. Do clients ever come to you with one problem, only for you to discover that they have a
      completely different problem? Could you please provide examples?
   6. Are you familiar with the term data literacy?
   7. What do you understand by this term?
   8. Is it important for users of web analytics systems to have data literacy?
   9. Do you know of a model that can be used to assess a user’s data literacy?
  10. (Participants are introduced to the data literacy framework, their questions are answered
      before the next question is asked.) How would you describe your clients’ data literacy?
  11. What do you think is the reason for this level of data literacy?
  12. What is the impact of this degree of data literacy?
  13. How do you go about assessing a client’s data literacy?
  14. Have you been able to observe a development in the data literacy of users?
  15. Not everyone can afford a consultancy, how do you see the data literacy of users who, for
      example, manage personal pages?
  16. Can you already see the user’s data literacy from the installation of a web analytics
      system?
  17. Do you think that the users should acquire more data literacy, or should the systems
      provide more support to the users?
  18. How could the web analytics systems support the users?
  19. How do you train your own staff?
  20. What skills should already be taught in school to improve data literacy?