=Paper=
{{Paper
|id=Vol-3667/GenAILA-paper2
|storemode=property
|title=An AI Agent Facilitating Student Help-Seeking: Producing Data on Student Support Needs
|pdfUrl=https://ceur-ws.org/Vol-3667/GenAILA-paper2.pdf
|volume=Vol-3667
|authors=Joonas Merikko,Anni Silvola
|dblpUrl=https://dblp.org/rec/conf/lak/MerikkoS24
}}
==An AI Agent Facilitating Student Help-Seeking: Producing Data on Student Support Needs ==
An AI Agent Facilitating Student Help-Seeking:
Producing Data on Student Support Needs
Joonas Merikko1,2 , Anni Silvola3
1
University of Helsinki, Finland
2
Annie Advisor Ltd, Helsinki, Finland
3
University of Oulu, Finland
Abstract
Large language models (LLMs) have provided unprecedented possibilities for personalizing educational
experiences. Studies have addressed the potential of these models in supporting the learning process. Still,
less attention has been given to how LLMs could help students to sustain their academic well-being. The
current paper examines the use of LLMs in facilitating students’ help-seeking behaviors in an educational
context. We build on earlier work on a rule-based chatbot providing students with support opportunities.
First, we use thematic analysis with student support experts’ wordings on student support needs to
build a support need classification model. Then, we utilize this classification model, GPT-4 API, and
WhatsApp API, to build a support bot prototype and describe the development process and technological
architecture. We discuss the possibilities of such technology in lowering barriers to help-seeking and
producing data on student support needs and well-being for learning analytics applications.
Keywords
AI agents, student support, well-being, help-seeking, large language models
1. Introduction
New technologies have long inspired both utopian visions and dystopian concerns about
the future of education [1, 2]. Large Language Models (LLMs), a recent breakthrough in
artificial intelligence, have sparked fresh debates in this continuum. As general-purpose AI
models capable of generating natural language responses, LLMs like ChatGPT are poised to
fundamentally transform education by tailoring learning experiences to individual needs [3, 4].
The integration of AI in education has accelerated rapidly, with AI agents now more accessible
and easier to develop than ever before [5]. These agents extend the available set of social and
material support available for students and help to mirror students’ situation [6, 7]. Studies have
addressed potential of AI in prediction of learner status, discipline specific learning support,
personalization of learning experience, and supporting evaluation and assessment [8].
At present, AI is transforming social and behavioral processes that underpin learning and
knowledge creation [9, 10]. Unlike previous educational technologies, LLM-based AI agents
provide an anthropomorphic interaction experience, adding a new dimension to the learner’s
Joint Proceedings of LAK 2024 Workshops, co-located with 14th International Conference on Learning Analytics and
Knowledge (LAK 2024), Kyoto, Japan, March 18-22, 2024.
$ joonas.merikko@gmail.com (J. Merikko); anni.silvola@oulu.fi (A. Silvola)
0000-0003-4166-4762 (J. Merikko); 0000-0002-2191-2194 (A. Silvola)
© 2024 Copyright for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0).
CEUR
ceur-ws.org
Workshop ISSN 1613-0073
Proceedings
engagement with digital tools [6, 11]. This presents an unprecedented opportunity for person-
alization in learning, going beyond the feedback mechanisms of existing learning analytics
solutions. However, how these tools will change the realities in which students operate, how
students can utilize these resources and the ethical aspects of such technologies remain an open
question [9, 10].
1.1. Help-Seeking Behavior and AI in Education
Help-seeking is a critical self-regulated learning strategy where students seek assistance to
achieve their academic goals or manage mental health concerns [12, 13]. Students’ help-seeking
skills are related to their academic well-being and sense of belonging in their studies [14]. Studies
also suggest that students’ willingness to seek help is linked to their academic achievement
and motivation; those with mastery goals are more likely to ask for help than those with
performance goals, who may avoid it due to fear of judgment [15, 16]. Teacher support is also a
critical factor that encourages students to engage in help-seeking [17]. However, barriers like
stigma, negative beliefs towards support services, difficulties in recognition of symptoms and a
preference for self-reliance often hinder this process [18, 19].
Technological development has expanded the avenues for help-seeking: In addition to social
support from peers, family, teachers, and student support professionals, help-seeking can include
assistance from sources that do not comprise communication with an actual person [20, 21].
Moreover, people may also form trusting relationships with nonhuman entities and assign them
human characteristics [22, 23]. Help-seeking from nonhuman sources includes benefits like
immediacy, ease of access, and greater control over the help-seeking journey, and artificial
agents can provide a non-judgmental space [20, 7, 24]. Especially in the educational context, AI
agents have the potential to facilitate students’ help-seeking for challenges that extend beyond
specific academic tasks and relate to their overall well-being and study habits. However, current
research on AI in education has not yet addressed how AI could support students to sustain
their well-being and engagement [25].
1.2. Current study
This study explores the role of an AI agent in facilitating secondary school students’ help-
seeking behavior. The focus of help-seeking processes on students’ academic well-being and
engagement, e.g., resource management skills, social support, peer relationships, study plans,
and daily routines [26]. As help-seeking attitudes are negatively associated with low levels of
emotional engagement [27], scaffolding help-seeking can lead to increased study engagement
and well-being, and thus also to improved academic outcomes. We build on previous experiences
of using a rule-based chatbot to provide students with support opportunities [28]. First, we
aim to build a student support need classification based on data gathered from an existing
rule-based support bot. Second, we aim to build an AI agent to facilitate help-seeking. Our
research questions are the following:
• RQ1: What kind of student support needs are educational organizations targeting with a
student support bot?
• RQ2: How might we use generative AI to facilitate students’ help-seeking process and
gather data on students’ support needs?
2. Methods
2.1. Context
This study was conducted in the context of the Finnish education system, where student support
services include professionals such as special education teachers, guidance counselors, nurses,
social workers, and psychologists. [29]. To increase the match between students’ support needs
and the institutions’ support services, some institutions use a digital student support agent
called Annie [28]. Student support experts use Annie to create rule-based bot conversations,
designed to determine what kind of help a student needs. Once the need is acknowledged, the
bot suggests the right help. In the current study, a novel support bot prototype was created using
LLM technology instead of rule-based dialogues, enabling students to use natural language to
interact with the bot.
2.2. Thematic analysis (RQ1)
To answer the first research question (What kind of student support needs are educational
organizations targeting with a student support bot?) we collected data from the digital student
support tool Annie. In the tool, multidisciplinary expert groups have identified what kind of
worries or needs students might have in different phases during their educational path, and
what kind of support resources the educational organization can offer for each of these needs.
These needs and resources are then encapsulated in guided conversations initiated by a student
support bot. Different wordings (N=263) used by expert groups in 16 Finnish educational
organizations (two in higher education, 14 in secondary) to describe student need categories
were collected from the Annie database.
These wordings were analyzed using qualitative thematic analysis [30], aiming to create a
support need classification. First, all the entries were read through. Second, identical or nearly
identical entries were grouped together. Third, these groups were collated into potential themes.
Fourth, the potential themes were critically investigated and adjusted, aiming to find a balanced
model (not too specific, yet not too general) considering the research question. Fifth, each theme
was named and compelling extract examples were chosen for reporting.
2.3. Design Science Research (RQ2)
To answer the second research question (How might we use generative AI to facilitate students’
help-seeking process and recognize student needs?), design science research (DSR) methodology
[31] was used. The starting points for the DSR process were the experience from a rule-based
proactive support bot [28], the support need categorization (RQ1) and the availability of GPT-4
API. Using literature on help-seeking behavior and human-computer interaction as a knowledge
base, we designed an initial version of the support bot prototype (Design cycle of DSR). More
specifically, we attempted to create a low-threshold, low-stigma environment where students
could openly discuss their worries. However, we also wanted to prevent the support bot from
attempting to solve students problems, acknowledging the hallucination problem, i.e., the risk
of GPT-4 generating content that is not based on factual or accurate information [32]. Instead,
the bot was instructed to validate student worries and aid students find support from human
professionals. A psychologist was consulted in the creation of the initial GPT-4 prompt.
The first version of the prototype was released as a beta version in a blog post. An informed
consent was required from the beta testers, stating that the prototype is for testing purposes
only and not to be used for requesting help, and that the gathered data may be used for research
and development purposes. By analyzing the discussions created by beta testers, we were able
to further iterate the bot infrastructure and the prompts for GPT-4 (Relevance cycle of DSR).
3. Results
3.1. Support Need Classification (RQ1)
In the thematic analysis, we found 12 different themes for the student support need wordings
used by the experts. The results of the thematic analysis of support need wordings are presented
in Table 1. The themes with the highest number of original wordings included Study planning
(N=49, 18.6%), Psychological well-being (𝑁 = 44, 16.7%) and Other issues (𝑁 = 43, 16.3%).
Table 1
Support need classification based on thematic analysis of experts’ support need wordings.
Code and Theme Examples N %
SP Study Planning “Study progress”, “Missing course credits” 49 18.6%
SM Subject matter “Assignments”, “Vocational studies” 9 3.4%
TL Teaching & learning arrangements “Online learning”, “Learning environment” 14 5.3%
LS Learning skills & special needs “Learning difficulties”, “Concentration” 17 6.5%
LC Life plans, career & identity “Life situation”, “Career planning” 27 10.3%
PH Physical well-being “Physical health”, “Vaccinations” 17 6.5%
PS Psychological well-being “Stress and coping”, “Mental well-being” 44 16.7%
SO Social well-being “Relationships”, “Belongingness” 20 7.6%
FH Finance & housing “Financial issues”, “Financial aid, housing” 8 3.0%
TM Tools & materials “Study equipment”, “System credentials” 8 3.0%
DC Documents & certificates “Certificates”, “Applications” 7 2.7%
OT Other issues “Other”, “Several issues”, “Needs more info” 43 16.3%
3.2. Support Bot Prototype (RQ2)
The technological infrastructure of the Support Bot Prototype is presented in Figure 2. While
the starting point was to leverage LLMs, especially GPT-4 in this case, to build the support
bot prototype, additional components were developed in the process. The central part of the
prototype is the Support Bot Engine, which interacts with the student through WhatsApp API
and controls the conversation.
There are five possible actions that the support bot can take (Figure 1):
1. GPT-4 response: the Support Bot Engine calls the GPT-4 API, including role instructions
for the bot, the chat history, and the support need classification in the API call. The
bot instructions were developed through open beta testing with continuous iterations.
For example, we instructed the bot to support students’ reflection if the student did not
want the personnel to know about the issue. Furthermore, we experimented with the
messaging style (with/without emojis, message length instructions, and tone of voice).
The final bot instructions are presented in Table 2.
2. Contact request: In each call to GPT-4 API, the classification of support needs (Section
3.1) was provided. Based on the chat history and the classification, if a support need of
a certain theme is recognized and the student gives their consent, a contact request to
a member of school personnel is created using GPT-4 API function calling feature (e.g.,
if the theme is psychological well-being, the school psychologist will be requested to
contact the student).
3. Remember or forget: Realizing that the bot discussions may contain sensitive data
that students might want full control of, we developed a feature that enables the student
to erase the discussion logs from the server after the discussion. The student is asked
whether they want the discussion logs to be preserved (remember) or erased from the
server (forget). Additionally, the feature is aimed to raise students’ awareness of data
security.
4. How to proceed: We noticed that the discussions often tailed away in the open beta
tests without reaching a logical endpoint. We developed a feature that nudges the student
if the conversation has been inactive for five minutes, allowing the student to continue or
end the discussion or to view the available support options.
5. Support options: If the student chooses to view the support options, the bot changes
into a rule-based mode, guiding the student through the available support options with
multiple-choice questions.
4. Discussion
4.1. Overview of the results
First, we created a student support need classification model by analyzing student support
professionals’ wordings for support needs. The classification model is a conceptual tool for
framing the diverse needs of students. It is more of a practical tool than an exact representation
of student needs. However, it provides a starting point to understand what students might
require from support services, and a shared language for discussing student needs within the
educational community. We were impressed with the capability of GPT-4 to utilize this model
in recognizing the themes of discussions, prompting us to consider how AI might shape the
landscape of student support.
Second, we created a support bot prototype combining LLMs and rule-based logic in an
instant messaging environment. Developing prompts for the support bot was less about coding
and more about continuous prompt iteration and figuring out what tasks AI can handle and
what might be better suited for rule-based processes. Integrating the bot with WhatsApp made
Figure 1: Features of the support bot prototype from a student perspective. 1. GPT-4 response, 2.
Contact request, 3. Remember or forget, 4. How to proceed, 5. Support options.
GPT-4 API gpt-4 responses gpt-4 responses,
contact requests,
how to proceed,
support options,
Support Bot remember or forget
role instructions,
chat history, Engine
support need WhatsApp
classification API
student Student
messages Phone
Figure 2: The technological infrastructure of the support bot prototype.
Table 2
The final bot instructions after several iterations.
Prompt section Prompt text
General role You are a student support bot called Annie. Your role is to help students find right
kind of support and services for their worries or needs regarding their studies or
well-being.
Detailed You need to act as a first listener of their worries and find out what is their issue
instructions or worry. You do not need to solve their problem but show empathy and let them
know that support is available, and that it is a good thing that they talked about
their worries. You can chat about their issue for a good while, and when a good
moment appears, you can carefully try to suggest them relevant support options.
Make sure to ask the student whether you can create a contact request. They
might also like to just chat with you about their worries, and that is perfectly
alright. You can encourage students that reaching out for help is a good thing,
but you do not need to push them.
Messaging style Use emojis, ask one question at a time, message length appropriate for mobile
instant messaging, use warm and positive [language here] Gen Z language.
the interaction feel rather personal. We are optimistic about its potential, yet cautious on ethical
considerations. Overall, our results are promising in the sense that the possibilities of AI agents
in scaffolding help-seeking should be investigated further.
4.2. Limitations and future work
This work comes with limitations, that should be addressed in future work. First, the thematic
analysis was based only on one rater and therefore interrater reliability could not be measured,
possibly limiting the reliability of the classification. Moreover, the sample of the wording used
by student support experts might not be an extensive set of student worries and needs, but
rather a set of such needs that the educational organizations are equipped to address. Second,
the ethical considerations of using an AI agent in scaffolding help-seeking must be discussed
thoroughly before this technology can be used at scale. Next, we will evaluate the prototype
with students in various fictional scenarios and use a stimulated recall interview to capture
students’ experiences with the prototype.
4.3. Implications for Learning Analytics
Academic well-being is an aspect that the learning analytics field has somewhat overlooked,
perhaps stemming from the relative scarcity of well-being data compared to the data on learning
processes and outcomes. Additionally, handling well-being data involves navigating more
stringent ethical and privacy considerations. Despite these challenges, focusing on well-being
within learning analytics could be profoundly influential. While it holds intrinsic value, student
well-being is also increasingly recognized as a critical factor influencing learning outcomes.
The presented prototype addresses privacy concerns by allowing students to remove all
discussion logs after discussing with the AI agent. However, using the support need classification
Figure 3: A draft on support need analytics.
model allows us to collect data on students’ support needs and well-being on a level, which is
helpful for decision-makers, but preserves the privacy of individual students. Figure 3 illustrates
a proposed model for reporting student support needs to various stakeholders within educational
institutions. Such analytics could play a crucial role in optimizing the allocation of student
support resources, ensuring that resources are utilized where they can have the most significant
effect.
Disclosure
Joonas Merikko is employed as Chief Product Officer at Annie Advisor Ltd, receives a salary
from the company, and owns stocks of the company.
References
[1] L. Cuban, P. Jandrić, The dubious promise of educational technologies: Historical patterns
and future challenges, E-learning and Digital Media 12 (2015) 425–439.
[2] N. Selwyn, Looking beyond learning: Notes towards the critical study of educational
technology, Journal of computer assisted learning 26 (2010) 65–73.
[3] E. Kasneci, K. Seßler, S. Küchemann, M. Bannert, D. Dementieva, F. Fischer, U. Gasser,
G. Groh, S. Günnemann, E. Hüllermeier, et al., Chatgpt for good? on opportunities and
challenges of large language models for education, Learning and individual differences
103 (2023) 102274.
[4] J. Achiam, S. Adler, S. Agarwal, L. Ahmad, I. Akkaya, F. L. Aleman, D. Almeida, J. Al-
tenschmidt, S. Altman, S. Anadkat, et al., Gpt-4 technical report, arXiv preprint
arXiv:2303.08774 (2023).
[5] T. Brown, B. Mann, N. Ryder, M. Subbiah, J. D. Kaplan, P. Dhariwal, A. Neelakantan,
P. Shyam, G. Sastry, A. Askell, et al., Language models are few-shot learners, Advances in
neural information processing systems 33 (2020) 1877–1901.
[6] A. Carolus, Y. Augustin, A. Markus, C. Wienrich, Digital interaction literacy model–
conceptualizing competencies for literate interactions with voice-based ai systems, Com-
puters and Education: Artificial Intelligence 4 (2023) 100114.
[7] B. Maples, M. Cerit, A. Vishwanath, R. Pea, Loneliness and suicide mitigation for students
using GPT3-enabled chatbots, npj Mental Health Research 3 (2024) 1–6.
[8] H. Crompton, D. Burke, Artificial intelligence in higher education: the state of the field,
International Journal of Educational Technology in Higher Education 20 (2023) 1–22.
[9] L. Markauskaite, R. Marrone, O. Poquet, S. Knight, R. Martinez-Maldonado, S. Howard,
J. Tondeur, M. De Laat, S. B. Shum, D. Gašević, et al., Rethinking the entwinement between
artificial intelligence and human learning: What capabilities do learners need for a world
with ai?, Computers and Education: Artificial Intelligence 3 (2022) 100056.
[10] I. Tuomi, Beyond mastery: Toward a broader understanding of ai in education, Interna-
tional Journal of Artificial Intelligence in Education (2023) 1–12.
[11] J. L. Steele, To gpt or not gpt? empowering our students to learn with ai, Computers and
Education: Artificial Intelligence 5 (2023) 100160.
[12] S. A. Karabenick, E. N. Gonida, Academic help seeking as a self-regulated learning
strategy: Current issues, future directions, in: Handbook of self-regulation of learning
and performance, Routledge, 2017, pp. 421–433.
[13] D. Rickwood, K. Thomas, Conceptual measurement framework for help-seeking for mental
health problems, Psychology research and behavior management (2012) 173–183.
[14] K. Mäkitalo-Siegl, F. Fischer, Stretching the limits in help-seeking research: Theoretical,
methodological, and technological advances, Learning and Instruction 21 (2011) 243–246.
[15] R. Butler, O. Neuman, Effects of task and ego achievement goals on help-seeking behaviors
and attitudes., Journal of educational Psychology 87 (1995) 261.
[16] S. A. Karabenick, Perceived achievement goal structure and college student help seeking.,
Journal of educational psychology 96 (2004) 569.
[17] R. A. Federici, E. M. Skaalvik, Students’ perceptions of emotional and instrumental teacher
support: Relations with motivational and emotional responses., International education
studies 7 (2014) 21–36.
[18] A. Aguirre Velasco, I. S. S. Cruz, J. Billings, M. Jimenez, S. Rowe, What are the barriers,
facilitators and interventions targeting help-seeking behaviours for common mental health
problems in adolescents? a systematic review, BMC psychiatry 20 (2020) 1–22.
[19] A. Gulliver, K. M. Griffiths, H. Christensen, Perceived barriers and facilitators to mental
health help-seeking in young people: a systematic review, BMC psychiatry 10 (2010) 1–9.
[20] C. Pretorius, D. Chambers, D. Coyle, Young people’s online help-seeking and mental health
difficulties: Systematic narrative review, Journal of medical Internet research 21 (2019)
e13873.
[21] M. Puustinen, J.-F. Rouet, Learning with new technologies: Help seeking and information
searching revisited, Computers & Education 53 (2009) 1014–1019.
[22] N. K. Lankton, D. H. McKnight, J. Tripp, Technology, humanness, and trust: Rethinking
trust in technology, Journal of the Association for Information Systems 16 (2015) 1.
[23] A. Waytz, J. Heafner, N. Epley, The mind in the machine: Anthropomorphism increases
trust in an autonomous vehicle, Journal of experimental social psychology 52 (2014)
113–117.
[24] V. Ta, C. Griffith, C. Boatfield, X. Wang, M. Civitello, H. Bader, E. DeCero, A. Loggarakis,
et al., User experiences of social support from companion chatbots in everyday contexts:
thematic analysis, Journal of medical Internet research 22 (2020) e16235.
[25] H.-C. Chu, G.-H. Hwang, Y.-F. Tu, K.-H. Yang, Roles and research trends of artificial
intelligence in higher education: A systematic review of the top 50 most-cited articles,
Australasian Journal of Educational Technology 38 (2022) 22–42.
[26] K. Raetsaari, T. Suorsa, H. Muukkonen, ” lopettasko lukion?”: toimintaperusteet, toimijuu-
den jännitteet ja ohjaus lukiolaisten arjen haasteissa, Kasvatus 52 (2021) 297–309.
[27] K. A. Kosyluk, K. O. Conner, M. Al-Khouja, A. Bink, B. Buchholz, S. Ellefson, K. Fokuo,
D. Goldberg, D. Kraus, A. Leon, et al., Factors predicting help seeking for mental illness
among college students, Journal of Mental Health 30 (2021) 300–307.
[28] J. A. Pesonen, ’are you ok?’ students’ trust in a chatbot providing support opportunities,
in: P. Zaphiris, A. Ioannou (Eds.), Learning and Collaboration Technologies: Games and
Virtual Environments for Learning: 8th International Conference, LCT 2021, Held as Part
of the 23rd HCI International Conference, HCII 2021, Springer, 2021.
[29] F. N. A. for Education, Support to learning and pupil welfare sys-
tem, 2024. URL: https://www.oph.fi/en/education-and-qualifications/
support-learning-and-pupil-welfare-system.
[30] V. Braun, V. Clarke, Using thematic analysis in psychology, Qualitative research in
psychology 3 (2006) 77–101.
[31] A. R. Hevner, A three cycle view of design science research, Scandinavian journal of
information systems 19 (2007) 4.
[32] V. Rawte, A. Sheth, A. Das, A survey of hallucination in large foundation models, arXiv
preprint arXiv:2309.05922 (2023).