<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>EATing Smart: Student-Educator Reflections on Assessment Literacy</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Wendy Rowan</string-name>
          <email>wendy.rowan@ucc.ie</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Katie O'Reilly</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Stephanie Larkin</string-name>
          <email>S.Larkin@ucc.ie</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Evan Murphy</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Ciara</string-name>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Business Information Systems, Cork University Business School, University College Cork</institution>
          ,
          <addr-line>Cork</addr-line>
          ,
          <country country="IE">Ireland</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>This study investigated the Assessment Literacy (AL) Dimension of the Enhancing Equity, Agency, and Transparency in Assessment Practices (EAT) Framework, which aimed to support the development of students' self-regulatory assessment skills by enhancing their comprehension and implementation of assessment practices. Through the application of the EAT Framework, both student and educator agency and success in assessment were expected to be improved [1]. The research focused on gaining a deeper understanding of the student perspective of assessment literacy within a Business Information Systems (BIS) undergraduate programme, while also incorporating an overview of the educator's perspective to inform the assessment design process. The Assessment Literacy (AL) dimension of the framework encompassed four key self-regulatory competencies: AL1 - What Constitutes "Good"; AL2 - How Assessment Tasks Fit Together; AL3 - Student and Educator Entitlement; and AL4 - Clarity Around the Requirements of the Discipline. A purposive sample of students from across the first three years of the BIS programme participated in interviews to discuss their experiences and understanding of the AL dimension, guided by the EAT student assessment framework. To complement these insights, the educator's perspective was captured to explore their approach to assessment design. Data collection instruments included the EAT mapping framework diagram (student and educator versions), the CV assessment used across the three years of study, and openended interview questions. The findings provided valuable insights into students' self-regulatory capacities and their engagement with assessment practices. Furthermore, the educator's input offered direction for the refinement of future assessment design. This study contributed to the enhancement of assessment literacy, offering recommendations for improved assessment co-design between students and educators. The outcomes aimed to enrich academic discourse and foster a more collaborative, student-informed approach to assessment in higher education through dissemination to students, peers, and the wider academic community.</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;Assessment Literacy</kwd>
        <kwd>EAT Framework</kwd>
        <kwd>Student Voice</kwd>
        <kwd>Assessment Design</kwd>
        <kwd>Self-Regulation</kwd>
        <kwd>1</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>
        To understand student’s engagement more fully with assessment and learning requires a research
approach to recognise the myriads of learning contexts and learner agencies. The Enhancing Equity,
Agency, and Transparency (EAT) Framework developed by Evans [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ] offered a valuable lens to
examine assessment practices. By focusing on Assessment Literacy (AL) dimension of EAT, this
study explored the ways in which Undergraduate Business Information Systems (BIS) students
developed self-regulatory skills. These skills were explored through the four key elements of the
Assessment Literacy dimension, namely: Understanding what constitutes ‘good’; how assessment
tasks fit together; what the entitlements are during the assessment proves; and finding clarity in the
professional discipline. The EAT framework is grounded in a Personal Learning Styles Pedagogy
(PLSP) which positions both students and educators as co-constructors of assessment and the
educational journey [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ].
      </p>
      <p>
        To explore the AL dimension in practice, this study concentrated on the student and educator
perspectives to better understand the situated nature of assessment literacy. Students and educators
bring distinct but connected experiences to the assessment process. To more fully understand this, a
qualitative approach was taken to capture the depth and diversity of these views. The EAT
framework [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ] with an emphasis on shared understanding and self-regulatory development,
provided a flexible basis to explore how assessment literacy is experienced within this specific
disciplinary context.
      </p>
      <p>This research aimed to spotlight how students internalise standards, and their interpretations of
‘what is good' academic work. It also sought to discover how students navigate the broader
educational landscape, especially assessment and feedback processes. Furthermore, it questioned
how students perceive professional disciplinary expectations and their role in the educational
journey, and their entitlement expectations during this experience. Ultimately, the research desired
to surface the conditions that support meaningful student engagement in assessment, as well as the
ways in which educator practices can more effectively foster a transparent and inclusive learning
environment.</p>
      <p>
        The remainder of this paper is structured as follows: Section 2 reviews the Enhancing Equity,
Agency, and Transparency in Assessment Practices (EAT) Framework by Evans [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ] to position our
research on the conceptions of assessment with students as partners in the educational journey.
Section 3 then introduces the research design of our study, exploring students' and educators'
perceptions of assessment literacy and assessment co-design. Section 4 presents findings from our
study, followed by a discussion on the contributions, limitations, and avenues for future research in
Section 5. Section 6 brings the paper to an end with a conclusion.
      </p>
    </sec>
    <sec id="sec-2">
      <title>2. The EAT Framework</title>
      <p>
        Enhancing Equity, Agency, and Transparency in Assessment Practices (EAT) Framework by Evans
[
        <xref ref-type="bibr" rid="ref1">1</xref>
        ] is a vital tool in providing transparency in assessment for equality of opportunity and student
self-regulation. The EAT framework can be used to improve assessment practices and to re-envision
what assessment looks like in the future. EAT provides an opportunity to transform conceptions in
the assessment of learning, refocusing on students as partners in the educational journey.
      </p>
      <p>
        Assessment is a fundamental component of the educational process, influencing both teaching
practices and student learning outcomes. Assessments can work to reinforce good performance for
students but also work as a guide for students who are looking to improve [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ]. Educators can improve
instruction, give feedback, and determine students' strengths and weaknesses with the help of
assessment [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ]. Assessment literacy is the ability of a student to understand the assessment they are
undertaking and the skills needed to undertake the assessment [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ]. Assessment literacy is critical in
ensuring students success in assessments. It ensures students can understand the purpose of the
assessment, strategies to undertake it, self-reflection, agency in seeking feedback and self-reflection
[
        <xref ref-type="bibr" rid="ref7">7</xref>
        ]. Assessment literacy within the EAT framework is connected to feedback literacy and assessment
design [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]. When students understand what is good within their assessments they can better engage
in peer assessment and feedback interpretation [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ]. Transparency in the assessment process can help
to further assessment literacy. Problems around transparency can arise from educators focuses on
students understanding but it is important that assessors are also working from shared
understandings to ensure fairness and reliability in grading [
        <xref ref-type="bibr" rid="ref9">9</xref>
        ].
      </p>
      <p>
        A clear understanding of assessment is crucial for students to be able to engage with it
meaningfully. An assessment's readability is an essential aspect. Readability includes layout,
language and organisation among other factors that influence how quickly a reader comprehends
written text. Students may have trouble understanding assessment instructions if they are unclear
or difficult to access, which could have a detrimental effect on performance [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ]. Clarity in
assessments covers not only students understanding of the task within the assessment but also the
purpose of the assessment. Students should understand why the assessment is being conducted as
well as how [11]. A lack of clarity within assessments can negatively impact students' perceptions of
teaching and their learning satisfaction [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ]. Students understanding of assessment criteria is vital
to their success, when students struggle in understanding the criteria this can cause them to
disengage [12]. Ensuring clarity in assessments can help to elevate the cognitive load on students
allowing students to work efficiently and improve their learning outcomes [13]. A student’s ability
to understand the assessment plays a crucial role in their potential success and has a positive impact
on their overall learning outcomes.
      </p>
      <p>Co-design in student assignments has emerged as a collaborative approach that actively involves
students in shaping assessment processes, with the aim of enhancing engagement, transparency, and
inclusivity in higher education. Previous research has highlighted the role that students can play in
the learning experience. It is important that students are viewed as having an active role and not a
passive role [14]. Students' involvement in the formation of assessment can be achieved through
processes such as co-design. Using co-design involves the students taking on an active role in the
formation of assessments in areas such as creating the tasks and the grading criteria [15]. Involving
students in the assessment design process not only enhances their engagement but also contributes
to more inclusive assessment practices by providing all students with equitable opportunities to
succeed [16]. While co-design approaches to assessments have many benefits it is important to
ensure that all students have the opportunity to contribute to ensure diversity and inclusion. Clear
communication, inclusive recruitment practices, and incentives are necessary to ensure equity in
assessment co-design [15]. Co-design in student assessments can work to increase student
engagement in their learning process as well as fostering a sense of inclusion.</p>
      <p>As discussed in the introduction the EAT framework is built on PLSP. Pedagogy is the careful
and planned way teachers help students learn by connecting what they teach, how they teach, and
how students are supported, aiming to help students grow both academically and personally [17].
PLSP emphasises the importance of understanding and responding to individual learning differences
in educational settings, particularly in how feedback is designed and delivered [18]. Feedback plays
a vital role in the assessment process. While there is no uniform definition of feedback it can serve
as tool for improvement, motivation and promote self-regulation [19]. Self-regulation in learning is
students' engagement in behaviour, motivation, and cognition [20]. For feedback to be effective for
students it is important they understand the entire assessment process, so they can act upon the
feedback they receive [21]. Feedback should be easy for students to engage with and implement in
their future work. Feedback can also come from the students in the form of self-feedback and
selfassessment, this is a valuable skill for students to develop [22]. When a student is able to carry out
self-assessment this can help to increase their feedback literacy, this is an important relationship that
educators should encourage within students [23]. Feedback literacy refers to a student’s knowledge,
skills and attitude needed to effectively interpret and use feedback [24].</p>
      <p>Previous research has called for assessment to become more integrated into the learning process
to more effectively support student learning experiences [25]. Traditional methods of assessment can
negatively impact students, in comparison learner centred methods can encourage deeper learning
and autonomy [26]. Students often focus on the grade they receive from assessment, while educators
tend to focus on feedback given to students [27]. This disconnect between educators and students
can make it challenging to fully support students through the process.</p>
      <p>This study focuses on addressing the following four research questions which align with the four
elements of the Assessment Literacy (AL) dimension of EAT. For AL1, the research question posed
was: How do students interpret assessment criteria and what influences their understanding of what
constitutes ‘good’? For AL2 we sought to understand: How do students perceive progression between
assessment tasks and how does this shape their development? To explore AL3 we asked: How do
students approach feedback and dialogue with educators? And finally, for AL4: How do students
come to understand the professional expectations in their field and how does this shape their
academic identity? The following section will introduce the research methods used in this explorative
study.</p>
    </sec>
    <sec id="sec-3">
      <title>3. Methodological Approach</title>
      <p>
        This research adopted a qualitative approach to obtain an in-depth understanding of students’ and
educators’ perceptions of the assessment literacy (AL) dimension and assessment co-design
embodied in the EAT Framework. The research aim was to take an in-depth look at the students’
self-regulatory approaches to their learning and self-assessment of educational progress. The
educator's input had the potential to offer direction for the refinement of future assessment co-design
ideologies. This information helps to provide both insights and suggestions for implementing good
higher education practices based on ideas within the EAT [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ] Framework.
      </p>
      <p>The participants in this study included Business Information Systems (BIS) Undergraduate
students studying the professional development modules running across three years of the four-year
programme, as well as educators providing teaching and learning support on this undergraduate BIS
programme.</p>
      <p>The student participants were recruited from three professional development modules. Only
students over the age of 18 years were invited to take part in this exploratory research. The only
exclusion to student participation were those under the age of 18 years, as this was to conform with
ethical approval requirements in working with adult populations, as participants under 18 years of
age are deemed more vulnerable.</p>
      <p>As the average number of students within each year of the BIS undergraduate programme is
approximately 150, it was decided to recruit between 8 to 9 Students from BIS year 1, 8 to 9 Students
from BIS year 2, and 8 to 9 Students from BIS year 3 for this study.</p>
      <p>This sample of students from across the first three years of the four-year BIS programme were
invited to take part in a 45-60 minute interview to discuss Assessment Literacy as part of the EAT
student assessment framework. Data collection tools used as part of the interview process included
the study version of the EAT mapping framework diagram, the educator designed CV continuous
assessment outline with rubric from each of the three modules across each of the 3 years of the BIS
programme and open-ended interview questions to structure the one-to-one peer-led session.</p>
      <p>The gender mix of participants was aimed at being representative of the student cohorts within
each student year group.</p>
      <p>In addition to the student focused interview sessions, the intention was to also interview between
4 to 5 educators teaching onto the BIS undergraduate programme to obtain their insights into the
continuous assessment process. The only exclusion criteria employed in relation to educator
recruitment was that only BIS lecturers teaching onto the BIS UG programme would be invited to
take part in the study. The reason for this decision was to ensure familiarity with the programme
design, learning objectives and anticipated learning outcomes.</p>
      <p>These interview sessions with both the student and educator participants were conducted by a
BIS 2nd year student on a summer research assistant internship with BIS. All interviews were
recorded on digital devices, thereafter, transcribed into a word document, and later analysed for
thematic streams.</p>
      <p>Students were recruited to this study through networking with student year representatives, local
university societies such as the BIS society, through social media such as LinkedIn, and a notification
sent through the relevant university learning management system and respective modules.
Recruitment was open to all students (years 1 to 3) on the BIS programme which currently enrols
150 students in each year of the degree. This research was conducted outside of the teaching period,
during the summer of 2024.</p>
      <p>Educators were recruited via an email message sent out to all BIS UG teaching staff through the
universal email address system. Educator recruitment was supported by professional and
educational staff on the research team.
3.1.</p>
      <sec id="sec-3-1">
        <title>Participant Demographics</title>
        <p>In the end, a total of 4 educators and 15 BIS students took part in the interview sessions for this study
(see table 1).
Qualitative interviews were analysed by utilizing MS Excel software and a thematic approach.
Adopting the qualitative coding practices of Saldaña [28] and the reflective thematic approach per
Braun and Clarke [29]. As reflective thematic analysis focuses on “the researcher’s reflective and
thoughtful engagement with the data and their reflexive and thoughtful engagement with the analytic
process” [29, p. 594]. Multiple members of the team had access to the data set to explore multiple
assumptions or interpretations of the data when undergoing structural and descriptive level coding,
and thematic analysis.</p>
        <p>MS Excel was used as the software of choice for the analysis as it was an easy platform on which
to reflect on the coding process and for each member of the team to have shared access to the coding
structure.</p>
        <p>Interview Transcripts were analysed by combining structural and descriptive coding processes
[28]. Codes were determined from both a bottom-up and top-down approach. Referring to the AL
dimensions of the EAT framework (structural) whilst also allowing for emergent codes to be
recognised. The coding took place over several weeks, with first, second order coding and themes
reviewed and revised by the members of the research team. Each code was allocated an operational
definition, to support consistency in coding. An illustration of the coding tree can be found below
in Figure 2.</p>
        <p>Following this approach to qualitative coding, a frequency count was performed. The frequencies
of each code were counted by the number of participants and combined with a qualitative analysis
of each code. The frequency count allowed us to identify which ideas were deemed most relevant by
participants, whereas the qualitative analysis allowed us to determine their reflections on each topic.</p>
        <p>The following section presents the key themes that emerged from the qualitative analysis,
offering insights into the participants’ reflections on assessment literacy.</p>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>4. Insights</title>
      <p>4.1.</p>
      <p>‘What is Good’
In alignment with the EAT framework – Assessment Literacy, both students and educators reflected
that providing and receiving explicit guidance from the outset was essential to success – or in
knowing what ‘Good’ should look like. From the educator perspective knowing what is ‘good’ is
conveyed through connections between content-assessment-criteria: “I use a rubric with students to
both explain the assessment and also to prompt them to think about what needs to go into the assessment.
How it links to the current material that we're working on. How it links to the learning objectives of the
module” (E3).</p>
      <p>To understand ‘Good’ it was suggested by students that the provision of model examples and
opportunities for them to practice and build confidence through small and progressive educational
challenges would help achieve this goal. Knowing what ‘Good’ is, not only comes from educator
feedback, but also from discussions and feedback from peers. Clarity on what is expected from
assessment was seen as needing to be clearly aligned to course content, assessment design and
assessment guidance in the form of rubrics.</p>
      <p>“Our lecturers would provide us a rubric and post on canvas and for all the required courses,
giving the required criteria for the assessment” (S5).
“If you have any questions related to the criteria you can choose to meet the lecturer for the
assessment, or you can just ask the lecturers in class” (S11).</p>
      <p>The rubrics were seen as a powerful resource for the students; they helped guide them and provide
concrete advice on what was expected.</p>
      <p>“I usually check the rubric quite often throughout the assignment, so like you would
check it at the start, but I tend to if it's over time that I'm doing this piece of work, I'll
check a good few times throughout the assignment” (12).</p>
      <p>Peer support was viewed by students as a vital resource to understand what the course and
assessments were seeking from them.</p>
      <p>“I will talk to my friends about what they would do. I mean, what are your plans for? What do
you think is good? I try and talk to people in the year ahead of me, see if they have any
suggestions” (S6).</p>
      <p>Of course, the downside to this strategy could be the potential for a peer to accidentally mislead
the enquirer, if they were also unclear about the work requirements.</p>
      <p>Building confidence comes with practice, the more work effort put into completing assessments
and reviewing the feedback and grade obtained, the more confidence was seen to be built through
this process.</p>
      <p>“I feel like the work that you put in, you always get back, not just in grades, but even learning
new skills and how to do assignments” (S4).</p>
      <p>Small gains in achievements made for large impressions on student confidence.
4.2.</p>
      <sec id="sec-4-1">
        <title>Assessment</title>
        <p>Although the rationale underpinning how assessment fits together may be clear to the educator, it
is in the translation of this knowledge to students that sometimes comprehension goes adrift. The
educator aim is to engage students in the learning process so that both content and assessment design
is relevant and valuable for the student to meet the learning objectives, but also to help the learners
to appreciate how these skills can be used beyond the educational experience.</p>
        <p>However, educators expressed that the connections between assessment elements and programme
design may not always be as transparent as they would wish.</p>
        <p>“In a reflective fashion and being very upfront with the students saying this builds on that. Even
if I had built it that way, it may not have been spelled out in the Learning Management System
really clearly” (E2).</p>
        <p>What was expressed by students is that they preferred assessments that were designed to build
on each other, and get progressively harder: “I think the ones … that work perfectly are the ones that
are directly correlated to each other, … like introduce skills and … you're learning it in sections … this is
how you're going to do this and this is how you're going to do that, and it builds on each other” (S3).</p>
        <p>To have the opportunity for iterative assessments, such as a formative and summative approach
to assessment, was viewed as supportive action that allowed them to demonstrate the evolution of
their learning. “Last year we had a module …with four or five deliverables set every two weeks ... And
we definitely used the feedback from the early ones to help us learn what was expected for the later ones,
… we knew the standard then and that helped us push on” (S10).</p>
        <p>The building block approach really appealed to students; they found modules that adopted this
approach took the learner on a journey, instead of just focusing on the destination. “If you start with
the basics, … then get introduced to more complex ideas and … I always really like … applying knowledge
from another assignment to the next one” (S10).</p>
        <p>This approach also appealed to educators, and as evidenced by both educator and student
comments, clearly seen as adding value to the learning experience. “I give students an opportunity to
submit a draft and gather feedback … and implement changes in stage two, very much a building blocks
approach to design and development” (E3).</p>
        <p>The student perspective was positive: “One of our assessments was based on a step-by-step guide.
…Because of what I've learned in project phase one, it's helped me massively with project phase two”
(S2).
4.3.</p>
      </sec>
      <sec id="sec-4-2">
        <title>Feedback Exchange</title>
        <p>As much as educators try to reach out to students, it can be challenging, if there is a reluctance
to engage. The educator's view on this was: “I invite them to contact me by e-mail. Some students
contact me by canvas. I will stay around for 20 minutes if anybody wants to come and chat” (E2).
Whereas the student version reads quite differently: “Most of the time I won't go directly to the lecturer,
probably just be like, yeah, peers or even the tutors if they're available” (S2).</p>
        <p>With a preference in the first instance to ask peers or tutors for support prior to reaching out to
the lecturer. “Well, I definitely I do ask the tutors for feedback, and I know that would be more peer
feedback, but I do think it helps, and they can give really good insight because you know they've done it
themselves as well” (S4). And when reaching out for individual feedback the end-of-lecture option is
viewed as the simplest method: “I haven't fully actually tried to get feedback from lecturers
individually, but for the ones I did it was usually either seeing them straight after the lecture and
then you just you talk about your situation or e-mail them” (S7)</p>
        <p>Engaging through feedback mechanisms both in relation to assessment and general progress can
involve a variety of forms. For students, some of the novel approaches such as peer-to-peer feedback
have made a firm impression such as: “I remember we had … peer feedback. So, like your peers would
review your work and then that'll be good because you get honest feedback on what could be improved…
you get constructive feedback, it always helps to build on things” (S6). This is positive support for
student engagement, as there are struggles in getting timely feedback from educators, as expressed
here: “Getting feedback to students very quickly … this is something that I have struggled with [due
to the nature of the subject I teach and large class sizes] … in any sort of a timely fashion back to
students” (E2).</p>
        <p>However, it is clear that feedback timely or otherwise, has a beneficial effect if it highlights ways
to improve performance: “If you do something that receives a worse grade for a module, for example, if
they kind of give you your feedback, you'd actually read over it, and you definitely absorb it” (S15).
Students’ role is then to take on board recommendations for change: “Seeing if there's anything you
can improve on and then like, obviously you're not going to improve if you don't take on board the
feedback” (S11).
4.4.</p>
      </sec>
      <sec id="sec-4-3">
        <title>Professional Skills</title>
        <p>Some of the essential skills to understand the discipline involve networking with others beyond
university. For students, this was achieved through the invitation of industry guest speakers and
researchers to university sessions. “Having guest speakers, I found helped a lot, but since they've gone
through what we are going through and then they've gone on to do” (S9). In conjunction with this
approach, the three professional development modules running through years 1 to 3 of the
undergraduate course were viewed by students as a beneficial addition to their core IS studies. Such
that, “the professional development module throughout the entire year … was hugely helpful, especially
as they brought in guest speakers from industry” (S7). Not only functioning for student discovery of
industry options but also raising awareness of professional skill requirements. “The professional
development module, … is pretty good because it gives you an understanding of what's expected by
people, how your career develops, what professional life is … getting to talk about the corporate and
professional world, because it's not something you're going to get until you go into the corporate
environment” (S3).</p>
        <p>Aside from organised sessions, students also expressed the actions they take to build their
professional skills. For instance, “I'm going to be chairperson of society this coming year, which I think
some of what I've learnt in professional studies got me to that point” (S12). Other students shared
personal methods for self-improvement: “I'd use resources like information we're given in our lectures,
and I'd also like to research it myself - online and follow social media channels - on what a BIS
professional would need and how you can improve on that” (S6).</p>
        <p>Another view on this was for professional skills to grow as the individual matures, being
immersed in their lived experiences, both in and outside of university life. “For professional
requirements you are going to pick them up yourself … as you go” (S2). But collegial activities do help:
“Of course problem solving and other communication skills all build up … during college life” (S1).</p>
      </sec>
    </sec>
    <sec id="sec-5">
      <title>5. Lessons learnt (Discussion)</title>
      <p>
        This study has highlighted the value of using the Assessment Literacy (AL) dimension of the EAT
Framework [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ] to better understand how undergraduate Business Information Systems (BIS) students
and their educators engage with assessment. The themes that emerged - understanding what
constitutes “good”, how tasks build upon each other, entitlement and feedback, and clarity in
disciplinary expectations -resonated closely with the wider literature on assessment and feedback
literacy [
        <xref ref-type="bibr" rid="ref7 ref8">24, 8, 7</xref>
        ]. In particular, the findings reinforce the importance of transparency and student
involvement in assessment design, pointing to the value of co-constructive and dialogic practices in
enhancing self-regulated learning.
      </p>
      <p>Students shared thoughtful reflections on how they understand academic standards and
expectations, often through practical tools such as rubrics, peer discussion, and iterative feedback
opportunities. These reflections indicate growing confidence in navigating assessment tasks,
especially when tasks are structured to build upon one another over time. Educators similarly
acknowledged the importance of scaffolding and offering formative support, recognising the ongoing
challenges in ensuring clarity and consistency. These insights reflect the value of the EAT
Framework in creating space for shared understanding between students and educators.</p>
      <p>Importantly, this study has responded to its four research questions, providing a situated account
of how students interpret assessment criteria (AL1), understand progression between tasks (AL2),
engage with feedback and dialogue (AL3), and begin to form a sense of professional identity (AL4).
By placing the students’ voice at the centre, this study offers a meaningful contribution to the
conversation around assessment literacy in higher education and demonstrates how reflective
insights can inform the refinement of assessment practices.</p>
      <p>As an exploratory study situated within a single disciplinary context, the findings are limited by
the relatively small sample size. The research was also conducted within a specific institutional and
discipline-specific setting, and further work would be needed to explore how assessment literacy is
experienced across other disciplines and over time.</p>
      <p>Future research could expand this work by applying the EAT Framework in a range of
disciplinary contexts to examine cross-programme trends. Further exploration of how co-design in
assessment shapes student engagement and identity would be valuable. It would also be useful to
explore assessment literacy from the perspective of more diverse student cohorts, to better
understand the experiences of underrepresented voices in assessment design.</p>
      <p>This study suggests making assessment criteria and expectations explicit, using clear rubrics,
exemplars, and formative dialogue, highly valued by students. Creating structured, scaffolded tasks
that allow students to build skills over time appeared to support deeper learning and confidence.
Encouraging a culture of peer support and feedback also emerged as beneficial. Moreover, embedding
opportunities for professional reflection and industry engagement, as seen in the professional
development modules, offers a valuable bridge between academic study and the workplace. Where
possible, involving students in aspects of assessment design may further promote inclusivity and
ownership in the learning process.</p>
    </sec>
    <sec id="sec-6">
      <title>6. Conclusion</title>
      <p>This study has demonstrated how the EAT Framework can be effectively used to explore
assessment literacy from both student and educator perspectives. It offers insights into the
development of professional skills among Business Information Systems undergraduates and
highlights current educators’ opinions on practice. The findings suggest opportunities for further
research examining how structured skills development combined with peer-learning and
professional engagement can enhance academic-to-workplace transition. An ongoing dialogue and
a collaborative, student-centred focus remain key to enhance the academic experience for students
and contributing to wider disciplinary and educational discussions.</p>
    </sec>
    <sec id="sec-7">
      <title>Ethics</title>
      <p>Ethical approval for this research was obtained from the University Social Research Ethics
Committee (SREC) in June 2024. Ethical practices were strictly followed throughout this research to
ensure the protection of participants' rights, dignity, and confidentiality. This approach complied
with institutional and legal requirements to uphold the integrity and credibility of the research
findings.</p>
    </sec>
    <sec id="sec-8">
      <title>Acknowledgements</title>
      <p>This research was supported by funding from the Department of Business Information Systems,
Cork University Business School, University College Cork, Ireland, for the summer research assistant
opportunity.</p>
    </sec>
    <sec id="sec-9">
      <title>Declaration on Generative AI</title>
      <p>The author(s) have not employed any Generative AI tools.
[11] P. Broadfoot, P. Black, “Redefining assessment? The first ten years of assessment in education,”
Assessment in Education: Principles, Policy &amp; Practice, vol. 11, no. 1, pp. 7–26, Mar. 2004. doi:
10.1080/0969594042000208976.
[12] H. Tillema, “Student Involvement in Assessment of their Learning,” in Enabling Power of
Assessment, vol. 1, Springer Science and Business Media B.V., pp. 39–53, 2014. doi:
10.1007/97894-007-5902-2_3.
[13] N. Serki, S. Bolkan, “The effect of clarity on learning: impacting motivation through cognitive
load,” Communication Education, vol. 73, no. 1, pp. 29–45, Jan. 2024. doi:
10.1080/03634523.2023.2250883.
[14] J. M. Civikly, “Clarity: Teachers and students making sense of instruction,” Communication</p>
      <p>Education, vol. 41, no. 2, pp. 138–152, Apr. 1992. doi: 10.1080/03634529209378876.
[15] A. Smith, L. McConnell, P. Iyer, M. Allman-Farinelli, J. Chen, “Co-designing assessment tasks
with students in tertiary education: a scoping review of the literature,” Assessment &amp; Evaluation
in Higher Education, vol. 50, no. 2, pp. 199–218, Feb. 2024. doi: 10.1080/02602938.2024.2376648.
[16] J. H.-M. Tai, M. Dollinger, R Ajjawi, T. St Jorre, S. Krattli, D. McCarthy, D. Prezioso “Designing
assessment for inclusion: an exploration of diverse students’ assessment experiences,”
Assessment &amp; Evaluation in Higher Education, vol. 48, no. 3, pp. 403–417, Apr. 2023, doi:
10.1080/02602938.2022.2082373.
[17] A. Loveless, “Technology, pedagogy and education: reflections on the accomplishment of what
teachers know, do and believe in a digital age,” Technology, Pedagogy and Education, vol. 20,
no. 3, pp. 301–316, Oct. 2011. doi: 10.1080/1475939X.2011.610931.
[18] C. Evans, M. Waring, “Enhancing Feedback Practice,” in Style Differences in Cognition,
Learning, and Management, 1st ed., S. Rayner and E. Cools, Eds., New York: Routledge, pp. 196–
213, 2011.
[19] C. Evans, “Making Sense of Assessment Feedback in Higher Education,” Review of Educational</p>
      <p>Research, vol. 83, no. 1, pp. 70–120, Mar. 2013. doi: 10.3102/0034654312474350.
[20] C. A. Wolters, A. C. Brady, “College Students’ Time Management: a Self-Regulated Learning
Perspective,” Educational Psychology Review, vol. 33, no. 4, pp. 1319–1351, Dec. 2021. doi:
10.1007/s10648-020-09519-z.
[21] C. Evans, M. Waring, “Enhancing Students’ Assessment Feedback Skills Within Higher
Education,” in Oxford Research Encyclopedia of Education, Oxford University Press, 2020. doi:
10.1093/acrefore/9780190264093.013.932.
[22] S. Rutherford, C. Pritchard, N. Francis, “Assessment IS learning: Developing a student‐centred
approach for assessment in Higher Education,” FEBS Open Bio, vol. 15, no. 1, pp. 21–34, Jan.
2025. doi: 10.1002/2211-5463.13921.
[23] Z. Yan, D. Carless, “Self-assessment is about more than self: the enabling role of feedback
literacy,” Assessment &amp; Evaluation in Higher Education, vol. 47, no. 7, pp. 1116–1128, Oct. 2022.
doi: 10.1080/02602938.2021.2001431.
[24] D. Carless, D. Boud, “The development of student feedback literacy: enabling uptake of
feedback,” Assessment &amp; Evaluation in Higher Education, vol. 43, no. 8, pp. 1315–1325, Nov.
2018. doi: 10.1080/02602938.2018.1463354.
[25] D. Wiliam, “Assessment and learning: some reflections,” Assessment in Education: Principles,</p>
      <p>Policy &amp; Practice, vol. 24, no. 3, pp. 394–403, Jul. 2017. doi: 10.1080/0969594X.2017.1318108.
[26] D. Pereira, M. A. Flores, L. Niklasson, “Assessment revisited: a review of research in Assessment
and Evaluation in Higher Education,” Assessment &amp; Evaluation in Higher Education, vol. 41, no.
7, pp. 1008–1032, Oct. 2016. doi: 10.1080/02602938.2015.1055233.
[27] S. Lynam, M. Cachia, “Students’ perceptions of the role of assessments at higher education,”
Assessment &amp; Evaluation in Higher Education, vol. 43, no. 2, pp. 223–234, Feb. 2018. doi:
10.1080/02602938.2017.1329928.
[28] J. Saldaña. "The coding manual for qualitative researchers,", 4th ed, London: SAGE publications,
2021, pp.1-440, 2021.
[29] V. Braun, V. Clarke. "Reflecting on reflexive thematic analysis." Qualitative research in sport,
exercise and health, vol. 11, no. 4: pp 589-597, 2019.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>C.</given-names>
            <surname>Evans</surname>
          </string-name>
          ,
          <article-title>Enhancing Assessment Feedback Practice in Higher Education: The EAT Framework, Underpinning Principles of Evans' Assessment Tool</article-title>
          (EAT),
          <year>2016</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>C.</given-names>
            <surname>Evans</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Waring</surname>
          </string-name>
          ,
          <article-title>Enhancing feedback practice: A personal learning styles pedagogy approach</article-title>
          , in S. Rayner and E. Cools, (Eds.), Style Differences in Cognition, Learning, and Management: Theory, Research, and
          <string-name>
            <surname>Practice</surname>
          </string-name>
          , New York: Routledge,
          <year>2012</year>
          . doi:
          <volume>10</volume>
          .4324/9780203841853.
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>C.</given-names>
            <surname>Evans</surname>
          </string-name>
          , “
          <article-title>The EAT Framework Carol Evans Enhancing assessment feedback practice in higher education</article-title>
          ,” InclusiveHE,
          <year>2022</year>
          . Available: https://inclusivehe.org/inclusive-assessment/
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>I.</given-names>
            <surname>Stăncescu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L. M.</given-names>
            <surname>Drăghicescu</surname>
          </string-name>
          , “
          <source>The Importance Of Assessment In The Educational Process - Science Teachers' Perspective,” European Proceedings of Social and Behavioural Sciences</source>
          ,
          <year>2017</year>
          . doi:
          <volume>10</volume>
          .15405/epsbs.
          <year>2017</year>
          .
          <volume>07</volume>
          .03.89
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <surname>I. Tosuncuoglu</surname>
          </string-name>
          , “Importance of Assessment in ELT,
          <source>” Journal of Education and Training Studies</source>
          , vol.
          <volume>6</volume>
          , no.
          <issue>9</issue>
          , p.
          <fpage>163</fpage>
          ,
          <string-name>
            <surname>Aug</surname>
          </string-name>
          .
          <year>2018</year>
          , doi: 10.11114/jets.v6i9.
          <fpage>3443</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>C. D.</given-names>
            <surname>Smith</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Worsfold</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Davies</surname>
          </string-name>
          , R. Fisher, and
          <string-name>
            <given-names>R.</given-names>
            <surname>McPhail</surname>
          </string-name>
          , “
          <article-title>Assessment literacy and student learning: the case for explicitly developing students 'assessment literacy,'” Assessment &amp; Evaluation in Higher Education</article-title>
          , vol.
          <volume>38</volume>
          , no.
          <issue>1</issue>
          , pp.
          <fpage>44</fpage>
          -
          <lpage>60</lpage>
          , Feb.
          <year>2011</year>
          . doi:
          <volume>10</volume>
          .1080/02602938.
          <year>2011</year>
          .
          <volume>598636</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>C.</given-names>
            <surname>Hannigan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Alonzo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C. Z.</given-names>
            <surname>Oo</surname>
          </string-name>
          , “
          <article-title>Student assessment literacy: indicators and domains from the literature,” Assessment in Education: Principles</article-title>
          ,
          <source>Policy &amp; Practice</source>
          , vol.
          <volume>29</volume>
          , no.
          <issue>4</issue>
          , pp.
          <fpage>482</fpage>
          -
          <lpage>504</lpage>
          , Jul.
          <year>2022</year>
          . doi:
          <volume>10</volume>
          .1080/0969594X.
          <year>2022</year>
          .
          <volume>2121911</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>X.</given-names>
            <surname>Zhu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Evans</surname>
          </string-name>
          , “
          <article-title>Enhancing the development and understanding of assessment literacy in higher education,”</article-title>
          <source>European Journal of Higher Education</source>
          , vol.
          <volume>14</volume>
          , no.
          <issue>1</issue>
          , pp.
          <fpage>80</fpage>
          -
          <lpage>100</lpage>
          , Jan.
          <year>2024</year>
          . doi:
          <volume>10</volume>
          .1080/21568235.
          <year>2022</year>
          .
          <volume>2118149</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>C.</given-names>
            <surname>Gonsalves</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Z.</given-names>
            <surname>Lin</surname>
          </string-name>
          , “
          <article-title>Clear in advance to whom? Exploring 'transparency' of assessment practices in UK higher education institution assessment policy,” Studies in Higher Education</article-title>
          , pp.
          <fpage>1</fpage>
          -
          <lpage>17</lpage>
          , Jul.
          <year>2024</year>
          . doi:
          <volume>10</volume>
          .1080/03075079.
          <year>2024</year>
          .
          <volume>2381124</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <given-names>S.</given-names>
            <surname>Roy</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Beer</surname>
          </string-name>
          , C. Lawson, “
          <article-title>The importance of clarity in written assessment instructions</article-title>
          ,
          <source>” Journal of Further and Higher Education</source>
          , vol.
          <volume>44</volume>
          , no.
          <issue>2</issue>
          , pp.
          <fpage>143</fpage>
          -
          <lpage>155</lpage>
          , Feb.
          <year>2020</year>
          . doi:
          <volume>10</volume>
          .1080/0309877X.
          <year>2018</year>
          .
          <volume>1526259</volume>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>