=Paper= {{Paper |id=Vol-1828/paper-02 |storemode=property |title=Towards Measurement of the Relationship between Student Engagement and Learning Outcomes at a Bricks-and-Mortar University |pdfUrl=https://ceur-ws.org/Vol-1828/paper-02.pdf |volume=Vol-1828 |authors=Carmel Kent,Chris A. Boulton,Hywel Williams |dblpUrl=https://dblp.org/rec/conf/lak/KentBW17 }} ==Towards Measurement of the Relationship between Student Engagement and Learning Outcomes at a Bricks-and-Mortar University== https://ceur-ws.org/Vol-1828/paper-02.pdf
   Towards Measurement of the Relationship between
   Student Engagement and Learning Outcomes at a
            Bricks-and-Mortar University

                  Carmel Kent*, Chris A. Boulton, Hywel Williams
                              University of Exeter, UK
                   c.kent,c.a.boulton,h.t.p.williams@exeter.ac.uk

         Abstract. The relationship between student engagement and
         learning outcomes has been extensively studied in the context of
         online learning. However, it has been less investigated in face-to-
         face learning. In this paper, we describe initial findings from a study
         of student engagement and outcomes at a ‘bricks-and-mortar’
         (BaM) university, where engagement is characterized by a diverse
         set of systems and agents, spanning both physical and digital
         spaces. We ask whether the substantial relationship often found in
         online environments between engagement and outcomes, holds in a
         BaM setting. We present initial analysis of data traces from various
         sources, each relating to a different dimension of engagement.
         Initial results indicate a weak relation between engagement and
         outcomes, suggesting that this important relationship may be
         substantively different in face-to-face/BaM and online learning
         environments. These preliminary findings highlight the need for
         further research, tackling challenges which are specific to face-to-
         face learning in a bricks-and-mortar university environment.

         Keywords: Learning analytics · Bricks and mortar · Learning outcome ·
         Engagement


1 Introduction
Student engagement and learning outcomes are amongst the most ill-defined and
broadly interpreted theoretical concepts, for which there are no currently agreed upon
frameworks for operationalization [2, 5]. This vagueness is one of the reasons for the
general lack of clarity about the relation between them [16]. Nevertheless, the
relationship between engagement and outcomes has been extensively investigated in
the context of online learning, in which participation is shown to be highly correlated
with various types of learning outcomes [1, 3-4, 7, 11, 20-22]. In purely online
learning settings, engagement is typically defined narrowly in terms of student
interactions with a Virtual Learning Environment (VLE), usually in the context of a
specific course or module. However, student engagement in ‘bricks-and-mortar’
(BaM) institutions, where most teaching is delivered face-to-face, is much less clearly
defined and there are associated difficulties in its measurement and analysis. Thus
much less evidence is gathered in order to determine if and how engagement is of
value for predicting outcomes in face-to-face learning.

          Online learning is not analogous to face-to-face learning and each requires
different conceptualization and operationalization frameworks [8, 12]. Moreover,
students are shown to engage differently when learning in an online learning
environment as opposed to BaM environment, also resulting in different learning
outcomes. Specifically, this difference might be explained by the nature of online
learning, which is more self-regulated [10]. Within a BaM environment, learners
interact with a wide variety of systems, some of which relate directly to their course
performance (e.g. lectures, assessments, VLEs) while others address learning
outcomes in a wider context (e.g. career planning). Comparative study of higher
education learning across different contexts and environments is still in its infancy
[10] and holds many technical challenges relating to the collection, integration and
ethical aspects of data from multiple sources [18]. In this paper, we present initial
insights into the relationship between student engagement and learning outcomes in a
BaM university. While trying to capture this relationship's flexible and sometimes
elusive nature, here we adopt a holistic approach that aims to integrate data captured
from various sources and interaction points in order to provide a multidimensional
image of student engagement.


1.1 Measuring Engagement
The phrase “student engagement” has come to refer to the level of involvement
students appear to have within their classes and their institutions in the context of
learning [5]. Moore [14] proposed three types of interactivity: learner-content,
learner-instructor and learner-learner. We suggest extending this framework and
viewing students' engagement at a BaM institution as a multi-dimensional construct
entailing the measurement of interactions between the student and various types of
resources and agents (such as systems, people and devices) associated with the
individual learning experience [19]. For our purpose, an interaction denotes a singular
instance or event in which a student uses a resource, and represents a temporal
relationship between the student and the resource [6]. For instance, an interaction may
be attending a lecture, submitting a quiz, speaking to a lecturer, or accessing the VLE.
It is very difficult to separate the net contribution of each type of interaction to the
learning process. Even in the field of online learning, where interactions are easier to
identify, this debate still remains open [5]. In addition, it is very complicated to study
'engagement' across different learning designs/goals and backgrounds of students.
Thus, in this paper, we execute an initial cross-design analysis, and add demographic
parameters, in order to support future work with more fine-grained cohorts.


1.2 Measuring Learning Outcome
It is enormously challenging to measure the depth of understanding at a course-
specific learning outcome [17, 19]. Module results usually include assessment tools
that are defined by clarifying specific learning objectives [13]. There are important
differences between face-to-face and online learning, including the pedagogical basis
for assessment. Instructivism, which is common in BaM's face-to-face learning,
maintains that knowledge should be transferred directly from the instructor to the
learner without further interactions [15]. On the other hand, social constructivism is
often implemented in collaborative online learning environments, whereby the teacher
is seen as a facilitator between students, content and platforms, and social interactions
are more central [9]. In accordance, the definition of learning outcomes might reflect
on that difference, and thus face-to-face learning assessment in BaM institutions
could be less correlated with interactive behaviors. While we recognize that there are
many kinds of learning outcome, in this initial study we focus on student performance
as measured by module grades.


2 Method
Engagement has been shown to correlate with performance in online learning. In this
study, we ask how this relation manifests in a BaM university. More specifically, we
attempt to determine what types of interactions and student characteristics can predict
specific learning outcomes. Working with data from a traditional BaM university in
the UK, we have collected data from various university systems for 30,781
undergraduate students across three academic years commencing in Autumn 2013,
2014 and 2015. Tables 1, 2 and 3 below summarize the variables extracted to
operationalize engagement, demographic characteristics and learning outcomes
respectively. The systems from which the variables are extracted, as well as basic
descriptive statistics, are also presented. Our initial unit of analysis was the aggregate
of all interactions involving a specific student in a specific year, resulting in a dataset
of 52,553 records.
                       Table 1. Engagement variables and data sources

Variable                   System                                   Missing      Mean      St Dv
Number of attended         Career Events System. Events are            0          1.60      2.88
career events              optional and cover a wide range of
                           topics.
Number of signed-up        Career Events System                        0          1.99       3.34
career events
Proportion of career       Careers Events System                       0          0.46       0.46
events signed-up to that
were attended
Number of logins           Virtual Learning Environment                0          79.72    68.83
                           (VLE)
Number of logins           Inter Library Loans (Library ILL).          0          0.00       0.63
                           Online access to borrow books from
                           other UK libraries.
Number of logins           Library. Online access to                   0          0.19       0.70
                           academic journals and e-
                           resources and manage library
                           resource loans.
Number of library's        Library                                     0          0.12       0.54
fines paid
Number of logins           MACE (Module and Course                     0          1.17       1.45
                           Evaluation) system. An optional
                           quality questionnaires for students.
Number of submitted        MACE                                        0          0.83       1.07
evaluations
Number of logins           Exam's archival system                      0           3.29     6.64
Number of papers'          Exam's archival system                      0          15.35    28.20
views
Number of all              All systems (VLE, Library ILL,              0         100.66    87.66
interactions               Library, MACE, Exam's archival
                           system)
Number of committee        Student's guild (buying tickets to          0          0.03       0.18
interactions               guild's events, holding positions on
                           volunteering project committees)
Number of enrolled         Registration system                         0          1.07       0.26
programs

                      Table 2. Demographics variables and data sources

Variable                      System             Missing     Type             Dominant
                                                                              category
Gender                        Registration           30      Categorical      Female 55.1%
                              system
Disability (type of           Registration           72      Categorical      No known
disability)                   system                                          disability 87.2%
National identity             Registration          5,700    Categorical      British 41.3%
                              system
Nationality                   Registration          1,584    Categorical      UK 69.8%
                              system
Country of domicile            Registration         56       Categorical       England 69%
                               system
Ethnicity                      Registration       1,595      Categorical       White 73.9%
                               system
Age at enrollment to the       Registration        265       Numerical         Mean: 19.80
university                     system
Age at the beginning of        Registration        256       Numerical         Mean: 21.04
the year                       system
Living away from home          Registration         0        Binary flag       Away: 72.6%
                               system
Parents' occupational          Registration       6,339      Categorical       Higher managerial
background                     system                                          23.2%

                           Table 3. Outcome variables and data sources

Variable                                        System              Missing        Mean      St Dv
Average number of attempts for all modules      Module               2,814         1.01       0.09
in a year                                       Assessment
Average results for all modules in a year,      Module                8,106        49.80     21.31
normalized by credits' weights (i.e.            Assessment
summative ‘end of year result’)
Number of failures in all modules in a year     Module                7,697         0.15     0.84
                                                Assessment
Number of pass grades in all modules in a       Module                7,697         4.21     2.70
year                                            Assessment
Proportion of passes out of all passes and      Module                7,697         0.96     0.16
failures                                        Assessment
Number of results which were not agreed in      Module                     0        0.07     0.31
a year                                          Assessment
Number of agreed upon results in a year         Module                     0        6.84     2.77
                                                Assessment
Average gap between module result and its       Module                8,106        0.003     2.64
class average                                   Assessment


3 Findings
Here we present our two-step analysis. First, we present the significant pairwise
relations found between outcome variables, and engagement or demographics
variables. Second, we present a multivariate model to try and predict student success
based on features found to be significantly correlated with outcome at our first step.


3.1 Pairwise Relations between Outcome, Demographics and Engagement
Since none of our outcome variables are normally distributed, we used Spearman's
rank correlation test to find significant relationships between them and any numeric
engagement variable. A quick exploration of the engagement variables shows that
VLE logins, Past Exam views, Library logins, MACE submissions and event
attendances follow a typical power law distribution as one might expect, where many
students use each individual system sparingly and few students use each system often.
When correlating outcome with categorical variables, we have used Mann-Whitney U
test and Kruskal-Wallis H test. For space limitations, we only show significant
relations with the normalized result outcome variables, we are omitting significant
relations which were found to be very weak, in addition to some of the post-hoc
results.

Table 4. Pairwise relations between average results for all modules in a year, normalized by
credits' weights variables and between demographics and engagement variables.

 Significant demographic                      Selected post-hoc results                      Engagement
 variables                                                                                   variables
 Gender U = 231120953.50**                    Female (Med= 60.16)>Male (Med                  MACE- logins (r
                                              = 57.03)                                       = 0.262)**
                                                                                             MACE-
                                                                                             Submitted
                                                                                             evaluations (r =
                                                                                             0.250)**
 Away from home U=                            Away(Med = 60.25)>Local(Med =
 152140073.00**                               52.01)
 Disability H(10) =168.02**                   Long standing illness, Mental
                                              health, Mobility issues, learning
                                              difficulty > Information refused
 Is Disable flag H(3)=73.89**                 Refused >No disability>Disability
 Country of domicile
 H(140)=1,554.98**
 Ethnicity H(18)=627.97**                     White>Arab, Asian, Black
 National identity H(7)=360.69**
 Nationality H(187)=1,880.03**
 Parents' occupational                        All managers > All routines roles
 H(326)=869.74**
*Correlation/ difference is significant at the .05 level (two-tailed test) **; significant at the .01 level or
below (two-tailed test)


3.2 Multivariate Model to Predict Outcome out of Demographics and
     Engagement
For our regression model, we used the ‘logged’ version of some of the numeric
variables in an attempt to make them more normally distributed, which while helpful
has not fully solved the problem of non-normality. As there are some students who
have not accessed some of the systems at all, we make the transformation x ->
log(x+1) which maps 0 to 0 and prevents the problem of trying to use log(0). We have
fitted a model to predict the weighted average results for a year, resulting (F(11,
44425) = 512.7, R2 = 0.1126, Adjusted R2 = 0.1124), p< 2.2e-16, Residual standard
error=20.08. The parameter estimates and significances are detailed in Table 5 below.
For the categorical variables in the table, the first factor that appears in the data is
assumed to have a coefficient of 0 (e.g. “Female” has no effect on our model) and
other factors for that category are assigned a coefficient of which the significance is
then determined.
 Table 5. Multivariate model parameters and significances (engagement variables are in bold)

Coefficients                      Estimate         Std. Error           t value              p-value
(Intercept)                         52.446              0.823           63.751               <2*10-16 **
Gender (Male)                       -0.239            0.19286           -1.242               0.2143
Age at beginning of year            -0.328              0.035           -9.399               <2*10-16 **
Away from home                       5.973              0.224           26.616               <2*10-16 **
Disability Type                      2.107              1.936           1.088                0.2764
(Unknown)
Is Disable (Yes)                     -2.834              0.284          -9.978               <2*10-16 **
log(events attended + 1)             2.832               0.132          21.495               <2*10-16 **
Committee interactions               8.914               0.489          18.215               <2*10-16 **
log(VLE + 1)                         -1.944              0.085          -22.870              <2*10-16 **
log(Past exams + 1)                  0.130               0.067          1.937                0.0528 *
log(Library logins + 1)              4.842               0.302          16.009               <2*10-16 **
log(MACE + 1)                        9.224               0.187          49.364               <2*10-16 **
*Coefficient is significantly different from 0 at the .1 level. **Coefficient is significantly different from 0
at the <2*10-16 level or below

         Our model appears to struggle from there being a low number of high scores
in the dataset. Generally, we find that being male and older is likely to reduce your
assessment results, as is living at home and being disabled. It also appears that being
‘more engaged’ is beneficial, except apparently logging onto the VLE too much could
be a disadvantage for the overall result.


4 Conclusions
One of the major challenges of learning analytics in a BaM setting is the need to
integrate analytics across different spaces and tools. In this study, we describe initial
steps into exploring the relationship between learning outcome and engagement
variables, where measures about engagement are integrated from students'
interactions with a variety of systems and services, physical and digital. In addition,
we have added demographic variables to be able to easily identify finer grained
cohorts for further analysis. Following the collection and integration phase, we have
shown here a regression model, predicting the aggregative score of all module grades
at the end of the year. Our model shows the predictive values of demographics
variables such as age, disability and being away from home, along with engagement
variables, showing interactions with some of the university's systems and services,
partially supporting existing evidence of the relation between engagement and
outcome. Interestingly, most of the significant estimates were shown with systems
which are not directly related to learning, but rather with a wider framework of
interactions held between students and the university facilities, such as career events,
committee activities and quality questionnaires. Moreover, interactions with the VLE,
the digital system which coordinates most learning activities, were shown to be
negatively correlated with module grades. Taking into consideration that the VLE, as
well as library resources, are not used equally or standardly across all modules, this
requires further investigation.


4.1 Limitations and Future Work
A problematic aspect for utilizing our findings is the observed range of residuals in
the modelling exercise. For example, this could suggest that using a predictive
modelling technique to determine failing students is unlikely to be effective, at least
when using a student per year timescale and seeing how an individual student's
performance changes over time. We could hopefully explain more variance by
reducing this to a termly timescale. Module assessments are usually derived directly
from different learning objectives and designs [13]. This variation could further cause
a differentiation in the dependencies on different systems. The importance of module
assessments in the total aggregative score also varies. Thus, a more predictive model
could result from analyzing students enrolled to a specific module, course or
programme. Age was shown to negatively affect the outcome. In addition to the
reported relations above, when exploring secondary relations, among engagement and
demographic variables, the age's negative relations with some engagement variables
(such as all interactions with digital systems and career events attendance) suggests an
explanation to this negative effect, and is subject to further analysis. In addition, some
positive correlations among the engagement variables themselves (such as VLE
logins, MACE, exams' archive and all digital interactions) supports the notion that
"students who do stuff also do more stuff".1

Traditional approaches of the teaching-centered paradigm are usually measured by
summative scores. Nonetheless, an educational institution's definition of learning
outcomes, as well as the subjective expectation each student adopt, does not
necessarily adhere to the taken summative measurements. Some students are after
First Class Honors while others are aiming simply to complete the course or find a
decent career. Therefore our goal should be to enable a flexible, multi-dimensional,
possibly sometimes subjective framework for 'learning outcomes', and to seek to find

1

              http://blogs.edweek.org/edweek/edtechresearcher/2014/03/big_data_mooc_research_breakthrough_learning_a
ctivities_lead_to_achievement.html
relations between various dimensions of engagement, demographics and various
dimensions of learning outcome. For example, adding data from surveys (or other
sources) could broaden our current limits of the data by complementing it with
students' self-perceived interactions, data about their face-to-face interactions with
each other or with their instructors, informal interactions (such as interacting over
social media), their own perceptions about what is considered to be their 'learning
outcomes' and more. In addition, some of our current variables are too coarse. For
example, adding finer grained VLE activities, such as posting on a bulletin board and
downloading material, separating data about career events' attendance by the event
type and more, are crucial and can benefit our overall understanding about students'
engagement.


5 References
[1] Adeyinka, T., & Abdulmumin, I. (2011). Pattern of undergraduate’s participation
      in the online discussion forum at the Univeristy of Ilorin, Nigeria. Management,
      Journal of Information Technology, 12(3), 59–76.
[2] Akyol, Z., & Garrison, D. R. (2011). Understanding cognitive presence in an
      online and blended community of inquiry: Assessing outcomes and processes
      for deep approaches to learning. British Journal of Educational Technology,
      42(2), 233–250. http://doi.org/10.1111/j.1467-8535.2009.01029.x
[3] Andresen, M. A. (2009). Asynchronous discussion forums: success factors,
      outcomes, assessments, and limitations. Educational Technology & Society,
      12(1), 249–257.
[4] Asterhan, C. S. C., & Hever, R. (2015). Learning from reading argumentive group
      discussions in Facebook: Rhetoric style matters (again). Computers in Human
      Behavior, 53, 570-576. http://doi.org/10.1016/j.chb.2015.05.020
[5] Axelson, R. D., & Flick, A. (2010). Defining Student Engagement. Change: The
      Magazine of Higher Learning, 43(1), 38–43.
      http://doi.org/10.1080/00091383.2011.533096
[6] Brooks, C., Thompson, C., & Teasley, S. (2015). A time series interaction analysis
      method for building predictive models of learners using log data. Proceedings
      of the Fifth International Conference on Learning Analyticsand Knowledge -
      LAK ’15, 126–135. http://doi.org/10.1145/2723576.2723581
[7] Cerezo, R., Sánchez-Santillán, M., Paule-Ruiz, M. P., & Núñez, J. C. (2016).
      Students’ LMS interaction patterns and their relationship with achievement: A
      case study in higher education. Computers & Education, 96, 42–54.
[8] Deboer, J., Ho, A. D., Stump, G. S., & Breslow, L. (2014). Changing “Course”:
      Reconceptualizing Educational Variables for Massive Open Online Courses.
      Educational Researcher, http://doi.org/10.3102/0013189X14523038
[9] Garrison, D. R. (2006). Online collaboration principles. Journal of Asynchronous
      Learning Networks, 10(1), 25–34. Retrieved from
      http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.96.4536&rep=rep1&t
      ype=pdf
[10] Harris, R., & Nikitenko, G. (2014). Comparing online with brick and mortar
      course learning outcomes An analysis of quantitative methods curriculum in
      public administration. Teaching Public Administration, 32(1), 95–107.
      http://doi.org/10.1177/0144739414523284
[11] Kuzilek, J., Hlosta, M., Herrmannova, D., Zdrahal, Z., & Wolff, A. (2015). OU
      analyse: Analysing at-risk students at The Open University. Learning Analytics
      Review, 1-16.
[12] McConnell, D. (2000). Implementing computer supported cooperative learning.
      In Millar, R. (2013). Improving science education: Why assessment matters.
      Valuing Assessment in Science Education: Pedagogy, Curriculum, Policy, 55–
      68, Psychology Press.
[13] Millar, R. (2013). Improving science education: Why assessment matters. Valu-
      ing Assessment in Science Education: Pedagogy, Curriculum, Policy, 55–68.
[14] Moore, M. G. (1989). Editorial: Three types of interaction ‫‏‬. The American
      Journal of Distance Education, 3(2), 1–6.
[15] Onyesolu, M. O., Nwasor, V. C., Ositanwosu, O. E., & Iwegbuna, O. N. (2013).
      Pedagogy: Instructivism to Socio-Constructivism through Virtual Reality.
      International Journal of Advanced Computer Science and Applications, 4(9),
      40–47. Retrieved from http://ijacsa.thesai.org/
[16] Picciano, A. G. (2002). Beyond student perceptions: Issues of interaction,
      presence, and performance in an online course. Journal of Asynchronous
      Learning Network, 6(1), 21–40.
[17] Prince, M. (2004). Does Active Learning Work ? A Review of the Research.
      Journal of Engineering Education, 93(July), 223–231.
      http://doi.org/10.1002/j.2168-9830.2004.tb00809.x
[18] Reyes, J. A. (2015). The skinny on big data in education: Learning analytics
      simplified. TechTrends, 59(2), 75–80. http://doi.org/10.1007/s11528-015-0842-
      1
[19] Shavelson, R. J., & Huang, L. (2003). Responding Responsibly. Change: The
      Magazine of Higher Learning, 35(1), 10–19.
      http://doi.org/10.1080/00091380309604739
[20] Song, L., & McNary, S. W. (2011). Understanding students’ online interaction:
      Analysis of discussion board postings. Journal of Interactive Online Learning,
      10(1), 1–14.
[21] Wei, H., Peng, H., & Chou, C. (2015). Can more interactivity improve learning
      achievement in an online course? Effects of college students’ perception and
      actual use of a course-management system on their learning achievement.
      Computers & Education, 83, 10–21.
      http://doi.org/10.1016/j.compedu.2014.12.013
[22] Zhu, E. (2006). Interaction and cognitive engagement: An analysis of four
      asynchronous online discussions. Instructional Science, 34(6), 451–480.
      http://doi.org/10.1007/s11251-006-0004-0
Acknowledgments

This research was supported by the Effective Learning Analytics Project at the
University of Exeter. A project which aims to help students reach their full academic
potential while studying at Exeter.