=Paper=
{{Paper
|id=Vol-1925/paper07
|storemode=property
|title=Study of the Flexibility of a Learning Analytics Tool to Evaluate Teamwork Competence Acquisition in Different Contexts
|pdfUrl=https://ceur-ws.org/Vol-1925/paper07.pdf
|volume=Vol-1925
|authors=Miguel Ángel Conde,Francisco J. García-Peñalvo,Ángel Fidalgo-Blanco,María Luisa Sein-Echaluce
|dblpUrl=https://dblp.org/rec/conf/lasi-spain/CondeGBS17
}}
==Study of the Flexibility of a Learning Analytics Tool to Evaluate Teamwork Competence Acquisition in Different Contexts==
Study of the flexibility of a Learning Analytics tool to
evaluate teamwork competence acquisition in different
contexts
Miguel Ángel Conde1, Francisco J. García-Peñalvo2, Ángel Fidalgo-Blanco3, María
Luisa Sein-Echaluce4
1
Department of Mechanics, Computer Science and Aerospace Engineering, Robotics Group,
University of León – Campus de Vegazana S/N, 24071 León, Spain
miguel.conde@unileon.es
2
Computer Science Department, Research Institute for Educational Sciences, GRIAL research
group, University of Salamanca, Faculty of Science - Plaza de los Caídos S/N. 37008
Salamanca, Spain
fgarcia@usal.es
3
Laboratory of Innovation in Information Technologies. Politechnical University of Madrid.
Calle de Ríos Rosas 21, 28003 Madrid, Spain
angel.fidalgo@upm.es
4
Department of Applied Mathematics, University of Zaragoza. Campus Rio Ebro, Calle de Ma-
ría de Luna 3, 50018 Zaragoza, Spain
mlsein@unizar.es
Abstract. Learning analytics tools and methodologies aim to facilitate teachers
and/or decision makers with information and knowledge about what is happening
in virtual learning environments in a straightforward and effortless way. How-
ever, it is necessary to apply these tools and methodologies in different contexts
with a similar success, that is, that they should be flexible and portable enough.
There exist several learning analytics tools that only works properly with very
specific versions of learning platforms. In this paper, the authors aim to evaluate
the flexibility and portability of a methodology and a learning analytics tool that
supports individual assessment of teamwork competence. In order to do so the
methodology and the tool are applied in a similar course from two different aca-
demic contexts. After the experiment, it is possible to see that the learning ana-
lytics tool seems to work properly and the suggested new functionalities are sim-
ilar in both contexts. The methodology can be also applied but results could be
improved if some meetings are carried out to check how team works are pro-
gressing with their tasks.
Keywords: Learning Analytics tool, teamwork competence, participation, fo-
rums
Copyright © 2017 for the individual papers by the papers' authors. Copying permitted for private and academic purposes. This volume is published and copyrighted by its editors.
1 Introduction
Nowadays we live in the digital society. Lot of our daily activities are mediated by the
technology. We use the Information and Communications Technology (ICT) anywhere
and anytime and with different proposes. For instance, we use them to work, to access
to the information, to play games, to see music or films, to learn, to interact with others,
etc. For most of these activities the technology is also recording information about what
we are doing (not always with the user awareness or consent), and this information can
be later analyzed for making decisions [1].
When talking about the application of ICT in learning contexts, and from a formal
learning perspective, most educational institutions are providing students with tools
such as Virtual Learning Environments (VLE) and/or Learning Management Systems
(LMS) [2, 3]. These platforms facilitate spaces with tools that extend and give support
to the traditional concept of a class, because they are mostly centered in helping the
teachers, due to their emphasis on facilitating administrative and management work
relative to learning (which includes tools for document management, questionnaire cor-
rection automatization, discussion spaces, etc.) [4]. For students, they constitute spaces
where they can carry out their lecture activities or with which they can complement
their classes. For these reasons VLEs and LMSs has been widely accepted both by ed-
ucation institutions [2, 5, 6], and in businesses [7].
These platforms generate a great amount of information, and dealing with it and
extracting useful knowledge from that data, is not easy. It is necessary to apply meth-
odologies and tools that allow having knowledge about students’ effort and competence
development, about how resources are being used, which are the moments of highest
activity in the platform, the impact of some contents in students’ performance, teachers’
performance, etc. [8, 9]. Those methodologies and tools are given by what is known as
Learning Analytics and other disciplines such as Educational Data Mining and Aca-
demic Analytics.
Learning analytics is a research field devoted to understand how learning took place
online [10]. Learning analytics is becoming an essential tool to inform and support
learners, teachers and their institutions in better understanding and predicting personal
learning needs and performance [11]. According to [12], Learning Analytics is the
“measurement, collection, analysis and reporting of data about learners and their con-
texts, for purposes of understanding and optimizing learning and the environments in
which it occurs”. The final goal of learning analytics is improving learning via the in-
terpretation and contextualization of educational data [13].
Given this context and the different possibilities provided by learning analytics, we
should explore the problem we aim to address. In this case, the issue to analyse is the
individual assessment of teamwork competence (TWC). The development of this com-
petence is highly demanded by employers [14, 15] and is supported both by policy
makers [16] and by higher education institutions [17]. But why is TWC so appreciated?
Because: 1) teamwork involves sharing of information and discussion among students
to build mental models in a cooperative way, ultimately contributing to the improve-
ment of students’ learning [18, 19]; 2) companies seek that prospective employees have
developed the TWC because members of an organization are working together in
groups to achieve common goals [20]; 3) the application of the Bologna process posi-
tions TWC as a key competence that students should develop in Higher Education.
Learning platforms may provide us with evidence about the development of team-
work competence by students. However, lots of time are required to evaluate this com-
petence acquisition from this data so we need two things: a methodology to assess TWC
development and a learning analytics tool [21].
Regarding the former, for this research work we have used CTMTC (Comprehensive
Training Model of the Teamwork Competence). It explores the group results and how
each individual has acquired the competence. The methodology relies on the analysis
of learning evidence from data generated by the use of IT-based learning tools by stu-
dent teams during a project development [4]. Moreover CTMTC application entails that
teams develop the project in several stages adapted from the International Project Man-
agement Association (IPMA) [22].
In relation to the latter, that is, the learning analytics tool, there exist several options.
It is possible to use some of the existing the cross-platform and platform-specific gen-
eral purpose dashboards (Moodle Dashboard, Google Analytics, etc.). or learning ana-
lytics frameworks (GISMO, VeLA), but in this case, they are not adapted to the meth-
odology and would mean that teachers should carry out an extra effort. In this sense it
is better to develop an ad-hoc tool [23]. For CTMTC an ad-hoc learning analytics tool
was developed and successfully applied in several experiments but in specific environ-
ments [21, 24, 25]. For instance it was used in 7 different courses of the University of
León [26].
In this project, what we aim to check is the differences between the application of
this tool in two different universities in two courses with similar aims, contents and
students. From the experiment our goal is to know how flexible the CTMTC method-
ology and the learning analytics tool are. We also aim to check what happens with
students grades when we define some on-going interviews during the methodology ap-
plication to see groups progress and without them.
In order to achieve this the paper is structured as follows. In next section, we intro-
duce the methodology CTMTC and describe briefly the tool. In Section 3 we present
the experiments carried out. Section 4 shows the results obtained that are discussed in
the following section. Finally, some conclusions are posed.
2 Background of the research
In this section, we describe the CTMTC methodology and the learning analytics tool to
facilitate understanding the experiment.
2.1 CTMTC Methodology
CTMTC method [27, 28] puts the focus on TWC components such as leader behaviour,
cooperation between peers, problems between team members and performance of each
member. It takes into account the group results and how each individual has acquired
the competence.
CTMTC is conceptually based on the phases described by Bruce W. Tuckman [29]
and used by AIEIPRO-IPMA [30] and MIT [31] as a helpful framework for recognizing
a team's behavioral patterns and to assess the development of teamwork competence.
The defined stages were: Forming, Storming, Norming and Performing.
For each stage CTMTC defines a set of individual evidence that each group member
should achieve. These evidence are generated by the use of web-based learning tools
during a project development and can be exploited by a learning analytics tool [32].
The evidence evaluated for each of the stages are the following:
• Forming. This phase consists of the definition of the working team, which can be
defined by the teachers, by the students or automatically depending on the students’
profiles. The evidence is in this case the team defined.
• Storming. It consists of the definition of mission, goals, target audience, purpose and
the reason to develop the work. In addition, it requires also the definition responsi-
bilities for each team member. The evidence in this case is the description of this
information about the project development.
• Norming. It is based in the definition of a set of norms to be applied by the team
members in order to develop the project. The evidence are the normative and the
interactions to define it.
• Performing. In this stage, each team should define a tracking map to know when
each member has completed a task. This includes the distribution of tasks, schedul-
ing, definition of milestones and indicators to know when they are achieved. These
elements can be used as evidence and also the interactions required to define them.
• Final process. It is not included in Tuckman stages. However it is very used in aca-
demic contexts. It consists of the the final outcomes of the project.
However, CTMTC and other similar ones, on their own, are not completely effec-
tive. The reason is that monitoring individual evidence in the teamwork and evaluating
its performance requires a great deal of time for the teaching staff (the effort should be
multiplied by the number of students), because monitoring and assessment (formative
and summative) of the individual evidence require a qualitative analysis of all of the
interactions in the forum (what students say, how they say it, and when they say it) [21].
2.2 The Learning Analytics tool
In order to facilitate the application of CTMTC an ad-hoc Learning Analytics tool was
developed. This tool aimed to facilitate accessing to the information stored into Moodle
logs. This information would be used to analyze the evidence required for each of the
stages described above. It should be noted that the tool is not focused in what we de-
fined as group evidence, that can be checked by assessing the results published in the
Wiki, but to explore the students’ interactions carried out to achieve that results.
In order to describe the CTMTC learning analytics tool it is necessary to explore two
issues: how the tool was implemented and its functionalities.
Regarding the implementation, it is necessary to take into account that the tool is
intended to access to the students’ records in the LMS. This feature could be articulated
in several ways: 1) Direct access to the database; 2) Define a standard extension or
plug-in for the LMS; 3) Use of web services.
The first of these options was limited by the version of the LMS; that is, if there was
a change in the database, changes would be also necessary in the tool. The second op-
tion would limit the development done to a specific LMS, which would limit the flexi-
bility and portability of the tool. Given these facts we decided to use web services. The
use of web services ensures, amongst other things, that the solutions defined are inde-
pendent of the underlying implementation [33], which solve the problems previously
mentioned.
Once this was decided Moodle web service layer was used to access information and
some additional functions were added to Moodle External Layer, so logs could be ac-
cessible. This was necessary because Moodle did not make accessible the information
that we need to access by using the web service. In addition, the definition of the web
service client was necessary in order to access the information without login into Moo-
dle. More information about the connection of the tool to Moodle, the changes made in
Moodle external layer and the client can be found in [21, 32].
With regard to the functionalities of the learning tool, it is necessary to explore the
information that the web services client provides to the user and how it is represented.
Fig 1. shows the different navigational contexts. These are:
• Courses context. When the users access to the client they can see a list of the avail-
able courses in the LMS explored, the name of each of them is a link that lead to the
forums view.
• Forums context. It includes a list of the existing forums in the course. Information
about each of them could be obtained by clicking in the forum name, which would
lead the user to the groups context. It is also possible to navigate to the previous
context.
• Groups context. For a specific forum, it provides information about the number of
posts, users and posts per user. Moreover, it includes two lists one with information
about groups and other with information about students involved in the forum. The
first list includes the name of the groups with a link to the groups context, the number
of messages of each group (and percentage regarding the total of number), infor-
mation about the number of long and short messages and about the number of stu-
dents in each group. The second list shows the same information but per student.
• Discussion threads context. When clicking in one group it is possible to access to
this context. In it the user can see general information about the group messages
(short, long, number, first and last post author and date) and two lists. The first have
information about the discussions for this group and forum. For each one, it is pos-
sible to see when it was created and the distribution messages and views of the stu-
dents in this thread. The thread name can be clicked which will lead the user to the
posts info context. The second list includes information about the students of the
group. Fig 2. shows and example with the discussion threads context.
• Posts info context. It includes general information about the specific thread messages
(short, long, number, first and last post author and date) a list and a form. The list
shows the students involved in the discussion and information about the messages
and views for each of them. The form allows defining date ranges to see what mes-
sages where posted out of dates.
Fig. 1. OOWS Navigation Map [34] for the Learning Analytic tool
3 The experiment
The tool and the methodology have been tested in different contexts as described in the
introduction. However, those experiments were carried out into a single educational
institution, and in several cases in different courses. From these experiments, it was
possible to say that the methodology and tool can be easily adapted to the course context
and particular features. In this work, our idea was to compare what happens when we
applied the methodology and tool in similar courses of different institutions. In this
section we describe how this was carried out.
Fig. 2. Group context view of the Learning Analytic Tool
3.1 Context of the experiment
The experiment was carried over two different courses, one from the University of
León and the other from the Universidad Politécnica de Madrid in Spain. These courses
were:
• S1. Informatics. This is a course of the first course of Bachelor of Science on Elec-
tronic Degree of the University of León. It has 70 students. In this course students
learn programming concepts by using C language. The course has an intermediate
assignment to which CTMTC is applied. This assignment has a weight of the 24%
over the final grade. Although choice of team members and coordinators is open, the
group must choose one of the three possible topics for the work. Groups have 3 or 4
members, who use the LMS forums to interact between them; additionally, some of
the students also use instant messaging tools such as WhatsApp. Each group pub-
lishes its partial outcomes in the LMS wiki and delivers their final outcome using
Moodle LMS assignment block.
• S2. Informatics and Programming. This is a course of the first course of Bachelor of
Energy Engineering of the Politechnical University of Madrid. It has 186 students
enrolled in the course. In it, students learn algorithm and programs fundamentals.
The course has an assignment to which CTMTC is applied with a weight of 15%
over the final grade. Students could choose a team up to a deadline. After that, teach-
ers will define groups with the unassigned students. Team members choose their
leader. Groups have from 5 up to 7 members, who use the LMS forums to interact
between them. Each group publishes its partial outcomes in the LMS wiki and pub-
lish their final outcome in a web and produce a video presentation of their work.
For the assessment of the results a rubric described in [32] is used. It explores both
individual and group outcomes.
3.2 The method
In order to explore the possible differences between the application of CTMTC and
the Learning Analytics tool we decided to use a mixed methodology [35], that consists
of a quantitative and a qualitative analysis.
First, quantitative data from application of CTMTC and the learning tool is com-
pared between both case studies. We check the participation, the grades for individual
and group works and the number of posts/discussions per student. This information can
be seen in tables 1 and 2.
Next, two satisfaction questionnaires are carried out. They collect teachers’ and stu-
dents’ perceptions. Teachers’ perceptions are related to the learning analytics tool and
the methodology, and students’ perceptions are related to the methodology, because
they did not interact with the learning analytics tool. The information gathered through
the satisfaction questionnaires is analysed following a qualitative methodology. The
qualitative analysis consists of an examination of the text from the responses given by
participants [36]. This procedure includes grouping responses based on topic-proximity
criteria for both involved courses. After classification, we have combined the results in
a matrix in order to extract conclusions. We had 8 teachers involved in the experiment
and matrix about their perceptions can be seen in Table 3. On the other hand, we should
consider more than 250 students. The representation of a matrix with 250 rows is quite
difficult to read, so we have taken a sample of 30 students for this analysis (15 per
course) with the most relevant results (Table 4).
4 Results
The results are shown in this section following the methodology mentioned above.
Firstly, it is possible to see general information about the students involved in the ex-
perience and their actions (Table 1). In such table, it is possible to see that there is more
participation in S1 than in S2 and also a higher number of students’ interactions. Re-
garding the groups also there are more groups in S2 than in S1.
Table 1. Information about participation, activity and number of groups
Number of Students Average Number of Number of Groups
actions/user
S1 64/70 (91.42%) 607.5 23
S2 177/186 (95.16%) 645.2 32
Without the use of the learning analytics tool a manual inspection of each group’s
activity takes between 40 minutes and 1 hour (this time does not include assessment)
[24]. This would mean between 15 and 23 hours to check S1 and 21 and 32 hours for
S3. By applying the learning analytics tool 12 minutes were needed per each group.
That is, around 4 hours for S1 and 6 hours and a half for S2.
Table 2 shows the results attending to number of posts, average individual grade and
average group grade.
Table 2. Information about CTMTC methodology application
Post/User Average Group Grade Average Individual Grade
S1 16.2 7.08 6,80
S2 25.5 8.26 8.56
Results from tables 1 and 2 were obtained from the information gathered by the
learning analytics tool.
The information about teachers’ satisfaction about the tool and methodology can be
seen in Table 3. In this case, the categories chosen to group terms of the open questions
were the LA tool, the methodology and problems found with both. Results can be seen
in Table 3.
Table 3. Teachers perceptions about the methodology and tools
LA Tool Methodology Problems
S1 T1 Cool Some students did not apply it Lack of interest of
the students
S1 T2 More information Allows us to objectively measure in- None
dividual TWC
S1 T3 Time saving Students do not understand how im- Access through the
portant is interaction with their peers tool the specific in-
formation about one
student
S1 T4 Very useful although inter- Students do not like to use Moodle Warnings about stu-
face should be improved forums dents’ application of
CTMTC
S2 T5 Check CTMTC indicators Students learn to work in groups Include whatsapp
effortlessly analysis
S2 T6 It can be improved with a It allows students to know how to Individual infor-
warning system deal with real projects mation in the tool
S2 T7 All the information at a Something to assess what each one None
sight does in a group
S2 T8 Include leadership indica- It allows TWC development None
tors
Table 4. Students’ perceptions about advantages and problems of CTMTC and the tools used to
apply it
Advantages Problems Tools
S1 ST1 None Problems with other group mem- None
bers (distribution of tasks)
S1 ST2 Planning and deadlines Implication of other Whatsapp
S1 ST3 Organization improvement Randomly defined groups None
S1 ST4 Work as a team Distribution of tasks None
S1 ST5 Work organization Lack of interest of peers Instant messaging
S1 ST6 Distribute work to achieve Problems with coordination to in- None
our goals tegrate the parts
S1 ST7 Deadlines, work distribution, None None
work together
S1 ST8 Leadership, Agile methodol- Coordination problems Dropbox
ogy, tracking tools
S1 ST9 Good distribution of tasks Integration is not always easy None
S1 ST10 We are best working as a team None Whatsapp
S1 ST11 Work together and that my Coordination None
work was assessed
S1 ST12 Work with peers Communication tools Whatsapp
S1 ST13 Goals and deadlines None None
S1 ST14 Better planning Work completion None
S1 ST15 Collaboration with peers Necessity of using forum Trello
S2 ST1 Organization None Redmine
S2 ST2 Coordination Communication is not straightfor- Whatsapp
ward
S2 ST3 Planning and scheduling Complete your tasks Tools for scheduling
S2 ST4 Tasks distribution None None
S2 ST5 None Maintain motivation Skype
S2 ST6 Working together Deal with team members’ capabil- Whatsapp
ities
S2 ST7 Making decisions as a group None None
S2 ST8 Distributed leadership Coordination problems Video editing tools
S2 ST9 Dialogue to find solutions Communication Skyke, whatsapp
S2 ST10 Constructive criticism Moodle forums Whatsapp
S2 ST11 Distribution of tasks Implication of peers None
S2 ST12 Work organization Discussion with the others Whatsapp
S2 ST13 Improvement in problem Deadlines stress team members None
solving
S2 ST14 Improve our work None None
S2 ST15 Support others work Tracking with other members Control Version Sys-
have done is not easy tem
5 Discussion and conclusions
During the experiment carried out it was possible to explore two different issues: results
related to the application of the methodology, that can easily be analyzed and compared
thanks to the application of the Learning Analytics tool and perceptions about the tool
and the methodology.
Regarding the first issue, the learning analytics tool provides us information about
the students and group interactions while applying CTMTC (number of messages per
student, short messages, long messages, messages per group, distribution of the mes-
sages between team members, number of views, etc.). Taking into account that such
indicators have been shown to be related with students’ performance [21, 37] and with
the application of a rubric [32] it is possible to observe the individual and group grades
obtained when developing the tasks applied during the project.
For the present experiment and the data shown in Table 1 it is possible to say that
there is a slightly higher participation in S2 than in S1 with also a higher number of
interactions per user. This can be motivated because in S2 this is the third year of the
application of the methodology with good results while in S1 is the first edition. How-
ever, there is also an interesting difference between the number of messages posted by
students of S1 and S2. This use to be an indicator of students’ performance and it is
possible to see better individual and group grades for S2 students than for S1. This
difference can be motivated because in S1 there were not checking meetings to know
what groups were doing, so no corrective interventions can be applied; while for S2
there were two of these meetings. This means that the application of the methodology
does not only require a good description of what to do, but also checking groups pro-
gress in the application of the methodology.
We should also attend to the time necessary to check each group activity, between
40 minutes and 1 hour without the tool and around 12 minutes by applying it. That is a
save of a 75% of the time when using the learning analytics tool.
Regarding teachers’ perception about the tool, all of them find it useful and that it
helps them to save time when checking the learning evidence. There are several sug-
gestions to improve the tool: the improvement of the interface (that is quite simple be-
cause the tool was implemented as a proof of concept); to include a warning system
that allows teachers knowing if the methodology is being applied properly and if teams
accomplish deadlines; and also, to have access to specific information about single stu-
dents’ actions. With regard to the methodology they also seem to be happy because it
allows them assessing in an objective way individual acquisition of TWC, and help the
students to deal with complex projects in their courses. Finally, some of the teachers of
S1 claim about their students’ motivation with the activity and note that students have
problems by using Moodle Forums as the main interaction tool.
Attending to students’ perceptions, it should be noted that most of them are happy
dealing with a complex project, working as a group, distributing the efforts, learning
how to plan and schedule the tasks, etc. That is, they highlight advantages related to
teamwork behaviours, as described in other works related to teamwork behaviour [26,
38]. Regarding the problems several students do not find any, but others have problems
with the distribution of work, completion of the tasks by their peers and implication.
Moreover, several of them are not happy with the use of Moodle Forum for interaction.
They suggest the use of instant-messaging tools such as Skype or Whatsapp, and tools
to manage projects such as Redmine and/or a control version system. It is interesting to
see that the opinions gathered for S1 and S2 are quite similar.
After this experiment, we can conclude that the learning analytics tool and the meth-
odology are flexible enough to be applied in different academic contexts.
The tool can be improved by including more information about students, developing
a friendlier interface, including information from instant messaging tools, and provid-
ing a warning system for teachers, which is going to be addressed as future research
lines.
We also have seen that the methodology can be applied successfully in different
contexts, but that it is not enough with providing students with contents describing the
methodology and explaining it to them, they also need that teachers check how they are
progressing during the application and that they define corrective actions if needed.
This is a lesson learned for future applications of the methodology.
6 References
1. Conde, M.Á., Hernández-García, Á.: Learning analytics for educational decision making.
Computers in Human Behavior 47, 1-3 (2015)
2. Browne, T., Hewitt, R., Jenkins, M., Voce, J., Walker, R., Yip, H.: Survey of Technology
Enhanced Learning for higher education in the UK. UCISA - Universities and Colleges
Information System Asociation
http://www.ucisa.ac.uk/groups/ssg/~/media/groups/ssg/surveys/TEL survey
2010_FINAL.ashx (2010)
3. García-Peñalvo, F.J., Seoane-Pardo, A.M.: An updated review of the concept of eLearning.
Tenth anniversary. Education in the Knowledge Society 16, 119 (2015)
4. Avgeriou, P., Papasalouros, A., Retalis, S., Skordalakis, M.: Towards a Pattern Language
for Learning Management Systems. Educational Technology & Society 6, 11-24 (2003)
5. Prendes, M.P.: Plataformas de campus virtuales de Software Libre: Análisis compartivo de
la situación actual de las Universidades Españoles., Informe del proyecto EA-2008-0257 de
la Secretaría de Estado de Universidades e Investigación
http://www.um.es/campusvirtuales/informe.html (2009)
6. Arroway, P., Davenport, E., Guangning, X., Updegrove, D.: Educause Core Data Service
Fiscal Year 2009 summary report. EDUCAUSE
http://net.educause.edu/ir/library/pdf/PUB8007.pdf (2010)
7. Wexler, S., Dublin, L., Grey, N., Jagannathan, S., Karrer, T., Martinez, M., Mosher, B.,
Oakes, K., Barneveld, A.v.: LEARNING MANAGEMENT SYSTEMS. The good, the bad,
the ugly,... and the truth., The eLearning Guild
http://www.elearningguild.com/research/archives/index.cfm?id=130&action=viewonly
(2008)
8. Marks, A., AL-Ali, M., Rietsema, K.: Learning Management Systems: A Shift Toward
Learning and Academic Analytics. International Journal of Emerging Technologies in
Learning (iJET) 11, 77-82 (2016)
9. Iglesias-Pradas, S., Ruiz-de-Azcárate, C., Agudo-Peregrina, Á.F.: Assessing the suitability
of student interactions from Moodle data logs as predictors of cross-curricular competencies.
Computers in Human Behavior 47, 81-89 (2015)
10. Ferguson, R.: Learning analytics: drivers, developments and challenges. International
Journal of Technology Enhanced Learning 4, 304-317 (2012)
11. Greller, W., Drachsler, H.: Translating Learning into Numbers: A Generic Framework for
Learning Analytics. Journal of Educational Technology & Society 15, 42-57 (2012)
12. Long, P.D., Siemens, G.: Penetrating the Fog: Analytics in Learning and Education.
EducaUSE Review 46, (2011)
13. Siemens, G.: Learning Analytics The Emergence of a Discipline. American Behavioral
Scientist 57, 1380-1400 (2013)
14. Volkwein, J.F., Lattuca, L.R., Terenzini, P.T., Strauss, L.C., Sukhbaatar, J.: Engineering
change: A study of the impact of EC2000. International Journal of Engineering Education
20, 318-328 (2004)
15. EPYCE: Posiciones y competencias más demandadas. http://humanageinstitute.org/wp-
content/uploads/2017/02/Informe-EPyCE-2016.-Posiciones-y-competencias-m%C3%A1s-
demandadas-en-la-empresa.pdf (2016)
16. European-Commission: Key competences for lifelong learning. Recommendation
2006/962/EC of the European Parliament and of the Council of 18 De-cember 2006 on key
competences for lifelong learning. Official Journal of the European Union L394/10, (2006)
17. ANECA: REFLEX - El profesional flexible en la Sociedad del Conocimiento. ANECA
(Unidad de Estudios)
https://observatorio.um.es/observatorio/observatorio.contenidos.ver_fichero.do?codigo=43
(2007)
18. Leidner, D.E., Jarvenpaa, S.L., 265-291.: The use of information technology to enhance
management school education: A theoretical view. MIS quarterly 19, 265-291 (1995)
19. Vogel, D.R., Davison, R.M., Shroff, R.H.: Sociocultural learning: A perspective on GSS-
enabled global education. Communications of the Association for Information Systems 7,
(2001)
20. Iglesias-Pradas, S., Ruiz-de-Azcárate, C., Agudo-Peregrina, Á.F.: Assessing the suitability
of student interactions from Moodle data logs as predictors of cross-curricular competencies.
Computers in Human Behavior 47, 81-89 (2015)
21. Fidalgo-Blanco, Á., Sein-Echaluce, M.L., García-Peñalvo, F.J., Conde, M.Á.: Using
Learning Analytics to improve teamwork assessment. Computers in Human Behavior 47,
149-156 (2015)
22. NCB.- Bases para la competencia en dirección de proyectos
http://www.lpzconsulting.com/images/CP-_Trabajo_en_Equipo.pdf (Last accessed
28/04/2017)
23. Conde, M.Á., Hérnandez-García, Á., J. García-Peñalvo, F., Séin-Echaluce, M.L.: Exploring
Student Interactions: Learning Analytics Tools for Student Tracking. In: Zaphiris, P.,
Ioannou, A. (eds.) Learning and Collaboration Technologies: Second International
Conference, LCT 2015, Held as Part of HCI International 2015, Los Angeles, CA, USA,
August 2-7, 2015, Proceedings, pp. 50-61. Springer International Publishing, Cham (2015)
24. Conde, M.Á., Hernández-García, Á., García-Peñalvo, F.J., Fidalgo-Blanco, Á., Sein-
Echaluce, M.: Evaluation of the CTMTC Methodology for Assessment of Teamwork
Competence Development and Acquisition in Higher Education. In: Zaphiris, P., Ioannou,
A. (eds.) Learning and Collaboration Technologies: Third International Conference, LCT
2016, Held as Part of HCI International 2016, Toronto, ON, Canada, July 17-22, 2016,
Proceedings, pp. 201-212. Springer International Publishing, Cham (2016)
25. Fidalgo-Blanco, Á., Lerís, D., Sein-Echaluce, M.L.: Monitoring Indicators for CTMTC:
Comprehensive Training Model of the Teamwork Competence in Engineering Domain.
International Journal of Engineering Education (IJEE) 31, 829-838 (2015)
26. Conde, M.Á., Rodríguez-Sedano, F.J., Sánchez-González, L., Fernández-Llamas, C.,
Rodríguez-Lera, F.J., Matellán-Olivera, V.: Evaluation of teamwork competence acquisition
by using CTMTC methodology and learning analytics techniques. Proceedings of the
Fourth International Conference on Technological Ecosystems for Enhancing
Multiculturality, pp. 787-794. ACM, Salamanca, Spain (2016)
27. Lerís, D., Fidalgo, Á., Sein-Echaluce, M.L.: A comprehensive training model of the
teamwork competence. International Journal of Learning and Intellectual Capital 11, 1-19
(2014)
28. Fidalgo-Blanco, Á., Lerís, D., Sein-Echaluce, M.L., García-Peñalvo, F.J.: Monitoring
Indicators for CTMTC: Comprehensive Training Model of the Teamwork Competence in
Engineering Domain. International Journal of Engineering Education (IJEE) 31, 829-838
(2015)
29. Tuckman, B.W.: Development of sequence in small groups. Psychological Bulletin 63, 384-
399 (1965)
30. NCB – Bases para la competencia en dirección de proyectos.
http://aeipro.com/index.php/es/mainmenu-publicaciones/mainmenu-publicaciones-
libros/223-ncb-30-bases-para-la-competencia-en-direccion-de-proyectos (Last accessed
06/05/15)
31. Learning and Development http://hrweb.mit.edu/learning-development/learning-
topics/teams/articles/stages-development (Last accessed 30/04/2017)
32. Conde-González, M.Á., Colomo-Palacios, R., García-Peñalvo, F.J., Larrueca, X.:
Teamwork assessment in the educational web of data: A learning analytics approach towards
ISO 10018. . Telematics and Informatics (In Press (2017))
33. Gottschalk, K., Graham, S., Kreger, H., Snell, J.: Introduction to web services architecture.
IBM Syst. J. 41, 170-177 (2002)
34. Pastor, O., Abrahao, S., Fons, J.: An Object-Oriented Approach to Automate Web
Applications Development. In: Bauknecht, K., Madria, S.K., Pernul, G. (eds.) Electronic
Commerce and Web Technologies: Second International Conference, EC-Web 2001
Munich, Germany, September 4–6, 2001 Proceedings, pp. 16-28. Springer Berlin
Heidelberg, Berlin, Heidelberg (2001)
35. Green, J.L., Camilli, G., Elmore, P.B.: Handbook of Complementary Methods in Education
Research. American Educational Research Association by Lawrence Erlbaum Associates,
Inc (2006)
36. Miles, M.B., Huberman, A.M.: Qualitative Data Analysis: An Expanded Sourcebook. Sage
Publications (1994)
37. Agudo-Peregrina, Á.F., Iglesias-Pradas, S., Conde-González, M.Á., Hernández-García, Á.:
Can we predict success from log data in VLEs? Classification of interactions for learning
analytics and their relation with performance in VLE-supported F2F and online learning.
Computers in Human Behavior 31, 542-550 (2014)
38. Tasa, K., Taggar, S., Seijts, G.H.: The development of collective efficacy in teams: a
multilevel and longitudinal perspective. Journal of Applied Psychology 92, 17-27 (2007)