=Paper= {{Paper |id=Vol-317/paper-1 |storemode=property |title=Experiences of Technology Enhanced Learning: What Went Wrong? |pdfUrl=https://ceur-ws.org/Vol-317/paper01.pdf |volume=Vol-317 |authors=Su White and Hugh C Davis |dblpUrl=https://dblp.org/rec/conf/ectel/WhiteD07 }} ==Experiences of Technology Enhanced Learning: What Went Wrong?== https://ceur-ws.org/Vol-317/paper01.pdf
   Proceedings of the workshop on What Went Wrong? What Went Right? - 2007




    Experiences of Technology Enhanced Learning: What
                      Went Wrong?

                                Su White and Hugh C Davis
   Learning Societies Lab, Electronics and Computer Science, University of Southampton, UK
                           saw@ecs.soton.ac.uk, hcd@ecs.soton.ac.uk
  Abstract.
       The e-learning community is beginning to amass a great deal of
       experience of successful practice, but typically final project reports and
       associated papers concentrate only on the successful outcomes. There
       has been very little published on the innovations that failed or the
       unexpected and unwanted outcomes of such projects. This experiences
       paper presents four case studies of projects in which the the authors
       have been involved over the last 15 years. The contribution of this
       paper is to focus on the aspects of the projects that were not successful
       or were unwanted, analyzing the causes. The paper concludes by
       suggesting that most projects have both successful and unsuccessful
       components, and that the community would be better informed if they
       were more often provided with the complete picture.

       Keywords: e-learning, project failure, institutional change management, higher
       education




1 Introduction

The authors have worked together on many projects introducing technology into
learning at the University of Southampton, UK since the early 1990’s. The first author
has worked primarily as an educational developer while the second author has worked
primarily as a Computer Scientist. This experiences paper revisits some of the
projects they have worked on, all of which were successes in the eyes of the funders.
Outputs were produced, results were published, changes were made and the budget
was accounted for. In this second examination we consider those aspects of the
project that did not go as planned, or even if they did go as planned they did not
necessarily result in the consequences we had anticipated.
   In order to analyze these results we have examined them using a framework which
considers the context of the project and the expected technical and pedagogical
outputs, as represented in Figure 1. The Learners are at the centre (of course) and the
project environment; its processes and objectives (shaded grey) will make technical
and pedagogical innovations which will, hopefully, impact upon the learning. These
project managers, and the innovations they make, will be affected by the context in
which the project is carried out; the local (institutional) environment and the wider
(external) environment. Together these establish strategic priorities and influence the
way the institution manages change and the culture in the organization. Ideally the




                                                                                             1
    Proceedings of the workshop on What Went Wrong? What Went Right? - 2007




organizational learning which results from the project will also feed back into the
institutional culture and management – leading to some change.
   In this paper we use case studies to explain why projects can indeed be both
successes and failures; we are all aware of cases where a project has produced some
excellent technical innovations, but learning has not been changed. Similarly we see
projects where student learning has clearly been improved but the lessons have not
been learned by the institution/environment, so the change does not benefit a wider
community. For these reasons funding bodies such as JISC1 in the UK are now putting
great store onto “embedding” project results.
   We now present four case studies of real e-learning projects their successes, the
problems they encountered, their shortcomings and failures which we will evaluate
against this framework.

                                            External
                                          environment




                                          Pedagogical
                                          Innovations

                                              Learners
                                                    ss
                                           Technical
                                          Innovations




                                             Local
                                          environment




Figure 1: The Framework used for Technology Enhanced Learning Project evaluation.




Microcosm and the Scholar Project

These were two separate but inter-related projects. Microcosm was an educationally
oriented hypermedia system while The Scholar Project developed e-learning
applications for institutional change using Microcosm as its e-learning platform.
   The Microcosm project was a pre-web Open Hypermedia System (OHS) [2, 9].It
was developed within the department of Electronics and Computer Science (ECS) at

1 JISC funds technology infrastructure for UK universities and has an extensive development

  programme primarily managed via competitively awarded projects addressing agreed
  strategic priorities. http://www.jisc.ac.uk/




                                                                                              2
   Proceedings of the workshop on What Went Wrong? What Went Right? - 2007




the University of Southampton. It was designed as a testbed for novel and emerging
ideas in hypertext and hypermedia presentation and implementation. The project was
active between 1988 and 1998. The technical innovations in Microcosm were
important, earning ECS a world leading reputation within the hypertext research
community; the system won the BCS software prize in 1996 and it earned venture
capital investment for a spin-off company. The second author managed the
Microcosm research team.
   The Scholar Project began in 1993. It was funded by an initiative of the UK
Universities Funding Council (UFC) designed to promote “effectiveness and
efficiency” in Universities by stimulating growth in the use of technology for learning
through the Teaching and Learning Technology Programme (TLTP) [13]. Institutions
and consortia of subject specialists were encouraged to apply for funding for using
technology in learning. In those pre-web days a number of consortia became
interested in using Microcosm as the engine for their delivery engine. Amongst the
projects that received funding was the University of Southampton’s institution wide
Scholar Project, which was managed by the first author. This project set out to “shift
the culture of the university” by creating a campus-wide network for multimedia
learning. The project sponsored the design, development and implementation of
computer-based learning resources. It used the device of creating an Interactive
Learning Centre to assist a number of early adopters from a range of academic
disciplines to have the necessary educational and technical assistance to prepare
Microcosm based learning materials [16].
   Almost as soon as the Scholar project began the World Wide Web started to make
its appearance felt within the academic community, and some consortia that were
using Microcosm transferred their development to the Web. However others,
including The Scholar Project and the consortium formed to develop Physics teaching
resources (SToMP) argued for the pedagogic merits of using Microcosm. History of
course now shows that those who changed made the right decision, and it became
necessary for those who wished to maintain their investment in learning materials to
move to some internet based delivery [1, 4]. This points us to our first, and for these
projects the most important cause of failure: the external environment changed
radically and this meant that their choice of technical platform was inappropriate.
   There were good outputs from The Scholar Project, many related to individual and
organizational learning which resulted from the activities which the project
undertook. A number of people around the university became familiar with the use of
technology in teaching. The focus and intensity of project activities created a climate
where they gained sophisticated insights into appropriate technology use in their
disciplines, they networked together and formed professional and friendship bonds.
Many of these people have now reached senior positions we see that they are indeed
using their experience to change the culture of the university – even if somewhat later
than anticipated!
   But even without the advent of the web, we would have to confess that there were a
number of flaws in the Scholar/Microcosm project, both from a technical viewpoint
and from the institutional context.
   Microcosm, as an open system which interconnected many different media types
and commercial applications was technically robust. It has a very small and
specialized user base, but it had none of the strengths that Morris et al attribute as




                                                                                          3
    Proceedings of the workshop on What Went Wrong? What Went Right? - 2007




“worldware” [12].It was a system in perpetual beta, not ready for mainstream use.
The fundamental problem was that this software was being produced by a research
lab, who tended to concentrate on solutions to interesting new problems rather than on
code maintenance, thorough documentation and resolving uninteresting limitations.
The academic user-base actually made things worse as they continually fed the
research lab with a stream of interesting feature requests . By the time the code was
put into the hands of a commercial team there were so many options and features it
was impossible to document or test systematically. The resulting product became
extremely difficult to describe – it was really a framework allowing users to tailor
their own hypertext system – and it became far too complex for an average teacher to
understand how they were supposed to use it [3].
   From an institution point of view, it became clear that while the funding council
required the University to sign-up for full institutional commitment to the project, as
soon as the funding ceased the University management took a close look at the
Interactive Learning Centre and significantly changed its brief, bringing it in line with
core institutional objectives. Consequently there was effectively no longer any cost
free support within the university centre for teachers wishing to use learning
technology. As the result of this the individual Schools within the University have
subsequently tended to develop their own approaches and support.


The Modular MSc

In 1994 a senior management of a large computer manufacturer, with development
premises near Southampton, understood the need for change in their organization, and
the need to move their emphasis from selling their own hardware and software
towards selling software services. The staff of this company tended to by highly
skilled in a particular area of the organization’s overall portfolio, and had previously
shown little interest in tracking technological development elsewhere. The senior
management realized that if they were to start to change the culture within the
company they would need to start an education programme to make the staff aware of
the emerging world of internet based, open source, multimedia interoperable open
software. The senior management of the UK branch approached the University of
Southampton and asked them to run an MSc, using a mode of delivery suitable for full
time employees, that would be designed to broaden their understanding of current
computer science. The company would guarantee 12 new students per year, but the
University was free to advertise the course elsewhere.
    The second author was appointed course leader; the solution we adopted was to
produce a highly modular course.. Modules represented the leading edge of that time
and reflected the research strengths of ECS, for example
• Open Distributed Systems                       • The Social Impact of the
• The Multimedia Revolution                          Information Revolution
• Object Oriented Technology                     • Interactive Entertainment Systems
• Networking in the '90s: The
    Information Superhighway




                                                                                            4
    Proceedings of the workshop on What Went Wrong? What Went Right? - 2007




   Each module would require one week of attendance at the University for teaching,
followed by 6 weeks to complete some coursework which would be assessed. All the
course materials would be on the Web, and the once off campus the students would be
supported by on-line tutoring, both synchronously (chat) and asynchronously via
email and course forums.
   A charging structure was introduced whereby the cost of the first module was at
commercial rates, but each subsequent module that a student elected to take would
cost less. Any student who completed the necessary number of modules would
receive free supervision for their project. This charging structure was designed to
encourage those who had started to complete, and also to enable us to advertise the
courses to industry at commercial rates – with the expectation that such attendees
might only be interested in the one week taught component.
   This solution presented some significant challenges; in 1995 the internet was not
standard issue, particularly at this company where they had a long history of
developing their own proprietary network protocols and communication software, and
although the senior management had specifically encouraged this solution to expose
employees to these new technologies, the middle management were enormously
concerned about security issues. In the end we needed to set up a small network of
internet connected PCs not only outside the companies firewall, but also outside the
regular working premises, so that there could be no possibility, for example, of a disk
being accidentally moved from a company machine to one of these PCs. Without the
technology on the employees’ desktops, Web based learning was significantly
hindered, and mostly the learners simply printed the notes.
   A more significant problem was that of recruitment to the MSc. Although the
course started well with a full cohort of students, mostly recruited from the company
but also some recruited from other companies and occasional attendees using the
modules as short courses. Once the first batch of modules had been run over a period
of two years, we discovered that there was very much reduced attendance (single
figures) when the modules were run again – so the courses had become financially
unviable. The reason for this turned out to be quite straightforward; although the
senior management wanted the up-and-coming middle management to broaden their
education, when they offered a sponsored place on the MSc, the manager of the unit
was inevitably too busy and more interested in acquiring skills very specifically
aligned to their current project and problems. In the first instance they offered their
place to a junior colleague, some of whom were barely qualified to participate in an
MSc course. They then found they resented the loss of time of that colleague, and
when asked to recommend further participants they refused to do so.
   The course, which had been designed for industry leaders was thus being delivered
mainly to technicians. They would much have preferred an in depth course to increase
their specific skills, rather than this broad look at the latest technologies, which they
did not see at that time to be relevant to their working life. At the same time managers
from the target group were happily signing up to another part time MSc in Software
Engineering and Formal Methods which was seen as providing relevant skills.
   So in summary the very managers who’s understanding the company wished to
broaden were the force that scuppered the degree, just because they did not
understand the significance of the way the world was changing. When the university
asked the company to address their commitment to providing numbers for this course




                                                                                            5
   Proceedings of the workshop on What Went Wrong? What Went Right? - 2007




the senior managers responsible had moved on – and the new ones were unable to
confirm this commitment. In 1998 The University was forced to close the course
down while “teaching out” the existing students - a very expensive process which
took until 2002 to complete.


The e3an project

The Electrical and Electronic Engineering Assessment Network (e3an) was
established in 2000 as a three year initiative under Phase 3 of the Fund for the
Development of Teaching and Learning (project no. 53/99) [8]. The project was led
by the University of Southampton in partnership with three other UK south coast
higher education institutions; Bournemouth University, The University of Portsmouth
and Southampton Institute. The First Author was the Principal Investigator, with
technical assistance from the second author. The project collated sets of peer-
reviewed questions in electrical and electronic engineering which had been authored
by academics from UK Higher Education. The questions were stored in a database
and available for export in a variety of formats chosen to enable widespread use
across a sector. Around half of the questions were Objective questions and could be
imported into test engines such as QuestionMark [6, 15].
    The objectives of the project were twofold. The first objective was to produce a
database of high quality questions which teachers could use to create stage tests or
worksheets for their students. The questions all had worked solutions or hints, so that
the students could obtain feedback. The second objective was to form a community of
practice in Electrical and Electronic Engineering assessment. Around 100 academics
contributed to this database, and they all received some training on modern
assessment methods, and contributed their own experiences and skills to the network.
    This short description demonstrates that the primary objectives of the project were
entirely in the area of pedagogic innovation. The project plan had assumed that the
technology of collecting and storing the questions and some associated metadata
would be straightforward. In the event this was not the case. Numerous technical
issues arose. The xml QTI specification for representing questions was new and
poorly specified, and hardly used. The use of xml representations for equations were
in their infancy and again there was little support for mathML in any of the software.
The range of software for delivering questions was large (and non standard). Many of
the academics we were working with did not have regular access to the Internet, and
were not likely to set tests on-line – even if they used MCQs they preferred to have
them printed. Furthermore, educational metadata was not sufficiently fine grained in
its descriptive facility to distinguish between numerous similar questions at the level
users required.
    The development of a community of practice was successful, and the educational
development approaches developed by the project have been established [ref here].
However the project, which had been designed and financed as a pedagogic
intervention rapidly became a technical firefighting exercise although the staff who
had been employed to manage the project were not strongly technical. In the end all
the technical problems were solved, and a database of a few thousand questions was




                                                                                          6
   Proceedings of the workshop on What Went Wrong? What Went Right? - 2007




released on a CD – and could also downloaded from the web site. The project has
been enormously successful technically for Southampton; some of the solutions we
chose were adopted by a number of later projects and by examination boards and e3an
is much used as an exemplar item bank. The authors’ lab is now a centre of expertise
in e-assessment with many follow up technical projects.
   However, as the creation of the database neared competition the project staff
acquired new jobs, based on their newfound expertise, leaving the project before the
results were rolled out to students. The resulting dissemination phase was much lower
key than had been planned, and the work has not been used as much as was
envisaged.
   As an aside, technical advances since 2000 have been significant. Tools exist for
capturing new questions, and the database can be accessed on line as a web service. It
can then be connected to other question rendering tools to allow students to work
directly with the questions on-line.


The DialogPLUS project

   The DialogPLUS project was a collaboration between Pennsylvania State
University, the University of Leeds, UCSB, and the University of Southampton. It
began in February 2003 to investigate ‘Digital Libraries in Support of Innovative
Approaches to Teaching and Learning in Geography’. The project was funded for
three years by the Joint Information Systems Committee (JISC) in the UK and the
National Science Foundation (NSF) in the USA under the Digital Libraries in the
Classroom Programme. The second author was the Principal Investigator.
   According to JISC [10]: “This programme aims to examine how integrating recent
technical developments with digital content will improve the learning experience of
students and provide new models for the classroom including the impact of
integration on student achievement, retention, recruitment and on institutional
structures and practices.” Specific objectives were to:
     • Bring emerging technologies and available digital content into core teaching
          and learning
     • Develop and use innovative approaches in integrating technologies for the
          benefit of undergraduate teaching
     • Demonstrate how the pedagogical process needs to be adapted or developed
          to support the learning process when using technology
     • Examine the human and organisational issues associated with implementing
          new modes of teaching.”
   Martin and Treves (2007) described aspects of the DialogPLUS project from the
standpoint of the geographers [11], addressing the first three bullet points above in
some detail. The authors of this paper were involved in managerial, technical,
educational and evaluative support roles at the University of Southampton and for the
project as a whole. We became increasingly aware of the effect the project had on our
own institution, particularly with respect to its influence on e-learning strategy and
policy making as described in [5]




                                                                                         7
    Proceedings of the workshop on What Went Wrong? What Went Right? - 2007




   A primary objective of the DialogPLUS project was to investigate the practicalities
of the joint design and sharing of learning activities, based upon existing digital
resources. JISC and the NSF have already funded the production and licensing of
many digital resources for use in education and research, and this programme was
particularly concerned with deploying such resources in blended learning, exploring
the associated technical, educational and organisational issues, and evaluating the
impact on students and staff.
   The final report for the DialogPLUS project [14] examines both the successes and
failures of the project. To summarize the main successes briefly: the extended project
team produced and implemented a large volume of online learning material, re-using
digital resources from multiple sources, as was envisaged, and the vast majority of
this continues to be used after the funding has ceased. Evaluation indicated that high
quality online learning activities, as part of a blended approach, did enrich
programmes of study. Team members now have a sound understanding of good
practice in design for learning, elearning and blended learning and are able to make
informed contributions to ongoing institutional, national and international work on
digital repositories, sharing and reuse of resources, pedagogical planning, design and
implementation tools. At Leeds, Southampton and Penn State Geography departments
the project is seen to have been a success and the international aspect of the
collaboration has made the resources more widely applicable.
   However some of the objectives of the programme were not met, and there were
also some interesting “negative outcomes”.
   First, the funders were very keen to see the results embedded. Since the resources
continue to be used we can certainly claim that they are embedded. On the other hand
one could argue that it was not just the use of the resources that should have been
embedded, but the continued production of further resources and the increased use of
blended learning within the Schools. In fact, at Southampton at least, the lesson that
was learned was that in order to carry out such innovation one needed to keep earning
more grant money. The School of Geography has continued to be successful in this
respect, and the approach of funding educational development through grant funding
has become enshrined in university e-learning strategy
   Secondly, in the DialogPLUS project there was intended to be an emphasis on
sharing the resources developed. The Geography departments collaborated on
producing resources to support teaching objectives they shared in common, and we
might have expected that this would have encouraged them to share use of them. In
fact we saw very little of this except in the case of the most generic materials. (A
learning object on Academic Integrity was reused by all departments, but only after
they had completely changed the content in each case – it was only the learning
design they shared). When it came to course materials, once one teacher had
developed a course that used excellent blended learning resources we found that the
Geography departments either decided to share the whole course, including the
teacher, or to send the students to “attend” the course virtually at another university. It
seems that is easier, and more beneficial, to share students than to share materials.
Rather than emphasize the development and sharing of common learning objects to be
deployed in redundant modules at multiple institutions, the project found itself
facilitating the partners in extending access to their most unique module offerings to
students from other institutions through Shibboleth-enabled federations.




                                                                                              8
      Proceedings of the workshop on What Went Wrong? What Went Right? - 2007




   Whilst the technical aspects of sharing resources were not problematic, there were
legal barriers, including IPR and copyright, to sharing resources either directly or via
repositories, particularly with parties outside the UK, as many of the UK licenses
limit the resource use to UK higher education. An example of this was Digimap
which licenses the use of the Ordnance Survey maps of the UK to UK HE only, so
resources that made use of such maps could not be shared with the US partners.
   Finally, attempts to prove that any initiative such as this has actually improved
learning are necessarily going to be limited as it will never be possible to do double
blind trials. As part of the DialogPLUS project both we and various external agencies
conducted extensive evaluations with the learners and the teaching staff, the majority
of which produced positive and encouraging results. A particular issue that we did
encounter was the extent to which younger undergraduates were prepared to work on
their own (as opposed to being taught) [7]. In general we produced further evidence to
support the view that Blended Learning approaches are more suited to more mature
students, who have developed a clear understanding of what they want to learn, rather
than less mature learners who may still treat knowledge as something they are taught
rather then something they seek to acquire.


Conclusion

The four case studies that we have described above were all successful projects – at
least according to their associated publications and to some of the skakeholder
community, and yet this paper has shown that there were still aspects of each project
in which the original objectives were not fully realized, or where some unexpected
and possibly unwanted results have been achieved.

   Of course it is necessary to ask what makes a successful project? A final project
report which shows that the objectives were met and the budget kept to? Or a class of
happy students? Or an evaluation which demonstrates that a novel method improves
the student experience? Or a change in institutional culture? And the correct answer
may be any of these. One person’s failure is another’s success. And visa versa. The
important lesson is that we should disseminate bad practice and unwanted outcomes
as well as good practice and thus learn from our mistakes?


References

[1]       Bacon, R., Swithenby, S., A Strategy for the Integration of IT-Led Methods
          into Physics - the Stomp Approach. Computers & Education 26:1-3 (1996)
          135-141
[2]       Davis, H., Hall, W., Heath, I., Hill, G., Wilkins, R., Towards an Integrated
          Information Environment with Open Hypermedia Systems. In: Lucarella et
          al (ed.): ACM Conference on Hypertext ECHT'92. ACM Press (1992) 181-
          190




                                                                                           9
      Proceedings of the workshop on What Went Wrong? What Went Right? - 2007




[3]       Davis, H.C., White, S.A., Linking Experiences: Issues Raised Developing
          Linkservices for Resource Based Learning and Teaching. In: Okamoto, T.,
          Hartley, R., Kinshuk, and Klus, J., Eds. (ed.): Second IEEE International
          Conference on Advanced Learning Technologies, . IEEE Madison, Wi,USA
          (2001) 0401-0405
[4]       Davis, H.C., Bacon, R.A., Experiences Migrating Microcosm Learning
          Materials. Proceedings of the fifteenth ACM conference on Hypertext and
          hypermedia ACM Press New York, NY, USA, Santa Cruz, CA, USA (2004)
          141-142
[5]       Davis, H.C., Fill, K.E., Embedding Blended Learning in a University’s
          Teaching Culture: Experiences and Reflections. British Journal of
          Educational Technology 38:5 (2007)
[6]       Davis, H.C., White, S.A., Dickens, K.P., Focusing on the Question: An Xml
          Testbank. 8th International Conference ALT-C Edinburgh (2001)
[7]       DiBiase, D., Kidwai, K., Wasted on the Young? Comparing the Efficacy of
          Instructor-Led Online Education in Giscience for Post-Adolescent
          Undergraduates and Adult Professionals. Association of American
          Geographers 2007 Annual Meeting. Association of American Geographers,
          San Francisco, California
[8]       ECS Learning Societies Lab, e3an: http://www.e3an.ac.uk/ Electronics and
          Computer Science, University of Southampton, Southampton (2007)
[9]       Hall, W., Davis, H.C., Hutchings, G., Rethinking Hypermedia the
          Microcosm Approach. Kluwer, Boston, MA (1996)
[10]      JISC,      Digital    Libraries    in     the     Classroom    Programme:
          http://www.jisc.ac.uk/whatwedo/programmes/programme_dlitc.aspx (2007)
[11]      Martin, D., Treves, R., Dialogplus: Embedding Elearning in Geographical
          Practice. 2007. British Journal of Educational Technology 38:5 (2007)
[12]      Morris, P., Ehrmann, S.C., Goldsmith, R., Howat, K., Kumar, V., Valuable,
          Viable Software in Education: Cases and Analysis. McGraw-Hill (Primis),
          New York (1994)
[13]      Universities Funding Council (UFC), Teaching and Learning Technology
          Programme: Circular 8/92. UFC, Bristol (1992)
[14]      University of Southampton School of Geography, Dialog+ Final Project
          Report
          http://www.dialogplus.soton.ac.uk/outcomes/dialogplus_final_report.pdf.
          University of Southampton, Southampton
[15]      Wellington, S.J., Davis, H.C., White, S.A., Populating the Testbank:
          Experiences within the Electrical and Electronic Engineering Curriculum. In:
          Danson, M. (ed.): The 5th International Computer Assisted Assessment
          Conference. Unoiversity of Loughborough, Loughborough (2001)
[16]      White, S., Scholar - a Campus Wide Structure for Multimedia Learning. In:
          Hoey, R. (ed.): AETT Annual Conference: Designing for Learning. Kogan
          Page, Jordanhill Campus, University of Strathclyde, Glasgow (1993) 194-
          196




                                                                                         10