=Paper=
{{Paper
|id=None
|storemode=property
|title=Learning from User Experience in the Design of a Playful Feedback Tool
|pdfUrl=https://ceur-ws.org/Vol-656/paper9.pdf
|volume=Vol-656
|dblpUrl=https://dblp.org/rec/conf/iused/VoldW10
}}
==Learning from User Experience in the Design of a Playful Feedback Tool==
Learning from User Experience in the Design of a Playful
Peer Feedback Tool
Vibeke Vold Barbara Wasson
Uni Digital, Uni Research Department of Information Science and Media
Allégaten 27 Studies
5020 Bergen, Norway University of Bergen, Norway
+47 90013289 +47 55584120
Vibeke.Vold@uni.no Barbara.Wasson@uib.no
ABSTRACT and playfulness with which participants interact and give
Today’s youth thrive in informal participatory communities each other feedback in participatory environments . We try
where they not only consume but also act as contributors or to harness this and draw on the new media skills the
producers. In a participatory culture of learning, students’ students are developing in our design of a playful peer
active contributions to their learning are stressed and peer assessment tool. A review of current research in the field of
assessment is considered as an important component. In peer assessment and feedback, and observation of user
this paper we investigate which user experiences should be experiences during a field trial, are used to inform the
supported in a playful peer feedback tool within a design of a playful peer feedback tool in a participatory
participatory culture of learning. culture of learning.
LEARNING IN SCY
Keywords The EU 7th framework SCY (Science Created by You;
User Experience (UX), participatory culture of learning, www.scynet.eu) project addresses learning in science
active learning, peer feedback. offering learners a learning experience based on real life,
challenging assignments [3]. In SCY-Lab (the SCY
learning environment) learners work individually and
INTRODUCTION collaboratively on “Missions” which are guided by socio-
Today’s youth participate in a variety of social media scientific questions such as “How can we design a climate-
where they create and disseminate ideas or news, friendly house?” [3]. Learners have to gather and process
collaborate and connect with people. Examples of social information, design and conduct experiments, make
media software are numerous and enable communication, interpretations and abstractions, communicate their
collaboration, multimedia and entertainment through blogs, conclusion or, in other words, engage in processes of active
and social networking (Facebook, MySpace), wikis, Flickr, learning, based on inquiry, knowledge building, and
YouTube, Second Life, etc. Through their participation in learning by design [28].
these informal communities today’s youth develop new
media skills [28]. In their 2006 report on Confronting the SCY uses a pedagogical approach that centres around
Challenges of a Participatory Culture Jenkins et al. [15] products called “emerging learning objects” (ELOs) that
identify 10 new skills—Play, performance, Simulation, are created by learners [3]. The ELOs, such as a CO2-
Appropriation, Multitasking, Distributed Cognition, friendly house design or a concept map, are the vehicles for
Collective Intelligence, Judgement, Transmedia Navigation, gaining an understanding of the general science skills,
Negotiation, Networking—developed through collaboration social and presentations skills, and domain concepts the
and networking. student has developed [28]. Thus assessment in SCY is
centred on these ELOs.
While there is an increasing view of learning as a
participative activity in the learning community [17], PEER ASSESSMENT IN A PARTICI-PATORY CULTURE
OF LEARNING
schools and institutions have been slow to react to the
emergence of this new participatory culture [15]. An Peer assessment enables students to take charge of their
important component in the design of learning learning, and become active learners who could take
environments is implementing this contemporary culture of responsibility for, and manage, their own learning [1, 2, 4,
learning [17]. Peer assessment is used as a means to 26, 30]. For example, it enables students to learn to assess
empower students and peers by enabling students to take and to develop assessment skills, either when they enact
charge of their learning and become active learners who peer assessment themselves or when receiving an
take responsibility for, and manage, their own learning [1]. assessment from their peers, and at the same time, it
enhances students’ learning through knowledge diffusion
In our work on peer assessment we are inspired by the ease and exchange of ideas, even when they are incorrect [28].
Peer assessment has also been found to motivate students to (experiential evaluation) and how the concept idea would
engage in the learning process [22]. Research on students’ fit into participant’s own context of living [20]. According
views about peer assessment has shown that students are to Roto et al. [20] the value of the anticipated interaction
motivated by the fact that they want to impress their peers outcome can be evaluated even thought there is no user
[11] and by the fact that peer assessment is productive. It interface or interaction design available.
makes them think, learn more, be critical, and be structured
In this paper we address: How can we design a playful peer
[2, 6, 23, 24]. In addition, peer assessment introduces the
feedback tool to sustain good user experiences?
students to the perspective that the focus of instruction is
not only on the end product(s) but also on the process, and FIELD RESEARCH
it highlights the value of collaboration (e.g., social During a March 2010 field trial of the SCY Mission
interactions, trust in others; [19]. Peer feedback is a form of “Create a CO2 Friendly House” we observed how peers
peer assessment where peers give opinions, suggestions for interact and give each other feedback. The trial was
improvements, ideas, etc. to each other. It has been found arranged at Sandvika Upper Secondary School in Oslo. It
that students are more willing to accept feedback given in ran for 20 hours, divided over 4 successive Wednesdays, 5
“student-speak” and students may be more willing to accept hours each day.
feedback from peers [7]. It has also been emphasized that Participants
the accuracy of the peer feedback may not be that crucial Three science classes of approximately 30 first year high
[9] and that the consequence of variety of accuracy in peer school students (16-17 years old) were introduced to the
feedback might just be a benefit [26]. SCY project and volunteers for the 4-week field trial were
solicited. A selection of 20 students from the volunteers
A PLAYFUL PEER FEEDBACK TOOL
across these classes was chosen to participate. The 3
In our work in the SCY project we are focused on teachers divided the students in 4 person design teams, each
providing “playful” peer assessment possibilities in a of which chose their own name:
science learning environment and in this manner empower
the users to become active learners who take responsibility • BioNorway (3 girls and 1 boy)
for, and manage, their own learning [29]. The tool is • New energy (3 boys and 1 girl)
“playful” because it is lightweight and designed to take • Power puff (4 girls)
advantage of new media skills. • PikenesJens (2 girls and 2 boys)
User experience • ThumbsUp (2 girls and 2 boys)
Over the last decade “user experience” became the Learning Environment
buzzword in the field of human-computer interaction (HCI) The learning environment comprised SCY-Lab (with its
and interaction design [14]. It has become a catchphrase, resources and tools) Google search engine, Google
calling for a holistic perspective and an enrichment of SketchUp (for 3D drawings), PowerPoint and Word. No
traditional quality models with non-utilitarian concepts, feedback tool was available in SCY-Lab; feedback was
such as fun [18, 5], joy [10], pleasure [16], hedonic value given spontaneously and orally within and between groups
[12] or ludic value [8]. Figure 1 shows a student working with SCYSimulation in
SCY-Lab.
Good user experience (UX) is the goal of most product
development projects today [20]. Hassenzahl [13] argued
that future HCI must be concerned about the pragmatic
aspects of interactive products (i.e. fit to behavioral goals)
as well as about hedonic aspect, such as stimulation (i.e.)
personal growth, an increase of knowledge and skills),
identification (i.e. self-expression, interaction with relevant
others) and evocation (i.e. self-maintenance, memories).
Focus on the positive aspects of technology use has also
been a trend in psychology [21] and within UX this idea
has been adopted outlining one of HCI’s main objectives to
contribute to our quality of life by designing for pleasure
(by creating outstanding quality experiences) rather than
for absence of pain (or preventing usability problems) [14].
Figure 1. Student working with SCYSimulation in SCY-Lab
How to sustain a good user experience? The Student Mission and Tasks Given
Many UX researchers argue that good UX comes from the The Mission challenge given to the students, “Your job is
value and meaning of the product concept itself [20]. In to design a CO2 friendly house”, included 9 tasks:
order to select the right concept, we need to evaluate the
concept ideas, the potential value of the concept idea itself
1. Create one concept map where you explain the
importance of reducing global CO2-levels.
2. Create one concept map where you brainstorm on
the design aspects of a CO2 -friendly house.
3. Make an initial plan on how your design group
will proceed with the tasks to ensure a successful
project.
4. Become an expert in one of the four fields:
a. Production of energy,
b. Laws of energy,
c. Solar cells and solar thermal collectors,
and
d. Heat pumps.
5. Experts present their work in their original design
groups. Figure 2. Team NewEnergy House Design ELO
6. Revise the initial plan.
7. Design, build and analyze your CO2 friendly house
using different tools that will be provided for you. Excerpt 1 (from Field Notes):
8. Write a report for the mayor of your town. Jens (PikenesJens): Do you have a CO2 reason for
9. Present your group’s findings in front of your building a round house?
classmates.
Magnus (NewEnergy): We have chosen to design a round
Data Collection house with one floor. We did this to save area and by this
Empirical data, collected during the field trial through also energy. Because the smaller square footage of exterior
observations, videos, and data recordings, included: field walls we don’t need to insulate as much. We also chose to
notes, video recordings, reports, power point presentations only use one floor in the house. In this manner we don’t
and the collection of ELOs. have the problem that the heat rises to the 2nd floor and we
get an even heat throughout the whole house.
Analysis for Assessment Design Excerpt 1 shows the how a student question “Do you have
During the field trial we were interested in the following a CO2 reason for building a round house?” triggered a
questions: Are the students active and take initiative in their discussion about why Team New Energy made a circular
own learning process? Do they look at each others ELOs house.
and engage in peer interaction? Do they give feedback? Do
they need any support to share and give feedback on each The relevance of this for the design of SCY assessment is
other’s ELOs? that:
1) This dialogue should be supported by a SCYFeedback
Thus the analysis of the empirical data focused on whether tool
the students: 2) The content of the dialogue illustrates that a) Jens can
1) shared their ELOs ask a question (skill: formulate questions) and b) Magnus
2) asked questions or presented an argument can explain and argue for their choice of design (skill:
3) gave feedback to one another argumentation/reasoning). This shows some of the skills
4) took the feedback into consideration that the teacher will look for in a summative evaluation.
and the implications of these for the design of a feedback
tool. Episode 2:
Student Jens looked at other team’s house simulation in
SCY-Lab (see Figure 3) on his own computer and got a
Episode 1: reply from the teacher.
Student Jens looked at another team’s house design on their
screen and asked a question. The other student, Magnus,
pointed at their ELO (see Figure 2) on his computer screen.
team New Energy.
Figure 3. Team New Energy house simulation
Excerpt 2 (from Field Notes): Figure 4. Team New Energy house simulation showing heat
Jens (PikenesJens): How can the walls have less surface loss coefficient of their house
area (96 m2) than the floor and roof (both 172 m2)?
Teacher: You have to use the formula for calculating the Excerpt 3 (from Field Notes)
surface area for circles instead of rectangles. Jens (PikenesJens): Wow! Your graph bar for the door is
very small compared to ours! The door area is 2 m2 and the
Jens (PikenesJens: What is the forumula?
doors material is glass. How many m2 does a door need to
Teacher: be? Is 2 m2 enough? Is glass door better than wood?
Jens (PikenesJens): I have now calculated and I think that
their answer is correct. The walls do have less surface area Teacher: I would think that wood is better isolation
when using a circle than a rectangle!!! material than glass.
Teacher: Laughing. Yes that is correct. You did not expect
that did you? Jens (PikenesJens): I have checked and you get better
values for glass than for door. But the glass is triple!!!
Jens (PikenesJens): No, humm well then I guess that I
have understood something new. Teacher: Ok that might explain it. Tipple glass door might
Excerpt 2 shows how a student question “How can the provide better isolation than a single wood door.
walls have less surface area (96 m2) than the floor and roof Jens (PikenesJens): But is 2 m2 enough for the door?
(both 172 m2)?” triggered a discussion between the student
and the teacher. Teacher: How big is the door into the classroom? And how
The relevance of this for the design of SCY peer feedback big are you?
is that: Jens (PikenesJens): Checking the classroom door and
1) This dialogue should be supported by the SCYFeedback walking through it. I do not think that it is more than 2 m2.
tool. The peers designing the round house could just as well Great then I can reduce the door sixe and get a better heat
as the teacher help the students with information about how coefficient. I will also experiment will various door
to calculate the surface area of a circle. materials.
2) The content of the dialogue shows that Jens found that
Team New Energy correctly had used the formula and Excerpt 3 shows that the student Jens displays general
calculated the area and volume of a circle (mathematics science skills such as being able to visualize, interpret and
domain). It is also plausible that after the communication make judgements about data. By investigating the
with the teacher Jens also has gained this skill in geometry simulation of another team and comparing this with their
of calculating area and volume from complex shapes. own a student gains experience in interpreting data and in
Episode 3 investigating how the house simulation variables are related
Students’ sharing of house simulations generates discussion to the overall heat transfer coefficient. The application of
around the elements in the data simulation of a CO2 the concept of overall heat transfer coefficient with the
friendly house. Figure 4 shows the house simulation of transfer of heat is a skill within Physics and
Thermodynamics. The student discussion and application
of this concept in their house simulation model could friendly can be used. The discussion and peer feedback
demonstrate that they have gained this skill. could be supportive for the students in gaining general
science skills such as being able to reflect on one’s own
The skills of interpreting another team’s (Team ThumbsUp)
knowledge and interpret data.
house simulation proved to be useful for Jens (Team
PikenesJens) as he got a new perspective on how low the The relevance of this for the design of SCY assessment is
heat loss coefficient for the door could be. Based on the that:
comparison of the two teams’ simulation model and 1) This student dialogue should be supported by the
feedback from the other student Team ThumbsUp changed SCYFeedback tool.
their values and managed to reduce the heat loss coefficient
2) The student questions would then be documented and
for their door.
the teacher could look back at the student dialogue when
The relevance of this for the design of the SCY assessment assessing the student skills.
is that:
CONCLUSIONS AND SUMMARY
1) The commenting and questioning of a student made In this paper we have explored which user experiences
ELO could be supported by the SCYFeedback tool. The should be designed for and supported by a playful peer
students in team New Energy might just as well as the feedback tool within a participatory culture of learning.
teacher answer questions related to their simulation ELO Today’s youth participate in a variety of social media and
and the choices behind their selection of values. develop new skills (e.g., play, simulation, judgement,
2) The ELO sharing led to changes in student ELO and the multitasking). Within learning research the view of learning
discussion shows that the student displays skills like for as a participatory activity where the students themselves
example being able to visualize, interpret and make participate actively in the learning community has been
judgements about data. increasing. Peer assessment has been suggested as a
method to be used to empower students to take charge of
Episode 4 and manage their own learning.
Students’ presentation of their house design gave a good
opportunity for peer feedback in a plenum. Figure 5 shows UX researchers argue that good UX comes from the value
the students in Team New Energy presenting their use of and meaning of the product itself. The concept of
isolation in their house design. participatory peer feedback has been further investigated in
a school setting with the SCY-Lab learning environment
and “Create a CO2 friendly house” Mission in order to see
if the concept idea would fit into participant’s own context
of learning.
The field study showed that:
- students were looking at each others products (ELOs)
and took initiative by asking each other questions
- students naturally engaged in peer feedback dialogues
- students were able to make judgements about other
students ELOs and use this to further develop their own
skills
- the students seemed to be comfortable with switching
between working on own ELOs and investigating other
Figure 5. Team New energy presenting their house design
students ELOs
- the students seems to be motivated by playing with
Excerpt 4 (from Field notes): other students simulations
Team ThumbsUp: We see that you have chosen tar paper for roof - students need support to communicate with each other
but is that an environmental friendly material? and give each other feedback on ELOs
Team New Energy: It is perhaps not the most environmental - students showed skills in their discussions (e.g.
friendly, but it is very isolating and thus we do not have to use too collaboration, formulate questions, argumentation,
much electricity to heat the house. reasoning, mathematical calculation, judgement,
Team ThumbsUp: We think that you should avoid using a simulation)
material that is not environmental friendly.
The idea of creating a good user experience and also
Excerpt 3 shows that Team ThumbsUp is questioning the cultivate the students as active learners with a peer based
environmental friendliness of their choice of tarpaper as assessment tool seems promising. Findings show that
one of the roof materials. Team ThumbsUp and New students act, take initiative and they also seem to take
Energy discuss if a material that is not environmental pleasure in sharing their products (ELOs) and engaging in
peer discussions. Students do not seem to need instructions Programme for R&D (Grant agreement 212814). This
and guidance as this playfulness falls naturally for them. document does not represent the opinion of the European
However, a need for a means to link peer feedback to ELOs Community, and the European Community is not
was identified. The goal for the design of the playful responsible for any use that might be made of its content.
SCYFeedback tool for peer assessment should be to We like to acknowledge the contributions of the large list
facilitate student sharing of ELOs together with of people working on SCY to the work presented here.
opportunities for student feedback on the ELOs. The tool
should lay the foundation for a good user experience where REFERENCES
student themselves can engage in ELO sharing and take [1] Black, P., Harrison, C., Lee, C., Marshall, B., &
charge of having fun and creating their own pleasure. Wiliam, D. (2002). Working Inside the Black Box,
assessment for learning in the classroom. Kings
Figures 6 and 7 show screenshots of how the ELO display College: London
and linking of peer feedback comments to an ELO could be [2] Boud, D. (1995). Enhancing learning through self-
facilitated. Figure 6 shows the ELO Gallery where the assessment. London: Kogan Page.
students can find published ELOs while Figure 7 shows an
[3] de Jong et al. (2009) Learning by creating and
ELO and how students could give peer feedback and score
exchanging objects: the SCY experience. Special issue
the ELO.
on learning objects, British Journal of Educational
Technology.
[4] Dochy, F., Segers, M., & Sluijsmans, D. (1999). The
use of self-, peer-, and co-assessment in higher
education: A review. Studies in Higher Education, 24,
331–350.
[5] Draper, S.W. (1999). Analysing fun as a candidate
software requirement. Personal Technology, 3(1),1-6.
[6] Falchikov, N. (1995). Peer feedback marking:
Developing peer assessment. Innovations in Education
and Training International, 32, 175–187.
[7] Frost, J. & Turner, T. (Eds.) (2005) Learning to Teach
Science in the Secondary School, Second Edition.
Routledge Falmer, London.
[8] Gaver, W.W., & Martin, H. (2000). Alternatives.
Figure 6. ELO Gallery showing students published ELOs Exploring Information Appliances through Conceptual
Design Proposals. In Proceedings of the CHI 2000
Conference on Human Factors in Computing (pp.209-
216). New York: ACM, Addison-Welsley.
[9] Gielen, S., Peeters, E., Dochy, F., Onghena, P., &
Struyven, K. (2009). Improving the effectiveness of
peer feedback for learning. Learning and Instruction,
doi:10.1016/j.learninstruc.2009.08.007.
[10] Glass, B. (1997). Swept away in a sea of evolution:
new challenges and opportunities for usability
professionals. In R. Liskowsky, B.M. Velichkovsky, &
W. Wünschmann (Eds.), Software-Ergronomie ’97
Usability Engineering: Intertation von Mensch-
Computer-Interaktion und Software Entwicklung (pp.
17-26). Stuttgart: B.G. Teubner.
[11] Hanrahan, S. J. & Isaacs, G. (2001). Assessing self-
Figure 7. ELO feedback screen with comment and score field
and peer-assessment: the students’ views. Higher
Education Research and Development, 20, 53-70
ACKNOWLEDGMENTS
[12] Hassenzahl, M. (2002). The effect of perceived
This study was conducted in the context of Science Created
hedonic quality on product appealingness.
by You (SCY), which is funded by the European
International Journal of Human-Computer Interaction,
Community under the Information and Communication
13, 479-497.
Technologies (ICT) theme of the 7th Framework
[13] Hassenzahl, M. (2003) The thing and I: understanding Education Academy. Retrieved from
the relationship between user and product. In Blythe, http://www.heacademy.ac.uk/resources/detail/id437
C.Overbeeke, A.F. Monk, & P.C. Wright (Eds.), establishing learning effects
Funology: From Usability to Enjoyment (pp. 31-42). [23] Stefani, L. (1992). Comparison of collaborative, self,
Dordrecht: Kluwer. peer and tutor assessment in a biochemical practical.
[14] Hazzenzahl, M. & Tractinsky, N. (2010). User Biochemical Education, 20(3), 148- 15 1.
experience – a research agenda. Behaviour & [24] Stefani, L. A. J. (1994). Peer, self and tutor
Information Technolog, 25(2), 91-97. assessment: relative reliabilities. Studies in Higher
[15] Jenkins, H., Clinton, K., Purushotma, R., Robison, A.J. Education, 19, 69-75.
& and Weigel, M. (2006). Confronting the Challenges [25] Topping, K. (1998). Peer assessment between students
of Participatory Culture: Media Education of the 21st in colleges and universities. Review of Educational
Century. Chicago: The MacArthur Foundation. Research, 68, 249–276.
[16] Jordan, P. (2000). Designing Pleasurable Products: An [26] Topping, K. (2003). Self and peer assessment in school
Introduction To The New Human Factors. London, and university: Reliability, validity and utility. In M.
New York: Taylor & Francis Segers, F. Dochy, & E. Cascallar (Eds.), Optimizing
[17] Kollar, I. & Fischer, F. (2009). Commentary: peer new modes of assessment: In search of qualities and
assessment as collaborative learning: a cognitive standards. Dordrecht/Boston/London: Kluwer
perspective. Learning and Academic Publishers.
Instruction.doi:10.1016/j.learninstruc.2009.08.002 [27] van Zundert, M., Sluijsmans, D. & van Merriënboer, J.
[18] Monk, A.F. & Frohlich, D. (1999) Computers and Fun. (2010). Effective peer assessment processes: Research
Personal technology, 3, 91. findings and future directions. Learning and
[19] Noonan, B. & Duncan, C. R. (2005). Peer and self- Instruction, 20(4), 270-279.
assessment in high schools. Practical Assessment, [28] Vold, V, Wasson, B & de Jong, T. (forthcoming)
Research and Evaluation, 10(17). Assessing Emerging Learning Objects: ePortfolios and
[20] Roto, V,, Rantavuo, H. & Väänänen-Vainio-Mattila, Peer Assessment, in Digital Assessment of 21st Inquery
K. (2009). Evaluating user experience of early product skills Routledge's new 'Psychology in Education' book.
concepts. International Conference on Designing [29] Wasson, B. & Vold, V. (forthcoming 2010).
Pleasurable Products and Interfaces, DPPI09, 13-16 Leveraging New Media Skills for Peer Feedback in
October 2009, Compiegne University of Technology, Collaborative Inquiry Learning, First Nordic
Compiegne, France. Symposium on Technology-enhanced learning (TEL),
[21] Seligman, M.E.P. & Csikszentmihalyi, M. (2000), NORDITEL 2010.
Positive Psychology An Introduction, American [30] Yang, M., Badger, R., & Yu, Z. (2006). A comparative
Psychologist, 55, 5-14. study of peer and teacher feedback in a Chinese EFL
[22] Sluijsmans, D. (2002). Establishing learning effects writing class. Journal of Second Language Writing, 15,
with integrated peer assessment tasks. The Higher 179-200.