<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Using Teaching Analytics to Inform Assessment Practices in Technology Mediated Problem Solving Tasks</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Geneviève Gauthier</string-name>
          <email>Genevieve.gauthier@ualberta.ca</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Department of Educational Psychology, University of Alberta</institution>
          ,
          <country country="CA">Canada</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>Teaching analytics can provide a useful framework to conceptualise the visual methods used in previous work on teachers' assessment practices. I describe two examples of visual methods and discuss their assumptions and potential usefulness beyond the original context in which they were designed. Rethinking visual data analysis around the teaching process provide different lenses and ways to use data to inform and improve both the teaching and learning processes.</p>
      </abstract>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>Introduction</title>
      <p>
        Teaching analytics focuses on the development or adaptation of visual methods and
technological tools to inform and support teaching practices within technology
enhanced learning contexts [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]. This perspective on data analysis and interpretation to
support teachers’ instructional decision-making process acknowledges the importance
of designing and thinking about technology mediated learning to provide an active
role to teachers. In a discourse that has emphasized the power of educational
technology around an individualistic and personalized approach to learning [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ], teaching
analytics brings a new perspective into the technology mediated discourse. It reframes
the use of educational technology as enabling teachers and their teaching to occur in
new ways instead of trying to replace the teacher variable in the learning equation.
As argued by Goggins [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ], isolating teaching analytics from the broader learning
analytics perspective reframes the learning equation by making this layer explicit,
acknowledging that learning happens in a social context in which teachers’ expertise is
instrumental and beneficial to this learning. In terms of research, this perspective
changes the focus from analysing how teachers can make sense of technology to
finding ways that technology can enhance or inform teachers’ practices. Teachers’
assessment practice relates to learners’ data as it is build and developed through years of
interaction with learners and content in similar contexts.
      </p>
      <p>
        Investigating teachers’ practice instead of solely focusing on learners’ concrete
instance of data enable us to explore teachers’ pedagogical content knowledge [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ].
This type of knowledge that experienced teachers have can be defined as their ability
to select, represent and communicate component of the domain knowledge in a way
that stimulates learning for novice learners. Some aspects of the pedagogical content
knowledge, like the understanding of what is difficult for learners, the ability to
provide good examples and explanations or the ability to predict difficulties or challenges
that will likely be encountered by novice learners, can be linked to the concept of the
student model in intelligent tutoring system [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ]. Excellent teachers have an implicit
understanding of how students tend to learn and struggle with the content knowledge
in their domain. An in-depth study of these predictive aspects of experienced teachers
pedagogical content knowledge’s represent an indirect way to investigate typical
learning patterns for specific tasks.
      </p>
      <p>
        In this paper I discuss how teaching analytics provide a useful framework to
conceptualise the tools I have developed in previous work on teachers’ assessment
practices. Thinking of data analysis in ways that can improve teaching enables me to
rethink of tools and methods beyond the research context in which they were
developed and propose ways to apply them in other contexts and domains. I begin by
situating the context and purpose of the research we do before describing two examples
of visual methods that can be described as teaching analytics. I then propose to use
theses examples to build and expand on concepts of teaching analytics as tool to
improve teaching expressed by previous authors on teaching analytics [
        <xref ref-type="bibr" rid="ref3 ref6">3, 6</xref>
        ].
2
      </p>
    </sec>
    <sec id="sec-2">
      <title>Background and purpose of research</title>
      <p>
        The research is situated in a higher education where teachers design and use
interactive teaching cases, which are conceived as problem solving activities or tools to help
students anchor knowledge [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ]. In this context, a teaching case typically consists of a
story about one or more problems affecting a patient and a scenario for addressing the
diagnosis and management of these problems. The development of interactive cases is
a technology-mediated task that changes the sequencing in which teachers need to
plan and implement their assessment. Teachers need to specify the expected outcomes
or answer for each case ahead of time, not in reaction to students’ production. This
answer along with rules and arguments sustaining the answers are used to give
feedback to students. The development and testing of interactive teaching cases is done
through BioWorld and its Case Builder companion (Lajoie, Lavigne, Guerrera, &amp;
Munsie, 2001). Bioworld is a computer based learning environment where
participants are presented with patient cases to solve. The structure of the environment is
non-linear; participants can interact with the problem through selecting potential
hypotheses, ordering test, checking vital signs and scrutinizing the patient problem in
any sequence or order they want. While solving the case, participants collect
evidences supporting their reasoning and they are asked to sort and prioritize the ones
supporting their final diagnosis.
      </p>
      <p>
        The development of teaching cases and their corresponding answer and
grading rules for the computer based learning environment has led us to unravel
interesting assessment issues that highlight the tacit nature of teachers’ assessment
knowledge [
        <xref ref-type="bibr" rid="ref8 ref9">8, 9</xref>
        ]. When experts and case creators were asked to perform relatively
easy cases using the computer-based learning environment, we encountered validity
and reliability issues regarding the proposed “good” answers for the cases. Experts
and case creators, who were both experienced practitioners and teachers, could neither
reproduce the “proposed” good answers for the cases, nor upon replication of the
cases, repeat their first answer to the cases. Subsequent analysis of the computer log
representing the different diagnostic tests or actions comparing different performances
did not result in any meaningful patterns.
      </p>
      <p>To gain insight on how teaching cases could be assessed while enabling
variability in the reasoning process that we observed in competent practitioners, we
investigated how experienced teachers, who manage this variability in their assessment
practice on a daily basis, conceptualize the notion of a good answer to teaching cases.
Through technology-mediated interactions structure to capture teachers’ knowledge in
action, we studied how teachers plan, design and interpret students’ reasoning
performance in the context of open-ended interactive scenarios.
3</p>
    </sec>
    <sec id="sec-3">
      <title>Teaching Analytics as a tool for practitioners to reflect about their assessment practices</title>
      <p>
        This first example of how methods developed for our research on assessment can gain
in being framed as teaching analytics build on the idea expressed by Rebholz,
Libbrecht and Müller [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ] of using visual representations of data as a tool for practitioners
to improve their decision making process. We propose to use the concept of tool to go
beyond the investigation of learner’ individual performance and include analysis
related to the practitioner’s judgment over time or over a series of different tasks. As
expressed by Goggins [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ], teaching analytics have the potential to enhance teaching
and subsequently have a positive impact learning. Going beyond the raw data
produced by learners and focusing on teachers’ practices and experiences of interaction
with these learners opens up possibilities in the ways we use different sources of data.
In this example we describe and explain how teaching analytics can be thought as
tools to help practitioners reflect and gain insight about their practice. We use the
concept of student model, as internalized tacit knowledge teachers gain through
experience as an assumption to interrogate their assessment practice for specific teaching
cases.
      </p>
      <p>
        Teachers’ instructional practice happens in a constant flux of
decisionmaking in action. Teachers make a lot of decisions while interacting with students,
content and context but they are not always able to remember or reason about these
decisions [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ]. When investigating teachers’ knowledge about assessment we realised
that most of it was implicit; they tend to know what a good performance looks like
when they see it but they cannot easily articulate the criteria they use to make this
judgment [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ]. Throughout a number of our experimentations teachers expressed
surprise about the variability in the problem solving process of relatively easy cases
[
        <xref ref-type="bibr" rid="ref12 ref9">9, 12</xref>
        ]. They were puzzled by the variability between the answers they would
proposed and the one they could produce when doing the case in the computer learning
environment. They were incredulous about the computer log and the recording of
their performances when we asked them to solve the same cases twice. Their beliefs
about assessment of teaching cases were not aligned with the reality captured by the
computer. As a result we had to design a method to capture their knowledge in action
instead of relying on what they thought the answer should be. The method aimed at
capturing teachers’ implicit knowledge about their assessment practice anchored in
concrete events instead of relying on memory or predictions about the problem
solving process and answer [
        <xref ref-type="bibr" rid="ref13">13</xref>
        ].
      </p>
      <p>
        We build on teachers’ case based knowledge by relying on their verbal
abilities which tend to be more developed than their written abilities. Teachers’ ability to
verbalize their thinking for an external audience is well developed and using think
aloud protocols with them provides richer verbal descriptions and explanations than
with non teaching experts. To gain insight about the problem solving process within
the technology mediated problem-solving task, we framed a think-aloud protocol
analysis into a teach-aloud task, which is anchored in a familiar case presentation task
for these teachers [
        <xref ref-type="bibr" rid="ref14">14</xref>
        ]. The use of verbal data combined with video and computer
logs provides a better retrospective understanding of the meaning of the data in
relationship to both the global and contextual nature of the problem solving performance.
This labour intensive strategy addresses the incompleteness of the data collected
through computer log interactions as expressed by Goggins [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ], and it helps teachers
reflect back on the data as they provide rich narrative context and cues.
      </p>
      <p>
        The method developed throughout a number of experimentation in my
doctoral studies [
        <xref ref-type="bibr" rid="ref15">15</xref>
        ] can be conceptualized as an example of teaching analytics since it
uses a visual method to display computer log data in the context of a think-aloud
protocol. Even if we have mainly used data of teachers’ performance given our research
question related to assessment’s judgment, we use the information to inform teachers
understanding of the assessment process which has had impact on their actual
assessment practice. The use of their own performance data is related to assessment
judgments of teachers having been shown to be anchored in personal knowledge about the
task [
        <xref ref-type="bibr" rid="ref16">16</xref>
        ] and that inference or interpretation of their own behaviour is less prone to
different types of biases than when they assess students’ performance [
        <xref ref-type="bibr" rid="ref17">17</xref>
        ].
3.1
      </p>
      <sec id="sec-3-1">
        <title>Brief description of design and use of visual representation of data</title>
        <p>
          Data collection and analysis to develop, validate and interact with the visual
representation occurs in two phases. In phase 1 teachers solve the cases while performing a
think-aloud protocol [
          <xref ref-type="bibr" rid="ref18">18</xref>
          ] during their interaction with the computer-learning
environment. In the first phase of analysis the computer log data and the think-aloud
protocol are combined and transformed into a sequential representation of the problem
solving performance. The framing of the think-aloud, as a presentation for a specific
audience of learners, enables the use of conversational analysis [
          <xref ref-type="bibr" rid="ref19">19</xref>
          ] for these
monologues where the focus is on the intentions and meaning(s) of the utterances and
actions performed by the participants. The goal in building these visual representations
is to use empirical qualitative models as tools to study the complexity of problem
solving performance with participants.
        </p>
        <p>
          We use these visual representations with participants to have them validate
the analysis before asking them to use the visual representation as a tool to reflect on
the assessment of this specific teaching case. The validation phase, where teachers
can inspect their entire verbal protocol, is similar to a retrospective think-aloud
protocol where participants have the opportunity to add or comment on their previous
performance [
          <xref ref-type="bibr" rid="ref18">18</xref>
          ]. After the validation task, participants are asked to reflect on their
problem solving process by categorizing the key features of their resolution process
for the cases that are indicative of a good performance. Figure 1 below shows a
section of the categorized individual visual representation.
This visual teach-aloud method was designed to study how experienced teachers use
their contextual case knowledge. We aimed at extracting their concept of competent
reasoning for specific cases to inform a reflection on their individual and shared
assessment practice. We do not intend to question teachers’ expertise and knowledge,
but we refer to the use of retrospective contemplation about their actions after the fact.
This strategy is what Schön [
          <xref ref-type="bibr" rid="ref20">20</xref>
          ] refers to as “reflection-on-action”, it builds on their
implicit knowledge and promotes a better understanding of the strengths and
limitations of their assessment judgment.
4
        </p>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>Beyond the individual learner: analysing different units of data</title>
      <p>
        In this second example we build on Goggins’ [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ] idea of using teaching analytics as a
tool to bring a social perspective into the assessment of learning analytics. Yet, our
concept of group is more elusive and not focused on groups that are interacting and
learning together in time and space. Teaching analytics can enable teachers to
capitalize on the social component of learning beyond the analysis of the individual learner’s
performance.
      </p>
      <p>This perspective on the analysis and use of data has the potential to open the
door of the classroom by rethinking the unit, purpose and use of the data within the
learning process. Individual performance’ data may be relevant in some situations but
different grouping of data may be more useful or informative if we frame different
questions about the learning process. What defines a ‘meaningful unit’ depends on
the context and aspect of the learning process that we decide to explore. We suggest
that a group of teachers grading the same task is a useful unit of analysis or that an
entire class doing the same task or problem is also a useful unit of analysis. Analysing
data from these two groups can provide insight and better contextualize judgment and
interpretation of performance in a computer based learning environment. We briefly
define different groups or unit of analysis that we have used in our experimentation
and discuss how these use of data can inform teachers about their assessment
practices.
4.1</p>
      <sec id="sec-4-1">
        <title>Triangulating data and comparing teachers’ assessment criteria</title>
        <p>In our work we used a group of teachers who teach the same course to different
students as a useful unit of analysis. These teachers sometimes discuss or build teaching
cases together but they do not teach or learn together per say. We compared their
concepts of a good performance for each case and analyse the convergence in their
judgment. We created combined visual representations for each case resolution by
merging each participant’s individual representation into one complex multi-layered
representation. These representations were the result of the analysis of their
convergence in decision making about the key element required for students to demonstrate
a successful reasoning.</p>
        <p>The analysis process was very insightful for teachers, as they had never had
the opportunity to see how competent colleagues would solve these types of
problems. We did not capture their reflection about the process but teachers’ comments at
a number of occasions gave us insight on the impact of the process on their
perspective on assessment. For example at the beginning of the analysis, one participant
asked who was the student that had produced one of the visual representations I had
on the wall. As this visual representation happened to be produced by one of their
colleague, it completely changed the way they were looking at it. This comment was
revealing of their typical assessment experience where they mostly had to compare
and evaluate students’ performances solely based on their own understanding. The
exercise of comparing with colleagues opened their minds to different “good’ ways to
solve these cases which impact how teachers assess them as well.
4.2</p>
      </sec>
      <sec id="sec-4-2">
        <title>Visual representations as a tool to contextualize performance and its assessment</title>
        <p>
          The use of visual representations has the potential to inform teachers about the nature
of competent performance. It provides a concrete trace of the context and process of
the problem solving performance, which include tangents and mistakes, that differs
from their ideal image of what the answer should be. This reminder about the
complexity and potential variation in the performance of competent self or colleagues
improves the transparency of the assessment procedure. It enables a better evaluation
of the validity claims and the corresponding inferences of proficiency related to its
scoring in small-scale educational settings [
          <xref ref-type="bibr" rid="ref21">21</xref>
          ]. While collecting students’
performance on the case we have experimented with using the visual representation tools
described above as assessment tool to promote teachers’ critical perspective about
their own judgment. Teachers asked more questions in their feedback but they also
ended up using the visual representations as tools to communicate their assessment
criteria to students when they gave them their grade and feedback. Students said that
the representations of the problem solving process helped them see how they could
have solve the case in different ways and it made them aware of how much more
depth they could have gone into.
5
        </p>
      </sec>
    </sec>
    <sec id="sec-5">
      <title>Conclusion and discussion</title>
      <p>The two examples described above show how the use of visual display of data
analysis can provide insight into the teaching process in the context of research on
assessment. If we think about these tools beyond the context in which they were designed
and used they can become more generic tools that have the potential to support the
teaching and learning process in other contexts. Teaching analytics provide a new
way to frame questions related to the teaching process by acknowledging the role of
the teacher as the orchestrator of learning in classroom settings. A better
understanding of the socio-technical context in which learning occur will lead to the design and
implementation of technologies that better align technical requirements and
affordances with teachers’ analytic strengths and weaknesses.</p>
      <p>Future research using teaching analytics as tools to explore assessment
practices will go beyond the realm of computer mediated interaction and see if these tools
cannot have impact on classroom assessment judgment in context where there is no
technology involved. We will also explore use of visual representations to anchor
discussions and training about assessment with groups of teachers in disciplinary and
interdisciplinary contexts in health.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          1.
          <string-name>
            <surname>Vatrapu</surname>
            ,
            <given-names>R.K.</given-names>
          </string-name>
          :
          <article-title>Towards semiology of teaching analystics</article-title>
          .
          <source>TaPTA Workshop</source>
          at ECTEL, Saarbrücken, Germany (
          <year>2012</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          2.
          <string-name>
            <surname>Cuban</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          :
          <article-title>Oversold &amp; underused: Computers in the classroom</article-title>
          . Harvard University Press, Cambridge, MA (
          <year>2001</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          3.
          <string-name>
            <surname>Goggins</surname>
            ,
            <given-names>S.P.</given-names>
          </string-name>
          : Group informatics:
          <article-title>A multi-domain perspective on the development of teaching analytics</article-title>
          .
          <source>TaPTA Workshop</source>
          at EC-TEL, Saarbrücken, Germany (
          <year>2012</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          4.
          <string-name>
            <surname>Shulman</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          :
          <article-title>Knowledge and teaching: Foundations of the new reform</article-title>
          .
          <source>Harvard Educational Review</source>
          <volume>57</volume>
          ,
          <fpage>1</fpage>
          -
          <lpage>23</lpage>
          (
          <year>1987</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          5.
          <string-name>
            <surname>Murray</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          :
          <article-title>Authoring Intelligent Tutoring Systems: An Analysis of the State of the Art</article-title>
          .
          <source>International Journal of Artificial Intelligence in Education</source>
          <volume>10</volume>
          ,
          <fpage>98</fpage>
          -
          <lpage>129</lpage>
          (
          <year>1999</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          6.
          <string-name>
            <surname>Rebholz</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Libbrecht</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Müller</surname>
            ,
            <given-names>W.</given-names>
          </string-name>
          :
          <article-title>Learning analytics as an investigation tool for teaching practitioners</article-title>
          .
          <source>TaPTA Workshop</source>
          at EC-TEL, Saarbrücken, Germany (
          <year>2012</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          7. Kolodner, J.L.,
          <string-name>
            <surname>Camp</surname>
            ,
            <given-names>P.J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Crismond</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Fasse</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Gray</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Holbrook</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Puntambekar</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ryan</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          :
          <article-title>Problem-based learning meets case-based reasoning in the middle-school science classroom: Putting Learning by Design (TM) into practice</article-title>
          .
          <source>Journal of the Learning Sciences</source>
          <volume>12</volume>
          ,
          <fpage>495</fpage>
          -
          <lpage>547</lpage>
          (
          <year>2003</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          8. Gauthier, G.,
          <string-name>
            <surname>Lajoie</surname>
            ,
            <given-names>P.S.</given-names>
          </string-name>
          , Richard,
          <string-name>
            <given-names>S.</given-names>
            ,
            <surname>Wiseman</surname>
          </string-name>
          , J.:
          <article-title>Mapping and validating diagnostic reasoning through interactive case creation</article-title>
          .
          <source>In: E-Learn</source>
          <year>2007</year>
          , pp.
          <fpage>2553</fpage>
          -
          <lpage>2562</lpage>
          . (
          <year>Year</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          9. Gauthier, G.,
          <string-name>
            <surname>Conway</surname>
            ,
            <given-names>J.M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Taylor</surname>
          </string-name>
          , R.:
          <article-title>Assessment planning for interactive casebased learning scenario</article-title>
          .
          <source>European Association for Research on Learning and Instruction (Earli) Special Interest Group on Assessment and Evaluation</source>
          , Brussels, Belgium (
          <year>2012</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          10.
          <string-name>
            <surname>Fenstermacher</surname>
          </string-name>
          , K.D.:
          <article-title>The tyranny of tacit knowledge: What artificial intelligence tells us about knowledge representation</article-title>
          .
          <source>In: 38th Hawaii International Conference on System Sciences</source>
          , pp.
          <fpage>1530</fpage>
          -
          <lpage>1605</lpage>
          . IEEE, (
          <year>Year</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          11.
          <string-name>
            <surname>Sadler</surname>
            ,
            <given-names>D.R.</given-names>
          </string-name>
          :
          <article-title>Specifying and promulgating achievement standards</article-title>
          .
          <source>Oxford Review of Education</source>
          <volume>13</volume>
          ,
          <fpage>191</fpage>
          -
          <lpage>209</lpage>
          (
          <year>1987</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          12. Gauthier, G.,
          <string-name>
            <surname>Conway</surname>
            ,
            <given-names>J.M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Taylor</surname>
          </string-name>
          , R.:
          <article-title>Variability in the expert solution for case based learning scenarios: Reliability issues. Canadian Society for the Study of Education (CSSE), Waterloo</article-title>
          , Ontario (
          <year>2012</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          13. Gauthier, G.:
          <article-title>Teach aloud: A modified version of the think-aloud protocol to study the teaching of clinical reasoning</article-title>
          .
          <source>Qualitative Health Research (QHR)</source>
          , Montreal, Quebec (
          <year>2012</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          14.
          <string-name>
            <surname>Anspach</surname>
            ,
            <given-names>R.R.</given-names>
          </string-name>
          :
          <source>Notes on the Sociology of Medical Discourse: The Language of Case Presentation. Journal of Health and Social Behavior</source>
          <volume>29</volume>
          ,
          <fpage>357</fpage>
          -
          <lpage>375</lpage>
          (
          <year>1988</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          15. Gauthier, G.:
          <article-title>Capturing and Representing the Reasoning Processes of Expert Clinical Teachers for Case-Based Teaching</article-title>
          . McGill University, Montreal (
          <year>2009</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          16.
          <string-name>
            <surname>Wyatt-Smith</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Castleton</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Freebody</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Cooksey</surname>
            ,
            <given-names>R.W.:</given-names>
          </string-name>
          <article-title>The nature of teacher's qualitative judgements: A matter of context and salience</article-title>
          . Part One: ''In-context'' judgements. .
          <source>Australian Journal of Language and Literacy</source>
          <volume>26</volume>
          ,
          <fpage>11</fpage>
          -
          <lpage>32</lpage>
          (
          <year>2003</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          17.
          <string-name>
            <surname>Bowers</surname>
            ,
            <given-names>A.J.:</given-names>
          </string-name>
          <article-title>What's in a grade? The multidimensional nature of what teacherassigned grades assess in high school</article-title>
          .
          <source>Educational Research and Evaluation</source>
          <volume>17</volume>
          ,
          <fpage>141</fpage>
          -
          <lpage>159</lpage>
          (
          <year>2011</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          18.
          <string-name>
            <surname>Ericsson</surname>
            ,
            <given-names>K.A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Simon</surname>
            ,
            <given-names>H.A.</given-names>
          </string-name>
          :
          <article-title>Protocol Analysis: Verbal Reports as Data</article-title>
          . MIT Press, Cambridge, Mass. (
          <year>1993</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          19.
          <string-name>
            <surname>Hutchby</surname>
            ,
            <given-names>I.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Wooffitt</surname>
          </string-name>
          , R.:
          <source>Conversation Analysis: Principles, Practices and Applications</source>
          . Blackwell Publishers Inc, Oxford (
          <year>1998</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref20">
        <mixed-citation>
          20.
          <string-name>
            <surname>Schön</surname>
            ,
            <given-names>D.A.</given-names>
          </string-name>
          :
          <article-title>The Reflective Practitioner: How Professionals Think in Action</article-title>
          . Basic Books, New York (
          <year>1983</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref21">
        <mixed-citation>
          21.
          <string-name>
            <surname>Kane</surname>
            ,
            <given-names>M.T.</given-names>
          </string-name>
          :
          <article-title>An argument-based approach to validation</article-title>
          .
          <source>Psychological Bulletin 527- 535</source>
          (
          <year>1992</year>
          )
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>