=Paper= {{Paper |id=None |storemode=property |title=The Value Chain for Web Accessibility: Challenges and Opportunities |pdfUrl=https://ceur-ws.org/Vol-792/Petrie.pdf |volume=Vol-792 }} ==The Value Chain for Web Accessibility: Challenges and Opportunities== https://ceur-ws.org/Vol-792/Petrie.pdf
 The value chain for web accessibility: challenges and
                    opportunities

Helen Petrie1, Christopher Power1, David Swallow1, Carlos A. Velasco2, Blaithin
 Gallagher3, Mark Magennis3, Emma Murphy3, Sam Collin4 and Keren Down4
                    1
                  Human Computer Interaction Research Group
                      Department of Computer Science
                            University of York
                    Deramore Lane, YorkYO10 5GH UK
           {apfreire|helen.petrie|cpower}@cs.york.ac.uk

                                2
                               Web Compliance Center,
            Fraunhofer Institute for Applied Information Technology (FIT)
               Schloss Birlinghoven, D53757 Sankt Augustin, Germany
                   Carlos.Velasco@fit.fraunhofer.de

                        3
                  National Council for the Blind of Ireland
                       Whitworth Rd, Drumcondra,
                            Dublin 9, Ireland
    {Blaithin.Gallagher|mark.magennis|emma.murphy}@ncbi.ie

                            4
                        Foundation for Assistive Technology
              302 Tower Bridge Business Centre, 46-48 East Smithfield,
                             London E1W 1AW UK




    Abstract. Abstract.This paper presents the results of interviews with
    representatives of the three key groups of stakeholders in the value chain for
    web accessibility: website commissioners, web developers and web
    accessibility experts. 26 web commissioners, 7 web developers and 14 web
    accessibility experts were interviewed. The results show that in spite of great
    efforts by the World Wide Web Consortium, the European Commission and
    other international and national organizations to promote web accessibility,
    knowledge of this topic is still low. More critically, the tools to support
    commissioners, developers and accessibility experts are still very poor and do
    not provide much of the functionality that the various groups in the value chain
    need. We believe these results highlight some of the reasons why the state of
    web accessibility is still as poor as it is.

    Keywords: web accessibility, value chain, website commissioners, website
    owners, web developers, web accessibility experts
Helen Petrie, Christopher Power, David Swallow, Carlos A. Velasco, Blaithin Gallagher, Mark
Magennis, Emma Murphy, Sam Collin and Keren Down

1    Introduction

    At the moment it is clear that much Web content and many Web 2.0 applications
are not accessible to disabled and older users (Kane, Shulman, Shockley, and Ladner,
2007; Lazar et al, 2011). If this situation is to change, there is a chain of stakeholders
who need to be both aware of accessibility issues and able to play their appropriate
role in creating accessible Web content and Web 2.0 application and services.
    The first link in this chain is the individuals who commission, own and manage
websites and Web 2.0 applications. In this paper we will refer to these individuals as
“website commissioners” for brevity. These individuals may not understand a great
deal about the technicality of the websites and applications, and in particular, they
may not understand the technicalities of how these are created to be accessible. But
they do need to be able to understand the general issues of accessibility and to
monitor whether the websites and applications they are responsible for are indeed
accessible. This might involve running accessibility tests or at least understanding the
output from accessibility testing of websites and applications, and discussions with
web developers and web accessibility evaluators.
    The second group in the chain is individuals who design and develop websites and
Web 2.0 applications (we will refer to this group as “web developers” for brevity).
Some of these individuals may do only the design of a website or application, such as
the layout and colour schemes. They should know about aspects of accessibility such
as easy to read fonts and good colour contrast for partially sighted people, colour
combinations not suitable for people with colour vision deficiencies and line length
and spacing requirements for people with dyslexia. They need tools to help them
assess whether their designs are going to be accessible. Developers are those
individuals who actually code a website, application or service. They need to
understand all the “nitty gritty” of how to code for accessibility, from marking up
tables correctly to ensuring that Flash animations are coded appropriately. They need
tools to help them produce accessible code most effectively and efficiently.
    The third group in the chain are the accessibility experts, who may well advise the
first two groups in the chain, and who test websites, applications and services for
accessibility. They may be employed by web commissioners or web developers,
either directly or as consultants or they may be working for external services that are
benchmarking websites and applications, such as government agencies or
organizations that represent disabled and older people. Obviously, they need to
understand all the complexities of technical accessibility. Nonetheless, they need
tools to support them in assessing the accessibility of websites and applications
effectively and efficiently as this is a very time consuming process, and they also need
to be able to communicate the results of these assessments to web commissioners and
web developers in ways that will be understandable to these individuals who may
have different levels of understanding of accessibility.
    This paper will present the results of in-depth interviews with samples of
individuals from each of these three groups in the chain to achieve accessible websites
and applications. These results will provide us with a greater understanding of the
                            The value chain for web accessibility: challenges and opportunities 3


needs of each of these groups and requirements for the development of tools to
support each group.


2           Terminology about web accessibility testing

   We have discovered that there are considerable differences between experts in the
field as to the terms used to describe different forms of web accessibility testing.
Therefore we will set out our definitions for a number of relevant terms. “Manual
accessibility testing” (sometimes shortened to “manual testing”) is any process where
the human web accessibility evaluator (sometimes shortened to “accessibility
evaluator”, who may or may not be a web accessibility expert, see above) makes the
decision about whether something is an accessibility issue or not. To do this, they
may use Web Accessibility Testing Tools (WATTs) to provide information to help
them make this decision. These tools are often plug-ins to web browsers and include:
       •      WAVE Toolbar1
        •     Web Accessibility Toolbar from the Paciello Group2
        •     AIS Web Accessibility Toolbar3
   These tools allow a human web accessibility evaluator to work with the source
code to assess accessibility issues. For example, a tool can find all the 
elements on a web page and indicate whether they have an alt text or not. However, a
human accessibility evaluator decides whether the alt text, if provided, is appropriate
for the . Accessibility evaluators also often use testing tools not specifically
developed for accessibility testing but for other web development purposes (Non-Web
Accessibility Testing Tools, nWATTs), such as:
        •     Firebug4
        •     Web Developer Toolbar for Mozilla5,6
        •     Opera Dragonfly7
        •     Code inspector for Safari8
   For example, many WATTs allow the evaluator to isolate specific lines of code;
however, many nWATTs provide more useful functionality, and allow the
accessibility evaluator to dynamically move from a line of code to view and access
the elements that encapsulate it, or allow the accessibility evaluator to view the
Document Object Model (DOM) directly.


1http://wave.webaim.org/toolbar (retrieved April 25 2011)
2
    http://www.paciellogroup.com/resources/wat-ie-about.html (retrieved 25 April 2011)
3http://www.visionaustralia.org.au/ais/toolbar/ (retrieved 25 April 2011)
4
  https://addons.mozilla.org/en-US/firefox/addon/firebug/ (retrieved 25 April 2011)
5
  http://chrispederick.com/work/web-developer/ (retrieved 25 April 2011)
6https://addons.mozilla.org/en-us/firefox/addon/web-developer/ (retrieved 25 April 2011)
7
  http://www.opera.com/dragonfly/ (retrieved 25 April 2011)
8http://www.apple.com/safari/features.html (retrieved 25 April 2011)
Helen Petrie, Christopher Power, David Swallow, Carlos A. Velasco, Blaithin Gallagher, Mark
Magennis, Emma Murphy, Sam Collin and Keren Down

   “Automatic accessibility testing” (sometimes shortened to “automatic testing”) is
any process where an algorithm makes the decision about whether something is an
accessibility problem or not. For example, an algorithm can decide whether an 
element has an alt text or not. The first and still most famous Automatic Accessibility
Testing Tool (AATT) was Bobby9, now no longer available. An extensive list of
AATTs is available at the WAI website10. AATTs inevitably only assess a subset of
web accessibility problems, as many problems require a human judgement, as
discussed above.



3        Method

3.1       Participants

   26 website commissioners participated in this study. 3 (11.5%) work in the private
sector, 13 (50%) in the public sector and 10 (38.5%) in the third sector (for charities,
NGOs etc.). 10 (38.5%) work for very small organizations (49 employees or less), 7
(26.9%) for small organizations (50 to 250 employees) and 9 (34.6%) for large
organizations (more than 250 employees).
   7 web developers took part in the study. All were male. Four were in their 20s and
three were in their 30s. They have between 4 and 13 years of experience of web
development, with an average of 8 years. Three participants work for large
enterprises, three work for SMEs and one is self-employed.
   The web developers use different tools for their Web2.0 development. Only two of
the developers used integrated development environments (IDEs), and then only for a
minority of their development tasks. One developer uses ECLIPSE, but only for
coding, debugging and validation is done in separate applications. Another developer
uses Dreamweaver, but stressed it was purely for the code highlighting, and he
preferred the old version because it was much simpler. The other five developers
used simple text editors such as Notepad++ or vim for all their development work.
   14 web accessibility experts participated in this study. 3 were women and 11 were
men. Interviewees had an average of 8.7 years of experience with accessibility
evaluations with the range being 0.5 years to 13 years.The web accessibility experts
were from a range of types of organizations, x were from private organizations and x
from public organizations; x were from SMEs and x were from large
organizations.For all of those workingfor large organizations and for one of the small
organizations, the interviewee worked in a smaller unit/group within the
organization.From all organizations, threereported that they only did evaluations of
web applications internal to the company, while two reported that they do only
evaluations for external clients. The remaining nine reported that they do evaluations


9http://www.cast.org/learningtools/Bobby/index.html (retrieved April 25 2011)
10
     http://www.w3.org/WAI/ER/tools/complete.html (retrieved April 25 2011)
                       The value chain for web accessibility: challenges and opportunities 5


for internal and external clients, with two saying their work was more heavily
weighted to external clients.


3.2    Interview schedules


   For the web commissioners and web accessibility experts, interviews were
conducted either face-to-face or via phone. For the web developers, a contextual
interview was used – the interview took place at the participant’s place of work and
the participant was asked to work through their typical working methods with the
interviewer, illustrating how they do typical tasks (Beyer and Holtzblatt, 1997).

  For web commissioners theinterview schedule covered the following topics:
   • The organization
   • The website, and web applications if relevant
   • The interviewee, particularly their knowledge/training in web technologies,
       accessibility and usability
   • Web quality processes

   As the interviewer did not want to cue the interviewee that the interview was about
web accessibility in particular, the recruitment information and interview schedule
was presented as being about web quality processes. This was reasonable, as web
accessibility is one aspect of web quality. Therefore questions about web usability as
well as web accessibility were asked.

  For the web developers, the interview schedule covered the following topics:
   • The web developer:their knowledge and skills, their organisation, and the
        nature of the work they are involved in. As well as asking for objective,
        factual information to put the participant at ease, this section also established
        whether the participant felt their knowledge of the field was up-to-date and
        what they considered to be their strengths and weaknesses.
   • Workspace configuration:an “introduction” to the participant’s workspace
        and immediate surroundings. This section investigated the hardware used by
        the participant (e.g. monitors, input devices etc.) as well as any development-
        related artefacts (e.g. post-it note reminders, whiteboards, notepads etc.). It
        also established the software applications and webpages that the participant
        typically uses to develop websites.
   • Communication:who the web developer communicates with during a
        typical day, and how. This section focused on how and when the participant
        communicates with clients, colleagues, and management in order to gain an
        understanding of their day-to-day interaction with others. It also established
        how web development requirements and progress reports are formalized in
        the participant’s organization.
   • Help and information:who and what resources the web developer turns to
        when they need help. Having asked the participant to assume they have run
        into a typical technical problem, this section investigated who and what
Helen Petrie, Christopher Power, David Swallow, Carlos A. Velasco, Blaithin Gallagher, Mark
Magennis, Emma Murphy, Sam Collin and Keren Down
         resources they turn to for help, the type of information they are typically
         looking for, and the stage of development that such problems usually occur.
    •    Standards-compliance: what standards-compliance means to the web
         developer and how they achieve it. This section explored the participant’s
         understanding of a standards-compliant website, any challenges they have
         faced in making websites standards-compliant, and the extent to which their
         websites are standards-compliant.
    •    Preview:how the web developer previews a website they have created. This
         section required the participant to demonstrate the browsers, applications,
         and devices that they typically use to preview websites. It also established
         what the developer looks for when previewing a website and how dynamic
         webpages and third-party components are previewed.
    •    Validation:how the web developer validates a website they have created.
         This section required the participant to demonstrate how they validate a
         website. It also established how they tackle any validation errors, how
         frequently they validate a website, how useful they find the feedback from
         validation tools, and how easily validation checks fit into the participant’s
         workflow.
    •    Users:what consideration the web developer gives to the users of their
         websites. This section focused upon the usability of the developer’s websites,
         whether instructions are required, and established who is responsible for the
         development of feedback, instructions, and error messages.
    •    Accessibility:what consideration the web developer gives to the accessibility
         of their websites. This section explored the web developer’s understanding of
         accessibility, the importance they place upon accessibility, and whom they
         feel is responsible for making websites accessible. It also established the
         extent to which the participant’s websites are accessible as well as the factors
         that motivate or prevent the participant from making a website accessible.
    •    Accessibility Testing: how the web developer tests the accessibility of a
         website they have created. This section required the participant to
         demonstrate how they test the accessibility of a website. It also established
         how they tackle any accessibility problems, how frequently they test the
         accessibility of a website, how useful they find the feedback from
         accessibility tools, and how easily accessibility tests fit into the participant’s
         workflow.
    •    Future Improvements:how the web developer could improve the
         accessibility of the website they have created in future. This forward-looking
         section investigated the type of help and information that the participant
         would find useful to develop accessible websites in future. It considered the
         form such information might take, the granularity of the information, when
         such help and information might be useful, and how it might fit into the
         developer’s workflow.

   For the web accessibility experts, the interview schedule covered the following
topics:
     • The organization and its culture regarding accessibility evaluations
     • The experience and training of the interviewee in accessibility evaluations
                        The value chain for web accessibility: challenges and opportunities 7


    •     The tools used during automated and/or manual accessibility evaluations
    •     What features in the tools are used most commonly and how useful are those
          features to the evaluators
    •     Sampling methods used to select web pages for testing
    •     Tracking of tests performed on web pages
    •     Reporting to developers/commissioners
    •     Procedures for maintenance of web accessibility


4       Results

   The interviews with website commissioners showed that they are responsible for
websites which include a range of complex features that are challenges for
accessibility, particularly the use of media players(found in 61.5% of the websites)
and link sharing for social network sites (found in 57.7% of the websites). The use of
CMSs was very common, with 88.5% of the websites using them. There were a
number of comments from participants indicating that they rely on the CMS to ensure
accessibility of the website. This strategy is only as good as the CMS used and its
ability to ensure accessibility correctly. Web commissioners need to be able to
evaluate accessibility independently of their CMS, and in the case of several
participants, this was clearly not happening.
   Web commissioners’ organizations often out-source part or all of the website work
(34.6% total out-sourcing, 46.2% partial out-sourcing). Out-sourcing of web design
and development work lead to a number of issues around accessibility. Several
participants remarked that companies claimed to have expertise in accessibility, but
they found this hard to judge independently. In addition, out-sourcing could create
accessibility-related conflicts, as one participant found to their cost, when their design
company proposed a design for the website that the implementation company argued
would be inaccessible (and unfortunately, design won over accessibility).
   In terms of disabled and older people as audiences for websites, only 11.5% of web
commissioners spontaneously mentioned these groups as potential audiences for their
website. However, when prompted, 69.2% agreed that these groups might be
potential audiences for the website. These results show that website commissioners
are not really thinking about the potential audiences of their websites, as disabled and
older people are almost always possible audiences for websites. This means that
accessibility will often be omitted from the agenda in the commissioning of websites.
   In contrast, when asked whether their organization has a policy on accessibility of
the website for disabled and older people, 61.5% of the web commissioners said they
did. This difference between the question about audiences and the question about web
accessibility policy probably shows the subject bias effect (Rosnow and Rosenthal,
1997) one gets in asking questions about web accessibility. It may also show the lack
of realism in the minds of most people about people with disabilities and older people
– they fail to realize the range of things that disabled and older people may wish
andneed to do using the web. An example was a school for children with disabilities
who did not think their website needed to be accessible to disabled or older people –
Helen Petrie, Christopher Power, David Swallow, Carlos A. Velasco, Blaithin Gallagher, Mark
Magennis, Emma Murphy, Sam Collin and Keren Down

although parents of the pupils were a key target audience. Could not the parent of a
disabled child be disabled themselves or could not the guardian of a disabled child be
their grandparent?
   Most website commissioners (88.5%) correctly understood the meaning of web
accessibility with somewhat less showing a correct understanding of web usability
(69.2%). About half the participants had some training or knowledge of web
technologies, web accessibility and web usability. However, only a third had any
training in web accessibility and usability and had picked up information on the job.
In retrospective, it would have been better to ask separate questions about training and
knowledge, to tease out this difference. Even the figure of one third of web
commissioners having knowledge of web accessibility may over estimate their
knowledge, as a number of the participants had worked with the CFIT, part of NCBI
and had training or knowledge in web accessibility from this collaboration.
   Finally, web commissioners were asked about their web quality processes for
accessibility and usability. For web accessibility, more than half (61.5%) of
participants said that they (or the companies they out-source to) assess the
accessibility of their website. However, it was worrying that only a quarter of these
participants could say what kind of testing had been done. This does not indicate a
good understanding of the process. And only 7.7% of participants named a WCAG
level conformance that they were aiming for/were tested against. Few participants
had a formal process for ensuring that accessibility problems identified in testing were
addressed and that accessibility of the website was maintained going forward from the
testing. This would indicate a clear gap in the support provided to website
commissioners in tracking accessibility of their site.
   Results for web usability quality control were very similar and participants
appeared to deal with accessibility and usability together. This may be a “hook” that
those concerned with web accessibility can use to convince website commissioners to
implement quality control for accessibility – by bundling it with usability and
showing that commissioners can achieve two goals with one system or tool.
   The contextual interviews with web developers showed that amongst 7 developers
who represented a range of organization and Web 2.0 application types, only 2 used
IDEs for their work, and those two only in a small way.
   The points of particular interest that emerged from these interviews with
minimalist environment developersare:
     • These web developers use very minimalist environments for coding and
          separate both physically and temporally their coding from their testing and
          validation of webpages
     • Many of thesedevelopers use and are very positive about browser extensions
          such as FireBug for debugging, and thought that a similar system for
          accessibility support would be appropriate (interestingly, none of the
          developers were aware of the browser extensions for accessibility such as the
          AIS Web Accessibility Toolbar
     • These developers felt that inline active accessibility tools in their main editor
          would be very annoying
                       The value chain for web accessibility: challenges and opportunities 9


    These points suggest thatfor these developers accessibility tool developers should
follow the route started by the AIS Toolbar and similar browser extensions and
provide accessibility testing support for minimalist environment web developers as
browser extensions.
    In addition:
     • These developers want a clearer quantitative checklist of accessibility
         problems (while acknowledging that not all problems can be
         “programmatically determined”) or a bullet-point headline list of problems in
         clear “human” language with more technical code suggestions and examples
         behind each bullet-point
     • These developers want explanations of accessibility problems from
         accessibility checkers that show the reasoning behind them
     • Feedback from accessibility checkers was considered too vague; warnings
         such as “you may need to check the alt description of this image” were
         considered unhelpful

   These points suggest that accessibility checking tools need to organize information
in different ways from current practices and need to provide more information,
particularly the reasoning behind accessibility problems, which they generally do not
do at the moment.
     • These developers were open to the idea of simulations of the experiences of
          web users with disabilities as a tool in developing accessible websites and
          applications
     • They thought these could provide definitive information about whether a
          piece of code would be accessible or not and “settle arguments”
     • However, they were cautious about the amount of time it would take to run
          and in particular to interpret the results from such simulations

   These points suggest that simulations of disabled users can be a useful tool for
minimalist environment web developers, but they need to be developed with regard to
the time to run and interpret them.
   The interviews with web accessibility expertsindicate that there is currently a gap
in how tools support accessibility experts and the reality of practice for these
individuals.
   The statements from accessibility experts regarding AATTs are particularly
interesting. The current tools do not provide adequate information to the expert about
what the tools test or what the problems are on a webpage. These statements are
supported by the fact that so few experts report using AATTs. If new AATTs are to
be widely adopted they must provide more information about what is being tested. It
is not sufficient for a tool to simply “dump out” a set of problems. It must be possible
for the user to query what tests are being performed, and engage in a dialogue with
the tool regarding how decisions are made by a tool. All of this information must be
presented in the language of the user and frame the results in terms of how the
problems are likely to impact the web user, and in terms of repair of the issue.
   Further to this, most experts feel that they bring more knowledge and experience to
an evaluation than can be captured in an AATT. When new automated tools are
Helen Petrie, Christopher Power, David Swallow, Carlos A. Velasco, Blaithin Gallagher, Mark
Magennis, Emma Murphy, Sam Collin and Keren Down

developed, they must be created in such a way that the expert can contribute their
knowledge into the system in order to eliminate false positives or false negatives. For
example, assume that an AATT returns a list of alternative text warnings, such as
having an empty (alt=””) text for an image. Many will be decorative images not
needing an alternative text, thus creating a false positive. However, in the first
instance of such a test, the expert must manually check all warnings. In current
AATTs, each time the automated test is run, all warnings are produced, and as a result
each error may need to be checked by an expert every time. In a future AATT one
could image a case where the expert could annotate the information about which
warnings are real and which are false positives for propagation to the next time the
tests are run. In this way, such manual checking would only need to be done the first
time after elements on web pages change.
    In terms of manual testing, there are a large number of experts who rely on
multiple tools in order to get the complete set of features needed to undertake
evaluations. The use of different toolbars, browsers and development tools (e.g.
Firebug) in an ad hoc way makes it very difficult to integrate different tests and
related reports together. A more unified approach, or at least a unified view of these
different tools through a common interface, is something that would likely be
welcomed by experts.
    Web accessibility experts are in need of tools that will help them manage the pages
they have been asked to test and what tests have been undertaken on those pages.
These features must be flexible enough to accommodate different strategies that
experts have when undertaking an evaluation. For example, in some cases experts
will take the traditional approach of performing all tests on one page, and that is a
strategy that should continue to be supported. However, the tools should also support
the approach of applying one test (e.g. checking for alternative texts on images) to all
pages in sequence. Tools that help in these seemingly mundane and tedious tasks will
allow experts who are working in opportunistic and/or scrambled styles of evaluation
to move their practices towards more strategic approaches. Hopefully, this will
reduce the overall potential for missing problems in a given set of pages and increase
the reliability of an individual evaluator.
    The comments regarding the lack of training that many experts have, in
conjunction with the rapidly changing technology environment, show a need to
provide more structured support and help during evaluation sessions through tools. If
it is the case that experts are reluctant to engage in formal training, and yet there are
still issues that they do not understand, then tools that providing comprehensive
guidance and structured dialogues for repair would be of value.
    It is absolutely essential that any future tools that are produced be able to generate
a variety of different formats of reports. Reports that only contain lists of violations
found in tests are not of interest to the majority of experts or their clients. While it is
important to still provide this feature in situations where an organization must have a
complete audit of their web application/websites against a set of guidelines, it is
equally important that tools support annotation of problems by experts. Further,
having the ability of collecting together similar problems between pages and then
                       The value chain for web accessibility: challenges and opportunities 11


annotate those problems with examples or solutions, is a feature that would be
received well by the expert community.


5    Conclusions and Future Work

This paper has presented the results of interviews with web commissioners, web
developers and web accessibility experts about their experiences of commissioning,
developing and evaluating websites for accessibility. The results show that in spite of
great efforts by the World Wide Web Consortium, the European Commission and
other international and national organizations to promote web accessibility,
knowledge of this topic is still low. More critically, the tools to support
commissioners, developers and accessibility experts are still very poor and do not
provide much of the functionality that the various groups in the value chain need. We
believe these results highlight some of the reasons why the state of web accessibility
is still as poor as it is.

Acknowledgments. We would like to thank all the participants who took part in this
study for their time and valuable input. We would also like to thank all the i2Web
consortium for their work and support.Finally, we would like to particularly thank the
European Commission for supporting this work through FP7 Project 257623 from the
Software and Service Architectures and Infrastructures Unit.


References

Beyer, H. and K. Holtzblatt (1997).Contextual design: defining customer-centred
       systems. Morgan Kaufmann.
Kane, S. K., Shulman, J.A., Shockley, T.J. and Ladner, R.E. (2007). A Web
       Accessibility Report Card for Top International University Web
       Sites.Proceedings of the 2007 international cross-disciplinary conference on
       Web accessibility (W4A '07). New York: ACM Press.
Lazar, J. et al. (2011). Societal Inclusion: evaluating the accessibility of job placement
       and travel web sites. Proceeding of I+CLUDE 2011. London, UK, Royal
       College of Art.
Rosnow, R.L. and Rosenthal, R. (1997). People studying people: artifacts and ethics
       in behavioural research. W. H. Freeman and Co.