=Paper= {{Paper |id=None |storemode=property |title=Empirical Results from an Evaluation of the Accessibility of Websites by Dyslexic Users |pdfUrl=https://ceur-ws.org/Vol-792/Freire.pdf |volume=Vol-792 }} ==Empirical Results from an Evaluation of the Accessibility of Websites by Dyslexic Users== https://ceur-ws.org/Vol-792/Freire.pdf
Empirical Results from an Evaluation of the Accessibility
             of Websites by Dyslexic Users

                 André P. Freire, Helen Petrie and Christopher Power

                        Human Computer Interaction Group
                         Department of Computer Science
                               University of York
                       Deramore Lane, York UK YO10 5GH
              {apfreire|helen.petrie|cpower}@cs.york.ac.uk



       Abstract.This paper presents an empirical study on problems encountered by
       users with dyslexia when using websites. The study was performed by a user
       evaluation of 16 websites by a panel of 13 participants with dyslexia, each
       participant evaluating 10 websites. The results presented in the paper are based
       on 693 instances of accessibility and usability problems. Most frequent
       problems were related to navigation issues, problems with presentation and
       organisation of information, lack or misfunctioning of specific funtionality in
       websites, and issues with language.

       Keywords: dyslexia, web accessibility, learning difficulties, user evaluation


1    Introduction

The field of web accessibility has been concerned with the development of techniques
to make websites more accessible to people with disabilities. Most of the research in
this field, however, has been limited to investigating accessibility issues to people
with visual, physical and hearing disabilities. Far less attention has been given to the
study of the accessibility of websites to people with cognitive disabilities and specific
learning difficulties, such as dyslexia (see McCarthy & Swierenga, 2010).
   A number of sets of web accessibility guidelines have been developed. These
guidelines involve both general accessibility guidelines for disabled users, such as the
Web Content Accessibility Guidelines 1.0 (WCAG) (W3C, 1999) and 2.0 (W3C,
2008) from the World Wide Web Consortium, and other guidelines specific to
dyslexic users (British Dyslexia Association, 2011; Bradford, 2005; Kolatch, 2000;
Zarach, 2002). However, very little empirical evidence for the kinds of problems
dyslexic users face when using websites has been reported.
   A large scale user-based evaluation of websites conducted as part of a formal
investigation into the accessibility of websites commissioned by the Disability Rights
Commision of Great Britain (Disability Rights Comission, 2004) provided empirical
information on the problems that dyslexic users encounter with websites. The results
of that study have pointed to the need for more indepth empirical studies to better
2André P. Freire, Helen Petrie and Christopher Power


understand the nature of problems encountered by dyslexic users when using
websites.
   This paper presents an empirical study involving dyslexic participants with a wide
range of websites. The study involved user evaluation of 16 websites by a panel of 13
dyslexic participants. The results show the main problems encountered by these
users, and discussions about the nature of the problems and implications for design to
be drawn from the findings.
   The paper is organised as follows: Section 2 discusses related work connected to
dyslexia and web accessibility; Section 3 presents the method used for the empirical
study; Section 4 presents the main results obtained; Section 5 presents a discussion of
these results, and, finally, Section 6 the main conclusions and future work.


2    Dyslexia and Web Accessibility

In a recent literature survey of web accessibility and dyslexia, McCarthy & Swierenga
(2010) reported that there is little work in the literature regarding the study of the
accessibility of web sites for dyslexic users. Their findings highlight the fact that
most of the research on web accessibility has focused on users with visual disabilities
or with severe cognitive disabilities.
    Empirical studies with participants with more severe cognitive disabilities have
pointed to a lack of inclusion of the problems found in their results in current
accessibility guidelines, as reported by Small et al. (2005) and Sevilla et al. (2007).
    Small et al. (2005)conducted an empirical study involving the evaluation of two
WCAG 1.0-compliant websites with 27 users with developmental cognitive
disabilities (corrected vision, cerebral palsy, obsessive-compulsive disorder). They
found that users had a substantial number of problems with the websites, which led
the authors to argue that the cognitive disabilities analysed in the study were not
accounted for in WCAG 1.0.
    Sevilla et al. (2007) conducted another empirical study with 20 participants with
cognitive disabilities using two different versions of a web interface. A conventional
version of a website was evaluated as well asanother version with simplified
navigation. Besides reporting on improvements in performance with the version with
simplified navigation, the authors also argue that the needs of people with cognitive
deficits are poorly addressed in WCAG 1.0 and WCAG 2.
    With respect to dyslexia in particular, there are few studies that provide rigorous
empirical results of dyslexic participants using websites. The majority of the
literature on dyslexia and web accessibility is related to guidelines to produce
accessible web content to dyslexic users, derived from general guidelines for dyslexia.
    A number of sets of guidelines have been produced to help developers produce
more accessible web content for dyslexic users (British Dyslexia Association, 2011;
Bradford, 2005; Kolatch, 2000; Zarach, 2002), as reported in a review undertaken by
McCarthy & Swierenga (2010). Friedman & Bryen (2007) also conducted a review
of 20 sets ofguidelines from research studies and websites maintained byprofessonals
and advocacy organisations connected to dyslexia and other cognitive disabilites, and
    Empirical Results from an Evaluation of the Accessibility of Websites by Dyslexic Users3


compiledthe guidelines most cited by these sources; most guidelines had to dowith
other cognitive disabilities, but some were applicable todyslexia. Evett & Brown
(2005) also performed a analysis comparing guidelines for producing accessible
content for dyslexic and blind users, and reported to have found a high degree of
overlap between guidelines for these two user groups.
   With respect to empirical studies with dyslexic participants using websites, the
largest study to date reported in the literature was conducted by the Disability Rights
Commission of Great Britain in 2004 (Disability Rights Commission, 2004). The
study involved tests on 100 websites, performed by a panel of 50 participants, which
included participants with dyslexia, visual, hearing and physical disabilities. Out of
the 50 participants, 12 had dyslexia (Petrie, Hamilton, & King, 2004). The study
resulted in a total of 585 accessibility problems. In particular, the study found that the
most recurring problems encountered by dyslexic users were: confusing page layout,
unclear navigation, poor colour selections, graphics and text too small and
complicated language.
   The DRC study helped to provide empirical evidence to problems that people with
dyslexia have when using websites. Further studies in line with this study could
provide more detailed analyses of the types of problems dyslexic users find when
using websites. In particular, in the DRC study only 22% of the tasks in the study
were performed in a laboratory environment, whilst the remainder 78% of them were
performed remotely with self-reports provided by the participants. Petrie, Hamilton,
King and Pavan (2006) compared the data from the two methods of data collection
and found that the quality of the data was not compromised, but the quantity of data
provided by the remote evaluations was lower. Thus the data from the DRC study
may have underestimated the problems that disabled, including dyslexic, userswere
having with the web. A study performed in the laboratory allows for the identification
of more problems that users would not necessarily report on, and also provides richer
details about the nature of the problems.
   Other small-scale studies have also reported on experiences of dyslexic users when
using websites. Harrison & Petrie (2007) conducted a study of six websites involving
six participants, of whom two were visually disabled and two were dyslexic. In their
study, they analysed the relationship between the severity ratings of accessibility
problems given by users with the ratings given by accessibility experts and with the
priority of related issues set in guidelines. The results showed that there was no
correlation between ratings given by users and by experts with the priorities of
problems in accessibility guidelines in WCAG 1.0.
   Al-Wabil et al. (2007)conducteda study investigating navigation issues faced by
dyslexic users. Their study comprised interviews with 10 participants with dyslexia.
The participants were shown examples of web pages and asked to discuss about their
experiences with navigation elements in web sites. Results pointed to how dyslexic
users use search features, breadcrumb trails and other navigation resources. Although
the study provided good insight from users’ opinions, there was no empirical evidence
from participants using real websites.
   Studies on problems encountered by dyslexic users have reported interesting
findings, and have pointed to a clear need for broader empirical studies. This paper
4André P. Freire, Helen Petrie and Christopher Power


presents an indepth study of dyslexic people using a range of websites accessibility to
provide more insight into the nature of the problems dyslexic users have when using
websites, and will provide important contributions to implications for design of
websites.


3     Method

3.1    Participants

The participants were recruited from the University of York Students’ Union mailing
list and from personal contacts of the authors.
    A total of 13 participants with dyslexia took part in the study, of whom 6 were
male and 7 were female. Their ages ranged from 19 to 49 years (median = 20). The
majority of the participants (12 out of 13) had English as their first language; one
participant had Persian as first language, but was fluent in English. All participants
had been diagnosed with dyslexia either by the University of York’s Disability Office
or by other appropriate professionals. Each participant was reimbursed with £15 per
hour of their time.
    With respect to their experience with computers, the participants rated their
experience in a scale from 1 (not at all) to 7 (extensive). Their ratings of computer
experience ranged from 3 to 7, with 84% of the participants with experience rated as 5
or above. All the participants had been using the Internet for 7 years or more. The
participants spent between 1 and 20 or more hours per week on websites; 6 out of 13
reported to spend more than 20 hours a week using websites.
    Participants were asked to provide details about their dyslexia, in terms of how
severe it was and in what aspects they were affected by it. Most participants reported
to have been assigned a severity level in a severity scale that ranged from “mild”,
“moderate”, “severe” and “profound”. In the sample of participants, 3 reported to
have mild dyslexia, 3 mild-moderate dyslexia, 2 moderate dyslexia, 1 moderate-
severe dyslexia, and 4 were not able to inform their level of dyslexia.
    The aspects in which they were affected by their dyslexia were very broad, and
varied considerably from participant to participant. The issues reported and the
numbers of participants affected by each of them are as follows:
• Difficulties with spelling (8 participants)
• Difficulties with reading and comprehension (7 participants)
• Difficulties with reading text with black printing on white background (7
  participants)
• Limited short-term memory (4 participants)
• Low writing speed (2 participants)
• Difficulties with processing of verbal information (2 participants)
• Speech difficulties (2 participants)
• Difficulties with motor coordination (1 participant)
• Limited spatial awareness (1 participant)
• Asperger’s syndrome (1 participant)
      Empirical Results from an Evaluation of the Accessibility of Websites by Dyslexic Users5


   Only 5 participants reported using some kind of assistive technology: 2
participants reported using Dragon Dictate and 2 participants use Dictaphone, both for
speech recognition; 1 participant reported using TextHelp as a speech synthesizer
software for reading texts on a computer. Regarding their enhancements, 6
participants reportedchanging the backgroundcolour of text in order to be able to read
it comfortably, and 1 participant reported often increasing font size in websites to
read text comfortably.

3.2      Websites evaluated

   This study involved the evaluation of a sample of 16 websites, comprising websites
at different conformance levels with the Web Content Accessibility Guidelines 1.0
and 2.0. The selection of websites included both websites from the private and public
sectors, and involved local and central government websites, public services, non-
profitx organisations and commercial websites.
   One of the goals of the selection was to have a varied range of websites and
conformance to WCAG 1.0 and 2.0. This was envisaged to enable further analysis
comparing the problems found by disabled users and problems identified in
accessibility audits performed with the guidelines.
   In order to obtain a wide range of websites in different conformance levels, around
400 home pages of websites were analysed with automatic accessibility evaluation
tools. Websites that had some potential level of conformance were further analysed
using manual accessibility audits with WCAG 1.0. The websites were drawn from a
sample of 100 websites evaluated in the formal investigation conducted by the
Disability Rights Commission of Great Britain (Disability Rights Commission, 2004)
and other websites found by search procedures.
   The selection was performed soon after the WCAG 2.0 were published. At
thatpoint there was very little support available to perform accessibility audits with
the new guidelines. Due to this, the selection was initially based on conformance of
the home pages to WCAG 1.0. A follow-up evaluation of the home pages of the
websites with WCAG 2.0 was performed usingthe archived web pages that had been
evaluated with WCAG 1.0, so the evaluation was on exactly the same home page.
   Table 1 shows the list of websites selected and information about their level of
conformance to WCAG 1.0 and WCAG 2.0. For each website, the level of
conformance (A, AA, or AAA) is shown, as well as the numbers of instances of
violations of checkpoints/success criteria and the number of different
checkpoints/success criteria violated. If a web page contains five images that do not
have alternative text associated to them, this would count as five instances of
violations of checkpoint 1.1 in WCAG 1.0 and success criterion 1.1.1 in WCAG 2.0,
but would count as only one checkpoint/success criterion violated.
6André P. Freire, Helen Petrie and Christopher Power


Table 1List of websites with WCAG 1.0 and WCAG 2.0 conformance levels




                                            WCAG 1.0 - instances of




                                                                                                                  WCAG 2.0 - instances of



                                                                                                                                            success criteria violated
                                                                                             WCAG 1 Conformance




                                                                                                                                                                        WCAG2 Conformance
                                                                      WCAG 1.0 - Different




                                                                                                                                            WCAG 2.0 - Different
                                                                      checkpoints violated
                                            violations




                                                                                                                  violations
                                                                                             Level




                                                                                                                                                                        Level
                                    Type
 Website
 Lflegal – Law Office
 www.lflegal.com                Private            5                        2                   AA                       0                          0                   AAA
 Green Beast Design
 www.green-beast.com            Private          23                         3                   AA                       9                          3                     AA
 York City Council
 www.york.gov.uk                Public           16                         4                   AA                       7                          5                         -
 NHS – National Services for
 Scotland
 www.nhsnss.org                 Public           30                         6                   AA                     31                           9                         -
 Copac - Libraries network
 www.copac.ac.uk                Private          21                         8                     A                      6                          2                        A
 The Automobile Association
 www.theaa.com                  Private          68                         9                     A                    58                           9                         -
 Department of Health
 www.dh.gov.uk                  Public           91                         9                     A                    31                           6                        A
 Digizen
 www.digizen.org.uk             Private          80                         9                     A                    46                         12                          -
 JISC
 www.jisc.ac.uk                 Public           58                       12                      A                  216                          13                          -
 Royal Mail
 www.royalmail.com              Private          50                       15                      A                  103                            7                         -
 Pret
 www.pret.co.uk                 Private        184                        16                      A                  141                          21                          -
 Trades Union Congress
 www.tuc.org.uk                 Private        146                        23                      A                    97                         17                          -
 British Museum
 www.britishmuseum.org          Public         130                          8                      -                   86                           8                         -
 NHS Direct
 www.nhsdirect.nhs.uk           Public           30                       10                       -                 163                          20                          -
 Ford
 www.ford.co.uk                 Private        124                        27                       -                 244                          33                          -
 TicketMaster
 www.ticketmaster.co.uk         Private        757                        29                       -               1118                           35                          -
      Empirical Results from an Evaluation of the Accessibility of Websites by Dyslexic Users7


   The selection of websites contains websites with a range of conformance levels,
instances of violations and number of different checkpoints/success criteria violated.
In relation to WCAG 1.0, there are 4 level-AA conformant websites, 8 level-A
conformant websites, and 4 websites not conformant to WCAG 1.0. In relation to
WCAG 2.0, there is one level-AAA conformant, 1 level-AA conformant, 2 level-A
conformant and 12 non-conformant websites. Despite having a larger number of
websites that are not conformant to WCAG 2.0 in the sample, it is worth noting that
these websites have a wide variability in the number of instances and of individual
checkpoints/success criteria violated, which will enable future analyses of coverage of
problems found by user evaluation by WCAG 1.0 and WCAG 2.0.
   A set of 3 tasks was developed for each website (except for NHS Direct, with two
tasks). The tasks involved typical activities that would be performed in the websites,
such as consulting council tax charges, buying tickets online, finding a local health
service, find a used vehicle, and others. We attempted to have tasks with different
difficulty levels for all the websites.


3.3      Design

   The study consists of observing disabled users while using the selected websites
and registering information about the way they experience the websites. Participants
were asked to use a concurrent think aloud protocol, to “speak aloud” what they were
thinking as they were carrying out their tasks. The main variables to be analysed
regarding the tests were:
• Problems encountered by users and their severity
• Task completion
• Difficulty to complete each task (measured in a scale from 1 – 5)
• Time to complete tasks
• User satisfaction with the website
   All the variables to be analysed are important to help understand whether disabled
users can use the websites or not. Special attention was given to the problems found
and how users rate the severity of the problems. During the study, users were asked
to rate the severity of each problem found on a website while attempting to perform a
given task using the following scale, adapted from the severity rating scale defined by
Nielsen (1993). The severity of each problem should be rated according to the
following scale:
   1 – Cosmetic – an irritation that is unlikely to cause serious interruption in
completing the task.
   2 – Minor – a problem that is likely to interrupt the users for a short period of time
or from which they can recover easily
   3 – Major – a problem that is likely to interrupt the users for a long period of time
or from which they will recover but with some difficulty.
   4 – Catastrophe – a problem which will stop the user from completing their task.
   Problems coded by the researchers during the analysis of the videos were also
assigned severity levels according to this same scale.
8André P. Freire, Helen Petrie and Christopher Power


3.4    Procedure

   Evaluations took place in the Interaction Labs in the Department of Computer
Science at the University of York. Participants were briefed on the nature of the
study and then asked to sign an informed consent form. Participants then were asked
to make any adjustments they needed to do in their preferred internet browser.
Participants were also instructed about how they should proceed during the study,
including instructions about talking aloud as they did the tasks, and reporting on any
problems they would find while attempting to do the tasks. An explanation of the
scale they should use to rate the severity of problems was also given. After this,
participants were asked to open the first website and were given the tasks they would
do.

3.5    Data preparation and coding

   The analysis process comprised an initial phase to establish a categorization of
accessibility problems and adequate levels of inter-coder reliability. The second phase
was the main coding of all the problems.
     In order to gather a representative set of problems, the selection of videos for the
initial phase included a range of different websites. Each video was initially coded
independently by three different coders, who identified accessibility problems and
gave them an initial classification and severity rating.
     After the independent coding of the videos, the three coders met to compare their
initial identifications and classifications. During these meetings, a unified list of
problems identified by all the coders is produced, a mutually agreed classification,
and a classification scheme itself is built up.
   The second phase, comprising the main coding of the full corpus of data, involved
the analysis of approximately 45 hours of video recordings of dyslexic participants.
This phase was performed by one coder.


4     Results and Discussion

A total of 693 instances of problems encountered by dyslexic participants were
identified in the 16 websites. Each website was evaluated by 10 different participants.
Themean number ofproblems per participant on each website was 4.33.
   Table 2shows the websites with the total number of instances of problems
encountered by all participants. The total number of instances of problems ranged
from 13 on the Digizen website to 70 on the Department of Health’s website, with a
mean of 43.25 problems per website.
    Empirical Results from an Evaluation of the Accessibility of Websites by Dyslexic Users9


Table 2 Total number of instances of user problems per website




                                                               Conformance Level


                                                                                       Conformance Level

                                                                                                           Instances of user
                                                                                                           problems
                                                               WCAG 1


                                                                                       WCAG 2
                                                       Type
 Website
 Lflegal – Law Office                               Private    AA                  AAA                         49
 Green Beast Design                                 Private    AA                  AA                          32


 York City Council                                  Public     AA                  -                           63
 NHS – National Services for Scotland               Public     AA                  -                           46




 Copac - Libraries network                          Private         A                       A                  29


 The Automobile Association                         Private         A                        -                 37
 Department of Health                               Public          A                       A                  70
 Digizen                                            Private         A                        -                 13
 JISC                                               Public          A                        -                 46
 Royal Mail                                         Private         A                        -                 49
 Pret                                               Private         A                        -                 18


 Trades Union Congress                              Private         A                        -                 57


 British Museum                                     Public           -                       -                 57
 NHS Direct                                         Public           -                       -                 44
 Ford                                               Private          -                       -                 52
 TicketMaster                                       Private          -                       -                 30

   The problems encountered by users during the evaluation were also categorised
according to how they affected the user when trying to perform their tasks. Table 3
presents a list of the categories with most problems assigned to them. It is worth
noting that fifteen categories of problems accounted for 80% of instances of problems
encountered by users.



Table 3List of most occurrent problems found in the evaluation by dyslexic users
10André P. Freire, Helen Petrie and Christopher Power


                                                                    Percentage of
 Category                                               Instances
                                                                    problems
 1. Information not in page where users expected
 it to be                                                   111          16%
 2. Navigation elements do not help the users
 find what they are seeking                                 86          12.4%
 3. Difficult to scan page for specific item                72          10.4%
 4. Default presentation of text is not adequate            43           6.2%
 5. Expected funcionality not present                       34           4.9%
 6. Too much information on page                            33           4.8%
 7. Organization of content is inconsistent with
 web conventions or common sense logic                      30           4.3%
 8. Functionality does not (or appear not to) work
 correctly/as expected                                      30           4.3%
 9. User cannot make sense of information                   29           4.2%
 10. User does not perceive that action has had
 any effect (no/insufficient feedback given to an
 action)                                                    17           2.5%
 11. English too complicated for perceived target
 audience                                                   16           2.3%
 12. Link destination not clear                             16           2.3%
 13. User cannot understand sequence of
 interaction in funcionality                                16           2.3%
 14. User inferred the existence of funcionality
 where there is not one                                     13           1.9%
 15. Too much irrelevant content before task
 content                                                    12           1.7%

   0avigation and information architecture
   The categories with most instances of problems encountered by users were
“information not in page where users expected it to be” and “navigation elements do
not help the users find what they are seeking”, which together account for 28.4% of
the instances of problems. These two categories are related with navigation and
information architecture issue. The first was occurred when users followed a path
that seemed logical to them on the website, but the pages did not contain what they
expected them to present. The second category is related to problems when the
elements in the navigation did not help users to decide where to go to find the
information they were seeking.
   The prevalence of navigation-related problems points to the need of more attention
to the design of information architecture in websites. It is important that designers
investigate how users find it most logical for the information on websites to be
organised.
   Empirical Results from an Evaluation of the Accessibility of Websites by Dyslexic Users11


   Highlighting and scanning information
   The third most frequent category was “difficult to scan page for specific item”,
which accounted for 10.4% of the problems. This category is related to problems
when the user encounters difficulties scanning for specific items in a web page, often
due to lack of structural or visual aids that would make the content they needed stand
out from the rest of the web page.

    Presentation of text
    The fourth most frequent category was “default presentation of text is not
adequate”, with 6.2% of the problems. Problems in this category involved numerous
issues reported in guidelines on how to design text for dyslexic readers (British
Dyslexia Association, 2011; Bradford, 2005). Common issues included the use of
italics, inadequate spacing between lines and paragraphs, small font size,
inappropriate font style, and inappropriate colour background.
    The occurrence of complaints about colour background is noteworthy. Many
participants encountered problems with black writing on white background. For these
users, reading text on white background for a long time causes the text to start
forming “visual patterns”, or “dancing around”. Although most web browsers have
features to change the colour background of a website, none of the participants of this
study knew about this feature, or if they knew, they found it very difficult to use. In
most cases when this problem was reported, participants expected that the websites
would provide them with a colour selector feature instead.

   Functionality expected
   The category “expected funcionality not present” accounted for 4.9% of the
problems. This category included issues when a given functionality was not present
on the expected page, or not present at the website at all. It also included problems
when users expected certain functionalities to be provided by websites, but they were
not. Some users considered it a given that websites will provide them with an internal
search feature. Another expected functionalitynoted bysome users was the presence
of an “auto-complete” feature for input fields. Many users who have difficulties with
spelling felt it was beneficial to have this feature in popular search websites, and
expected that other websites would also offer this feature to help them spell words
when inputting information.

   Amount of information and organisation of content
   Users being bombarded with too much information in a page accounted for 4.8%
of the total number of problems. Illogical organisation of information within a web
page accounted for a further4.3%. The problems with illogical organisation included
issues with related information not being displayed logically along with other related
information, illogical ordering of information (not in alphabetical order, for example)
and lack of patterns in the way information is listed. A separate category of problems
when there was too much irrelevant information before relevant content accounted for
1.7% of the problems.
12André P. Freire, Helen Petrie and Christopher Power


   Misfunctioning
   Issues with features not working correctly or in the way users expected accounted
for 4.3% of the problems. Many of these problems were related to a lack of proper
testing procedures to assure the functionality of the websites is working correctly.

   Making sense of content and language
   Users not being able to make sense of content accounted for 4.2% of the problems.
These problems were related to cases where specific information was displayed out of
context, abbreviations with no explanations and nonsense text shown in captchas. A
separate category contained problems where the general level of English was too
complicated for the perceived target audience, which accounted for 2.3% of the
problems.

   Other categories included lack of clear feedback that an action has had effect
(2.5%), unclear destination of a link (2.3%), users not being able to understand the
sequence of interaction in a given functionality (2.3%) and users inferring the
existence of functionality where there is not one (1.9%), such as when a text in the
imperative mode that looks like a link but is not.


5     Conclusions and Future Work

This paper has presented initial analyses of problems enountered by dyslexic web
users on a range of websites with different levels of conformance to WCAG 1.0 and
2.0. Users encountered many problems, which have been categorized into 15 groups.
The analysis of these problems will help refine guidelines for the development of
websites to allow dyslexic users to use them easily. Future analysis will investigate to
what extent these problems are covered by WCAG 2.0 and guidelines specifically to
address the needs of dyslexic web users and the relationship between WCAG 1.0 and
2.0 conformance and the number of problems encountered by dyslexic users.

Acknowledgments. We would like to thank all the participants who took part in this
study for their time and valuable input. Thanks to CNPq – Brazilian Ministry of
Science for supporting author André P. Freire.


References

Al-Wabil, A., Zaphiris, P., & Wilson, S. (2007). Web Navigation for Individuals with Dyslexia:
         An Exploratory Study. In C. Stephanidis (Ed.), Universal Acess in Human Computer
         Interaction. Coping with Diversity (Vol. 4554, pp. 593-602): Springer Berlin /
         Heidelberg.
British Dyslexia Association. (2011). Dyslexia Style Guide Retrieved 29/06/2011, from
         http://www.bdadyslexia.org.uk/about-dyslexia/further-information/dyslexia-style-
         guide.html
   Empirical Results from an Evaluation of the Accessibility of Websites by Dyslexic Users13


Bradford, J. (2005). Designing Web pages for dyslexic users. Dyslexia Online Magazine
           Retrieved 29/06/2011, from http://www.dyslexia-parent.com/mag35.html
Disability Rights Comission. (2004). The Web: access and inclusion for disabled people A
           formal Investigation conducted by the Disability Rights Commission. London: The
           Stationery Office.
Evett, L.& Brown, D. (2005). Text formats and web design for visually impaired and dyslexic
           readers--Clear Text for All. Interacting with Computers, 17(4), 453-472. doi:
           10.1016/j.intcom.2005.04.001
Friedman, M. G.& Bryen, D. N. (2007). Web accessibility design recommendations for people
           with cognitive disabilities. Technology and Disability, 19(4), 205-212.
Harrison, C.&Petrie, H. (2007). Severity of usability and accessibility problems in eCommerce
           and eGovernment websites. In N. Bryan-Kinns, A. Blanford, P. Curzon & L. Nigay
           (Eds.), People and Computers XX — Engage (pp. 255-262): Springer London.
Kolatch, E. (2000). Designing for users with cognitive disabilities. The Universal Usability
           Guide, University of Maryland, College Park              Retrieved 29/06/2011, from
           http://www.otal.umd.edu/UUGuide/erica
McCarthy, J.& Swierenga, S. (2010). What we know about dyslexia and Web accessibility: a
           research review. Universal Access in the Information Society, 9(2), 147-152. doi:
           10.1007/s10209-009-0160-5
Nielsen, J. (1993). Usability engineering. Boston, MA: Morgan Kaufmann.
Petrie, H., Hamilton, F., & King, N. (2004). Tension, what tension?: Website accessibility and
           visual design.Proceedings of the 2004 International Cross-disciplinary Workshop on
           Web accessibility (W4A). New York, NY, USA.
Petrie, H., Hamilton, F., King, N. and Pavan, P. (2006). Remote usability evaluations with
           disabled people. Proceedings of CHI 2006. New York: ACM Press.
Sevilla, J., Herrera, G., Martínez, & Alcantud, F. (2007). Webaccessibility for individuals with
           cognitive deficits: A comparativestudy between an existing commercial Web and its
           cognitivelyaccessible equivalent. ACM Transactions on Computer Human
           Interaction, 14(3), 12.doi: 10.1145/1279700.127970
Small, J., Schallau, P., Brown, K., & Appleyard, R. (2005). Web accessibility for people with
           cognitive disabilities. Proceedings of CHI '05 (extended abstracts). New York: ACM
           Press.
World Wide Web Consortium. (1999). Web Content Accessibility Guidelines 1.0 Retrieved
           14/04/2011, from http://www.w3.org/TR/WCAG10/
World Wide Web Consortium. (2008). Web Content Accessibility Guidelines 2.0 Retrieved
           14/04/2011, from http://www.w3.org/TR/WCAG20
Zarach, V. (2002). Ten guidelines for improving accessibility for people with dyslexia. CETIS
           University       of     Wales     Bangor          Retrieved      29/06/2011,    from
           http://wiki.cetis.ac.uk/Ten_Guidelines_for_Improving_Accessibility_for_People_wit
           h_Dyslexia