=Paper= {{Paper |id=Vol-1734/fmt-proceedings-2016-paper5 |storemode=property |title=Convenient Mobile Usability Reporting with UseApp |pdfUrl=https://ceur-ws.org/Vol-1734/fmt-proceedings-2016-paper5.pdf |volume=Vol-1734 |authors=Johannes Feiner,Keith Andrews,Elmar Krainz |dblpUrl=https://dblp.org/rec/conf/fmt/FeinerAK16 }} ==Convenient Mobile Usability Reporting with UseApp== https://ceur-ws.org/Vol-1734/fmt-proceedings-2016-paper5.pdf
              Convenient Mobile Usability Reporting with UseApp

            Johannes Feiner                                   Keith Andrews                                 Elmar Krainz
                   IIT1                                           IICM2                                         IIT1
     johannes.feiner@fh-joanneum.at                         kandrews@iicm.edu                       elmar.krainz@fh-joanneum.at
      IICM2 , Institute for Information Systems and Computer Media (IICM), Graz University of Technology, Austria
                  IIT1 , Institute for Internet Technologies & Applications (IIT), FH JOANNEUM, Austria




                                                                                                                              UseApp Server
                            Abstract
     Usability reporting is necessary to communicate
     the results of usability tests to developers and
     managers. Writing usability reports (data aggre-                                                                               UsabML
     gation, interpretation, formatting, and writing for                                         UseApp
     specific readerships) can be tedious. Particularly
                                                                                   Facilitator
     for mobile usability evaluation, where recording
     user task performance outside a lab is often neces-
     sary, testing and reporting can be costly. In many
     cases, automated extraction of usability findings                                                     User
     would be helpful, but is rather difficult to achieve
                                                                                   Figure 1: UseApp helps evaluators record the perfor-
     with commonly used report formats such as Word
                                                                                      mance of users electronically during usability testing.
     or PDF.
     UseApp is a tablet-based web application devel-
     oped to overcome some of these limitations. It                            more, the modern practice of agile software development
     supports the capture of usability data in the field                       encourages rapid, incremental testing. In both cases, test-
     during testing, simplifying data collection and ag-                       ing has to be simple and efficient. A tool supporting elec-
     gregation. Live-reports are generated on-the-fly                          tronic capture of usability data as easily as using a pen and
     and usability findings can be exported electroni-                         paper can be of great benefit.
     cally to bug tracking systems.                                               Nowadays, many applications are designed to run on
                                                                               mobile phones, shifting the focus of usability testing to mo-
                                                                               bile usability testing. This shift requires a tool set support-
1    Mobile Usability Reporting                                                ing mobile reporting. Reports in structured formats such as
                                                                               UsabML [FAK10] allow evaluation results to be processed
Usability evaluations are performed to validate the usability                  electronically: findings can be extracted and then imported
(and user experience) of software products. For example,                       into bug tracking systems automatically.
experts might conduct heuristic evaluations (HE) to detect
potential flaws in applications based on their experience
and judgement. Thinking aloud (TA) tests might be con-
                                                                               2    Related Work
ducted with representative test users to discover problems                     Many methods for evaluating user interfaces have been de-
in realistic usage scenarios.                                                  veloped over the past three decades [DR93; Nie95]. For-
    Smaller software development teams often do not have                       mative usability evaluation [Red+02] seeks to discover po-
the resources to conduct extensive user studies. Further-                      tential usability problems during the development of an in-
                                                                               terface, so they can be fixed. Of the formative evaluation
Copyright c by the paper’s authors. Copying permitted for private and
academic purposes.
                                                                               techniques, Heuristic Evaluation (HE) [NM90; Nie94b;
In: W. Aigner, G. Schmiedl, K. Blumenstein, M. Zeppelzauer (eds.): Pro-
                                                                               HLL07] and Thinking Aloud (TA) [RC08] testing are par-
ceedings of the 9th Forum Media Technology 2016, St. Pölten, Austria,          ticularly widely used. Nielsen [Nie94a] suggested that
24-11-2016, published at http://ceur-ws.org                                    some, simplified usability evaluation is always better than

                                                                          41
Convenient Mobile Usability Reporting with UseApp


none. Brooke [Bro96] proposed the System Usability Scale              Criteria       Description            Usage in UseApp
(SUS) to make assessment through questionnaires simpler               Simple         Minimise input,        No pen and paper
and results comparable through normalised scoring.                    and Fast       use templates.         required.    Place-
    Usability reporting feeds back the findings of usabil-                                                  holders and default
ity evaluations to development teams [Que05; Her16].                                                        values.
According [Yus15] and [YGV15] using conventional bug                  Context        Sensor support         Auto-timing of task
tracking systems for usability reporting does now work                Aware-         (GPS), timing.         duration.
well. Structured written reports have traditionally been              ness
used [FH03; LCA97]. Some e↵orts have been made to                     Don’t          Manage and store       Reuse existing user
standardise the structure of such reports, including the              Repeat         project and user       details, question-
Common Industry Format (CIF) [NIS99] for formal experi-               Yourself       details.               naires.
ments. However, usability reports are still largely delivered         (DRY)
in traditional document formats such as Microsoft Word                Export         Structured             Export as UsabML.
and PDF, which are extremely hard to process automati-                and Reuse      formats,    post-
cally. UsabML [FAK10] is a structured, XML-based for-                                processing.
mat format for usability reports, which allows tool-based
extraction and/or conversion and thus fosters simpler au-                Table 1: Selected design criteria for a mobile usability
tomation and reuse.                                                         reporting tool.
    Reporting usability defects to software developers (cmp
[Hel+11]) is a challenge still. [YGV16] investigated re-             main challenges addressed in several papers was Improv-
porting and analysed 147 responses. They detected a gap              ing the test suite. Furthermore, [ZSG16] refer to research
between the what reporters provide and what developers               groups working on improved toolkits and testing frame-
need when fixing defects. UseApp aims into the same di-              works: [Can+13] for Advanced Test Environment (ATE), a
rection, as it narrow this gap by supporting semiautomated           platform which supports automatic execution of user expe-
handover of usability results into bug tracking systems.             rience tests, [LH12] for a toolkit for unsupervised evalua-
    Modern usability evaluation shifted towards open use             tion of mobile applications, [BH09] a logging based frame-
situations and takes the mobile context into account, as dis-        work to evaluate usability of apps on mobile devices, and
cussed in [BH11], [KSV12] and [Lan13]. ISO standards                 [VCD15] for automated mobile testing as a service. For
support objective measurement of the usability of mobile             research crowdsourcing in mobile testing [Sta13] created
applications as reported in [MIA16]. Several tools to assist         the lightweight Cloud Testing of Mobile Systems (CTOMS)
mobile usability testing can be found in literature. [Sto+15]        framework.
present MARS, a mobile app rating scale. The tool helps                 In contrast to this work, which focuses on reporting,
assessing and classifying apps in health sector. The chal-           only few of the cited approaches mention post-processing
lenges of automatic UI observation and event logging to              and reuse of reports at all.
improve usability on mobile apps can be found [Ma+13],
but the support for usability engineering methods (like TA           3    UseApp Concept
or HE) is missing. Frameworks with a set of di↵erent tools
and methods to support mobile usability evaluation can be            UseApp is a client-server web application, as shown in Fig-
found at [And+01] and [Che16].                                       ure 1. The facilitator manages the mobile user testing and
    Also, some commercial products are on the market. For            typically enters data into the system using a web browser
example, Usertesting1 is a product which helps to add us-            on a tablet. The criteria used to design UseApp are shown
ability testing on mobile platforms. Beside premium/paid             in Table 1. Data entry should be fast and simple, through a
support for testing, simple test can be created with the help        minimal interface and use of templates, placeholders, and
of an online tool. Another tool for testing-support of mo-           default values. Sensors should be used to automate pro-
bile web sites is UXRecorder2 which supports recording of            cedures as far as possible. Data should only have to be
users touch and facial impressions.                                  entered once. Overviews and reports should be generated
    The systematic mapping study [ZSG16] about mobile                on-the-fly and results should be exported in a structured re-
application testing techniques categorised the di↵erent ap-          porting format.
proaches and stated that 19 out of 79 studies employed us-              Recipes for common evaluation tasks, such as a thinking
ability testing. The paper discusses many challenges of mo-          aloud test or administering a standard questionnaire should
bile testing, such as context-awareness, lab vs. in-the-wild         be available pre-canned. The interface should support fo-
testing, video recording or mobile eye-tracking. One of the          cus and context: giving an overview whilst simultaneously
                                                                     allowing the facilitator to focus on the details of current ac-
  1 https://www.usertesting.com/.                                    tions. Colour-coded indicators should give feedback about
  2 http://www.uxrecorder.com/.                                      already completed sections, and highlight where data is still

                                                                42
Convenient Mobile Usability Reporting with UseApp




    Figure 2: Just six steps – from setup, to data entry to a
       final report – motivates even small development teams
       to perform mobile usability tests.


incomplete. To remove the need for pen and paper, every-
thing should be possible directly on the tablet: from sign-
ing a consent form with a stylus or via audio, to answering                Figure 3: Many built-in placeholders and the timer func-
questionnaire questions by tapping.                                           tionality allow simple and fast reporting.


4    UseApp Implementation                                            ing a slider. After completing tasks, the users are asked for
                                                                      feedback. As the questions have all been prepared and as-
UseApp currently has built-in support for running Think-              sembled in advance, the answers are collected in electronic
ing Aloud (TA) tests and administering SUS [Bro96] ques-              form.
tionnaires. In future versions, support for Heuristic Evalu-             The results can be viewed for single participants, or for
ations (HE) and other questionnaires and rating scales will           a group of participants, including means and summaries.
be added.                                                             Multiple charts are available to support interpretation and
    The client implementation uses many features of mod-              communication of the results. Figure 4 shows an example.
ern HTML5 web technologies, in order to support the fea-              Notes and annotations can be added by the facilitator.
tures outlines in Section 3. Responsive web design is used
to support several screen resolutions and provide sensible            5     UseApp in Action
fallbacks where features are not supported by a particular
device or browser. O✏ine storage, sensors, audio input and            UseApp was trialled for a number of mobile usability eval-
output, and canvas-based charts are all used.                         uations. The UseApp server was set up in-house and the
                                                                      installation of the web app on an iPad was prepared in ad-
    The UseApp server is built in Ruby/Rails and exposes a
                                                                      vance. The manager of each study entered the project de-
restful application programming interface (API). Thus, the
                                                                      tails, task descriptions, and questionnaire questions in ad-
client only retrieves and stores data on the server, but the
                                                                      vance. As users performed their tasks, the facilitator had
layout and rendering are completely server independent.
                                                                      their iPad in hand to guide the session, enter observations,
    The workflow for a TA test comprises six steps (Project
                                                                      and record task duration. After completing the tasks, an in-
Details, Agreement, User Info, Test, Inquiry, and Finish),
                                                                      terview was conducted and a questionnaire was filled out.
as indicated in the top bar in Figure 2. The workflow starts
                                                                      Immediately after each test user has finished, the usability
with entering the project details. Test users then give their
                                                                      managers had access to the results and could add any com-
consent and answer demographic and background ques-
                                                                      ments or notes relevant to that test.
tions.
                                                                          Feedback from the first users of UseApp (the usability
    The facilitator can track individual or collective perfor-
                                                                      evaluations managers and facilitators) has indicated some
mance directly with help of UseApp. Placeholders and tem-
                                                                      of its benefits and limitations:
plates support and speed up facilitator input as shown in
Figure 3. Timing of task duration is supported by built-in                • Feedback: the top bar indicating the six steps to com-
timers. Task completeness can be indicated just be mov-                     pletion was useful feedback.

                                                                 43
Convenient Mobile Usability Reporting with UseApp


                                                                         Ongoing improvement of UseApp will expand evalu-
                                                                      ation methods supported and the palette of built-in tem-
                                                                      plates. The use of GPS sensors to track location might also
                                                                      be useful in some evaluation contexts.

                                                                      References
                                                                      [And+01]    Terence S. Andre, H. Rex Hartson, Steven M.
                                                                                  Belz, and Faith A. McCreary. “The User Ac-
                                                                                  tion Framework: A Reliable Foundation For
                                                                                  Usability Engineering Support Tools”. In: In-
                                                                                  ternational Journal of Human-Computer Stud-
                                                                                  ies 54.1 (Jan. 2001), pages 107–136. doi:10.
                                                                                  1006/ijhc.2000.0441 (cited on page 2).
                                                                      [BH09]      Florence Balagtas-Fernandez and Heinrich
                                                                                  Hussmann. “A Methodology and Framework
                                                                                  to Simplify Usability Analysis of Mobile Ap-
                                                                                  plications”. In: Proc. IEEE/ACM International
                                                                                  Conference on Automated Software Engineer-
                                                                                  ing. ASE 2009. IEEE Computer Society, 2009,
                                                                                  pages 520–524. ISBN 9780769538914. doi:
                                                                                  10.1109/ASE.2009.12 (cited on page 2).
     Figure 4: Results are calculated on-the-fly and make re-         [BH11]      Javier A. Bargas-Avila and Kasper Hornbæk.
        sults available instantly.                                                “Old Wine in New Bottles or Novel Chal-
                                                                                  lenges: A Critical Analysis of Empirical Stud-
                                                                                  ies of User Experience”. In: Proceedings of
    • No Paper: no need for paper keeps the test environ-
                                                                                  the SIGCHI Conference on Human Factors
      ment uncluttered.
                                                                                  in Computing Systems. CHI ’11. Vancouver,
    • Re-Use: where user testing is required multiple times,                      BC, Canada: ACM, May 7, 2011, pages 2689–
      the reuse of already prepared evaluation documents,                         2698. ISBN 9781450302289. doi : 10 . 1145 /
      such as same or similar questions for the question-                         1978942.1979336 (cited on page 2).
      naire, is time saving.                                          [Bro96]     J. Brooke. “SUS: A Quick and Dirty Usability
                                                                                  Scale”. In: Usability Evaluation in Industry.
    • Export: software developers liked the idea of post-                         Edited by Patrick W. Jordan, Bruce Thomas,
      processing reports. After exporting the usability re-                       Bernard A. Weerdmeester, and Ian L. Mc-
      ports in structured UsabML, automated import into                           Clelland. Taylor & Francis, 1996. Chapter 21,
      bug tracking systems is not difficult.                                      pages 189–194. ISBN 0748404600 (cited on
                                                                                  pages 2–3).
   UseApp acts as a practical companion when running a
mobile usability test. Although UseApp can help, prepar-              [Can+13]    Gerado Canfora, Francesco Mercaldo, Cor-
ing and conducting usability tests still takes time and e↵ort.                    rado Aaron Visaggio, Mauro DAngelo, Anto-
   A minor limitation was the lack of support for freehand                        nio Furno, and Carminantonio Manganelli. “A
writing when signing the consent form. A tablet support-                          Case Study of Automating User Experience-
ing a stylus might be useful for future versions instead of                       Oriented Performance Testing on Smart-
forcing users to draw with their fingers.                                         phones”. In: Proc. 6th International Confer-
                                                                                  ence on Software Testing, Verification and Val-
                                                                                  idation. ICST 2013. Mar. 18, 2013, pages 66–
6     Concluding Remarks                                                          69. doi : 10 . 1109 / ICST. 2013 . 16 (cited on
UseApp has the potential to support usability evaluators                          page 2).
in multiple ways. It simplifies data entry when conduct-              [Che16]     Lin Chou Cheng. “The Mobile App Usabil-
ing mobile usability tests, provides templates for input, au-                     ity Inspection (MAUi) Framework as a Guide
tomation for recording tasks, and reuse of project data. In-                      for Minimal Viable Product (MVP) Testing
stant reporting and flexible export into structured UsabML                        in Lean Development Cycle”. In: Proc. 2nd
help accelerate the provision of usability findings by the                        International Conference in HCI and UX on
test team to the appropriate software developers.

                                                                 44
Convenient Mobile Usability Reporting with UseApp


           Indonesia 2016. CHIuXiD 2016. Jakarta, In-                       [Lan13]    Tania Lang. “Eight Lessons in Mobile Usabil-
           donesia, Apr. 13, 2016, pages 1–11. ISBN                                    ity Testing”. In: UX Magazine 998 (Apr. 10,
           9781450340441. doi : /10 . 1145 / 2898459 .                                 2013). https : / / uxmag . com / articles / eight -
           2898460 (cited on page 2).                                                  lessons - in - mobile - usability - testing (cited
[DR93]     Joseph S. Dumas and Janice Redish. A Prac-                                  on page 2).
           tical Guide to Usability Testing. Ablex, Dec.                    [LCA97]    Darryn Lavery, Gilbert Cockton, and Mal-
           1993. ISBN 089391990X (cited on page 1).                                    com P. Atkinson. “Comparison of Evaluation
[FAK10]    Johannes Feiner, Keith Andrews, and Elmar                                   Methods Using Structured Usability Problem
           Krajnc. “UsabML - The Usability Report                                      Reports”. In: Behaviour & Information Tech-
           Markup Language: Formalising the Exchange                                   nology 16.4 (1997), pages 246–266. doi:10.
           of Usability Findings”. In: Proc. 2nd ACM                                   1080/014492997119824 (cited on page 2).
           SIGCHI Symposium on Engineering Interac-                         [LH12]     Florian Lettner and Clemens Holzmann. “Au-
           tive Computing Systems. EICS 2010. Berlin,                                  tomated and Unsupervised User Interaction
           Germany: ACM, May 2010, pages 297–302.                                      Logging As Basis for Usability Evaluation of
           ISBN 1450300839. doi : 10 . 1145 / 1822018 .                                Mobile Applications”. In: Proc. 10th Inter-
           1822065 (cited on pages 1–2).                                               national Conference on Advances in Mobile
[FH03]     Andy P. Field and Graham Hole. How to De-                                   Computing & Multimedia. MoMM 2012. Bali,
           sign and Report Experiments. Sage Publica-                                  Indonesia: ACM, 2012, pages 118–127. ISBN
           tions, Feb. 2003. ISBN 0761973826 (cited on                                 9781450313070. doi : 10 . 1145 / 2428955 .
           page 2).                                                                    2428983 (cited on page 2).
                                                                            [Ma+13]    Xiaoxiao Ma, Bo Yan, Guanling Chen,
[Hel+11]   Florian Heller, Leonhard Lichtschlag,
                                                                                       Chunhui Zhang, Ke Huang, Jill Drury, and
           Moritz Wittenhagen, Thorsten Karrer, and
                                                                                       Linzhang Wang. “Design and implementation
           Jan Borchers. “Me Hates This: Explor-
                                                                                       of a toolkit for usability testing of mobile
           ing Di↵erent Levels of User Feedback
                                                                                       apps”. In: Mobile Networks and Applications
           for (Usability) Bug Reporting”. In: Proc.
                                                                                       18.1 (2013), pages 81–97 (cited on page 2).
           Extended Abstracts on Human Factors in
           Computing Systems. CHI EA 2011. Van-                             [MIA16]    Karima Moumane, Ali Idri, and Alain Abran.
           couver, BC, Canada: ACM, May 7, 2011,                                       “Usability Evaluation of Mobile Applications
           pages 1357–1362. ISBN 9781450302685.                                        Using ISO 9241 and ISO 25062 Standards”.
           doi : 10 . 1145 / 1979742 . 1979774 (cited on                               In: SpringerPlus 5.1 (2016), page 1. doi:10.
           page 2).                                                                    1186/s40064-016-2171-z (cited on page 2).
[Her16]    Morten Hertzum. “A Usability Test is Not an                      [Nie94a]   Jakob Nielsen. Guerrilla HCI – Using Dis-
           Interview”. In: interactions 23.2 (Feb. 2016),                              count Usability Engineering to Penetrate the
           pages 82–84. doi:10.1145/2875462 (cited on                                  Intimidation Barrier. Jan. 1994. http : / / www .
           page 2).                                                                    nngroup . com / articles / guerrilla - hci/ (cited
                                                                                       on page 1).
[HLL07]    Ebba Thora Hvannberg, Effie Lai-Chong Law,
           and Marta Kristín Lérusdóttir. “Heuristic eval-                  [Nie94b]   Jakob Nielsen. Ten Usability Heuristics.
           uation: Comparing ways of finding and report-                               Nielsen Norman Group. 1994. https : / /
           ing usability problems”. In: Interacting with                               www . nngroup . com / articles / ten - usability -
           Computers 19.2 (2007), pages 225–240. doi:                                  heuristics/ (cited on page 1).
           10 . 1016 / j . intcom . 2006 . 10 . 001. http : / /             [Nie95]    Jakob Nielsen. “Usability Inspection Meth-
           kth.diva-portal.org/smash/get/diva2:527483/                                 ods”. In: Conference Companion on Human
           FULLTEXT01 (cited on page 1).                                               Factors in Computing Systems. CHI ’95. Den-
[KSV12]    Artur H. Kronbauer, Celso A. S. Santos, and                                 ver, Colorado, United States: ACM, 1995,
           Vaninha Vieira. “Smartphone Applications                                    pages 377–378. ISBN 0897917553. doi:10.
           Usability Evaluation: A Hybrid Model and Its                                1145/223355.223730 (cited on page 1).
           Implementation”. In: Proc. 4th International                     [NM90]     Jakob Nielsen and Rolf Molich. “Heuris-
           Conference on Human-Centered Software En-                                   tic Evaluation of User Interfaces”. In: Proc.
           gineering. HCSE 2012. Toulouse, France:                                     Conference on Human Factors in Comput-
           Springer-Verlag, Oct. 29, 2012, pages 146–                                  ing Systems. CHI ’90. Seattle, Washington,
           163. ISBN 9783642343469. doi:10.1007/978-                                   USA: ACM, 1990, pages 249–256. ISBN
           3 - 642 - 34347 - 6 _ 9. http : / / dx . doi . org / 10 .                   0201509326. doi : 10 . 1145 / 97243 . 97281
           1007/978-3-642-34347-6_9 (cited on page 2).                                 (cited on page 1).

                                                                       45
Convenient Mobile Usability Reporting with UseApp


[NIS99]    NIST. Common Industry Format for Usability                    [YGV15]   Nor Shahida Mohamad Yusop, John Grundy,
           Test Reports. National Institute of Standards                           and Rajesh Vasa. “Reporting Usability De-
           and Technology. Oct. 1999. http://zing.ncsl.                            fects: Limitations of Open Source Defect
           nist.gov/iusr/documents/cifv1.1b.htm (cited                             Repositories and Suggestions for Improve-
           on page 2).                                                             ment”. In: Proc 24th Australasian Software
[Que05]    Whitney Quesenbery. Reporting Usability                                 Engineering Conference. ASWEC ’ 15 Vol.
           Results – Creating E↵ective Communica-                                  II. Adelaide, SA, Australia: ACM, Sept. 28,
           tion. Tutorial Slides. Dec. 2005. http : / /                            2015, pages 38–43. ISBN 9781450337960.
           www . wqusability . com / handouts / reporting _
                                                                                   doi : 10 . 1145 / 2811681 . 2811689 (cited on
           usability.pdf (cited on page 2).                                        page 2).
[Red+02]   Janice (Ginny) Redish, Randolph G. Bias,                      [YGV16]   Nor Shahida Mohamad Yusop, John Grundy,
           Robert Bailey, Rolf Molich, Joe Dumas, and                              and Rajesh Vasa. “Reporting Usability De-
           Jared M. Spool. “Usability in Practice: For-                            fects: Do Reporters Report What Software De-
           mative Usability Evaluations — Evolution and                            velopers Need?” In: Proc. 20th International
           Revolution”. In: Extended Abstracts on Hu-                              Conference on Evaluation and Assessment in
           man Factors in Computing Systems. CHI EA                                Software Engineering. EASE ’16. Limerick,
           2002. Minneapolis, Minnesota, USA: ACM,                                 Ireland: ACM, May 2016, 38:1–38:10. ISBN
           2002, pages 885–890. ISBN 1581134541. doi:                              9781450336918. doi : 10 . 1145 / 2915970 .
           10.1145/506443.506647 (cited on page 1).                                2915995 (cited on page 2).
[RC08]     Je↵rey B. Rubin and Dana Chisnell. Hand-                      [ZSG16]   Samer Zein, Norsaremah Salleh, and John
           book of Usability Testing: Howto Plan, De-                              Grundy. “A Systematic Mapping Study of
           sign, and Conduct E↵ective Tests. 2nd edi-                              Mobile Application Testing Techniques”. In:
           tion. John Wiley & Sons, May 2008. ISBN                                 Journal of Systems and Software 117 (2016),
           0470185481 (cited on page 1).                                           pages 334–356. http://www.ict.swin.edu.au/
                                                                                   personal/jgrundy/papers/jss2016.pdf (cited on
[Sta13]    Oleksii Starov. “Cloud Platform For Research                            page 2).
           Crowdsourcing in Mobile Testing”. Master’s
           Thesis. East Carolina University, Jan. 2013.
           http : / / thescholarship . ecu . edu / bitstream /
           handle / 10342 / 1757 / Starov _ ecu _ 0600M _ 10953 .
           pdf (cited on page 2).

[Sto+15]   Stoyan R. Stoyanov, Leanne Hides, David J.
           Kavanagh, Oksana Zelenko, Dian Tjondrone-
           goro, and Madhavan Mani. “Mobile App Rat-
           ing Scale: A New Tool for Assessing the Qual-
           ity of Health Mobile Apps”. In: JMIR mHealth
           and uHealth 3.1 (2015), e27. doi : doi : 10 .
           2196/mhealth.3422. http://mhealth.jmir.org/
           2015/1/e27/ (cited on page 2).
[VCD15]    Isabel Karina Villanes, Erick Alexandre Bez-
           erra Costa, and Arilo Claudio Dias-Neto. “Au-
           tomated Mobile Testing as a Service (AM-
           TaaS)”. In: Proc. IEEE World Congress on
           Services. June 2015, pages 79–86. doi : 10 .
           1109/SERVICES.2015.20 (cited on page 2).
[Yus15]    Nor Shahida Mohamad Yusop. “Understand-
           ing Usability Defect Reporting in Software
           Defect Repositories”. In: Proc. 24th Aus-
           tralasian Software Engineering Conference.
           ASWEC ’ 15 Vol. II. Adelaide, SA, Aus-
           tralia: ACM, 2015, pages 134–137. ISBN
           9781450337960. doi : 10 . 1145 / 2811681 .
           2817757 (cited on page 2).


                                                                    46