AHP Supported Evaluation of LMS Quality Bojan Srđević Matija Pipan Zorica Srđević Tanja Arh University of Novi Sad Jozef Stefan Institute University of Novi Sad Jozef Stefan Institute Trg D. Obradovica 8, Jamova cesta 39, Trg D. Obradovica 8, Jamova cesta 39, 21000 Novi Sad, 1000 Ljubljana, 21000 Novi Sad, 1000 Ljubljana, Serbia Slovenia Serbia Slovenia bojans@polj.uns.ac.rs matic@e5.ijs.si srdjevicz@polj.uns.ac.rs tanja@e5.ijs.si Pedagogical criteria can, for example, include [15]: Learner ABSTRACT Learning Management System (LMS) provides a platform control, Learner activity, Cooperative/ Collaborative for an on-line learning environment by enabling the learning, Goal orientation, Applicability, Added value, management, delivery, and tracking of the learning process Motivation, Valuation of previous knowledge, Flexibility and learners. Selection of the most suitable method is and Feedback. usually prolonged by the time and effort consuming On the other hand, Kurilovas [12] groups technical criteria evaluations of numerous features of LMS. To reduce the as follows: number of features and at the same obtain a reliable result from an evaluation, we propose a decomposition of this 1. Overall architecture and implementation: Scalability of complex problem to more easily comprehended sub- the system; System modularity and extensibility; problems that can be analyzed independently through a Possibility of multiple installations on a single platform; multi-criteria method called Analytic Hierarchy Process Reasonable performance optimizations; Look and feel is (AHP). To verify the approach, an expert is asked to use configurable; Security; Modular authentication; AHP on an originally developed reduced hierarchy of the Robustness and stability; Installation, dependencies and problem of selecting the most appropriate LMS for the portability; student target group. Results of the application are 2. Interoperability: Integration is straightforward; LMS compared with the results obtained by the DEXi multi- standards support (IMS Content Packaging, SCORM); criteria model. 3. Cost of ownership; 4. Strength of the development community (for open Keywords: LMS, Evaluation, Analytic Hierarchy Process source products): Installed base and longevity; Documentation; End-user community; Developer INTRODUCTION community; Open development process; Commercial The Organization for Economic Cooperation and support community; Development defined LMS technology as a technology 5. Licensing; used by instructors to build and maintain courses. It features 6. Internationalization and localization: Localizable user personal communication via email, group communication interface; Localization to relevant languages; Unicode via chatting and forums, posting content including syllabi, text editing and storage; Time zones and date papers, presentations and lesson summaries, performance localization; Alternative language support; evaluation via question and answer repositories; self- 7. Accessibility: Text-only navigation support; Scalable assessment tests, assignments, quizzes and exams, fonts and graphics; and instruction management via messaging, grade posting and 8. Document transformation. surveys, and more. There are many LMS systems on the market that can be It is obvious that selection of the most suitable LMS is a obtained for free and are Open Source (i.e. Moodle, Sakai, complex task that involves defining the evaluation criteria Claroline, ATutor, etc.) or through payment (i.e. and selecting a method for criteria evaluation that will be Blackboard, WebCT, Clix, and many others). All of them systematic, comprehensive, easy to use, etc. support many different features which can be used as Once defined, the criteria can be evaluated using a self- evaluation criteria and analyzed from different aspects [6]: evaluation questionnaire that employs a 7-point Likert scale 1. Pedagogical aspect 1 (strongly disagree) – 5 (strongly agree), 6 (not 2. Learner environment applicable), 7 (don’t know) [7, 13, 14, 15]. Other evaluation 3. Instructor tools tools include MS-Excel spreadsheets application [1], fuzzy 4. Course and curriculum design logic [6], an expert system shell for multi-attribute decision 5. Administrator tools and support DEXi [2], a hybrid Multi-criteria decision-making 6. Technical specification. (MCDM) model based on factor analysis and DEMATEL [21] etc. The number of features for evaluation is usually very high in all these applications (e.g. 57 in Pipan [16]; 52 offered in Cavus [6]). To evaluate such a great number of features, a significant amount of time and effort is required Judgment term Numerical term of the evaluator. Absolute preference (element i over 9 We believe that reliable results can be obtained with fewer element j) criteria if the problem is decomposed in order to more Very strong preference (i over j) 7 easily comprehended sub-problems that can be analyzed Strong preference (i over j) 5 independently, i.e. presented as a hierarchy. One of the most popular methods that deal with decision hierarchies is Weak preference (i over j) 3 Analytic Hierarchy Process (AHP) [17] and we propose this Indifference of i and j 1 method for the evaluation of selected LMS products because: (1) it supplies management in both education and Weak preference (j over i) 1/3 industry with a less complex and more appropriate and Strong preference (j over i) 1/5 flexible way to effectively analyze LMSs, (2) it supports Very strong preference (j over i) 1/7 their selections of an appropriate product, and (3) achievement of a higher level of e-learner satisfaction [18]. Absolute preference (j over i) 1/9 Other advantages of AHP that should be emphasized are An intermediate numerical values 2,4,6,8 and 1/2,1/4,1/6,1/8 that AHP provides a measure of consistency of the can be used as well evaluator and that it can be used for participative evaluation of LMS product. Table 1: The fundamental Saaty’s scale for the comparative judgments To verify AHP applicability, an expert is asked to use AHP on an originally developed hierarchy of the problem of In standard AHP, an eigenvector (EV) method is used for selecting the most appropriate LMS for the student target deriving weights from local matrices; the EV is called the group. Consistency of the expert is checked throughout the prioritization method, and the computational procedure is process. At the end, results of the evaluation are compared consequently called prioritization. After local weights are with results presented in [16]. calculated at all levels of the hierarchy, a synthesis consists of multiplying the criterion-specific weight of the alternative AHP IN BRIEF with the corresponding criterion weight and summing up the Main features results to obtain composite weights of the alternative with One of the key issues in decision making is eliciting respect to the goal; this procedure is unique for all judgments from the decision maker (DM) about the alternatives and all criteria. importance of a given set of decision elements. If a problem AHP is aimed at supporting decision-making processes in can be structured hierarchically, then a certain ratio scale both individual and group contexts. In later cases various can serve as an efficient tool to enable this hierarchy by aggregation schemes are applicable, e.g. AIJ and AIP [9], performing pair-wise comparisons. The core of AHP [17] as well as various consensus reaching procedures are easy lies in presenting the problem as a hierarchy and comparing to implement. This issue is out of scope here; namely, the the hierarchical elements in a pair-wise manner using paper deals strictly with an individual application of AHP. Saaty’s 9-point scale, Table 1. This way, the importance of one element over another is Measuring consistency expressed in regards to the element in the higher level. The The DM makes judgments more or less consistently AHP is a multi criteria optimization method which creates depending not only on his knowledge of the decision so-called local comparison matrices at all levels of a problem itself, but also on his ability to remain focused and hierarchy and performs logical syntheses of their (local) to ensure that his understanding of the cardinal preferences priority vectors. The major feature of AHP is that it between elements will always, or as much as possible, be involves a variety of tangible and intangible goals, formalized properly while using a verbal scale or related attributes, and other decision elements. In addition, it numerical ratios [20]. For example, if the Saaty’s 9-point reduces complex decisions to a series of pair-wise ratio scale is used, the question could be: will the DM put comparisons; implements a structured, repeatable, and aij = 3, or aij = 2, if he considers element Ei slightly more justifiable decision-making approach; and builds consensus. important than Ej? Or, if there are seven elements to be compared, then matrix A is of size 7x7, and the question could be: is the DM really capable to preserve consistency while comparing head-to-head 21 times all pairs of elements? How is the DM to override the imposed difficulty with Saaty’s scale when he compares elements Ei and Ek, after he has judged the elements Ei and Ej, and Ej and Ek? If he has already made the judgments aij = 3 and ajk = 4, he should logically put aik = 12 without any further random consistency index (RI) defined also by Saaty [17], judging because a simple transitivity rule applies: aik = aijajk the consistency ratio is obtained: = 3x4 = 12. Because the maximum value in Saaty’s scale is CI . 9 for declaring the absolute dominance of one element over CR = (2) RI the other, there is a problem in attaining consistency while judging certain elements. The inconsistencies generally Saaty [17] suggested considering the maximum level of the accumulate until the need for their measuring arises. DM’s inconsistency to be 0.10; that is, CR should be less or equal to 0.10. Consistency analysis of the individual DM can be based on the consistency ratio (CR) defined by Saaty [17], and the EXAMPLE APPLICATION total L2 ED for each comparison matrix. Whichever method is used to derive the priority vector from the given local Problem statement AHP matrix [19], if it already has all the entries elicited The problem is stated so as to assess and rank by from the DM, measuring consistency is necessary in order applicability the three e-Learning Management Systems to ensure the integrity of the outcomes. based on three typical qualitative criteria and a number of qualitative sub criteria. An expert is asked to perform the Standard AHP uses EV, the prioritization method, and the decision making processes by applying the AHP model. consistency coefficient CR to indicate the inconsistency of the DM [17]. The other commonly used consistency Hierarchy of the problem measures are the total Euclidean distance, and minimum An original hierarchy of the problem [16] consists of five violations measure. levels: goal – criteria set – sub criteria set (4+4+3 per The CR is calculated as a part of the standard AHP criterions in upper level) represented by specific groups of procedure. First, the consistency index (CI) is calculated attributes – sub sub criterions (24 in total under sub using the following equation: criterions), represented by groups of more detailed attributes – and three alternatives (LSMs). In order to λmax − n reduce the number of decision elements, the fourth level in CI = (1) n −1 the hierarchy (sub sub attributes) is avoided and thus the where λmax is the principal eigenvalue of the given reduced hierarchy of the problem is created as shown in Figure 1. comparison matrix. Knowing the consistency index and Figure 1: Reduced hierarchy of the decision problem Identify LSM with the best applicability characteristics implementation for the purposes of ensuring interoperability, portability and reusability, especially Criteria set (with attributes as sub-criteria) for learning resources as they require for their The set of criteria is the key component of the decision- preparation qualified professionals and are very time making model. In creating the model [16], an attempt is [10]. made to meet the requirements set by Bohanec & Rajkovič [5] by taking into account the principle of criteria integrity (3) T&D (Tutoring & didactics): Third group of criteria is (inclusion of all relevant criteria), appropriate structure, merged into Tutoring & didactics. The tutor’s quality of non-redundancy, comprehensiveness and measurability [4]. environment is assessed using the: Comprehensiveness means that all the data about the subject are actually present in the database. Non- • (CODE) Course development, redundancy means that each individual piece of data exists • (ACTR) Activity tracking and only once in the database. Appropriate structure means that • (ASSE) Assessment criteria. the data are stored in such a way as to minimize the cost of Activity tracking undoubtedly provides important support expected processing and storage [3]. to the tutor in the learning process. Here we have focused The criteria set is stated under three main scopes: Student’s on monitoring students in the process of learning and the learning environment, System, technology & standards, and possibility of displaying students’ progress, analysis of Tutoring & didactics. These three scopes represent the presence data, sign-in data and time analysis. global skeleton of the multi-attribute model with attributes (considered as sub-criterions) associated with each Decision alternatives The multi-attribute decision making model was completed criterion. with three learning management systems (LMS): (1) SLE (Student’s learning environment): The first A1. Blackboard 6 (www.blackboard.com): Blackboard is scope is adopted as the first criterion and declared as the among the most perfected and complex LMSs on the Student’s learning environment. It is composed of four market. The system offers various communication options basic attributes: (both synchronous and asynchronous) within the learning • (EASE) Ease of use environment. The Blackboard LMS is designed for • (COMM) Communication institutions dedicated to teaching and learning. Blackboard • (FUEV) Functional environment and technology and resources power the online, web-enhanced, • (HELP) Help. and hybrid education programs at more than 2000 academic institutions (research university, community college, high (2) STS (System, technology & standards category): The school, virtual MBA programs etc.). Blackboard has 5,500 second group of attributes is grouped into the System, clients representing 200 million users (2.5 million from its technology & standards category. These groups of criteria largest, hosted client; 100,000 from its largest, self-hosted are assessed through four basic attributes: client) in 60 countries [8]. • (TEIN) Technological independence. The attribute of technological independence is used for the evaluation A2. CLIX 5.0 (www.im-c.de): CLIX is targeted most of all of an LMS from the prospective of its technological at big corporations because it provides efficient, accessibility, which is a pre-condition that has to be manageable, connected and expandable internet-based met if we wish to talk about system applicability and learning solutions. This scalable, multilingual and efficiency. customizable software aims at providing process excellence • (SECR) Security and privacy. The Security and for educational institutions. For educational administrators, privacy criterion focuses on two issues: User security CLIX offers powerful features for course management and and privacy and security and privacy of an LMS. User distribution. Additionally, it provides personalized learning security and privacy should be at the forefront of paths for students, a tutoring centre for lectures and a whole attention; therefore an LMS must keep communication bunch of innovative collaboration tools for both user and personal data safe and avoid dangers and attacks groups, e.g. a virtual classroom. Altogether, CLIX makes on user computers. Application security and privacy planning, organizing, distributing, tracking and analyzing of assessment is made using authentication, authorization, learning and teaching a smooth and efficient process. logging, monitoring and validation of input. A3. Moodle 1.5.2 (www.moodle.org). Moodle is a free, • (LIHO) Licensing & hosting. Add description. open source PHP application for producing internet-based • (STAN) Standards support. It is also important to educational courses and web sites on any major platform consider e-learning standards – standards for (Linux, UNIX, Windows and Mac OS X). The fact that it is description of learners' profiles and standards for the free of charge is especially attractive for schools and description of learning resources [11]. In the context of companies which always lack resources for the introduction e-learning technology, standards are generally of new learning technologies. Furthermore, the Moodle developed to be used in system design and system is not only price-efficient – it can easily be The alternative with the highest final weight is CLIX 5.0 compared to costly commercial solutions on all aspects. (0.590) and can be considered as the most applicable LMS Courses are easily built up using modules such as forums, for the students. The second ranked alternative is chats, journals, quizzes, surveys, assignments, workshops, Blackboard, while Moodle 1.5.2 is the least applicable resources, choices and more. Moodle supports localization, LMS. and has so far been translated into 34 languages. Moodle It is worthy to mention that the expert was very consistent has been designed to support modern pedagogies based on during the whole evaluation process. Overall HCR is 0.059. social constructionism, and focuses on providing an environment to support collaboration, connected knowing DISCUSSION AND CONCLUSIONS and a meaningful exchange of ideas. It has nearly 54,000 One of the important problems in the field of e-learning is registered sites (over 9,800 from the U.S.) representing over the selection of an appropriate LMS that will satisfy most of 200 countries, 44.3 million users, and 4.6 million courses. the users’ preferences and requirements. The complexity of Moodle’s wide spread international use, coupled with its the problem is increased due to the growing number of continued growth over the past six years, has made it the LMS each year and also due to the number of features that leading open source LMS solution. should be taken into account while evaluating each LMS. To reduce that complexity and facilitate selection of an Evaluation of decision elements appropriate LMS, we propose a decomposition of the After a brief explanation of basics and concepts of AHP, problem to more easily comprehended sub-problems that the expert compared in pairs first criteria versus goal, then the evaluator can analyze independently. The AHP sub criteria versus criteria, and finally alternatives with methodology based on pair-wise comparison of decision respect to each of the sub criteria. Comparison matrices and elements on one hierarchy level was found to be related calculated local weights of decision elements are appropriate for such analysis. Also, the final result of AHP presented in Figures 2-3. application, which found CLIX 5.0 to be the most applicable LMS, proved that the proposed approach was justified: the reduced hierarchy and use of AHP led to the same result as the one provided by the DeXi evaluation of 57 criteria. If AHP and DeXi are further compared, it should be also Figure 2: Criteria versus goal and their local weights emphasized that: a) AHP treats consistency of the DM (DMs), DEXi does not. b) DEXi uses a simplified 3-point scale (linguistic semantic statements such as low, average and high); AHP most commonly uses Saaty’s 9-points (fundamental) scale; other scales also in use are geometric (Lootsma’s), balanced, Ma-Feng scale etc. In practical implementations the first seems easier, especially if many decision elements have to be considered (assessed). If one has to compare 7 or more elements at a time by using any AHP scale, it can be time consuming and inconsistent (e.g. due to ‘short term memory’ and/or ‘brain channel capacity’ limits). c) AHP produces cardinal information represented by weights at all hierarchical levels of the decision Figure 3: Sub criteria versus criteria and their local weights problem; DEXi does it very approximately and with After the local weights (W) of all decision elements are limited theoretical justification. calculated, a synthesis is performed to obtain composite d) Both AHP and DEXi run easily on any standard PC weights of the alternatives with respect to goal (Table 2). platform. Weights Both AHP and DEXi can be used in individual and group Blackboard 6 0.257 d-m frameworks. In group contexts AHP enables the direct application of various aggregation schemes (e.g. AIJ, AIP; CLIX 5.0 0.590 different weights allocated to DMs; different consensus Moodle 1.5.2 0.152 reaching procedures) while in the use of DEXi, there are no HCR=0.059 implemented aggregation schemes. Table 2: Final (composite) weights of alternatives REFERENCES Learning Environments", September 29 (2009), 1. 3Waynet Inc. LMS evaluation tool user guide. Nice, France. Under License to Commonwealth. 13. Lee, J., Lee, W. The relationship of e-Learner’s 2. Arh, T., Blazic, J.B. Application of multi-attribute self-regulatory efficacy and perception of e- decision making approach to learning management Learning environmental quality. Computers in systems evaluation. J Comput 2, 10 (2007), 28–37. Human Behavior 24, 1 (2008), 32-47. 3. Awad E. M., Gotterer M. H.. Database 14. Naveh, G., Tubin, D., Pliskin, N. Student LMS use management. Danvers, MA: Boyd & Fraser, 1992. and satisfaction in academic institutions: The 4. Baker, D., Bridges, D., Hunter, R., Johnson, G., organizational perspective. The Internet and Krupa, J., Murphy, J., Sorenson, K. Guidebook to Higher Education 13, 3 (2010), 127-133. Decision-Making Methods. USA, WSRC-IM- 15. Nokelainen, P. An empirical assessment of 2002-00002, 2002, Department of Energy. pedagogical usability criteria for digital learning 5. Bohanec, M., Rajkovič, V. Multi-Attribute material with elementary school students. Decision Modeling: Industrial Applications of Educational Technology & Society 9, 2 (2006), DEX. Informatica 23 (1999), 487−491. 178-197. 6. Cavus, N. The evaluation of Learning 16. Pipan, M., Arh, T., Jerman Blažič, B. (2010). The Management Systems using an artificial Evaluation Cycle Management - Method Applied intelligence fuzzy logic algorithm. Advances in to the Evaluation of Learning Management Engineering Software 41 (2010), 248–254. Systems. Chapter In Integrating Usability 7. Chen, Y. , Hwang, R., Wang, Ch. Development Engineering for Designing the Web Experience: and evaluation of a Web 2.0 annotation system as Methodologies and Principles (Eds. Tasos a learning tool in an e-learning environment . Spiliotopoulos; Panagiota Papadopoulou; Computers & Education 58, 4 (2012), 1094-1105. Drakoulis Martakos; Georgios Kouroupetroglou), 8. Cobb, J., Steele, C. Association learning IGI Global, Hershey PA, USA, 2010. management systems 2011: Special Blackboard 17. Saaty, T.L. The analytic hierarchy process. edition. Tagoras (2011), 30-31. McGraw-Hill Inc, 1980. 9. Forman, E., Peniwati, K. Aggregating individual 18. Shee, D. Y., Wang, Y. Multi-criteria evaluation of judgments and priorities with the analytic the web-based e-learning system: A methodology hierarchy process. Eur J Oper Res 108 (1998), based on learner satisfaction and its applications. 165–169. Computers & Education 50, 3 (2008), 894-905. 10. IEEE Computer Society. Learning Technology 19. Srdjevic, B. Combining different prioritization Standards Committee LTSC, IEEE, Draft Standard methods in analytic hierarchy process synthesis. for Learning Objects Metadata (LOM) (Tech. Rep. Computers & Operations Research 32 (2005), 1484.12/D4.0). Washington, DC, USA, IEEE 1897–1919. Computer Society, 2002. 11. Jerman Blažič, B., Klobučar, T. Privacy provision 20. Srdjevic, B., Srdjevic, Z. Bi-criteria evolution in e-learning standardized systems: status and strategy in estimating weights from the AHP ratio- improvements. Computer standards & interfaces scale matrices. Applied Mathematics and 27 (2005), 561−578. Computation 218 (2011), 1254-1266. 12. Kurilovas, E. Methods of Multiple Criteria 21. Tzeng, S., Chiang, C., Li, C. Evaluating Evaluation of the Quality of Learning intertwined effects in e-learning programs: A Management Systems for Personalised Learners novel hybrid MCDM model based on factor Needs. Presented at the EC-TEL'2009 Workshop analysis and DEMATEL. Expert Systems with "Learning Management Systems meet Adaptive Applications 32, 4 (2007), 1028-1044.