1 Assessment and Improvement of Information Quality through Information Management Process Concept Ismael Caballero, Mario Piattini Abstract — It is a well known fact that information is one of the most important assets for today’s enterprises since it is the basis for organizational decisions. However, as information is produced from data, both data and information quality must be assured. Although many researches have proposed technical and managerial solutions to some specific information quality problems, an integrative framework which brings together these kinds of solutions is still lacking. Our proposal consists of a framework for assessing and improving information quality through the concept of Information Management Process (IMP). An IMP is assessed according to an information quality maturity model by using an assessment and improvement methodology. The framework provides a consistent roadway for coordinating efforts and resources to manage information quality with a strategic perspective. As an application example, a study case has been included in the paper. Index Terms — Information Quality, Information Quality Management, Information Quality Assessment —————————— ‹ —————————— 1 INTRODUCTION A s [39] states, information can be obtained as the result of a data manufacturing process, where data must be considered as non-ideal raw material for this process. problem by the entire organization. It is a matter for the quality management team, encouraged by organization heads, who must implement several quality management Data is said to be non- ideal due to different and specific concepts like information quality policy, information strat- potholes related to special characteristics of data: multiple egy, information quality planning, information quality con- sources of data can generate different values; systematic trol and information quality assurance through organiza- errors can generate information losses; a large amount of tion [16, 17], implying all the workers and trying to coordi- data can be unmanageable for an application in a reason- nate efforts and commitments in order to control and im- able time; distributed and heterogeneous systems can gen- prove information quality issues [29]. erate inconsistent formats, values or definitions, … [37]. Unfortunately, there is not an integrative framework that These potholes are all sources of ongoing information qual- guides organizations to achieving information quality goals ity problems, like unused data, barriers to data accessibility through management by implementing the above- or data utilization difficulty. The need to consider data and mentioned concepts [14, 23]. In spite of the fact that some information as one of the most important assets for organi- researches have provided several information quality zations [19] (and therefore one of the most precious re- measurements and/or assessment methodologies [7 19, 30, sources [30]) has been demonstrated since it is the basis for 33, 40], none of them are focused on group efforts or com- tactical, strategic or operational decisions [33, 36]. Poor data mitments extending to the entire organization in both ana- and information quality will have a negative impact on the lytic and pragmatic ways [9]. What is required is to know global efficiency of organizations [33]. how an organization works, and to develop the ability to Fortunately, more and more organizations have at last real- identify major problems or standardize an information ized the importance of information quality and they try to quality culture by implying both leaders and infrastructure implement some of the frameworks proposed by research- [15, 20] in technical and management tasks. ers [1, 7, 13, 19, 23, 28, 30, 33, 36, 38, 40] for improving spe- In an effort to fill this void, we propose an integrative cific information quality issues, although many of these framework in which information is considered as a product organizations do not yet have the right techniques, tools (what allows the user to take an engineering point of view and practices to achieve a high information quality level of information quality [4, 23]) and taking into account the [25]. One of the reasons for this fact, is that information Software Process definition given by [12] (what allows the quality problems are not usually understood as a global identification of who is doing what, when, using which resources and how), a perception of both Information Man- ———————————————— agement and Information Quality Management activities • Ismael Caballero is with the ALARCOS Research Group. Escuela Superior de Informática (UCLM).Paseo de la Universidad, 4, 13071, Ciudad Real can be created as an Information Management Process (Spain). E-mail: Ismael.Caballero@uclm.es (IMP). An IMP is intended to model what how information • Mario Piattini is with the ALARCOS Research Group. Escuela Superior de quality might be managed by drawing the relationship Informática (UCLM).Paseo de la Universidad, 4, 13071, Ciudad Real among the main components of the Information Systems. (Spain). E-mail: Mario.Piattini@uclm.es Having in mind this perception, entire organizational in- 2 QUATIC’2004 PROCEEDINGS formation quality can be managed by assessing and im- information quality in organizations. By optimizing the proving each IMP in organization taking into account that most important and critical IMPs, an organization can reach several IMP can share several resources. It is true there are a satisfactory information quality state. several frameworks for assessing and improving software The remainder of this paper is structured as follows: section processes such as CMM (Capability M [20, 31], CMMI [35], 2 describes CALDEA; section 3 presents a brief summary of ISO 9001 [5], BootStrap [3] and SPICE [21]; but none of EVAMECAL. Section 4 shows the results of applying them have focused on information quality nor have they EVAMECAL to a specific kind of organization. Lastly, some even taken it into account. conclusions and futures lines of research related to the In [9] four conditions for a good information quality model are explained. framework are established : 1. it should provide a systematic and concise set of crite- ria according to which information can be evaluated. 2 CALDEA: AN INFORMATION QUALITY MODEL 2. it should provide a scheme to analyze and solve infor- BASED ON MATURITY LEVELS mation quality problems. CALDEA defines five information quality maturity levels 3. it should provide the basis for information quality for an IMP: Initial, Definition, Integration, Quantitative measurement and proactive management. Management and Optimizing. The levels are ordered by 4. it should provide the research community with a con- taking into account several information quality goals and ceptual map that can be used to structure a variety of their relative importance, providing a systematic and con- approaches, theories, and information quality related cise set of criteria according to which information can be phenomena. evaluated. Thus, at higher levels where more information Our proposal defines two main components: quality issues are assured, it would be possible to state that • An information quality model based on maturity more organizational requirements are satisfied. It is also staged levels, known as CALDEA, which main aim is possible to affirm that the higher the information quality to coordinate the relationships among the IMP’s com- maturity levels an organization has reached for its most ponents by creating several growing maturity levels important IMPs, the more competitive this organization can with growing information quality goals and require- be due to the absence of information quality problems. The ments. For each level, several Key Processes Areas levels are drawn like the staged ones of CMMI, because it (KPA) are described proposed. These KPAs are fo- appears to be easier to work with a well-defined sequence cused on management and technical issues. For each of improvements (which cover from basic management KPA, some tools, techniques, standards, and practices project fundamentals to complex data quality management and metrics as required, are proposed, but not im- issues). As previously explained, for each level, CALDEA posed, although due to length restrictions they are not addresses specific KPAs, which meet specific information included in this paper. quality goals. These KPAs are focused on not only technical • An assessment and improvement methodology, but also managerial issues, providing the basis for informa- known as EVAMECAL, in the style of CBA-IPI [6], tion quality measurement and management and integrating SCAMPI [34] or SPICE [22] which consists of a set of both aspects in order to compensate the lack of integrative steps that provides a basis for data/information quality frameworks mentioned in the introductory section. Each measurement and proactive management. The main KPA has been divided into activities and tasks, which can element is a set of questionnaires for assessment and a be satisfied by using several techniques, practices and tools rule system for identifying the path to improvement in order to transform a set of incoming products into other based on the results of this assessment. outgoing ones. In order to make the framework as univer- This structure satisfy the [9]’s four conditions: as CAL- sal and general as possible, none of the techniques, prac- DEA is structured in maturity levels with KPAs, a system- tices and tools are mandatory as previously mentioned, but atic and concise set of criteria for information quality as- organizations must choose the best suited to each KPA on sessment is provided, satisfying the first condition. By de- their own. We should emphasize that the chosen KPAs are fining KPAs for each level, some of which are focused on based on CMMI’s KPAs [35] and the chosen activities and management issues, the basis for proactive management tasks on our experiences with industrial and scientific ini- and measurements is provided, satisfying the third condi- tiatives regarding information quality, which have been the tion. And finally, by being structured in such staged levels main rationale for their choice. Anyway, the contents of this describing KPAs and proposing (not imposing) tools, tech- research paper are in continuous progress in order to niques, and standards, a conceptual map has been provided achieve theoretical validation. In the other hand and, due to for the research community in order to address a variety of paper length restrictions, neither techniques nor tools will approaches, theories and information quality related phe- be looked at in detail nomena, satisfying the fourth condition. On the other hand, EVAMECAL provides a schema for analyzing and solving 2.1 Initial Level information quality problems, satisfying the second condi- An IMP is said to be at Initial Level when no efforts are tion. made in order to achieve any information quality goals. The main idea of the framework is to use EVAMECAL for assessing and improving an IMP using CALDEA’s lev- 2.2 Definition Level els as reference in the guidance of the optimization of the An IMP is said to be at Definition Level or Defined when it CABALLERO AND PIATTINI: ASSESSMENT AND IMPROVEMENT OF INFORMATION QUALITY THROUGH IMP CONCEPT 3 has been defined and planned. This implies identifying all dated to correct defects and/or discord with the USR- its components and their relationship to the requirements. IMP, USR-IQ and the organizational information To achieve this goal, the following KPAs need to be satis- quality policies. fied: • (RM) Risk and Poor Information Quality Impact • (IQATM) Information Quality Assurance Team Management. Authors like [7] affirm that it is neces- Management. The aim of this KPA is to form a team sary to determine the impact of risks due to the poor composed of people having direct responsibility for quality of information in the IMP in order to limit information and for its integrity. This team will en- them at organizational level. courage the entire organization to take on commit- • (IQSM) Information Quality Standardization Man- ments regarding information quality policies [2] and agement. All lessons learned through specific experi- make corresponding efforts in order to support the ences should be properly gathered, documented and activities of this maturity model. transmitted to all new people who are going to be • (IPM) IMP Project Management. This is a manage- part of an IMP. Thus, IMP performance will be higher ment KPA aimed at developing a plan for IMP in or- than it would otherwise be. der to coordinate both managerial and technical ef- • (OIQPM) Organizational Information Quality Poli- forts and elaborate all related documentation. cies Management. The means by which all the efforts • (URM) User Requirements Management. User Re- previously mentioned can be implemented, consisting quirements must be collected and documented. Three of defining policies of information quality based on kinds of requirements might be identified: those re- the previously defined standards affecting not only lated to final product (URS), those related to IMP – single IMPs, but also the whole organization. which must be gathered in the User Requirement Specification for IMP document (URS-IMP) document 2.4 Quantitative Management - and those related to Information Quality –which must be gathered in the Information Quality User Re- An IMP is said to be at a Quantitative Management Level quirements Specification (URS-IQ). or quantitatively managed when after having been Inte- • (DSTM) Data Sources and Data Targets Manage- grated (Integration level has been achieved) several Meas- ment. Due to the intrinsic characteristics of data, both urement Plans have been developed and implemented and data sources and targets must be identified and docu- measurement procedures have been automated. Therefore, mented, in order to avoid problems like uncontrolled the main information quality goal of this level is to obtain a data redundancy or problems with data format inter- quantitative compliance that IMP performance over a rea- change [28]. sonable time period, remains as consistent as required in • (ADMPM) Database or Data Warehouse Acquisi- terms of variation and stability through a reliable set of tion, Development or Maintenance Project Man- measurements [11] of information quality characteristics of agement. In order to improve information quality, it IMP. This level is composed of the following KPA: is highly recommendable to draw up a project for ac- • (MM) IMP Measurement Management. Since met- quisition, development or maintenance of a database rics about IMP components have been drawn up at or a data warehouse management system, supporting definition level, the aim of this KPA is to define when both URS-IQ and URS-IMP. and how to make the measurements and how to rep- • (IQM) Information Quality Management in IMP resent the results and to whom. These metrics are Components. For each information quality compo- used to check conformity to specifications [15, 24] nent, information quality dimensions from URS-IQ • (AMP) IMP Measurement Plan Automation Man- must be identified, controlled and monitored. It is agement. In order to increase the reliability and re- necessary to identify from the URS-IQ the dimensions peatability of measures, measurement procedures of quality of information that must be controlled [19], must be automated as required by [18]. This KPA as well as the metrics adapted for each one of those aims to study all the issues related to the automation dimensions [23, 32]. of these management procedures. 2.3 Integration Level 2.5 Optimizing Level An IMP is said to be at Integration Level or Integrated An IMP is said to be at Optimizing Level if when being when besides having been Defined (Definition level has quantitatively managed the obtained measurements are been achieved), many efforts are made in order to assure used to develop a continuous improvement by eliminating that the IMP is in compliance with organizational informa- defects or by proposing and implementing several im- tion quality requirements and standards. This implies stan- provements. The following two KPAs must be satisfied: dardizing different information quality learned lessons in • (CADPM) Causal Analysis for Defect Prevention order to avoid previous errors and improve future work.. Management. From the study of the measurement re- The following KPAs must be satisfied: sults, some typical quality techniques and tools like • (VV) Information Products and IMP Components Statistical Control Process (SPC) or Ishikawa’s dia- Validation and Verification. Both information prod- grams can be applied to detect defects of information ucts (obtained as a result of data transformation proc- quality and identify their root causes. The obtained ess) and IMP components must be verified and vali- conclusions must form a basis for a corresponding 4 QUATIC’2004 PROCEEDINGS maintenance process for removing detected defects in TABLE 1 affected resources. NUMBER OF QUESTIONS BY MATURITY AND DEPTH LEVELS. • (IODM) Innovation and Organizational Develop- DEPTH LEVEL 1 2 3 ment Management. Similarly to the previous KPA, Level Maturity 2 12 18 82 here the results can be used to improve the IMP, in terms of performance, planned time or budget. This is Level Maturity 3 8 7 24 the basis for the idea of continuous improvement. Level Maturity 4 3 2 11 2.6 Achieving higher levels of CALDEA Level Maturity 5 4 2 11 In the proposed framework, a KPA can be in one of these Total Questions per depth level 28 31 131 states: {“Fully Satisfied”, “Satisfied”, “Partially Satisfied” and Total Questions Questionnaire 190 “Not Satisfied”}. An information quality maturity level is said to be achieved when all contained KPAs are at least “Satisfied”, that is to say, in order to achieve higher levels of veloped products. The idea of organizing the question- CALDEA, the Information Quality Management Team naire in several depth levels is so that questions will be must guide an IMP to the “Satisfied” state of all contained asked from top down and only if necessary. Thus, the KPA for the lower level. This goal can be achieved by ap- block of questions of the first depth level serves to plying the assessment and improvement methodology, evaluate if KPAs are satisfied or not, avoiding at this which is next described. depth level other questions which are not important for establishing specific aspects about the accom-plishment 3 EVAMECAL: AN ASSESSMENT AND of the more specific issues, which are dealt with in IMPROVEMENT METHODOLOGY lower depth levels. So, if all the answers to the ques- tions of the first depth level differ from “Not Satisfied”, As previously mentioned, an assessment and improve- then questions in the level immediately below should ment methodology is required in order to guide organiza- be answered, and so on. Altogether one hundred and tions to reach higher information quality maturity levels for ninety questions would be answered in the case of all each IMP. Basically, the methodology consists of a PDCA the answers to the questions of the first and the second cycle. The following is a brief summary of the steps to be depth level differing from "Not Satisfied". On the other taken: hand, a maturity level of a given IMP can be achieved a 1. Choose an IMP which needs to be optimized level only if the lower ones have been achieved. Table 1 2. Elaborate a Plan for its assessment. gathers the number of questions made by each matur- 3. Execute the assessment plan by conducting the sur- ity level and each depth level. Due to length restric- veys and measuring as required. (see section 3.1 tions none of the questions are included in this paper. It where a set of surveys are described) is important to say that we are working on checking 4. Analyze the results and elaborate an Improvement the validity and efficiency of each block and question Plan. inside the blocks for each depth level. The answers to 5. Study the viability of the Improvement Plan and the these questions must be a number between 0 and 100, solution. in order to quantitatively assess the degree of satisfac- 6. Execute the Improvement Plan. 7. Confirm the improvements and obtain and stan- tion for each task, activity and KPA. Thus, it is possible dardize conclusions. to set a numeric qualification for each state by calculat- As demonstrated in this document, the numbering for sec- ing a weighted average of the obtained qualification in tions upper case Arabic numerals, then upper case Arabic each one of the KPAs for that level according to a pro- numerals, separated by periods. Initial paragraphs after the posed weight given by a critically degree. In the pro- section title are not indented. Only the initial, introductory posal, it has been established that if this qualification is paragraph has a drop cap. between 0 and 20, the KPA is said to be “Not Satisfied”; if it is between 20 and 60 is said to be “Partially Satis- 3.1 The surveys fied”; if it is between 60 and 90 it is said to be “Satisfied”; For the assessment process, a set of surveys has been elabo- otherwise it is said to be “Fully Satisfied”. Table 2 shows rated. This set consists of four different classes of question- in the first column the KPA of each level of maturity naires, with different goals: and in the second one, the degree of criticality of each A. In order to delimit and characterize the organization, a KPA for that level. The degrees of criticality also serve total of fifteen questions. as a criteria in order to choose which KPAs must be sat- B. In order to delimit and characterize the IMP to be as- isfied first in the third step of EVAMECAL. These de- sessed, a total of six questions. grees of criticality are a hypothesis according to the C. In order to assess the degree of achievement of each supposed degree of importance. It is a line of future re- maturity level, several questions organized in different search to determine these weights based on the de- and selective blocks have been developed. The ques- mands of the different organizations interviewed and tions are focused on the KPAs, activities, tasks, pro- to set finely the ranges for qualification. posed techniques, tools and practices and required de- D. Finally, in order to collate and compare the answers CABALLERO AND PIATTINI: ASSESSMENT AND IMPROVEMENT OF INFORMATION QUALITY THROUGH IMP CONCEPT 5 with previous questionnaires, there is a last block of TABLE 2 questions with descriptive and textual language. CRITICALITY DEGREE FOR EACH KPA IN CALDEA. The surveys are going to be conducted at assessment CRITICALITY DEGREE time. Definition Level (IQATM) Information Quality Assurance Team 10 % 4 EXPERIENCES APPLYING THE FRAMEWORK. Management (IPM) IMP Project Management 15 % In order to empirically validate the framework and test (URM) User Requirements Management. 25 % its practical applicability and efficiency, it has been applied (DSTM) Data Sources and Data Targets Man- to different IMPs from several organizations. The following 10 % agement. shows the results obtained from applying the framework to (ADMPM) Database or data warehouse Acqui- a particular organization with proven experience in the sition, development or maintenance Project 25 % information management field. EVAMECAL was first ap- Management plied by following the steps previously detailed. The results (IQM) Information Quality Management in IMP of questionnaires A and B are presented in subsection 4.1 25% Components. and 4.2; in 4.3 the results of questionnaire C and a list of Integration Level proposed improvements are presented. Since finishing this (VV) Information Products and IMP Compo- paper, the organization has continued working on the im- 25% nents Validation and Verification. provement plan, although several subgoals have already (RM) Risk and Poor Information Quality Impact been achieved. 25% Management (IQSM) Information Quality Standardization 4.1 Characterization of the company 25% Management The main activity of the company is software develop- (OIQPM) Organizational Information Quality ment with a solid knowledge of and training in software 25% Policies Management quality standards and Software Engineering (all the devel- Quantitative Management Level opments are carried out by following one of the most im- (MM) IMP Measurement Management 70 % portant and widely used national software development (AMP) IMP Measurement Plan Automation methodologies). The company has obtained an ISO 9000 30 % Management. certification. Their offered services are consulting, devel- Optimizing Level opment, training courses, technical attendance, sale of li- (CADPM) Causal Analysis for Defects Preven- 50% censes, database and data warehouse administration, sys- tion Management tem planning projects and migration to an important com- (IODM) Innovation and Organizational Devel- 50% mercial DBMS (e.g. Oracle of which they are certified part- opment Management. ners). With a total of eighty-nine employees, it is the eight- een of the department of systems who organize the re- sources of computing support for the rest of the depart- 4.3 Assessment and Improvement of the IMP. ments. All the questions in the surveys were made to the head of 4.2 Characterization of studied IMP the consulting department. In table 3 the main results of these surveys can be found. These results reflect that none Among all the IMPs, the framework was applied to that of the KPAs belonging to level 2 or higher are at least “Sat- related to the Training Management Process, which is a isfied”. This means that the Definition level is at a “Not responsibility of the consulting department. The main goal Achieved” state. As an example, the conclusions drawn are of the IMP in question is to manage data regarding training, included: which consists of gathering both internal and external re- • (IQATM) Information Quality Assurance Team Manage- quests for training, choosing who are going to be the train- ment. In spite of not properly satisfying this KPA, the ers, determining which resources are going to be used and organization presents a quality infrastructure that can managing several quality training issues This process is adequately support information quality. adequately specified and documented in the quality man- • (IPM) IMP Project Management. This KPA is also not ual of the organization. satisfied. Some different forms exist for gathering data about course • (URM) User Requirements Management. User Require- demands, assignations and quality evaluations of proposed ments have been managed for the training proce- exercises, didactic materials, trainer capability, installations, dures, although information quality requirements assistance and used resources. have not been taken into account. The organization runs a software application, which is an • (DSTM) Data Sources and Data Targets Management. In internally developed tool to manage all previously men- the IMP definition, both data sources and data prod- tioned data. One of the employees of the consulting de- uct targets are identified and documented. There are partment, normally always the same one, is responsible for some forms in order to standardize data interchange transcribing data from the forms to the tool and for obtain- formats. ing the information, which will be given to the adequate • (ADMPM) Database or data warehouse Acquisition, de- person. 6 QUATIC’2004 PROCEEDINGS velopment or maintenance Project Management. The da- 5 CONCLUSIONS AND FUTURE WORK tabase where data is stored is an organizational one In this paper, an integrative framework for assessing and adequately modified to support the training manage- improving information quality for organizations has been rial software tool. Thus, data and procedure models briefly presented, and one experience of applying this were modified and extended, although none of the in- framework has been described. The framework consist of formation quality issues were considered. two main components: CALDEA, an information quality • (IQM) Information Quality Management in IMP Compo- model based on maturity levels which serves as references nents. This KPA has not been satisfied because infor- when assesing and guidance when improving; both activi- mation quality is not one of the goals of the IMP. ties, assessement and improvement are supported by the Taking into account the criticality degree of each of the second component, EVAMECAL. KPAs (see table 2) the following recommendations were On one hand, the IMP concept with both CALDEA and proposed in order to satisfy KPAs at Definition level: EVAMECAL satisfies the four goals required for a good 1. Create an Information Quality Assurance Team information quality framework [9]: which assumes the responsibility of IMP project 1. CALDEA provides a systematic and concise set of management criteria for information quality according to which 2. Adequately manage user requirements, both IMP information can be evaluated. and IQ ones. 2. EVAMECAL provides a schema for analyzing and 3. Identify and define both data sources and data prod- solving information quality problems uct targets, as well as the data interchange formats. 3. Some KPAs in CALDEA provide the basis for in- 4. From these requirement specifications, adequately formation quality measurement and proactive manage the information quality dimensions for each management. component of the system 4. CALDEA is by itself a conceptual map that can be 5. Modify database or data warehouse to give support used to structure a variety of approaches, theories to the information quality and information quality related phenomena since 6. Plan a project for the development of the IMP. KPA does not propose a closed set of tools, tech- niques and methodologies. TABLE 3 On the other hand, the experience of applying the RESULTS OBTAINED FROM APPLYING EVAMECAL TO THE TRAINING framework to real case studies has allowed both CALDEA MANAGEMENT IMP and EVAMECAL to be refined and has demonstrated that RESULTS OF SURVEYS although it is known that information quality is becoming Definition Level Not Achieved increasingly important, organizations do not have or do not (IQATM) Information Quality Assurance provide enough time or resources to deal with it. This fact Not Satisfied is aggravated by the lack of an information quality enter- Team Management Not Satisfied prise culture. (URM) User Requirements Management. Although the presented framework is becoming more (DSTM) Data Sources and Data Targets Partially Satisfied widely used much work has to be done, beginning with a Management. validation of both models, all the questionnaires of the sur- (ADMPM) Database or data warehouse veys, and the criticality degrees for each KPA. Another line Acquisition, development or maintenance Partially Satisfied Project Management of work being pursued, consists of choosing (or even de- veloping when they do not exist) some standards, practices, (IQM) Information Quality Management Not Satisfied techniques and tools in order to satisfy the majority of the in IMP Components. Integration Level Not Achieved information quality requirements for the majority of the (VV) Information Products and IMP organizations. Not Satisfied Components Validation and Verification. (RM) Risk and Poor Information Quality ACKNOWLEDGMENT Not Satisfied Impact Management This research is part of both CALIPO- supported by Direc- (IQSM) Information Quality Standardiza- ción General de Investigación of the Ministerio de Ciencia y Not Satisfied tion Management Tecnología (TIC2003-07804-C05-03)- and MESSENGER pro- (OIQPM) Organizational Information ject - supported by Consejería de Ciencia y Tecnología de la Not Satisfied Quality Policies Management Junta de Comunidades de Castilla-La Mancha (PCC-03-003- Cuantitative Management Level Not Achieved 1). (MM) IMP Measurement Management Not Satisfied (AMP) IMP Measurement Plan Automa- REFERENCES Not Satisfied tion Management. [1] Ballou D., Wang R., Pazer, H., and Tayi, G.K. “Modeling Informa- Optimizing Level Not Achieved tion Manufacturing Systems to Determine Information Product (CADPM) Causal Analysis for Defects Not Satisfied Quality”. Management Science 44(4), 1998, Pp. 462-484 Prevention Management (IODM) Innovation and Organizational [2] Ballou, D. and Tayi, G.K. “Enhancing data quality in Data Ware- Not Satisfied house Environments”. Communications of the ACM, January 1999/ Development Management. Vol 42, No I. CABALLERO AND PIATTINI: ASSESSMENT AND IMPROVEMENT OF INFORMATION QUALITY THROUGH IMP CONCEPT 7 [3] Bicego, A., and Kuvaja, D. “Bootstrap”, Europe’s Assesment and process capability, ISO/IEC JTC1/SC7, 1998. Method, IEEE Software, 1993, Pp. 93-95. [22] ISO IEC 15504 TR2:1998, Software Process Assessment – Part 7 [4] Bobrowski, M., Marré, M., Yankelevich, D. “A software Engineer- :Guide for use in process improvement, ISO/IEC JTC1/SC7, 1998. ing View of Data Quality”. Proceedings of Second International Soft- [23] Kahn, B., Strong, D., Wang, R. “Information Quality Benchmarks: ware Quality in Europe. Belgium. November 1998. Product and Service Performance”. Communications of the ACM [5] Coallier, F. “How ISO 9001 fits into the software world”, IEEE April 2002/Vol. 45, No. 4 Software. January 1994, pp 98 –100 [24] Kan, S., “Metrics and models” in Software quality engineering. Sec- [6] Dunaway, D. K. CMM SM -Based Appraisal for Internal Process ond Edition. Addison – Wesley Ed. 2002 Improvement. (CBA IPI) Lead Assessor’s Guide (CMU/SEI-96- [25] Kim, W., Choi, B., “Towards Quantifying Data Quality Costs” in HB-003). Software Engineering Institute, Carnegie Mellon Uni- Journal of Object Technology, Vol. 2, no.4, July-August 2003. Pp. 69- versity, Pittsburgh, 1996. 76 [7] English, L.P. Improving Data Warehouse and Business Informa- [26] Lee, Y.W., Strong, D., Kahn, B., Wang, R. “AIMQ: a methodology tion Quality: Methods for reducing costs and increasing Profits. for information quality assessment”. Information & Management, Willey & Sons, 1999. 2001 [8] Eppler, M.J. (2001) “Increasing Information Quality through [27] Liu, L., Chi, L.N. “Evolutional Data Quality : A theory-specific Knowledge Management Systems Services”. Proceedings of the view”. Proceedings of the Seventh International Conference on Infor- 2001 International Symposium on Information System and Engineering mation Quality (ICIQ-02) 2002 Pp 292-304 (ISE’01) June 25-28, 2001. Las Vegas, Nevada, USA [28] Loshin D. Enterprises Knowledge Management: The Data Quality [9] Eppler, M.J. and Wittig, D. “Conceptualizing Information Quality: Approach. Morgan Kauffman, San Francisco (California), 2001 A review of Information Quality Frameworks from the last ten [29] Motha, W. M. and Viktor H.L. “Expanding Organizational Excel- years.” Proceedings of the 2000 Conference on Information Quality. Pp lence: The Interplay between Data Quality and Organizational 83-96 Performance”, International Conference on Systems, Cybernetics and [10] Firth, C.M. and Wang, R. “Closing the data quality gap: using ISO Informatics (SCI'2001), Orlando: USA, July 22-25, Volume XI, 2001, 9000 to study data quality. TDQM working paper. Data Quality Pp.60-65. Research: A framework, survey and analysis (TDQM-93-03)” To- [30] Olson, J. E. Data Quality: the accuracy dimension. Ed. Morgan tal Data Quality Management (TDQM) Research Program, MIT Sloan Kaufmann Publishers. 2003 School of Management, April 1993 [31] Paulk, M., C. Weber, B. Curtis, and Chrissis, M. The Capability [11] Florac, W, A.and Carleton, A.D. “Using Statistical Process Control Maturity Model Guideline for Improving the software Process, Addi- to Measure Software Process”. In Fundamental Concepts for the son-Wesley, Reading, Mass. 1995 Software Quality Engineer. Taz Daughtrey Editor. American Soci- [32] Pipino, L., Lee, Y., Wang, R. “Data Quality Assessment” Commu- ety for Quality. 2002 nications of the ACM April 2002/Vol. 45, No. 4 [12] Fuggeta, A. “Software Process: A roadmap. The future of Soft- [33] Redman, T.C. Data Quality for the Information Age. Artech ware Engineering”, ed. A. Finkelstein ACM, Press, 2000, pp.27-34. House Publishers, Boston. 1996 [13] Genero, M. and Piattini, M. “Quality in conceptual Modelling”. In [34] Standard CMMI SM Appraisal Method for Process Improvement Information and Database Quality. Kluwer Academic Publishers. (SCAMPI SM), Version1.1: Method Definition Document. 2002. Pp.13-44 CMU/SEI-2001-HB-001. [14] Giannoccaro, A., Shanks, G., and Darke, P. Stakeholder “Percep- [35] SEI. Capability Maturity Model® Integration (CMMISM), Version tions of Data Quality in a Data Warehouse environment”. In Pro- 1.1 CMMISM (CMMI-SE/SW/IPPD/SS, V1.1) Staged Representa- ceedings of 10th Australasian Conference on Information System. tion CMU/SEI-2002-TR-012 ESC-TR-2002-012. en (1999) Pp 344-355 http://www.sei.cmu.edu/publications/ docments/02.reports/02tr00 [15] Grimmer, U., and Hinrichs, H. “A methodological approach to 2. html (last access on June 2004) data quality management supported by data mining”. Proceedings [36] Strong, D.M., Lee, Y. W., Wang R.Y. “Data Quality in context”. of the Sixth International conference on Information Quality. 2001 Pp Communications of the ACM. May 1997, pp 103-110 217-232 [37] Strong, D.M., Lee, Y. W., Wang R.Y. “Ten potholes in the road to [16] Helfert, M., and von Maur, E. “A Strategy for managing data information quality”. IEEE Computer August 1997, pp 38-46 quality in data warehouse systems. Proceedings of the Sixth Interna- [38] Wand, Y. and Wang, R. “Anchoring Data Quality Dimensions in tional Conference on Information Quality. 2002. Pp 62-76 Ontological Foundations”. Communica-tions of the ACM (CACM), [17] Hinrichs, H. “CLIQ- Inteligent Data Quality Management”. Pro- 39, (11), 1996. Pp 86-95 ceedings of the fourth IEEE international Baltic Workshop on databases [39] Wang, R., “A product perspective on data quality management”. and Information System. (2000) Communications of the ACM. February 1998 Vol 41(2) pp58-65 [18] Hinrichs, H. and Aden, T. “An ISO 9001:2000 compliant quality [40] Wang, R., Storey V.C., Firth, C. F. “A framework for analysis of management System for Data Integration in Data Warehouse Sys- Data Quality Research”. IEEE Transactions on Knowledge and Data tem”. Proceedings of the International Workshop on Design and Man- Engineering, Vol 7(4). (1995) Pp 623-640. agement of Data Warehouse (DMDW’2001) Interlaken, Switzerland, June 4, 2001 [19] Huang, K.T., Lee, Y., Wang, R. Quality Information and Knowledge. Prentice Hall, Upper Saddle River, 1999 [20] Humphrey, W. Managing the software process, Addison – Wesley, Reading Mass., 1989 [21] ISO IEC 15504 TR2:1998, part 2: A reference model for processes 8 QUATIC’2004 PROCEEDINGS Ismael Caballero has an MSc in Computer Science from de Escuela Mario Piattini has an MSc and PhD in Computer Science from the Superior de Informática of Ciudad Real. His is currently working on his Politechnical University of Madrid.and is a Certified Information System PhD Thesis on Information Quality Management. He is a professor in Auditor Manager by ISACA (Information System Audit and Control the Departament of Computer Science at the University of Castilla-La Association). He is a professor in the Department of Computer Sci- Mancha, in Ciudad Real, Spain. Authors of several papers on data ence at the University of Castilla-La Mancha, in Ciudad Real, Spain. quality, information quality and information quality management, be- Author of several books and papers on databases, software engineer- longs to ALARCOS Research Group where develops his research ing and information systems, he leads the ALARCOS Research Group work. of the Department of Computer Science at the University of Castilla-La Mancha, in Ciudad Real, Spain. His research interests are: advanced database design, database quality, software metrics, software mainte- nance and security in information systems.