=Paper=
{{Paper
|id=Vol-2218/paper27
|storemode=property
|title=Developing a Tool for Self-Assessment of IT Process Maturity: A Design Science Research Initiative
|pdfUrl=https://ceur-ws.org/Vol-2218/paper27.pdf
|volume=Vol-2218
|authors=Henn Jaadla,Björn Johansson
|dblpUrl=https://dblp.org/rec/conf/bir/Jaadla018
}}
==Developing a Tool for Self-Assessment of IT Process Maturity: A Design Science Research Initiative==
Developing a Tool for Self-Assessment of IT Process
Maturity: A Design Science Research Initiative
Henn Jaadla1, Björn Johansson2
1
Swedbank AS,
Liivalaia 8, 15040 Tallinn, Estonia
henn.jaadla@swedbank.ee
2
Department of Informatics, School of Economics and Management, Lund University
Ole Römers väg 6, 223 63 Lund, Sweden
bjorn.johansson@ics.lu.se
Abstract: Today’s IT organizations must ensure that IT services are aligned to
business needs and actively support ongoing business processes. This means
that internal IT service management processes are under constant improvement.
However, to be able to know if the IT service provision develops in the right
direction, there is a need to perform some kind of self-assessment of IT process
maturity. In this paper we present an initial review of IT process maturity
frameworks with a focus on self-assessment models. The main aim of the paper
is to present a design science research (DSR) project with the goal of
developing a tool for self-assessment of IT-maturity. The context of the project
is a large bank and the developed tool should become a permanent part of the
toolkit used by the bank, to continuously describe a baseline of current state -
“where are we today”. Such baseline will assist the IT organization in
identifying the gap to a wanted future state, and will thereby become the basis
for any improvement plans. This paper presents the first steps in this DSR
project and highlights the need and benefits of conducting the project as a DSR
project.
Keywords: IT Service Management (ITSM), Self-assessment, CMMI, Design
Science Research (DSR), Continuous Improvement
1 Introduction
Many IT Service Management organizations are adopting Agile software
development methodologies to improve time-to-market [1] and to increase customer
satisfaction [2]. However, while an Agile way of working promotes fast feedback
loops and better alignment with customer needs, this informal way of working may
create gaps in process compliance and maturity [3] – especially during the transition
to the new way of working, when process participants are still adjusting to the new
roles and responsibilities. Organizational change may impact the control and feedback
cycles of IT processes due to low process awareness, incomplete role adoption and
other transitional effects. In addition to that, the differences between Waterfall and
Agile may exacerbate the negative effects, if not mitigated properly.
Therefore, when IT enterprises are undergoing organizational changes to Agile
way of working, it would be prudent to evaluate IT process maturity throughout the
change, to ensure that lapses in process compliance and maturity can be handled
swiftly.
This paper introduces a design science research (DSR) project for developing a
tool for self-assessment of IT process maturity at a large bank.
Process maturity level is an indication of how well a process achieves its
objectives, and whether the process is capable of continuous improvement [4].
Process maturity assessments are commonly used as the starting point for ITIL (a set
of practices for IT Service Management; formerly an acronym for Information
Technology Infrastructure Library) implementations, to pinpoint the improvements
which would bring the most benefit, but they are equally valuable for understanding
the as-is state for planning continuous improvements and evaluating the overall
performance of the IT organization. So, whenever an organization is undertaking a
process improvement initiative, or going through organizational change, there is an
increased need for process maturity measurement. Furthermore, to gauge the progress
of improvements, or the impact organizational changes have to processes over time,
the measurement should be applied at regular intervals, across various roles and
organizational departments.
The most common maturity assessments, however, are qualitative assessments,
conducted through interviews, which are complex, time consuming, and expensive to
apply.
The DSR project presented in this paper attempts to design a quantitative
assessment based on the Capability Maturity Model Integration (CMMI) framework,
which will be conducted by performing a questionnaire-based survey among process
participants. The simplified nature of a self-assessment means that the survey can be
applied to different organizational units, and performed regularly, to make it useful
for monitoring IT process maturity trends in the organization.
The next section of this paper presents an initial review on IT process maturity
self-assessment, and describes the need to have a clear picture of IT process maturity
in today’s organizations.
We then proceed by describing the context for the proposed DSR project, which is
the IT organization of a large bank. The way the bank has been working with
development of IT is presented, as well as the changes that have recently been
implemented since the IT development process has changed into an Agile
development process.
The second-last chapter presents the suggested DSR project and describes the steps
and activities that are planned, as well as why these activities are suggested.
In the final section we present some concluding remarks on why we believe a DSR
project is the most appropriate approach in this case, and describe the benefits that are
expected as results from conducting the research in this way.
2 An initial review on IT process maturity self-assessment
The importance of internal services and their impact on the quality of the
manufactured products was the principle of the Total Quality Management approach,
developed in the ‘80s by Deming. Today, there appears to be common understanding,
282
that internal service quality is an influence and a key quality collaborator of external
services [5]. Exemplifying this is the multitude of international standards available for
managing IT Services.
There is a high demand on IT organizations to deliver value added IT services, and
IT services are under constant pressure to become better, faster and cheaper [6].
Therefore, improvement and optimization of an IT organization’s service processes is
an ever-ongoing work in progress. It is important to have well-working IT service
management processes in order to gain edge and maintain competitive advantage. IT
Service Management (ITSM) is the discipline that strives to improve the alignment of
information technology efforts to business needs and to manage an efficient
provisioning of IT services with guaranteed quality [7].
Regardless of where an organization is in the ITSM journey, understanding the
current state of IT process maturity is critical when deciding on improvement
priorities [7]. To define the current state by establishing an ‘as-is’-baseline, several
different methods - or combination of methods - are available [8]. One of the most
commonly used methods is to do a maturity assessment, which will determine the IT-
processes maturity level in an organization compared against a best-practice reference
set of processes [9]. IT process maturity is a good indicator for the organization’s
ability to perform and deliver value added IT services. The whole idea is that a
maturity model defines different maturity levels, and the higher up on the maturity
scale an IT organization is, the better it performs.
Apart from illuminating areas for improvement, self-assessment provides an
important cultural benefit because it encourages an ethos of continuous improvement,
promotes a holistic perspective, and allows people to gain a broader understanding of
the area in question [10, 11]. Regular use of self-assessment ensures that sound
approaches are used and developed in the organization [12].
There is no universal method for such self-assessment. On the contrary, findings
indicate that several approaches to self-assessment are successful as long as they fit
the organization, are used continuously and foster participation. [13].
One way of performing a maturity self-assessment is qualitatively through
conducting interviews and collecting evidence. This is however a long and costly
method, as the interview process and data collection is a highly complex and
specialized task that needs to be performed by competent assessors. Because of the
complexity of these methods, maturity assessment becomes an expensive and
burdensome activity for organizations [14].
Therefore it can be more appealing for an organization to select a quantitative
approach [7], where a representative selection of the process participants is surveyed
using a simplified questionnaire.
From a business perspective, the notion that it is easier to convince top
management when a large quantity of people has had a say can also weigh in favor of
a quantitative approach [15].
In quantitative assessments, a large number of respondents is surveyed, and
therefore it is important that the respondents understand the context and the questions
in a similar way. Therefore, to create a suitable assessment tool for the organization, it
is important to adjust the questions to set them in the appropriate organizational
context.
There are several aspects which impact the choice of assessment method, including
the need for independent external validation of results, applicability for
benchmarking, cost to business in time, effort and resources, etc. But perhaps the
most important factor of choosing the assessment method is whether the assessment
method is appropriate to support a long-lasting improvement program.
283
Laszlo [16]) concludes that few programs can withstand the test of time without
appropriate follow-up. Experiences have shown, that organizations that do not
manage to control the improvement initiatives they have established will lose focus
on achieving the basic organizational objectives [17]. Continued success means that
progress must be monitored continually to identify what has gone well and what
needs to be improved; then strategies and actions to increase the pace of improvement
can be developed [18]. With all this in mind, we next presents a DSR project that
aims at developing a tool for Self-Assessment of IT process maturity.
3 The context of the DSR project
The following section of the paper describes the IT organization of the company in
question, describes the Waterfall- and Agile-based versions of the company’s IT
Development Process, as well as the implications that their differences have on the IT
processes.
Swedbank is a large multinational financial institution with around 16000
employees. Swedbank’s main IT operations are distributed across four countries. The
company has a long history of software development, and has utilized different
techniques in different projects and developments, including Agile and Extreme
approaches. Due to the nature of the business, however, the development of software
has been heavily influenced by hardware-oriented development approaches. Due to
the need to control the development of complex software-intensive systems,
Swedbank’s Development Process has historically been built on the Waterfall
approach. This process has been well-integrated with IT Governance, Resource
Management, and Financial Process. However, for the Business customers, the
Waterfall approach has the downside of long lead times and slow feedback cycles.
To mitigate the downsides of the implemented Waterfall approach, Swedbank has
been introducing an Agile approach in some teams over the past several years, and as
of 2018, the Agile approach is implemented throughout all Business Areas.
The department-based way of working with clear distinction between IT
development and maintenance roles is replaced with cross-functional teams that
handle both development and operations, and are working based on a common
backlog.
Swedbank’s ITSM processes are based on the ITIL framework. The change to
Agile development process will affect the ITSM processes by changing the roles and
responsibilities, organization structure and the speed of introducing new services into
the production environment.
The changed dynamics of the way services are developed and operated will impact
the IT process maturity in various ways, and therefore it is important to evaluate the
process maturity changes throughout this organizational change, across the different
affected teams.
3.1 Swedbank Development Process framework
The existing Swedbank Development Process, which is shown in figure 1, is based on
the Waterfall approach.
284
Figure 1 - Swedbank Development Process (Waterfall)
The “Waterfall” development process consists of four phases as shown in figure 1.
Phase 1, Business Needs Analysis aims to capture ideas/identified needs, and
prepare a rough Business Case to understand whether it is worth investing in
continuing with a Pre-study.
During Phase 2, Pre-study, the new or changed business model and the
requirements are analyzed, alternative solutions are assessed and a recommendation
on the approach is made. Then the architectural description is prepared and approved,
and based on this, the project risks are assessed. Before moving to the next phase, the
business case is refined, and the initial value realization plan is created.
Phase 3, Project Development, comprises of the traditional waterfall steps of
initiation, development, verification and delivery.
Phase 4, Value Realization, contains the activities in business operations to fully
utilize the output of the project, and the measurement of the outcomes and effects to
assess the achievement of the business case and for input to future investment
decisions.
3.2 Agile Development Process
The new, Agile Development Process is fitted into the same framework, represented
in the same three levels. The Agile Development Process framework is only visibly
different from the Waterfall Development Process framework in that the
Development and Verification phases are combined into a single phase called
Iterative Incremental Development. This phase is repeated for each iteration.
285
Figure 2 - Swedbank Development Process (Agile)
However, the patterns of work within the project organization, which is comprised of
the steering committee and the development team, are quite different in Agile.
The project budget and time are fixed at the beginning; functionality is prioritized
to deliver business value.
Requirements are initially described in the form of user stories, in simple
business language. There is less detail upfront than in a Waterfall project;
detailed documentation may be created at the end of project, if needed.
Development of the solution is performed in time-boxed iterations. Detailed
planning is made at the beginning of the iteration, although there is no detailed
plan for whole project.
All team members are jointly responsible for the planning and for the monitoring
of progress of the iteration. A business representative is part of the development
team throughout the project.
Agile specific techniques and tools are used for project planning and
management, e.g. estimation using abstract story points, planning using the task
backlog, status reporting in daily stand-ups and burn-down charts.
3.3 Challenges of an Agile approach
There are several risks arising from the organizational transition from centralized,
waterfall-based way of working to decentralized, Agile way of working [3].
Among the transitional effects are the incomplete role adoption, low process
awareness and team motivation issues during the formation of the new cross-
functional teams.
Decentralization may lead to uneven performance between different business units
due to different adoption speed of new way of working. Additionally, the
decentralization may result in inefficiencies and duplication of control and
management activities.
However, the biggest change of introducing Agile way of working is made to the
way Business areas, departments and teams are structured. Previously, the
development and maintenance teams were mostly separated, and all IT teams
belonged to the IT divisions linked to the Business Areas.
286
In the Agile setup, Business Areas are divided into value streams, which in turn are
divided into Agile teams, which handle both development and maintenance of the
services. The teams will belong directly to the Business Areas, and there will be a
dedicated Business representative in each team. This will give the teams increased
autonomy in how they build and maintain their services.
As a result, ITSM processes will also be directly affected by this change, as both
development work and maintenance tasks will be handled by the same cross-
functional team, and the prioritization for the tasks will be done in one backlog. This
creates a risk, that maintenance, lifecycle management, and service operation tasks
may be under-prioritized in favor of development tasks.
3.4 Swedbank approach to measuring IT process maturity
In order to manage the impact that the transformation has to the IT Service
organization, there is a need to measure the effect this move has to process
compliance and IT process maturity.
Process maturity assessments take a comprehensive look at how an organization
integrates people, processes, tools, products, and management. This detailed
understanding is commonly used for identifying and prioritizing process
improvements [7]. However, in this case, the goal is to identify a trend of process
compliance and maturity.
To be able to gauge the impact of organizational changes to the IT process maturity
level, the maturity assessment needs to be performed regularly, to identify trends and
provide feedback while the new way of working becomes the norm. A full Capability
Maturity Model Integration (CMMI) assessment is unsuitable for establishing trends
in a short timeframe due to the cost, disruption and long feedback cycle. Therefore,
the IT Process Maturity Assessment project at Swedbank aims to implement a survey-
based self-assessment, which can be applied repeatedly across a broad spectrum of
roles and business areas within the IT organization.
4 The design science research project
Swedbank aims to improve on the off-the-shelf maturity assessments by establishing a
Swedbank-specific, recurring IT process maturity assessment program. The
assessment will build on the CMMI framework, but will be adjusted to the Swedbank
context and supplemented with questions regarding motivational and business benefit
aspects.
Swedbank IT process maturity self-assessment tool will be developed as a design
science research project. This approach will allow us to formalize the design, testing
and verification steps, and to ascertain validity, reliability and accuracy of the results.
The reasons for our choice of the DSR method is that the method itself aims to
create an artifact (e.g. a method, models, constructs, instantiations) and therefore is
suitable for the purpose of our research.
Regarding our specific research we have used the framework of Hevner, March,
Park [19] and adapted it to our research context (Figure 3). The environment defines
the problem space [20] and here we find the goals, problems, and opportunities that
define requirements, as they are perceived by people within the Swedbank IT
organization.
287
Design science addresses research through the building and evaluation of artifacts
designed to meet the identified business need [19]. The purpose of our research is to
create the artifact to be evaluated in collaboration with the IT organization in an
iterative way. In this way the project is also related to Action Design Research (ADR)
as presented by Sein, Henfridsson, Purao [21].
Figure 3 - DSR framework for the project
The Relevance Cycle provides input from the contextual environment of the research
project to the design science activities. The Rigor Cycle bridges the design science
activities with the knowledge base of scientific foundations, domain experience, and
expertise that provides guidance to the research project. The central Design Cycle
iterates between the core activities of building and evaluating the design artifacts and
processes of the research [19].
4.1 The Relevance Cycle
An application domain consists of the people, organizational systems, and technical
systems that interact to work toward a goal. The application domain determines the
requirements and acceptance criteria for the research.
In this case, the people perspective consists of IT organization, Process Office and
Agile teams, the organizational systems are the departments and the IT Service
Management framework, and the technical system is the survey tool and the analytics
tool used to gather and process the results.
The scope of the maturity assessment will be the IT Service Management
processes, and the assessment will be based on the Capability Maturity Model
Integration for Services (CMMI-SVC) model, gauging the IT process maturity and
performance on Process Area level, with the topics divided into three dimensions:
People, Process and Technology.
The questionnaire will be created in cooperation with the Process Office to engage
the subject matter experts in tailoring the CMMI framework for Swedbank context.
The output from the design science research will be returned into the environment
for study and evaluation in the application domain, i.e. the maturity assessment will
be carried out on a test group in the organization, and the results verified and
validated with the participants and subject matter experts from the Process Office.
288
The results of the field testing will determine whether additional iterations of the
relevance cycle are needed. The new artifact may have deficiencies in functionality or
in its inherent qualities (e.g. performance, usability) that may limit its utility in
practice. Another result of field testing may be that the requirements input to the
design science research were incorrect or incomplete with the resulting artifact
satisfying the requirements but still inadequate to the opportunity or problem
presented.
4.2 The Rigor Cycle
Design science draws from a knowledge base of scientific theories and engineering
methods that provides the foundations for rigorous design science research. As
importantly, the knowledge base also contains additional knowledge: firstly, from the
experiences and expertise that define the state-of-the-art in the application domain of
the research, and secondly, from the existing artifacts and processes found in the
application domain and the artifacts and processes developed in the iterative design
cycle.
The proposed approach builds on the CMMI-SVC framework, which represents
the best practice approach. This choice was based on the fact that CMMI is an
established model widely recognized in the industry, and that it allows the tailoring of
the model to better suit specific projects [22].
There are several works concerning the usefulness of IT process self-assessments,
which will provide a foundation for the improvements to be made in the design cycle
[5, 15].
There is a question of accuracy of quantitative process maturity self-assessments,
when compared to full qualitative process maturity assessments. Quantitative
assessments have a tendency to score maturity higher than it actually is, especially in
the people and process dimensions, but also in the tools dimension, which all require
specialist knowledge of the area in question [15]. The same tendency has been
identified in health sciences [23]. It is important to be aware of this upward bias,
especially when identifying improvements to implement on the path towards the next
maturity level.
It can also be questioned, whether IT process maturity alone is a good framework
for covering compliance, performance, value, quality and effectiveness of IT
processes. It may be insufficient to rate effectiveness of IT without the context of
business customer viewpoint. The IT capability maturity needs to be assessed against
actual business needs, and the value the processes provide to Business in terms of cost
and organizational risk. Also, the actual practice or operation of processes is strongly
affected by culture and behavior of the participants. The CMMI framework does not
specifically address the topics related to culture and motivation.
In the rigor cycle the data and artifacts from the design cycle are collected, stored
and analyzed. This includes the coding and mapping of questions for each iteration,
the functional setup of the survey tool, the interpretation and reporting artifacts of the
survey results, and the detailed feedback received from project participants.
289
4.3 The Design Cycle
The internal design cycle is central part of the science research project. This cycle of
research activities iterates between the construction of an artifact, its evaluation, and
subsequent feedback to refine the design further.
The goal of this cycle is to generate design alternatives and evaluate the
alternatives against requirements until a satisfactory design is achieved (Simon 1996).
As discussed above, the requirements are defined in the relevance cycle and the
design methods and theories are provided in the rigor cycle.
The IT process maturity assessment tool will contain two main components. The
data collection functionality will be developed as a web-based survey, using a
common survey platform. This platform will store the questionnaire, recipient lists
and raw results data.
The second component is the translation table for the results, where the processing
and aggregation of results is performed. This will initially be built in excel, with more
advanced tools considered as the project continues.
The questions will be based on CMMI-SVC, modified to suit the organization’s
processes and language. The questions are mapped to a CMMI process area and
maturity level, Swedbank process, and the respective dimension of People, Processes
or Tools.
To cover the culture and motivation perspective not specifically addressed by
CMMI-SVC, the People dimension will be extended with questions relating to team
collaboration, motivation and self-improvement aspects. The Process dimension will
be supplemented with questions about process relevance to business goals.
The focus of interest is on the roles that are most frequent participants of the
operational processes: Cross-Functional Team managers, Cross-Functional Team
members and Agile Product Owners.
The results will be aggregated by business area and role in the new organization.
The assessment results are mainly an input for the Process Office, which is the
organizational unit in charge of IT processes at Swedbank. Process Office will
validate the results against the process documentation. Where the results indicate
shortcomings and issues, the Process Office will with the help of the tool be able to
identify the likely causes of process gaps, and propose the appropriate
countermeasures, e.g. process training, updates to documentation and work
instructions, or process improvements.
5 Concluding remarks
Designing an IT process maturity self-assessment tool is essentially a pragmatic
exercise due to its emphasis on relevance – the outcome has practical utility for the
application environment.
However, practical utility alone does not provide a good solution and therefore it is
suggested to conduct the project as a design science research project. It is the synergy
between relevance and rigor and the contributions along both the relevance cycle and
the rigor cycle that define good design science research [24], but, also produce a
solution that is both relevant and practical.
By utilizing the DSR approach in designing an IT process maturity assessment
tool, we hope to develop a tool that is both useful and theoretically sound, to make
sure that the assessment results will reflect the true maturity state of the organization.
290
We hope that by engaging the line organization in the relevance cycle, and process
experts from the organization in the rigor cycle, we will succeed in creating a tool that
is based on Swedbank way of working and matched to CMMI-SVC maturity model.
The initiative is part of a long-term commitment to process improvement by
Swedbank. As the self-assessment of IT process maturity is developed into a
continuous practice, we hope that it will foster awareness, participation and a
continual improvement culture. We also hope that the project as such will provide
both practical and theoretical contribution into the area of assessment of IT process
maturity as well as into design science research and action design research.
References
1. Ince, C.S., Approaches and Benefits for Adopting Agile Methods. INSIGHT,
2015. 18(3): p. 18-20.
2. Lindvall, M., et al., Agile software development in large organizations.
Computer, 2004. 37(12): p. 26-34.
3. Dikert, K., M. Paasivaara, and C. Lassenius, Challenges and success factors
for large-scale agile transformations: A systematic literature review. Journal
of Systems and Software, 2016. 119: p. 87-108.
4. Srinivasan, S. and M. Murthy. Process Maturity Model Can Help Give a
Business an Edge. 2018 [cited 2018 2018-04-01]; Available from:
https://www.isixsigma.com/methodology/business-process-management-
bpm/process-maturity-model-can-help-give-business-edge/.
5. Machado, R.F., S. Reinehr, and A. Malucelli. Towards a maturity model for
IT service management applied to small and medium enterprises. in
European Conference on Software Process Improvement. 2012. Springer.
6. Leopoldi, R., Employing ITSM in Value Added Service Provisioning. 2015,
RL Information Consulting LLC. p. 5.
7. Lloyd, V., et al., ITIL continual service improvement. 2011: TSO.
8. Addy, R., Effective IT service management : to ITIL and beyond! 2007,
Berlin ; New York: Springer. xl, 342 p.
9. Marquis, H., ITIL: What It Is And What It Isn't. Business Communications
Review, 2006. 36(12): p. 49.
10. Zink, K. and A. Schmidt, Practice and implementation of self-assessment.
International Journal of Quality Science, 1998. 3(2): p. 147-170.
11. Gadd, K.W., Business self-assessment: a strategic tool for building process
robustness and achieving integrated management. Business Process Re-
engineering & Management Journal, 1995. 1(3): p. 66-85.
12. Povey, B., Continuous business improvement: linking the key improvement
processes for your critical long-term success. 1996: McGraw-Hill.
13. Samuelsson, P. and L.-E. Nilsson, Self-assessment practices in large
organisations: Experiences from using the EFQM excellence model.
International Journal of Quality & Reliability Management, 2002. 19(1): p.
10-23.
14. Proença, D. and J. Borbinha, Maturity models for information systems-A
state of the art. Procedia Computer Science, 2016. 100: p. 1042-1049.
15. Johansson, B., J. Eckerstein, and J. Malmros. Evaluating a Quantitative IT
Maturity Self-Assessment Approach: Does it give a good way of the as-is
state? in ICMLG2016-4th International Conference on Management,
291
Leadership and Governance: ICMLG2016. 2016. Academic Conferences
and publishing limited.
16. Laszlo, G.P., Implementing a quality management program–three Cs of
success: commitment, culture, cost. The TQM magazine, 1999. 11(4): p.
231-237.
17. Heskett, J.L., W.E. Sasser, and L.A. Schlesinger, The Service Profit Chain:
How Leading Companies Link Profit and Growth to Loyalty, Satisfaction
and Value. 1997: The Free Press, New York, NY.
18. Porter, L.J. and S.J. Tanner, Assessing Business Excellence - A Guide to Self-
assessment. 1996: Butterworth-Heinemann, Oxford.
19. Hevner, A., et al., Design science in information systems research. MIS
quarterly, 2004. 28(1): p. 75-105.
20. Simon, H.A., The sciences of the artificial. 1996: MIT press.
21. Sein, M.K., et al., Action design research. MIS quarterly, 2011: p. 37-56.
22. CMMI Product Team, CMMI for Services Version 1.3. 2010: Carnegie
Mellon, Software Engineering Institute, Pittsburgh, PA.
23. Ward, M., L. Gruppen, and G. Regehr, Measuring self-assessment: current
state of the art. Advances in Health Sciences Education, 2002. 7(1): p. 63-
80.
24. Hevner, A.R., A three cycle view of design science research. Scandinavian
journal of information systems, 2007. 19(2): p. 4.
292