Journal of Operational Risk

Risk.net

An approach to simultaneously assess operational risk and maturity levels in information technology management

Hossein Moinzad, Mohammad Jafar Tarokh and Mohammad Taghi Taghavifard

  • Simultaneous assessment of operational risk and maturity level of IT management.
  • Assessing the maturity level based on the COBIT architecture using the AHP model.
  • Utilization of AHP to weight each activity in the organization.
  • Utilization of time and cost to determine the relative importance of processes.

The development of information technology (IT) has had a considerable effect on organizational management and success. Due to the widespread use of IT, the risk of error increases, which can significantly affect organizations. In order to reduce the impact of these errors, organizational maturity and risk should be calculated. The aim of this paper is to investigate the operational risk and maturity level of IT in an anonymized financial institution, based on the American Productivity and Quality Center benchmark and control objectives for information and related technologies. To this end, the operational risk and maturity levels of 34 IT service management processes were investigated by collecting required data from electronic forms as well as through expert opinions. Results were obtained at three levels: the assessment of the operational risk and maturity level (level one), the domains (level two) and the processes (level three) of the IT organization. The findings showed that the organization had a maturity level of 1.11 (indicating ad hoc and disorganized processes) and therefore there was a 47.9% chance of operational risk to IT management goals. The results can help managers allocate limited resources to activities, optimize the use of capital, reduce administrative costs and use IT effectively for growth and development.

1 Introduction

Ease of communication and data exchange has made information technology (IT) a major and irreplaceable part of the modern world. Along with increasing use of IT at community level, new frameworks have been introduced for optimizing available resources and projects in institutions and organizations that benefit from IT. These requirements have led to the development of various strategies and methods for controlling, exploiting and managing IT projects. As a result, IT governance has become indispensable for companies, which spend 3–5% of their annual revenue on IT to stay competitive in the dynamic business environment of the modern world (Webb et al 2006). Therefore, different methods have been developed to measure the maturity and the risk of IT governance in organizations (Haes and Grembergen 2008).

In 2006, the IT Governance Institute conducted a global survey that included 695 organizations. According to this survey, 87% of the participants emphasized the necessity of using IT to present their business perspectives and strategies (ITGI 2007). The growing need for IT in the majority of industries and organizations causes a great deal of dependence on IT and a high risk of vulnerability. A wide range of external threats, such as mistakes, failures, abuse, fraud and cyber crime (Grembergen and Haes 2004), is associated with risk factors. As a result, most organizations are vulnerable to IT risks. IT governance helps reduce these risks. The concepts of risk and risk assessment have a long history: the Athenians, for example, demonstrated their capacity to assess risk over 2400 years ago.

Risk analysis should be seen not only as a means of reducing threats but also as a means of identifying opportunities that enable an organization to improve its performance. Effective risk management is a central function in successful planning and implementation (De Zoysa and Russell 2003).

Many studies have confirmed the correlation between the maturity of IT governance and IT performance. The performance of activities related to each process refers to the quality of the service when the activities are conducted and is determined by the completion percentage of the activity. IT management performance is defined as the effectiveness of IT management in achieving the following four goals, based on their importance in the organization (Weill and Ross 2004):

  1. (1)

    reduction of costs,

  2. (2)

    investment,

  3. (3)

    growth and

  4. (4)

    economic flexibility.

Processes defined in the control objectives for information and related technologies (COBIT) framework consist of various activities. COBIT defines several process control objectives to ensure IT complies with the business goals, security and profitability are maximized and risks are minimized. In new versions of COBIT the IT processes have been categorized differently (from V4.1 in 2007 to COBIT 2019) and used as organizational architecture for IT management. This architecture is embedded in all versions of COBIT.

In this paper, COBIT 4.1 architecture was used to identify IT processes in an anonymized organization. Cognitive maps and the American Productivity and Quality Center (APQC) benchmarks were then used to define the relationships between activities in organizational processes. To measure the operational risk of IT, the analytical hierarchical process (AHP) was used in COBIT architecture to quantitatively calculate risk at different organizational levels. The AHP divides COBIT 4.1 architecture into five different levels: IT organization, domains, processes, activities and risk alternatives.

Next, the weights of all activities were considered equally in COBIT 4.1 for calculating the maturity of the IT organization. In this paper, for a better maturity measurement, a unique weight was allocated to each activity based on time and cost. As a result, the activities’ impact on maturity depended on their weight. In this paper, a financial institution was selected as a case study, and risk and maturity levels were measured based on the COBIT model. Due to lack of infrastructure and human resource development in the case study, the experts decided that in the initial stage, four domains would be classified and evaluated based on COBIT 4.1: planning and organization (PO); acquisition and implementation (AI); delivery and support (DS); monitoring and evaluation (ME).

Since the analysis of maturity and risk in a financial institution is very complex and time-consuming, a graphical user interface (GUI) was designed to ease and accelerate this analysis and also to mechanize data entry. The GUI also improved the accuracy of computing and the integrity of results at each level of the organization, based on changes to activities and processes.

2 Background of risk and maturity analysis

Event tree analysis (ETA), fault tree analysis (FTA) and failure modes and effects analysis (FMEA) are of the common methods for risk assessments. In ETA, risk is assessed based on the probability of success and failure of an event. Therefore, the risk of a new system cannot be assessed before its implementation (Win et al 2004). So, in some risk assessment models, the AHP was used to calculate the weight of risk factors (Win et al 2004). The AHP method assumes that factors must be independent of each other (Saaty 1999). Overall, the enterprize risk and maturity should be assessed in order to describe risk perceptions, to assess how well the processes have been implemented and to measure how well the organization can deal with risk (Hillson 1997).

Quantitative risk analysis is often criticized during probability assessment (eg, of human errors or software defects) due to uncertainty about assumptions behind the assessment model and also the inherent risk uncertainty. However, it is necessary to use quantitative measures to make a rational and effective decision that combines knowledge and subjective beliefs. Measuring uncertainties and identifying at-risk audiences can help to understand risk and provide useful information for risk management. Kaplan and Garrick (2011) indicated that “the concept of risk involves uncertainty in addition to the loss or damage that may be received”. Instead of estimating the probability of undesirable events in risk assessment, alternative valuation methods are introduced and evaluated according to the factors affecting the occurrence of these incidents (Wideman 1986).

Progress has been made with network risk assessment (NRA), but three challenges have been identified. First, the nature of vulnerability is rarely considered in NRA. Instead, the assessment has concentrated on interactions between risks or between technology systems and on the effect of vulnerabilities on cascading sequences. Since vulnerability is an important risk element, it is difficult to fully understand how disasters occur without the risk of conflict and vulnerability (Pescaroli and Alexander 2016). Second, to the best of our knowledge, results from the NRA and the linear risk assessment (LRA) have not yet been compared. Without these comparisons, it is not clear how NRA and LRA are similar, or even how adopting a network approach may change risk results. This issue challenges the view that the NRA result is in line with the LRA result. Third, the theoretical foundations of NRA and LRA are not clear and may be based on different concepts. Since risk is nonlinear, revealing and complex in nature, it seems that NRA exploits digital transformation, whereas LRA contains simpler assumptions of causality that have a root in the modernist world of enlightenment. Such challenges may indicate irreconcilable differences between the assessment methods (Ginsberg et al 2018).

Maturity models in information systems research highlight a growing interest in risk assessment (Becker et al 2010; Mettler 2009), emphasizing the need for supporting maturity assessment in order to continuously improve an organization. Although the Software Engineering Institute’s capability maturity model (CMM) is more common in maturity studies for software development and capability maturity model integration (CMMI) (Becker et al 2010), several new maturity models have been developed in recent years. These have improved maturity for IT/business synchronization (Luftman 2003; Khaiata and Zualkernan 2009), business process management (Rosemann and de Bruin 2005), business intelligence (Packard 2007), project management (Crawford 2014), information cycle management (Sun 2005; Gottschalk 2009) and also the adoption of inter-organizational systems (Ali et al 2011) and organizational planning systems (Holland and Light 2001). Despite a growing interest in this area, information systems research rarely attempts to reflect and develop theoretical models and therefore lacks evidence of precise scientific methods in its development processes (Mettler 2009; Becker et al 2010). Methods such as design science (Hevner et al 2010) are useful tools for the precise development of new maturity models, which Becker et al (2010) validated. However, to ensure their acceptance by users, the proposed models should be tested experimentally and practically. It is also challenging to reduce the gap between current and optimal maturity. Mettler (2009) suggests that many models are incapable of describing how improvements can be made.

As a result, previous approaches have focused on risk analysis and maturity assessment, or maturity analysis of risk management, while the dimensions of risk and maturity assessment have been neglected. Therefore, this paper suggests a model for maturity and risk analysis of IT processes in organizations by monitoring the complex relationships in ITA. It also uses the AHP to allocate weight to each activity in the organization in order to improve on previous models and to clarify the effect of all the activities and processes on the risk and maturity of an IT organization.

The strategic use of IT has become a key factor in giving an organization a competitive advantage. Without consistent development over all sectors, organizations will not reach the required effectiveness in IT. Therefore, the first step is to establish a link between strategic IT planning and a comprehensive organizational program, and identify how much IT increases the value of the business. In the second step, we determine any IT risks to the organization. The third step reduces these risks through enhancing maturity and reinforcing infrastructure. In the fourth step, we review the previous steps thoroughly and assess the results.

In this study, the concept of risk was defined as

  • an action with more than one result,

  • an action with unpredicted results and

  • a possible outcome with an adverse consequence.

The most important risks for a financial institution, such as a bank, are divided into three categories.

  1. (1)

    Credit risk: the risk associated with losses resulting from a borrower’s failure to repay at all or repay according to contractual obligations.

  2. (2)

    Market risk: the risk associated with probable losses to the bank’s assets resulting from changes and fluctuations in market factors (such as exchange rate, interest rate, stock price).

  3. (3)

    Operational risk: the risk of direct or indirect losses resulting from inadequate or incorrect processes undergone by an organization, individual, system or an event outside the organization. This risk is most common in financial and banking institutions and is regarded as a risk that is not directly related to credit risk or market risk.

The risk mentioned in this study relates to inadequate or incorrect processes in IT management and is considered to be a part of the operational risk in financial institutions.

The IT framework and IT working environment are very important and fast-changing factors. Their application horizon is thus very limited. Due to the need for integrity among business entities in a large IT organization, risk assessment has become a promising and enabling field of study and therefore plays a key role in aiding the mission and goals of large organizations. A key to achieving a competitive advantage is the success of IT in realizing business goals.

This paper seeks to simultaneously assess the operational risk and the maturity of IT management and to develop solutions in order to achieve an optimal level of maturity and to reduce operational risk. Therefore, this study aims to assess the maturity and risk of IT management based on real data.

3 Methodology

3.1 COBIT

COBIT is a framework designed to control IT performance. This framework was initially developed by the Information Systems Audit and Control Association (ISACA) and it later developed into an independent organization, ITGA. COBIT 4.1, which was used in this study, was released in 2007. According to the ISACA, the COBIT framework is a high-level process model that organizes a wide range of IT-related activities within 34 processes.

The ITGA focuses on IT management considerations for executive management. Figure 1 illustrates the ITGA’s areas of interest within COBIT 4.1. Based on the structure of COBIT 4.1, an organization consists of four domains and 34 processes.

Process structure in the COBIT model. Source: Information Systems Audit and Control Association (2012).
Figure 1: Process structure in the COBIT model. Source: Information Systems Audit and Control Association (2012).

COBIT consists of a framework and a set of tools that allow managers to fill the gaps in control needs, technical issues and business risks. COBIT structure and the hierarchical approach to COBIT provide a comprehensive and continuous view of IT and related decision-making.

In summary, the benefits of implementing COBIT as a framework for IT governance are as follows.

Strategic alignment.

This ensures communication between business plans and IT operations.

Value creation.

This ensures that IT operations meet the expected goals of the organization’s strategic plan by focusing on cost optimization and incremental value creation.

View.

This ensures that managers have a view of what IT is doing

Resource management.

This refers to the investment and management of critical IT resources (information, applications, infrastructure, individuals, etc).

Risk management.

This ensures that senior executives are aware of the risks and responsibilities associated with IT.

Measurement of efficiency.

This refers to resource consumption, implementation of strategies, tracking and controlling the process and computing the effectiveness of the results.

Common understanding.

This ensures a common understanding between stakeholders, based on a single language.

3.2 COBIT customization using the APQC reference model

Since this study focuses on a financial institution, the COBIT framework has been customized using the APQC for the case study. One of the most important functions of the APQC is to develop a model or framework for the identification and classification of processes as well as a database of organizational activities for the customization of organizational processes.

3.3 Calculation of the weights for each activity

To measure IT risk and maturity in the case study, the organization was first divided into the major domains, processes and activities of the COBIT architecture. For each of the IT-related activities, time and cost were estimated and related data was gathered, as shown in Figure 2. Based on the time and cost values associated with each activity, the relative importance of the activities in the processes was calculated. In the same way, all four domains and 34 processes in the organization were ranked. It was necessary to estimate time, cost and percentage of implementation for all the processes and activities. This information was gathered using a GUI that was designed for the assessment.

The information entry form for the relative importance of activities.
Figure 2: The information entry form for the relative importance of activities.

These steps formed the cornerstone of measuring organizational maturity and risk in the study. The relative importance of activities was dynamically determined by the execution time of the activity and its implementation cost. Changes in both these parameters would have a direct effect on relative importance.

3.4 Evaluation of process maturity in COBIT

Senior management of companies and enterprises need to monitor their organization’s IT management regularly. IT management should be able to find effective solutions.

By using the maturity model built for each of the 34 COBIT processes, management would be able to identify

  • the actual performance of an enterprise in its current position,

  • the current state of the firm and

  • the goals of an enterprise that would improve its position (ie, the target position the firm wants to achieve).

To make the results more useful, maturity assessment was used as a tool to support special business cases. This is shown in Figure 3. For each of the 34 COBIT processes, the maturity model was defined by an incremental measurement scale from 0 (nonexistent) to 5 (optimized).

Graphical representation of maturity model. Source: Information Systems Audit and Control Association (2012).
Figure 3: Graphical representation of maturity model. Source: Information Systems Audit and Control Association (2012).

3.5 Maturity level of activities

The maturity calculation was based on the calculation method in COBIT 4.1. The only difference from previous calculations was the calculation process maturity, ie, the weights of activities with an impact on the process maturity levels. In COBIT, all activities were considered to have the same weight and effect on the maturity of the processes. The mechanism of the process maturity measurement was a two-step procedure. In the first step, the realization degree of the process was measured by obtaining a realization percentage of the activities.

In the second step, the maturity was measured by evaluating and calculating the number of attributes. COBIT has nine attributes that measure the maturity. A look at Table 1 of these management attributes shows that they describe the evolution of IT processes from 0 to 5 (the optimized maturity level). These attributes can be used to analyze gaps in IT provision and improve planning. As a result, nine relevant attributes were categorized into five levels, which are represented in Table 1. Process maturity levels can be calculated according to their realization.

Table 1: COBIT process maturity levels. [As COBIT process maturity levels decrease from 5 to 1, the risk increases. Source: Information Systems Audit and Control Association (2012).]
    Level Capability
5 Optimizing Focus on process Level 5 – Optimizing
    improvement PA 5.1. Process innovation
      PA 5.2. Process optimization
4 Quantitatively Processes measured Level 4 – Predictable
  managed and controlled PA 4.1. Process measurement
      PA 4.2. Process implementation
      PA 4.2. control
3 Defined Processes characterized Level 3 – Established
    by the organization PA 3.1. Definition of process
    and proactive (projects PA 3.2. Process deployment
    tailor their processes  
    to organization’s  
    standards)  
2 Managed Processes characterized Level 2 – Managed
    by projects and often PA 2.2. Process output
    reactive PA 2.2. management
      PA 2.1. Process management
1 Initial Processes unpredictable, Level 1 – Performed
    poorly controlled and PA 1.1. Process implementation
    reactive  

3.6 Calculation of risk with the help of COBIT organizational architecture

After the maturity measurement, risk was another indicator that was calculated from the case study organization’s processes. The risk in this project was an operational risk. In the banking industry, the following operational risk definition is given by the Basel Committee on Banking Supervision (2010): “Operational risk refers to the risk of losses resulting from processes, people, and systems that are inadequate or impaired, or external events. For example, operational risk definitions in financial institutions can include embezzlement, fraud, money laundering, misuse of confidential customer information.”

The concept of risk, as presented in this study, does not exist within the framework of COBIT. One reason for evaluating the process risk is that it provides a good basis for other calculations, in particular, some types of optimization. Since the COBIT framework looks comprehensively at the financial organization and structures all microactivities to meet organizational goals, the context provided by this framework provides a good opportunity to audit all micro-enterprise activities from the risk point of view. Moreover, the impact of various kinds of risk to the organization can also be assessed and analyzed on a larger scale using the hierarchy of goals in the COBIT organization architecture.

The risk level in its material sense means “a hazard” and it is calculated by multiplying damage severity by the risk probability. Generally, the risk management operation consists of three main stages.

Risk identification:

at this stage, the risks that can affect the achievement of business goals are identified.

Risk assessment:

at this stage, the probability of the risks’ occurrence, the severity of the damage and the results from each identified risk are investigated and evaluated.

Risk control:

at this stage, an appropriate strategy is presented to deal with each risk. It also evaluates the effectiveness of the strategy adopted through regular reviews.

Although different frameworks and standards may determine the number of risk management phases differently, the main standard structure is based on the steps illustrated in Figure 4.

Algorithm of risk management in the organization.
Figure 4: Algorithm of risk management in the organization.

In the first phase of this study, the risk factors were identified. Thirteen operational risks were considered as the most common and riskiest incidents in the field of IT projects. These risks included

  1. (1)

    inappropriate use of IT tools,

  2. (2)

    unstable published software,

  3. (3)

    integration of incorrect IT technologies with existing infrastructure,

  4. (4)

    unrealistic expectations,

  5. (5)

    inadequate support for senior management,

  6. (6)

    insufficient attention to user requirements,

  7. (7)

    lack of adequate counseling,

  8. (8)

    higher than expected expenditure,

  9. (9)

    invalid IT management in the project,

  10. (10)

    too much need for reliability,

  11. (11)

    knowledge management/invalid asset management,

  12. (12)

    security challenges and

  13. (13)

    low efficiency.

This study assessed the identified risks as the second phase of risk management. COBIT was used to calculate the weight of each activity, the maturity of IT-related processes and the maturity in the domains of the organization. The AHP model was also used to calculate risk. In this model, the IT organization was placed at the top of the hierarchical model. At the second level, the four main domains (planning and organizing; receiving and implementing; delivery and support; monitoring) were evaluated.

These four domains (the second level) were divided into 34 processes (the third level). Processes were then divided into activities (the fourth level). At the fifth level, decision alternatives were placed, which included the 13 most common risks mentioned above. The hierarchy model is shown in Figure 5.

Regarding the values of the variables, the numerical risk level was 0–25, which can be correlated with normalization to 0–100. Yet as mentioned before, 13 risk categories were considered for each activity, process and expected risk level, which resulted in 13 risk alternative probabilities and severities.

However, only one value should be assigned to the desired activity as the representative of 13 risk factors. Therefore, by weighing the risks and calculating the average weight, activity-related risk can be calculated.

The implementation model of IT risk assessment based on the AHP process.
Figure 5: The implementation model of IT risk assessment based on the AHP process.

In this way, an evaluation template was created based on COBIT to compute maturity and risk in IT management. For each activity, an interview was held with at least five experts selected from executives, IT management experts, IT management consultants, IT executives, IT system operators, IT system developers and information security management systems (ISMS) experts. The time, cost, completion percentage, potential risks, severity and probable harm were determined for each activity. Each interview lasted between 60 and 100 minutes. The case study organization had approximately 20 000 employees working in 1600 branches and its annual revenue was near 10 billion US dollars.

After filling in this form for each activity, the assessment software would calculate the risk based on Table 2. After gathering data for all activities in an organization, data were used to calculate risk to processes, domains and the organization according to the relative importance of each process and the AHP. The risk for the four domains and the whole organization was calculated based on the concept of relative importance and expected value.

Table 2: Calculating the probability and severity of damage.
Risk severity chart Score
A critical effect that completely stops the system 100
A severe effect that compromises the security of the client 90
and disrupts the functioning of the system  
An important effect that dissatisfies the customer 80
and requires a lot of money to resolve  
A normal effect that degrades the system 70
and prevents it from functioning correctly  
A normal effect that prevents the system from functioning correctly 60
An insignificant effect that inconveniences the customer 50
and where the customer seeks to fix it  
An insignificant effect that inconveniences the customer 40
but the customer does not seek to fix it  
A negligible effect that does not inconvenience the customer 30
An unnoticeable effect that affects the functioning of the system 20
and is not negligible  
A negligible effect that shows an error in system performance 10
A negligible effect that shows negligible failure in system performance 01
Risk probability Score
Every day 100
Every three days 90
Every week 80
Every two weeks 70
Every month 60
Every three months 50
Every six months 40
Every year 30
Every two years 20
Every four years 10
More than four years 01
Description Probability Severity
Very high 80–100 80–100
High 60–80 60–80
Medium 40–60 40–60
Low 20–40 20–40
Very low 00–20 00–20

To obtain the risk of all 34 processes, the four domains and the whole organization using the AHP, myriad calculations were required. A form was designed in the assessment stage to reduce the errors and to accelerate the calculation process. This risk information form consisted of three parts. The first part included 13 possible risks in the IT organization. Each activity could be vulnerable to all risks. In the second part, the time of the risk’s occurrence could be selected. In the third part, the severity of damage for each risk could be set. The risk information form is shown in Figure 6.

Risk information form for risk in activities.
Figure 6: Risk information form for risk in activities.

4 Results

In this paper, the APQC’s IT management for the financial institution was used as a reference for customizing IT processes in the banking industry. Each process presents some activities. In total, the entire organization includes 34 processes and 1712 activities, which can be found in Table 3. Empirical data were needed to assess the maturity and risk of IT management. Data collection was done according to the case study strategy.

Table 3: The four domains and 34 COBIT processes.
    PA  
       
0Num   1.1 2.1 2.2 3.1 3.2 4.1 4.2 5.1 5.2 Sum
01 P01 Define strategic IT plan 3 5 5 6 6 5 4 6 11 51
02 P02 Define information architecture 3 5 5 6 6 5 4 6 9 49
03 P03 Determine technological direction 3 5 5 6 6 5 4 6 4 44
04 P04 Define IT processes, organization 3 5 5 6 6 5 4 6 6 46
  and relationships                    
05 P05 Manage IT investment 7 6 4 5 6 6 5 5 3 47
06 P06 Communicate management aims and 3 6 4 5 6 6 5 5 3 43
  directions                    
07 P07 Manage IT human resources 5 6 4 5 6 6 5 5 3 45
08 P08 Manage quality 5 6 4 5 6 6 5 5 3 45
09 P09 Assess and manage IT risks 10 6 4 5 6 6 5 5 3 50
10 P10 Manage projects 8 6 4 5 6 6 5 5 3 48
11 AI1 Identify automated solutions 24 6 4 5 6 6 5 5 3 64
12 AI2 Acquire and maintain application 8 6 4 5 6 6 5 5 3 48
  software                    
13 AI3 Acquire and maintain technology 4 6 4 5 6 6 5 5 3 44
  infrastructure                    
14 AI4 Enable operation and use 6 6 4 5 6 6 5 5 3 46
15 AI5 Procure IT resources 5 6 4 5 6 6 5 5 3 45
16 AI6 Manage changes 3 5 4 5 6 5 4 6 9 49
17 AI7 Install and accredit solutions and 3 5 5 6 6 5 4 6 15 55
  changes                    
18 DS1 Define and manage service levels 3 5 5 6 6 5 4 6 8 48
19 DS2 Manage third-party services 3 5 5 6 6 5 4 6 10 50
20 DS3 Manage performance and capacity 3 5 5 6 6 5 4 6 8 48
21 DS4 Ensure continued service 3 5 5 6 6 5 4 6 8 48
22 DS5 Ensure systems security 3 5 5 6 6 5 4 6 9 49
23 DS6 Identify and allocate costs 3 5 5 6 6 5 4 6 4 44
24 DS7 Educate and train users 3 5 5 6 6 5 4 6 5 45
25 DS8 Manage service desk and incidents 3 5 5 6 6 5 4 6 6 46
26 DS9 Manage configuration 3 5 5 6 6 5 4 6 4 44
27 DS10 Manage problems 3 5 5 6 6 5 4 6 6 46
28 DS11 Manage data 3 5 5 6 6 5 4 6 26 66
29 DS12 Manage physical environment 3 5 5 6 6 5 4 6 5 45
30 DS13 Manage operations 3 5 5 6 6 5 4 6 8 48
31 ME1 Monitor and evaluate IT performance 3 5 5 6 6 5 4 6 9 49
32 ME2 Monitor and evaluate internal control 3 5 5 6 6 5 4 6 7 47
33 ME3 Ensure compliance with external 3 5 5 6 6 5 4 6 5 45
  requirements                    
34 ME4 Provide IT governance 3 5 5 6 6 5 4 6 7 47
  Total                   1712

A bottom-up approach was used to integrate maturity and risk indicators in the processes, domains and the organization. The maturity of a process corresponded to the maturity of its activities based on the expected value (occurrence probability and completion rate). The maturity of a process was equal to the maturity of the expected value of the activities. Similarly, the maturity level of the organization was equal to the expected value of maturity in the four domains. The definitions of maturity indicators and risk management in IT are presented in Figure 7.

Maturity and risk indicators.
Figure 7: Maturity and risk indicators.
Realization percentage for the activities of a process.

These activities are classified into five general categories. The full realization for each category of activities would increase the maturity level of that process.

Relative importance of each activity.

The relative importance is calculated based on the cost and time spent on the implementation of that activity. Increasing costs and runtime would increase the relative importance of the activity.

Risk of each activity.

This criterion is determined by data entered by the user, based on pre-identified risks. The total risk of an activity obtained by the AHP would be analyzed at various levels based on 13 types of IT risk information.

Maturity level of the process.

This criterion, which has a sixfold discrete value (from 0 to 5), represents the concept of process maturity based on the COBIT definition.

Documentation on information entry, and assessment of risk level and process maturity, is provided in upstream areas of processes and the whole organization. The result of these calculations is described in Tables 4 and 5 for 1712 activities and 34 processes.

Table 4: Ranking and assessing maturity and risk at the level of domains based on relevant processes. [“G” denotes gained maturity. “P” denotes process relative importance in domain. “RL” denotes risk level. “N” denotes not achieved. “I” denotes initial. “R” denotes repeatable. “D” denoted defined. “Q” denotes quantitatively managed. “O” denotes optimizing.]
(a) PO1: IT strategic plan description
      Level 2 Level 3 Level 4 Level 5
    Level 1        
  Level 0 PA 1.1 PA 2.1 PA 2.2 PA 3.1 PA 3.2 PA 4.1 PA 4.2 PA 5.1 PA 5.2
Criteria   F L L L L L P P P
evaluation                    
Maturity Level 2: managed process
level                    
(b) Organization assessment at domain level  
  Maturity level  
     
  G P RL N I R D Q O  
Collected processes     Level 1/2    
for planning and organization                    
PO1 Define a strategic IT plan 2 10.8 52.2              
PO2 Define information architecture 1 11.8 59.6              
PO3 Determine technological direction 1 09.0 57.1              
PO4 Define IT processes, organization 1 09.5 58.5              
and relationships                    
PO5 Manage IT investment 1 09.3 55.7              
PO6 Communicate management aims 1 08.5 56.6              
and direction                    
PO7 Manage IT human resources 1 09.5 52.3              
PO8 Manage quality 2 09.7 49.9              
PO9 Assess and manage IT risks 1 11.0 46.4              
PO10 Manage projects 1 10.9 49.0              
Collected processes     Level 1/11    
for acquisition and implementation                    
AI1 Identify automated solutions 1 21.1 46.5              
AI2 Acquire and maintain 1 12.1 47.4              
application software                    
AI3 Acquire and maintain 2 10.8 47.2              
technology infrastructure                    
AI4 Enable operation and use 1 15.6 45.1              
AI5 Procure IT resources 1 14.3 47.7              
AI6 Manage changes 1 12.3 42.4              
AI7 Install and accredit 1 12.7 43.7              
solutions and changes                    
Collected processes     Level 1/08    
for delivery and support                    
DS1 Define and manage service levels 1 07.2 44.3              
DS2 Manage third-party services 1 08.5 45.0              
DS3 Manage performance and capacity 1 07.9 42.1              
DS4 Ensure continuous service 1 07.8 45.5              
DS5 Ensure systems security 1 07.7 43.6              
DS6 Identify and allocate costs 1 07.3 45.2              
DS7 Educate and train users 1 07.0 44.0              
DS8 Manage service desk and incidents 1 07.6 46.0              
DS9 Manage configuration 1 06.8 45.8              
DS10 Manage problems 2 07.6 46.1              
DS11 Manage data 1 10.2 49.6              
DS12 Manage physical environment 1 07.1 47.9              
DS13 Manage operations 1 07.1 44.5              
Collected processes     Level 1/08    
for monitoring and evaluation                    
ME1 Monitor and evaluate IT performance 1 24.8 43.8              
ME2 Monitor and evaluate internal control 1 25.3 47.5              
ME3 Ensure compliance with external 1 23.9 46.5              
requirements                    
ME4 Provide IT governance 1 26.1 45.4              
Table 5: Ranking and assessing maturity and risk at the level of domains and the whole organization.
  Domain Risk Domain    
  maturity level level relative weight Organization Organization
Domains (0–5) (%) (%) maturity level risk assessment
Planning and organization 1.20 53.70 29.20 1.11 47.94
Acquisition and implementation 1.11 45.70 23.40 1.11 47.94
Delivery and support 1.08 45.40 36.20 1.11 47.94
Monitoring and evaluation 1.00 45.80 11.20 1.11 47.94

The results showed that, due to the maturity and risk levels in the PO4 and PO10 processes at the PO level, it is impossible to succeed in IT management without the necessary coordination.

In terms of maturity level, the organization is in level 1.11 (indicating ad hoc and disorganized processes); therefore there is a 47.9% chance that the organization faces operational risk in achieving the IT management goals.

The PO domain, which has a maturity level of 1.2 (indicating ad hoc and disorganized processes), has about a 53.7% chance of facing risk. Quality management (the PO8 process) is effective in achieving IT organization goals in terms of maturity and risk levels. However, in a less experienced organization, quality management is considered only to be a commercial issue.

The AI domain, which has a maturity level of 1.11 (indicating ad hoc and disorganized processes), has about a 45.7% chance to face risk. Changes in infrastructure and applications (the A13 process) ensure IT management excellence. Failure of designs, such as Enterprise Resource Planning and Business Intelligence, may negatively affect the perception of business IT management.

The DS domain, which has a maturity level of 1.08 (indicating ad hoc and disorganized processes), has about a 45.4 % chance to face risk. The maturity and risk profile of the DS6 process highlights the need to emphasize cost allocation to shift the IT position from just a cost-based unit to a finance-based unit.

The ME domain, which has a maturity level of 1 (indicating ad hoc and disorganized processes), has about a 45.8% chance to face risk. The maturity and risk in the ME1 process show that good IT management in a project is more achievable in the long term. Therefore, continuous monitoring in IT is a key factor in achieving IT management goals.

5 Discussion

The findings suggested that the organization had a maturity level of 1.11 out of 5 using COBIT, which represents the initial stage of maturity and can be interpreted as unpredictable, poorly controlled and reactive processes. Moreover, there is a 47.94% probability of operational risk in the field of IT in the organization.

According to research calculations, the highest process maturity was obtained by PO1 (defining a strategic IT plan), reflecting the new approach of state-owned banks and the concentration of these organizations’ managers on the benefits derived from the use of IT in electronic banking. The two processes PO9 and PO10 in ITG have the lowest process maturities. PO9 is concerned with the progress, safety and security processes of IT.

The results suggest the current level of maturity of IT governance in state-owned banks is not desirable, and state-owned banks need an IT plan to improve maturity. These results also indicate that state-owned banks have realized the significance of a strategic coalition between businesses through IT. Therefore, new IT goals and business goals are required in order to achieve a competitive advantage and a larger market share.

The banking system compels banks to treat society directly as their client. Also, client behavior has transformed. Nowadays, most clients exclusively use e-banking services such as internet banking, mobile banking, automated teller machines, point-of-sale terminals, etc. If service delivery is inefficient, clients will lose trust in the banks. This may partly explain the risk difference in ensuring continuous service (the DS4 process), the risk difference in evaluating internal control (the ME2 process) and also in ensuring compliance with supervisory requirements (the ME3 process), which differ greatly from bank to bank.

In order to reduce the chance of risk and improve the maturity of the IT processes in the case study, the processes could be prioritized according to their weight and level of maturity. As a result, the company could allocate resources effectively between IT processes and improve company performance in order to gain advantage in the market.

6 Limitations of the study

The most important limitations of this research were the lack of IT infrastructure, the lack of expertise in human resources, the time limitation for project implementation (due to the entry of new rivals in the market share of the studied financial institution), the budget constraints (due to expensive services and withdrawal of customers) and the lack of prior research related to the simultaneous assessment of risk and maturity in an IT organization.

Also, since this research used a single case study, the results cannot be generalized to other businesses and organizations. Additionally, the literature related to this research was limited.

The lack of infrastructure and specialists made it impossible to use new tools, such as COBIT 5 or COBIT 2019. The most effective reasons for migrating from COBIT 4.1 to COBIT 5 and then to COBIT 2019 are the need for integrated knowledge of standards, for a process-based framework that includes all IT and business operations, for effective guidelines in IT and management areas such as enterprise architecture and emerging technologies and for integration with the standards. In terms of the infrastructure and executive structures, the executives needed to migrated to COBIT 5 and then to COBIT 2019 after the development of the infrastructure.

7 Conclusion

Nowadays, due to the widespread use and importance of IT in different organizations, and since different accidents and errors in IT organizations can lead to significant problems for organizations, special attention should be paid to the measurement of its maturity, risk, and severity of the damage. Since IT involves many processes, the risk and maturity measurement at the level of activities, processes, domains and organizations is a very complicated and overwhelming task, which is prone to calculation errors.

Most managers have fundamental problems in selecting an appropriate risk management framework and methodology. Moreover, the board of directors, managers, business managers and IT and risk specialists throughout the organization need guidance for the implementation of the risk management process based on the internal and external conditions of their organization; therefore, it is necessary to use effective methods and frameworks to provide guidance for managers and others and solve the problem of risk assessment and management in IT-based organizations. The COBIT framework can be a useful tool for implementing a risk management process. At first glance, this framework is a common language for communication between business managers and IT managers to achieve a common understanding of each other’s intentions and objectives. More fundamentally, it puts the operational objectives of the governing and management system in the hands of the organization. In other words, it enables value creation by maintaining a balance between realizing interests, managing risks and optimizing resources.

To accelerate this value creation process and ensure more reliable output data, software should be developed based on IT architecture to calculate the maturity and risk of organizations based on the IT maturity and risk assessment models, by obtaining initial information at all IT levels. This study used COBIT and the balanced scorecard to assess IT maturity and risk separately, which has not been done in previous studies.

In this paper, IT management risk and maturity were evaluated by collecting data related to 1712 activities, 34 processes and four domains. The results of this paper can be applied to IT organizations, academics and stakeholders. After calculating, obtaining and confirming the results, they were analyzed and sent to the organization’s management. They were then used for

  1. (1)

    allocating limited resources to activities and processes,

  2. (2)

    optimizing and effectively investing in IT,

  3. (3)

    reducing administrative costs,

  4. (4)

    increasing growth and development and

  5. (5)

    increasing flexibility in commercial areas.

In addition, this study provided a dynamic measurement of the relative importance of each IT-related activity, process and domain in the organization. To manage operational IT risk, resource allocation can be based on a relative importance index in future studies. The COBIT model used in this study, where activities were extracted based on the APQC framework, can also be implemented in similar organizations in future research, since most large organizations have proven their ability to implement COBIT.

8 Recommendations

Future research should use this method in other industries and organizations to assess risk and maturity simultaneously and to analyze the results, in order to provide managers with more useful data.

In contrast to risk identification being based on a single routine process, as it is often carried out currently, the risk management in this study identified and controlled risks to the IT goals and business objectives of a financial organization. Risk was also measured separately from maturity. Therefore, it is suggested that risk is measured and analyzed by identifying the relationship between process maturity and each risk.

Identifying factors with a major impact on IT in banks was the priority of this study for building process models. The factors found in this research should be validated at the level of large domestic organizations. In this way, hidden variables can be identified and categorized.

Periodic measurements of the factors in success and failure, as well as a comprehensive view of an organization’s current situation, can provide the necessary information for allocating resources to policy makers. Measurements will also help assess the improvement of performance indicators for managers in order to facilitate the use of IT.

To increase users’ and stakeholders’ business satisfaction, IT management should focus on improving IT governing processes so that they have greater relative importance, greater maturity and lower risk.

These results can also be used to improve the current IT management template. Repeating and generalizing these findings with the same methodology will lead to better business performance. It is suggested that increased profits, financial reporting and market share are some of the parameters that could be considered for future studies to provide a method for linking business performance to IT management and IT maturity management.

According to the literature review on risk management and reliability, there is no detailed study on the reliability of the processes. Therefore, it is necessary to study theoretical foundations and reliability engineering and management, by extracting concepts that fit the nature of organizational risk and maturity.

Furthermore, a method design for evaluating the combined effect of risks on an organization is recommended, to evaluate the impact of risks on the organization by considering the interactions between risks.

Declaration of interest

This article is part of the PhD project of Hossein Moinzad and is not related to any governmental organizations. There is no funding for this research. No potential conflict of interest was reported by the authors.

References

  • Ali, M., Kurnia, S., and Johnston, R. (2011). Understanding the progressive nature of Inter-Organizational Systems (IOS) adoption. In E-Collaboration Technologies and Organizational Performance: Current and Future Trends, pp. 124–144. Information Science Publishing, IGI Global, Hershey, PA (https://doi.org/10.4018/978-1-4666-2625-6.ch004).
  • Basel Committee on Banking Supervision (2010). The Basel Committee’s response to the financial crisis: report to the G20. Report, October, Bank for International Settlements. URL: http://www.bis.org/publ/bcbs179.pdf.
  • Becker, J., Poppelbus, J., Niehaves, B., and Simons, A. (2010). Maturity models in information systems research: literature search and analysis. Communications of the Association for Information Systems. 29(27) (https://doi.org/10.17705/1CAIS.02927).
  • Crawford, J. K. (2014). Project Management Maturity Model, 3rd edn. Auerbach Publications, New York (https://doi.org/10.1201/b17643).
  • De Zoysa, S., and Russell, A. D. (2003). Structuring of risk information to assist in knowledge-based identification of the life cycle risks of the civil engineering project. In Proceedings of the 5th Construction Specialty Conference of the Canadian Society for Civil Engineering, June, Canada, pp. 4–7. Canadian Society for Civil Engineering.
  • Ginsberg, A. C., Abolhassani, L., and Rahmati, E. A. (2018). Comparing networked and linear risk assessments: from theory to evidence. International Journal of Disaster Risk Reduction 30, 216–224 (https://doi.org/10.1016/j.ijdrr.2018.04.031).
  • Gottschalk, P. (2009). Maturity levels for interoperability in digital government. Government Information Quarterly 26(1), 75–81 (https://doi.org/10.1016/j.giq.2008.03.003).
  • Grembergen, V. W., Haes, D. S., and Guldentops, E. (2004). Structures, Processes and Relational Mechanisms for IT Governance: Strategies for Information Technology Governance. IGI Global (https://doi.org/10.4018/978-1-59140-140-7.ch001).
  • Haes, D. S., and Grembergen, V. W. (2008). Analyzing the relationship between IT governance and business/IT alignment maturity. In Proceedings of the 41st Hawaii International Conference on System Sciences. IEEE (https://doi.org/10.1109/HICSS.2008.66).
  • Hevner, A., and Chatterjee, S. (2010). Design science research in information systems. In Design Research in Information Systems, pp. 9–22. Integrated Series in Information Systems, Volume 22. Springer (https://doi.org/10.1007/978-1-4419-5653-8_2).
  • Hewlett Packard (2007). The HP Business intelligence maturity model. Report, HP. URL: https://download.101com.com/pub/tdwi/Files/BI_Maturity_Model_4AA1_5467ENW.pdf.
  • Hillson, D. A. (1997). Toward a risk maturity model. International Journal of Business Risk Management 1(1), 33–45.
  • Holland, C. P., and Light, B. (2001). A stage maturity model for enterprise resource planning systems use. Database for Advances in Information Systems 32(2), 24–45 (https://doi.org/10.1145/506732.506737).
  • Information Systems Audit and Control Association (2012). COBIT 5: a business framework for the governance and management of enterprise IT. Working Paper, ISACA.
  • IT Governance Institute (2007). COBIT 4.1 framework, control objectives, management guidelines, maturity models. Working Paper, ISACA.
  • Kaplan, S., and Garrick, B. J. (1981). On the quantitative definition of risk. Risk Analysis 1(1), 11–27 (https://doi.org/10.1111/j.1539-6924.1981.tb01350.x).
  • Khaiata, M., and Zualkernan, I. A. (2009). A simple instrument to measure IT business alignment maturity. Information Systems Management 26(2), 138–152 (https://doi.org/10.1080/10580530902797524).
  • Luftman, J. (2003). Assessing IT/business alignment. Information Systems Management 20(4), 9–15 (https://doi.org/10.1201/1078/43647.20.4.20030901/77287.2).
  • Mettler, T. (2009). A design science research perspective on maturity models in Information Systems. Working Paper, Institute of Information Management, University of St. Gallen. URL: http://www.alexandria.unisg.ch/publications/214531.
  • Pescaroli, G., and Alexander, D. (2016). Critical infrastructure, panarchies and the vulnerability paths of cascading disasters. Natural Hazards 82(1), 175–192 (https://doi.org/10.1007/s11069-016-2186-3).
  • Rosemann, M., and de Bruin, T. (2005). Towards a business process management maturity model. In Proceedings of the European Conference on Information Systems, Regenburg, Germany. dblp Computer Science.
  • Saaty, T. L. (1999). Fundamentals of the analytic network process. Working Paper, August, ISAHP, Japan, pp. 12–14.
  • Sun Microsystems (2005). Information lifecycle management maturity model. White Paper, April, Sun Microsystems. URL: https://vbds.nl/downloads/pub10.pdf.
  • Webb, P., Pollard, C., and Ridley, G. (2006). Attempting to define it governance: wisdom or folly? In Proceedings of the 39th Hawaii International Conference on System Sciences. IEEE Press, Piscataway, NJ (https://doi.org/10.1109/HICSS.2006.68).
  • Weill, P., and Ross, J. W. (2004). IT Governance: How Top Performers Manage IT Decision Rights for Superior Results. Harvard Business Press, Cambridge, MA.
  • Wildeman, R. M. (1986). Risk management. Project Management Journal 17(4), 20–26 (https://doi.org/10.1097/00006247-198608000-00012).
  • Win, K. T., Phung, H., Young, L., Tran, M., Alcock, C., and Hillman, K. (2004). Electronic health record system risk assessment: a case study from the MINET. Health Information Management 33(2), 43–48 (https://doi.org/10.1177/183335830403300205).

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@risk.net or view our subscription options here: http://subscriptions.risk.net/subscribe

You are currently unable to copy this content. Please contact info@risk.net to find out more.

You need to sign in to use this feature. If you don’t have a Risk.net account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here