Information Security Maturity Model (ISMM)

Size: px
Start display at page:

Download "Information Security Maturity Model (ISMM)"

Transcription

1 Information Security Maturity Model (ISMM) A dissertation submitted to The University of Manchester for the degree of Master of Science in the Faculty of Engineering and Physical Sciences 2013 By Nnatubemugo Innocent Ngwum School of Computer Science

2 Table of Contents List of tables.5 List of Figures 6 Abstract 7 Declaration 8 Intellectual Property Statement.9 Acknowledgement.10 Dedication 11 Chapter 1: Introduction Research Motivation and Questions Project Aims and Objectives Project Scope Report Structure 16 Chapter 2: Literature Review Maturity Modelling: An improvement approach Trends in Maturity Modelling Crosby's Quality Management Maturity Grid (QMMG) Overview of the CMM and CMMi ISO/IEC Model HMG Information Assurance Maturity Model Security Maturity Models Economics of Security Summary.33 Chapter 3: Methodology Overview and Deliverables Methodology 37 2

3 3.2.1 Preliminary research Data Collection Data Validation and Analysis ISMM development ISMM Testing/Project Evaluation Conclusion and report writing Project Tools Data Collection Tools Communication Tools Report Writing tools Summary.46 Chapter 4: ISMM Development and Evaluation Iterative Development of ISMM ISMM Components Description Maturity Levels Security Factors Security Areas ISMM Evaluation Summary.61 Chapter 5: Results Economic Spending of Organisations Security Governance Process and Procedures Technology and Innovation Risk Management Results Summary.78 3

4 Chapter 6: Evaluation of Selected Organisations Structure of Information Security Assessment Framework (ISAF) Testing the ISMM and ISAF by evaluating selected organisations Assessments on Processes and Procedures Standardisation Assessments on Technology and Innovation Assessments on Security Governance Assessments on Risk Management Capability Security Cost Estimation Discussion Summary.97 Chapter 7: Conclusion Summary of Achievements Project Limitations Recommendation for Future Work Conclusion..100 List of References.102 Appendix A: Cover Letter and Questionnaire Appendix B: Information Security Maturity Model (ISMM)..115 Appendix C: Information Security Assessment Framework (ISAF) 117 Appendix D: Evaluation Feedback Form and Results..127 Appendix E: Project Gantt chart Appendix F: Security Cost Estimation Template Appendix G: Project Plan Gantt Chart..136 Word Count: 35,157 4

5 List of Tables Table 2.1: The Quality Management Maturity Grid.20 Table 2.2: The Five Levels of Software Process Maturity.22 Table 2.3: Capability Maturity Model Integration.25 Table 2.4: ISO/IEC-2 Maturity Model.27 Table 2.5: The HMG Information Assurance Maturity Model..28 Table 2.6: Summary of Security Maturity Models.31 Table 2.7: Works on Economics of Security..33 Table 2.8: Overall costs of an organisation's worst incident.34 Table 3.1: Research Methodologies 38 Table 5.1: Participants roles and years of experience...63 Table 5.2: Economic Spending of participating organisations..64 Table 5.3: Additional results under 'security governance' category 68 Table 5.4: Additional results under 'Process and Procedures' category 72 Table 5.5: Additional results under 'technology and innovation' category.74 Table 5.6: Additional result under 'risk management' category.78 Table 6.1: Assessment of case study organisations on process and procedure standardisation 81 Table 6.2: Assessment of case study organisations on technology and innovation.84 Table 6.3: Assessment of case study organisations on security governance.86 Table 6.4: Assessment of case study organisations on risk management capability...87 Table 6.5: Breakdown of case study organisations information security maturity..89 Table 6.6: Contractor Salary Rates..91 Table 6.7: Cost of security of participating organisations 92 Table 6.8: Projected cost of security per maturity level 96 5

6 List of Figures Figure 2.1: Continuous representation of the CMMI.24 Figure 2.2: Staged representation of the CMMI 25 Figure 2.3: ISO/IEC Model process categories and process groups...26 Figure 4.1: Iterative Procedure for ISMM Development..47 Figure 4.2: Information Security Maturity Levels..51 Figure 4.3: Information Security Assessment Categories.53 Figure 4.4: Security Factors and their Security Areas.56 Figure 4.5: Four Aspects of Cost of Security...59 Figure 5.1: Participants by industry sector.63 Figure 5.2: Management's perception of information security..65 Figure 5.3: Formulation of Security Policy.66 Figure 5.4: Levels of awareness training programme.66 Figure 5.5: Employees security awareness level and compliance.67 Figure 5.6: Stages for security consideration in a project...69 Figure 5.7: Information management procedures as established by participants.70 Figure 5.8: Rate of information security audit and penetration testing 70 Figure 5.9: Rate of security incidents by participants...71 Figure 5.10: Rate of disruption of business process by security attacks..73 Figure 5.11: Average uptime rate for participants' network.74 Figure 5.12: Participant risk assessment and treatment capability levels 76 Figure 5.13: Intervals for malware scanning and software updates 77 Figure 5.14: Intervals for checking and ensuring business continuity plans activeness.77 6

7 Abstract The main focus of this project is to investigate the popular perception by security experts: they believe that up-front security investments by organisations to achieve high maturity would eventually result in lower cost of ensuring security. However, Ross Anderson et al. have a different opinion in their recent report (Measuring the Cost of Cybercrime). They argue that huge security investments by organisations have little or no influence on their eventual cost of security. Their study showed that cyber criminals still have their ways, and inflict costs on organisations despite their high security investments[6]. The question between these arguments then is: Are the security experts wrong in their opinion? Or could it be a situation of organisations' investments not being properly guided by standards using effective security maturity models? Through literature research, various existing maturity models, especially the security models, are compared. The inspiration and ideas derived enabled an information security maturity model (ISMM) to be developed. The developed five-layer ISMM is used to evaluate selected organisations to determine their maturity levels and how that translate into their economic postures. The assessment results and the project outcome show that organisations that make up-front security investment stand to gain far more than those who do not. The results also indicate that good security controls alone are not enough to guarantee information security without complementary good security governance or culture. 7

8 Declaration No portion of the work referred to in this dissertation has been submitted in support of an application for another degree or qualification of this or any other university or other institute of learning. 8

9 Intellectual Property Statement i. The author of this dissertation (including any appendices and/or schedules to this dissertation) owns certain copyright or related rights in it (the Copyright ) and s/he has given The University of Manchester certain rights to use such Copyright, including for administrative purposes. ii. Copies of this dissertation, either in full or in extracts and whether in hard or electronic copy, may be made only in accordance with the Copyright, Designs and Patents Act 1988 (as amended) and regulations issued under it or, where appropriate, in accordance with licensing agreements which the University has entered into. This page must form part of any such copies made. iii. The ownership of certain Copyright, patents, designs, trademarks and other intellectual property (the Intellectual Property ) and any reproductions of copyright works in the dissertation, for example graphs and tables ( Reproductions ), which may be described in this dissertation, may not be owned by the author and may be owned by third parties. Such Intellectual Property and Reproductions cannot and must not be made available for use without the prior written permission of the owner(s) of the relevant Intellectual Property and/or Reproductions. iv. Further information on the conditions under which disclosure, publication and commercialisation of this dissertation, the Copyright and any Intellectual Property and/or Reproductions described in it may take place is available in the University IP Policy (see in any relevant Dissertation restriction declarations deposited in the University Library, The University Library s regulations (see ) and in The University s Guidance for the Presentation of Dissertations. 9

10 Acknowledgements My heartfelt appreciation goes to my supervisor, Daniel Dresner (Dr.), for his guidance and feedback throughout the project. I must say that his challenging questions and reasoning during the project greatly propelled me to success. I also wish to thank all the security experts in industry and academia who participated and contributed during the iterative development and evaluation of the project. 10

11 Dedication I dedicate this report to God who is the reason for my existence and achievements I cannot do without you. To my loving parents, you have jealously sown a lot in me I remain indebted to you. To NITDA, Nigeria, thank you for this great opportunity. 11

12 Chapter 1: Introduction It is a known fact that organisations, in both public and private sectors, have evolved from the era of documents to data in the course of their business[53]. Volumes of data/information are being generated by increasing technological devices in this field of information technology (IT). Of course, these advancements in information technology have been empowering businesses, and still hold a lot for the future. Companies can now transact business with partners and reach their numerous customers, irrespective of their distant geographical locations[40]. It is becoming generally accepted that ''the technological underpinnings of business are fundamental to its success in today s digital world''[50]. Hence, every business is now automating its processes and adopting best technologies as to remain competitive. Indeed, information technology has virtually become the platform of every activity in the society, and will ever remain indispensable. Even as information technology can empower a business, it can also ruin it fast. Information has become a valuable, an integral part of every business, and the ''currency of the twenty-first century'' that ''needs to be suitably protected''[30, 39]. It is, because loss of vital organisation's information (secret) might imply loss of business; gain of certain sensitive information could imply gain of power. Therefore, the shift of these information assets from the physical (in form of documents) to the digital world compels the criminals to re-align their attack strategy for the trend. Cybercrimes now outnumber conventional burglary cases[6]. One of the UK's newspapers (The Telegraph) in a recent survey announced a fifty percent upsurge of cybercrime last year. It reveals that both small businesses, especially the large ones are ''struggling to cope with losses of between 450,000 and 850,000 [29]. The governments are also not spared in this online community attacks as witnessed in South Korea[1], and in Malaysia[41] sometime in the past. Trends show that the society is fast moving into an era of cyber war[47, 55]. Therefore, if businesses and economies must thrive in the face of these threats, they must take information security serious, and invest in it. But how could they know when their security investments and systems are effective enough to withstand these evolving threats? This question suggests the need for a tool for evaluating and guiding organisations to information security maturity. 12

13 A tool, information security maturity model, is developed in this project for appraising an organisation's security level. What is this information security maturity model? Information security means confidentiality (restriction to unauthorised access or disclosure), integrity (preservation of originality) and availability (access whenever required) of information/information systems. Any tool with well-defined, evolutionary levels for determining the extent to which organisations can reliably achieve its set objectives is a maturity model[54]. Bringing these two definitions together, information security maturity model could be defined thus: a tool for assessing organisations information systems' capability of meeting security requirements, while ensuring organisations' objectives are met, in the midst of security attacks and incidents[54]. This project, in particular, concerns with investigating and determining the value of security investments in organisations. Whether it is benefitting them, or they are just having an illusion of protection while continually losing resources to security incidents and breaches. I start by addressing the research motivation and questions. 1.1 Research Motivation and Questions As already stated in the previous section, the maturity model will enable organisations determine their security status. But why would organisations bother to know their information security levels if such undertaking is not critical to their business goals? However, the information security maturity model offers practical benefits to organisations in several direct and indirect ways: It pulls or liberates an organisation from the comfort and laxity of not having a clear picture of its maturity levels/status into improvement striving. One knowing that he does not know is the beginning of knowledge. The organisation is then inspired to action as the model presents those identified limitations as the barrier preventing it from meeting its desired goals. By so doing, the maturity/standard of the organisation is raised towards achieving its objectives. The maturity model can be used to improve a process, and consequently improving the quality of systems or products/services resulting from the 13

14 improved process. It is believed that ''the quality of a system is highly influenced by the process used to acquire, develop and maintain it''[62]. By raising service/product quality, stakeholders' and customers' satisfaction is increased, implying more business for the organisation. Crosby in his book (Quality is Free) argues that the earlier a defect is identified and corrected in a manufacturing process, the lesser the cost of ensuring quality[21]. Similarly, I can argue that this maturity model which can be used for identifying improvement areas in organisational processes for remedy is a cost-saving tool. These mentioned benefits derivable from the use of the ISMM explain the motivation behind this research project, which is to help organisations achieve those benefits. Furthermore, since the maturity model is to ensure coordinated processes or activities within organisations, it means they enable environments where organisations can quantitatively evaluate their investments against returns. Unfortunately, most organisations invest in information security to secure clients private data, ensure business secret is untapped, and so many other reasons geared towards business success. They often are not sure their investments are yielding the intended purpose which is security and business gain in general. This has prompted the following research questions: What is the relationship between the maturity level of an organisation and its cost of security? Do organisations that make up-front security investments benefit than those who do not? Do security controls alone ensure information security? From the results of this project, answers to these questions emerged. But to be able to achieve that, the aims and objectives of this project were set as to guide the project to successful completion. The next section discusses the aims and objectives of this research project. 1.2 Project Aims and Objectives The project is associated with two main aims. Firstly, to develop a tool for evaluating the information security maturity of organisations: information security maturity 14

15 model. Secondly, to investigate if there exist a relationship between an organisation's (or supply chain's) development activities and the cost of security. It could be that higher maturity implies higher cost for maintaining infrastructures and sustaining the maturity; or maybe, less cost (more savings) due to incidents/breaches are being prevented. This project is to verify and conclude on that popular claim that organisations that have higher maturity are likely to save and gain more than their counterparts who had not invested in security. Therefore, I proposed the following objectives to enable me achieve these two aims: To gain good knowledge of best practices as specified in the information security standards. To carry out a detailed systematic literature review of existing capability maturity models, with specific interest in the security maturity models. To conduct and have credible information security survey of organisations in different industries. To develop a five-layer information security maturity model. To determine a selected organisation's position on the developed five-layer information security maturity model. To investigate a selected organisation's security investments in relation to its cost of security. 1.3 Project Scope This project primarily focuses on developing an information security maturity model that can be used to evaluate the maturity level of an organisation. A systematic literature review of previous capability maturity models (CMMs), especially the security maturity models, was carried out in order to understand this project in a wider context. Some previous maturity frameworks and methodological concepts was adopted and improved on in the course of this project. Case study organisations were also evaluated to determine their positions on the developed five-layer maturity model and the impact on their economic situations. By doing this evaluations and costbenefit analysis, the motive of the project was achieved. This motive was to empirically confirm whether initial security investments to attain maturity actually influence the economic gains or losses of an organisation. The nature of such relationship was also 15

16 determined. It is worth highlighting, however, that the proposed maturity model is not a substitute for an information security improvement programme. It can only be used to appraise an organisation; it is not meant to prescribe solution to raise the organisation's level. 1.4 Report Structure Having introduced this project, defined its objectives and scope, the remaining parts of this report is structured thus: Chapter 2: Literature review This section covers a detailed background research on maturity models as to situate the project in a wider context. It starts by explaining maturity modelling concept, and proceeds to discuss trends in maturity modelling. Security maturity models are discussed to build good understanding necessary for the project. The chapter also explores literatures on economics of security before ending with a summary section. Chapter 3: Research methods and project plans This section focuses on the methodology used, those discarded, and the reasons for choice of methodology. It also details all the individual stages of the methodology and their activities. At first, overview of the project is presented in this chapter, while the project tools used for the project is presented at the concluding part. Chapter 4: ISMM Development and Evaluation In this section, a detailed explanation on how the ISMM was actually developed using iterative procedure is done. A description of all the components, maturity levels, security factors and areas are also presented in this chapter which ends by describing the evaluation strategies for the project. Chapter 5: Results This section of the project presents results of the survey conducted and highlights salient points using charts and tables. Chapter 6: Evaluation of selected organisations The assessment of the case studies is done in this section. This is preceded by a description of the information security assessment framework (ISAF) that is used in 16

17 assessing the organisations. This section further discusses the results of the project and the outcome of the assessment conducted on the case studies. Chapter 7: Conclusion This conclusion chapter summarises all work done in the project. It starts by presenting the project achievements, limitations, areas for future works and then finalises by giving the conclusions which answer the research questions. 17

18 Chapter 2: Literature Review This chapter involves a systematic review of maturity models. The concept of maturity modelling is treated in the first section. The following section discusses the early key maturity models, and extends to existing security maturity models. Economics of security section is included which highlights and discusses relevant works that investigated costs of security. Finally, the chapter ends with a summary section that also provides the rationale for the project. 2.1 Maturity Modelling: An improvement approach According to the Oxford Advanced Learners Dictionary, maturity is ''the state of being fully grown or developed'', while a model is a thing that is considered an excellent example of something. Combining these two words, I further explain a maturity model as a good example of an evolutionary development of something to full growth. In the information technology context, a number of explanations of what a maturity model is have been made: It is a set of well-defined activities which explains the features of effective processes[62]. It is a systematic framework for evaluating and improving processes in an evolutionary pattern[54]. Phillip Crosby in his book (Quality is Free) argues that quality is always misinterpreted, with the opinion that quality is simply meeting requirements[21]. However, if requirements are not clear and unknown, then it will be difficult to achieve the unknown. This is because great success is always the product of great vision. This implies that activities and efforts will keep revolving in certain uncoordinated sphere and boundaries of little or no progress until there is a clear picture of a goal or destination. Maturity modelling is, however, perceived as a standardised approach for guiding and connecting these activities, processes and efforts to their destination which is the desired goals/objectives. By systematically evaluating the practices, culture and operations of an organisation against standards, the extent to which the organisation can achieve its set objective is determined. Any means or tool with well-defined progressive levels for conducting this exercise can be termed a maturity model[54]. 18

19 Over the years, so many maturity models have evolved. All have the common purpose of improving a process. But before focusing on the information security maturity model which is the topic of this project, it is of importance we have a look at trends in maturity modelling. 2.2 Trends in Maturity Modelling The continuous rise of variety of maturity models for improvement purposes in different domains had been preceded by some basic early models. Indeed, it will be difficult discussing these models as a mix; or each as a collection of ideas/arguments from different researchers as their series of versions from the first have always been handled solely by a body (e.g., Software Engineering Institute) or a particular set of individuals/bodies. In this section, I discuss these basic foundation models (whose structures many other models adopt) in linear form, identifying their weaknesses, strengths and how they apply to information security Crosby's Quality Management Maturity Grid (QMMG) The Maturity Grid is one of the earliest attempts to maturity modelling. Phillip Crosby, the quality professional, had introduced the Quality Management Maturity Grid in his attempt to address the difficulties of quality management[21]. His proposition has it that the earlier a defect is identified and corrected in a manufacturing process, the lesser the cost of achieving quality. This is because rework expenses are greatly saved. According to Crosby, desired quality results can only be achieved and maintained by conscious planning, and not by chance or brute-force strength[21]. This led him to develop a system for measuring and controlling quality in an organisation s operations: the quality management maturity grid. The quality management maturity grid, according to him, was intended as a tool which operations managers (who may not really be quality experts) could use in appraising their operations and identifying improvement actions. The maturity grid is divided into five stages of maturity, with six management categories as shown below: 19

20 Management categories Management understanding and attitude Quality organisation status Problem handling Cost of quality as % of sales Quality improvement actions Summation of company posture Stage I: Uncertainty Random blames/agita tions due to poor management awareness. Quality responsibility is left to the operations departments only. Chaotic way of handling problems with blames and fighting. Reported: unknown Actual: 20% Not recognised and done. We don t know why we have problems with quality. Stage II: Awakening Management recognises quality needs, but unsupportive. Better quality manager, but operations are still reactive. Temporary approach of resolving problems; no long-term strategy. Reported: 3% Actual: 18% Short-term reactive approach is practised. Is it absolutely necessary to always have problems with quality? Stage III: Enlightenment Emerging knowledge and support for quality. Quality department receives top management attention; and quality manager partakes in management functions Problems are handled with sense of ownership/ma turity in a repeatable way. Reported: 8% Actual: 12% Well-defined quality improvement program is followed. Through management commitment and quality improvement we are identifying and resolving our problems. Stage IV: Wisdom Managem ent participate s in quality campaign. Quality manager is part of the organisati on; reporting is effective and more preventive actions. Problems are identified earlier and prevented ; improved functions. Reported: 6.5% Actual: 8% Quality program is delivering desired results. Defect preventio n is a routine part of our operation. Stage V: Certainty Quality is considere d part of business essentials. Quality manager is on board; quality is taken serious, and defects preventio n is highly achieved Problems are prevented except in rare cases. Reported: 2.5% Actual: 2.5% Quality improvem ent is a routine. We know why we do not have problems with quality. Table 2.1: The Quality Management Maturity Grid[21] 20

21 The table 2.1 above specifies the five evolutionary phases of development which an organisation is likely to go through in the course of attaining maturity. These phases go from uncertainty through awakening, enlightenment, wisdom to certainty. Although Crosby had provided the basic framework on which most maturity models were developed, his work is for improving the quality of goods and services. Notwithstanding the relationship in which security is considered a primary system quality characteristic[37], the QMMG cannot be used to evaluate and improve an information system. It did not cover the technicalities and other key aspects of information security which include technology, governance, process, and risk management; it only focuses on service/product quality management and improvement. As seen in table 2.1, the QMMG describes the features of processes at each level of maturity; it does not spell out core manufacturing activities or key process areas (as seen in the CMM for software) which actively influence maturity. It does not also specify the metrics for properly calculating the cost of quality Overview of the CMM and CMMi The initial Capability Maturity Model (CMM) for software is also known as and used interchangeably as the Software Capability Maturity Model (SW-CMM). This is the software model upon which many maturity models were developed[65]. It was developed to address the problems in software development processes and to evaluate capability of software companies[45]. Lack of consistent and well-defined approach to software development and management had accounted for loss of gains (i.e., not meeting software goals) which a mature software process could probably ensure[16]. In an attempt to address this, the Software Engineering Institute of Carnegie Mellon University developed the Capability Maturity Model. This was in response to the U.S. government s need and quest for a way of assessing software process maturity of contractors[45]. The main aim of the Capability Maturity Model was to assist software organisations have good control of their software development and management processes[45]. This, it achieves by providing them with standard, well-defined stages of best practices that - if followed consistently - will lead them to maturity. However, the CMM and its successor model (the CMMi) have their limitations in software process management. They do not provide satisfactory means for handling issues relating to software 21

22 maintenance[8]. This limitation necessitated the Software Maintenance Maturity Model (SM MM ) which specifically identifies and addresses those unique maintenancespecific concerns. It can be used to evaluate and improve the capability of software maintenance suppliers. The CMM framework had originally been inspired by Crosby s quality management maturity grid[16]. However, the model was based on Humphrey Watt's process maturity framework, as described in his book: managing the software process[32]. It had been structured into five evolutionary stages with key process areas that must be achieved cumulatively in order to attain certain maturity level[45]. The type of structuring is termed the 'staged' representation, as also featured in the CMMi[62]. These five stages of Capability Maturity Model, their descriptions and the key process areas, as developed by the SEI, are as shown in the table 2.2 below: Stage Description Key Process Areas Characteristics 1 Initial No process areas The software process is characterised as ad hoc, and occasionally even chaotic. Few processes are defined, and success depends on individual effort. 2 Repeatable Software configuration management Basic project management processes Software quality assurance are established to track cost, Software subcontract management schedule, and functionality. The Software project tracking and oversight necessary process discipline is in place to repeat earlier successes on Software project planning projects with similar applications. Requirement management 3 Defined Peer reviews The software process for both Intergroup coordination management and engineering Software product engineering activities is documented, Integrated software management standardized, and integrated into a Training program standard software process for the Organisation process definition organization. All projects use an Organisation process focus approved, tailored version of the organization's standard software process for developing and maintaining software. 4 Managed Software quality management Quantitative process management 5 Optimised Process change management Technology change management Defect prevention Table 2.2: The Five Levels of Software Process Maturity[45] Detailed measures of the software process and product quality are collected. Both the software process and products are quantitatively understood and controlled. Continuous process improvement is enabled by quantitative feedback from the process and from piloting innovative ideas and technologies. 22

23 In table 2.2 above, each level of maturity shows its key process areas. They are those sensitive areas which actively influence an organisation s software process capability. Reaching the goals of each key process area demands implementing all the practices as required for that process area. And collective achievement of the practices of those key process areas means organisation s achievement of a particular level. This notion, however, does not undermine the contributions or effects of other process areas in an organisation s software process capability; rather, these key process areas form the indispensable activities which must be properly handled for achievement of a level[45]. Due to SW-CMM's success in software process improvement, the value experienced from it had encouraged the rise of many other capability maturity models in various disciplines [20, 23, 62]. These include the Software Acquisition CMM, Systems Engineering CMM, Systems Security CMM, ISO 15504, People CMM, and so on[62]. This development led to various inconsistencies and difficulties integrating more than one maturity model in a single improvement program. This is due to their varying structures, format and definitions for maturity[62]. Consequently, the idea of a Capability Maturity Model Integration (CMMi) came in an effort to address this problem, and to provide a framework for integrating future capability maturity models[62]. While its predecessor maturity model (CMM for software) specifically tailors to software issues, CMMi claims to support all purposes[58]. An approach for developing products and services, as seen in CMMI for development (CMMI-DEV)[63]; delivering and managing services, as seen in CMMI for services (CMMI-SVC); and for acquiring products and services, as seen in CMMI for Acquisition. From the security perspective, the idea of using the CMMI to effectively improve all processes in a project or for conducting organisation-wide evaluation is still questionable. It is, because the model has barely considered security-specific issues, as seen in one of its integrated security models: the Systems Security Engineering CMM (SSE-CMM)[58]. This subset model focuses on security engineering; security while executing information systems projects[31], etc. It does not cover in detail security concerns after project execution like issues around human factor, risk management 23

24 and so on. Hence, one can argue that it might not be an effective tool for completely evaluating and improving the information security maturity of an organisation. The CMMI like the predecessor CMM for software has staged representation[45, 58]; it also includes continuous representation. Both representations have the same contents, but categorised as such to allow organisations the options of choosing the ideal improvement route for their operations[58]. This can be considered a major strength of the CMMI over other maturity models. In the continuous representation, functions are grouped according their process areas. This gives organisations the opportunity to choose the process area that best marches and supports their business objectives and enables comparisons across and among organisations on a processarea-by-process-area basis [58]. Figure 2.1 below shows the continuous view of the CMMI: Figure 2.1: Continuous representation of the CMMI[58] Observing figure 2.1 above, we can view the Process Management category, Project Management category, Engineering category and the Support category. These categories have their process areas (with activities under each) which must be achieved for attainment of a maturity level. As clearly shown above, the CMMI has not considered information security as to adopt it as a unique category with its own process areas. Also, the staged representation categorises variety of activities/functions into maturity levels as shown in figure 2.2 below: 24

25 Figure 2.2: Staged representation of the CMMI[58] The CMMI, like the CMM, adopts a cumulative approach in its ascent to maturity. This demands that lower levels of maturity must be attained (i.e., not skipped) before the higher one is ever achieved[63]. Unlike the CMM that has five levels of maturity, the CMMI extends to six levels of maturity as shown in table 2.3 below: 5 Optimising Focus on continuous process improvement. 4 Quantitatively Managed Process measured and controlled. 3 Defined Process characterised for the organization and is proactive. 2 Managed Process characterised for projects and is often reactive. 1 Performed Process unpredictable, poorly controlled, and reactive. 0 Incomplete Table 2.3: Capability Maturity Model Integration[58] Having been widely accepted for process improvement both in private and public sectors, the CMMI has benefitted many world-class organisations and government agencies. They include Accenture, U.S. Air Force, Samsung, J.P. Morgan, Bank of America, Intel, and so on[58]. Notwithstanding its high acceptability, it is a more generalised model and did not concentrate or comprehensively treat issues relating to information security. Another model which has had a good shot at software process improvement like the CMM is the ISO/IEC model. 25

26 2.2.3 ISO/IEC Model The ISO/IEC was together developed by the International Organisation for Standardisation (ISO) and the International Electrotechnical Commission[38]. Though both the CMM and the ISO/IEC model are used for software process assessment and capability determination, both are slightly different in their structuring. The CMM groups key process areas across the maturity levels. This means that all the key process areas under certain maturity level are the requirements which must be achieved in order to attain that level as seen in figure 2.1. Taking a different approach, the ISO/IEC has two dimensions: the process dimension and the capability dimension. The process dimension is made up of three process categories, nine process groups and forty-eight processes, while the capability dimension comprise of six capability levels. For clarity, figure 2.3 below shows the process categories, process groups, and processes. Figure 2.3: ISO/IEC Model process categories and process groups[38] As seen from figure 2.3, information security is not acknowledged in this model as to deserve a processes category or at least a group. It solely focuses on software business improvement, and has no defined security process or measurement indicators which can be used to appraise organisation's information security capability. 26

27 The ISO/IEC has six capability levels with clear description of how a lower level connects to the next higher level (i.e., cumulative progress). Unlike in other capability maturity models, such descriptive links between the capability levels are nicely made in the ISO/IEC model as seen in table 2.4 below. Level 5 Level 4 Level 3 Optimising process Predictable process Established process The previously described Predictable process is continuously improved to meet relevant current and projected business goals. The previously described Established process now operates within defined limits to achieve its process outcomes. The previously Managed process is now implemented using a defined process that is capable of achieving its process outcomes. Level 2 Managed process The previously Performed process is now implemented in a managed fashion (planned, monitored and adjusted)and its work products are appropriately established, controlled and maintained. Level 1 Level 0 Performed process Incomplete process Table 2.4: ISO/IEC-2 Maturity Model[38] The implemented process achieves its process purpose The process is not implemented, or fails to achieve its process purpose (there is little or no evidence of any systematic achievement of the process purpose). Table 2.4 above shows the six capability levels an organisation's software process will go through to achieve maturity, from incomplete, performed, managed, established, and predictable to optimising processes. We can further observe that the Level 1 as witnessed in the CMM appears to be split into two levels here: level 0 and level 1. The separation done here can be argued as reasonable. This is because some organisations fail in their objectives because they do not follow procedures/processes in their activities - this can be likened as level zero organisations. However, others achieve accidental success through a process that has not been systematically organised for repeatability - this can be likened as level one organisations. The ISO/IEC model might be viewed to be more effective in assessing processes than the CMM since it clearly identifies and evaluates process categories separately across capability levels. However, the extension of the CMM to CMMI to include some of the features of the ISO/IEC (e.g., the same number of stages of capability levels, etc.), while retaining the values of the CMM, has given the CMMI greater 27

28 acceptability[58]. Both are still not effective measurement and improvement tools for security because they have not pictured and established information security as a domain in their structures. ISO/IEC model has been used to conduct so many assessments, and also been extended to other domain-specific model like the Automotive SPICE and SPICE for SPACE[11, 17]. However, none of the ISO/IEC series of models has considered information security as a domain for comprehensive information security assessment and improvement. Nevertheless, there are some existing maturity models for assuring information security. One of them is the HMG Information Assurance Maturity Model HMG Information Assurance Maturity Model In response to the Transformational Government and Shared Services initiatives, the need for information sharing amongst UK government departments became necessary. But more important was the need for uniform standards to ensure information shared is handled properly and securely. As a result, the HMG information assurance (IA) maturity model and assessment framework was developed. This serves as a means of assessing, ensuring and assuring every participating departments of the maturity of one another[9]. The HMG Information Assurance Maturity Model adopts five levels of maturity like the CMM and Crosby's maturity grid[9, 21, 45], as shown in table 2.5 below: Level 1: Initial Level 2: Established Level 3: Business Enabling Level 4: Quantitatively Managed Level 5: Optimised Awareness of the criticality of IA to the Business and Legal requirements. IA Processes are Institutionalised. IA Processes are Implemented in Critical Areas of the Business. The Number of Corporate Exceptions to Implementing IA Processes is Known & Reported. Responsive IA processes are integrated as Part of Normal Business. Table 2.5: The HMG Information Assurance Maturity Model[9] From table 2.5, we see that the HMG IA maturity model in level one portrays a stage where there is awareness and attempts to improve information security by establishing policy. This awareness state represented here is what is obtainable from 28

29 level two of the previous models. Their level one has always been a representation of a maturity state where activities/processes are uncoordinated, ad hoc, and entirely lacked awareness and interest of the management[45, 63]. Secondly, the HMG IA maturity model has been so detailed in its approach of focusing on information security by identifying some key process areas. These process areas are as follows[9]: Leadership and Governance Training, Education and Awareness Information Risk Management (IRM) Through-Life IA Measures Assured Information Sharing and Compliance Under each of these process areas are ''Areas to Probe'', which are responsibilities; and their ''Evidence Expected'', which are expected proves to show that a responsibility has been met[9]. This maturity model has good focus on information security that it defines clearly the levels of compliance for these responsibilities and evidences expected across the five maturity levels. Hence, it will be easy to map or benchmark organisation's activities against these clearly stated process areas. Not minding the detailed work done, the HMG IA maturity model is yet not the destination as an all-encompassing security maturity model. Though it has treated information risk management (IRM) carefully, the technology aspect of information security has not been addressed as seen in its lists of process areas. As we know, system architectural design, controls for supporting security, their flexibility for innovation, and all that come under technology are critical factors which influence information security. HMG IA maturity model has barely made distinct specifications to that effect. Meanwhile, there are other security maturity models which are worth reviewing as seen in the next section. 2.3 Security Maturity Models The Systems Security Engineering Capability Maturity Model (SSE-CMM) is a security maturity model which adopts similar context with the Software Engineering Capability Maturity Model[61, 67]. The SSE-CMM, just like every other software CMM was 29

30 designed for process improvement, contractor selection, etc. But more importantly, it claims to have brought a solution of reducing cost at higher capability. Unlike other security maturity models that I discuss in this section, SSE-CMM has six maturity levels. They are: not perform, performed informally, planned and tracked, well defined, quantitatively controlled, and continuously improving[31]. The SSE-CMM defines three process areas with a number of processes/activities under each of them. These process areas are engineering, process and organisational process areas. It is worth identifying that the main focus of the SSE-CMM is to improve systems security engineering, especially during acquisition. It has barely considered some critical aspects of information security like human factor and continuous risk management. Control Objective for Information Related Technology (COBIT) being a framework created by the Information Systems Audit and Control Association (ISACA), has its own security maturity model. The model, having been derived from the SE-CMM, focuses on managing information technology (IT) and IT governance[10]. COBIT maturity model defines five maturity levels, with a main focus of auditing procedure. COBIT IT processes integrate applications controls into business processes. This is because it assumes that they are governed by the business process owners. These IT processes are described in the COBIT maturity levels. Table 2.6 below shows the levels as compared with other security models. National Institute of Standards and Technology (NIST) maturity model focuses on documentation of procedures[43, 67]. It defines five maturity levels which has contents similar to that of HMG IA maturity model. Similar in the sense that the first level realises security needs and establishes policies; secondly, procedures/processes are laid out according to policies established; and the third level deals with implementing the procedures/processes, and so on. Unfortunately, both models did not recognise that level where processes are uncoordinated, ad hoc, etc. They depict the first level as a level of awareness, which is the characteristic of level 2 in most previous software maturity models. Table 2.6 below shows a summary of these discussed models amongst others: 30

31 Model Description of Levels Focus areas SSE-CMM 0. Not performed 1. Performed informally 2. Planned and tracked 3. Well defined 4. Quantitatively controlled 5. Continuously improving COBIT Model Maturity NIST Maturity Model Citigroup's Information Security Evaluation Model (CITI-ISEM) CERT/CSO Capability Assessment Security 1. Initial /ad-hoc 2. Repeatable but intuitive 3. Defined process 4. Managed and measurable 5. Optimised 1. Policy 2. Procedure 3. Implementation 4. Testing 5. Integration 1. Complacency 2. Acknowledgement 3. Integration 4. Common practice 5. Continuous improvement 1. Exists 2. Repeatable 3. Designated person 4. Documented 5. Reviewed and updated Table 2.6: Summary of Security Maturity Models[19, 66] Security engineering and software design Auditing of procedures Checks level of Documentation Organisation awareness and adoption Measurement of quality in relation to levels of documentation. Though we have these security maturity models in place, an observable limitation is that they are customised to organisations that developed them. This makes it difficult using them to evaluate and improve information security status of variety of organisations. Examples of such cases are: COBIT model from ISACA, NIST from National Institute of Standards and Technology, and so on. Finally, one other recent work done on information security maturity modelling is that by Malik Saleh. His information security maturity model defines five levels of maturity: non compliance, initial compliance, basic compliance, acceptable compliance and full compliance[54]. Although Malik incorporates several key security factors as metrics/indicators for evaluation, his survey and hence work still has a major limitation, for using qualitative methodology. The questionnaire uses a scale system of 1 to 5 to gather respondents' opinions of situations. This allows the respondents alone to judge and score. In this case, results will likely be influenced by the subjective 31

32 opinion of the respondents. However, for credible results that will yield reliable findings or appraisal, practical events, practices etc. should be considered and quantified objectively. 2.4 Economics of Security Economics of security, as a term, generally concerns with the believed gain achieved by investing in security. The term 'return on investment' might be a close term to this which tends to portray the profit made out of any investment. However, Bruce Schneier, just like many other opinions[2, 18], disagrees with the appropriateness of this term in the context of information security. He argues that security, unlike other business investments, does not provide return but indirectly ensures that by preventing losses[56]. This financial and reputational losses are what security investments prevent thereby saving the cost that would have been lost if they were to be sacrificed in the absence of security. Talking about economics of security involves determining the total cost of security for organisations, both those who invest and those who do not invest in information security. By comparing the economic status of similar organisations (one investing and the other not investing), the perceived gain of implementing security is determined. However, this lifecycle of security investment include the initial cost of procuring the security solutions, maintaining and updating them; the cost of remedying breaches and incidents; the cost of ensuring business continuity plans; cost of organising security awareness trainings; the cost of paying legal fines and compensations, and other indirect cost like business and time loss with reputational loss (see figure 4.5). The sum of all these costs in a given security solution lifecycle is seen as the total cost of security. For organisations that do not invest at all in security, the total of all costs borne tackling incidents is deemed as their total cost of security. So many security reports, articles, academic materials, etc. have dealt with this topic (economics of security) in different perspectives [4, 27, 42]. While some concentrated in investigating the general cost of cybercrime or security incidents[28, 35], many engaged in identifying and separating the costs of various incident types[6, 34]. Others toed the path of computing the return on security investments (ROSI)[2, 59]. Table

33 below outlines some of the key materials on security economics that have been explored and their areas of focus. Author or institute/body Ponemon Institute Title of material Focus area reference 2012 Cost of Cyber Crime Study: United Kingdom PricewaterhouseCoopers 2013 Information Security Ross Anderson Computer Security Institute European Network and Information Security Agency Lawrence A. Gordon Breaches Survey Measuring the cost of cybercrime 2010/2011 Computer Crime and Security Survey Introduction to Return on Security Investment The Economics of Information Security Investment Table 2.7: Works on Economics of Security Focuses on the costs of cybercrimes by country, industry, nature of crime, etc. Outlines the different types of incidents/breaches and their estimated costs by year, etc. Gives a global view of the various cybercrime types and their estimated costs Specifies attacks types and their rate of change over the years and by industry, etc. Focuses on Return on Security Investment (ROSI) calculation Focuses on determining the optimal amount that should be invested on a given set of information The cost of cybercrime by Ponemon Institute details the overall costs of cybercrime by country. It went further to break down the percentage proportion of different types of cybercrime, and also suggested the causes of the variance in the nature and costs across the countries surveyed. The report reveals positive relationship between the annualised cost of cybercrime and organisational size. It estimated larger organisations to be spending eighty-nine pounds ( 89) per capital cost, while the smaller ones spend three hundred and ninety-nine pounds ( 399)[35]. Another good source of security cost estimation is the 2013 Information Security Breaches and Survey by PricewaterhouseCoopers (PwC) as shown in table 2.7 above. This technical report provides detailed information on trends in information breaches/incident. But of importance to this project is how it presents the various [35] [49] [6] [34] [2] [27] 33

34 security incidents/breaches and their respective estimated costs. These costs are shown in table 2.8 below: Table 2.8: Overall costs of an organisation's worst incident[49] It is clear from table 2.8 above the various types of breaches/incidents and their estimated overall costs. It also sums up all the costs for both small and large organisation and compares with figures from other years to determine the rate of change. However, this report did not include or try to estimate the cost of initial investments on information security. It did not also consider costs spent in maintaining and upgrading security mechanisms, but just concentrated on costs that emerged as a result of incidents/breaches. Ross Anderson et. al., in their work Measuring the Cost of Cybercrime explain the various forms of cybercrimes, sources and values. Their work concluded by presenting a global view and summary of those various cybercrimes and their respective 34

35 estimated costs[6]. They were also careful to demarcate between the loose and accurate data as to allow a better understanding of the cybercrime cost's landscape. Other works on economics of security, as seen in table 2.7, include that from the Cyber Security Institute, European Network and Information Agency, and so on. They all explain, in their perspectives, issues surrounding security economics with focus on areas as described in table 2.7. Most of these technical reports and academic researches on economics of security support the opinion that security investments save costs thereby leaving a positive impact on an organisation's economics status[7, 56]. They also indentified a trend in which the larger organisations seem to spend lesser per capital cost than the smaller organisations[35]. However, the project takes a different approach of not just investigating the various and final cost of security, but also determining how far it is affected by security maturity. As usual, Chief Information Security Officers (CISO) strive to increase security budget[14] and invest in information security. But in most cases, they do not do proper analysis to understand the economical impact of such undertaking to their businesses. Hence, it is important that organisations who aim to succeed and grow economically should quantitatively understand the cost-benefit realities of their investments[24]. In this research project, this worrying concern of how security investments impact organisations economically is addressed. 2.5 Summary An overview of the term 'maturity modelling' was given. Trends in maturity modelling were also addressed in details, with focus on the security models. Finally, this chapter ended with discussions on economics of security. Numerous existing capability/maturity models explored have not been discussed due to the scope of this project. They have evolved over the years in many versions and different domains, especially in the software engineering domain[65]. Despite the ever-increasing versions and drift of these maturity models to specialised domains[65], attempts in the information security domain have not taken into consideration the economics of security seriously. It will, therefore, be irrational for high security maturity to be achieved while organisations goals are forfeited there should always be a balance 35

36 and alignment of security investments and maturity with organisations business goals. To monitor and ensure this balance consistently, an information security maturity model which has a good focus on the economics of security (i.e., the monetary gainloss analysis) is necessary. 36

37 Chapter 3: Methodology The chapter presents an overview and deliverables of this project. It also details the methodology and finally the tools used in carrying out the project. 3.1 Overview and Deliverables As already mentioned, this project entails investigating and establishing the relationship between the maturity of an organisation and its cost of security. To achieve this, an information security maturity model is developed for evaluating and determining the positions of selected organisations on the developed five-layer maturity model. Also, information gathered concerning the economics of security (how much is invested and the resulting monetary gains or losses after investment) is used to analyse if actually security investments reduce cost of security or not. This was achieved by evaluating six case study organisations. Their maturity levels were obtained and their security investments also investigated. By comparing the results of the maturity assessments and the eventual economic outcomes of the case studies, various findings were made that answered the research questions. The deliverable of this project is as follows: A literature review on maturity models A 5-layer cost of security maturity model for evaluating the information security maturity of organisations. An evaluation of where selected organisation(s) is positioned on the developed information security maturity model. 3.2 Methodology Different methodologies with which the project could be carried out to answer the research questions were considered. Table 3.1 below shows these three methodologies and the stages of activities under each. 37

38 Stages Methodologies Quantitative Qualitative Mixed Approach Approach Approach 1 Data Collection (quantitative) Literature research Preliminary preparation (literature research & event participation) 2 Develop model Opinion Poll Data Collection 3 Evaluate Sort Results Data Validation and Analysis organisation(s) 4 Analyse Results Conclude Develop model and write Report 5 Conclude and Test model and Evaluate organisation(s) write Report 6 Conclude and write Report Table 3.1: Research Methodologies The methodologies as seen in table 3.1 above are briefly described as follows: Quantitative Approach The stages of activities involved in this approach include: Data Collection: this entails collecting primary data from organisations. Develop model: the maturity model is then developed at this stage. Evaluate Organisation(s): evaluation of case studies is done here relying solely on data provided by them to determine their maturity levels. Analyse Results: results of assessment are analysed to identify trends. Conclude and write Report: conclusion is drawn and report articulated. This would have been an ideal approach for this project assuming respondents/security experts find it very convenient reporting their security issues in detail. Unfortunately, this has not been the case in the security industry as security experts reveal their security issues with caution and hesitation [5]. However, relying solely on data collected from respondents alone, through this approach, might greatly affect the accuracy of the project outcome negatively. This is because responses from respondents might be loose (not accurate or specific). They could also be chances of respondents not responding as detailed as the research would demand. In which case, the researcher might not have enough data to draw conclusions. 38

39 Qualitative Approach The approach involves the following activities: Literature Research: involves gathering arguments and opinions from previous research works and technical reports. Opinion Poll: this stage involves gathering opinions on the research questions from experts in industry and academia by means of survey. Sort Result: results of survey conducted are then arranged/sorted and leading views statistically identified. Conclude and write report: conclusion is drawn based on the leading view from the survey. As seen in the brief sequence of explanations, this approach is not ideal since conclusions based on literature researches, security reports and opinions of experts are prone to ideological bias. Exploring practical situations in organisations is vital to gaining good understanding of their security problems and economic realities. Based on the limitations inherent in these two discussed approaches, the mixed approach was considered appropriate for this research topic for the following reasons. Mixed Approach The mixed approach entails collecting primary data from organisations (using survey) and supplementing that with data obtained from my literature researches. This mixed approach ensured that quality data which lend credibility to the research project was used. This is because while the primary data allowed the realities of the assessed organisations to be felt, the secondary data sources ensured outliers/deviations in their responses were checked. Below are the following stages of activities which constitute the methodology of the project. Each stage of methodology is described in each subsection that details the activities involved in the stage Preliminary research Before proceeding with the project, time was taken to prepare, reflect and research as to understand what the project is all about, best ways to conduct it, and the main project focus. The following preliminary activities which prepared me for success in this project were undertaken: 39

40 Background reading Detailed systematic literature review of previous capability maturity models was conducted, with focus on the security maturity models. Information security requirements as in the relevant information technology/security standards were also explored and understood. Some of the materials used in this process include articles in journals, digital libraries (e.g., IEEE Xplore, ACM etc.), textbooks, conference proceedings, articles on reputable websites (e.g. BBC, The Telegraph etc.). Online daily and weekly newsletters from security industry (e.g., searchsecurity.com) continuously remained sources of current information too. These enabled me gain understanding of the project topic in a wider context. Participation in seminars and events in IT/security industry By participating in seminars, other intellectual events and also following trends in the security industry, I gained good insight into this topic/project. These events were often organised by IT/security professional bodies that I belong to. Some of these bodies include the British Computer Society (BCS), the Institute of Information Security Professional (IISP), Internet Society (ISOC), and so on. By interacting with professionals and sharing in their opinions and experiences, I was able to expand my awareness and align with trending issues in the security industry Data Collection The data collection techniques and sources for this project include: use of questionnaires I considered and used questionnaires to gather primary data from organisations due the sensitivity of this project. Not many organisation, and in fact few, were willing to grant an interview session. Hence, questionnaire was an easy approach to extend my survey to as many respondents as possible, irrespective of their locations. Despite the fact that some other approaches were used as supplement, the reasons behind my use of questionnaire as the major data collection approach are as follows: Questionnaires will allow respondents the convenience and time to provide well-informed and verified responses. This is because physical 40

41 presence of interviewers could impose undue pressure on respondents. This might result in quick inaccurate answers being provided under tension. Requests to schedule and pay physical visits to some organisations were declined, including requests for physical interview sessions. They would not permit external unknown researcher to access their confidential and sensitive data or witness their operations live for fear of losing their business strategy or reputation. Hence the major reliance on questionnaire for collecting data. Thirdly, the time limit available for the project did not allow for extensive push to get interview sessions from organisations spread worldwide which the questionnaire was able to reach. The data collection process lasted for three months, while other project activities were in progress. The first batch of questionnaires was distributed on April These went to organisations in the government and financial industry; and the distribution was done electronically. Some banks that I had written referred my request to their head offices for attention. Therefore, I mailed hardcopies of the questionnaire with covering letters to the head offices of those banks. Follow-up s and calls (where applicable) were used to remind the reluctant respondents to fill in the questionnaires. Two months later, precisely June , the questionnaires were extensively distributed using a collection of s of organisations from all over the world. Organisations in the Fortune 500 list of companies were largely contacted with the survey questionnaire. interviews (via Skype & phone) Considering the difficulties with getting live interviews sessions with organisations, phone and Skype calls became useful. By this means, some willing respondents were at ease to grant anonymous interviews outside their corporate environments. This approach was helpful to the project as some who could not respond due to company policy were more responsive through this means. Through scheduled phone and Skype calls, some data for the project were also gathered. 41

42 information security survey reports Several annual information security reports served as key sources of my secondary research data. These include the 2012 Cost of Cyber Crime Study: United Kingdom[35], the 2013 information security breaches survey by PricewaterhouseCoopers (PwC)[49], 2013 data breach investigations survey by the Verizon RISK Team[64], the 2013 CISCO Annual Security Report[33], etc. I used these materials as supplementary materials to the primary research data collected through surveys. Ross Anderson argues in one of his papers, 'why cryptosystems fail[5]' that information security field is one of the fields in science that suffers the greatest challenges and setbacks. He explains that this is because organisations are usually very secretive with their security incidents, attacks, and failing operations. They do this for fear of reputational damage[28]; they do not understand that such attitudes hinder collaborative and cumulative efforts towards building solutions for preventing further similar attacks on other organisations. Hence, several instances of a particular attack hitting different organisations from time to time without remedy are witnessed in the information security field. This same reason accounts for the relatively small responses collected from the large number of questionnaires I had distributed for this project. In addition to the mitigating approach mentioned already, I had to embark on personal underground research of selected organisations through some reliable sources like the internet, media like CNN, BCC and other reputable newspapers and annual reports of organisations (where obtainable). This enabled me gather useful information about such organisations to supplement the ones they had provided through in the survey. Notwithstanding, the primary data collected through the survey were carefully validated as specified in the next section Data Validation and Analysis The validity of research data can greatly influence the final outcome of a research. Hence, it is normal for anyone to be concerned about how valid and truthful the provided data by respondents are. Nevertheless, I was optimistic and believed the data to be of good quality since any respondent who could likely provide dishonest answers 42

43 could as well avoid granting the interview or filling in the questionnaire in the first place. Most importantly, I used some credible security reports as previously mentioned to verify the validity and consistency of the collected data. These reports from surveys reflect the yearly status-quo of organisations, governments etc. as it concerns information security breaches, incidents, costs incurred on fines, and so on. Those bodies/teams who conduct these surveys work in close collaboration with security service organisations, government security agencies, etc. These security organisations across the globe contribute continually to their information database[49, 64]. They offer those assessment bodies first-hand information which they collect during forensic investigations and other rescue services to these victim organisations[64]. Interestingly, it was discovered that greater percentage of the collected data was not far from the stories of these information security yearly reports. Lastly, my personal years of experience in both private and public sectors in network administration role, IT service delivery, hardware/software implementation and maintenance is another knowledge-asset which I used in scrutinising the validating of these responses. Hence, I have confidence that these data are of good quality with proper analysis carried out on them. Other activities that characterised this stage include sorting and arranging the data in a form ready for use. The Microsoft Excel application was very useful during this sorting and presentation of data for analysis. Different currencies which respondents reflected were converted into pounds sterling for consistency, and open-ended responses were summarised and so on ISMM development An iterative procedure or approach is considered effective [13, 46], and was used in developing the ISMM. By this step-by-step approach, I initially articulated the draft model based on the extensive literature research I had done on existing security models and views gathered from security professionals. Secondly, I went into the next stage to improve the ISMM by going through the requirements specified in the ISO/IEC and ISO/IEC I ensured that the 'clauses' and 'security categories' therein were well-reflected. The maturity model further went through a review session with my supervisor whose challenging questions and guidance helped me to re-align the model for better. Lastly, a feedback survey form was designed (see Appendix D) and distributed to security experts in the industry and academia. Their inputs/contributions 43

44 in no small measure helped me extend a final professional touch of improvement to the information security maturity model. While I proceed to discuss the evaluation approach used for this project, detailed information on how the ISMM was developed is found in chapter ISMM Testing/Project Evaluation In the project evaluation stage, the ISMM was tested for effectiveness. Three evaluation strategies were proposed and used for testing the developed ISMM. The first approach involves evaluating the case studies over a period of two years as to allow the influence of the model to be observed (see detail in section 4.3). The second approach involved security experts in industry and academia who reviewed, criticised and submitted feedbacks on the model. This was achieved through distributed surveys. In the last evaluation approach, results using the developed ISMM were compared with that using other published proven security models. Consistency seen between the different evaluation outcomes confirmed the effectiveness of the developed ISMM. Detailed explanation of the processes involved in these evaluation strategies are in section 4.3. To accomplish this project, some tools which were helpful are discussed in the next section Conclusion and report Writing During this stage, conclusions were drawn from the assessment conducted on the case studies, and the dissertation report prepared. The writing of the dissertation report took forty three (43) day as reflected in the project plan gantt chart. 3.3 Project Tools The tools or instruments used for this research project are divided into two for clarity: the data collection tools and the communication tools Data Collection Tools The following represent the data collection tools that were used: Google Drive (Online questionnaire) An electronic version of the questionnaire was considered and used, as it enabled a wider scope of audience to be reached faster. It also ensured safe return or delivery of the completed questionnaires. Not only is it free, the Google drive questionnaire helped organise the results of the survey in a 44

45 Microsoft Excel template, which is easy to handle/analysis. Another reason for this choice, unlike the hardcopy questionnaire, is that respondents would find it more convenient to complete by just checking the option boxes. It also allowed them the flexibility to edit their responses in the event they obtained more or better information. Most importantly, it allowed me modify the questionnaire at some point to include some vital questions initial omitted. Firstly, I had to create a free Gmail account which allowed me access to the 'Google document' feature that was used to develop the questionnaire. But before then, I had taken time to articulate all the necessary questions on paper in collaboration with my colleague with whom I undertook the same project topic, though independently. Two good heads, they say, is better than one. Though the initial drafting was a team work, further fine-tuning processes of the questionnaire came through individual efforts. After then, the online questionnaire was developed to include all the necessary questions. A link to this online questionnaire was distributed via to respondents, with follow-up reminders sent every other week. As always, the initial response/comment to the survey led to further fine-tuning of the online questionnaire to include some key security factors like 'risk management' which was initially omitted. In addition, hard copy questionnaires were used as supplement too. Paper Questionnaires The paper questionnaire was used to reach some organisations whose could not be obtained on their websites or through other means. Such organisations, especially in the financial sector, were contacted through their online complaint channels. Most of them responded with their head office addresses, directing my request to their head offices. Consequently, I had to resort to sending paper questionnaires with covering letters to those addresses. This situation proved the fact that paper questionnaires can still be indispensible despite the high preference of the online version in recent day research activities. 45

46 3.3.2 Communication Tools Phone Realising the number of responses I got in this project was made feasible, despite the general difficulty in obtaining security information, by extensive use of the phone to follow-up on respondents. Repeated calls were made to some of the respondents reminding them to assist with filling in the questionnaires. s Electronic mailing was used extensively for distributing the questionnaires, placing reminders and also communicating with my supervisor for feedbacks. It is worth noting the effectiveness of s over telephone calls, in some cases, since almost all the s I sent using the official university/academic domain received attention, and were at least replied. Skype Some of the respondents in far locations were able to grant brief Skype calls through which I was able to gather further insights into the activities of such organisations. Several brainstorming sessions with fellow researchers during which we were able to resolve pressing challenges/issues were made possibly via Skype video calls. Interestingly, all those Skype communications were all free of charge, which encouraged my extensive use of it Report Writing tools The Microsoft packages were used in preparing this dissertation report. The Microsoft Word was used in writing this report, drawing figures, tables etc. The Microsoft Project was used to prepare the project Gantt chart, and finally the Microsoft Excel which was used to sort the survey results, draw charts and compute the case study organisations' maturity levels. 3.4 Summary This chapter has described the methodology and tools used in carrying out the project. The first section, however, gave an overview of the project and finally presented the project deliverables. The various stages involved in the methodology were explained in the second section in which the iterative method used in developing the model was highlighted. The third and last section described the project tools. With this clear description of the project methodology, the next chapter details the specific activities involved in developing the maturity model. 46

47 Chapter 4: ISMM Development and Evaluation The chapter comprises of three main sections. This first section details the iterative procedure and activities involved in developing the ISMM. The second section describes the ISMM and its components in details, while the final section explains how the model was evaluated. 4.1 Iterative Development of ISMM Iterative procedural approach of developing maturity models has been effective in delivering effective maturity models that ensured intended purposes[13]. The success track record of this approach is evident in other key, existing and popular models[13]. Therefore, I used an iterative procedure for developing the ISMM, while adopting major components similar to those used in developing previous renowned models. Some of these components are discussed in the four distinct phases of the procedure as shown in figure 4.1 below: 1 Literature research/consultations 2 Application of Security Standards 4 Review by Industrial Experts 3 Supervisor's Review Final ISMM Figure 4.1: Iterative Procedure for ISMM Development 47

48 Phase 1 - Comparison of existing maturity models To be able to articulate the first draft of the ISMM, so many existing maturity models were compared carefully with the aim of identifying strengths in them that could be adopted. In addition to the considerable level of originality exhibited, some good features of some of the existing models were adopted, combined, enhanced and applied to the ISMM. Maturity models which I reviewed ranged from models on quality improvement, software process improvement, e-governance, e-learning, etc. to security models[3, 9, 21, 23, 31, 45, 62]. After this extensive review and comparison, the first draft of the ISMM whose design had been influenced in the following ways was achieved: The HMG Information Assurance Maturity Model (IAMM)[9] guided my structuring of the security factors which are explained in the next sections. Though the HMG IAMM inspired the choice of the security factors, more factors than were reflected in the HMG IAMM were included in the ISMM, especially the economics of security component. They are the Process/Procedure, Security Governance, Technology & Innovation and Risk Management factors. The last and important factor is the 'Economics of Security' factor. This one does not contribute in determining the maturity level of an assessed organisation. In the assessment framework (ISAF), it is separated, measured and results presented in its own section. It is to be used to understand the economic implication of each maturity level. Most importantly, the HMG model's quantitative approach of defining the evidences and characteristics expected of each level was adopted. This was followed in establishing a firm pattern for quantitatively assessing maturity as seen in the security assessment framework (see Appendix C) The Crosby's Quality Management Maturity Grid (QMMG) inspired the design/layout of the ISMM. Some key ideas adopted from the QMMG include the number of levels (though specified by the project), the discrete gradation or ascension of the levels exhibited in the QMMG, and so on. By this, the first level in both models depicts a total absence of capability in their fields, while the last fifth level shows a state of proactive ability. The approach of determining and expressing the likely economic realities of each maturity level 48

49 as exhibited in the QMMG is also seen in the developed ISMM. Hence, a projection of the expected cost of security for each maturity level is estimated accordingly. The best practices as contained the two security standards (ISO/IEC and ISO/IEC 27002) were used. These best practices were well-articulated as the expected characteristics for each security areas to be probed and compliancelevel indicators defined across the maturity levels as appropriate. The scoring pattern for computing the maturity level of an assessed organisation as used by Malik Saleh[54] was also applied in this project. The average point for all the security areas probed assumes the final maturity point/level of an assessed organisation (see chapter 5 for details on assessment) Casual consultations/discussions with industrial experts during various security events were another source of ideas for this phase. After this first phase, the first draft of the ISMM was produced which was further refined in phase 2. Phase 2 - Application of Security Standards This draft ISMM was further improved on using the two industry security standards: ISO/IEC and ISO/IEC This was a more tasking process as I had to carefully examine the two standards to ensure that the requirements they specify were all reflected/considered in the ISMM. That, obviously, reshaped the ISMM further to include more relevant security areas with characteristics as specified in the standards. At this stage, it was necessary for my supervisor to review and comment on the artefacts developed so far. Phase 3: Supervisor's Review During this review process, my supervisor raised some critical, challenging questions about how discrete the levels are, and what makes one level different from the other. My supervisor's review indeed stirred me to modify and enhance the model as to ensure the characteristics that define all the security factors for each level evolved in a more discrete and balanced pattern, and are clear (see Appendix C). Efforts here eliminated the chances of an evaluated organisation falling in between two maturity levels (i.e., inability to be ideally place in a particular maturity level). Having made 49

50 several amendments that resulted in a more balanced model, I further sought reassurance by seeking experts opinions about the product. Phase 4: Review by Industry Experts Lastly, the ISMM went through another phase of review by the security experts in the industry and academia. This was to get their professional opinions, criticisms, contributions and assessments of the ISMM. I designed and attached a feedback form alongside the two artefacts (ISMM and assessment framework) and distributed to these professionals. Their assessment of the model, which came positive, was also sought in the survey. Of course, the feedback and comments received from the survey helped in giving the ISMM a final professional polish. The next activity after developing the model was to evaluate it to ensure the purpose of developing it was achieved. But before discussing the evaluation process, it is important we understand the components of the model and reasons for including them. These components include the maturity levels and their definitions, the security factors, the security areas, and so on. 4.2 ISMM Components Description This section describes the components of the ISMM. It starts by describing the maturity levels in the first sub-section; the security factors in the second sub-section; the security areas in the third sub-section, and finally discusses the cost of security components in the last sub-section Maturity Levels This section gives a definition and detailed explanation of the various five evolutionary and cumulative levels which an organisation seeking maturity would need to go through in order to achieve full maturity. These levels from level one to five are defined as vulnerable, security awareness, basic security, meeting requirements and robust security respectively. These maturity levels are further explained using Fig 4.1 below: 50

51 Robust Security Level 5 Meeting requirements Basic compliance Security Awareness Level 4 Level 3 Level 2 Vulnerable Level 1 Figure 4.2: Information Security Maturity Levels (Inspired by F.S. Malik, 2011[54]). Level 1: Vulnerable At this level, an organisation is completely unconscious of its security needs. It does not understand the impact of information security on its business nor does it realise its weak and vulnerable state. Organisations at this level are usually victims of all kinds of attacks, both targeted and opportunistic attacks. No policies or procedures are adopted by such an organisation, and safety lasts as long as there is no attack attempt. In a nutshell, organisations at this level are very vulnerable to security attacks. Level 2: Security Awareness Organisations at this level understand the role of information security to their business: they understand that information security could be instrumental to business success. Policies and procedures exist, though enforcing them has been a struggle with little consistency in the way things are done. Operations or security activities at this level are usually temporary and implemented irregularly. 51

52 Level 3: Basic Compliance At this level, security policy and procedures are established with staff understanding their roles/responsibilities. This is a level at which an organisation makes distinct moves to invest in security. Top management is aware and support security programmes; staff are scheduled for security trainings; strategies are devised to ensure staff observe the policy and procedures, and so on. Though security is improved at this level, due diligence and strong compliance is not feasible. Managements of organisations at this level are willing to buy security solutions and implement controls, but continuous attention to improve and innovate in line with trend is far from their interest. Level 4: Meeting Requirements At this level, an organisation is consistent with best practices through active monitoring and review of policy and procedures in line with industry standards. Policies and procedures are followed consciously, and members of staff exhibit strong information security culture while discharging their duties. Regular internal and external audits are important management's strategy for ensuring security through compliance. Effective risk management characterises such organisations, and security issues are also centrally managed. Level 5: Robust Security Organisations at this level are more proactive in their security approach. They go beyond active compliance with industry standards into modelling and adopting preventive strategies and controls against possible threats/attacks. Not only are they assumed to be fully secured, they continually research and improve their technologies and practices. Comprehensive and effective risk management is in place. In rare occasions, arising incidents/breaches are identified and resolved without them having visible impact on business operations. Such organisations are generally agile and sensitivity to evolving security challenges which enable them plan prompt preventive actions. Having defined and explained the five maturity levels in which an assessed organisation must fall, I went further in identifying the security factors which serve as the indicators for determining the security of an organisation. These security factors, which I articulated in line with the provisions of the two main security standards 52

53 (ISO/IEC and ISO/IEC 27002), were grouped into four assessment categories (See Appendix A). The last category, as seen in the ISMM, is the 'economics of security' category which assesses the monetary dealings (investments and gains) of assessed organisation. All these categories are discussed in details in the next section Security Factors From my research and review of previous works on security maturity models. Four key assessment categories have always been prominent[60]. These I tag the 'Security Factors' in this project. They are four security categories which comprises of other security areas. Eight different security areas have been articulated under these categories as seen in subsequent figure 4.4. They include access management, compliance, through-life technological implementation, management of information security, training/awareness programme, roles/responsibilities, risk assessment and treatment, and finally incidents and business continuity management. Figure 4.3 below shows the four security factors with follow-up discussions on each of them. Information Security Figure 4.3: Information Security Assessment Categories Figure 4.3 above shows the four categories or factors under which different security areas are grouped. For information security to be ensured, these four categories of consideration are critical and must be handled properly and effectively in line with the standard requirements. For clarity, I further discuss what these categories entail and the rational for affirming their criticality to information security. 53

54 Security Governance Information security governance is one of the major areas organisations have failed, hence resulting in failure in ensuring information security. Under this, management's coordinating ability and other human factors are the main concerns examined. It involves how far management understands the criticality of information security; how it allocates resources and responsibilities, and how it monitors to ensure the organisation aligns with policy, regulations and standards. Poor security governance is the major source of staff irresponsiveness to the demands of information security[48]. Human factor limitations and staff carelessness is one major source of security breaches[26]. And if this is coordinated properly, then most of the security challenges faced by organisations will be reduced. Current trend in security industry shows that targeted attack is increasingly becoming the preferred approach by cyber criminals[52]. These attacks are most times being launched by the organisation's employee unconsciously due to their low security awareness and consciousness. Hence, the need for security governance to be taken serious by organisations cannot be over-emphasised. This is because no matter the security mechanisms and controls an organisation might implement, if the security governance which raises the staff competence is low, the organisation will continue to fall victim of security breaches and incidents. I have articulated three security areas under this factor in the maturity model. They are to be measured while assessing any organisation to determine the final governance outlook of the organisation. Another is the 'process and procedures' factor/category which is also as important as the 'security governance' category. Process and Procedures The 'process and procedures' category generally examines how standardised the operations of an organisation are. Are members of staff following pre-defined processes which are repeatable? Or are they just approaching situations from different ways, according to concerned staff's discretion? For information security to be guaranteed, operations must follow certain guidelines which are articulated in policy and international standards. In line with the demands of the security standards, two security areas were included as indicators to be measured while appraising organisations' performance on this factor. There are 'access management' and 'compliance', and will be discussed in detail in the next section. Establishing and 54

55 enforcing standard process/procedure in an organisation is a major way of reducing risks due to human factors. Therefore, security assessment will not be complete if an organisation's ability in ensuring its operations are repeatedly complying and following security standards is not measured. Next are the technological implementation considerations. Technology & Innovation It is widely believed in various industries that the strength of the technological implementation of an organisation says how secure and likely the organisation is to meet its business objectives[50]. Even the security standards specify various technical controls which must be in place for security to be assured. That informs my distinct identification of this category in the maturity model. Under this category, the effectiveness of an organisation's technical controls and how they innovate in line with best practices and changing security landscape is to be measured. Risk Management Capability Lack of proper risk management accounts for over seventy percent (70%) project failures[25] and most security incidents experienced by organisations. Consequently, risk management has been considered critical and included as one of the assessment categories because without proper assessment of risks to information assets, an organisation might not be able to adequately protect itself. Value and protection should be assigned to information assets according to their sensitivity and relevance to the business. Else, expensive assets might carelessly be neglected and better protection given to less valuable assets. According to the ISO/IEC 27002, good risk management demands understanding and defining an organisation's risk appetite[39]. It involves identifying and planning treatment to other risks in accordance to their importance to the organisation. Hence, the reason for grouping two security areas to be probed under this category: risk assessment/treatment and incident & business continuity management. Economics of Security This factor or section was designed into the maturity model and assessment framework considering the main aim of the model, which is to determine the economic impact of security investments on organisations. It does not form part of the 55

56 considerations for determining the security maturity level of an assessed organisation; it only investigates the cost of security of the organisation for analysis purpose. Without obtaining the total cost of security of an assessed organisation alongside its maturity level and benchmarking with that of another similar organisation, the question on whether or not security investments save costs might be difficult to answer. Having explained the security factors/categories, the security areas grouped under them is to be discussed in the next section Security Areas Eight security areas to be probed while assessing an organisation were identified and incorporated during the ISMM design and development. These areas reflect all activities that should be put right by any organisation desiring information security maturity. Figure 4.4 below groups these areas under their respective security factors or categories. Access management Management of information security Compliance Training and awareness programme Roles and responsibility Information Security Through-life technological implementations Risk assessment and treatment Incident and business continuity management Figure 4.4: Security Factors and their Security Areas 56

57 As shown in figure 4.4 above, different security areas were considered under the security factors in line with standards. Furthermore, these security areas constitute several activities to be probed when assessing an organisation as detailed in the Information Security Assessment Framework (see Appendix C). Brief explanation on these security areas and rationale behind their consideration and inclusion follows: Access management Access control involves preventing unauthorised access to information assets[36]. In assessing an organisation, assess management section will consider, in general, an organisation's ability to control unauthorised access of information within the organisation network and information systems. It is important for ensuring information security because without it being properly handled, sensitive information would move uncontrolled to the unintended recipients. And once this happens, privacy is compromised and different forms of losses are incurred. Compliance This aspect of the assessment framework will check the organisation's information systems' ability to fulfil the requirements as specified in the policy, legal and regulatory standards. Technical and procedural compliance are to be checked here to ensure controls and operational procedures are in consistent with requirements. Through-life technological implementations This aspect involves and examines the controls and technical innovativeness of an assessed organisation. Without good technological controls that conform to standard requirements, information security will be loose and not guaranteed. Therefore, technological know-how of an organisation and how it innovates in line with changing security landscape is vital to sustaining information security. Management of information security Management awareness, support and involvement in security are important information security determinant. This also involves management understanding of the impact of security to business, allocating resources and monitoring performance. If the technical controls are sound, but management and staff attitudes to security is 57

58 poor, information security will remain an illusion. This area of assessment is to gauge an assessed organisation's information security management ability, and how proactive it is. Roles and responsibilities Without roles and responsibilities being assigned properly and activities accounted for, operations will be uncoordinated resulting in lack of security. This area must be assessed to ascertain the level of coordination, ownership and protection of various sensitive information assets in an organisation. Risk management and treatment A proper risk assessment and treatment plan is another key requirement to guard against targeted and opportunistic attacks. Ability of an organisation to continually maintain and review a comprehensive risk register is indispensible in ensuring security. Therefore, such capability should be assessed in an organisation during evaluation process. Incident and business continuity management If an organisation is so unprepared for unforeseen incidents which could include natural disaster or internal breakdown, then its security is temporal. Some opinions believe that data does not exist until it is in three places. For an organisation to be considered secure, it should have solid and active backup arrangements for such accidental situations. Evaluating this aspect is also very essential in confirming how reliable and really secure an organisation is. Total Cost of Security This part of the model computes the total expenditure an organisation incurs for adopting and implementing security controls. The cost incurred in the events of breaches/incident, regulatory or legal fines, etc. are all incorporated and summed up under this as shown in figure 4.5 below. 58

59 Planned Cost of Security Acquisition Cost Business Continuity/disaster recovery Disposal Costs Direct Costs Unplanned Cost of Security Maintenance Cost Cost of breaches/incidents Legal fines and compensations Lost information assets Reputational damage Lost business and time Indirect Costs Total Cost of Security Figure 4.5: Four Aspects of Cost of Security (Inspired by[6, 15]) As in figure 4.5 above, the total cost of security is divided into two parts: the planned cost of security and the unplanned cost of security. The planned cost of security involves all security expenditures that were initially considered and expected by an organisation. They include the cost of acquiring and disposing security controls/mechanisms, business continuity costs and maintenance costs. The unplanned cost of security involves expenditures that were never anticipated, but just happened. Such include expenditures on breaches/incidents, legal fines and compensations; and also cost due to suspended business, lost assets and reputational damage etc. With the security factors and areas having been discussed, bringing the explanation of the components of the ISMM to conclusion, the evaluation approach for testing the efficacy of this model follows. 4.3 ISMM Evaluation In evaluating the maturity model, I adopt three strategies as to determine and ensure the model is effective in achieving its intended purpose: 1. Applying the ISMM practically on organisations over a period of time is adopted as an effective method for testing the maturity model. Hence, I propose an evaluation project that will test the model by evaluating the case study organisations over a period of two years. During this period, an evaluation 59

60 exercise is to be conducted on these organisations in 6-months intervals as to determine changes in their maturity levels and how such changes influence their economic status. This is because the intended purpose of inspiring improvement on the organisations by the ISMM, after application, can only be visible over a space of time. And a continuous re-evaluation of these same organisations in successive intervals of 6 months will confirm the ISMM to be effective or not. If there is growth, within the period, on the organisations' maturity and monetary savings, then the model should be considered effective. Considering the short period of this project, this evaluation proposal can be carried on by subsequent researchers. They are to test the case study organisations after every 6 months to see how the model sustains and improves the maturity levels of those organisations. 2. Secondly, an evaluation of the case study organisations were conducted (in chapter 5) with the developed ISMM, and their maturity levels recorded. Then, I conducted further evaluation exercises on the same organisations using other existing proven security models (e.g., HMG IAMM and that by Malik Saleh) to get their ranking of the organisations too. I compared the results obtained from the developed ISMM with that from the other existing security models. Interestingly, both sides of the results showed consistency in the maturity levels assigned to those organisations. This suggests that the security factors, areas, levels and their gradations have been carefully designed in a manner that delivers the intent of the ISMM, which is to improve and sustain organisations' maturity. 3. Thirdly, I involved security experts in industry and academia in reviewing and evaluating the ISMM. Therefore, a feedback form was designed and distributed alongside the ISMM and ISAF templates seeking their review, criticism, contributions and evaluation of the ISMM. Respondents were also expected to score the model on how comprehensive and likely it is to achieve the intended purpose. The average overall score from survey was eighty-six percent (86%)(see appendix E). That confirmed acceptability of the developed ISMM and reassured me of its efficacy. 60

61 4.4 Summary This chapter discussed the iterative procedure and activities followed in developing the ISMM. Details on the ISMM structure and components were also discussed. The chapter ended by presenting the methods or strategies by which the project was evaluated. 61

62 Chapter 5: Results This chapter presents the results of the survey for this project. The results are presented in categories according to the security factors they fall under. Firstly, results to some key questions that influence the purpose of this project or are relevant to answering the research questions are presented separately using charts. Then, a table is used to summarise the remaining results under the category before moving to the next category or factor. It is worth highlighting that large organisations were only considered as case studies in this project which determines the impact of upfront security investments on economic status of an organisation. This is because for the effects of security investments in relation to maturity level to be observed, an organisation would have to be benchmarked with another of the same or similar business and size. Before the results are presented, information like the response rate to the survey, respondents demographics, etc. is necessary. Two hundred and twenty-three (223) questionnaires were distributed in total through several means. However, two hundred and fourteen (214) questionnaires were distributed by s; while nine (9) hardcopy questionnaires were sent through post. Eventually, six (6) organisations are used as case studies due to their interest in participating in this research work. In the following section, the demographics of the participating organisations are specified. Demographic information of participating organisations Three (3) of the participating organisations are located in Europe while the other three (3) are located in Africa. Figure 5.1 below presents participants' distribution by industry. 62

63 16.6% 33.3% Financial Services Government IT Services 50% Figure 5.1: Participants by industry sector In figure 5.1 above, three of the participants are from the government sector, comprising fifty percent of the participating organisations; two are from the financial services sector, while one of the participants is from the IT services sector. Other information regarding the participants involves the job roles/positions of the various responding officers and years of experience in the role, as shown below. Participant Roles Director Assistant Deputy Head of IT Chief Director Manager manager (Security Unit) operations Information officer (CIO) Years of Experience Number of employees Table 5.1: Participants roles and years of experience From table 5.1 above, two of the respondents are in the directorate positions, while others have positions ranging from assistant manager, deputy manager, head of IT operations to chief information officer (CIO). On the average, the years of experience of the respondents is five (5), as seen in table 5.1 above. Also, the numbers of employees of the participating organisations are as shown in table 5.1 above. In 63

64 Financial Financial Government Government Government IT services furthering this data presentation chapter, I group and present these results in their relative categories. 5.1 Economic Spending of Organisations This section presents the monetary expenditure or security costs incurred by organisations under various factors or business issues that drive them. Cost detailed in table 5.2 below include cost on acquisition/disposal, maintenance, business continuity/disaster recovery, breaches/incidents, legal fines/compensations, lost information assets, reputational damage and lost business and time. Cost Type Participants / Sector Planned Costs of Security Unplanne d Cost of Security Acquisition/ Disposal Business continuity/ Disaster recovery Maintenance Breaches/ incidents Legal fines/ Compensations Lost assets Damaged reputation Loss of business/time All figures are in pound sterling Commer cially sensitive Commer cially sensitive Commer cially sensitive No info available No info available unknown N/A Table 5.2: Economic Spending of participating organisations Table 5.2 above represents the costs as reported by the respondents. In the questionnaire (see appendix A), security costs were investigated under three categories. The acquisition costs comprised of the acquisition/disposal costs and the business continuity/disaster recovery costs. The maintenance cost, which includes the cost of running, updating and maintaining security controls, was investigated separately. Other costs related to incidents and losses were grouped and investigated in a single question. That is why I present the breakdown as seen in table 5.2 above. As 64

65 indicated in the table, all currencies reported by the respondents were converted to pound sterling for consistency purpose. While some participants were comfortable with providing their security costs in their responses, some were not prepared to do so. Participants two and participant four provided full details of their security expenditures while participant three, five and six provided some. However, participant one avoided answering questions related to economics of security entirely. This might be for fear of the unknown as suggested by the response: commercially sensitive. The next section presents results on security governance. 5.2 Security Governance As already explained in section 4.2.2, this aspect of security concerns with how able an organisation is in coordinating information security activities in a manner as to ensure security. Survey results concerned with this aspect of security are presented here. What is the perception of top managements? Do they understand their roles in ensuring information security? Participant6 5 Participant5 Participant4 Participant Totally ignorant 2 Aware & Uninterested 3 Supportive 4 Very Supportive 5 Partakers Participant2 5 Participant Figure 5.2: Management's perception of information security Results, as shown in figure 5.2, shows that management commitments to information security is increasing in the participating organisations; though participant five (5) seem to be indifferent, despite their awareness on the criticality of information security to the business. 65

66 Does the organisation have detailed security policy? No Policy Exists 2 Word-of-mouth instructions 3 Security manual only 4 Comprehensive Security Policy 5 Regularly-review Policy Figure 5.3: Formulation of Security Policy It is clear that all the participants, except five, understand the need for a comprehensive security policy to guide its function. However, none showed proactive review of policy as to contain emerging threats and security demands. Does the organisation have established information security training and awareness programme? 6 1 No awareness programme 2 Individual efforts to train 3 Annual general training scheduled 4 Need-specific security trainings 5 Regular refresher trainings Participant1 Participant2 Participant3 Participant4 Participant5 Participant6 Figure 5.4: Levels of awareness training programme 66

67 From the results in figure 5.4 above, participants two(2), four(4) and six(6) take awareness training very serious. They engage in proper training need assessment for their staff and provide role-specific security trainings for them. Conversely, three of the other participants only conduct an annual general security awareness campaign for their staff. What is the employees' information security awareness and compliance level in the organisation? 6 1 Unaware 2 Aware/not Compliant 3 Weak compliance 4 Compliant 5 Very Compliant Participant1 Participant2 Participant3 Participant4 Participant5 Participant6 Figure 5.5: Employees security awareness level and compliance Figure 5.5 above shows that participants two and six employees are very aware and compliant with security policy and procedures. Participant four employees are compliant; participants one, three and five exhibit weak compliance to their information security obligations. Further results regarding security governance by these six participants are summarised in table 5.3 below: 67

68 Financial Financial Government Government Government IT Services Question Participants/Sector Is the organisation's information assets inventory well defined, documented and information asset owners identified? Does information security constitute key consideration in every business plan of the organisation? Are engagements/communicatio ns with external parties controlled? Does the organisation classify data/information? How often are cases of staff misuse of access rights/privileges? How well does the requirements of the policy spread across departments/functions? Specify on a scale of 1-5 How detailed is the organisation's recruitment screening to ensure qualified staff for key functions? Is security checks conducted on staff before security privilege is granted? Is there an information security unit in the IT department of the organisation? Are officers assigned specific information security roles/responsibilities? Outdated inventory Outdated inventory Outdated inventory Autoupdate inventory exist Outdated inventory Outdated inventory Public, internal, confidenti al and secret Yes Yes Yes No Yes rarely rarely Monthly rarely monthly rarely Table 5.3: Additional results under 'security governance' category Although organisations make efforts to maintain inventory of their information assets, many still do not bother to actively update it to incorporate new changes, according to the results. Also, table 5.3 above shows that the consideration placed on information security during various business dealings of organisations is still at a modest degree, on 68

69 the average. Fifty percent of the respondents indicated low regard for information security, while the other fifty percent indicated moderate attention to it in every of their business engagements. Other results as shown in the table 5.3 above are selfexplanatory. The numbers represent the participants' subjective assessment of their organisations on a scale of one (1) to five (5) on the respective areas. The next to be presented are results under the 'process and procedures' category. 5.3 Processes and Procedures Results under this category show the ability of participating organisations in defining and adopting best practices, and also abiding by policies, regulatory and other security requirements At what stage in a project life cycle is security considered and/or introduced? 1 Not considered 2 As the need arises 3 Implementation stage 4 Design Stage Project Inception stage Participant1 Participant2 Participant3 Participant4 Participant5 Participant6 Figure 5.6: Stages for security consideration in a project It clear from figure 5.6 above that participants two and six take security serious as they consider it during project inception. Participants one and three are in the habit of introducing security controls while implementing a project, while participant four considers security at the design stage of its projects. Unfortunately, participant five only remembers to arrange for security when there is problem or disruption of business as indicated in the results. 69

70 4 Is there formal information security incident management procedure? 1 Do not exist 2 Uncordinated procedures 3 Existing procedures inadequately enforced 4 Well-defined procedures 5 Constant review of best-practice procedures Participant1 Participant2 Participant3 Participant4 Participant5 Participant6 Figure 5.7: Information management procedures as established by participants Most of the participants (four participants) indicated that they have information management procedures, though following it has been a challenge. Figure 5.7 above also shows that only two of the participants agree that they have not established good security management procedures, but are still uncoordinated in their approach to incident reporting and response. How regular is the information system audit for compliance conducted? Are penetration testing and vulnerability assessments conducted? 6 1 Never 2 Rarely 3 Yearly 4 6-month intervals 5 Quarterly Participant1 Participant2 Participant3 Participant4 Participant5 Participant6 Figure 5.8: Rate of information security audit and penetration testing 70

71 From the above result in figure 5.8 above, only participants one and two show that system audit by internal and external auditors are carried out every 6-months interval. Results from other participants indicate low interest in consistently checking vulnerabilities and auditing the information systems for compliance What is the rate of security incidents and breaches in the past year? 1 Several 2 20 to to to 5 5 No incident Participant1 Participant2 Participant3 Participant4 Participant5 Participant6 Figure 5.9: Rate of security incidents by participants The above results in figure 5.9 show higher rate of security incidents witnessed by participant five, unlike its counterparts in the same government sector (i.e., participants three and four). Participant six (6), in the IT services sector, seems to have more control over security incidents as shown in the results. The other remaining two participants (participants one and two) who are in the financial sector also show considerable control of their security incidents. However, results show that participant two (2) exhibits better control over situations than participant one. The remaining results under this category are represented in table 5.4 below. 71

72 Financial Financial Government Government Government IT Services Question Participants/Sector Is there access control policy in place for managing user access? How regularly is the granted rights/privilege reviewed and updated? What is the period of inactivity before a user s account is deactivated? Is there punishment for staff violation of the security policy? Is there Bring Your Own Device (BYOD) policy in place? Are there proper arrangements for mobile computing/teleworking? How often are computer systems checked/scanned for malicious applications/activities? How often are staff prompted for password change to allow access to operating system and other applications? How often are the firewall and IDS logs monitored and acted upon? How compliant are the operations of the organisation to policy, regulatory, and legal requirements? How often are cases of staff misuse of access rights/privileges? Are users obliging to their user responsibilities? Access control policy exists detailed formal & documen ted policy Access control policy exists Access control policy exists System administr ators control access monthly monthly quarterly quarterly 6- months interval 1 month 1 month 3 months 3 months 6- months breaches are noted without punishment Yes staff are punished when there are tangible losses Only configure d devices are allowed on the network staff security breaches go unnotice d Not prepared to answer staff are punished when there are tangible losses No staff security breaches go unnotice d Necessar y mobile computin g arrange ments is in place daily daily quarterly daily Weekly/ monthly monthly monthly 6-month intervals Annually daily Monthly/ quarterly Weak compliance Complian t Weak complian ce monthly 6-month intervals detailed formal & documente d policy monthly 1 month staff are punished when there are tangible losses Devices are put on a guest VLAN daily monthly daily Annually Weekly Complian t Weak complian ce Weak compliance rarely rarely weekly rarely monthly rarely What is the time between security incident and its resolution (response rate)? 1day 1day 1week 1 to 5 days 1week 1day Table 5.4: Additional results under 'Process and Procedures' category 72

73 The summary of the remaining results for the 'process and procedures' category is as shown in table 5.4 above. Results from table 5.4 above show that participants two and six (from the financial and IT services sectors respectively) exercise high degree of access management and procedural coordination in its operations. Also, participant four (in government sector), unlike participants three and five in the same sector, shows considerable level of good security practices under this category. Results in the next section show the degree of adoption of best technologies and security controls by participants in ensuring information security. 5.4 Technology and Innovation This section presents results of activities of participating organisations in terms of the security controls and mechanism they use in ensuring security. How often do malwares or/and other security attacks disrupt key business processes? Participant6 5 Participant5 Participant4 Participant Severally 2. Twice a month 3. Once in 6 months 4. Barely in a year 5. Not at all Participant2 4 Participant Figure 5.10: Rate of disruption of business process by security attacks As shown in figure 5.10 above, participants one, two and four experience, on the average, one instance of business disruption in a year due to security breach/incident. Participants three and five experience at least one instance of business disruption in six months due to security incident. Participant six exhibits proactive ability in preventing breach/incident, or maybe handles such in a way that it does not hinder business process when it occurs. 73

74 Financial Financial Government Government Government IT Services What is the average percentage uptime of the organisation's network in a year (Intranet and Internet)? Participant6 5 Participant5 Participant4 Participant % to 30% 2. 30% to 50% 3. 50% to 70% 4. 70% to 90% 5. 90% to 100% Participant2 5 Participant Figure 5.11: Average uptime rate for participants' network Participant one, two, and six experience consistency in their network availability (from 90% to 100%), while participants three and four are in the range of seventy to ninety percent in their network uptime. Figure 5.11 shows that participant five relatively has lower network uptime, and struggles between uptime range of fifty to seventy percent. Table 5.5 below summarises the remaining result under this 'technology and innovation' category. Question Participants/Sector What are the measures in place for authentication and authorisation in the organisation? Are there mechanisms for ensuring non-repudiation Policy and governanc e, active directory, regular reviews, IAM, rolebase access. Use of token; password s, secret PIN etc. Active Directory for authentic ation, Secure Access Control Systems for Authoris ation Coded access card Passwor d Radius and TACACS+ authentica tion and authorizati on; Role Delegation in Active Directory Yes Yes Yes Yes No Digital Signature 74

75 Question Are there technological measures in place to ensure information security (e.g., against malicious codes, intrusion etc.)? Are there vulnerability scanning tools in place? How effective is the technical vulnerability management? Does the organisation have any access control mechanism put in place? Does information system controls reflect security considerations at the design stage? Rate on a scale of 1 to 5 Is there policy for implementation of cryptographic techniques/controls? Is cryptographic controls used to secure information? Does the information systems design support technological innovation for improvement? Evaluate degree of flexibility on a scale 1 to 5 Participants/Sector Intrusion Antimalware Intrusion Intrusion Anti- Intrusion preventio preventi preventio malware preventi n and software, on and n and software on and detection Encryptio detection detection detectio system, n, system, system, n firewalls, Firewalls, firewalls, firewalls, system, encryptio Intrusion encryptio encryptio firewalls, n, antimalware detection n, anti- n, anti- encrypti system, malware malware on, anti- software, Strong software, software, malware authentic authentic authentic authentic software ation ation/aut ation ation, mechanis horisatio mechanis mechanis authenti ms n ms, ms cation mechanis secure mechani m, coding sms, etc. Access Control Yes Nessus SIEM SIEM None BoomSc an, Nmap Discretion ary access Controls to restrict unauthor ised access Tripple DES Private and public key infrastru cture PKI, Tripple DES etc. Rolebased Rolebased Rolebased Rolebased RSA, AES, DES RSA security, symmetr y and asymme tric cryptogr aphy Describe the rate of the organisation's Denial of Service experience in a year? How effective, if any, is the mechanism for logging and tracing activities within the organisation? Specify on the scale of 1-5 occasional ly hardly hardly hardly occasion ally Table 5.5: Additional results under 'technology and innovation' category hardly 75

76 The next results are those which specify the risk management ability of the participant organisations. 5.5 Risk Management Results Risk management results show the extent of recovery mechanisms, and how prepared and robust an organisation is in the face of security challenges. Can they bounce back shortly after an incident? Or do they collapse after an attack or natural disaster? Results related to this from survey conducted for this project are presented thus: Is there proper risk assessment and treatment for all information assets? Estimate the organisation s ability of avoiding recurrence of solved security problem on a scale of 1 to 5? Participant6 Participant5 Participant4 Participant3 Participant2 Participant Figure 5.12: Participant risk assessment and treatment capability levels Results show from figure 5.12 above that participant four has robust risk management strategy, while participants two and six indicated confidence and reliability in their risk management ability. Participants one and three indicated level 3 ability in their risk assessment and treatment ability, while participant five seem not to have proper arrangements in place to ensure incident do not reoccur when they happen. 76

77 How often is virus scanning and updates carried out? Participant6 5 Participant5 Participant Annually 2. Quarterly 3. Mothly 4. Weekly 5. Daily Participant3 3 Participant2 5 Participant Figure 5.13: Intervals for malware scanning and software updates The above results show the intervals between every malware scanning and fixing. The level five as indicated by participants two, four and six in figure 5.13 above shows that those organisations are consistently scanning, fixing and updating their malware software on a daily basis. Participant one conducts a quarterly general checks to ensure systems are updated and well-protected against malwares, while participants three and five undertake such exercise on a monthly basis. Is there a business continuity/disaster recovery plan for the company s information systems? How often is it checked and tested? Participant6 5 Participant5 Participant4 Participant Annually 2. 6-month intervals 3. Quarterly 4. Monthly 5. Weekly Participant2 5 Participant Figure 5.14: Intervals for checking and ensuring business continuity plans activeness Participants two and six are always checking and ensuring their business continuity plans are still effective and are all-time ready. However, participants three and four 77

78 Financial Financial Government Government Government IT Services check their business continuity plans every quarter to confirm their readiness, while participant two does such checks every six months. Further results under this 'risk management' category are found in table 5.6 below. Question Participants/Sector Are there physical protections for information assets? How often are data backed up? Are there active data back-up policy and strategy in place? Yes Yes Yes Yes Yes Yes Weekly or monthly auto realtime back-up auto realtime back-up auto realtime back-up Weekly or monthly auto realtime back-up Table 5.6: Additional result under 'risk management' category The participating organisations, as seen in table 5.6 above, understand the importance of risk planning and protection of their information assets. If not for participants one and six who undertake their data backup exercise on a weekly or monthly basis, others indicated ability and even real-time backup during daily business operations. 5.6 Summary This chapter solely presented results of the survey conducted during this project. The demographics of the participants were initially explained, before the results followed. The results were organised according to the security factors or categories they fall under, with the sequence of presentation as follows: Economic spending of organisations, Security Governance, Processes and Procedures, Technology and Innovation and Risk Management. The final section summarises the chapter. With these results already presented, the testing of the maturity model by evaluating these participating organisations is done in the next chapter. 78

79 Chapter 6: Evaluation of Selected Organisations This chapter details the testing of the ISMM developed while evaluating the case study organisations. The first section explains the structure of the information security assessment framework (ISAF) used for the assessment; the second evaluates the six participating organisations to determine their maturity levels. This is followed by the discussion section which analysis the results of the assessment, while the last section summarises the chapter. 6.1 Structure of Information Security Assessment Framework (ISAF) The information security assessment framework is the tool developed for assessing the maturity of organisations. The ISAF comprises of many columns and rows (see Appendix C). The first column contains all the 'security areas' under which specific security activities tagged 'areas to probe' are grouped. These 'areas to probe' are security best practices articulated from the ISO/IEC which should be investigated to determine how well an organisation ensures them. They constitute all the relevant security requirements as specified in the security standards (ISO/IEC and ISO/IEC 27002), and grouped in their various categories. The subsequent five columns labelled 'expected evidence for each maturity level' specify the degree of performance that characterises an organisation which is to achieve each particular maturity level. The last column labelled 'Point' is the space provided for appending score or achieved point for each corresponding security areas examined. The ISAF rows are grouped according to the security factors, with each group/category introduced by its title heading (e.g., Security Governance). The second row from the top specifies the maturity levels, while the following third row shows the weighting of each corresponding maturity level. An organisation that is assessed is assigned a score (from 1 to 5) for demonstrating a degree of security performance equivalent to that specified in the maturity level of the score. Further down the rows are all the relevant 'areas to probe' and the gradation of security best practices ideally expected of each maturity level. At the end of each security category/factor, a demarcating row shows the average score achieved for that security factor before introducing the next security-factor category. 79

80 The first security factor on the ISAF is the 'process and procedures', followed by the 'Technology and Innovation', 'Security Governance' and then the 'Risk Management' factors/categories. The last assessed category of the ISAF considers the security economics of an assessed organisation, hence tagged 'Economics of Security'. Before the economics of security category is another row which computes and presents the average total point achieved under the previous four security factors. Finally, the ISAF shows the score boundary and their associated maturity levels. An organisation is considered to be on a maturity level corresponding to the 'score boundary' in which its final average falls. Further details on how the ISAF is used to evaluate an organisation are discussed in the next section in which the case study organisations are evaluated. 6.2 Testing the ISMM and ISAF by evaluating selected organisations The evaluation of six participating organisations using the ISAF is presented in this section. Each sub-section presents the assessment of these organisations under each security factor. The first assessment represented is on process and procedures, followed by security governance, technology and innovation, risk management capability and lastly a summary of their economic spending. For simplicity and clarity, the assessment for each security factor is presented using tables. And the various security areas probed are populated in each table with the scores/points earned on them by these participants well specified using numbers. These numbers, in other words, are the maturity levels of these participants in those respective security areas probed. At the end, the average of all the points earned by each participant on all the areas probed is computed giving the participant's maturity level on that particular security factor/category. The same approach was used to generate the participants' maturity levels for all the four security factors. After this, the average of the total points achieved under those four factors was computed as their final average point or maturity level (see appendix C) of each respective participant. Using the 'overall rating and maturity level' section of the ISAF, the average score for each participant is placed in the appropriate boundary to determine the maturity level. The following sub-sections details the assessments conducted on the six case study organisations under the four security factors. 80

81 Financial Financial Government Government Government IT Services Assessments on Processes and Procedures Standardisation Table 6.1 below presents the assessment of the six case studies on processes and procedures standardisation. Areas probed Participants/Sector/ratings Is there access control policy in place for managing user access? 2 How regularly is the granted rights/privilege reviewed and updated? 3 Is there formal information security incident management procedure? 4 What is the period of inactivity before a user s account is deactivated? 5 At what stage in a project life cycle is security considered and/or introduced? 6 Is there punishment for staff violation of the security policy? 7 Are there standard operational procedures across all departments which support information security? 8 Is there Bring Your Own Device (BYOD) policy in place? Are there proper arrangements for mobile computing/teleworking? 9 How often are computer systems scanned for malicious applications/activities? 10 How often are members of staff prompted for password change to allow access to operating system and other applications? 11 How often are the firewall and IDS logs monitored and acted upon? 12 How compliant are the operations of the organisation to policy, regulatory, and legal requirements? 13 How often are cases of staff misuse of access rights/privileges? Are users obliging to their user responsibilities? 14 How regular is the information system audit for compliance conducted? 15 What is the rate of security incidents and breaches in the past year? What is the time between security incident and its resolution (response rate)? Average point Table 6.1: Assessment of case study organisations on process and procedure standardisation 81

82 Responses provided in the questionnaires by the participants were used in determining their rating for each security area probed. The assessment framework was designed carefully, with the maturity levels' gradation well-balanced as not to allow the assessment of an organisation to fall in between two maturity levels (i.e., It is easy to place responses by respondents in their ideal levels). The format of some questions, however, allowed respondents to evaluate their degree of performance in a security area on a scale of 1 to 5 (see appendix A). Such subjective ratings by participants were automatically adopted and reflected as the organisation's score in those areas. However, there were some open-ended questions that demanded specific answers from respondents. Example of such is: what are the technological measures in place to ensure security? Responses to such open-ended questions needed to be, and were benchmarked with security standards in order to determine their compliance level and ideal score. Having defined the characteristics and compliance level expected in each maturity level, I tried to objectively identify the maturity level that specifies the same degree of security practice as reported by the respondents. And the value of that level becomes the score earned by the organisation on that security area. As seen in table 6.1 above, questions number seven (7) and eight (8) are examples of such open-ended questions. Scores were assigned to the participants on these two questions/areas by matching the degree of performance exhibited in their responses with the appropriate maturity level on the ISAF. My appraisal of each reported response or performance, as already stated, was guided and supported by the security requirements as specified in the standards (ISO/IEC and ISO/IEC 27002). For example, question eight (8) says: 'Is there Bring Your Own Device (BYOD) policy in place?' Are there proper arrangements for mobile computing/teleworking? Participant one just indicated yes without further explanations; hence I gave him/her benefit of doubt but only assigned three (3) points since there was no professional explanation as evidence of claims. Participant two and six went straight to prove existence of such consideration by the professionalism exhibited in their responses. Participant two said that only configured devices are allowed on the network while participant six indicated that devices are put on a guest VLAN. For such specific and problem-solving responses, I assigned four (4) points each to the two participants. Participant three indicated not prepared to answer while participant five indicated that necessary 82

83 mobile computing arrangements are in place. For these two, I scored two (2) points each because both were not specific in their claims. So, I assumed them to be just aware (i.e., level 2) of such security concern since I could not find any professional evidence of such practice in their responses. Participant four indicated no implying that there is no such BYOD policy or measures in place. Hence, I ideally scored one (1) point which reflects a state of no performance. In addition, most of the questions in the questionnaire were designed in a way to ease the evaluation exercise. Multiple choice answers were provided for most questions allowing a respondent to just select the option that reflects his/her organisation s security status. The levels of maturity are arranged in ascending order such that the first level is assigned option a while the highest level is e. So, once the respondent selects the option that explains its security status, the scoring is automatically done. Apart from the two open-ended questions mentioned above, the remaining questions under this security factor take this multiple choice format (see questionnaire in appendix A). In the bottom row of table 6.1 is the average points achieved by each participant under this security factor/category. These were obtained by summing all the scores/points earned by each participants in all the areas probed and dividing by the number of the questions. This shows the strength of these participant organisations in defining and coordinating their security processes and procedures. While participant one(1), three(3), four(4) and five(5) are on level three (3) maturity, participant two(2) and six(6) exhibit level four (4) maturity. This implies that the last two participants are meeting security requirements while the former participants are just fulfilling the basics under the process and procedures category. However, none of these participants measured up to the level five (5) maturity. Discussion on the dynamics of participants performances across the four security factors is to be presented in the subsequent section. But before then, assessment of these participant organisations under the remaining security factors/ categories is to be concluded. The next subsection presents the assessment of the participants under the technology and innovation category. 83

84 Financial Financial Government Government Government IT Services Assessments on Technology and Innovation This sub-section presents the assessment conducted on the six participant organisations in the aspect of their technological implementations to ensure information security. The assessment method, as described in the previous security factor, applied to this factor and to the rest of the factor to be discussed. Areas probed Participants/Sector/ratings How effective is the organisation s authentication and authorisation controls for network access 2 How effective is the organisation s mechanism for ensuring non-repudiation and fostering of e- commerce? 3 Are there technological measures in place to ensure information security (e.g., against malicious codes, intrusion etc.)? 4 How often do malwares or/and other security attacks disrupt key business processes? 5 Are there vulnerability scanning tools in place? How effective is the technical vulnerability management? 6 Describe the rate of the organisation's Denial of Service experience in a year? 7 How effective is the organisation s access control mechanism 8 Does technology reflect security consideration at design stage Is there policy and for implementation of cryptographic techniques/controls? Is cryptographic controls used to secure information? 10 Is the information system design flexible to innovation? 11 What is the average percentage uptime of the organisation's network in a year (Intranet and Internet)? 12 How effective is the mechanism for logging and tracing activities within the organisation? Average point Table 6.2: Assessment of case study organisations on technology and innovation Question one in table 6.2 above is an open-ended question that examined the effectiveness of the organisations authentication and authorisation controls. 84

85 Participant one was scored five (5) points on that due to the level of evidence shown in the response. The respondent included technical measures which ensure proper authentication/authorisation. Not only that, the respondent s organisation seem to understand that by policy and proper governance, strict authentication/authorisation would be proactively enforced. Participants two, three, and six were scored four (4) points each on that aspect for their good technical controls as evident in their responses (see table 5.5). Participant four got three (3) points for being specific at least on one technical control; while participant five earned two (2) points for vaguely indicating password as their authentication/authorisation mechanism without details. Similar pattern played in the next security area probed: Are there mechanisms for ensuring non-repudiation? Participants one, two, three and four just indicated yes without included details of the mechanisms. As such, I presumed they were playing safe and rather score each three (3) points (level3 basic security) on that area. However, participant six, from the IT industry was more precise and professional while responding by indicating digital signature as the mechanism deployed against nonrepudiation. On that note, I scored the four (4) points. This strategy of comparing and weighing the responses of respondents to open-ended questions, in line with best practices, was used where necessary to decide the ideal point to score. On the average, the maturity levels of the six participants on this security factors are 3.6, 4.0, 3.7, 3.6, 2.2 and 4.4. Referring at the score boundary in the assessment framework (see appendix C), this means that participants are at maturity levels 4, 4, 4, 4, 2, 4 respectively. Speaking further on the maturity scores, we see that besides participant five who is just at the security awareness level, all other participants are meeting requirements. Further details on the responses provided by the respondents on this security factor are in the results chapter in table 5.5. The next assessments presented are on security governance as seen the next subsection Assessments on Security Governance As highlighted earlier, concerns assessed here are the ability of the participating organisations to manage information security. Arresting the human factor concerns is 85

86 Financial Financial Government Government Government IT Services the key aim of this security factor. Table 6.2 below presents the assessment of the participants under this category. Areas probed Participants/Sector/ratings What is the perception of top managements? Do they understand their roles in ensuring information security? 2 Does the organisation have detailed security policy? Is the organisation's information assets inventory well defined, documented and information asset owners identified? Does information security constitute key consideration in every business plan of the organisation? Are engagements/communications with external parties controlled? 5 Does the organisation classify data/information? How often are cases of staff misuse of access rights/privileges? 7 Does the organisation have established information security training and awareness programme? What is the employees' information security awareness and compliance level in the organisation? 9 How well does the requirements of the policy spread across departments/functions? Are staff from all relevant functions involved in coordinating information security 10 How detailed is the organisation's recruitment screening to ensure qualified staff for key functions? Is security checks conducted on staff before security privilege is granted? 11 Is there an information security unit in the IT department of the organisation? Are officers assigned specific information security roles/responsibilities? Average point Table 6.3: Assessment of case study organisations on security governance In table 6.3 above, questions one, two, six, seven and eight are similar in their format as they were provided with multiple choice answers from which respondents can select (see questionnaire in appendix A). Therefore, the scores or points presented in table 6.3 above correspond to the respective maturity levels as selected by the respondents. Conversely, questions four, nine, ten and eleven were designed to enable 86

87 Financial Financial Government Government Government IT Services respondents evaluate their degree of compliance on the scale of one to five. The subjective judgment of the respondents in this case is taken as the maturity level of the respective organisations in the security areas assessed. However, question five is an open-ended question: Does your organisation classify information? Most respondents, except respondent one, just answered yes without specifying the categories of classification. Hence, they were scored three (3) points each for not providing details. Conversely, respondent one was scored four (4) points for listing all the categories in which information assets could belong in his/her organisation. One point went to participant five for indicating the absence of information assets classification in the organisation. In this rating, one point signifies a state of no performance (vulnerable state) as earlier stated, and as shown in the ISMM and ISAF. Lastly, the average point or maturity levels attained by the six participants are seen in the bottom row of table 6.3. While participants one and three possess level three (basic compliance) in their security governance ability, participants two, four and six are meeting requirements (level 4). Only participant five exhibits lack of good governance of security. The next subsection presents the assessments on risk management capability Assessments on Risk Management Capability The assessment conducted on the six participant organisations for risk management capability is presented in this subsection as shown in table 6.4 below: Areas probed Participants/Sector/ratings Are there physical protections for information assets? Is there proper risk assessment and treatment for all information assets? Estimate the organisation s ability of avoiding recurrence of solved security problem? 3 Is there a business continuity/disaster recovery plan for the company s information systems? How often is it checked and tested? 4 How often are data backed up? Are there active data back-up policy and strategy in place? 5 How often is virus scanning and updates carried out? Average point Table 6.4: Assessment of case study organisations on risk management capability 87

88 Five security areas were probed to determine the risk management capability of the participants. The first was on the physical protection available for protecting information assets. From the results as shown in table 5.6, all the participants indicated the existence of adequate physical protection for their information assets. Therefore, four (4) points each were assigned to all on that aspect. Another key area examined was the risk assessment and treatment activities of the organisations. This involves how they are able to predict and arrange preventive measures against security incidents. The question allowed participants to submit their evaluation concerning that on a scale of one to five. Their subjective assessments were adopted as the maturity scores of those participants on this security area. The business continuity plans of the participants were assessed using multiple-choice question. As earlier explained, the options selected as responses by the participants which tally with their maturity levels were reflected accordingly in the assessment exercise. Participants two and six showed proactive/innovative business continuity plans which are tested weekly for activeness, and were scored five points each. Participant one indicated monthly testing and checks of their business continuity plans for activeness, and earned four points, as specified in the framework. Participants three and four do their checks quarterly and were scored three points each. Participant five indicated the intervals for checking their business continuity plans as six months, hence earned two points in line with the design of the ISAF and questionnaire. The average points achieved by participants one to six is shown in table 6.4 as 3.6, 4.6, 3.6, 4.4, 2.8 and 4.6 respectively. This, according to the maturity boundary as defined in the ISAF translates as maturity levels 4, 5, 4, 4, 3, 5 for participants one, two, three, four, five and six respectively. At this point, it is necessary to bring all the average performances of the participants in all the security factors/ categories in a summary table as shown below. 88

89 Financial Financial Government Government Government IT Services Security factor/categories Participants/Sector/ratings Process and Procedures Standardisation Technology and Innovation Security Governance Risk Management Capability Final average Maturity level Table 6.5: Breakdown of case study organisations information security maturity A breakdown of the maturity levels of participants under the four security factors are presented above. In table 6.5 above, the final average of each participating organisation on the four assessed categories is shown in bold. Following the score boundary as specified in the ISAF (see appendix C), the final averages are translated into the appropriate maturity levels. It is clear from the table above that none of the assessed organisations attained level five (5) maturity. However, participant two, four and six were able to achieve level four maturity. Participants one and three are both on level three (3) maturity, while participant five is on level two (2) maturity. Further inferences and discussions on the meaning of the results and assessment conducted on these organisations are presented in the discussion section. But before then, it is important that I present a summary table of the costs of security of these participants. However, as noticed earlier in table 5.2, some of the organisations avoided reporting some of their costs. For this reason, I had to use some relevant standard guides and security reports to estimate the costs for these organisations as detailed in the next section Security Cost Estimation This section focuses on estimating and extrapolating costs of security for some of the participants who did not provide such information, or provided partly. These costs include the acquisition cost, maintenance cost and all costs related to incidents/breaches. This presentation will be in this order starting with the acquisition and maintenance costs. 89

90 Acquisition and Maintenance Costs Estimation for Security Controls The National Institute of Standards and Technology (NIST) in its Self-Assessment Guide specifies the security standards required for protecting an information system[44]. Based on this specification in the NIST Assessment Guide, the U.S. Department of Education Information Technology Security developed an Information Technology Security Cost Estimation Guide. The purpose of this guide is to enable estimation of the associated costs with implementing information security controls[22]. These costs are broken down by specific security controls, which are based on the management, operational and technical controls in the NIST Special Publication (SP) , Security Self-Assessment Guide for Information Technology Systems'[22]. In computing the costs of acquisitions of security controls for the project, the following assumptions and considerations were made: Average rate of inflation in the two regions (Europe and Africa) from which the case studies were drawn from 2002 to 2013 is 1.5%[12, 51], as the guide was issued in the year I assumed that each case study organisation operates two networks (intranet and internet) in which security requirements were incorporated. All estimates in dollars were converted and computations done in pound sterling. Two network Intrusion Detection Systems (IDS) probe; one analysis console; one Full-time Employee (FTE) for one year for monitoring IDS was budgeted per organisation. Currency conversion rate used is one pound sterling approximately equal to one and half dollars. Cost estimates per staff or system were multiplied by the number of employees for each computed organisation. The cost estimation guide presents two options between which system managers could choose while estimating the cost of full-time employee (FTE) labour. They are the contractors rates and the government rates[22]. I chose the contractors rates owing to the fact that most organisations' IT implementation and projects are usually contracted and not deployed by the organisations' employees. The contractors rates as 90

91 published by the US General Service Administration (GSA) IT Federal Supply Schedule and as reported by the cost estimation guide is as follows: Fiscal Year Annual Contractor FTE Salary Monthly Contractor FTE Salary Weekly Contractor FTE Salary Daily Contractor FTE Salary Hourly Contractor FTE Salary FY 01 - FY 06 $200,000 $16,667 $3,846 $769 $96 Table 6.6: Contractor Salary Rates[22] The figures as shown in table 6.6 above were used in estimating the acquisition and maintenance cost for the concerned case study organisations. Full details of the elements considered in the estimation and their respective costs is shown in appendix F. Appendix F shows the details of the estimation and the final total costs of acquiring security controls for the two case studies that failed to provide such. Furthermore, the costs of maintenance for the two case studies were deduced by identifying all the elements in the estimation that implied maintenance. These include cost for maintaining, updating, running and reviewing the security controls and also conducting security awareness trainings. These costs were identified, summed up and subtracted from the total costs estimated, thus making clear the acquisition and maintenance costs (see appendix F). Cost of Incidents/breaches estimation As shown in the results in table 5.2, participants one, three and five did not provide information on their costs of incidents/breaches. However, they did provide information on the annual rate of occurrence of incidents/breaches as seen in figure 5.9 of section 5.3. As earlier cited in the literature review (section 2.4), the Ponemon Institute published a 2012 Cost of Cybercrime Study which reveals cybercrime costs. This study shows that large organisations (which is the focus of this project) spend eighty-nine pounds ( 89) per capital cost for each security incident/breach experienced[35]. In calculating the costs of incidents for the case studies (participants) concerned, I multiplied the estimated per capital cost ( 89) by the number of employees in each organisation. The result of this multiplication was further multiplied by the number of incidents/breaches reported by each participating organisation to 91

92 Financial Financial Government Government Government IT services derive their final estimated costs of security incident. The design of the multiple-choice answers to the question on incidents/breaches rate is in range (i.e., 1 to 5 incidents, 20 to 100 incidents, etc.). So, I used the average of the range boundaries selected by the respondents as their respective number of incidents. For example, if 1 to 5 is the selected range, then [(1+5)/2] = 3. So, three (3) is taken as the organisation s annual rate or number of incidents for the computation. Find the details of the calculations/estimation at the bottom table of appendix F. In table 6.7 below, I present the estimated costs of security by these six participating organisations. The figures presented below is a combination of the primary data provided by respondents (as seen in table 5.2) and the supplementary estimated cost as seen in appendix F. Cost Type Participants / Sector Maturity Level of participants All figures are in pound sterling Planned Acquisition/ Costs of Security Disposal Business continuity/ Disaster recovery Maintenance Unplanne Breaches/ N/A d Cost of Security incidents Legal fines/ Compensations Lost assets Damaged reputation Loss of business/time Cost of Security 9,642,232 6,191,410 8,332,007 7,253,529 27,263,658 3,682,542 Table 6.7: Cost of security of participating organisations Table 6.7 above summarises the cost of security for the six participating organisations. Though the cost of maintenance for participant three was estimated alongside its cost of acquisition which was not reported, the reported cost of maintenance was used. As 92

93 seen in table 6.7 above, a column which shows the maturity levels of the participants is included as to make referencing easy while discussing results/findings in the next section. 6.3 Discussion For clarity, I discuss the results of this project in various paragraphs, each paragraph focusing on each security factor/category. Even as each paragraph discusses each security factor/category, an approach of inter referencing other security categories is adopted. As seen in figure 5.2, the management perception of security, according to participant five, is at level two. This obviously explains why the level of security investments by the organisation is very low compared to other participants as seen in table 6.7 above. Table 5.3 shows that all the participants, except participant five, have security units and also detailed security policies. This, by implication, is a show of initial desire to abide by security requirements. However, evidences from the same table show that these same organisations have weak attitudes towards updating their inventory records. In addition, figure 5.5 stresses that fact by showing the weak awareness and compliance level of these same organisations to information security. These findings reveal two things: the managements have the urge for information security, but individual voluntary commitments by the employees are found wanting. Generally, participants one, three and five have weak information security culture, especially participant five whose maturity is relatively very low. Conversely, level four maturity is maintained by participants two, four and six. This shows that they are not just investing in security controls but are complementing such investment efforts by good security consciousness of their employees. Table 6.1 which shows the assessment of organisations under process and procedures reveals a trend in which participants one and three consider security investment at implementation stage. In contrast, participants two, four and six consider security investment during project inception or design. For the latter participants, this seems to explain why their final maturity level is higher (level 4), in line with popular opinion as discussed in the literature review. Crosby, like popular views, argues that planning and investing at project inception are more likely to remove defects/errors and ensure 93

94 quality[21]. Quality in this context means information security[37]. This observation however suggests that early security planning could positively influence the final maturity posture of an organisation. It is also observed that both participants one and two seem to be serious with access control. This could be due to the nature of their business, as both are of the financial industry. Again, participant five has the lowest performance in coordinating processes and procedures. While participants two, four and six are meeting requirements (level 4), participants one and three maintain basic level of maturity. This basic level of maturity is also the case with participants one and three for security governance. The fact that participants one and three both maintain only basic level (level 3) of maturity in these two security factors suggests that human factors hinder them from achieving greater maturity despite their investments. At first, a general key observation is to be pointed out on the results and assessments under the technology and innovation security factor/category. It is observed that all the participants, except participant five, generally consider and implement security controls. Their responses in table 5.5 show that they have technological measures like cryptographic technologies, activity logging mechanisms, etc. with which they protect their information. Under this security factor, all participants, except participant five, achieved maturity level four (meeting requirements) to prove the fact that they actually invest in security. But the main finding here is: participant one and three maintain level four maturity on 'technology and innovation' but maintain level three under both 'security governance' and 'processes and procedures' as earlier discussed. Also, the final maturity level of these two participants (one and three), as seen in table 6.5, is level three (basic security). Consequently, this points to the fact that their good technological measures alone is not enough to ensure security if good security governance is neglected. In risk management, participants two, four and six, just like in every other security factors discussed, still show consistency in their ability. Their general good performance in all the security factors/categories tends to explain their overall higher maturity level compared to other participants. Unfortunately, participant five has generally exhibited poor performance in all the security categories. This means that information security is not considered critical and important by participant five. 94

95 From the economics of security perspective, it is seen from table 6.7 that participants two and four spend relatively higher than other participants on initial investment. Participants four and five are of the same industry (government) and are relatively of the same size. But participant four, from table 6.7, makes higher initial security investment (i.e., 3,925,045) than participant five (i.e., 163,032). Their maintenance costs also vary considerably as participant five, from previous results, do not take security serious. However, table 6.7 shows great difference in the final cost of security of these two participants in the government sector. Participant five eventually spends far higher than participant four. This is because participant five spends greatly (i.e., 27,052,440) in remedying incidents, unlike participant four that spends relatively smaller amount (i.e., 1,391,960). It is, then, observable that while participant four invested more than participant five to achieve security, participant five eventually spent far beyond its counterpart due to several incidents handled as a result of poor security. Similarly, participants one and two are of the same industry (financial services). Both invest in security, with participant two investing a bit higher as seen in table 6.7. However, their final cost of security vary considerably with participant one spending more (i.e., 9,642,232) than participant two (i.e., 6,191,410). Recall, that participant one achieved maturity level three, while participant two achieved level four. This finding consolidates with the previous finding on the two government organisations to suggest that up-front security investment eventually reduces cost of security. Since security investments influence organisations' maturity and eventually their cost of security, it is necessary to further show an estimate of cost expected of each maturity level. Making a generalisation based on the cost range presented below per maturity level will be misleading because organisational size somehow influences such costs. However, the projections made in this project are based on the size range of the participating organisations (between 1500 to 6500 employees). And the cost realities of the assessed organisations, as seen in this project, also form the basics of the cost projections in table 6.8 below. 95

96 Maturity Level Planned Costs of Security Acquisition/ Disposal Business 0-250, 000 continuity/ Disaster recovery Maintenance Unplanned Breaches/ 50,000,000 Cost of incidents and above Security Legal fines/ Compensations Lost assets Damaged reputation Loss of business/time Cost of Security by maturity level 50,000,000 and above All figures are in pounds sterling 250, 000-1,000,000 7,500,000-50,000,000 7,750,000-51,000,000 Table 6.8: Projected cost of security per maturity level 1,000,000-4,000,000 3,000,000-6,000,000 4,000,000-10,000,000 3,000,000-6,000, ,000-2,000,000 3,500,000-8,000,000 3,500,000-6,500,000 No cost expected 3,500,000-6,500,000 It should be noted that the figures as specified in table 6.8 above are estimates or projected figures based on the economic outcomes of the case studies in this project. Indeed, rigorous methodology has been used to conduct this project, in addition to relevant security materials/resources utilised. Nevertheless, deviations might be witnessed when using this projection to predict the economic outcomes of organisations. These possible deviations could be due to some limitations of this project as explained in the next chapter. In projecting the expected cost of incidents/breaches for each maturity level, I referred back to the frequency of incidents earlier defined in the assessment framework and as shown in figure 5.9 for each maturity level. I multiplied the defined incidents' rate or frequency in each maturity level by eighty-nine pounds ( 89). Eighty-nine pounds is the estimated per capital cost per incident according to the survey conducted by Ponemon Institute[35]. The respective figures obtained by this multiplication were further multiplied by four thousand, three hundred and thirty two (4332) to obtain the cost range projections. This figure 4332 is the average organisational size of the participating organisations. 96

97 As clearly shown in figure 6.8 above, the cost of security ranges for maturity level three and four seem close. But their ranges for acquisition cost show considerable variance. Recall that participant one invested in security, but ended up in level three maturity due to poor security governance. This tends to explain the wider range of cost of security in level three. The point here is that organisations could easily slip from level four maturity to level three if they are lagging in performance on other security factors, despite their security investments. It takes good performances in all security factors/categories for level four or level five maturity to be achieved. 6.4 Summary This chapter discussed the testing of the ISMM and evaluation of the case study organisations. It started by explaining the structure of the developed information security assessment framework (ISAF) used in evaluating the organisations. It then proceeded to evaluate the case study organisations. During this process, some costs were estimated and extrapolated to supplement the primary data provided by the organisations. This chapter thereafter concluded by discussing the results or findings of the project during which trends relevant to answering the research questions were observed. 97

98 Chapter 7: Conclusion This chapter brings the project to completion by summarising all the achievements, limitations and conclusions derived. It also points the direction for future work. 7.1 Summary of Achievements This project, having been successfully conducted, has come up with the following achievements: A detailed literature research on maturity models, especially the security models. This can provide good understanding of this topic to future researchers who may wish to further research on information security maturity model (ISMM). A cost-benefit information security maturity model (ISMM) which can be used to evaluate an organisation's information security maturity in relation to its economic status has also been developed. This project has added to the wealth of knowledge by establishing the relationship between the maturity of an organisation and its cost of security. Unlike other existing works on security maturity modelling, it provides a projection of the likely security costs associated with each maturity level. The methodology, evaluation plans and the iterative approach used in developing the ISMM can serve as stepping ground for future researchers on this area. 7.2 Project Limitations Despite the huge success achieved in this project, some limitations of this project include: Large-Organisation Bias: Samples were only drawn from large organisations deliberately for consistency purpose. However, results and economic outcomes of this project might differ with small-size organisations whose economic realities on security investments might be unacceptable. 98

99 Estimated Costs: Mixed research methodology was used, which combined primary data with secondary research data from respondents. It is still possible that the economic realities of the participants were not perfectly reflected by this means. This is because some of their security costs, which were not provided, were estimated and extrapolated using industry cost estimation guides and security reports. Also, there is possibility that respondents were dishonest as to provide inaccurate data. Non-response: Questionnaires distributed across the globe did not receive attention in some regions and industries. This, however, means that the project outcome might not be generalised, as samples were only from Europe and Africa. 7.3 Recommendation for Future Work The following recommendations are identified for future work. Detailed metrics and evidence for assessment As observed in the data collection tool/method and assessment framework, some of the investigations allowed the respondents to provide subjective evaluations on a scale of one to five. This approach could be improved on by comprehensive specification of the metrics and expected evidences of performance for each security area across the maturity levels. Also, efforts could be intensified in penetrating industries to gain physical interviews, inspection exercises and sighting of those documentary evidences. This way, greater accuracy could be achieved with the results of a project. ISMM with improvement guidelines It can be recalled that the project is to develop a maturity model which evaluates an organisation to determine its maturity level and how it affects its economic status. Although the model may inspire improvement actions, the scope of this project does not intend the model to provide improvement guidelines/formula to the assessed organisation. This is an area that could be worked on by future researchers. They could extend the research by coming up with a security maturity model which does not only evaluates but also provides guidelines or formula towards raising an organisation s maturity. 99

100 7.4 Conclusion The project has produced a five-layer maturity model for evaluating the information security maturity of an organisation. The major aim, as achieved by this project, is to determine the relationship between the maturity of an organisation and its cost of security. An overview of the project and how it was necessitated from the ever-increasing dependency on information technology was presented in the introductory chapter. The literature review chapter went further to explain the term 'maturity modelling', and proceeded to discussing trends in maturity modelling. Previous attempts and efforts to produce security maturity models were also highlighted in the chapter. The Economics of Security section also explored efforts in determining cost of cybercrimes and cost-benefit analysis of security investments. All these set the atmosphere for understanding and furthering the course of developing an economic security maturity model which had not been attempted. Proceeding to chapter three, the various methodologies that could be used to conduct this project were stated and explained briefly. The choice of the ideal methodology was made and reasons explained. The individual stages involved in the chosen methodology were explained, detailing all the activities undertaken in the process of developing the model. At last, the chapter discussed the project tools used in carrying out the project. Chapter four of this project went in detail in explaining the iterative procedure used in developing the maturity model. Individual activities which characterised the various iterative stages were also discussed. The different components, maturity levels, security factors and areas, and the general structure of the ISMM were also explained. Finally, the chapter concluded by enumerating and explaining the three evaluation approaches used in testing and ensuring the efficacy of the ISMM. Results were presented in chapter five. This chapter concentrated on results presentation using charts and tables, and emphasis made on key points without actually discussing or interpreting them. 100

101 Chapter six of the project details the processes and activities involved in testing the ISMM by evaluating six participants. It started by explaining the structure of the information security assessment framework (ISAF) that was used for the evaluation. The description of the actual assessment of the case studies followed, and finally a section which discussed the results of the survey conducted and assessment outcomes. This discussion section ended with a presentation of a summary table that shows the projected expected cost of security for each maturity level. The concluding chapter summarises the project by stating the achievements, limitations, recommendations for future works and conclusions drawn from the project. In conclusion, the project was focused on determining the relationship between the maturity of an organisation and its cost of security. Numerous security models exist with similar goal of evaluating and ensuring the alignment of organisations' operations with security standards. However, complex and sophisticated technologies and controls are useless if the main aim of most organisations - which is profit making - is not achieved[57]. Unlike other models and researches that had concentrated on just increasing the technicality of an organisation, this project answers the popular management question of whether or not security investment is worth doing. From the assessment, organisations at higher maturity (i.e., level four and five) spent higher than those on the lower levels (i.e., level one and two especially) on initial investment. Consequently, their final costs of security were far lesser than that of those organisations at the lower levels of maturity. That is to say, the higher the maturity of an organisation the lower the eventual cost of security and vice versa. Considering the outcome of this project, I subscribe to the opinion of most security experts which maintains that it is profitable to make up-front security investments[35, 56]. Furthermore, making an investment without proper guidance and human attitudinal support might yet be a wasteful venture as seen in this project. Participants one and three invested in security, but poor security governance hinders them from meeting security requirements. On this note, I conclude that good technological controls alone without complementary adequate security governance is not enough to ensure information security. 101

102 List of References [1]. BBC News. South Korea Network Attack 'a computer virus' [Online] Available at: [Accessed 20 March 2013] [2]. European Network and Information Security Agency. Introduction to Return on Security Investment. Greece [Online]Available at: htp:// [Accessed 20 August 2013]. [3]. Andersen, K.V. and Henriksen, H.Z. E-government maturity models: Extension of the Layne and Lee model. Government Information Quarterly, 23 (2) [4]. Anderson, R., Security economics: a personal perspective. in Proceedings of the 28th Annual Computer Security Applications Conference, (2012), ACM, [5]. Anderson, R., Why cryptosystems fail. in Proceedings of the 1st ACM conference on Computer and communications security, (1993), ACM, [6]. Anderson, R., Barton, C., Böhme, R., Clayton, R., van Eeten, M., Levi, M., Moore, T. and Savage, S., Measuring the Cost of Cybercrime. in WEIS, (2012). [7]. Anderson, R. and Moore, T. The economics of information security. Science, 314 (5799) [8]. April, A., Huffman Hayes, J., Abran, A. and Dumke, R. Software Maintenance Maturity Model (SMmm): the software maintenance process model. Journal of Software Maintenance and Evolution: Research and Practice, 17 (3) [9]. Great Britian. The National Technical Authority for Information Assurance. HMG Information assurance maturity model. Version 4.0 [14] CESG. London: Cabinet Office and CESG, [Online] Available at: aturity+model [Accessed 10 April 2013] [10]. ISACA. Control objectives for information and related technology, [Online] Avaliable at: [Accessed 14 July 2013] [11]. Automotive, S. Automotive SPICE Process Assessment Model. Final Release, v4, [12]. The Word Bank. Inflation, consumer prices (annual %), [Online] Available at: C5-A9?display=graph [Accessed 28 August 2013] 102

103 [13]. Becker, J., Knackstedt, R. and Pöppelbuß, D.-W.I.J. Developing maturity models for IT management. Business & Information Systems Engineering, 1 (3) [14]. Bodin, L.D., Gordon, L.A. and Loeb, M.P. Evaluating information security investments using the analytic hierarchy process. Communications of the ACM, 48 (2) [15]. Brecht, M., Nowey, T. and Krones, A., A Closer Look at Information Security Costs. in WEIS, (2012). [16]. Brooks, F., Basili, V., Boehm, B., Bond, E., Eastman, N., Jones, A., Shaw, M. and Zraket, C. Report of the Defense Science Board Task Force on Military Software. Office of the Undersecretary of Defense for Acquisition. [17]. Cass, A., Volcker, C., Winzer, L., Carranza, J. and Dorling, A. SPiCE for SPACE: a process assessment and improvement method for space software development. ESA bulletin, [18]. Cavusoglu, H., Mishra, B. and Raghunathan, S. A model for evaluating IT security investments. Communications of the ACM, 47 (7) [19]. Chapin, D.A. and Akridge, S. How can security be measured. information systems control journal, [20]. Cooper, J., Fisher, M. and Sherer, S.W. Software Acquisition Capability Maturity Model (SA-CMM) Version 1.02, DTIC Document, [21]. Crosby, P.B. Quality is free: The art of making quality certain. McGraw-Hill New York, [22]. United States. Department of Education, Information Technology Security. Information Technology Security Cost Estimation Guide, 2002.[Online] Available at: Estimation_Guide_NIST.doc [Accessed 20 August 2013] [23]. Fisher, D.M. The business process maturity model: a practical approach for identifying opportunities for optimization. Business Process Trends, 9 (4) [24]. Fumey-Nassah, G., The management of economic ramification of information and network security on an organization. in Proceedings of the 4th annual conference on Information security curriculum development, (2007), ACM, 25. [25]. G., J.H. Why 70% of Government IT Projects Fail - Quality Project Management for Education Agencies. ESP Solutions Group, Project Management Series - Part 1, 2nd Edition. [26]. Gonzalez, J.J. and Sawicka, A., A framework for human factors in information security. in WSEAS International Conference on Information Security, Rio de Janeiro, (2002). 103

104 [27]. Gordon, L.A. and Loeb, M.P. The economics of information security investment. ACM Transactions on Information and System Security (TISSEC), 5 (4) [28]. Gordon, L.A., Loeb, M.P., Lucyshyn, W. and Richardson, R CSI/FBI computer crime and security survey. Computer Security Institute, [29]. Gribben, R. Warning for small firms over cyber crime explosion The Telegraph, [30]. Harris, L.E. Digital property: currency of the 21st century. McGraw-Hill Ryerson, [31]. Hefner, R. and Monroe, W., System security engineering capability maturity model. in Conference on Software Process Improvement, (1997). [32]. Humphrey, W.S. Managing the Software Process (Hardcover). Addison-Wesley Professional, [33]. CISCO Systems Inc Cisco annual security report [Online] Available at: [Accessed 03 May 2013]. [34]. Computer Security Institute. 2010/2011 CSI Computer Crime and Security Survey, New York, [Online] Available at: [Accessed 20 August 2013] [35]. Ponemon Institute Cost of Cyber Crime Study: United Kingdom, [Online] Available at: 20Cost%20of%20Cyber%20Crime%20Study%20FINAL%204.pdf [Accessed 17 July 2013] [36]. NIST. Guide to selecting information technology security products, Gaithersburg, MD , [37]. British Standards Institution. BS ISO/IEC 25010:2011. Systems and software engineering Systems and software Quality Requirements and Evaluation (SQuaRE) -- System and software quality models. London: BSI, [38]. British Standards Institution. BS ISO/IEC :2006. Information technology - process assessment Part 5: An exemplar process assessment model. London: BSI, [39]. British Standards Institution. BS ISO/IEC 27002: Information Technology- Security Techniques-Code of Practice for Information Security Management. London: BSI,

105 [40]. Ives, B., Jarvenpaa, S.L. and Mason, R.O. Global business drivers: aligning information technology to global business strategy. IBM Systems Journal, 32 (1) [41]. M. Madhavan. Authorities knew about Malaysian parliament hack. CyberSecurity Malaysia [Online] Available at: detail/1307/index.html [Accessed 28 April 2013] [42]. Moore, T. and Anderson, R. Economics and Internet security: A survey of recent analytical, empirical and behavioral research. Harvard University Computer Science Group. [43]. NIST. Guide for Security-Focused Configuration Management of Information Systems, Gaithersburg, MD , [44]. NIST. Security Self-Assessment Guide for Information Technology Systems. Washigton: U.S. Government Printing Office, [45]. Paulk, M.C., Weber, C.V., Curtis, B. and CHRISSIS, M. The capability maturity model: Guidelines for improving the software process. Addison-wesley Reading, [46]. Paulk, M.C., Weber, C.V., Garcia, S.M., Chrissis, M.B.C. and Bush, M. Key practices of the Capability Maturity Model version 1.1. [47]. N. Perlroth. Hackers in China Attacked The Times for Last 4 Months. The New York Times [Online] Available at: [Accessed 03 May 2013] [48]. Posthumus, S. and Von Solms, R. A framework for the governance of information security. Computers & Security, 23 (8) [49]. PricewaterhouseCoopers. Information security breaches survey A technical report prepared by PricewaterhouseCoopers for the Department of Business, Innovation and Skills, BIS/13/P [Online] Available at: [Accessed 27 April 2013] [50]. Security Lancaster, Small Business: Cyber Security Survey [Online] Available at: [Accessed 28 April 2013] [51]. RateInflation. United Kingdom Inflation Rate History to 2013, [Online] Available at: [Accessed 29 August 2013] 105

106 [52]. Richardson, R. CSI computer crime and security survey. Computer Security Institute, [53]. Rosenzweig, R. Scarcity or abundance? Preserving the past in a digital era. The American Historical Review, 108 (3) [54]. Saleh, M.F. Information Security Maturity Model. International Journal of Computer Science and Security (IJCSS), 5 (3). 21. [55]. D. E. Sanger. In Cyberspace, New Cold War. The New York Times [Online] Available at: [Accessed 03 May 2013] [56]. B. Schneier. Schneier on Security, [Online] Available at: [Accessed 29 August 2013] [57]. National Computing Centre. IT Governance. Developing a successful governance strategy. A best practical guide for decision makers in IT. Manchester: NCC, [Online] Available at: Enterprise-IT/Prepare-for-the-Exam/Study-Materials/Documents/Developing-a- Successful-Governance-Strategy.pdf [Accessed 20 August 2013] [58]. Software Engineering Institute. Capability Maturity Model Integration (CMMi) Overview. Pittsburgh, PA Pittsburgh PA, [Online] Available at: [Accessed 17 April 2013] [59]. Sonnenreich, W., Albanese, J. and Stout, B. Return on security investment (rosi)-a practical quantitative model. Journal of Research and Practice in Information Technology, 38 (1) [60]. Stambul, M.A.M. and Razali, R., An assessment model of information security implementation levels. in Electrical Engineering and Informatics (ICEEI), 2011 International Conference on, (2011), IEEE, 1-6. [61]. Sun, L., Srivastava, R.P. and Mock, T.J. An information systems security risk assessment model under the Dempster-Shafer theory of belief functions. Journal of Management Information Systems, 22 (4) [62]. Team, C.P. Capability Maturity Model Integration (CMMI SM), Version 1.1. Software Engineering Institute, Carnegie Mellon University/SEI-2002-TR-012. Pittsburg, PA. [63]. Team, C.P. CMMI for Development, Version 1.3. [64]. Team, V.R. Data breach investigation report A technical report prepared by the Verizon RISK Team,

107 [65]. von Wangenheim, C.G., Hauck, J.C.R., Salviano, C.F. and von Wangenheim, A., Systematic literature review of software process capability/maturity models. in Proceedings of International Conference on Software Process Improvement and Capabity determination (SPICE), Pisa, Italy, (2010). [66]. Woodhouse, S., An ISMS (im)-maturity capability model. in Computer and Information Technology Workshops, CIT Workshops IEEE 8th International Conference on, (2008), IEEE, [67]. Xiao-yan, G., Yu-qing, Y. and Li-lei, L. An Information Security Maturity Evaluation Mode. Procedia Engineering,

108 Appendix A: Cover Letter and Questionnaire School of Computer Science, Kilburn Building, The University of Manchester, M13 9PL, Manchester, 18th April, ,...,...,...,...,... Sir/Madam RESEARCH QUESTIONNAIRE I am a postgraduate student running a programme in Advanced Computer Science - Computer Security, at the University of Manchester, UK. I am currently working on my dissertation which involves a research on a topic: Information Security Maturity Model. This is intended to be a tool for evaluating where an organisation is positioned on the proposed 5-layer maturity model. It should inspire an organisation through an evolutionary path to achieve maturity. The focus of the research work is primarily to investigate and establish the relationship between the maturity of an organisation's (or supply chain's) development activity and its cost of security. That is, to determine if organisations who invest in security actually save and gain more economically than those who do not consider and invest in security. Please Note: This interview is strictly for academic purpose to aid research decisions; hence, information provided will be treated with utmost confidentiality. Also, you are at liberty to discontinue participation in this research at any willing time. Therefore, it will be appreciated to have your answers as accurate and honest as possible, if interesting in participating. Please, attached with this cover letter is the Information Security Maturity Questionnaire for you to fill in. Thank you for your cooperation and time. Nnatubemugo Innocent Ngwum ngwumn@cs.man.ac.uk 108

109 Information Security Maturity Questionnaire Instructions: Circle the option which corresponds to your answer (e.g., a,b,c,d,e) or (Yes or No) Provide answers to the other questions by writing in the space provided immediately beside or below each of them. 1. What percentage of the total IT budget is invested on security in the past 5 years? Estimate the monetary value of investments on security measures in the past 5years? Estimate the total cost of maintaining and updating security mechanisms in the past 5 years? How much in total was spent on remedying security breaches, fines, and other losses (i.e., cost of security) in the past 5 years? Estimate the total loss due to business disruption, reputational damage and direct assets loss during security incidents in the past 5 years? What percentage of the entire organisation's business revenue is the loss? How many legal/regulatory fines has your organisation faced in the past 5 years? Does the organisation have a security policy? If Yes, how comprehensive is it? (a. No policy, b. Word-of-mouth instructions, c. Security manual, d. Comprehensive security policy, e. Regularly-reviewed policy) 9. How well does the requirements of the policy spread across departments/functions? Specify on a scale of 1-5? Does information security constitute key consideration in every business plan of the organisation? If Yes, indicate the overall degree of its consideration in a scale of 1 to Are there controls to limit risks due to human factor? If yes, please state a few. 12. What stages does an information/information system go through during its life cycle in the organisation? Specify: 13. At what stage in a project life cycle is security considered and/or introduced? (a. Not considered, b. As the need arise, c. Implementation stage, d. Design stage, e. Project conception/inception stage) 14. Approximately how much was spent in setting up the security features? (a. $10, $20, 000, b. $20, $50, 000, c. $50, 000-$100, 000, d. Other - Specify) 109

110 15. Approximately how much is spent on maintaining the security of information systems annually? (a. $2, $5, 000, b. $5, $10, 000, c. $10, 000-$20, 000, d. Other - Specify) 16. Are information classified in your organisation (i.e., unclassified, classified, restricted, confidential)? Yes/No 17. If Yes above, please specify organisation's information categories and classifications assigned (e.g. unclassified, classified, restricted, confidential, private, commercial in confidence) 18. Are there technological measures in place to ensure information security (e.g., against malicious codes, intrusion etc.? Specify: 19. What is the perception of top management as regards information security? (a. totally ignorant, b. aware & uninterested, c. Supportive, d. Very supportive, e. partakers) 20. How detailed is the organisation's recruitment screening to ensure qualified staff for key functions? Specify on a scale of 1-5 (1 is Min., 5 is Max.) Is there proper risk assessment and treatment for all information assets? Estimate the organisation s ability of avoiding recurrence of solved security problem on a scale of 1 to 5? (1 is Min., 5 is Max.) How often are IT equipment changed/upgraded? Every: (a. 20yrs, b. 10yrs, c. 5yrs, d. 3yrs) 23. List the technical controls in place in your organisation for securing data? 24. How often do malwares or/and other security attacks disrupt key business processes? (a. Several times a week, b. Twice a month, c. Once in 6 months, d. Barely once a year, e. Not at all) 25. Does the organisation have established information security training and awareness programme? (a. No training, b. Individual efforts to train, c. Annual general training, d. Specialised security trainings provided, e. Training are providing in line with staff changing job requirement) 26. Describe the rate of the organisation's Denial of Service experience in a year? (a. daily, b. weekly, c. occasionally, d. hardly, e. never) 27. Are there standard operational procedures across all departments which support information security? If yes, please name a few: 28. Are there controls to limit risks due to human factor? If yes, please state a few: 110

111 29. Is the organisation's IT equipment inventory well defined? (a. No inventory, b. few records, c. incomplete/outdated inventory, d. detailed up-to-date inventory, e. auto-update inventory) 30. Are there access control mechanisms (role based, mandatory, etc.)? If yes, what type of access control mechanism is used? 31. How regularly are the granted rights reviewed and updated? (a. not review, b. annually, c. 6-months interval, d. quarterly, e. monthly) 32. Are there physical protections for information assets (e.g., servers, paper files, etc.)? Please specify below: 33. What are the measures in place for authentication and authorisation in the organisation? Please specify below: 34. Is there access control policy in place for managing user access? (a. No access control policy, b. system administrators control access, c. Access control policy exists d. detailed formal & documented policy, e. Policy is agile) 35. What is the period of inactivity before a user s account is deactivated? (a. Never deactivated, b. 1 year, c. 6 months, d. 3 months, e. 1 month) 36. Grade employees information security awareness and compliance level in your organisation. (a. unaware, b. aware/not compliant, c. weak compliance, d. compliant, e. very compliant) 37. Does the organisation follow any information security guidelines? Yes/No 38. Is there an information security unit in the IT department of the organisation? If Yes, state how established it is on the scale of 1 to Does the information policy cover all operations of all departments in the organisation? Yes/No 40. How often are cases of staff misuse of access rights/privileges? (a. daily, b. weekly, c. monthly, d. rarely happen in a year, e. rarely) 41. Does your organisation have incident report/response procedure? If Yes, how defined is it on a scale of 1 to 5? 42. Are data grouped according to their sensitivity in the organisation's policy (if YES specify below)? Yes/No. 43. Does information system controls reflect security considerations at the design stage? Rate on a scale of 1 to Does the information systems design support technological innovation for improvement? Evaluate degree of flexibility on a scale 1 to Are there vulnerability scanning tools in place (specify if YES)? Yes/No 46. Are there data loss prevention tools in place (specify if YES)? Yes/No 111

112 47. How comprehensive is the implementation of security requirements/policy (if any) and standards in the security architecture? Specify on the scale of Is there policy for implementation of cryptographic techniques/controls? Is cryptographic controls used to secure information? If Yes, specify type. 49. Are there standard operational procedures across all departments which support information security? Yes/No 50. How often are computer systems checked/scanned for malicious applications/activities? (a. occasionally, b. quarterly, c. weekly, d. daily, e. realtime auto-scanning/monitoring) 51. Are there mechanisms for ensuring non-repudiation (i.e., people are made accountable for their activities) in the organisation's communications system? Yes/No 52. How willing and ready is the top management in adopting and supporting a security policy? (a. unwilling, b. likely to be willing, c. willing, d. very willing, e. drivers) 53. Is there Bring Your Own Device (BYOD) policy in place? Are there proper arrangements for mobile computing/teleworking? Specify below: 54. How often are members of staff prompted for password change? (a. no prompt, b. at users' discretion, c. 6-month intervals, d. monthly, e. actively monitoring to alert change as needed) 55. Are staff accounts ever deactivated for inactivity, and how long prior deactivation? (a. forever active, b. Yearly, c. 6 months, d. 60days, e. 30days) 56. How formalised is your organisation's report procedures for security issue? Specify on the scale of Members of the security team comprise of staff from: (a. no team, b. visiting expert, c. staff of IT department, d. staff from key process departments, e. staff from all department) 58. Does the organisation have any access control mechanism put in place? Yes/No 59. If yes, what type of access control is used? [a. Mandatory, b. Discretionary, c. Role-Based, d. Other - specify] 60. Are the access rights granted to users reviewed regularly? Yes/No 61. Are there malicious code detection tools in place (specify if Yes)? Yes /No 62. How often are the firewall and IDS logs monitored and acted upon? (a. rarely, b. annually, c. Monthly, d. weekly, e. daily) 63. Are there mechanisms for monitoring and restricting malicious activities over the organisation s network, if YES specify? Yes/No 64. What is the average percentage uptime of the organisation's network in a year (Intranet and Internet)? (a , b , c , d , e ). 112

113 65. How often is penetration testing carried out? (a. 5-year Interval, b. 2-year Interval, c. Annually, d. 6-month intervals, e. Quarterly) 66. How often are data backed up? (a. No back-up exercise, b. Annually, c. Weekly/Monthly, d. Daily, e. auto real-time back-up) 67. How often is virus scanning and updates carried out? (a. Annually, b. Quarterly, c. Monthly, d. Weekly, e. Daily) 68. How effective, if any, is the mechanism for logging and tracing activities within the organisation? Specify on the scale of Does the organisation's system allow for multiple login with one ID? Yes/No 70. How regular is the information system audit for compliance conducted? (a. never, b. rarely, c. yearly, d. 6-month, e. quarterly) 71. How far does the security architecture allow for innovative improvements? Specify on the scale of Is there a business continuity/disaster recovery plan for the company s information systems? if YES specify type? Yes/No 73. How often is this plan checked and tested? (a. Annually, b. 6-months interval, c. Quarterly, d. Monthly, e. Weekly) 74. How compliant are the operations of the organisation to policy, regulatory, and legal requirements? (a. several breaches, b. not compliant, c. weak compliance, d. compliant, e. very compliant) 75. Are there controls to limit risks due to human factor (admin only privilege, etc.)? Specify, if any: 76. How knowledgeable and responsive are staff to information security concerns? (a. ignorant, b. aware, c. complying, d. seriously complying, e. innovatively complying) 77. Is there a procedure for reporting security incidents, if YES specify? Yes/ No 78. What is the rate of security incidents and breaches in the past year? (a. several incidents, b cases, c cases, d. 1-5, e. No incident) 79. Is there punishment for staff violation of the security policy? (a. staff security breaches go unnoticed, b. breaches are noted without punishment, c. staff are punished when there are tangible losses, d. staff are punished appropriately, e. punishment are recorded for future reference too) 80. What is the time between security incident and its resolution (response rate)? (a. 1 year, b. 1 month, c. 1 week, d. 1-5 days, e. 1 day) 81. Would you describe your organisation's business growth rate as 'neutral' in the past five (5) years, if NO answer the following couple of questions? Yes/No 113

114 82. Describe, in percentage, your organisation's business growth rate (i.e., how far has the organisation gained more customers/businesses) compared to the past five (5) years? Describe, in percentage, your organisation's business drop rate (i.e., how far has the organisation lost its usual customers/businesses) compared to the past five (5) years? Does your organisation have staff information security training plan, and how often? 85. Evaluate your organisation s information risk management capability, across all departments, on a scale of 1-5? 86. Does your organisation have a well-developed risk register (i.e., document containing all likely information risks, their owners/managers and counter measures planned)? Yes/No 87. If Yes above, specify how comprehensive the risk register is on the scale of 1-5? 88. How often is the risk register (i.e., risk assessment) reviewed to reflect evolving information security risks? What is the average occurrence of security incidents/breaches resulting from unidentified risks in a year? Evaluate your organisation s level of assignment of security roles/responsibilities (i.e. who handles what sensitive data etc.), and staff s understanding and compliance to their roles on a scale of 1-5? 91. Are various information risks assessed/identified and mitigation actions planned? Yes/No If there is any other relevant information you may wish to add which has not been covered in these questions, concerning your organisation's security practices, please feel free to write it below: What is your organisation's industry?... What is the number of employees in your organisation?... What is your job role/position?

115 Appendix B: Information Security Maturity Model (ISMM) Security Factors Security Areas Level 1: Vulnerable Level 2: Security Awareness Level 3: Basic Security Level 4: Meeting Requirement Level 5: Robust Security Process and Procedures Standardisation Technology and Innovation Security Governance Access management Compliance Through-life technological implementations Management of information security Complete ignorance of information security criticality No policy/procedures in place for coordinating and protecting information within the organisation and while in transit on the networks. Operations are completely noncompliant to legal, technical, and security regulations and standards Technology used is still substandard and vulnerable to threats/attacks Lack of management's awareness and interest in the need for information security. Awareness of criticality of information security with little efforts Access control policy is in place, but implementation across various essential information facilities is still minimal and uncoordinated Compliance regime shows desire to comply. Constant review/audit of information systems is yet not practicable Security investment in information systems is as the need arise and reactive Management understands the criticality of information security to business, but not supportive. Basic compliance to information security requirements Access control mechanisms for users, networks, applications, operating systems, etc. is in place with little efforts to track its efficacy and correct shortfalls Regular reviews by internal audits and actions followed to achieve compliance Basic security requirements are met. Maintenance and upgrade to match standard is lacking Management understands its role in ensuring security with investment and efforts made to ensure security. Full compliance to standard information security requirements Standard ways of managing access to information in various forms is practised. mobile computing (BYOD) policy is well defined and followed. Performance is measured Both internal and external auditors ensure full compliance to security policies, standards and legal requirements Full security requirements implementation; standard cryptographic controls are used. Management is fully involved. Security investments are wellplanned, implemented and measured against performance. Innovative ways of ensuring information security Active control of information access is practised. Regular review of access control policy to match evolving threats. Dynamic management of information assets to ensure security. External audit shows best practices are consistently followed resulting in no critical compliance issues. Innovative use of standard and latest technology to ensure security Information Security is considered an essential/integral part of the business by the management. 115

116 Security Governance Risk Management capability Training and awareness programme Roles and Responsibility Risk assessment and treatment No awareness and training programme initiated Information security roles are not assigned. There are no information risk policy; assessment and treatment of risks is not done Awareness programme is weak and not supported with active staff training Roles/responsibilities are specified though not followed strictly. Risk policy and owners exist, but no proper handling of risks Annual awareness and security trainings programme is considered Information assets owners understand and follow their roles/responsibilities for their assets Risk appetite, policy and owners are well defined and followed. Risks register specifies core business risks and treatment Regular trainings to align staff with best practices. Staff security awareness/competence is tested and improved on. Information Risk Owners are appointed and continually redirected to achieve security objectives Adequate risks management plans with periodic accreditation of old and new Information systems to reduce risks. Proactive ways to ensuring information security shows staff high security awareness. Appointment of Chief Information Officer in the board. Active review of risks, threats, vulnerabilities ensures risks are kept within the organisation's risk appetite Incident and Business continuity management No incident reporting and response procedure in place. Business continuity plan does not exist. Reporting of security events lacks coordination. Business continuity plans are not in place. Procedure for security events reporting exist. Business impact analysis and plans are implemented. Standard and effective mechanism for reporting and responding to security events. Business continuity plans covers all areas of business Procedures and mechanisms for security events and business continuity plans are continually tested, maintained and reviewed to keep up to standards. Economics of Security Planned Cost of Security Unplanned Cost of Security Total Cost of Security 0-250, , 000 1,000,000 50,000,000 and above 7,500,000-50,000,000 50,000,000 and above 7,750,000-51,000,000 1,000,000-4,000,000 3,000,000-6,000,000 4,000,000-10,000,000 3,000,000-6,000, ,000-2,000,000 3,500,000-8,000,000 3,500,000-6,500,000 No cost expected 3,500,000-6,500,

117 Access Management Appendix C: Information Security Assessment Framework (ISAF) PROCESS/PROCEDURE STANDARDISATION Level 1: Vulnerable Level2: Security Awareness Level3: Basic Security Level4: Meeting Requirements Level5: Robust Security Level Weightings Areas to probe Expected Evidence for each corresponding maturity level Point Is there access control policy No access control Formal access in place for managing user policy exist control policy is access? documented How regularly are the granted rights/privilege reviewed and updated? Is there formal information security incident management procedure? Staff with administrative privileges control access at will Not reviewed at all Annually 6-months interval Does not exist Report of incident is uncoordinated and does not follow specific procedures Procedures exist though not strictly followed thus ineffective in ensuring timely response Access control policy covers all requirements related to information access control. Quarterly Well defined procedures for reporting and responding to security incidents. Well-defined escalation path and proper documentation done for every report/response given Constant review and documentation of access control policy to reflect changing business and security requirements monthly Standard procedures are strictly followed and constantly improved on to improve security. Proactive measures are in place to prevent past documented incidents What is the period of inactivity before a user s account is deactivated? At what stage in a project life cycle is security considered and/or introduced? Never deactivated 1 year 6months 3months 1month Not considered at all As need arises Implementation Design stage Inception /conception 117 3

118 Access Management Is there punishment for staff violation of the security policy? Are there standard operational procedures across all departments which support information security? Is there Bring Your Own Device (BYOD) policy in place? Are there proper arrangements for mobile computing/teleworking? How often are computer systems checked/scanned for malicious applications/activities? How often are staff prompted for password change to allow access to operating system and other applications? How often are the firewall and IDS logs monitored and acted upon? Staff security breaches go unnoticed No defined operating procedure in place Policy and provisions for use of mobile devices do not exist Not done as far as there is no issue Access restriction to operation systems/applicatio ns is minimal Breaches are noted without punishment Understanding of need for operating procedure exist but not implemented Individual efforts to ensure security without formal BYOD policy Quarterly or as problem arises Password management/cha nge is at the discretion of the user Staff are punished when there are tangible losses Operating procedure exists, but not strictly practised Detailed requirements for staff use of personal devices are not fully specified in policy Weekly or monthly Security mechanisms prompt user for password change every six months. No such operation Annually Monthly /Quarterly Appropriate mechanisms exist for tracking, identifying and disciplining staff security misconducts Operating procedure details proper use of all information processing facilities and users' responsibilities Detailed policy which specifies how mobile devices should access sensitive information within and outside the organisation is in place Daily Policy and security mechanisms insists that user password be changed monthly Weekly Staff breach of security is punished and documented for future reference Operating procedure is constantly reviewed for efficacy. There is proper segregation of duties to prevent negligence of duty and also to ensure security Policy for mobile computing is continually reviewed to meet evolving security requirements/challenges Active/real-time monitoring of system activities to identify and resolve malwares Change of password in accordance to access control policy and active identification and blocking of password/user account misuse Daily 118 4

119 Compliance Average Point How compliant are the operations of the organisation to policy, regulatory and legal requirements? How often are cases of staff misuse of access rights/privileges? Are users obliging to their user responsibilities? How regular is the information system audit for compliance conducted? Are penetration testing and vulnerability assessments conducted? What is the rate of security incidents and breaches in the past year? What is the time between security incident and its resolution (response rate)? Several breaches are recorded often Several daily cases of user violation of security policy due to ignorance Never done Compliance is still reactive and not consistent in all aspects Users understand their security responsibilities but do not actively abide by it. Individual staff efforts to troubleshoot and correct errors and system vulnerabilities Technical compliance is assured with legal and regulatory compliance not fully enforced Few cases of violation in a month due to conscious efforts to fulfil security responsibilities System audit is only carried out by internal auditors. Full compliance to legal, regulatory, technical and other security requirements is practised Users understand their security responsibilities and are actively abiding by it. Penetration testing and vulnerability scanning are carried out every six months by external experts. System audit by internal officers is also done regularly Several breaches incidents incidents 1-5 incidents No incident A year interval or more Incidents are resolved within one month of occurrence Incidents do not exceed one week before resolution Incidents are promptly handled within one to five days interval. Compliance to legal, regulatory and security requirements is continually checked and reviewed as necessary Access control is effective and cases of misuse and information compromise are rare due to users' proactive compliance to their security responsibilities. Penetration testing and vulnerability scanning is regularly done. Both internal and external auditors regularly confirm compliance of information systems to security requirements. It takes no more than a day to resolve incidents 5 119

120 Through-life technological implementations TECHNOLOGY AND INNOVATION How effective is the organisation s authentication and authorisation controls for network access How effective is the organisation s mechanism for ensuring non-repudiation and fostering of e-commerce? Are there technological measures in place to ensure information security (e.g., against malicious codes, intrusion etc.)? How often do malwares or/and other security attacks disrupt key business processes? No mechanism to ensure effective network access control for both internal and external networks No mechanism at all No security controls in place Internal resources and network access control policy is in place but not strictly followed. Need to ensure non-repudiation identified with no actions to implement it. Awareness on the dangers of malwares on information facilities exists with no concern Network access control and policy for internal resources/equi pment are in place for users Logging of access to critical information systems is done Basic controls like antivirus etc. which are not upgraded as necessary uncountable Twice in a month Once in six months Adequate network controls; and policy to ensure proper identification/authenticati on on internal and external network services is enforced. Well-established mechanisms for logging activities and protecting e- commerce activities are in place Standard controls for monitoring, detecting and fixing malwares, intrusions, malicious codes are in place and continually checked for efficacy Barely once a year Mechanisms for ensuring access control on both internal and publicly-owned networks are continually monitored/reviewed for efficacy and regulatory compliance All activities are appropriately logged with real-time monitoring of network and system activities available Innovative controls/technologies constantly adopted. Strong user awareness and attitude help reduce targeted attacks. Not at all 6 120

121 Are there vulnerability scanning tools in place? How effective is the technical vulnerability management? Describe the rate of the organisation's Denial of Service experience in a year? How effective is the organisation s access control mechanism Does technology reflect security consideration at design stage Is there policy for implementation of cryptographic techniques/controls? Is cryptographic controls used to secure information? No detective tools in place. Vulnerability is high Antivirus software, though outdated and ineffective. Awareness of other vulnerabilities exists but motivation for remedy is not there. Some vulnerability tools are in place. Quantitative evaluation and corrective measures for handling all threats are not fully established. Quantitative evaluation of all information assets, vulnerabilities, reporting mechanisms and measure to handle them are fully established and followed in a timely manner. daily weekly Occasionally Hardly Never None exists No indication of security consideration No sense of information protection Access control mechanism is minimal with mostly manual approach in use Post-deployment makeup implementations Need for confidentiality through cryptographic techniques is understood without implementation plans Access control mechanism not fully restrictive (loose) Basic security requirements reflected Use of one form of cryptographic control. Multiple and strong cryptographic controls not deployed Standard and ideal use of access control types. Only privilege necessary for duty is allowed Technology shows careful consideration of security during design phase Good combination of different relevant cryptographic controls as stipulated in policy Sophisticated malware and vulnerability tools are continually evaluated for performance. Review and upgrade of vulnerability management approach is taken serious. Dynamic and innovative implementation and review of access control mechanisms for maximum security Technology allows for flexibility and innovation in security provisions Constant review and improvement of cryptographic approach and key management technique 121 7

122 Management of information security Average Point Is the information system design flexible to technological innovation? What is the average percentage uptime of the organisation's network in a year (Intranet and Internet)? How effective is the mechanism for logging and tracing activities within the organisation? Not at all Adequate technological requirements are yet to be met Current technical requirements are met Technology meets all requirements and allows for improvement Does not exist Activity log applies for few operations Activities log is in place for every critical Information system across departments Sophisticated mechanism for logging and monitoring application, system, network etc. use. High traceability. Technology is highly flexible to change and improvement All activities are logged, monitored and acted upon regularly for all information assets. Past activities can be retrieved/traced. SECURITY GOVERNANCE What is the perception of top managements? Do they understand their roles in ensuring information security? Does the organisation have detailed security policy? Top management is ignorant of the criticality of information security to business objectives No policy exists Top management understands the criticality of information security to business but yet not acting. Word-of-mouth instructions Top management understands its roles/responsib ilities and supports plans for information security Manual containing security procedures; not comprehensive Information security is seen as business enabler and detailed plans to ensure information security is actively being implemented Detailed policy which covers all areas of business and all legal and security requirements Top management considers information security as an essential part of the business. An information officer is part of the management team. Constant and proactive review of comprehensive policy as to contain emerging threats and security trends. 1228

123 Management of information security Is the organisation's information assets inventory well defined, documented and information asset owners identified? Does information security constitute key consideration in every business plan of the organisation? Are engagements/communication s with external parties controlled? Does the organisation classify data/information? How often are cases of staff misuse of access rights/privileges? No inventory of information assets. Assets are not assigned owners. Information security is not remembered in all internal and external dealings No form of information classification Numerous cases per day Assets acquisition records exist, but no proper inventory of assets with assigned owners in line with its changes Importance of information security for all internal and external business activities/ communications is understood but not seen as critical. Information classification mind-set exists but no formal documentation and enforcement. Once instance per week Inventory of asset was once completed, but not regularly updated. Assets owners are not reviewed as need be. Information security during internal and external dealings is considered. Monitoring of activities is not done to measure and enforce compliance There is basic classification of information according to value/sensitivit y. Comprehensive and up-todate inventory of asset with respective owners. Asset owners understand and follow their responsibilities for protecting their assets. Information security is considered in every aspect of the business activities/plan. Agreements with external parties details information security requirements for partnership Detailed classification of information according to value, sensitivity and criticality. Appropriate level of protection is assigned to each category. Once a month Once a year rarely Real-time update of inventory records. Detailed track record of assets owners. Time-to-time refresher knowledge for all asset owners for proactive protection of their information assets. Information security is seen as an integral part of the organisation s business. Active monitoring of all communications, proper reporting and correction of deviations to agreed security requirements with external parties. Information assets classification is continually reviewed in line with changing business needs. New assets are classified immediately they arise. 1239

124 Roles and Responsibilities Training and awareness programme Does the organisation have established information security training and awareness programme? What is the employees' information security awareness and compliance level in the organisation? How well does the requirements of the policy spread across departments/functions? Are staff from all relevant functions involved in coordinating information security? How detailed is the organisation's recruitment screening to ensure qualified staff for key functions? Is security checks conducted on staff before security privilege is granted? No awareness or training programme in place unaware No sense of coordination or information security consciousness across relevant departments Need for staff checks or screening before granting security privilege is not considered Individual developmental efforts are encouraged; no formal programme adopted yet. Aware and noncompliant Information security awareness exists for key representatives of various functions, but needed actions are farfetched. Screening of staff for competence and credibility is not formalised. Integrity of screening process is poor. General annual awareness and training programme exits. Weak compliance Representative s from relevant functions are involved in information security campaign and efforts Staff check for competence is in place. Evidence and reliable/formali sed approach is yet not adopted Training needs assessment and provision of trainings according to staff job requirements. Staff for key functions are scheduled for specialised security trainings Full compliance Team of departmental heads, managers, auditors etc. are actively infusing information security culture in staff in their various functions Detailed formalised procedure for staff, contractors and third parties' integrity and competence screening is carried out before security responsibility/privilege is assigned. Information security awareness programme is championed by management board. Regular refresher trainings provided for staff changing job requirements Complying proactively Security strategies are continually being improved/implemented by team of representatives across relevant functions to ensure strict compliance of all operations to defined policy and standards. Continuous review of staff, contractors' etc. competence and activities in line with changing security landscape ensures high information security

125 Risk assessment and treatment Is there an information security unit in the IT department of the organisation? Are officers assigned specific information security roles/responsibilities? Information security unit does not exist. Roles/responsibilit ies are not assigned at all. Information security functions are part of the IT department's task. No separate security unit exist. Information security unit exists, but does not fully understand and practise its functions effectively Information security unit is active. Specific roles/responsibilities are assigned and carried out by designated staff as required by policy and standards. Assigned roles are continually monitored, reviewed and re-assigned in line with staff changing skills and when necessary to ensure highest security performance. Average Point RISK MANAGEMENT CAPABILITY Are there physical protections for information assets? Is there proper risk assessment and treatment for all information assets? Estimate the organisation s ability of avoiding recurrence of solved security problem? Information assets are easily accessible by all No form of risk assessment. Physical protection for information assets is loose Little efforts to treat risks happen only when those risks come real. Information assets are provided physical protection, though proper risk assessment is not done to determine commensurate protection Risk assessment is done, but treatment plans are not fully implemented Proper risk assessment is performed and commensurate physical protection provided to prevent all identified human and environmental threats to information assets Detailed risk assessments. Risk appetite of organisation established and treatment plans carried out appropriately Physical protection controls are constantly upgraded in accordance with current technological trend as to ensure best practise and strong security Continuous review of risk registers and risk appetite. Treatments to identified risks are continually reviewed to match the changing risk nature

126 Total Cost of Security Business continuity management Is there a business continuity/disaster recovery plan for the company s information systems? How often is it checked and tested? How often are data backed up? Are there active data back-up policy and strategy in place? How often is virus scanning and updates carried out? Average Point No business continuity plans in existence No back-up policy or plans exist for information assets Incident only leads to emergency actions Recognised need for data back-up is not taken serious. Business continuity plans cover only data/databases. Data back-up is done monthly but active measure to ensure fast recovery is not in place Full incident and business plan is in place. Regular monthly checks guarantee readiness for activation Adequate policy and strategy for data back-up and restoration is in place and continually tested for efficacy Annually Quarterly Monthly Weekly Daily Business continuity plans are continually reviewed and tested to ensure alltime readiness to support business. Policy is regularly reviewed and strategy continually rehearsed for all-time readiness. Migration to better back-up strategies as need arise is a norm Final Average Point/ Maturity Level ECONOMICS OF SECURITY Planned Cost of Security Unplanned Cost of Security Overall rating and Maturity level Score Boundary Maturity Level Vulnerable Security Awareness Basic Security Meeting Requirements Above Robust Security 126

127 Appendix D: Evaluation Feedback Form The University of Manchester School of Computer Science FEEDBACK FORM Project: Information Security Maturity Model (ISMM) Instruction: Please, indicate your answer for each question by writing the number which corresponds to your opinion in the 'opinion' column. The following numbers are as defined: (1-Strongly disagree, 2-Disagree, 3-Neither agree nor disagree, 4-Agree, 5-Strongly agree) In addition, give more details of your opinion for each question by assigning an ideal percentage score that says how well each question element has been handled in the last column (percentage rating). See example below: S/N Question Opinion Percentage rating 1 Do you think the model is balanced enough to position any organisation in any of the five levels without the organisation falling in between two levels? 1 80% Please, attend to the two sections: Section A: Information Security Maturity Model (ISMM) S/N Question opinion Percentage rating 1 Do you think the model is balanced enough to position any organisation in any of the five levels without the organisation falling in between two levels? 2 Are the key elements which determine the information security status of an organisation considered? 3 How self-explanatory and user-friendly is the ISMM? 4 Do you think the choice of definitions of the five levels is well thought-out? 5 Do you think the transitions or progression between the levels are good and discrete enough? 6 Is the information security maturity model recommendable for the security industry? Please provide below any other comments/contributions that could improve the ISMM that the researcher has not considered, if any: 127

128 Section B: Information Security Assessment Framework (ISAF) S/N Question opinion Percentage rating 1 Does the assessment framework cover the requirements of an information security management system as specified in the ISO/IEC and ISO/IEC27002? 2 Do you think the assessment framework will be effective in assessing the information maturity level of an organisation? 3 Do you think the framework will produce consistent results when used to assess an organisation by different assessors? 4 How self-explanatory and user-friendly is the ISMM? 5 Are the characteristics/evidences attributed and expected of each maturity level across various processes ideal? 6 The transitions or progression between the levels of the expected evidences are good and discrete enough? 7 Is the information security assessment framework recommendable for the security industry? Please provide below any other comments/contributions that could improve the ISAF that the researcher has not considered, if any: 128 2

129 Appendix E: Project Evaluation Results ISMM Evaluation All figures are in percentage Evaluator 1 Evaluator 2 Evaluator 3 Evaluator 4 Question Question Question Question Question Question Average ISAF Evaluation All figures are in percentage Question Question Question Question Question Question Question Average Total Average

130 Participant 1 Participant 3 Appendix F: Security Cost Estimation Acquisition and Maintenance Costs Estimation Number of Employees Management Controls Categories Security requirements Cost description Pound Pound Risk Management Review of Security Controls sterling ( ) sterling ( ) Perform risk assessment/ independent review/ selfassessment 1 FTE, 5 months Mission Impact Analysis 1 FTE, 12 days Perform risk assessment/ independent review/ selfassessment 1 FTE, 5 months Key Control Testing 24 hours per quarter per network environment, including reading of results Incident/Alert Response and Reporting 1 FTE as needed (approximately two weeks per year) Management Oversight of Corrective Actions 1 FTE, 1 month Life Cycle Mission Impact Analysis 1 FTE, 12 days Perform risk assessment/ independent review/ selfassessment Incorporation of security requirements into procurement documents and commercial off the shelf acquisition specifications 1 FTE, 5 months FTE, 15 days per each procurement (depending on the complexity of the security challenge that the procurement holds)

131 Authorize Processing (Certification and Accreditation) Design Review and System Testing before Production 1 FTE, 4 months ( Application Security Assessment ) Officially Document the Certification and Accreditation 1 FTE, 4 weeks Testing of Security Controls after Changes 1 FTE, 3 weeks Management Oversight of Corrective Actions 1 FTE, 1 month Development of New System Security Plan 1 FTE, 9 weeks Perform risk assessment/ independent review/ selfassessment 1 FTE, 5 months Production, distribution, and signing of Rules of 1 FTE, 5 weeks Behaviour Contingency Plan Development 1 FTE, 32 weeks Development of New System Security Plan 1 FTE, 9 weeks Management oversight of corrective actions 1 FTE, 1 month System Security Plan Revision and update of system security plan 1 FTE, 4 weeks Operational Controls Personnel Security On-going oversight of user account authorization and Physical and Environment Protection 1 FTE, 1 year associated policies On-going oversight of user account authorization and associated policies 1 FTE, 1 year Encryption Software $25 ( 16) per user

132 Production and Input/output Controls Purchase and Install Intrusion Detection System with Data Integrity and Validation Controls $4,800 ( ) per network IDS probe; $2,400 ( ) for analysis console; 1 FTE, 1 year for monitoring IDS Contingency Planning Contingency plan development 1 FTE, 32 weeks Procurement of off-site storage and processing site $30K ( 19,335) per year Back-up file and off-site management and 1 FTE, 3 months documentation Security training courses $150 ( 96.6) per employee per year Hardware and System Software Maintenance Data Integrity Testing and Revision of Contingency Plan 1 FTE, 10 weeks Hardware and software authorization and documentation Update virus signatures and oversee network activity and access control logs Purchase and Install Intrusion Detection System with Data Integrity and Validation Controls 1 FTE, 4 months FTE, 6 weeks $4,800 ( ) per network IDS probe; $2,400 ( ) for analysis console; 1 FTE, 1 year for monitoring IDS System performance monitoring in real time $6,900 per year Design review and system testing prior to production ( Application Security Assessment ) 1 FTE, 3-8 months

133 Documentation Security Awareness, Training, and Education Create network diagrams for setups of routers, 1 FTE, 3 weeks switches, and firewalls Design review and system testing prior to production 1 FTE, 3-8 months ( Application Security Assessment ) Revision and update of system security plan 1 FTE, 4 weeks Contingency Plan development 1 FTE, 32 weeks Development of New System Security Plan 1 FTE, 9 weeks Contingency plan development 1 FTE, 32 weeks Incorporate a missing element into an incomplete 1 FTE, 4 weeks system security plan Officially Document the Certification and Accreditation 1 FTE, 4 weeks Security training courses $150 ( 96.6) per employee per year Security refresher courses $150 ( 96.6) per employee per year General Security Awareness Training $870 ( 560.7) per year Production, distribution, and signing of Rules of Behaviour 1 FTE, 5 weeks Technical Controls Identification and Authentication On-going oversight of user account authorization and associated policies 1 FTE, 1 year Purchase and Install Intrusion Detection System with Data Integrity and Validation Controls On-going oversight of user account authorization and associated policies $4,800 ( ) per network IDS probe; $2,400 ( ) for analysis console; 1 FTE, 1 year for monitoring IDS FTE, 1 year

134 Logical Access Controls Audit Trails Purchase and Install Intrusion Detection System with Data Integrity and Validation Controls $4,800 ( ) per network IDS probe; $2,400 ( ) for analysis console; 1 FTE, 1 year for monitoring IDS Encryption Software $25 ( 16) per user Setting and maintenance of access controls 1 FTE per year On-going oversight of user account authorization and associated policies Update virus signatures and oversee network activity and access control logs Authorization and management of inter-system relationships Firewalls and Secure Gateways 1 FTE per year FTE, 6 weeks FTE, 4 months $9,000 ( ) for carrier managed security and VPN services for large offices, higher speed/interface capabilities FTE, 1 year On-going oversight of user account authorization and associated policies Development of Public Warnings 1 FTE, 1 day Purchase and Install Intrusion Detection System with Data Integrity and Validation Controls $4,800 ( ) per network IDS probe; $2,400 ( ) for analysis console; 1 FTE, 1 year for monitoring IDS Setting and maintenance of access controls 2 FTEs per year

135 Development and communication of user account authorization policies On-going oversight of user account authorization and associated policies 1 FTE, 6 months FTE, 1 year Total Estimated Acquisition/implementation Costs only Estimated Maintenance Cost only Inflation rate from 2002 to % Estimated Acquisition/implementation Costs only (with inflation) Estimated Maintenance Cost only (with Inflation) Cost of Incidents and Breaches Estimation Participants Number of Employees Annual incidents/breaches Incident Cost computation Cost per participant rate Participant to 20 {[(10+20)/2] = 15} (89*3911)* Participant to 20 {[(10+20)/2] = 15} (89*3574)* Participant to 100 {[(2+100)/2] = 51} (89*5960)* Participant No incident reported No incident reported No incident reported 1356

136 Appendix G: Project Plan Gantt Chart 1367

Understanding Model Representations and Levels: What Do They Mean?

Understanding Model Representations and Levels: What Do They Mean? Pittsburgh, PA 15213-3890 Understanding Model Representations and Levels: What Do They Mean? Mary Beth Chrissis Mike Konrad Sandy Shrum Sponsored by the U.S. Department of Defense 2004 by Carnegie Mellon

More information

Software Project Management Sixth Edition. Chapter Software process quality

Software Project Management Sixth Edition. Chapter Software process quality Software Project Management Sixth Edition Chapter 13.2 Software process quality 1 Product and Process Quality A good process is usually required to produce a good product. For manufactured goods, process

More information

This chapter illustrates the evolutionary differences between

This chapter illustrates the evolutionary differences between CHAPTER 6 Contents An integrated approach Two representations CMMI process area contents Process area upgrades and additions Project management concepts process areas Project Monitoring and Control Engineering

More information

How I Learned to Stop Worrying and Love Benchmarking Functional Verification!

How I Learned to Stop Worrying and Love Benchmarking Functional Verification! How I Learned to Stop Worrying and Love Benchmarking Functional Verification! Mike Bartley Test and Verification Solutions SETsquared Business Acceleration Centre University Gate East, Park Row Bristol

More information

MEASURING PROCESS CAPABILITY VERSUS ORGANIZATIONAL PROCESS MATURITY

MEASURING PROCESS CAPABILITY VERSUS ORGANIZATIONAL PROCESS MATURITY MEASURING PROCESS CAPABILITY VERSUS ORGANIZATIONAL PROCESS MATURITY Mark C. Paulk and Michael D. Konrad Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213-3890 Abstract The

More information

Software technology 3. Process improvement models. BSc Course Dr. Katalin Balla

Software technology 3. Process improvement models. BSc Course Dr. Katalin Balla Software technology 3. Process improvement models BSc Course Dr. Katalin Balla Contents Process improvement models. Popular SPI models: CMM, SPICE, CMMI The Personal Software Process (PSP) and the Team

More information

Capability Maturity Model the most extensively used model in the software establishments

Capability Maturity Model the most extensively used model in the software establishments International Journal of Scientific and Research Publications, Volume 6, Issue 5, May 2016 710 Capability Maturity Model the most extensively used model in the software establishments Ajith Sundaram Assistant

More information

IRM s Professional Standards in Risk Management PART 1 Consultation: Functional Standards

IRM s Professional Standards in Risk Management PART 1 Consultation: Functional Standards IRM s Professional Standards in Risk PART 1 Consultation: Functional Standards Setting standards Building capability Championing learning and development Raising the risk profession s profile Supporting

More information

Risk Management Update ISO Overview and Implications for Managers

Risk Management Update ISO Overview and Implications for Managers Contents - ISO 31000 highlights 1 - Changes to key terms and definitions 2 - Aligning key components of the risk management framework 3 - The risk management process 4 - The principles of risk management

More information

Quality Management of Software and Systems: Software Process Assessments

Quality Management of Software and Systems: Software Process Assessments Quality Management of Software and Systems: Software Process Assessments Contents Temporal development of the CMM and the assessment procedures Mature and Immature Processes Structure of the Capability

More information

Rational Software White Paper TP 174

Rational Software White Paper TP 174 Reaching CMM Levels 2 and 3 with the Rational Unified Process Rational Software White Paper TP 174 Table of Contents Abstract... 1 Introduction... 1 Level 2, Repeatable... 2 Requirements Management...

More information

CITS5501 Software Testing and Quality Assurance Standards and quality control systems

CITS5501 Software Testing and Quality Assurance Standards and quality control systems CITS5501 Software Testing and Quality Assurance Standards and quality control systems Unit coordinator: Arran Stewart April 17, 2018 1 / 36 Overview Based on material from: Stephen Dannelly, Winthrop University

More information

Chapter 6. Software Quality Management & Estimation

Chapter 6. Software Quality Management & Estimation Chapter 6 Software Quality Management & Estimation What is Quality Management Also called software quality assurance (SQA) s/w quality:- It is defined as the degree to which a system, components, or process

More information

SWEN 256 Software Process & Project Management

SWEN 256 Software Process & Project Management SWEN 256 Software Process & Project Management Understanding existing processes Introducing process changes to achieve organisational objectives which are usually focused on quality improvement, cost reduction

More information

Cyber and Technology Resilience: Themes from cross-sector survey November 2018

Cyber and Technology Resilience: Themes from cross-sector survey November 2018 Cyber and Technology Resilience: Themes from cross-sector survey 2017-2018 November 2018 Cyber and Technology Resilience: Themes from cross-sector survey 2017-2018 Contents 1 Overview 2 Executive summary

More information

By: MSMZ. Standardization

By: MSMZ. Standardization Standardization Introduction A standard is a document. It is a set of rules that control how people should develop and manage materials, products, services, technologies, processes and systems. Every organization

More information

Transactions on Information and Communications Technologies vol 11, 1995 WIT Press, ISSN

Transactions on Information and Communications Technologies vol 11, 1995 WIT Press,   ISSN A quality assessment method for application management R.M. Hather, E.L. Burd, C. Boldyreff Centre for Software Maintenance, University of Durham, Durham, DEI 3EL, UK, Email: ames@durham.ac.uk Abstract

More information

Benchmarking Functional Verification by Mike Bartley and Mike Benjamin, Test and Verification Solutions

Benchmarking Functional Verification by Mike Bartley and Mike Benjamin, Test and Verification Solutions Benchmarking Functional Verification by Mike Bartley and Mike Benjamin, Test and Verification Solutions 36 Introduction This article describes asuremark - the Functional verification Capability Maturity

More information

Guidelines for Testing Maturity

Guidelines for Testing Maturity Guidelines for Testing Maturity Erik van Veenendaal of Improve Quality Services BV in the Netherlands has both involved in test process improvement projects at a large number of industrial organizations.

More information

Software Engineering Inspection Models Continued

Software Engineering Inspection Models Continued Software Engineering Inspection Models Continued Structure Review of the CMM/CMMI model /CMMI model Steps/Levels Management views of levels CMMI context Capability Maturity Models SEI developed the CMM

More information

Agile leadership for change initiatives

Agile leadership for change initiatives Agile leadership for change initiatives Author Melanie Franklin Director Agile Change Management Limited Contents Introduction 3 Agile principles 3 Introduction to Agile techniques 6 Working in sprints

More information

CMM,,mproving and,ntegrating

CMM,,mproving and,ntegrating Pittsburgh, PA 15213-3890 CMM,,mproving and,ntegrating Mike Phillips Mary Beth Chrissis Mike Konrad Sandy Shrum SM SCAMPI, SCAMPI Lead Appraiser, SEPG, and SEI are service marks of Carnegie Mellon University.,

More information

CMM and CMMI : Show Me the Value!

CMM and CMMI : Show Me the Value! CMM and CMMI : Show Me the Value! Abstract Most organizations seek a rating against the Capability Maturity Model (CMM) or Capability Maturity Model Integration (CMMI) because their customers require it

More information

MULTICHANNEL MARKETING FOR PHARMA

MULTICHANNEL MARKETING FOR PHARMA 02 AN INTRODUCTION TO MULTICHANNEL MARKETING FOR PHARMA Inspiring audiences. Motivating change. Thinking beyond. www.wearecouch.com COUCH medical communications 1 Introduction Consumers these days have

More information

Quality Management System Guidance. ISO 9001:2015 Clause-by-clause Interpretation

Quality Management System Guidance. ISO 9001:2015 Clause-by-clause Interpretation Quality Management System Guidance ISO 9001:2015 Clause-by-clause Interpretation Table of Contents 1 INTRODUCTION... 4 1.1 IMPLEMENTATION & DEVELOPMENT... 5 1.2 MANAGING THE CHANGE... 5 1.3 TOP MANAGEMENT

More information

SOFTWARE MEASUREMENT GUIDEBOOK. Revision 1

SOFTWARE MEASUREMENT GUIDEBOOK. Revision 1 SOFTWARE ENGINEERING LABORATORY SERIES SEL-94-102 SOFTWARE MEASUREMENT GUIDEBOOK Revision 1 JUNE 1995 National Aeronautics and Space Administration Goddard Space Flight Center Greenbelt, Maryland 20771

More information

1 Management Responsibility 1 Management Responsibility 1.1 General 1.1 General

1 Management Responsibility 1 Management Responsibility 1.1 General 1.1 General 1 Management Responsibility 1 Management Responsibility 1.1 General 1.1 General The organization s management with executive The commitment and involvement of the responsibility shall define, document

More information

YaSM and the YaSM Process Map. Introduction to YaSM Service Management

YaSM and the YaSM Process Map. Introduction to YaSM Service Management YaSM and the YaSM Process Map Introduction to YaSM Management Contents Why Yet another Management Model?... 5 YaSM - the idea... 5 A framework for everyone in the business of providing services... 6 YaSM

More information

Are You Being Ruined by Best Efforts: Does your Maintenance & Reliability Strategy Really Support Defect Elimination and Incident Avoidance?

Are You Being Ruined by Best Efforts: Does your Maintenance & Reliability Strategy Really Support Defect Elimination and Incident Avoidance? Reliability Consulting White Paper June 2017 Are You Being Ruined by Best Efforts: Does your Maintenance & Reliability Strategy Really Support Defect Elimination and Incident Avoidance? By Mike Whittaker

More information

Quality Management with CMMI for Development v.1.3 (2013)

Quality Management with CMMI for Development v.1.3 (2013) Quality Management with CMMI for Development v.1.3 (2013) Discussion Topics Software Development Maturity Models CMMI Characteristics of Maturity Levels Key Process Areas Software Quality Management Concerned

More information

CMMI Version 1.2. Model Changes

CMMI Version 1.2. Model Changes Pittsburgh, PA 15213-3890 CMMI Version 1.2 Model Changes SM CMM Integration, IDEAL, and SCAMPI are service marks of Carnegie Mellon University. Capability Maturity Model, Capability Maturity Modeling,

More information

1.264 Lecture 5 System Process: CMMI, ISO

1.264 Lecture 5 System Process: CMMI, ISO 1.264 Lecture 5 System Process: CMMI, ISO Next class: Read UML chapters 1 (skim), 2 (skim), 4, 8, 9 Make sure you have Visual Paradigm installed. We start using it Monday in class. Homework 1 solutions

More information

Chapter 12. Contents Evaluating Process! Postmortem Analysis. Chapter 12 Objectives

Chapter 12. Contents Evaluating Process! Postmortem Analysis. Chapter 12 Objectives Contents Chapter 12 Evaluating Products, Processes, and Resources Shari L. Pfleeger Joann M. Atlee 4 th Edition 12.1 Approaches to Evaluation 12.2 Selecting an Evaluation Techniques 12.3 Assessment vs.

More information

Risk appetite and internal audit

Risk appetite and internal audit 30 April 2018 Risk appetite and internal audit Chartered Institute of Internal Auditors This guidance looks at the nature of risk appetite and how it has come to the fore following the financial crisis

More information

ISACA All Rights Reserved.

ISACA All Rights Reserved. Tichaona Zororo CIA, CISA, CISM, CRISC, CRMA, CGEIT, COBIT 5 Certified Assessor B.Sc. Honours Information Systems, PGD Computer Auditing Accredited COBIT 5 Trainer ISACA 2016. Business Value Value

More information

CMMI A-Specification. Version 1.7. November, For CMMI Version 1.2. This document is controlled by the CMMI Steering Group.

CMMI A-Specification. Version 1.7. November, For CMMI Version 1.2. This document is controlled by the CMMI Steering Group. CMMI A-Specification Version 1.7 For CMMI Version 1.2 November, 2007 This document is controlled by the CMMI Steering Group. CHANGE PAGE HISTORY This is the first release of the A-Spec for CMMI v1.2. Change

More information

Measurement in Higher Maturity Organizations: What s Different and What s Not?

Measurement in Higher Maturity Organizations: What s Different and What s Not? Pittsburgh, PA 15213-3890 Measurement in Higher Maturity Organizations: What s Different and What s Not? Dennis R. Goldenson 27 July 2004 Sponsored by the U.S. Department of Defense 2004 by Carnegie Mellon

More information

Personal Software Process SM for Engineers: Part I

Personal Software Process SM for Engineers: Part I Personal Software Process SM for Engineers: Part I Introduction to the PSP SM Defect Removal Estimation of Project Size Microsoft Project Design READING FOR THIS LECTURE A Discipline for Software Engineering,

More information

Quality: A Health Capsule to Retain Growth Sabyasachi Bardoloi Manager Pinnacle Research Group Pinnacle Systems, Inc.

Quality: A Health Capsule to Retain Growth Sabyasachi Bardoloi Manager Pinnacle Research Group Pinnacle Systems, Inc. The PROJECT PERFECT White Paper Collection Quality: A Health Capsule to Retain Growth Sabyasachi Bardoloi Manager Pinnacle Research Group Pinnacle Systems, Inc. 1. Introduction: Quality is the very essence

More information

ISO 2018 COPYRIGHT PROTECTED DOCUMENT All rights reserved. Unless otherwise specified, or required in the context of its implementation, no part of th

ISO 2018 COPYRIGHT PROTECTED DOCUMENT All rights reserved. Unless otherwise specified, or required in the context of its implementation, no part of th INTERNATIONAL STANDARD ISO 31000 Second edition 2018-02 Risk management Guidelines Management du risque Lignes directrices Reference number ISO 31000:2018(E) ISO 2018 ISO 2018 COPYRIGHT PROTECTED DOCUMENT

More information

CERT Resilience Management Model, Version 1.2

CERT Resilience Management Model, Version 1.2 CERT Resilience Management Model, Organizational Process Focus (OPF) Richard A. Caralli Julia H. Allen David W. White Lisa R. Young Nader Mehravari Pamela D. Curtis February 2016 CERT Program Unlimited

More information

Staged Representation Considered Harmful?

Staged Representation Considered Harmful? Staged Representation Considered Harmful? Terry Rout Software Griffith University Queensland, Australia CMMI Users Group, 2004 1 An Acknowledgement Edsger W. Dijkstra, Go To Statement Considered Harmful

More information

Applying the Personal Software Process (PSP) sm with Ada

Applying the Personal Software Process (PSP) sm with Ada Applying the Personal Software Process (PSP) sm with Ada Mr. David Silberberg U. S. Department of Defense/Q74 98 Savage Road Suite 626 Fort Meade, MD 27-6 31-688-931 dsilber@romulus.ncsc.mil 1. ABSTRACT

More information

Dissertation Results Chapter Sample Results, Analysis and Discussions

Dissertation Results Chapter Sample Results, Analysis and Discussions 4.0 Results, Analysis and Discussions 4.1 Introduction This chapter sets out the results of the questionnaire and provides supporting critical discussion of the respective results. Accordingly the chapter

More information

Project Management Framework with reference to PMBOK (PMI) July 01, 2009

Project Management Framework with reference to PMBOK (PMI) July 01, 2009 Project Management Framework with reference to PMBOK (PMI) July 01, 2009 Introduction Context Agenda Introduction to Methodologies What is a Methodology? Benefits of an Effective Methodology Methodology

More information

Software Engineering

Software Engineering Software Engineering (CS550) Software Development Process Jongmoon Baik Software Development Processes (Lifecycle Models) 2 What is a S/W Life Cycle? The series of stages in form and functional activity

More information

ASSESSING QUALITY IN SOFTWARE ENGINEERING: A PRAGMATIC APPROACH. University of Pretoria

ASSESSING QUALITY IN SOFTWARE ENGINEERING: A PRAGMATIC APPROACH. University of Pretoria ASSESSING QUALITY IN SOFTWARE ENGINEERING: A PRAGMATIC APPROACH. A dissertation submitted to the department of Computer Science of the University of Pretoria in partial fulfilment of the requirements for

More information

TAMING COMPLEXITY ON MAJOR RAIL PROJECTS WITH A COLLABORATIVE SYSTEMS ENGINEERING APPROACH

TAMING COMPLEXITY ON MAJOR RAIL PROJECTS WITH A COLLABORATIVE SYSTEMS ENGINEERING APPROACH TAMING COMPLEXITY ON MAJOR RAIL PROJECTS WITH A COLLABORATIVE SYSTEMS ENGINEERING APPROACH Chris Rolison CEO, Comply Serve Limited The Collaborative Systems Engineering Approach Collaboration A system

More information

REGISTERED CANDIDATE AUDITOR (RCA) TECHNICAL COMPETENCE REQUIREMENTS

REGISTERED CANDIDATE AUDITOR (RCA) TECHNICAL COMPETENCE REQUIREMENTS REGISTERED CANDIDATE AUDITOR (RCA) TECHNICAL COMPETENCE REQUIREMENTS 1. Context After completion of the recognised training contract, a period of specialisation is required, appropriate to the level required

More information

INNOVATION MATURITY TEMPLATE

INNOVATION MATURITY TEMPLATE INNOVATION MATURITY TEMPLATE Catalyst Exchange Pty Ltd ABN 82 155 378 448 Suite 402, 12 Century Circuit, Norwest Business Park, Baulkham Hills, NSW 2153, Australia T +61 2 9114 8605 E info@catalystexchange.com.au

More information

CERT Resilience Management Model, Version 1.2

CERT Resilience Management Model, Version 1.2 CERT Resilience Management Model, Asset Definition and Management (ADM) Richard A. Caralli Julia H. Allen David W. White Lisa R. Young Nader Mehravari Pamela D. Curtis February 2016 CERT Program Unlimited

More information

ISO whitepaper, January Inspiring Business Confidence.

ISO whitepaper, January Inspiring Business Confidence. Inspiring Business Confidence. ISO 31000 whitepaper, January 2015 Author: Graeme Parker enquiries@parkersolutionsgroup.co.uk www.parkersolutionsgroup.co.uk ISO 31000 is an International Standard for Risk

More information

Capability Maturity Model for Software (SW-CMM )

Capability Maturity Model for Software (SW-CMM ) PHASE-IV: SYSTEMS IMPLEMENTATION Software Quality Assurance Application Development Installation and Support Software Quality Assurance Capability Maturity Model for Software (SW-CMM ) The Capability Maturity

More information

International Civil Aviation Organization FIRST INFORMATION MANAGEMENT PANEL (IMP/1) Montreal, Canada January, 25 30, 2015

International Civil Aviation Organization FIRST INFORMATION MANAGEMENT PANEL (IMP/1) Montreal, Canada January, 25 30, 2015 International Civil Aviation Organization WORKING PAPER 15/01/2015 rev. 0 FIRST INFORMATION MANAGEMENT PANEL (IMP/1) Montreal, Canada January, 25 30, 2015 Agenda Item 5: Review and elaborate on concepts,

More information

Quality management plan

Quality management plan 2.3.8. Quality management plan Quality management is not an event - it is a process, and a mindset. It is a continuous process of improvement involving all aspects of business including Quality control

More information

The Basics of ITIL Help Desk for SMB s

The Basics of ITIL Help Desk for SMB s The Basics of ITIL Help Desk for SMB s This three-step process will provide you the information necessary to understand ITIL, help you write your strategic IT plan and develop the implementation plan for

More information

6. Capability Maturity Method (CMM)

6. Capability Maturity Method (CMM) WAT IS TE C? 6. aturity ethod (C) Concept: The application of process management and quality improvement concepts to software development and maintenance odel: A model for organizational improvement Guidelines:

More information

<Full Name> Quality Manual. Conforms to ISO 9001:2015. Revision Date Record of Changes Approved By

<Full Name> Quality Manual. Conforms to ISO 9001:2015. Revision Date Record of Changes Approved By Conforms to ISO 9001:2015 Revision history Revision Date Record of Changes Approved By 0.0 [Date of Issue] Initial Issue Control of hardcopy versions The digital version of this document is

More information

Quality Assessments of Statistical Production Processes in Eurostat Pierre Ecochard and Małgorzata Szczęsna, Eurostat

Quality Assessments of Statistical Production Processes in Eurostat Pierre Ecochard and Małgorzata Szczęsna, Eurostat Quality Assessments of Statistical Production Processes in Eurostat Pierre Ecochard and Małgorzata Szczęsna, Eurostat Since 1994, Eurostat has developed its own approach for the measurement of the quality

More information

Trends in Change Management for 2018

Trends in Change Management for 2018 Trends in Change Management for 2018 Author Melanie Franklin Director Agile Change Management Limited Contents Executive Summary 3 Setting the scene 3 Explaining the value of change management 4 Specific

More information

Proposed International Standard on Auditing 315 (Revised)

Proposed International Standard on Auditing 315 (Revised) Exposure Draft July 2018 Comments due: November 2, 2018 International Standard on Auditing Proposed International Standard on Auditing 315 (Revised) Identifying and Assessing the Risks of Material Misstatement

More information

Developing a successful governance strategy. By Muhammad Iqbal Hanafri, S.Pi., M.Kom. IT GOVERNANCE STMIK BINA SARANA GLOBAL

Developing a successful governance strategy. By Muhammad Iqbal Hanafri, S.Pi., M.Kom. IT GOVERNANCE STMIK BINA SARANA GLOBAL Developing a successful governance strategy By Muhammad Iqbal Hanafri, S.Pi., M.Kom. IT GOVERNANCE STMIK BINA SARANA GLOBAL it governance By NATIONAL COMPUTING CENTRE The effective use of information technology

More information

Investment Readiness answers 4 Key Questions to the Business Model. 2. Investment Readiness: am I ready to start?

Investment Readiness answers 4 Key Questions to the Business Model. 2. Investment Readiness: am I ready to start? 2. Investment Readiness: am I ready to start? When you have started your social business and have managed to overcome the first months or years, you will eventually reach the point where it is obvious

More information

Introduction To The The Software Engineering Institute s Capability Maturity Model for Software

Introduction To The The Software Engineering Institute s Capability Maturity Model for Software Introduction To The The Software Engineering Institute s Capability Maturity Model for Software 1 The Software Engineering Institute (SEI) A federally funded research and development center Affiliated

More information

CGEIT ITEM DEVELOPMENT GUIDE

CGEIT ITEM DEVELOPMENT GUIDE CGEIT ITEM DEVELOPMENT GUIDE Updated March 2017 TABLE OF CONTENTS Content Page Purpose of the CGEIT Item Development Guide 3 CGEIT Exam Structure 3 Writing Quality Items 3 Multiple-Choice Items 4 Steps

More information

Sub-section Content. 1 Preliminaries - Post title: Head of Group Risk - Reports to: CRO - Division: xxx - Department: xxx - Location: xxx

Sub-section Content. 1 Preliminaries - Post title: Head of Group Risk - Reports to: CRO - Division: xxx - Department: xxx - Location: xxx Sub-section Content 1 Preliminaries - Post title: Head of Group Risk - Reports to: CRO - Division: xxx - Department: xxx - Location: xxx 2 Job Purpose - To assist in the maintenance and development of

More information

Quality Systems Frameworks. RIT Software Engineering

Quality Systems Frameworks. RIT Software Engineering Quality Systems Frameworks Some Major Quality Frameworks ISO 9000 family of standards A general international standard for organizational quality systems Oriented towards assessment and certification Malcolm-Baldrige

More information

The Quality Maturity Model: Your roadmap to a culture of quality

The Quality Maturity Model: Your roadmap to a culture of quality The Quality Maturity Model: Your roadmap to a culture of quality F R A N K I E W I L S O N H E A D O F A S S E S S M E N T B O D L E I A N L I B R A R I E S, O X F O R D F R A N K I E. W I L S O N @ B

More information

INTERNATIONAL STANDARD

INTERNATIONAL STANDARD INTERNATIONAL STANDARD ISO/IEC 27004 First edition 2009-12-15 Information technology Security techniques Information security management Measurement Technologies de l'information Techniques de sécurité

More information

Single or multi-sourcing model?

Single or multi-sourcing model? Outsourcing 2007/08 Single and multi-sourcing models Anthony Nagle and Alistair Maughan, Morrison & Foerster www.practicallaw.com/8-380-6567 Once an organisation decides to outsource, one of its biggest

More information

PRM - IT IBM Process Reference Model for IT

PRM - IT IBM Process Reference Model for IT PRM-IT V3 Reference Library - A1 Governance and Management Sysem PRM-IT Version 3.0 April, 2008 PRM - IT IBM Process Reference Model for IT Sequencing the DNA of IT Management Copyright Notice Copyright

More information

WAMITAB Level 3 Certificate in Facilities Management

WAMITAB Level 3 Certificate in Facilities Management WAMITAB Level 3 Certificate in Facilities Management Guided Learning Hours: 24 Total Qualification time: 76 Total Credits: 21 Qualification Code: 601/1722/9 WAMITAB Code: FMCER3 Version 7, October 2017

More information

This resource is associated with the following paper: Assessing the maturity of software testing services using CMMI-SVC: an industrial case study

This resource is associated with the following paper: Assessing the maturity of software testing services using CMMI-SVC: an industrial case study RESOURCE: MATURITY LEVELS OF THE CUSTOMIZED CMMI-SVC FOR TESTING SERVICES AND THEIR PROCESS AREAS This resource is associated with the following paper: Assessing the maturity of software testing services

More information

DEAF DIRECT: Performance Management Policy: April Performance Management Policy

DEAF DIRECT: Performance Management Policy: April Performance Management Policy Performance Management Policy 1 Contents Introduction Aims of the Performance Management Process Benefits of the Performance Management Process Key Principles of the Process Job Descriptions Planning Performance

More information

Organisational Readiness and Software Process Improvement

Organisational Readiness and Software Process Improvement Organisational Readiness and Software Process Improvement Mahmood Niazi a, David Wilson b and Didar Zowghi b a School of Computing and Mathematics, Keele University, ST5 5BG, UK mkniazi@cs.keele.ac.uk

More information

The importance of targeted communication in SFIA implementations. Nicole Minster. July 2013

The importance of targeted communication in SFIA implementations. Nicole Minster. July 2013 The importance of targeted communication in SFIA implementations Nicole Minster July 2013 People. They are an organisation s greatest asset. However it s that most important asset that can make or break

More information

ERROR! BOOKMARK NOT DEFINED.

ERROR! BOOKMARK NOT DEFINED. TABLE OF CONTENTS LEAD AND LAG INDICATORS... ERROR! BOOKMARK NOT DEFINED. Examples of lead and lag indicators... Error! Bookmark not defined. Lead and Lag Indicators 1 GLOSSARY OF TERMS INTRODUCTION Many

More information

CMMI Conference November 2006 Denver, Colorado

CMMI Conference November 2006 Denver, Colorado Make Middle Managers the Process Owners CMMI Conference November 2006 Denver, Colorado Welcome CMMI Mid Mgrs Process Owners - 2 WelKom Huan Yín Bienvenido Bienvenue Wilkommen ЌАΛΟΣ ΟΡΙΣΑΤΕ Välkommen Witamy

More information

6. IT Governance 2006

6. IT Governance 2006 6. IT Governance 2006 Introduction The Emerging Enterprise Model 3 p IT is an integral part of the business p IT governance is an integral part of corporate governance 4 Challenges for the IT IT gets more

More information

Reflection on Software Process Improvement

Reflection on Software Process Improvement Reflection on Software Process Improvement Keynote for SEPG Conference in Japan - 2005 John D. Vu, Ph. D Technical Fellow The Boeing Company BOEING is a trademark of Boeing Management Company. John Vu

More information

Translate stakeholder needs into strategy. Governance is about negotiating and deciding amongst different stakeholders value interests.

Translate stakeholder needs into strategy. Governance is about negotiating and deciding amongst different stakeholders value interests. Principles Principle 1 - Meeting stakeholder needs The governing body is ultimately responsible for setting the direction of the organisation and needs to account to stakeholders specifically owners or

More information

Continuous Process Improvement - Why Wait Till Level 5?

Continuous Process Improvement - Why Wait Till Level 5? Continuous Process Improvement - Why Wait Till Level 5? Girish Seshagiri Advanced Information Services, Inc. Peoria, IL USA Abstract Continuous improvement is generally considered to be a paradigm shift

More information

At Equinor, the way we deliver is as important as what we deliver

At Equinor, the way we deliver is as important as what we deliver The Equinor Book At Equinor, the way we deliver is as important as what we deliver This is a notebook version of the Equinor Book. The latest version of the Equinor Book at any time is found at www.equinorbook.com.

More information

TenStep Project Management Process Summary

TenStep Project Management Process Summary TenStep Project Management Process Summary Project management refers to the definition and planning, and then the subsequent management, control, and conclusion of a project. It is important to recognize

More information

The IT Management Mistakes Report 2016

The IT Management Mistakes Report 2016 The IT Management Mistakes Report 2016 As enterprises move into the digital age, if we continue to make the same mistakes the challenges will only increase. The Survey Early in 2016 we ran a webinar on

More information

The BEST Framework EDF Group s Expectations for Managing Health and Safety. The EDF Group BEST Framework

The BEST Framework EDF Group s Expectations for Managing Health and Safety. The EDF Group BEST Framework Version 1 The BEST Framework EDF Group s Expectations for Managing Health and Safety The EDF Group BEST Framework 2 CONTENTS 1 2 3 4 5 6 7 8 Leadership in Health and Safety 07 Incident Management 09 Contractor

More information

Chapter 02. Professional Standards. Multiple Choice Questions. 1. Control risk is

Chapter 02. Professional Standards. Multiple Choice Questions. 1. Control risk is Chapter 02 Professional Standards Multiple Choice Questions 1. Control risk is A. the probability that a material misstatement could not be prevented or detected by the entity's internal control policies

More information

IT GOVERNANCE AND MANAGED SERVICES Creating a win-win relationship

IT GOVERNANCE AND MANAGED SERVICES Creating a win-win relationship IT GOVERNANCE AND MANAGED SERVICES Creating a win-win relationship TABLE OF CONTENTS IT Governance and Managed Services 3 ROLE OF IT GOVERNANCE AND OUTSOURCING 3 IT GOVERNANCE AND THE OUTSOURCING CONTRACT

More information

Ofcom Consultation Response (Ref: Review of quality of service information Phase 1: information on quality of customer service)

Ofcom Consultation Response (Ref: Review of quality of service information Phase 1: information on quality of customer service) Ofcom Consultation Response (Ref: Review of quality of service information Phase 1: information on quality of customer service) Response from Enigma QPM Limited V2 6 th Oct 2008 Prepared by Ray Murphy

More information

How to plan an audit engagement

How to plan an audit engagement 01 November 2017 How to plan an audit engagement Chartered Institute of Internal Auditors Planning audit projects, or engagements, well will ensure you deliver a quality assurance and consulting service

More information

Contract Management Part One Making the Business Case for Investment

Contract Management Part One Making the Business Case for Investment Contract Management Part One Making the Business Case for Investment Executive Summary This paper is the first in a short series of three which will look at the business case for organisations to invest

More information

ASL and BiSL self-assessments: an analysis

ASL and BiSL self-assessments: an analysis ASL and BiSL self-assessments: an analysis Machteld Meijer & René Sieders Published in IT Beheer, May 2009. In 2003 an article by René Sieders was published in the IT Beheer Jaarboek entitled Experiences

More information

A successful implementation of an ERM dashboard - Remko Riebeek

A successful implementation of an ERM dashboard - Remko Riebeek A successful implementation of an ERM dashboard - Remko Riebeek Abstract: One of the key tools to implement ERM in an organisation is the ERM dashboard. In essence, the dashboard should provide the management

More information

CERT Resilience Management Model Capability Appraisal Method (CAM) Version 1.1

CERT Resilience Management Model Capability Appraisal Method (CAM) Version 1.1 CERT Resilience Management Model Capability Appraisal Method (CAM) Version 1.1 Resilient Enterprise Management Team October 2011 TECHNICAL REPORT CMU/SEI-2011-TR-020 ESC-TR-2011-020 CERT Program http://www.sei.cmu.edu

More information

This is the third and final article in a series on developing

This is the third and final article in a series on developing Performance measurement in Canadian government informatics Bryan Shane and Gary Callaghan A balanced performance measurement system requires that certain principles be followed to define the scope and

More information

Cost Engineering Health Check - a limited survey. Prepared by QinetiQ. For Society for Cost Analysis and Forecasting (SCAF)

Cost Engineering Health Check - a limited survey. Prepared by QinetiQ. For Society for Cost Analysis and Forecasting (SCAF) Cost Engineering Health Check - a limited survey Prepared by QinetiQ For Society for Cost Analysis and Forecasting (SCAF) QINETIQ/TIS/S&AS/IP1203326 ver. 1.0 10th December 2012 Copyright QinetiQ 2012 Page

More information

Measuring Your ROI On Social Media

Measuring Your ROI On Social Media Measuring Your ROI On Social Media So What? * Measuring what matters means measuring relationships * All transactions conducted today are still driven by relationships * Building, managing, and measuring

More information

A FRAMEWORK FOR AUDIT QUALITY. KEY ELEMENTS THAT CREATE AN ENVIRONMENT FOR AUDIT QUALITY February 2014

A FRAMEWORK FOR AUDIT QUALITY. KEY ELEMENTS THAT CREATE AN ENVIRONMENT FOR AUDIT QUALITY February 2014 A FRAMEWORK FOR AUDIT QUALITY KEY ELEMENTS THAT CREATE AN ENVIRONMENT FOR AUDIT QUALITY February 2014 This document was developed and approved by the International Auditing and Assurance Standards Board

More information

Asset Management Maturity

Asset Management Maturity Asset Management Maturity A Position Statement First Edition English Version ISBN 978-0-9870602-4-2 Published October 2015 www.gfmam.org Forward With the publication of the ISO 55000 series of standards,

More information

Contents. viii. List of figures. List of tables. OGC s foreword. 6 Organizing for Service Transition 177. Chief Architect s foreword.

Contents. viii. List of figures. List of tables. OGC s foreword. 6 Organizing for Service Transition 177. Chief Architect s foreword. iii Contents List of figures List of tables OGC s foreword Chief Architect s foreword Preface Acknowledgements v vii viii 1 Introduction 1 ix xi xii 1.1 Overview 3 1.2 Context 3 1.3 Goal and scope of Transition

More information