The history of ITSM
This chapter describes the origin and development of the IT service management (ITSM) discipline. This relatively short history has delivered important knowledge, experience, and insights that will help to further develop ITSM. The most important models that have played a role or that are still important are discussed in this chapter. These models contain the most important components of the way an IT management organization works and how it is managed. The chapter ends with an evaluation and conclusions about the requirements for a proper method.
1.1 The beginning
With the emergence of automation came the growing need for organizing its management. IT changed rapidly over the years and became quite complex. It developed from a mainframe with linked terminals and a limited number of updates per year (1964-1985) to networked midrange servers with an internal client/server architecture (1985-2000). And from this network it changed to an internet-linked system of on-premises servers and outsourced servers and network services, through PCs, thin clients, or cloud computing (2000-present).
Furthermore, the functionalities available to the user changed. In the past, users of terminals were restricted to a fixed package of functionality available on the mainframe. Users of PCs, working on a server, had access to more varied office functionalities. This created greater dependency upon the IT department, and a growing need for a better service delivery. At the same time, the complexity of IT services was growing. In its slipstream, IT management became more complex as well. The level of interaction between user and service provider increased, as did the level of interaction between the teams of the IT organization involved in providing the IT services.
The increase in functionality caused tremendous growth in the number of users. While initially IT supported specific functions, it came to support more and more generic functions, used by almost everyone. IT management had to support many different kinds of users, each with their own specific demands for support and communication.
As a consequence, the boundaries between the different roles blurred, and members of different teams were forced to increase their cooperation. There were hardly any mechanisms available to manage this new way of working, causing all sorts of problems. This was not only the result of introducing new systems, but also the result of an increase in the number of user questions about new applications with a shorter lifespan. In a mainframe environment, applications were usually updated no more
than twice a year, but many of the newer applications required monthly or even weekly updates. This made systems highly unstable.
Failure of IT should never be allowed to put an organization’s operational management at risk. It is unacceptable – if only for the sake of ompetition – for production to stagnate because IT is unable to print the order forms.
Simultaneously, users became more and more dependent on information systems functioning properly. The contribution of IT to the primary company activities increased exponentially. Nowadays, IT is crucial for the operational management of organizations. Many organizations have become highly dependent on IT, and they would not survive long without it. However, it still seems to be quite difficult to provide adequate IT services that are monitored as well.
1.2 Increasing need for control
There is an ever growing need for higher-quality IT services: a process that still continues today. Internal and external customers require better information delivery to fulfill their organizational needs. More and more organizations have to deal with legislation and regulation, requiring correctly arranged organizational processes.
The stock-exchange scandals surrounding companies such as Enron, WorldCom, and Ahold have resulted in increased attention to corporate governance. This led to the Sarbanes-Oxley Act in the USA, overseeing the internal regulation of organizations quoted on the stock exchange. The legislation and regulations continue to grow, including the Code Tabaksblat, anti-spam legislation, SAS 70 statements, IFRS for financial markets, and BASEL II.
More than ever, organizations need to be able to prove that they are in control.
1.3 The rise of the processbased approach
It has become obvious to most organizations that they need to manage their processes properly to be in control. Organizations have to be able to adapt to constantly changing conditions – something that has become a habit for many. Employees have accepted the fact that the ‘job for life’ no longer exists and job hopping has become a common phenomenon. The only truly stable factors in an organization are the processes.
For decades, IT organizations have developed along hierarchical structures. Based on knowledge, technology, or responsibilities, organizational structures have become ‘stove-pipes’, focusing on line management. This structure has to be changed to a more process-based approach. Unfortunately, it appears that managing processes in combination with line management (the matrix organization) is very difficult for many organizations. Organizations tend to start enthusiastically, applying all sorts of processes without having a clear goal in mind. They often use the trial-and- error approach, to see ‘how far this gets us’ – an approach that tends to be quite unsuccessful.
In a more structured approach, organizations often use the PDCA cycle1 as an improvement instrument. PDCA stands for plan-do-check-act, the four sequential stages of an improvement initiative. The PDCA cycle is used in several reference models discussed in this chapter.
1.4 Process-based reference models
The increased attention to process-based management, for better control, gave rise to many ‘models’. Processes have played an important part in these models since the early 1980s. In the following sections, we discuss the most relevant and prominent models in chronological order (Figure 1.2). At the end of this chapter you will find an overview of the trends that are indicative for an integrated solution.
1.5 ISMA (from 1979)
In the 1970s, IBM conducted research into the quality of IT services. In 1979, Edward A. van Schaik and some of his IBM colleagues developed ISMA (Information Systems Management Architecture) for mainframe computers. The publication ‘A Management System for the Information Business’ followed in 1985. In ISMA, OGC2 recognized the first initiative in developing a process description and it used the method as a reference for the development of ITIL.
In 1994, IBM started a project to replace ISMA. This led to the IT Process Model (ITPM: Ommeren & Kapoor 1997), which was designed as a generic management model for specifying and managing functions in IT management organizations. Since the model was quite complex, IBM also developed ITPM Light (Buijs & Kapoor 1998).
The purpose of ISMA was to provide an efficient management system for IT services.
The contribution to the development of ITSM is mainly the
distinction between people, process, and product, and the
recognition of several identifiable processes.
Description and main graphics
ISMA contained a series of grouped activities in which (steps of) processes and particularly procedures were described (Figure 1.3).
ISMA contained 42 ‘processes’, arranged as strategic, tactical, and operational processes. This set already contained several elements that would be given a lot of attention in later models, sometimes under a different name:
- service level planning
- security planning
- capacity planning
- change control
- resource and data inventory
- production and distribution scheduling
- problem control
- service evaluating
- financial administration
ISMA had a maturity model according to the five familiar levels, in this case renamed to Startup, Growth, Control, Planning, and Strategic Planning.
ITPM, a derivative of ISMA, was based on the assumption that every IT organization knows a series of fundamental processes, whatever the organizational structure and the technology that is used. A distinction was made between people, process, and product. The ITPM process model regrouped the 42 ISMA processes and contained 11 process groups. ITPM Light contained only ten (abstract) processes.
Looking at these ten processes, one can easily identify several important ITIL processes and functions, as well as several steps from the ITIL change management process:
● Provide operational support (in ITIL: service desk)
● Make and monitor agreements (in ITIL: service level management)
● Design the solution (in ITIL: change management)
● Select or create the solution (in ITIL: change management)
● Integrate and test the solution (in ITIL: change management)
● Implement the solution (in ITIL: change management)
● Monitor availability (in ITIL: availability management)
● Monitor resources (in ITIL: configuration management)
● Create strategy
● Create and monitor IT plan
ISMA is no longer used in practice and has never received much attention. The success of ITSM came many years after the publication of ISMA. Most of the attention in the 1980s was spent on developments in hardware and software. ISMA became known in later years because it was seen as ‘the origin of ITIL’.
ITPM never received much attention either. This is probably because it was a proprietary product and it could not, therefore, compete with ITIL.
Pros and cons
From a historical perspective, ISMA became well known because of the recognition that it was the first initiative to document process management, and that it influenced the creation of ITIL. The names and descriptions of several processes that later became the core of ITIL are derived directly from ISMA. However, just like ITIL, ISMA mainly described procedures as opposed to processes. The lack of an underlying
process model, which was also missing in ITPM, led to a fragmented view of the classified activities, especially their technological and organizational aspects.
1.6 Triple management model (Looijen, from 1986)
Separation of duties (or separation of concerns) has been used for many years as an instrument to make a system manageable and controllable. In separation of duties, a domain is divided into two parts that can monitor each other. Separation of duties provides a controllable system, where one domain specifies what the other domain has to do. This method avoids a situation where the provider has to monitor himself. This instrument can be applied without any effort to the information delivery domain of an organization.
In this respect, the work of Prof. Dr. Maarten Looijen (Delft University of Technology, the Netherlands) has been crucial. In his publications and courses, Looijen focused on the role of information management and exploitation. The increased attention to developing and implementing information systems had led to the understanding that better management of information systems was required to realize the desired
information delivery. The phases ‘design’ and ‘delivery’ became increasingly important and the lifecycle approach became the basis of Looijen’s work.
In 1985, Looijen described his ideas on management for the first time in the article ‘Exploitation of automation resources and data processing’, in the Dutch journal Informatie [Looijen 1985]. In 1986, his book ‘Exploitation of automation resources’ was published. Looijen finished his PhD in 1988 and was appointed at Delft University of Technology. In his thesis, Looijen published the FATO model, which distinguished between the areas of Functionality, Automation resources, Task areas and fields, and Organization. In his analysis, Looijen distinguished tasks, organization, and resources, which in fact already followed the structure of people, process, and product.
In 1991, Looijen published ‘EBM – A management method with SDM’, in which he emphasized that the IT supply organization should be involved in the early stages of the development of information systems, to prevent systems from being ‘thrown over the wall’. In 1992, in collaboration with Guus Delen, he published his views on the professionalization of IT management in the book ‘Management of information systems’ [Looijen, 1998].
The purpose of the triple management model is to create an efficient and effective system for the management of information systems.
The contribution to the development of ITSM consists mainly of the identification of the three sub-domains ‘information management’, ‘application management’ and ‘technical management’, and the resulting separation of demand and supply.
Description and main graphics
In the FATO model, Looijen described the activities that can be found in the later ITIL books. Among these are the following processes and functions:
● Change Management (ITIL: change management)
● Problem Handling (ITIL: incident management)
● Service Level Management (ITIL: service level management)
● Capacity and Planning (ITIL: capacity management)
● Information Center (ITIL: service desk)
● Protect (ITIL: security management)
● Availability (ITIL: availability management)
● Disaster Recovery (ITIL: continuity management)
Looijen distinguished three types of management (Figure 1.6):
● Information management (IM) – carried out by the User Organization (UO)
● Application management (AM) – carried out by the Maintenance Organization (MO)
● Technical management (TM) – carried out by the Processing Organization (PO)
Looijen described each of these domains in a Mintzberg Chart.
With the state model (Figure 1.5), Looijen described the lifecycle of the information system. In this model, the phases from development to operation and use were positioned, and a distinction was made between influences from the user organization and influences from the management organization.
Looijen provided a very important contribution to the recognition and professionalization of IT management. With his documentation of the three management forms (information, application, and technical management) and the corresponding task areas, he laid a solid basis for structuring the IT management discipline. He used this for many years in his teaching at Delft University, where many IT management specialists were educated in the nineties, until he retired in 2001.
The triple management model had a leading position only in the Netherlands. In other countries this view was hardly known, and instead ideas focused on a separation between demand and supply, which was analogous to the dichotomy between information management and technical/application management (Figure 1.6). This separation of information management and application/technical management
(or demand and supply) led to a separation of duties, one of the most basic principles of governance.
Opportunities and challenges
The idea that application management and technical management are two subdomains of the larger IT management domain, and that managing the service delivery (ITSM) is a specific discipline, did not become popular until the end of the nineties. In the IT management domain, the disciplines of application management and technical management should cooperate as effectively as possible, in order to deliver the best output of this domain (IT services) to the user organization. In this respect, application management and technical management have the same structure, and differ only in the object of management (applications versus other infrastructure). This difference makes sense, because the areas focus on different objects, but it can also be a risk: applying strict separation can threaten cooperation. At the end of the 1990s, ITSM was given a position in the work of Looijen, next to technical and application management.
1.7 ITIL (from 1988)
In the UK, the problems of the 1980s led to the development of a series of books that described the best practices in IT organizations. These books, known as ITIL, the IT Infrastructure Library, were developed as an initiative of the British government, in an attempt to mitigate government IT costs and improve IT quality.
ITIL started in 1986 as the ‘Government Information Technology Infrastructure Management Method (GITIMM)’, an initiative led by John Stewart of the Central Computer and Telecommunications Agency (CCTA). The objective of GITIMM was that people would be able to manage IT infrastructure in a standardized manner. This would result in a decrease in the dependency on individuals and in streamlined work processes. This way, efficiency would increase and as a result costs would be reduced.
Stewart involved a large number of organizations, studied the available literature (such as the ISMA book), and set up a project team in order to document a series of best practices. The project team produced several small books with these best practices from 1988 onwards. The books initially appeared under the name GITIMM, but this was soon changed to the IT Infrastructure Library (ITIL). After the release of ITIL v2 and ITIL v3, this first version became known as ITIL v1.
Several organizations were involved in managing this first version of ITIL:
● OGC – Office of Government Commerce, an organization of the British government that merged with CCTA in 2000. The objective of the OGC was to help customers to modernize their procurement activities and improve their service delivery by using IT as efficiently as possible.
● itSMF – IT Service Management Forum. The itSMF was set up in 1991 as a British user group for ITIL, followed by a similar Dutch organization in 1993. In the following years, copies of these organizations appeared in several countries around the world. Until 1995, the organizations went by the name IT Infrastructure Management Forum, but the increasing focus on service instead of infrastructure led to the more appropriate new name. Nowadays, itSMF is a worldwide, independent, internationally renowned nonprofit umbrella organization, supporting the development of ITSM and mostly focusing on ITIL.
● Exam organizations – From 1991 onwards, ITIL exams were developed and an increasing number of professionals tested their knowledge. This development was stimulated by EXIN, which developed a foundation exam: ITIL Essentials, later renamed ITIL Foundation. The British BCS/ISEB and the Canadian LCS also used this exam, giving rise to its worldwide distribution. In the following years, other exam institutes joined these organizations. In 2007, OGC contracted out the responsibilities involving the ITIL exams to APMG3
From 2000 onwards, several of the first ITIL books were updated and a new edition was published. This reduced the overlaps (and sometimes inconsistencies) of the first series, and the coherence between subjects become more obvious. The vision for ITSM also became clearer.
Organizations increasingly became dependent upon IT. The contact between the IT management organization and the customer intensified, making the service aspect more important. ITIL v2 responded to this cultural shift from technology focus to service management.
The quality and efficiency of IT processes are core topics in ITIL v2. This version also examines the process maturity of an IT organization. Although ITIL v2 consisted of seven parts (Figure 1.7), two books received the most attention: Service Support (‘the red book’) and Service Delivery (‘the blue book’). This had everything to do with the fact that the main ITIL courses and exams dealt only with these two books.
In 2007, the third version of ITIL (v3) was published. ITIL v3 is a further development of v2 and focuses on the lifecycle of an IT service. The processes of v2 were elaborated to create a better connection with the company strategy from a strategic point of view. The service lifecycle and continual service improvement were developed to serve this purpose. The starting point for ITIL v3 is not technology (as in version 1), or processes (as in version 2), but service delivery (IT services). The service lifecycle forms the core of ITIL v3.
In 2011, a fourth version of ITIL was published, and from here on all ITIL versions will be marked with the year in which they appear, as in ‘ITIL 2011’. This version mainly fixed issues in the ITIL 2007 version. Besides these corrections, several processes were added, and others were removed or rewritten. The Service Strategy book was almost completely rewritten. From January 2012, all trainings and exams were adapted to this latest version. In terms of the nature of ITIL, nothing has changed: it is still a huge collection of suggestions, examples of procedures, and descriptions of resources (best practices) that can be used in an IT management organization.
The purpose of ITIL is to provide best-practice guidance
The contribution to the development of ITSM consists mainly of practical documentation of a large number of activities in IT management organizations, particularly of various core processes, defining the concept of IT service, and providing a uniform terminology for various organizations.
Description and main graphics
The first series of nearly 50 books that became known as ITIL v1 offered guidelines on a variety of subjects that concern the average IT organization on a day-to-day basis. This series discussed a number of challenges in the IT management domain in titles such as Service Level Management, Help Desk, Contingency Planning, and Change Management. The most famous books were arranged in ‘sets’ (Figure 1.7).
ITIL v2 is seen as a ‘process-based framework’. Where the individual ITIL v1 books were presented in ‘sets’, the components of ITIL v2 were documented in just a few books (Figure 1.8):
● Service Support
● Service Delivery
● Security Management
● Application Management
● Planning to Implement Service Management
● ICT Infrastructure Management
● Business Perspective volumes 1 and 2
In ITIL v3, the familiar ITIL processes from the previous versions are divided over several lifecycle phases. ITIL v3 distinguishes five phases in this service lifecycle (Figure 1.9). These phases are:
- Service strategy – What services do you offer?
- Service design – How do you design a new service?
- Service transition – How will the new service become operational?
- Service operation – How do you deliver good services?
- Continual service improvement – How you improve the services regularly?
Some ITIL v3 processes have interfaces with multiple other processes.
ITIL 2011 was released as an update to ITIL v3, but it also added new practices and changed the descriptions of many other practices. Practices that were added are: strategy management for IT services, business relationship management and design coordination. Simultaneously, several practices disappeared, particularly in the completely rewritten Service Strategy book and in the Continual Service Improvement book.
The IT Infrastructure Library (ITIL) is widely regarded as the de facto standard for ITSM. ITIL is still the most complete reference model for ITSM.
The first set of ITIL books approached ITSM mainly from the viewpoint of technology, and the rapid changes in the IT environment soon caused some parts of ITIL v1 to be regarded as old-fashioned. However, the ITIL books had very little competition during the 1990s and were therefore regarded as the de facto standard for ITSM. This development was strongly encouraged by the worldwide market that arose around
ITIL. ITIL developed into a set of products and services that became the core business of many companies; the offering included books, courses, examinations, conferences, consultancy, and tools. With this huge market, ITIL became a great success – something that is clearly visible when looking at the increase in the number of ITIL certificates issued since 1992 (Figure 1.10). ITIL Foundation exams were a particular success. A small proportion of the certificates are ‘Practitioner’ certificates and an even smaller proportion are ‘ITIL Service Manager’ certificates. By the year 2000, after ten years of existence, 60,000 ITIL certificates had been issued. In 2008 the magic number of 1,000,000 certificates was reached. In 2012 approximately 2,000,000 people hold an ITIL certificate.
In 2006, OGC subcontracted APMG for the management of ITIL rights and ITIL certification exams, and for the accreditation of training organizations. APMG created a whole new certification scheme for ITIL v3. OGC subcontracted the publishing rights for ITIL to TSO.
ITIL v2 was phased out by OGC on June 30, 2011. Since January 2012 all exams and training have been based on ITIL 2011. In spite of criticism of the broader and more abstract approach of ITIL v3, the distribution of ITIL continues.
Opportunities and challenges
ITIL brought many benefits. It finally delivered a reference model for introducing ITSM. The great value of ITIL lies in the wide acceptance of a single ‘language’, allowing people in different organizations around the world to understand each other, and defining a series of key ‘best practices’ for a process-based management organization.
However, ITIL also came with some disadvantages. For many years, ITIL focused on technology, and it took until 2000 for the emphasis to shift to services. Furthermore, the complex ITIL books were by no means an easy read. The absence of a good architecture and control led to many inconsistencies.
In the course of time, ITIL was carried out to the letter, with limited attention to cultural change, which was especially necessary in management. Through this overly restrictive and incomplete approach, many ITIL projects did not have the desired effect. ITIL was not eveloped as a framework that is to be implemented in its entirety.
Although ITIL is widely perceived as a set of process descriptions, this is not actually what it is. In all versions this is explicitly not even the goal of ITIL. ITIL is a set of practices for many IT management topics, documented in a series of books. ITIL is no more and no less than a set of best practices, of which only a small number are described in a process-like manner. An often-heard criticism of ITIL is that it does not provide a proper process description. But in fact this criticism is not justified: even though ITIL uses the word ‘process’ in many instances, the introduction states clearly that ITIL is about best practices, and not about processes.
Describing best practices is a good way of making positive experiences available to others, and has proven to be useful. However, this approach also has a major disadvantage. A collection of individual best practices does not enforce an architecture. If the descriptions are designed and developed differently, this will create an inconsistent and/or redundant structure. The relationship between the documented practices is not enforced from a shared architecture and is sometimes difficult to recognize.
ITIL describes many topics as a function, while other topics are described with the focus on operational elements. Thus, the practices described in the ITIL books are not actual processes, but often functions or procedures. In fact, all these ‘processes’ are a mix of process, product, and organizational elements (people, process, and product), resulting in a rather opaque and inconsistent framework.
A major change in ITIL v3 was its focus on the integration of IT and business: the focus shifted from executing processes to managing IT in a lifecycle. ITIL v3 was poorly received. Its increased complexity was not beneficial to the market, because numerous organizations were not yet ready for a more complex version 3 of ITIL.
What many organizations were still missing in v2 – for example, the focus on culture, the ITIL process model, and its relationship with the supporting products – was not available in v3 either.
Resistance was not only caused by the increased complexity: the inconsistencies between different parts of ITIL v3 also provoked criticism. This led to OGC’s announcement that a project would be started in 2010 in which these defects would be repaired (‘ITIL v3.1’). In July 2012, this new version was launched as ‘ITIL 2011’. The Strategy book was rewritten to some extent, to improve its structure, and a number of inconsistencies and shortcomings in the other books were repaired.
ITIL has a major influence on ITSM and is very important to the discipline, but it does not provide a complete and coherent structure for the set-up and management of an IT organization. ITIL needs be completed and adjusted to fit into local practice.
This makes ITIL a very useful tool for setting up an IT management organization, but it is not a solid and complete method. This means that ITIL cannot be successfully applied without additional improvement of the structure, the culture, and the product.
1.8 PRINCE (from 1989)
The PRINCE project management method (PRojects IN Controlled Environments) was developed in 1989 by CCTA as the British standard for IT project management. The reason for the development of PRINCE was that the turnaround times and costs of IT projects were constantly exceeded. In 1996, a revised version was released under the name PRINCE2. After some minor modifications in 2002 and 2005, PRINCE2 was substantially revised in 2009 and released as PRINCE2:2009. Today, PRINCE2 is managed by OGC.
The purpose of PRINCE2 is to manage projects successfully.
PRINCE2 is not part of the field of ITSM: it is used only as supporting technology, but the method fits in well with ITIL.
PRINCE2 is not an ITSM reference model. However, it is discussed here anyway, based on the following arguments:
● In Europe, PRINCE2 is the leading project management method.
● Many ITSM implementations use PRINCE2.
● PRINCE2 is often used in major changes.
PRINCE2 is based on four approaches:
● seven principles
● seven themes
● seven processes
● tailoring the method
Seven principles form the basis of PRINCE2. These can be used in each project. Without the use of these principles, a project does not qualify as a PRINCE2 project:
- business justification
- learning lessons
- roles and responsibilities
- management by stages
- management by exception
- product focused
The PRINCE2 themes describe the aspects of project management that require attention during the whole project:
- business case
PRINCE2 uses a process-based approach. The method recognizes seven processes that control, manage, and deliver the projects:
- starting up a project
- initiating a project
- directing a project
- controlling a stage
- managing stage boundaries
- managing product delivery
- closing a project
Figure 1.11 shows how the principles, themes, and processes together form the PRINCE2 project environment.
PRINCE2 is a method commonly used in Europe for managing IT projects. Since the method is generic, it is also increasingly used outside IT. Furthermore, PRINCE2 projects can be tailored to the type, size, and environment of the project.
In the United States, another project management method is popular: PMBoK (Project Management Body of Knowledge).
Opportunities and challenges
Some well-known benefits of PRINCE2 are:
● controlled start, implementation and completion of a project
● good insight into the progress of the project
● continuous focus on the justification of the project
● clearly defined roles and responsibilities
● quality control during the whole lifecycle of the project
● common vocabulary
● learning from experience
The PRINCE2 management products are available as a template at no cost. Although PRINCE2 can be applied by any organization, the training and test market is highly regulated.
A major pitfall of PRINCE2 is that organizations sometimes apply it too rigidly. If the method is not tailored to the size and complexity of the project, it can lead to excessive bureaucracy. PRINCE2 is a tool, not an end in itself. Furthermore, PRINCE2 pays little attention to the leadership qualities of the project manager and to other competencies needed to perform the various project roles.