An Iterative Requirements Specification Procedure for Decision Support Systems

Size: px
Start display at page:

Download "An Iterative Requirements Specification Procedure for Decision Support Systems"

Transcription

1 Journal of Medical Systems, Vol. 11, No. 4, 1987 An Iterative Requirements Specification Procedure for Decision Support Systems Cyril H. P. Brookes Requirements specification is a key element in a DSS development project because it not only determines what is to be done, it also drives the evolution process. A procedure for requirements elicitation is described that is based on the decomposition of the DSS design task into a number of functions, subfunctions, and operators. It is postulated that the procedure facilitates the building of a DSS that is complete and integrates MIS, modelling and expert system components. Some examples given are drawn from the health administration field. INTRODUCTION Objectives The procedures described in this paper are intended to guide the designers of a DSS-- assumed to be a team comprising, at least, one user and a facilitator--through the specification stages of the project. Its primary focus is on assisting the team to decompose the support environment into elements that can then be considered in detail. The result will be a modular DSS that utilizes the technology available in a complete manner. The examples given are drawn from a health administration environment. The Specification of Effective Decision Support Systems The underlying objective assumed for a DSS is the improved performance in the decision processes of the system's users. Requirements specification is seen to be a critical element in DSS development since so much uncertainty surrounds these projects, particularly in terms of (a) the explication of actual requirements, as opposed to those that are initially perceived to be desirable; (b) the most effective development path that leads from initial ideas to the final product; (c) the feasibility of implementing the specifica- From the University of New South Wales, Sydney, Australia /87/ $05.00/ Plenum Publishing Corporation

2 252 Brookes tions successfully; (d) the difficulty of knowing when the project is complete, or, at least, when further work is unjustified. Facilitating the Evolutionary Implementation of DSS in a Series of Independent Iterations It is now widely accepted that decision support systems are most effective if they are permitted to evolve rather than requiring the design and implementation to be performed to a specification that is determined at the start of the project. 1.2 What has not been addressed in the literature is consideration of those procedures that can guide the DSS requirements specification process, so that it is compatible with the need for evolutionary development. It seems to be assumed that the evolution through a number of versions or prototypes should be a somewhat random process without any attempt at using formal guidelines. The design team searches for an idea or ideas as a starting point, builds version 0, and then initiates an iterative process that is directed by perceptions of utility that may, or may not, lead to a comprehensive assessment of the benefits a DSS could yield. This paper proposes a structure within which the evolution of a DSS can take place in such a way that (a) the iterations are "independent"--that is, later iterations build on earlier ones, thereby avoiding the need to adjust to new reporting and interface procedures; (b) the scope of the specification is complete, implying that the procedure leads the designer and user to investigate all possible sources of DSS support. Basic Assumptions Underlying the Specification Process The requirements elicitation method is based on a functional decomposition procedure developed following extensive research and pilot studies. 3 The following assumptions form an essential foundation to the procedure: 1. The model of the decision-making process first proposed by Mintzberg et al. 4 is an appropriate formulation of the cognitive tasks that need to be supported by the DSS. 2. Equal consideration should be given during the elicitation process to support for problem finding as well as to support for problem solving. 3. There is an important need to recognize the complementary nature of hard and soft data, and to specify the system accordingly. 5,6 4. There is a need to integrate the specifications of the "expert system type" elements of the DSS with those for the more conventional analysis and MIS reporting components. The Importance of Soft Data as a Source of Management Information Almost all the research work and publications dealing with the subject of decision support systems focuses on the analysis and reporting of numerical, formatted, "factual," and historical data. However, increasing emphasis is being placed on the role of the informal information system, which frequently communicates "soft data." Mintzberg's work 7 on the nature of managerial work outlines a number of important findings,

3 Requirements Specification Procedure 253 including the following: (a) A high proportion of manager's time (between 60 and 90%) is spent in verbal (oral) communication; (b) managers dislike, and have no time for, reflective activities; (c) managers have a strong preference for "soft" information--e.g., explanations, gossip, ideas, opinions, scenarios, predictions. In earlier work 5,8 I have outlined a classification of soft data types that appears relevant to the decision support system designer and I have also indicated the ways in which soft data might be incorporated with DSS design. Two aspects of these findings are particularly relevant in this context: the problem of communicating soft data, and the types of soft data used by managers. The problems in communicating soft data are significantly different from those experienced in the reporting of factual, numerical, historical data from a database. Many of these difficulties arise, as might be expected, from the informal nature of much soft data and also from the fact that it is almost always encoded in text rather than numeric form. Summarizing the communication problems for soft data: 1. It is not formatted, hence access and indexation are difficult. 2. Its existence is often not known to those who may need it. 3. The meaning of text can be obscure, causing ambiguity. 4. Accuracy depends on the "source" reliability and cannot be verified by a remote user. 5. Security restrictions set by the "source" may not follow organizational or hierarchical access rules normally applied to hard data in a database. 6. It often has a short life-span. It follows that the means for incorporating soft data into a DSS environment must be on a basis that removes or negates most of these difficulties. Usually this requires that the source or author of soft data and the user be brought into direct communication so that they can assess accuracy, meaning, security, etc. This indicates a need for systems other than the "arm's-length" information retrieval relationships that are commonly used for factual data retrieval and a primary role for the system as a "matchmaker," bringing together those who have soft information with those who want it. The Interface Between Sources of Hard and Soft Data Obviously, the formal inclusion of soft information reporting within a DSS can be difficult, although attempts can be made to turn soft into hard data by conversion such as the Lickert scale, etc. There is evidence, however, that the effectiveness of hard information reporting, the determination of the implications of model results, and the results of rule interpretation in expert systems can all be improved if the appropriate soft data are available at the same time. 5 It should be noted that soft information is frequently not "fully explicated," and it may need to be created by the interaction of a number of people taking part in a meeting or a decision conference. Specifically, six techniques that can be used to facilitate the soft/hard linkage are as follows: 1. Design numeric reports for an area so they match human sources of soft data about the area--e.g., "John and Bill's payroll report," "Bill and Jill's sales report."

4 254 Brookes 2. Include free format fields in databases to contain text comments. 3. Arrange meetings and design matching reports so they encourage soft and hard data exchange about the same area. 4. Utilize meetings, decision conferences, etc., to propose problem structure and to estimate probabilities and utilities for incorporation into models. 5. Maintain links between items on the word-processing or electronic mail files and the numeric database. 6. Build special-purpose systems to facilitate asynchronous exchange and "brainstorming" of data on a routine basis--e.g., bulletin boards or the "corporate intelligence system." The concept of a corporate intelligence system has been developed in a separate publication. 6 THE STRUCTURE OF A DECISION SUPPORT SYSTEM Specific Components of the Decision Process to Be Supported As a result of an analysis of a large number of MIS reports, case studies, and interviews with middle and senior managers of a range of organisations, and using Mintzberg's 1976 model 4 as a guide, it is apparent that the relevant cognitive tasks that should be the focus of DSS design are as follows: Status Assessment--Seeking Comfort and Finding Problems. The decision maker seeks both to evaluate and to understand the current status and scans the environment until he/she identifies a situation that requires a response--i.e., finds problems, crises, opportunities, or previous decisions that did not have the anticipated effect. Diagnosis. The availability of information and intelligence about the problem area is investigated. The decision maker learns more about the kind of problem being faced, its implications, and its possible causes. Details of similar problems existing in other parts of the organization, and general cases of specific problems are sought. Generating Alternative Solutions. The decision maker scans the environment looking for "packaged" solutions which have been used on previous occasions or which have been prepared for situations such as the one now encountered. Alternatively, the decision maker designs one or more new alternatives specifically for this problem occurrence. Evaluating Outcomes of Alternatives. The decision maker reviews the various alternatives and determines those that appear viable. The "utilities" of anticipated outcomes are compared and ranked according to preference criteria elicited from problem owners. The utility of each alternative and the preference criteria will include abstract as well as concrete value components. Judgment. This task is deemed to be entirely subjective and cannot be directly supported by the DSS other than through the provision of support of earlier phases. Overall DSS Architecture General Principles. The preceding analysis can lead to the formulation of a general architecture for DSS that ought to provide the complete functionality required (Figure 1).

5 Requirements Specification Procedure 255 J GENE-'~TE PROB' H l I DIAGNOSIS RECOGNITION ~" ~ 2:2 '~" - -.~ ~ ~ _. - ~ J.. ~ ~ -"-'-Z -"-'-- I " STATUS ~ PROBLE~! A~S~5"~MENr [ ~ ~.~ ~ I ~" ~ L ~ SOLV1NG DSS Figure 1. Tasks supported by the DSS functions. Of absolutely critical importance is the need to distinguish between those elements of the DSS that support mental activity before a problem is recognized (status assessment) and those that follow problem finding (problem solving). Supporting Status Assessment. In the absence of a "situation requiring a response," the manager will expect the DSS to be scanning the environment and reporting information. Although the manager may expect to do some browsing through the knowledge base, it has been my experience during all of the studies! have conducted on decision support system environments that managers rely almost entirely upon routine output from the information system and on soft information contacts in order to satisfy the needs of status assessment. The routine analysis and reporting emanating from this function of the DSS can be seen to comprise three components: support for comfort generation, support for problem finding using prespecified inference rules and models approved by the user, and support for problem finding using "alerting" techniques based on general rules for recognizing unusual trends or coincidences. Support of Problem-Solving Activity. Problems will be identified both as a result of the analysis of routine reports and following many other stimuli, some of them completely outside the scope of the DSS. Following Mintzberg's 4 analysis, it is proposed that the three cognitive tasks that need to be supported in order to aid the problem-solving activity are (no matter how the problem was first identified): (1) diagnosis to help in assessing implications and.potential causes of the problem, (2) generation of alternatives, including the search for precedents and related problems and assistance with the design of custom-built solutions, and (3)

6 256 Brookes evaluation of alternatives using models, inference rules, and detailed analysis of the knowledge base in order to assess sensitivity, risk, and likely comparative benefits. DSS Functions, Subfunctions, Operators, and Techniques A basic objective of the procedure described in this paper is to provide guidelines for the decomposition of the total DSS project. This segmentation is done at three levels--functions, operators, and techniques. In the section on specific components, the functions to be supported by the DSS were introduced and these form the basis for the first level of decomposition (Figure 2). The second level comprises a set of operators that, in total, implement the functionality implied by the first-level concepts. These operators focus the attention of both the user and the designer on narrow aspects of potential support from the system. Each of these operators is likely to form the basis for one or more iterations of the evolutionary development life cycle as suggestions are conceived, implemented, and evaluated. Examples of the required operators are presented in subsequent sections. The third level of decomposition is the set of computational, data retrieval, and evaluative tools and techniques that must be employed by the system developers to implement the specification as it unfolds following the consideration' of each operator. There a large number of techniques that may be applicable in the implementation of a DSS, and many of them can be used to implement more than one operator. They include, as might be expected, a wide range of arithmetical, statistical, modeling, inference rule building, data editing and retrieval, time series analysis, and cataloguing procedures. The requirements elicitation procedure is sequential, leading the design team to concentrate its attention on one function at a time. Each operator relevant to that function is considered for its applicability to the problem area, and individual techniques are selected for their compatibility with the manager's way of working. The design process then 5YS"TEM FUNCTION LL~EL SUB-FUNCTIONS COMFORT AND ASSESSMENT I ALERTING I J DIAGNOSIS SOLVING I I GENERATION EVALUATION LEVEL PROBLEM PERIODIC AND OF ALT~RNATI~ OF ALTERHATIVES FINDING INTROSPECTIVE ' KNOWLEDGE PRE SPECIFIED PRE SPECIFIED PRE-SPECIFIED ANALYSIS BASE ~ AD-HOC OR AD-HOC ROUTINE INVENTORY ANALYSIS AND -ON DEMAHD, REPORTING TRIGGERED OR AD-HOC Figure 2. Hierarchy diagram of the DSS architecture.

7 Requirements Specification Procedure 257 proceeds iteratively in an evolutionary manner until all operators have been evaluated. Progressive implementation is usually desirable--at the rate of one or more iteration per operator. Considerable experience has been gained with the use of the procedure. Different sequences have been found to be desirable, depending on the type of problem environment being studied. AN OVERVIEW OF THE PROCEDURE DSS Functions and Subfunctions These include (1) status assessment, which comprises (a) problem finding and managerial comfort (routine reporting with prespecified models and inference rule sets), and (b) alerting (routine introspective analysis of the knowledge base with reporting as necessary depending on results), and (2) problem solving, comprising (a) diagnosis (prespecifled, on demand, triggered, or ad hoc analysis), (b) generation of alternatives (usually ad hoc analysis and reporting but may be prespecified), and (c) evaluation of alternatives (usually ad hoc but may involve prespecified models, inference rule sets, or reports). Examples of Operators Implementing the Functions and Subfunctions There are a number of operators that have been found to be essential to the implementation of the procedure. Status Assessment--Problem Finding and Managerial Comfort Summarization. Factual analysis, time series statistics, performance indices. Reporting should be compatible with sources of soft data. Inferential Assessment. Best and worst cases, good and bad, acceptable or not. Sets of rules to facilitate support of this assessment. Standards Comparison. Actual versus budget, last year versus this year, plus other comparisons that assist in assessment of relative merit. Status Assessment Operatorsmlntrospective Alerting Factual Alerting. Search for unusual events, coincidences, trends. Intelligence Alerting. Search for unusual intelligence messages or reports. Problem.Solving Operators--Diagnosis Detail Amplification. Expand factual background of problem area. Structure Modeling. Bnild financial, stimulation, optimizing, or multiattribute models of the problem. Refine models, if required, as problem solving proceeds. Implication Assessment. What result if no action? Use models to predict. What happended last time? Extract and analyze performance indices for precedents.

8 258 Brookes Problem-Solving OperatorsmGeneration of Alternatives. Soft Data Cross-Reference. Analyze soft data available and cross-reference for relevance to precedents or suggested options. Model Augmentation. Augment models used to determine structure to incorporate characteristics of options under review. Problem-Solving Operators--Evaluation of Alternatives Detailed Fact Retrieval. Extract facts about options from the database. Preference Formulation. Elicit and codify goals, utility measures, preference rule sets, and/or optimization criteria relevant to feasible options. What-If Processing. Navigate models, perform sensitivity check, calculate utilities, store results in the database. Evaluation Assessment. Process utility, risk, and preference models, determine a recommended ranking of options where appropriate, and report. Tools and Techniques for Implementing the Operators Assessing the usefulness and performing detailed analysis of the relevant tools and techniques to implement each operator will be relatively simple for the designer who is experienced with this type of system implementation. This is the type of design activity frequently used in DSS projects. The essential difference between the procedure outlined in this paper and conventional practice is the focus of the work. Many DSS projects are tool- or technique-driven. Because Lotus 1-2-3, DBASE III, or an expert system shell are available, the user or designer considers how he/she may use this tool in a specific problem environment. Although the result may be valuable, it probably does not represent the total, or even the most useful, support that is possible in this problem situation. Using tool/technique availability to guide development usually reduces the creative element in the design. Experience with the use of this procedure shows that by focusing attention on specific operators, in turn, a complete and satisfying DSS evolves in a series of independent iterations. This appears to be superior to the relatively ad hoc approach described in an earlier section. The same tools may be used, but their application is controlled and likely to be more comprehensive. The appropriate tools and techniques cover a wide spectrum, including database profile analysis, time series analysis, quality control analysis, modeling (spreadsheet), multiattribute, statistical, simulation, and optimizing), inference rule building or expert system shells, knowledge base catalogs, and text communication system (especially to assist with soft data interfaces). EXAMPLES DRAWN FROM HEALTH ADMINISTRATION To illustrate the application of the procedure, some examples are given of the approach that may be taken to requirements specification. Status Assessment For a hospital administrator the status assessment segment of a DSS could include the following aspects. Naturally, it is not essential to implement each operator for every

9 Requirements Specification Procedure 259 DSS. The procedure requires that each operator be considered for its utility, but many of them will not be included in the final system. Summarization. To support comfort generation and problem finding, MIS style reports would be appropriate that allowed the administrator to feel "comfortable" that he/she was aware of the status quo. Typically, the reports would include statistics on admissions, transfers, and separations; bed occupancy; staff levels; overall costs; areas of staff shortage; volume of clinical tests; etc. A variety of performance indices such as cost/patient bed day or operations/bed day may also be included. The essential factor in this iteration of the specification is the focus on comfort and problem finding. Discussion with the executive should ensure that these concepts are the primary criteria for determining the parameters to be included and the cutoff level for detail. It is, of course, critical that the level of detail reporting required for diagnosis and other problem-solving subfunctions be avoided for this component of the DSS. Soft data relevant to those reports might be identified, such as the supervisors' comments on the level of performance in their clinics or wards and predictions as to future performance. The relationship between the presentations of the two classes of data should then be specified. Performance Assessment. To assist the administrator's inferential reasoning processes, a set of rules, in effect a miniexpert system, may be devised to support problem finding and coinfort generation by advising the best and worst sections, wards, clinics, etc., or those whose performance is/is not acceptable. Standards Comparison. Apart from common concepts such as actual versus budget reporting, this operator is intended to focus the design team's attention on comparison as an indicator of problems and a generator of a "comfort" feeling. For example costs per patient bed-day for a hospital may be compared against a standard, or average, of all hospitals of this type. Rules may be formulated that highlight comparisons that are out of specification, such as finding cases where the number of pathology tests per patient is more than 20% above a benchmark calculated to reflect the current mix of patients. Alerting. The alerting operators all have the purpose of bringing potential problem situations to the administrator's attention. Common means for implementing them include the monitoring of time series parameters and advising when statistically significant changes occur. For example, the trend in pathology services per patient for a particular clinic, or the pharmacy charge rates may alter significantly. Problem-Solving Support The specification of support for the problem-solving subfunctions (diagnosis, alternative generations, and alternative evaluation) follows similar steps, except that it may be necessary to focus on establishing a suitable decision support environment in which future (but unknown) problem situations can be handled. Since the procedure is similar to those already presented in the preceding section, only two problem-solving operators are considered in detail. Diagnosis--Implication Assessment. This process will be entered when a specific problem becomes apparent--for example, a serious cash flow deficiency for the hospital or a class of problems of this type is envisaged. Its aim is to support the administrator in determining the seriousness of the situation. Implication assessment could be imple-

10 260 Brookes mented using budget or regression models to forecast cash flows over the next 12 months. The models would have been specified in the "structure modeling" iteration. Alternative EvaluationmPreference Formulation. During the evaluation of alternatives phase of problem solving, the administrator will need to codify his/her preferences and utilities for incorporation into the models and rule sets used for the What-If Processing iteration and that for Evaluation Assessment. Thus, for the cash flow problem, the relative merits of cost control versus increasing prices, medical services versus catering or laundry services, etc., may need to be formulated. CONCLUSION This requirements elicitation procedure has been applied in over 50 situations of varying complexity and urgency. In each case the problem decomposition framework has been valuable because of one or more of the following reasons. 1. It is not necessary for the manager to answer directly the global question "What information do you want?" Rather, the answer to that question is obtained as a series of responses to questions relating to operators, which appear relevant to the manager's way of thinking and working. 2. The managers find that the procedure is closely related to the way they use their information resources in practice. Therefore, iterations can be independently implemented. 3. It is possible to integrate the benefits of applying MIS, modeling, and rule-based procedures within one specification and implementation process. 4. The elicitation process provides an effective development framework since the sequence of operators will also determine the implementation evolution. REFERENCES 1. Keen, P.G.W., & Scott-Morton, M.S., Decision Support Systems--An Organizational Perspective, Addison-Wesley, Reading, Massachusetts, Keen, P.G.W., & Gambino, T.J., Building a decision support system: The mythical man-month revisited. Building Decision Support Systems (J.L. Bennett, ed.), Addison-Wesley, Reading, Massachusetts, 1983, chap Brookes, C.H.P., A framework for DSS development. Transactions DSS-85, Institute for the Advancement of Decision Support Systems, San Francisco, April Mintzberg, H., Raisingham, D., & Theoret, A., The structure of unstructured decision processing. Admin. Sci. Q. 21(June), Brookes, C.H.P., Text processing as a tool for DSS design. Processes and Tools for Decision Support (H.G. Sol, ed.), North-Holland, Amsterdam, Brookes, C.H.P., A corporate intelligence system for soft information exchange. Knowledge Representation for Decision Support Systems (L.B. Methlie and R.H. Sprogue, eds.), North-Holland, Amsterdam, Mintzberg, H., The manager's job: Folklore and fact. Harvard Bus. Rev. July/August, Brookes, C.H.P., Incorporating Text Based Information within a Decision Support System, DSS-81, Execucom Systems Corporation, 1981.