Impact and Effectiveness of End-User Developed Information Systems in a Process Control Environment: A Case Study William Thomas Gaillard Herbert E. Longenecker, Jr.1 Roy J. Daigle2 University of South Alabama School of Computer and Information Sciences Mobile, AL, USA, 36688 Abstract Can end-users in a process-oriented organization develop effective information systems? In anticipation of a positive response, management at a major pulp and paper-manufacturing company recently invested in an end-user-centered system at one of its plants. The system consists of a data warehouse and an end-user application development environment to access data for process-oriented decision-making. Over time, user-developed systems were in use throughout the plant. In this paper we discuss a synergistic management-developed approach in assessing the effectiveness of the use of the end-user development environment and the implications for use in any organization. Consistently, these end-user developed systems enjoyed a high degree of perceived satisfaction and produced an excellent return on investment. Keywords: End-user application development, user satisfaction, key measures Traditionally, systems analysts and designers are the builders of information systems. The people who employ those systems in their day-to-day job activities are referred to as end-users. The major reasons identified for software development failures involve lack of user participation, inadequate management support, ill-defined requirements, large implementation delays, and lack of good project management (Johnson 1995; Wilder 1998). Tangible evidences of failure have been late, over budget or outright failure of systems to meet performance demands (Johnson 1995). A goal of information systems is to apply information technology to support people to increase their productivity and thereby increase the profitability of the organization (McNurlin 1997). One way to achieve this goal is through end-user application development. End-user application development is an approach in which end-users acquire and/or develop software without the assistance of programmers, and quite often without system analysts (Hicks 1990). End-user application development, which originated in the late 1970s, has been growing rapidly for a number of years. It now represents a very significant, and in many cases the dominant, form of computing in many organizations (Huff 1992). 1. BACKGROUND Prior to 1995, the Information Systems (IS) department in a major pulp and paper manufacturing company had traditional responsibilities: Most of IS personnel time was spend on developing COBOL applications dealing with Payroll and Accounting. Over time, computer-literate end-users (line-supervisors and operators at the process control level) began to see ways to utilize computing capabilities to improve their day-to-day activities at the process control level. They requested assistance from the IS department to build decision-support applications. IS personnel's initial approach was a traditional systems development approach in which end-users worked with IS personnel to develop process control decision-support applications. However, end-users were really in an exploratory mode-they did not know what they wanted. They could not give clear specifications . . . and the rest of the story is predictable. As end-users developed awareness of existing data sources, computing capabilities, and reporting alternatives, they made wholesale changes to applications specifications, leading to delays in development. These delays resulted in a loss of momentum and interest. Each party questioned the wisdom of time and energy invested in the application. The resulting experience was a frustrating one for both end-users and IS personnel. 2. DEUDS The objective of a process-oriented information system is to bridge the gap between process control and business information systems and to improve the economic performance of the organization by provide adequate information to maximize day-to-day processes. Breaking with the traditional approach in 1995, the company invested approximately one million dollars in a sophisticated software product infrastructure to provide end-users direct access to data for process control-oriented decision-making. The IS department and management selected a development environment for end-user developed systems (DEUDS). The system consists of a data warehouse and an end-user application development environment that has direct-access to the data warehouse. Process data from thousands of strategic process locations within the plant are collected from sensor devices and recorded within a centralized data warehouse facility. In the paper industry, process control data are the foundation of many operating decisions. With training in the use of the DEUDS from the IS department, end-users were empowered to develop their own decision-support application systems. End-users were able to design their own screens and reports through the built-in software functionality, make on-line calculations of process performance, build applications in spreadsheets with automatic links to data, and integrate information with higher level business systems. Entire subsystems were written to transform raw data into meaningful information for decision-making, trend monitoring, and process control-performance prediction. These subsystems consisted of many individual applications relevant to specific data in specific process areas of the plant. Summary reports and graphics were derived from the system. Within three years of the initial installation in a test plant, at least thirty-three (33) end-user developed systems were detected through contacting end-user departments. Before expanding the use of the DEUDS to its other plants, the corporate top management asked the local plant managers and the local Information Systems (IS) department to answer the question, "Has the budget been spent wisely?" One of the authors, a member of the IS department and a student enrolled in the IS graduate program at a local university, was involved in the assessment process. He coordinated a process to answer this question by investigating the use of the DEUDS. He coordinated a process involving all the key mill managers and other executives in describing satisfactory language to assess the perceived value of the resulting end-user applications. The focus of this paper is to discuss the results of the case study and the implications for that organization. Plant management wanted a way to access whether or not the systems significantly improved processes they felt critical to their financial success. They were not particularly interested in conventional attitude and user success indicators (Barki 1994), nor were they interested in the acceptance of the technology, per se (Davis 1989). Rather, they were concerned that the systems measured up to their own perceptions of system value and to their definition of their ability to derive meaningful information. In addition, they wanted to know whether the system- derived information assisted them in influencing processes that lead to a more satisfactory bottom line. Specifically, they rejected conventional instruments and instead were satisfied with a homegrown instrument judged equitable to all (nine) of the principal process managers. 3. METHODS The IS department investigator coordinated the following plan of action: 1) Determine the existence of all end-user developed systems. 2) Obtain participation at key plant management levels. 3) With plant management, participatively develop an instrument perceived to be effective and fair to assess system value as self-evaluated the end-user community. 4) Survey management and the user community. 5) Analyze the data, present results and draw conclusions. Each of these steps is discussed in the remainder of this section. System Existence Since the IS Department was uninvolved in development of the DEUDS, it in fact had no idea which applications existed. Therefore, time was devoted to interview most of the organizational personnel to determine wherein systems existed, and who was the user community for each system. Thirty-three systems were identified. The systems came from all departments. Each system had from 3-5 users per system. Participation When developing work teams, the key players must have "the right mix of knowledge about the business and the authority to make decisions about the design, and they have to communicate well (Martin 1990a). A comprehensive 1997 Harvard study (Leibs) revealed that maximum efficiency comes from carefully blending clerks, managers, professionals, and information technologists. Participative, cross-functional teams in the paper industry can provide a competitive advantage because they are users who are close to the business processes (Shanahan 1999). To simultaneously achieve full participation and balance, the investigator's strategy was to involve upper management, middle management, functional area supervisors, and operators in different parts of the study. A steering committee comprised of management representatives from all function areas provided support, input, and guidance throughout the duration of the project. Key-user experts were selected to join management to form a participative cross-functional team to prepare an instrument for assessing system value and Key Measure Alignment of end-user developed systems. Assessment Instrument The ground rules established by the steering team was that the team would negotiate under facilitation the development of an instrument that was perceived to be fair by all involved. Existing instruments were considered and rejected. While the literature is replete with studies that have established methods for assessing user satisfaction, as well as technology acceptance (Doll 1988; Davis 1989; Adams 1992; Barki 1994), the team decided to self-define "System Value (Economic Impact, Productivity and Satisfaction)" and "Key Measure Alignment" (Parker 1988) to measure "System Effectiveness". "System Value" refers to whether a user perceives a system as helping them do their job or providing them with information, whether or not the system affected productivity, and the relative economic impact. Examples of Key Measures Paper produced/day (tons) Pulp produced / day (tons) Machine Up Time Top Grade Yield Operating Income Maintenance $/Ton Operating Efficiency Overtime Fixed Cost / Ton Variable Cost / Ton Manufacturing $ / Ton Ash % Fiber Length Dirt Brightness Variable Cost / Ton Fiber flow & inventory Moisture Basis Weight Temperature Surface Size Smoothness Freeness Waste Table 1. Key Measures Regardless of how good a system may be perceived to be by the one who designed it, it has no value if it is not used (Malhotra 1997). "Key Measure Alignment" refers to whether a system is aligned with the organization's business functions. "Key Measures," defined as key operating parameters or process variables, are the acceptable limits for process variability. Table 1 lists some examples of key measure for functional process areas within the organization. A Delphi-like technique was use to determine a list of assessment factors that represented alignment with business functions. Presenting System Value and Key Measure Alignment, as illustrated with Figure 1, could identify successful systems from the perspective of both the end-user community and management. The team identified 32 factors they perceived to be relevant. Then the team constructed a weighting factor to be used in producing a weighted sum score derived from the user group associated with each application. The team felt that although no one person saw more than a few of the applications, the weighted score would be comparable across applications. Table 2 shows a list in descending order of weights for the weighted factors. Survey A Value Assessment Questionnaire was distributed via a Visual Basic program to end-users involved each of the 33 DEUDS systems user groups. Responses were collected from 2 - 5 participants who evaluated each system. The end-users were asked to complete the questionnaire for each of the systems with which they had familiarity. The questionnaire consisted of the weighted items of Table 2 regarding perceptions of productivity (15 questions), economic impact (6 questions), satisfaction (11 questions), time-savings (1 question), documented dollar-savings (1 question), importance to key measure support (2 questions. Upper management prepared an analysis of plant process efficiency based on actual operating costs. 4. RESULTS Figure 2 shows the overall plant process efficiency from 1991 to 1998. Figure 3 shows that all assessed end-user developed systems with the DEUDS have a high degree of perceived "System Value" and that all but one has a high degree "Key Measure Alignment". Table 3 shows the estimated annual cost savings is roughly between $2 million and 5 million per year. Moreover, Figure 3 indicates that there was a significant elevation of process-efficiency. Table 4 shows that there was a perceived estimate total time-savings of 400 to 500 hour per week by those who were involved in operating decisions. This is an equivalent of 14 full-time key personnel. Seq Question Score Category 17 This application has had a positive impact on product quality. 103 Economic Impact 9 Because of this application, usable information is more accurate. 100 Productivity 13 Because of this application, usable information provides quicker response to problems. 99 Productivity 16 This application has had a positive impact on business performance. 98 Economic Impact 18 As a result of this application, costs have been reduced. 98 Economic Impact 19 As a result of this application, costs have been avoided. 98 Economic Impact 8 Because of this application, usable information is more timely. 97 Productivity 14 Because of this application, usable information aids in the solution of problems. 97 Productivity 15 Because of this application, usable information helps me make better decisions. 97 Productivity 10 Because of this application, usable information is more complete. 96 Productivity 12 Because of this application, usable information makes me more productive. 96 Productivity 4 This application has helped me or others make better decisions. 95 Productivity 7 Because of this application, usable information is more accessible. 93 Productivity 20 As a result of this application, revenues have been increased. 93 Economic Impact 6 This application would benefit other operational processes. 92 Productivity 23 I am satisfied with the quality of the data from this application. 91 Satisfaction 5 This application reduced manual effort. 89 Productivity 1 This application has helped me. 87 Productivity 34 This application is documented as saving the company (in dollars) 87 Economic Impact 31 This application has really made a difference 86 Satisfaction 33 This application has saved in effort hours (range of hours saved) 86 Productivity 11 Because of this application, usable information makes my job easier 85 Productivity 25 I am satisfied with the system functionality 85 Satisfaction 26 I am satisfied with the ease of use 85 Satisfaction 28 I am satisfied with the technical support associated with this application 85 Satisfaction 21 This application was developed as a result of focus on a key measure 83 Economic Impact 3 This application has helped many people 82 Productivity 29 I am satisfied with the training I received for this application 82 Satisfaction 32 This application could be improved 82 Satisfaction 27 I am satisfied with the flexibility of the system 81 Satisfaction 24 I am satisfied with the presentation of the data from this application 77 Satisfaction 30 Before this application I did not have access to this information 73 Satisfaction 2 This application has helped a few people 70 Productivity 22 I am satisfied with the volume of data from this application 69 Satisfaction Table 2. Prioritization of Value Assessment Questions in Order of Importance Documented Savings in Dollars Based on responses to 33 Systems Evaluated by Assessment Questionnaire Survey Savings Range (in dollars) Low Range $ Savings High Range $ Savings $0 $5,000 $0 $35,000 $5,000 $50,000 $45,000 $450,000 $50,000 $100,000 $550,000 $1,100,000 $100,000 $500,000 $400,000 $2,000,000 $500,000 $750,000 $1,000,000 $1,500,000 $1,995,000 $5,085,000 Table 3. Documented Savings in Dollars Savings in Effort Hours Per Week Based on responses to 33 Systems Evaluated by Assessment Questionnaire Survey Savings Range (in effort hours per week) Low Range Effort Hours High Range Effort Hours 0 5 0 0 5 10 55 110 10 15 70 105 15 20 45 60 20 25 240 300 410 575 Table 4. Savings in Effort Hours Per Week 5. DISCUSSION Tables 3 and 4 indicate that there is a perception of significant savings of both dollars and effort from a sampling of DEUDS developed systems. Although the actual number of DEUDS developed systems is not known, the investigator estimates that there are at least four times as many as those involved in the study, suggesting greater credibility to the minimum estimates. Management has expressed confidence in the estimates since knowledgeable employees made the estimates based on their individual experiences and because they appear to corroborate the increase in plant operating efficiency during the case study period. Although there is no scientific basis for cause effect between the savings estimates of dollars and effort are the cause of the increased plant operating efficiency, management believes that the savings estimates are sufficient evidence to justify the initial investment in the DEUDS. Based on the results of this study, all DEUDS developed systems were effective according to User Satisfaction and all but one of the systems was effective according to Key Measure Alignment. In the opinion of the authors, there is insufficient information to conclude that all end-user systems in the study were successful. An alternative interpretation may be that, since only systems developed by end-users of the DEUDS were involved in the study, any system that was not satisfactory to the end-user may have been discarded or modified until the level of desired satisfaction had been reached. This seems to suggest that the detection of unsatisfactory systems could be achieved earlier in end-user development environments. The most significant result seems to be that perceived alignment with Key Measures was reported for all but one of the systems. Key Measures and their relative importance were determined independently of any systems development and alignment. Only 30% of those surveyed were end-user developers with management and supervisory personnel comprising the remainder. 6. CONCLUSIONS Taken together, the results of the study are insightful but still pose a dilemma familiar to IS and the business organization-the problem of value assessment. Before approving investments in information technology, most managers need evidence of some concrete benefits to offset development costs. Evaluations of the success of existing systems also depend on assessing benefits. User and management assessment of value to the organization is critical in providing continuing direction to IT development. Successful introduction of information technology into an organization may then depend on targeting applications where substantial time and cost benefits can be demonstrated. During this period, the role of the IS department evolved from development of information systems to support for end-user developers. Instead of gathering specifications and requirements for applications from the end-user community and writing code, they became responsible for maintaining the technological infrastructure to support the DEUDS, for insuring data collection and data integrity, and for supporting and training end-users for their utilization of the DEUDS. At the time DEUDS was installed, the user community was used to frustration and disappointment over system development failures. They were skeptical of any new development that proposed to assist them in their jobs. Over time, the users trained in the use of the DEUDS system became convinced of management support to develop applications that would assist them in their day-to-day activities. As the use of the DEUDS became more sophisticated, systems comprising dozens of individual applications were developed that resulted in dollar and hour savings. In addition, this system information is now available over the corporate wide area network (WAN). In the past, access and communication of data was tightly controlled. Now, with the WAN and online information systems such as the data historian, information is available real time to anyone with a PC and the correct address and password. The integrity of the system and its data are now critical to the organization. 7. REFERENCES Adams, D. A., Nelson, R. R., and Todd, P. A. (1992). Perceived usefulness, ease of use, and usage of information technology: A replication. MISQ 16(2), 227-247. Barki, H., and Hartwick, J. (1994). Measuring user participation, user involvement, and user attitude. MISQ 18(1), 59-79. Hicks, James O., " Information Systems in Business: An Introduction", West Publishing. St. Paul, MN. 1990. Huff, Sid L., "Modeling and Measuring End Users Sophistication", Proceedings of the annual computer personnel research conference, 1992. Johnson, J., "Chaos: The Dollar Drain of IT Project Failures", Application Development Trends, January, 1995. Leibs, Scott, Carrillo, Karen M., "Insuring Productivity", pp. 46-62, Information Week, November, 1997. Martin, James, "Motivation of Players Crucial to Success of RAD Process", PC Week, February 12, 1990, p.58. Martin, James, "JAD Workshops Help Capture Design Specifications", PC Week, February 19, 1990, p.58. McNurlin, Barbara C., Sprague, Ralph H., "Information Systems Management in Practice", Prentice Hall College Division, 4th Edition, 1997, Chapter 18 -- Creating the New Work Environment Neumann, Seev, Ahituv, Niv, "Principles of Information Systems for Management", Wm. C. Brown Publishers, Copyright, 1982, 1986, 1990. Parker, Marilyn M. and Benson, Robert J., "Information Economics", (1988). Shanahan, Michael, "Competing globally: human intervention the key", Pulp & Paper, February, 1999 The Value Matrix System Value Small Moderate Great Impact Impact Impact 1 QUADRANT 1 33 QUADRANT 2 Well Liked Moderately Liked Not Liked 0 QUADRANT 3 0 QUADRANT 4 Key Measure Alignment Figure 1 1 bart@cis.usouthal.edu 2 daigle@cis.usouthal.edu