Applying Principles of Cognitive Psychology to Designing Information Systems Interfaces: What We Can Learn from Students and Practitioners Barbara J. Levine Communication Department levine@rmu.edu Robert Morris University Pittsburgh, PA 15108, USA ABSTRACT Although there is substantial research that user interfaces can be improved by incorporating principles of cognitive psychology, these principles are commonly omitted in the design of information systems interfaces. To explore why, this paper presents the results of a study that examined the design projects of practitioners who were completing an interdisciplinary, graduate-level degree in information systems. The results indicate that the kinds of data students collect and the degree to which they value and understand cognitive principles can help explain the omission. The paper also offers strategies educators can employ to prepare students and professionals to incorporate cognitive principles in real-life information systems interfaces. Key words: cognitive principles, information systems, curricula, interface design, usability, strategies for educators 1. INTRODUCTION The goal of user-centered design (UCD) is to create interfaces that provide users with satisfying experiences (Rubin, 1994, p. 10). Academics and usability specialists increasingly recognize that cognitive psychology, the discipline whose goal is “to understand the nature of human intelligence and how it works” (Anderson, 2000, p. 1) and related fields (including, human factors, human-computer interaction (HCI), ergonomics, and cognitive neuroscience) can inform our understanding of why something poses problems for users (e.g., Barnum, 2002; Dumas and Redish, 1999; Hackos and Redish, 1998; McCraken and Wolfe, 2004; Norman, 1988; Raskin, 2000; Schneiderman and Plaisant, 2005; and Wharton and Lewis, 1994). Hackos and Redish (1998, p. 15) describe the relationship between UCD and cognitive psychology as follows: It is from work in cognitive psychology over the last several decades that we have come to appreciate that we cannot just impose designs on users. People are active parts of the system…Cognitive psychology shows us that we must accept the users as reality because it is they and not the designers (nor their supervisors) who will in the end determine how the product is used (or not used). For more than a decade, there has been growing interest in integrating cognitive psychology, HCI and related areas in IS interfaces. For example, More (1990) maintained that not enough attention had been paid to human factors, which she referred to as “the most critical element in the success of IS” (p. 311). Sparked by Norman’s seminal work The Design of Everyday Things (1988), various designers began to incorporate cognitive psychology into their IS applications. Some recent examples include Web design (Badre, 2002), information systems (Siau and Tan, 2005), and complex dynamic systems for driver interfaces (Jansson, et al., 2006). Recent textbooks, too, emphasize the importance of applying cognitive theory and research findings to IS interface design. Two such texts are Esgate and Groome (2005), who illustrate numerous applications of theory and research to real-world interfaces, and Schneiderman and Plaisant (2005), who devote eight chapters of their book to cognitive aspects of interface design. The authors of both texts consistently maintain that designers of interactive systems must understand the cognitive abilities of their users. In real-life settings, however, those charged with finding and fixing problems with human interfaces often lack expertise in cognitive psychology and related disciplines. As a result, Raskin (2000) and others hold that designers commonly omit a “crucial first step” in the design process: “making sure that the interface design accords with universal psychological facts” (Raskin, 2000, p. 4). Possible explanations for why cognitive psychology is excluded in the design process include the following: * Practitioners may not be aware of how cognitive psychology can aid in making interfaces more usable. * Practitioners may lack the knowledge and skills needed to collect, evaluate and apply data anchored in cognitive psychology to interface design. * Practitioners may not value data rooted in cognitive psychology. * Practitioners may rely more on industry standards than academic research to make design decisions. This study seeks to learn more about why cognitive psychology is commonly omitted in interface design. It further aims to help students and practitioners acquire expertise in cognitive psychology and related disciplines so that they can apply cognitive processes to the design of information systems (IS) interfaces in the workplace. To achieve these goals, this research studied practitioners who are continuing to work full time in their field while pursuing a doctor of science degree in information systems and communication. These professionals’ dual perspective from industry and academia help address the study’s research questions: 1. What kinds of data do students and practitioners collect to discover problems with user interfaces? 2. To what extent do students and practitioners understand the relationship between cognitive psychology and interface design? 3. How do students and practitioners apply data rooted in cognitive psychology to identify and fix problems with user interfaces? 4. To what extent do students and practitioners value data anchored in cognitive psychology and related disciplines? 2. METHODOLOGY To address the research questions, two sources of data were analyzed: portions of students’ projects for a graduate-level usability course and reflective memos. Designed to help students develop the knowledge and skills they need to create effective IS interfaces, the projects required students to identify problems with existing interfaces, redesign them to eliminate those problems, conduct usability tests, recommend to management ways to enhance the interface’s usefulness, and communicate all aspects of the their research in a written report. Students also chronicled their reflections on the design process in memos. These data sources are described in greater detail in the results section. Twenty eight students, who were enrolled in an interdisciplinary graduate program in information systems and communication and were completing the program’s graduate-level course on usability design and testing, participated in this study. Equally represented by gender and spanning a broad age range from 26 to 65, the participants worked full time while earning their degree. These participants were selected because they provided perspectives from both industry and academia. Students who did not wish to participate signed an opt-out form. To assess students’ written reports and memos for the frequency of occurrence of events—the manifest content—and for inferring meaning from the data—the latent content--this research employed content analytic measures (e.g., Krippendorf, 1980). The results section describes both types of analyses: those that required little or no inference by coders and those that required judgments by the coders to classify the data into categories. The analyses followed procedures for conducting content analysis outlined by Robson, 2002; Singleton, 1993 and others. These include defining the unit of analysis for coding the data, developing categories to classify the data, sampling from the universe or population of relevant data, assigning the data to categories, and assessing the extent to which individuals consistently classify the data in the categories. Gauging the extent to which coders agree on how to classify information is essential to the credibility of the analysis and interpretation of the results. One measure to assess agreement among coders for assigning information to categories, Cohen’s Kappa (K), was calculated for the analyses that required coders to interpret information in order to categorize it (Robson, 2002, pp. 240-242). The K values for the analyses are stated in the results section. 3. RESULTS Question 1: What kinds of data do students and practitioners collect to discover problems with user interfaces? To address this question and to determine whether participants collected data that could have yielded insights into cognitive processes, I analyzed in two ways the information they reported in their projects’ methodology section. First, I determined how frequently participants collected each type of data, regardless of which participant collected it. To do so, I listed each type of data participants listed in their projects’ methodology section. The results displayed in Appendix 1 show that the students collected a variety of data. The most frequently collected kinds of data were: verbal responses to interview questions (17), observations of users as they completed tasks associated with using the interface (16), users’ comments as they expressed their thoughts aloud while using the interface (14), and demographic data and information concerning users’ level of expertise from screening questionnaires (13). It is important to note that most participants collected more than one kind of data for their inquiries. For example, one student observed users performing a task, recorded the number of errors and interviewed users after they completed the task. In this case, all three of these were counted. The following lists the kinds of data that were most likely to reveal information about cognitive principles underlying design issues and the number of participants who collected this information (in parentheses): verbal responses to interview questions (17), observations of users as they completed tasks associated with the interface (16), users’ comments as they expressed their thoughts out loud (14), surveys or questionnaires about the task (9), and number of errors committed by users while completing tasks (4). These are marked by an asterick in Appendix 1. Analysis of these data shows that approximately two thirds (67%), 60 of the 89 total number of kinds of data students in the aggregate collected, could have yielded insights into cognitive processes. But because most students gathered more than one type of data, it is not possible to glean from these data how many individual students actually collected information that could lead to cognitive insights. Therefore, I tracked the number of participants who collected each kind of data. These data show that all but 1 of the 28 participants collected at least some data that could have yielded information about cognitive processes. Jointly, these two analyses show that most participants collected data that could heighten their understanding of the cognitive processes that affect users’ performance. However, many advocates of user-centered design recommend that researchers use a data collection method that lets them learn more about what users are thinking while they are performing tasks. This method, referred to as think aloud protocols, can yield essential information about cognitive processes that underlie users’ actions. Asking users to say out loud what they are thinking to themselves may reveal problems that researchers may not be able to detect by other means (Dumas & Redish, 1999; Hackos & Redish, 1998). In this study, one half of the participants (14 of 28) collected data derived from think aloud protocols. The other two most frequently collected kinds of data, observations and responses to interview questions, could also uncover underlying reasons why users are experiencing difficulties with an interface, depending on the researchers’ purpose and methods of collecting the data. Because in most cases participants didn’t provide this information, we can only speculate as to their intentions. Question 2: To what extent do students and practitioners understand the relationship between cognitive psychology and interface design? To answer this question, I analyzed the portion of students’ projects where they analyzed cognitive aspects of design, including the cognitive principles that were incorporated in the original interface design, those that they felt should be included in redesigning the interface, and the ones they incorporated in their own projects. They were instructed to draw on cognitive psychology and related literature throughout their discussion. The discussions of cognitive processes were classified into three categories: high understanding (participants discussed in-depth three or more ideas related to cognitive processes, connected the ideas to UCD and cited relevant literature), moderate understanding (participants discussed two or more ideas related to cognitive processes, partially connected the ideas to UCD and cited some literature), or low understanding (participants discussed less than three ideas related to cognitive processes, failed to connect the ideas to UCD and cited little or no literature). Cohen’s Kappa value for this analysis was high (.91). Following are excerpts from participants’ discussions. High understanding: Donald Norman (1990) in his book The Design of Everyday Things, points out several principles of design relating to cognitive and human factors that should be incorporated into any interface. In this interface, I found that a few of these factors were considered and included. Norman states one his principles as “Making things visible bridge the gulf of execution and evaluation” (Norman, p.188). One area where information is made explicitly visible is in the breadcrumb trail. Though the users did not recognize the term used to describe this feature, all instantly recognized what it was and how it was to be used. Writing out the path that had been taken by their clicks explicitly puts the knowledge in their heads onto the physical page, where they do to need to guess abut where they have come from within the site. Another area where this is done well is in the top navigation. When a section is selected, it becomes highlighted clearly in yellow, and all users instantly recognized this indicator that told them which section they were located in. Also, the system is standardized throughout, with items in alphabetical order in all navigation lists and standard functionality throughout the pages. A couple of Norman’s principles were partially utilized, but could be improved. For example, Norman states, “Use both knowledge in the world and knowledge in the head” (p.188). Several of the navigation items had terms on them that confused the users. This could possibly be because words like “requirements” and “program” are vague and different people interpret them in different ways. This is not making “knowledge in the head” explicit enough in the interface. The terms used need to be improved in order to be as understandable as possible. Also, Norman states, “Get the mappings right” (p.188). Two example where the mappings could be improved were discovered…there seems to be a disconnect with the mental model of some of the users. Moderate understanding: …I believe that few of the cognitive factors were incorporated. The seven cognitive factors are as follows…[Participant listed the seven]…However, with a close look I can see that this interface does simplify the structure of the task and does display proper mapping. The structure of the task is simplified in the original interface as the questions asked are based on information stored in the short term memory. The lack of drop down boxes and links makes it simple and creates very little mapping issues. There is proper mapping throughout the interface with little room to get lost which allows users to stay on track and complete the form in a timely manner. Lastly, the original interface is very standard and supports factor seven when all else fails, standardize. Low understanding: Steven Krug suggests our goal should be to make everything self-evident, and if not at least self-explanatory (p.18). All the comments and user suggestions pointed back to the same goal. The results indicated that the majority of participants (71%, 20 of 28) demonstrated a high understanding of cognitive processes, how they relate to interface design and how they can ameliorate design problems. The remaining 8 participants were equally divided between 4 that demonstrated a moderate understanding (14%) and 4 that had a low understanding (14%) of cognitive processes and their application to interface design. Question 3: How do students and practitioners apply cognitive principles to finding and fixing problems with user interfaces? To assess how participants actually applied cognitive principles and related disciplines to their interfaces, I analyzed the portion of students’ projects where they discussed how they redesigned their interface after they conducted a usability study. Each of the participants’ responses was categorized as high application (participants applied several cognitive principles and discussed in depth reasons for their decisions), moderate application (participants applied a few cognitive principles and cited some reasons for their decisions), or low application (participants applied two or fewer cognitive principles and failed to provide reasons for their decisions). Cohen’s Kappa value for this analysis was high (.80). The following excerpts illustrate these. High application: …in addition to embracing the principles contained in the original web design, the redesign incrementally focused attention on “making thing visible” and ‘getting the mappings right (Norman, 2002, p.189). Also, some content revisions were made to the page to make it more meaningful for the intended users. This included changes to existing wording and the addition of new content categories to respond to user information needs. Changes in task descriptions were made to help “bridge the gulfs of execution and evaluation” (p.189) while changes to site navigation were made to more closely link user expectations, content and roles. These changes made information more visible and accessible for the intended user group. The changes…were necessary to reflect the desired content from the various user types in a manner that was easy to use and to understand…In addition to the cognitive design principles…research findings from…were also considered…these research findings were consistent with the user feedback …other research… argues the importance of organizing web sites by…each of these design concepts were also incorporated into the prototype… Moderate application: In creating the interface, I incorporated knowledge of the world and knowledge in the head. As Norman notes, people “feel more comfortable when the knowledge required for a task is available externally (p.189). I also simplified the structure of the tasks that were asked of the participants…People like simple things that don’t cause them to become frustrated and annoyed….It is also important to make things visible: bridge the gulfs of execution and evaluation. It’s important that people know what is possible and what action needs to be taken. It is also important that people easily recognize what effect their actions had. “Make the actions of an outcome obvious” (p.189). Low application: …A prototype was created after completing the observations and compiling and reviewing the user’s comments and suggestions. Our goal was to design a product that a novice or first time user would have no problem navigating through. The users all wanted to be able to navigate easily… and expected to be able to find some sort of instructional information in the product. The results indicated that approximately one half of the participants (15 of the 28 or 54%) were classified in the high application category, and nearly one third (8 of the 28 or 29%) were classified in the moderate application category. Of the remaining 5, who were categorized as low application, 1 participant applied 2 cognitive principles but didn’t discuss reasons for this decision. Four of these 5 failed to apply any of the principles to their interfaces (14% of the 28 total number of participants) and did not provide reasons for omitting them. Question 4: To what extent do students and practitioners value data anchored in cognitive psychology and related disciplines? To assess the value or importance participants attributed to data rooted in cognitive psychology, I analyzed information from two sources of data: their recommendations to management for implementing ways to improve the interface and their reflections on designing user interfaces. Although “value” could not be assessed directly in this study, it was inferred from these assessments. The rationale is the following. If participants advocated enhancements, based wholly or partially on cognitive processes, it seems likely that they attach some value or importance to them. However, we cannot determine whether the converse is true: If they didn’t ground their recommendations in cognitive principles, we cannot conclude that they don’t value this information. In fact, the recommendations may tell us more about what participants think their managers value than what their own perspectives. Similarly, the data from students’ reflections on designing user interfaces may present problems with interpretation. The data were culled from written responses contained in short, separate memos where students addressed three questions: what did you learn from doing this research, what would you have done differently, and how will this research change the way you think about the design process? Because comments about cognitive processes were not solicited in the instructions, the extent to which students discussed them may indicate the importance or value they attribute to them. I emphasize “may” because when students mention cognitive psychology, there is naturally a confound between whether they value it or simply perceive that their teacher values it. However, when they fail to mention it, after participating in a class where they are taught to value it, this confound disappears. The inference is more direct that students do not assign it importance. Approaching this research question from more than one source of information is a way of triangulating the data. Doing so can reduce threats to validity, including researcher bias, and enhance the credibility of the interpretation of the findings (Robson, 2002). Although the data for the analyses of the recommendations and memos were created by the same participants, it can be argued that they provided two views on the research question. To analyze the data from participants’ recommendations, I tracked the number of participants who based their decision to improve the interface on at least one cognitive insight. The results showed that approximately two thirds of the participants (64%, 18 of the 28) included cognitive processes as part of their justification for altering the interface; about one third (36% or 10 of the 28) did not justify their decisions based on cognitive psychology or related fields. In addition, I wanted to know how the 18 participants who included cognitive processes used them to advocate change. To do so, I evaluated whether participants merely listed ideas pertaining to cognitive psychology and related areas (a list), or they explicated their points by connecting cognitive processes to design issues (an elaborated argument). An example of a list is: 1. On the first page of the site, the school colors should be used for a backdrop and a photo of happy students included. 2. Remove information that is not necessary and incorporate relevant information that students and faculty could easily recognize in their search. 3. On the business division page, as with all relevant pages, include all classes offered and a description of the class. 4. Include all teachers associated with the college. Instructor contact info should be made available to anyone searching for it. 5. Utilize empty space with valuable information and reduce the students’ photos throughout the site. An excerpt from an elaborated argument is: …The first recommendation is to update the form by removing any sections that are irrelevant to technology that currently exists in the organization…By leaving these sections in the form; we may confuse the end user and make it more difficult to complete the form. Additionally, this can lead to inaccurate information being delivered to an auditor during an FDA audit. This change also adopts Norman’s principles by simplifying the structure of tasks, making things visible, and ensuring that mappings are right. Lastly, to support this recommendation, the users at the site visit each commented on this issue. This change was noted and well accepted during the usability test… The results indicated that about one half (56%, 10 of the 18 participants) simply listed cognitive factors—without explicating them. Of the 18 who included cognitive processes in their recommendations to managers, less than half (44% or 8 of the 18) connected them to problems with the interface and ways to improve their usability. It should be noted that when we consider the entire sample of 28 participants, only 8 --less than one third (29%)—directly advocated change based on an in-depth discussion of cognitive processes. The data from the reflective memos showed that about one third of the participants (36% or 10 of the 28) discussed the value of understanding or incorporating cognitive psychology to user interfaces. The following illustrate students’ reflections on interface design: Excerpt 1: Although many of the design and usability concepts were familiar to me…it was worthwhile to gain an understanding of the research and academic underpinnings of how these design approaches and concepts were developed. Research data gathered from others involved with web site design along with the cognitive principles provided a rational basis for redesign that could be easily communicated to the management… Excerpt 2: This research project was very interesting and was more detailed than I originally expected…Since my work often deals with analyzing requirements for systems, this course has allowed me to add more “tools” to my professional “toolkit.” From this experience, I have broadened my scope of analyses and can now apply Norman’s principles as they relate to almost anything and not just systems. Excerpt 3: I have always thought that design was very important…we try to make the students realize than no one will use their program if they are not intuitive, easy to use, and efficient. …He [Norman] really opened my eyes just how off everything in the world is. Then, again, humans have been forced to adjust for…well…since the beginning of time. Perhaps this is why usability is so often overlooked! We just adapt to the interface instead of the interface adapting to us. In light of the findings that 71% demonstrated a high understanding of cognitive processes and that more than half applied them to their interfaces, the number of participants who discussed cognitive processes seemed low. To better understand their views, I analyzed these data from a broader perspective: I asked how many students discussed the value or importance of studying “users,” irrespective of whether they mentioned cognitive processes. Presumably, those who reflected on this sought a deeper understanding of their users, which may include cognitive psychology. These data were more in line with the prior analyses. Fully two thirds (68% or 19 of the 28 participants) discussed the value of learning more about their users. The following are excerpted from students’ reflective memos: Excerpt 1: I have learned to include as many people as I possibly can into the usability and design…Conducting this research has opened my mind concerning…objectives and goals, to include anyone who wants to render input…My new motto is “people always.” Excerpt 2: I have designed countless web sites and applications without performing a user/task analysis or a usability test. This will definitely be an important consideration in any future project that I undertake. Excerpt 3: To design a functional process, you must listen to a variety of users to satisfy their requirements, where feasible, for making a usable process. 4. DISCUSSION OF FINDINGS One aim of this research was to explore why cognitive psychology is commonly omitted in interface design. The results indicate that such omission is less common than expected. After all, approximately one half of the participants applied-- to a high degree—the cognitive psychology data they collected, and another one third (29%) applied this information to a moderate degree. Nonetheless, the results indicate that such omissions can be profound. Five of the 28 participants (18%) applied cognitive principles only to a low extent, and 4 of these 5 failed to incorporate any cognitive principles in their designs. If the group that applied cognitive psychology moderately is pooled with this last group, one finds that nearly one half (47%) of the participants did not apply highly cognitive psychology and related disciplines to their interfaces. What makes this 47% disturbingly high is that students were designing their interfaces for an academic course where they were acquiring the knowledge and skills needed to design effective interfaces and were encouraged to incorporate them in their own projects. In light of this 47%, one can conclude that this research does find evidence for omitting or making less than optimal use of cognitive psychology. The remaining research questions may help us understand why this occurred. One follow-up question was to ascertain the kinds of data they collected. The first research question’s findings may help us understand whether they collected information that could have led to insight into how cognitive processes affect users’ performance. In the best case, these data showed that one half of the participants collected data that directly related to cognitive processes. And almost all collected some data that could yield this information. However, it can also be argued that only those --the one half—who collected data derived from the think aloud method gathered information that truly led to cognitive insights about user difficulties with their interfaces. This interpretation may partially explain why more of the students omitted cognitive psychology. The findings for the extent to which participants understood principles of cognitive psychology and related disciplines, too, may help explain the omission. While the results are mostly encouraging—71% demonstrated a high understanding of cognitive processes—a sizeable portion (28%) did not. And the 14% of the total sample that demonstrated little understanding of these disciplines and their relationship to interface design may indicate that students who did not fully understand this material were unable to apply it. Finally, the results of the fourth research question shed light on the value or importance participants attribute to data anchored in cognitive psychology and related fields and may help us understand further why so many students omitted findings from cognitive psychology in their interfaces. Although these two analyses suggest that most participants value data that increase their understanding of their users, the data cannot tell us whether this encompasses broadening their knowledge of cognitive processes and how this knowledge relates to finding and fixing problems with the interface. The data do show that as few as one third of our sample reflected directly on the value of cognitive processes for interface design, and about the same proportion did not include cognitive processes in their recommendations. Recall that interpreting this data is confounded when students mention cognitive psychology in their reflections. However, this confound disappears when faced with students who are not including the situation of the user in their reflections. In this case, it is safe to infer that there is a problem with the way or extent to which this population of students values this information. Limitations The following should be considered when interpreting the findings. First, because the data—students’ projects—were not created in order to answer the study’s research questions, this study shares a weakness with others that employ content analytic methods: the data may not adequately address the research aims (Singleton, 1993). For this reason, some advocate supplementing content analysis with additional research methods and measures (e.g., Robson, 2002; Singleton, 1993). Sampling is also a limitation. Because the participants were students completing the same degree program at one university, they are not representative of the broader population of students. Hence, the results may not generalize to other students or to the way professionals carry out projects in the workplace. However, it is important to note that the participants were solicited for their dual perspectives as both students and practitioners. A related concern is the potential for researcher bias in interpreting the results, stemming from the researcher’s role as instructor in the usability course and familiarity with the students in the program. An awareness of possible biases and the relatively high reliability values reported in the results section may mitigate this concern. 5. CONCLUSIONS This study’s underlying purpose was to address the need to better prepare students and practitioners to design effective IS interfaces. As a first step, this research sought to learn more about the extent to which students actually understand cognitive psychology and related disciplines, value data rooted in these areas, and apply cognitive principles to the design of IS interfaces. The findings point to a gap between what students are learning in the classroom and how they apply it. To strengthen the link between what we teach and real-life applications, the findings support the following strategies: First, educators should explain the value of applying to IS interfaces data rooted in cognitive psychology and uncover the reasons students do not attribute importance to this information. Uncovering their reasons is central to addressing the possible gap between knowledge and practice. Exploring further why some students do not value data anchored in cognitive psychology is also a question for future research. Second, educators should demonstrate how incorporating cognitive psychology can help designers identify and remedy problems with real-life IS interfaces. Illustrating the connection between cognitive processes and effective design reinforces the value of applying cognitive principles to user interfaces. In my own teaching, students frequently report an “aha” experience when they connect cognitive psychology with improvements in design. A third strategy is to include testimony from professionals in the field that attests to the value of applying cognitive psychology to IS interfaces. Testimonials that equate “value” with achieving organizational goals, and, in some cases, increasing the bottom line, help forge the relationship between educators’ concern for enhancing students’ understanding and ability to apply cognitive principles with industry’s concerns for producing effective IS interfaces. Future research could determine how these strategies—separately and in combination—can help prepare students to incorporate cognitive information in IS interfaces. And on a final note, tracking students’ progress as they learn about cognitive principles in college classrooms and move into the workplace may yield valuable insights for curricular design. 6. REFERENCES Anderson, John. R. (2000) Cognitive Psychology and Its Implications (5th ed.). Worth, New York. Badre, Albert N. (2002) Shaping Web Usability. Pearson Education, Inc., Boston, MA. Barnum, Carol M. (2002) Usability Testing and Research. Longman, New York. Dumas, Joseph S. and Janice C. Redish (1999) A Practical Guide to Usability Testing. Intellect, Portland, OR. Esgate, Anthony and David Groome (2005) An Introduction to Applied Cognitive Psychology. Psychology Press, New York. Hackos, JoAnn T. and Janice C. Redish (1998) User and Task Analysis for Interface Design. Wiley, New York. Jannson, Anders; Eva Olsson; and Mikael Erlandsson (2006) “Bridging the Gap Between Analysis and Design: Improving Existing Driver Interfaces With Tools From the Framework of Cognitive Work Analysis.” Cogn Tech Work, v 8, pp. 41-49. Krippendorff, Klaus (1980) Content Analysis: An Introduction to Its Methodology. Sage, Newbury Park, CA. McCraken, Daniel D. and Rosalee J. Wolfe (2004) User-Centered Website Development. Pearson, Upper Saddle River, NJ. More, Elizabeth (1990) “Information Systems: People Issues.” Journal of Information Science, v 16, iss 5, pp. 311-320. Norman, Donald A. (1988) The Design of Everyday Things. Doubleday, New York. Raskin, Jef (2000) The Human Interface. Addison-Wesley, Reading, MA. Robson, Colin (2002) Real World Research (2nd ed.). Blackwell Publishers, Inc., Malden, MA. Schneiderman, Ben and Catherine Plaisant, R.I. (2005) Designing the Human Interface (4th ed.). Addison-Wesley, Boston. Siau, Keng and Xin Tan (2005) “Technical Communication in Information Systems Development.” IEEE Transactions on Professional Communication, September 2005, v 48, iss 3, p. 269. Singleton, Royce A.; Bruce C. Straits; and Margaret M. Straits (1993) Approaches to Social Research. Oxford University Press, Inc., New York. Wharton, Cathleen and Clayton Lewis (1994) “The Role of Psychological Theory in Usability Inspection Methods.” In Jakob Nielsen and Robert I. Mack (Eds.), Usability Inspection Methods (pp. 341-350). John Wiley and Sons, New York. APPENDIX 1. KINDS OF DATA Kinds of Data Frequency 1. Verbal responses to interview questions* 17 (19%) 2. Observations of users* 16 (18%) 3. Users’ comments during think alouds* 14 (16%) 4. Demographic/screening information 13 (15%) 5. Survey responses to the task* 9 (10%) 6. Time to complete the task 7 (8%) 7. Number of errors in completing the task* 4 (5%) 8. Stories/narratives about the task 2 (< 1%) 9. Task scenarios 2 (< 1%) 10. Users’ comments (not from think alouds) 2 (< 1%) 11. Information requirements 1 (< 1%) 12. Number of clicks 1 (< 1%) 13. Artifacts 1 (< 1%) Total 89 * denotes data most likely to reveal information about cognitive principles