Multimedia Learning Environments: Exploring Student and Faculty Perceptions of Streaming Video Jennifer Nicholson nicholsonj@rowan.edu Darren Nicholson nicholson@rowan.edu Marketing and Business Information Systems Department Rowan University Glassboro, NJ 08028, United States Abstract This paper reviews the use of multimedia, specifically streaming video, as a teaching and learning vehicle for procedural knowledge. While prior, experimental research in this area has illustrated how various technology attributes can be employed to positively affect learning outcomes, we seek to provide the perceptions of both students and instructors when a multimedia environment is used to supplement or replace the teaching of procedural knowledge in the classroom. We found that creating tutorials using streaming video provided benefits to students in the form of greater satisfaction with the learning process, a greater understanding of the material, as well as a reduction in the amount of effort required to complete a homework assignment. Furthermore, from an instructor perspective, we found that while considerable up-front time investments are needed, we experienced a marked reduction in visits from students who required additional exposure to previously covered material, a decrease in prep time during subsequent semesters, and seamless portability to online learning contexts. Lastly, we explored student perceptions of perceived difficulty, perceived learning and overall satisfaction with the multimedia environment and, contrary to expectations, subjects who perceived the task to be of greater difficulty also perceived that they learned more and that the learning environment was more satisfying. Keywords: multimedia, instruction, education, information systems, streaming video, computer-mediated 1. INTRODUCTION In recent years, there has been a dramatic increase in the utilization of technology as an educational tool. From PowerPoint presentations to podcasting, instructors are experimenting with unique and creative ways to convey information to the learner. With the increased accessibility to multimedia capabilities, many course developers are under the impression that adding multimedia elements to instructional material will inherently make the dissemination of information more effective. It is important to keep in mind, however, that “the objective of using technology in learning should be to positively influence learning in one way or another; that is, the student should either learn something that he/she would not have learned without the technology or learn it in a more efficient manner” (Alavi and Leidner, 2001, p.4). This view is shared by many scholars (Alavi and Leidner, 2001; Goodhue, Klein, and March, 2000; Large, 1996; Steuer, 1992), indicating a need to investigate how delivery technologies can be utilized to maximize the learning experience. Thus, research that examines the effects of multimedia on learning outcomes has been ongoing. Multimedia research views technology as the collection of tools used to deliver information to an individual (Piccoli, Ahmad and Ives, 2001). Examples of delivery technologies in this context include text, hypertext, graphics, streaming audio and video, computer animations and simulations, embedded tests, and dynamic content (Piccoli, Ahmad, and Ives, 2001). Prior research has been conducted to examine how various factors of a multimedia environment affect learning outcomes and to offer guidance in the development of effective multimedia systems. Results from this research suggest that some degree of learner control can lead to greater intrinsic interest in an activity and satisfaction with the learning experience, which ultimately leads to improved academic performance (Kinzie, Sullivan, and Berdel, 1988; Lepper, 1985; Merrill, 1983, 1994; Williams, 1996). Additionally, learner control can avoid overloading the learner’s working memory (Rieber, 1994) as they can move through the information at a rate and sequence that is comfortable for them. Learner control also allows for repeatability, and the more information is repeated, the better and longer it is remembered (Alessi and Trollip, 2001). Similar research has also shown that interactive multimedia environments (i.e., control over features of the presentation) positively influence user attitudes (e.g., Haseman et al., 2002; Kettanurak et al., 2001) and that information complexity interacts with the multimedia environment to influence learning outcomes (Andres, 2004). Furthermore, research has shown that a more vivid (e.g., sensorially rich, such as animation and narration) and more interactive environment results in an increase in satisfaction and interest and that task complexity interacts with vividness and interactivity to affect performance and perceived mental effort (Nicholson, Nicholson, and Valacich, 2008). Additionally, Mayer and colleagues have published numerous studies on using multimedia to help students in understanding scientific explanations (e.g., Mayer and Anderson, 1991; Mayer and Anderson, 1992; Mayer, Heiser, and Lonn, 2001; Mayer and Moreno, 1998), which resulted in the development of ten principles of how to effectively use multimedia (Mayer, 2008). Some of these principles include: the multimedia principal – deeper learning occurs when animation and narration are used together rather than narration used alone; the modality principal – deeper learning occurs when animation and narration are used together rather than animation and on-screen text; and the personalization principal – deeper learning occurs when narration and on-screen text is conversational rather than formal. Based on this prior research we are learning a great deal about when it may be most beneficial to add a multimedia element to instructional material as well as what technology attributes to incorporate into a multimedia presentation to positively affect the learner’s experience and performance. We believe, however, that in addition to looking at perceptions of multimedia from the learner’s perspective, it is also important to examine the effects of multimedia from the instructor’s perspective. After all, multimedia production is an expensive, time-consuming endeavor not to be taken lightly (Lim and Benbasat, 2002); hence, it is important for instructors to understand the pros and cons of implementing a multimedia component in their courses. The purpose of this paper is to provide student and faculty perceptions of using a multimedia component in a course to convey procedural knowledge. We solicited feedback from students regarding their experience with the multimedia delivery method via an open-ended questionnaire and also collected rating data regarding their perceptions of the task and multimedia environment. First, we provide a description of the task and the multimedia dimensions employed. Second, we present results from the open-ended questionnaire, including comments from students, as well as observations from a faculty perspective. We follow this up by presenting initial, exploratory findings based on the rating data collected from the students as well as the implications of these findings. 2. THE TASK AND THE MULTIMEDIA ENVIRONMENT As part of the curriculum in an introduction to management information systems (MIS) course, which is a required course for all business majors in their junior or senior year, faculty members are required to teach a specific set of skills using Excel. This type of knowledge, which can be classified as procedural, is best conveyed through a demonstration (Gagne, 1985; ten Berge and van Hezewijk, 1999) or a richer representation (Park, 1994). Thus, these skills have typically been taught in a computer lab setting where the instructor would walk students through a hands-on tutorial. Students would then be given a homework assignment to complete on their own time to illustrate their proficiency in these skills. We often felt that this pedagogical approach for imparting these skills was not serving the needs of all students. Some students could barely keep up and seemed to be overwhelmed by the task, while others seemed bored and disinterested because they knew much of the material. Because students’ knowledge of these subjects varied, we decided the students would best be served if they were given more control over the learning process. Our answer to providing learner control, given the aforementioned studies illustrating the benefits of using an interactive and vivid multimedia environment, was to create a tutorial using streaming video to supplant the lab sessions. Rather than purchase an off-the-shelf tutorial product, we chose to create our own video tutorials so that they would be tailored to meet the objectives of the rubric associated with this portion of the course. We used Camtasia Studio to create streaming videos for the Excel tutorials. The streaming videos used animation and narration to demonstrate each step of the tutorial, which corresponded to a specific skillset/tool. For example, showing students how to use the Solver tool was one of the steps in the Excel tutorial. Each step of the tutorial was listed in a table of contents fashion, allowing students complete control over which steps they viewed and in which order they were viewed. Furthermore, students had the ability to pause, skip, fast forward, and rewind the tutorials. Students were also provided the Excel file used in the tutorial, which allowed them to practice the skills covered in the tutorial. Students were given access to the tutorials approximately three weeks prior to the homework due date. This approach was also integrated into a Business Web Applications (BWA) course, a required course for Management Information Systems students in their senior year, where students were required to build an ASP.Net application. In this course, streaming video was used as a supplement to reinforce, and provide context for, coding examples that were given during the in-class lecture. In other words, students were provided streaming video illustrating the process for incorporating the code/concepts into a real-world application using Visual Studio.Net. Data Collection Data for this study was collected in two phases. In the first phase, students in the MIS and BWA courses were given an open-ended questionnaire to fill out at the end of fall semester regarding their experience with the multimedia environment. Essentially, this questionnaire asked students if the multimedia learning environment, when compared to a traditional, face-to-face, lab-type setting, increased their satisfaction, understanding and interest in the material, and whether it decreased the amount of effort it took to complete their assignments. Students were asked to answer either yes or no to each of the questions as well as provide reasoning for their answers. A total of 28 questionnaires were returned in this first phase of data collection. The second phase of data collection occurred at the end of spring semester. The same open-ended questionnaire administered in the first phase was administered to additional sections of MIS and BWA students (note that no subjects from the first phase of data collection participated in the second phase of data collection) who also experienced the same multimedia course component as students in the first phase of data collection. However, we also solicited feedback using Likert-type scales to measure pre- and post-knowledge, perceived difficulty and satisfaction. Specifically, these questionnaires were used to measure the amount of prior experience students had with the software application, the amount of knowledge they felt they had regarding the software application after viewing the tutorials and completing the assignments, their level of satisfaction with the learning environment, as well as the amount of difficulty they had in completing the assignments. Satisfaction was assessed with an instrument adapted from Doll and Torkzadeh (1988), and Rai et al., (2002). A scale developed to measure perceived mental effort (Nicholson, 2006) was used to assess the student’s perception of difficulty in completing the assignments. A total of 81 questionnaires were returned in the second phase of data collection. Of the 109 subjects who participated in this study, demographic data collected revealed that the average age was 21 and that 67% of subjects were male. Next, we discuss student perceptions, from data collected via the open-ended questionnaires, as well as instructor perceptions of the multimedia learning experience. 2. STUDENT AND FACULTY PERCEPTIONS Student Perceptions As illustrated in Table 1 (see Appendix), students responded positively to the multimedia component used in both the MIS and BWA courses. Specifically, students reported an increase in satisfaction and understanding, as well as a decrease in the effort it took to complete the assigned homework, when comparing the multimedia environment to a traditional, face-to-face, lab-type setting. The same cannot be said for interest; however, we present student comments below that provide insight into this response. Satisfaction and Understanding Overall, students reported that they were more satisfied with this type of learning environment and that it helped them understand the material better than a traditional, face-to-face, lab-type setting. Students attributed their increase in satisfaction and understanding to the ability to go through the material at their own pace, to pause, rewind, and fast forward as needed, as well as the ability to skip over material they were already familiar with and focus only on that material which was unfamiliar. It is interesting to note from Table 1 that all of the BWA students reported an increase in understanding. For the most part, those in the MIS course who responded that the multimedia environment did not increase their level of understanding (n=15) stated that they were already familiar with Excel prior to the assignment. Based on this feedback, we therefore believe that for students having little prior knowledge in a given subject area, the use of a multimedia environment may meaningfully increase their level of understanding of the material. Below are some of the comments made by students pertaining to their increase in satisfaction and understanding. “I thought the videos were amazing. It added tremendous value to the lectures. It was the equivalent of having [the professor] over your shoulder for hours. I can’t express how helpful they were and how much more I learned.” “I am one of those students that get it in class, but forget when I step out of the classroom and a tool such as this one helps refresh my memory.” “There was no confusion about missed directions…” “…the multimedia allowed for hands on learning at the users pace.” “It allowed me to revisit sections I was unsure about, so it gave me the opportunity to gain a deeper level of understanding.” Interest Almost one-third of the students replied that the multimedia environment increased their interest in the material, with the highest percentage occurring in the BWA course. Some of the reasons given for the increased level of interest included not having to sit through material that they were already familiar with and that it was a new and novel approach that they hadn’t seen used in the classroom. Most of the respondents who did not report an increase in interest actually stated that their interest in the material did not decrease but was about the same as any other delivery method. Below are some of the comments made by students pertaining to their level of interest in the material. “I feel that if it was taught in lecture format I may find myself bored with it if the professor was going over something I already knew how to do.” “the material itself was interesting but the videos made it more interesting because every session was available for review and quick reference which made learning more engaging and more memory permanent.” “It didn’t make it more or less interesting, just more convenient.” “Instead of getting frustrated by only seeing how something was done once, it made the material more interesting because it was available whenever I wanted and I could watch it as many times as needed.” Effort When it came to effort, a little more than eighty-five percent of the students believed that the multimedia environment decreased the level of effort it took to complete the homework assignment. Some students did remark that it took a lot of effort, in the way of time, to go through the tutorials but that this effort subsequently made it easier for them to complete the homework assignment. Below are some of the comments made by students pertaining to the level of effort that was required to complete the homework assignment. “It decreased the level of effort for me to complete the homework assignment because I listened to the multimedia delivery and did the practice during that. Then right after I did the homework and if I had any problems I could just go back to the video and play again the instructions.” “…decreased the level of effort it took to complete the homework assignment. I could skip over the videos of what I knew, and I could repeat the videos of what I didn’t know to complete the homework assignment.” “If you took the time to learn from the tutorial, then you gained the knowledge to make the assignment easier.” “The multimedia delivery method both increased and decreased the level of effort to complete the assignment. Some of the videos were long; finding the time to watch the video and understand what was being said slightly increased the effort. However, after watching the videos, it was easier to complete the assignment, thus decreasing the effort.” “The media accelerated the time it took to complete the assignments because of the breeze of quick references on unsure material and the ability to skip through areas already mastered.” Multimedia Environment When asked what features of the multimedia delivery method were most beneficial, the unanimous response was the ability to move through the tutorial at your own pace, pausing, rewinding, and fast forwarding when necessary. Students also liked the step-by-step instructions, which allowed them to skip those steps they were familiar with, and liked the fact that they could watch the videos anywhere and anytime they wished. Another feature pointed out by some students was the ability to have the tutorial as a future reference for other projects and classes. Following are some general comments students made about the multimedia environment: “It is better than an in class demonstration because it can be paused and re-watched as many times as the viewer wants.” “I think these videos were some of, it not thee, best educational tool used in any of my classes” “I hope more classes adopt this method in the years to come” “I truly enjoyed this method of learning, and enjoyment played a big part in learning the material” “I was not intimidated by what was being taught because I had immediate access to the video with the ability to replay until I understood” “Liked this method so much and I am coordinating my employer to test and purchase the recording equipment so that I can adapt it to my work functions.” “Although there isn’t enough time to meet with each individual student, these videos act like a face to face meeting. I’m 1 on 1 with the instructor and he is showing me with great detail how and why things functions. Hands down this is a great addition to in class learning. “ Overall, students seemed to react favorably to the multimedia learning environment. They were unanimously more satisfied with this type of delivery method over a traditional, face-to-face, lab-type setting. It appears from their responses that the streaming videos aided in their understanding of the material and, while it may have taken time to watch the videos (although this would have been time spent in the classroom or lab), it seems to have decreased the amount of effort expended to complete the homework assignment. Next, we report on the impact of using streaming video from an instructor perspective. Faculty Perceptions The principal investigators of this research were also the instructors that incorporated the multimedia component into their courses. Several key issues emerged as salient: the initial investment, reliance on technology, a reduction in the amount of time spent re-teaching, and the archival of knowledge for future opportunities. These are discussed in more detail below. Initial investment Faculty members must invest the necessary time in acquiring and learning the ins and outs of multimedia development tools (e.g., Camtasia) which include the following processes: capturing the graphical user interface, recording audio, editing raw recording, updating older recordings, securing against redistribution, and ultimately publishing the content for consumption. Furthermore, when a new version of a software application is released, the faculty member may need to re-create all materials using the most current version of that software tool. Hence, depending on the software application, your streaming videos may have a shelf-life of about two years. Reliance on Technology Faculty members must have access to the appropriate IT infrastructure (e.g., Blackboard, ANGEL, etc.) and support to effectively organize and stream the content to users. Although other contextual and technical issues may emerge, problems with the IT infrastructure surface as the primary issue that may hamper the development, implementation and distribution of multimedia. We fielded several complaints from students about troubles with the streaming media, which was a product of the IT infrastructure where the videos were housed and streamed from. Unfortunately, this caused frustration for some students and may have tainted their experience with the multimedia environment. Reduction in the Amount of Time Spent Re-teaching Faculty members face a three-horned dilemma in deciding how to distribute their limited resources in meeting research, teaching and service/professional demands. We have experienced, with few exceptions, that students leverage office hours seeking assistance for what we call re-teaching. That is, they attended a lab or hands-on tutorial but for some reason or another were unable to grasp the material. After all, the classroom climate, especially hands-on labs, tutorials, etc., can move quite quickly and students may have a certain level of angst in slowing down the class or calling attention their way by asking for clarification or help. Hence, the net effect is time spent re-teaching, often times outside of scheduled office hours, thereby cutting into valuable resources (your time or your research assistant’s time to meet other obligations). We found that using streaming video to either supplant or support face-to-face, hands-on labs led to remarkable resources savings. Specifically, the number of students emailing or stopping by to receive help outside of class diminished greatly, which allowed us to use this time to meet research and service demands. Furthermore, when students did stop by, instead of spending valuable face-to-face time re-teaching, we were able to use the time to truly mentor them in life and career. Archival of Knowledge for Future Opportunities While the process of creating the streaming videos is front-loaded, our experience is that the time and effort it takes is well worth it. For as long as the curriculum remains the same, and barring any software version upgrades, these videos can be used as a tool to impart knowledge in future semesters. Furthermore, with the prevalence of online and distance education, these videos can easily become a component of an online course. In the following section, we provide support for further inquiry into student perceptions as well as the results. 4. EXPLORATORY INSIGHTS As an exploratory component to this study, we also collected rating data in an effort to gain additional insight into student perceptions of the task as well as the multimedia environment. As mentioned previously (see the section on Data Collection), we administered instruments to measure perceived difficulty, perceived learning, and satisfaction. Learner satisfaction is an important outcome of a good learning experience and has been employed in both an academic and a business setting to evaluate the effectiveness of learning environments (e.g., Alavi, Wheeler, and Valacich, 1995; Piccoli, Ahmad, and Ives, 2001; Wolfram, 1994). In this context, satisfaction has been described as a sense of accomplishment felt by a learner when they reach the end of a learning event and feel that the learning environment helped to facilitate information processing resulting in successful comprehension outcomes (Keller, 1987; Song and Keller, 2001). In this study, students were asked to rate their level of satisfaction with using the multimedia environment. Perceived difficulty was measured in order to gain insight into the amount of mental effort that student’s felt it took to perform a task. Mental effort refers to the amount of capacity that an individual allocates to meet instructional demands and is considered to be an indicator of cognitive load (Paas, 1992). Prior research has illustrated that an increase in cognitive load can reduce performance and learning (Bannert, 2002; Sweller et al., 1998). In this study, students were asked to rate the difficulty of the homework assignment they were given following the multimedia tutorial. “Perceived learning is defined as changes in the learner’s perceptions of skill and knowledge levels before and after the learning experience” (Alavi, Marakas, and Yoo, 2002, p.406). Prior research also supports the notion of using perceived learning as a measure of learning effectiveness. This study operationalized perceived learning as the difference between self-reported pre and post knowledge levels. We did not develop hypotheses regarding the relationships between these variables, hence we only present descriptive statistics (see Table 2 in Appendix). Nonetheless, there are some interesting findings with respect to the direction of means as they appear to be somewhat counterintuitive. Specifically, as perceived difficulty increases, both perceived learning and overall satisfaction with the learning environment positively increase. Rather, one may expect that as perceived difficulty increases that perceived learning and/or satisfaction may decrease, i.e., without any additional learning aids or instruction, as the difficulty of concepts increase, the effort that an individual may have to expend to acquire the requisite understanding to complete said task could lead to lower relative levels of perceived learning and satisfaction with the instruction. Yet it appears from the descriptive data that although the content of the BWA course was considered to be rather difficult, students were highly satisfied and perceived that they learned a lot. This finding definitely calls for the need for further investigation, which is discussed in the next section. Overall, what is important to note from this data, is that students were very satisfied with the multimedia delivery method. 5. LIMITATIONS AND FUTURE RESEARCH As this study is more exploratory in nature, there are few limitations that need to be addressed. In particular, the use of homogenous student subjects can lead to issues regarding the generalizability of the results. However, one could argue that this limitation is negated because this research examines a pedagogical technique that directly affects and applies to students. Second, our study was cross-sectional and can in no way predict students’ attitudes towards this type of learning environment if they were required to use it on a regular basis or if it was something that was integrated into multiple courses. Third, some of the faculty perceptions would be a non-issue if one decided to purchase a commercial, off-the-shelf tutorial product rather than create one’s own. In some cases, however, off-the-shelf products do not meet the needs of a particular course or do not even exist for certain subject areas. Finally, we examined the multimedia delivery of two very different subject areas with what appear to be varying levels of difficulty. However, we believe the differences found between these two groups definitely calls for a need for further research. Based on the findings from our descriptive statistics, it appears that perceived difficulty does not negatively influence perceived learning and satisfaction. Although the current study merely represents an initial, exploratory investigation into the utilization of multimedia as a learning aid, the results nonetheless reveal interesting and unexpected relationships between perceived difficulty, perceived learning and overall satisfaction with the learning environment. These results present an opportunity for future experimental research – that is, to disentangle the factors leading to higher levels of learning and satisfaction in the context of higher levels of perceived difficulty. Furthermore, based on the data collected, it would appear that those with little prior knowledge may benefit most from this type of instructional delivery method, as it appears to have meaningfully increased their level of understanding over those who were already familiar with the material. This also lends itself as an area for future investigation. It may be that a multimedia delivery method is best used when the material is perceived as being difficult and/or when one has little prior knowledge in a given subject area. 6. CONCLUSION In the education equation, much focus is placed on the learner, and rightfully so. However, when it comes to pedagogical techniques and methods, there are salient issues and benefits that should be considered from a faculty perspective as well. Hence, this paper provides a glimpse into the perceptions of students and faculty, as well as descriptive statistics of perceived learning, perceived difficulty and satisfaction, when a multimedia learning component is incorporated into a course. Through our experience with using streaming video as a means to supplement or replace the transfer of procedural knowledge we hope others are able to see the ultimate benefit of implementing this type of technology in the classroom as well as possible factors to consider which might undermine the meaningful outcomes of technology-mediated learning. We found that creating tutorials using streaming video provided benefits to students in the form of greater satisfaction with the learning environment, a greater understanding of the material, as well as a reduction in the amount of effort required to complete a homework assignment. Additionally, our findings appear to reveal a relationship between perceived learning, perceived difficulty, and satisfaction that warrants further investigation. From a faculty perspective, we experienced a marked reduction in visits from students who required additional exposure to previously covered material, freeing up resources that could instead be used for pursuing scholarly or service-oriented endeavors. Moreover, we experienced a decrease in prep time during subsequent semesters and seamless portability to online learning contexts. While there are a few drawbacks to implementing a multimedia component into a course, such as the initial investment in creating the videos as well as the reliance on technology, we believe the benefits significantly outweigh the drawbacks and hope our experience will encourage others to consider using streaming video as a complementary teaching tool. 7. REFERENCES Ajzen, I. (1988). Attitudes, personality, and behavior. Chicago, Ill.: Dorsey Press. Alavi, M., Marakas, G.M., and Yoo, Y. (2002). “A Comparative Study of Distributed Learning Environments on Learning Outcomes.” Information Systems Research, 13(4), 404-415. Alavi, M., and Leidner, D.E. (2001). “Research Commentary: Technology-Mediated Learning: A Call for Greater Depth and Breadth of Research.” Information Systems Research, 12(1), 1-10. Alavi, M., Wheeler, B., and Valacich, J. (1995). “Using IT to reengineer business education: An exploratory investigation of collaborative telelearning.” MIS Quarterly, 19, 159-174. Alessi, S.M., and Trollip, S.R. (2001). Multimedia for Learning: Methods and Development. Needham Heights, MA: Allyn and Bacon. Andres, H.P. (2004). “Multimedia, information complexity, and cognitive processing.” Information Resources Management Journal, 17(1), 63-78. Cohen, J. and Cohen, P. (1983). Applied Multiple Regression/Correlation Analysis for Behavioral Sciences (2nd ed.). Hillsdale, NJ: Lawrence Erlbaum Associates. Cole, P. (1992). “Constructivism Revisited: A Search for Common Ground.” Educational Technology, 32(2), 27-34. Doll, W.J., and Torkzadeh, G. (1988). “The Measurement of End-User Computing Satisfaction.” MIS Quarterly, 12(2), 259-274. Fishbein, M., Guenther-Grey, C., Johnson, W., Wolitski, R. J., McAlister, A., Rietmeijer, C. A., et al. (1996). “Using theory-based interventions to reduce AIDS risk behaviours: The CDC’s AIDS community demonstration projects.” In S. Oskamp & S. Thompson (Eds.), Safer sex and drug use: Understanding and preventing HIV risk behaviour (pp. 177–206). Thousand Oaks, CA: Sage Gagne, R.M. (1985). The Conditions of Learning (4th ed.). New York, NY: CBS College Publishing. Goodhue, D.L., Klein, B.D., and March, S.T. (2000). “User Evaluations of IS as Surrogates for Objective Performance.” Information and Management, 38, 87-101. Hackman, M.Z. and Walker, K.B. (1990). “Instructional Communication in the Televised Classroom: The Effects of System Design and Teacher Immediacy on Student Learning and Satisfaction.” Communication Education, 39(3), 196-206. Haseman, W.D., Polatoglu, V., and Ramamurthy, K. (2002). “An empirical investigation of the influences of the degree of interactivity on user-outcomes in a multimedia environment.” Information Resources Management Journal, 15(2), 31-48. Keller, J.M. (1987). “Strategies for Stimulating the Motivation to Learn.” Performance and Instruction, 26(8), 1-7. Kettanurak, V., Ramamurthy, K., and Haseman, W.D. (2001). “User attitude as a mediator of learning performance improvement in an interactive multimedia environment: An empirical investigation of the degree of interactivity and learning styles.” International Journal of Human-Computer Studies, 54, 541-583. Kinzie, M.B., Sullivan, H.J., and Berdel, R.L. (1988). “Learner Control and Achievement in Science Computer-Assisted Construction.” Journal of Educational Psychology, 80(3), 299-303. Kraft, P., Rise, J., Sutton, S., & Roysamb, E. (2005). “Perceived difficulty in the theory of planned behaviour: Perceived behavioural control or affective attitude?” British Journal of Social Psychology, 44, 479–496. Large, A. (1996). “Computer Animation in an Instructional Environment.” Library and Information Science Research, 18(1), 3-23. Lee, Y., Tseng, S., Liu, F., and Liu, S. (2007). “Antecedents of Learner Satisfaction Toward E-Learning.” Journal of American Academy of Business, 11(2), 161-168. Lepper, M. (1985). “Microcomputers in Education: Motivational and Social Issues.” American Psychologist, 40, 1-18. Mayer, R.E. (2008). “Applying the Science of Learning: Evidence-Based Principles for the Design of Multimedia Instruction.” American Psychologies, 63(8), 760-769. Mayer, R.E. and Anderson, R.B. (1991). “Animations need narrations: an experimental test of dual-coding hypothesis.” Journal of Educational Psychology, 83, 484-490. Mayer, R.E. and Anderson, R.B. (1992). “The instructive animation: helping students build connections between words and pictures in multimedia learning.” Journal of Educational Psychology, 84, 444-452. Mayer, R.E., Heiser, J., and Lonn, S. (2001). “Cognitive constraints on multimedia learning: when presenting more material results in less understanding.” Journal of Educational Psychology, 93(1), 187-198. Mayer, R.E. and Moreno, R. (1998). “A split-attention effect in multimedia learning: evidence for dual processing systems in working memory.” Journal of Educational Psychology, 90(2), 312-320. Merrill, M.D. (1983). “Component Display Theory.” In C. M. Reigeluth (ed.), Instructional Design Theories and Models. Hillsdale, NJ: Erlbaum Associates. Merrill, M.D. (1994). Instructional Design Theory. Englewood Cliffs, NJ: Educational Technology Publications. Montano, D., Kasprzyk, D., von Haeften, I., & Fishbein, M. (2001). “Toward an understanding of condom use behaviours: A theoretical and methodological overview of Project SAFER.” Psychology, Health and Medicine, 6(2), 139–150. Nicholson, J., Nicholson, D., and Valacich, J. (2008). “Examining the Effects of Technology Attributes on Learning: A Contingency Perspective.” Journal of Information Technology Education, 7, 184-204. Park, O. (1994). “Dynamic Visual Displays in Media-Based Instruction.” Educational Technology, 21-25. Petty, R.E. and Cacioppo, J.T. (1996). Attitudes and Persuasion: Classic and Contemporary Approaches. Boulder, CO: Westview Press. Piccoli, G, Ahmad, R., and Ives, B. (2001). “Web-based virtual learning environments: A research framework and a preliminary assessment of effectiveness in basic IT skills training.” MIS Quarterly, 25(4), 401-426. Rai, A., Lang, S.S., and Welker, R.B. (2002). “Assessing the Validity of IS Success Models: An Empirical Test and Theoretical Analysis.” Information Systems Research, 13(1), 50-69. Rieber, L. P. (1994). Computers, graphics, and learning. Madison, WI: Brown & Benchmark. Song, S.H., and Keller, J.M. (2001). “Effectiveness of Motivationally Adaptive Computer-Assisted Instruction on the Dynamic Aspects of Motivation.” Educational Technology Research and Development, 49(2), 5-22. Steuer, J. (1992). “Defining virtual reality: Dimensions determining telepresence.” Journal of Communication, 42(4), 73-93. Tabachnick, B.G., and Fidell, L.S. (2001). Using Multivariate Statistics (4th ed.). Needham Heights, MA: Allyn and Bacon. Ten Berge, T., & Van Hezewijk, R. (1999). “Procedural and declarative knowledge; An evolutionary perspective.” Theory and Psychology, 9(5), 605-624. Williams, M.D. (1996). “Learner-Control and Instructional Technologies.” In D.H. Jonassen (ed.), Handbook of Research for Educational Communications and Technology. Macmillan, NY: Simon and Schuster. Wolfram, D. (1994). “Audio-Graphics for Distance Education: A Case Study in Students Attitudes and Perceptions.” Journal of Education for Library and Information Science, 35(3), 179-186. Appendix Table 1: Summary of Student Responses MIS n=88 BWA n=21 OVERALL N=109 Compared to a traditional, face-to-face, lab-type setting: Percent Yes Percent Yes Percent Yes Were you more satisfied with the multimedia delivery method? 95.4% 90.5% 94.5% Did the multimedia delivery method increase your level of interest in the material? 54.5% 90.5% 61.5% Did the multimedia delivery method increase your level of understanding in the material? 83.0% 100% 86.2% Did the multimedia delivery method make completing the homework assignment easier? 84.1% 95.2% 86.2% Table 2: Means and Standard Deviations VARIABLE MIS BWA OVERALL Perceived Difficulty M=2.74 SD=1.20 M=3.91 SD=1.20 M=2.94 SD=1.27 Perceived Learning (Post Know – Pre Know) M=1.08 SD=.93 M=4.00 SD=1.14 M=1.59 SD=1.47 Satisfaction M=5.90 SD=.89 M=6.54 SD=.35 M=6.01 SD=.86 All items were measured on a seven-point Likert-type scale. The items for perceived difficulty and satisfaction were anchored by strongly disagree (1) and strongly agree (7), where a higher rating indicated an increase in perceived difficulty or an increase in satisfaction respectively. The items for pre and post knowledge were anchored by none (1) and extensive (7), where a higher rating indicated more knowledge.