Computer Competency of Incoming College Students: Yet More Bad News Frank Hulick frank.hulick@sru.edu David Valentine david.valentine@sru.edu Computer Science Department Slippery Rock University Slippery Rock, PA 16057 Abstract We describe the results of an effort to assess the computer competency level of the entire incoming freshman class at Slippery Rock University. A competency test was developed and administered to more than 1500 students during the spring and summer of 2008. We found that 58% of the students failed to meet the normally passing grade of 60% on the test. These results conform to results found by every other study we could find in the literature. Any formal testing of current college students, no matter who the authors, subjects or content areas were, has returned the same bad news: well over half of students do not have the necessary computer skills to function in an information-based society. Keywords: Computer Competency, IS Education, Proficiency Testing 1. THE ELUSIVE GOAL OF COMPUTER LITERACY The desire for “computer literacy” in the academy has been around for decades. The term has been so used and abused that no clear consensus of what it means seems possible. A glossary of terms from the Harvard Law website defines computer literacy as “the degree to which individuals are familiar with computer operating systems and applications.” The website for Austin Community College, Austin TX offers the following definition for computer literacy - “a user has basic knowledge of computer operations (copying files, printing documents, etc.); use of the Internet/World Wide Web (browsers, search engines); basic software applications (word processing, spreadsheets, etc.); email functions (sending/receiving messages, attaching files, etc.).” The Women to Women (W2W) Network states that computer literacy is the “knowledge about and the ability to learn about computers” whereas law.justia.com has the definition as “the general range of skills and understanding needed to function effectively in a society increasingly dependent on computer and information technology.” The terms ‘computer literacy’ and ‘computer competency’ span all of these definitions. There is a general sense that graduates today should have enough mastery of current, widely-used technology to be effective workers in an information-based society. But of course, the devil is in the details when you get down to defining what is “mastery” and what is the “current technology”. While we prefer the term “computer competency”, we will treat “computer literacy” as a synonym since so many others use “literacy”. Of course, a major problem is that the pace of technological change forces a corresponding change to the definition of competency or literacy. Not so long ago, people thought some word processing and maybe a little programming in BASIC made you competent. The Internet was a distant dream. Because competency is tied to the status of commonly used technology, we are always aiming at a moving target. Computer skills are highly prized by corporate America. Moody, Stewart and Bolt-Lee (2002) surveyed 1500 corporate recruiters and found the most valued skills were (a) communication (oral and written), (b) computer literacy, (c) interpersonal/social, (d) critical thinking/leadership (tied), and (e) teamwork. Only communication skills were more highly valued than computing skills by corporate America. A common belief is that the current cohort of college students is somehow innately “computer competent.” Anecdotal studies point to the vast array of electronic gadgets students own (from cell phones to MP3 players to laptops). Their ability to spend hours surfing the Web and “facebooking” friends is cited as another proof. However, the skills pointed to by these proponents are exactly the activities that will get you fired from a job, not hired into it! An employee who spends significant amounts of workday time “facebooking” or loading songs to an MP3 player will not be (and should not be) long on the job. As we shall see, the research literature does not support this competency assertion. Over the last few years, there has been an increasing demand for accountability in higher education from the funding stakeholders. State legislatures have mandated a 120 credit hour cap on degree requirements (Shannon 2007). Additionally many programs have been seeking external accreditation from appropriate professional boards, and that has put additional pressure on the curriculum. The unfortunate result is that across the land, many degree programs have decided to jettison the computer literacy requirement, and cut that three credit course from the major. The defense is the aforementioned belief that current students do not need such a course: they know it already. 2. SURVEY OF THE LITERATURE Many Information Systems educators have begun to assess exactly what the incoming freshmen understand about the technology. We question the assumption that knowing how to send text messages and Google topics means you are ready to use the technology in your professional life. While computing skills are widely available in secondary schools, they are not yet part of the mandated (core) curriculum. Ceccucci (2005) surveyed 100 randomly selected high schools across America to see what courses they offered in five areas. The survey reported that * 99% of high schools have a “computer literacy” course (apparently defined as application software). But only 13% require the course for graduation. Only 2 states (Nevada and North Carolina) have a computer literacy graduation requirement. * 80% have a graphics/desk top publishing course, and Adobe Pagemaker dominates. * 56% have a web/e-commerce course. * 50% have a programming course. VB dominates as the language of choice, with others equally split between C++ and Java. * 33% have a hardware/networking course. Many of these are CISCO CCNA curriculum programs. So while the course offerings, especially in applications software and in graphics, are widespread, there is little compulsion for students to actually take the courses. They are predominantly in the electives category of secondary education. So it seems that higher education must continue to fill the gap between secondary education and the corporate world for necessary computer skills. Shannon (2007) conducted a large study of 400 student pre-tests and post-tests around a typical college computer literacy course. The study consisted of self-reporting surveys on (a) demographic data and (b) computer skills. The subjects self-reported their belief in their own mastery of 13 different computer skills. The highest confidence levels on the pretest were file management, word processing, web browsing, email and CD burners; the lowest were spreadsheets, database, web design and file transfer skills. Shannon found statistically significant (df=798, p<0.01) increases in all thirteen areas across the board. The highest self-reported gains were in spreadsheets, database and presentation software, file transfer and multimedia editing. In other words, students had things to learn in all thirteen areas. Demographic data showed the strongest increase for African American and Hispanic students. Of course, self-reported confidence levels are open to question. Dettori, Steinbach, and Kalin (2006) describe a program at DePaul where incoming students use a suite of online competency exams to help place them into the appropriate introductory CTI course. They report an immediate advantage of the program is that it sets aside the students’ inflated beliefs. “Students tend to believe they are better prepared than they really are. Being able to point to and immediately discuss the result of the self-test proved a convenient and convincing advising tool.” Of course, these are the same students who are proclaiming that they “know computers” before taking the placement battery. VanLengen (2007) reports on well-designed pretest/posttest study around a college competency course involving 61 students. They were tested across eight different areas and less than one third of them passed the pretest. The average grade was failing. Avg Pretest Score Avg Posttest Score 54% 67% Furthermore, when the scores of students who scored above 60% (i.e. passed the pretest) were examined, VanLengen found that they increased their average from 64% to 73%. His department is using this study to argue that computer competency should become part of the school’s general education requirements. Rafaill and Peach (2001) discuss a bold program at Georgetown College (Kentucky). Since 1999 Georgetown has had a computer literacy graduation requirement. Students are given a massive, 3.5 hour online exam during orientation. The exam covers seven computing areas, and students who pass five of the seven are deemed computer literate. In the first two years of the program, fewer than 50% of students passed the test. Other than the Internet area, where the average score was 72%, students averaged well below 60% on the area tests. Students not passing the exam are then advised into a traditional computer competency course where a grade of C will meet the graduation requirement. McDonald (2004) reports a study at Georgia State where a suite of six competency tests was developed to measure student computer literacy. When those tests were given to students majoring in computer information systems, more than 50% could not pass all six. These surprising results sent the whole project back to square one. Wallace and Clariana (2005) describe a well-defined study on 140 freshman business majors in a computer literacy course. Two separate exams were designed, using the textbook publisher’s test bank. One test covered basic computer concepts (file management, email, web, etc) and the other focused on spreadsheet use (using mathematical functions, cell references, graphics, linking worksheets, etc.). They report only 36% were able to pass both pretests. The averages of the study are given below. Pretest Average Posttest Average Concepts 57.6% 78.0% Spreadsheet 59.5% 82.4% The authors conclude in a classic understatement that sums up for all of these studies: The assumption that incoming freshmen business students possess adequate knowledge of both computer concepts and computer literacy skills is not accurate. In fact, the average score of 58 percent on the Concepts test and 60 percent on the Excel pre-test suggests that students do not possess the necessary skills to function in an undergraduate School of Business. The consistency of results in these studies is stunning. In no case did we see the actual questions. The content areas ranged all over the map. The authoring groups ranged from campus committees to single faculty members. The subjects ranged from individual computer class members to every student in the freshman class. Yet, in every study we see “pass” rates that range from about one third to less than a half. It doesn’t seem to matter whose definition of computer literacy we use, or which testing scheme we adopt: the vast majority of students actually tested fail to demonstrate the necessary mastery. “Not accurate” is the least descriptive statement we can place on the assumption that this college cohort is innately computer competent. 3. THE STUDY AT SLIPPERY ROCK UNIVERSITY The administration at Slippery Rock University decided in 2003 that we would like to certify that every graduate had achieved computer competency to function in a modern information-based society. This was the latest such initiative in a long line of failures, dating back more than 10 years. A campus committee agreed to a set of content areas and the Computer Science department was charged with creating a test to measure student comprehension in those areas. After a series of false starts and logistical nightmares, the department joined with the Academic Services staff to administer an examination to all incoming freshmen as they completed their new-student orientation. The six areas to be tested were hardware, software, the Internet, networking, ethical issues, and security/privacy. Because the exam had to be administered within a sixty minute period (including getting students in and out of the room, distributing the materials, giving instructions and collecting the results), we limited it to sixty questions distributed across the six areas. Furthermore, we had to give the test to groups of 250 students at a time, which precluded doing an online exam: we simply lacked sufficient lab space. So we reluctantly used mark sense forms and #2 pencils to assess computer competency. Previous experience had prepared us for the low level of actual knowledge possessed by incoming students. They believe they know a lot more than they actually do about computer technology. That same previous experience pushed us to design a test aimed at the very minimalist set of skills deemed necessary. We wanted critics across campus to see the test and say, “yes, students should certainly know that.” For example, much of the hardware section of the test revolved around an actual computer advertisement. We presented the fact sheet for a standard desktop machine and then asked things like: * How much memory is in this machine? * What is the capacity of its hard disk? * Does it have wireless access? * Etc. During the spring and summer of 2008 this test was given to 1517 incoming freshmen. Our results confirm the results of all the previous studies: only 42% of the students earned a normally passing grade of 60%. The scores are given in Table 1 in the Appendix. The lowest score from all the students who took the exam was 15. The highest score was 57. Some of the more surprising (and discouraging) results were: * 9% of students knew the contents of memory are lost when a computer is turned off. * 17% knew that Windows Explorer is an application used to work with files. * 34% could determine the amount of RAM listed in a standard computer advertisement. * 20% knew what a device driver was. * Less than 50% knew that “reply to all” in an email program sent your reply to everyone in both the “To:” line and the “CC:” line. * Most students identified the Internet as “a large multi-user program” instead of an example of a computer network. * 45% of students knew that .txt, .mp3, .pdf, etc. were examples of file extensions. * 60% of students could identify Google as a search engine. The students deemed to have passed the exam will have their student record marked as “computer competent.” The remaining students have some options. Three existing computer science courses are identified as meeting the computer competency goal. Many students will take one of the three as part of their undergraduate degree program. All School of Business students, for example, take CpSc210 Productivity Software, and so they will meet the graduation requirement by completing that course. Other programs that have some form of a “technology” course in their curriculum may also have that course registered as meeting this requirement. Furthermore, a new one credit course has been developed to provide the necessary help for those students who just missed the cutoff score. This course will be run in the spring term, and is offered on a pass/no credit basis in an auditorium setting (up to 130 students per section). Students will receive a formal introduction to the fundamental concepts in each of our six content areas, and by passing a final exam will receive the credit and meet the requirement for graduation. That final exam will (of course) have strong similarities to the original screening exam. Students who fail this one credit course will be advised into one of the traditional courses that address computer competency. We look forward to reporting the outcome of this new course. 4. CONCLUSION Slippery Rock University conducted a computer competency testing of some 1500 incoming students in 2008. We found that 58% of students failed to demonstrate an understanding of computer technology that would normally be deemed “passing” on that exam. This result conforms to the vast weight of previous studies conducted in the same field. The same result has been obtained in studies that differ widely in their design, their origin, their content and their audience. It seems that whatever standard is used by professionals to measure whatever is considered “computer competent”, shows that current college students do not have those skills. To assume that they do possess that competency and to expect them to perform technically either in the academy or in industry is putting the students at risk. We conclude that incoming students, even though they are masters of using technological gadgets, need a formal introduction to the computer skills necessary to succeed in a modern, information-based society. 5. REFERENCES Ceccucci (2006). “What Do Students Know When They Enter College?” Information Systems Education Journal, 4 (90). Dettori, L., Steinbach, T., Kalin, M., “Is this Course Right for You? Using Self-Tests for Student Placement, Information Systems Education Journal, 2006, 4:77, 1-12. McDonald, D. S. (2004). “Computer Literacy Skills for Computer Information Systems Majors: A Case Study.” Journal of Information Systems Education, April 1, 2004. http://www.allbusiness.com/ human-resources/careers-job-training/1165970-1.html Retrieved August 20, 2008 Moody, J., Stewart, B. and Bolt-Lee, C. (2002), Showcasing the Skilled Business Graduate: Expanding the Tool Kit. Business Communication Quarterly.2002; 65: 21-33. Rafaill, W.S., & A.C. Peach (2001), “Are Your Students Ready for College? Technology literacy at Georgetown College”, Proceedings of the Annual Mid-South Instructional Technology Conference (6th, Murfreesboro, TN, April 8-10, 2001). Shannon, Li-Jen (2007) “Information and Communication Technology Literacy Issues in Higher Education.” ISECON Proceedings 2007, v24 (Pittsburgh): §3112. VanLengen, Craig (2007). “Computer Literacy Proficiency Testing.” ISECON Proceedings 2007, v24 (Pittsburgh): §2742. Wallace and Clariana (2005). Perception versus Reality—Determining Business Students’ Computer Literacy Skills and Need for Instruction in Information Concepts and Technology. Journal of Information Technology Education, 4. Computer Competency of Incoming College Students: Yet More Bad News. Hulick & Valentine, Slippery Rock University Appendix Table 1 1