Abstract Hands-on laboratory exercises which teach students the applied side of what they learn in lecture are vital to student success. This fact has been recognized by both the IEEE/ACM and NCAE (formerly CAE) model curricula, in 2017 and 2020, respectively. Both mandate the use of lab environments to give students a chance to practice skill learned.
However, for the instructor, developing and maintaining labs creates many challenges. Among these are the need for deep technical knowledge, time to create the lab exercises, and the time to maintain the lab in the face of constant changes to operating systems, user and server software and other tools, and the changing threat environment.
Much has been written about automating lab development and grading, which greatly helps instructors with the time demands. Unfortunately, very little has been written about whether the labs developed and automated are pedagogically correct. Do the labs actually teach what they claim to teach, are they efficient in teaching it, and similar questions remain unanswered. There must be a better way. Luckily, about 60 years ago, the same question was asked in the engineering discipline, and 20 years ago, a comprehensive push to map lab outcomes to ABET accreditation standards was undertaken (Feisel, 2002). This question apparently has not yet been asked in IT, CS, IS, or cybersecurity programs. The time for coasting along on labs that are not optimized is in our past as a field. We need to have labs that demonstrably teach mandated content.
Based largely on my own experience over the last 16 years of teaching, and shored up by literature reviews of hands-on learning theory in other fields which use labs, I’ve identified the following issues with labs. I do not claim this is a comprehensive list, but it represents a starting point for the conversations than need to occur in cybersecurity education:
1. Labs are often written by a single instructor. This may limit topic choices and technical quality.
2. Student engagement may be lacking in lab exercises.
3. Even an engaging lab may not be appropriate for the level of the course or the student preparation.
4. The scope of the hands-on exercise may be inappropriate for the course or time allotted.
5. A lab that is too detailed or too sparse in terms of instructions will limit student learning.
6. Non-standard or proprietary tools may limit the applicability of skills learned in career settings.
7. Many labs contain bugs, especially as software changes.
8. Student collusion and cheating may be difficult to prevent in lab environments and exercises.
9. Grading can be very time consuming.
Of course, there are other issues in cybersecurity labs, cost of equipment, upgrade cycles, security worries on campus networks, etc. There are also commercial and open, community-supported sources of labs, but the question remains; are these labs pedagogically effective? Without guidelines or standards, who can answer that definitively?