In Defence of Academic Qualifications in Cybersecurity

In Defence of Academic Qualifications in Cybersecurity

The consensus at the moment seems to be that academic qualifications for cybersecurity are expensive, out of date, and do not prepare students for life in the real world.

I agree.

However, that does not mean they are not useful, and indeed I would say that they are probably the first thing hiring managers should look at.

Especially in the USA, but increasingly in the UK and the rest of the English-speaking world, the path into security is laid out like this: get the CompTIA trifecta (A+, Network+, Security+). Create some projects in GitHub. Go spend months on TryHackMe to get into the top n%. Cram as many other certs as you can, and you should get a cybersecurity role. If all else fails, get a helpdesk job for experience and jump after 18 months as you’ll have experience.

Whilst the above scenario may work for some, it does not demonstrate that you actually know anything at all. The CompTIA exams are perfectly decent, but the idea of certification is to demonstrate what you already know. Current advice seems to be to learn from books or videos in order to pass the exam. The two are different mindsets. The first one challenges you to apply what you know to the aims of the qualification. The second gives you the answers and expects you to hit the ground running once you are let loose in the real world. I do not blame new entrants for choosing the second path. Getting experience is hard without a job, and getting a job is harder now (much harder than it ever has been for me as a GenX job seeker) because employers expect certification as a minimum. So people learn to pass the exams and not how to do the job that requires them.

Similarly, whilst there is definitely merit in creating your own projects, or working through tens of hours of hacking challenges, they do not represent the real world. The real world does not have pages of YouTube tutorials, nor does it have influencers suggesting projects and giving hints and walkthroughs on how to do them. As a learning experience, they are heavily influenced by the person doing them. Some may work through everything from scratch, filling GitHub repos and pwning boxes left, right, and centre. Others may have the same output, but have gotten there on the backs of video hints and answers in blog posts. Again, the two are not the same, and a hiring manager cannot see which type of person you are from your CV.

Which brings me to academia. To highlight potential biases, let me state first that I have a Masters in Cybersecurity, and am studying for a PhD in the same. Degrees are expensive. They require serious thought around the financial commitment if you are in the UK, and in the USA it can be almost ruinously expensive. This has to change. Education benefits everyone, and we should make it as cheap as possible for everyone to access tertiary education. My Masters cost me £7,200 over 2 years, and my PhD will be around £15,000 over 6 years. Both are part-time study. An undergraduate degree done full-time can easily reach £40-50,000 once living expenses are included. It is a massive commitment. What it does provide, however, is evidence that the person taking the course has reached a particular level of competency. Note that I’m not saying that they should be immediately employed in a security role, because I’m not. But what a degree proves is that you can follow a syllabus and submit work that reaches a particular standard. It is possible to cheat, for example by having someone else do the work, but universities have years of experience in detecting plagiarism and other types of cheating, and I would argue that they are more adept at spotting this than any certificate awarding body. By forcing students to submit multiple pieces of coursework and conduct independent research, hiring managers can be reasonably sure that the degree holder is capable of learning, and applying that knowledge.

What then of the course material itself? I would agree that many institutions are stuck using aging software, and teaching techniques that may be out of date. This needs to change. Core modules should be updated much more frequently, and help with designing training pathways should come from industry. However, to return to my point, the key takeaway from a degree is the ability to learn. What you’ve actually learned is almost irrelevant – this is why many roles specify a degree requirement only, without requiring a specific subject. During your working life, the technology you work with will be replaced, replaced, and replaced again. What is important is that you have the ability to learn. A degree, for all its faults, is the best indicator of this.

What of the future? More certification bodies should look to move away from multiple-choice testing and towards a thorough test of the candidate’s knowledge through practical demonstrations and longer-form coursework. This, of course, is more labour-intensive, and more expensive. Ultimately, though, if everyone has a certificate, then nobody has one, so hiring managers need another way of differentiating candidates. For the foreseeable future, that is often going to be an undergraduate or postgraduate degree.

Leave a Reply

Your email address will not be published. Required fields are marked *