The Cyber Gap Series — Part 3

Political Influence

As noted in the Introduction to this series of articles, there is a real cost levied by the existence of this gap. Different people will quantify this cost differently, depending on their views, experiences, responsibilities, etc. There is no shortage of articles, studies, and insider quotes that allow one to determine the financial cost of inadequate or non-existent cyber protections. Each year, businesses and governmental bodies spend billions of dollars to defend against cyber-attack and, each year, these same entities will spend millions recovering from successful attacks. The cost of such successful attack has also been quantified by lives ruined, reputations tarnished, and lasting psychosocial impacts.

However, it is just as important to be aware of and address the costs of this gap that fall short of national headlines. The core issue at hand is not simply the existence of a jobs gap, but a skills gap. What I call “The Great Divide” is not simply hundreds of thousands of jobs going unfilled because there is no one available to fill them. Rather, it is the fact that there exists a sizeable population that could, and may have even tried, to become an integral part of the industry who are simply not qualified, either because they don’t address industry needs or fail to meet traditional qualification metrics such as a four-year degree. This reality has a problem that is two-fold: qualified persons may be barred due to traditional requirements, and those who chose to pursue traditional qualification now possess an irrelevant or lacking skillset.

This two-fold problem impacts the industry in several ways: it limits the size of the potential applicant pool, candidates may be ill-fitting or incapable of fulfilling position responsibilities (a tenuous situation for both employee and employer), others may end up being underemployed (a status which will likely follow them for the majority of their careers) due to a lack of officiation, and others may be unable to justify the cost of traditional education due to the perceived decrease in value of such certification. Each of these outcomes have cascading effects on the industry itself, as well as all interconnecting realms. To reverse this trend, we must look to the economic, social, political, and educational factors that contribute to it.


The politics of cybersecurity begins in the 1970’s with the Ware Report[1]. This report, declassified in 1975, was written by the Defense Science Board, an ARPA-funded Task Force that was created to study and recommend computer security safeguards to protect classified information on resource-sharing computer systems[2]. While certain recommendations within the Report were adopted by government agencies such as the NSA, the nature of the document resulted in it being relatively ignored by the Carter Administration[3]. The succeeding administration contrasted this position as Reagan, inspired by the 1983 movie Wargames[4], signed National Security Decision Directive 145 titled “National Policy on Telecommunications and Automated Information Systems Security”, officially acknowledging the threat foreign governments posed within the cyber realm[5]. This directive, in addition to the anti-hacking laws that would soon pass in its wake, represents one of the earliest examples of federal intent to improve the nations cybersecurity posture. These measures, however, do not explicitly address the issue of personnel.

Following the administrations of Carter and Reagan, the next major political influence occurred during George W. Bush’s presidency. This influence can be divided into three distinct parts: Richard A. Clarke as National Coordinator for Security, Infrastructure Protection, and Counterterrorism, the September 11th Terrorist Attacks, and the cyberattacks against Estonia and the DoD in 2007[6]. Clarke’s position in the Bush administration signified, pre-9/11, the adoption of cybersecurity policy within the overarching information security concerns of national government. Of course, there is little doubt that the events that occurred on September 11th drastically changed how the United States government, as well as many other international governments, handled national security. Following 9/11, national interest in security grew considerably, as seen with the establishment of the Department of Homeland Security and the operational focus of agencies such as the Federal Bureau of Investigation shifting from law enforcement to national security[7].

Naturally, the nation’s interest in cybersecurity grew as well, with figures such as Richard Clarke speaking at the RSA Conference in 2002 to discuss the importance of improving cybersecurity posture within an organization[8]. During his presentation, Clarke cited statistics that indicated the average corporation will designate less than 1% of their revenue to information technology security. In the wake of this security-focused paradigm shift, President Bush would push for a four-billion-dollar budget for information technology security (a 64% increase). This budget allotment pales in comparison to the $17.4 billion budget for FY 2020[9]. Between these two points in time, we find our third distinct part: Estonia and the Department of Defense. Both events cemented cyberspace as an integral part of a nation’s infrastructure and, more importantly, marked the only time where supply and demand for qualified cybersecurity personnel were equal.

This equality existed due to the supply of qualified personnel exceeding the perceived demand, as prior initiatives to improve the nations cyber infrastructure had been focused on products. The period of 2001–2007 restructured the cyber landscape in response to the government increasingly shouldering the burden of security. After 2007, the equality between supply and demand ceased to exist, with supply falling further and further behind demand as the perception of what was needed to accomplish the mission grew in scope[10]. This perception has been upheld by both the Obama and Trump administrations and is forecasted to continue well into the 2020’s.

Finally, it is worth noting that there is a measurable delay between the foundation of initiatives in the political and social fields and their resultant effects in the economic and educational fields. This delay will be examined further in successive articles, but it is worth noting that initiatives with delays of several years or more may outlive the political lifespan of its proponents. As a result, there may exist a disincentive for politicians to commit time and resources on something they may not be able to capitalize upon later.

[1] Gourley, R. (2015, June 7). List of cyber threat “wake-up calls” Growing.

[2] Defense Science Board. (1975). Security Controls for Computer Systems: Report of Defense Science Board Task Force on Computer Security (R-609–1). RAND Corporation.

[3] Gourley, R. (2016, December 4). Report the cybersecurity commission should have sent the president.

[4] VanHooker, B. (2020, March 17). How the movie ‘WarGames’ inspired Reagan’s cybersecurity policies. MEL Magazine.

[5] Federation of American Scientists. (n.d.). National Security Decision Directive Number 145: National Policy on Telecommunications and Automated Information Systems Security. Federation Of American Scientists — Science for a safer, more informed world.

[6] RAND National Security Research Division. (2014). Hackers Wanted (RR-430). RAND Corporation.

[7] Lemos, R. (2002, February 19). Security guru: Let’s secure the net. ZDNet.

[8] Lemos, R. (2002, February 19). Security guru: Let’s secure the net. ZDNet.

[9] White House. (2020). Cybersecurity Funding (24).

[10] RAND National Security Research Division. (2014). Hackers Wanted (RR-430). RAND Corporation.