Entertainment based upon the information that is kept inside of machines is more recent and less mainstream. With the popular focus on intelligent, evil machines stepping up to rival humanity, the idea that the data inside machines of no sentience really is changing the way we all live, and not always for the good, is hard to dramatize. Even when it is done, the problem for Infosec educators is that, once the show is over, its (sometimes false) narratives may have been burnt into the viewers’ heads as facts. While we should welcome any increase in focus upon the public feeling about security around technology, these misconceptions can make our job harder. This is especially so for those of us working to break down barriers to computer usage by older and by less-computer savvy citizens, a problem highlighted by the Digital Divide. [1] Fear of technology is perhaps the last big hurdle for making the most widespread use of IT. Entertainment has ramped up this fear for many years, even though the spikes in its popular genres have not always synched with actual scientific progress. Yet the idea that machines might somehow come to equal and, ultimately, master us (or even threaten our very existence) is much older in origin and development than even moving pictures. I’m no critic, so the examples of entertainment used in this piece are based on my own reading and viewing. But even with this small sampling, you can clearly see how they pick up on concerns about computer security. Fears about the dominance of machines through AI are prolific in the world of entertainment. Some of the concerns about unintended consequences of machine intelligence predate the growth of modern computing, a growth which has been even more advanced and more ubiquitous than imagined in most works of speculative fiction. From Ambrose Bierce’s ‘Moxon’s Master‘ (1909) where a chess-playing automaton develops homicidal human impulses, through to the more calculatingly murderous logic of the HAL 9000 computer in ‘2001: A Space Odyssey‘ (1968) the ground was set for nervous speculation about machines programmed with human logic. More recently, some of these concerns have gained traction in the scientific field [2] too. Fears about the nuclear arms race have frequently been taken up in movies and books. Among those which speculated upon the role of intelligent machines is ‘Colossus, the Forbin Project‘ (1970). In this, a sentient, worldwide defense system decides to lead humankind forward on to a better existence – using nuclear blackmail. In ‘War Games‘ (1983) puts the nuclear button in the charge of another sentient machine, though this one proves benign and merely warns its creators about the futility of nuclear war. The movie Superman III (1983) contained futuristic fantasy themes centered upon AI. Yet an incidental and more mundane plotline was a true harbinger of computer crime. A minor character (played by Richard Pryor) breaks into the action by covertly creaming off tiny amounts of money from many accounts. In the real world, this technique known as salami slicing was later highlighted in Clifford Stoll’s ‘The Cuckoo’s Egg (1989). In this account of what is likely the first-ever investigation into hacking, Stoll follows up a less than one-dollar accounting discrepancy which, after much intrigue, turns out to have been caused by illegal use of computing time. Worldwide investigations ultimately uncover the first hacker, Markus Hess. Hess sold on the industrial secrets he stole to the Soviet Union. The movie Independence Day (1996) follows some of the plotlines created in HG Well’s War of the Worlds (1898). In that book, the otherwise invincible aliens are (spoiler alerts here) killed incidentally by the common cold virus. In the movie update, the invaders are destroyed (though in this case with some design) through a computer virus. Around the time the movie was released, only a third of US households had computers, and less than one in five had Internet access. [3] Firewalls were a new concept then, and interconnectivity was still something exclusive to government, academia, and commerce. Malware was also a new, much less developed threat to computing than it is now. The movie, therefore, brought the concept of the vulnerability of advanced systems to viruses to audiences who might not ever have heard the term before. There is a paradox about the vulnerability to cyber-attack of older and, in particular, to industrial control and management-level systems. On the one hand, these services, some of which have long supported parts of the critical national infrastructure (CNI) are vulnerable to exploits that have been available for years. On the other hand, the great age (in computing terms) of some make them invulnerable to the newest, more sophisticated exploits developed and used by hackers who target the most up to date technologies. In the TV series Battlestar Gallactica (2004) an independent (and sophisticated) AI entity attacks its human creators through all-out cyber warfare (including the detonation of the humans’ own nuclear weapons and the disabling of their defense systems). What is left of humanity can fight back with older weapons that are invulnerable to these cyber-attack techniques. Clearly, a major fail in the AI entity’s war plan! [download]Download the BEST PRACTICES FOR DEVELOPING AN ENGAGING SECURITY AWARENESS PROGRAM whitepaper[/download] The Machine Stops is the title of a novella (1909) by E.M. Forster, best known for his novels about upper-class 19th-century society. In his uncharacteristic science fiction departure, Forster draws a society that functions solely through a machine which, as it becomes more and more flawed, attempts to control its supine adherents ever more tightly. In Forster’s vision, the ultimate breakdown of this machine becomes the key to mankind’s future and its survival. In the TV series Mr. Robot (2015), activists of ambivalent motivation fight to shut down not an AI but all of the systems and services that modern society relies upon to support the capitalist economy. Unlike Forster however, the results of this apocryphal event are less straightforward. The series marks a turn away from storylines where machines have the upper hand: as one character declares “people always make the best exploits” (which also happens to be a helpful tagline for Infosec educators). For example, one plotline involves the creation of a hoax music CD, which actually contains malware. An employee is induced to place this malware on his own corporate network through a very sophisticated social engineering attack. Popular entertainment can be a helpful platform for in drawing public attention to Infosec issues and can provide useful narratives to illustrate real world concerns. Only more recently have fictional genres addressed the state of computer technology as it really is, perhaps a reflection of the fact that (as of 2014) [4] 84% of U.S. households own a computer, and 73% of these having broadband connection to the internet. Many of us take pleasure in speculative fiction. Let’s hope for even more opportunities to use entertainment as a way of illustrating real points about Infosec. You can try this yourself, by applying your own reading/watch lists to your knowledge of Infosec. Have some fun while you review these against real world Infosec developments! Putting your conclusions inside of a good Infosec narrative can add popular recognition and appeal to your awareness and education efforts. [1] https://www.whitehouse.gov/share/heres-what-digital-divide-looks-united-states?thanks=1&sid=93187656 [2] For example, the 2015 public warning, endorsed by Stephen Hawking and Elon Musk, about future development of autonomous weapon systems. http://futureoflife.org/open-letter-autonomous-weapons/ [3] Figures – for 1997 – taken from US Census Bureau: https://www.census.gov/prod/2001pubs/p23-207.pdf [4] Source: Pew Research Center http://www.pewresearch.org/fact-tank/2014/09/19/census-computer-ownership-internet-connection-varies-widely-across-u-s/