Leak of US Military Plans Highlights Shadow IT Issues

Yesterday, The Atlantic magazine revealed an extraordinary national security blunder in the United States. Top US government officials had discussed plans for a bombing campaign in Yemen against Houthi rebels in a Signal group chat which inadvertently included The Atlantic's editor in chief, Jeffrey Goldberg.

Author

  • Toby Murray

    Professor of Cybersecurity, School of Computing and Information Systems, The University of Melbourne

This is hardly the first time senior US government officials have used non-approved systems to handle classified information. In 2009, the then US Secretary of State Hilary Clinton fatefully decided to accept the risk of storing her emails on a server in her basement because she preferred the convenience of accessing them using her personal BlackBerry.

Much has been written about the unprecedented nature of this latest incident. Reporting has suggested the US officials involved may have also violated federal laws that require any communication, including text messages, about official acts to be properly preserved.

But what can we learn from it to help us better understand how to design secure systems?

A classic case of 'shadow IT'

Signal is regarded by many cybersecurity experts as one of the world's most secure messaging apps. It has become an established part of many workplaces, including government .

Even so, it should never be used to store and send classified information. Governments, including in the US, define strict rules for how national security classified information needs to be handled and secured. These rules prohibit the use of non-approved systems, including commercial messaging apps such as Signal plus cloud services such as Dropbox or OneDrive, for sending and storing classified data.

The sharing of military plans on Signal is a classic case of what IT professionals call " shadow IT ".

It refers to the all-too-common practice of employees setting up parallel IT infrastructure for business purposes without the approval of central IT administrators.

This incident highlights the potential for shadow IT to create security risks.

Government agencies and large organisations employ teams of cybersecurity professionals whose job it is to manage and secure the organisation's IT infrastructure from cyber threats. At a minimum, these teams need to track what systems are being used to store sensitive information. Defending against sophisticated threats requires constant monitoring of IT systems.

In this sense, shadow IT creates security blind spots: systems that adversaries can breach while going undetected, not least because the IT security team doesn't even know these systems exist.

It's possible that part of the motivation for the US officials in question using shadow IT systems in this instance might have been avoiding the scrutiny and record-keeping requirements of the official channels . For example, some of the messages in the Signal group chat were set to disappear after one week, and some after four.

However, we have known for at least a decade that employees also build shadow IT systems not because they are trying to weaken their organisation's cybersecurity. Instead, a common motivation is that by using shadow IT systems many employees can get their work done faster than when using official, approved systems.

Usability is key

The latest incident highlights an important but often overlooked lesson in cybersecurity: whether a security system is easy to use has an outsized impact on the degree to which it helps improve security.

To borrow from US Founding Father Benjamin Franklin , we might say that a system designer who prioritises security at the expense of usability will produce a system that is neither usable nor secure.

The belief that to make a system more secure requires making it harder to use is as widespread as it is wrong. The best systems are the ones that are both highly secure and highly usable.

The reason is simple: a system that is secure yet difficult to use securely will invariably be used insecurely, if at all. Anyone whose inbox auto-complete has caused them to send an email to the wrong person will understand this risk. It likely also explains how The Atlantic's editor-in-chief might have been mistakenly added by US officials to the Signal group chat.

While we cannot know for certain, reporting suggests Signal displayed the name of Jeffrey Goldberg to the chat group only as "JG". Signal doesn't make it easy to confirm the identity of someone in a group chat, except by their phone number or contact name.

In this sense, Signal gives relatively few clues about the identities of people in chats. This makes it relatively easy to inadvertently add the wrong "JG" from one's contact list to a group chat.

A highly secure - and highly usable - system

Fortunately, we can have our cake and eat it too. My own research shows how.

In collaboration with Australia's Defence Science and Technology Group , I helped develop what's known as the Cross Domain Desktop Compositor. This device allows secure access to classified information while being easier to use than traditional solutions.

It is easier to use because it allows users to connect to the internet. At the same time, it keeps sensitive data physically separate - and therefore secure - but allows it to be displayed alongside internet applications such as web browsers.

One key to making this work was employing mathematical reasoning to prove the device's software provided rock-solid security guarantees. This allowed us to marry the flexibility of software with the strong hardware-enforced security, without introducing additional vulnerability.

Where to from here?

Avoiding security incidents such as this one requires people following the rules to keep everyone secure. This is especially true when handling classified information, even if doing so requires more work than setting up shadow IT workarounds.

In the meantime, we can avoid the need for people to work around the rules by focusing more research on how to make systems both secure and usable.

The Conversation

Toby Murray receives funding from the Department of Defence. He is Director of the Defence Science Institute, which is funded by the Victorian, Tasmanian and Commonwealth Governments. He previously worked for the Department of Defence.

/Courtesy of The Conversation. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).