June 2023 Security Topic - Moving Cybersecurity Beyond the System Administrator

Historically in information security we have asked: what can a system administrator do to get employee buy-in to security practices? The more realistic question is: can a system administrator get buy-in for security practices? Are they the best role within the organization to encourage safe practices? Do they have the right skills? Should organizations be overly dependent on system administrators to impact security cultures?

On May 14, 2021, the Health Services Executive (HSE) in Ireland was hit by a major ransomware attack. The attack caused disruptions across the entire hospital system, forcing most offices to close and causing delays in patient care (Conti Cyber Attack on the HSE, 2021). Review of the incident by a third party revealed a long list of contributing factors which led to the successful ransomware attack. Technology professionals had presented many of these issues to the HSE board prior to the attack. Despite these sessions, the board did not choose to provide funding and support for cybersecurity efforts which would have spared them.  Did the technology professionals fail to provide accurate technical information? What would make a hospital board ignore stark warnings issued by the people most qualified to give them?

It’s likely that the technology professionals were successful in communicating all the obvious flaws in the hospital system. Hospitals were continuing to use vulnerable operating systems like Windows 7. While the entire medical system was connected, individual hospitals made their own varying  risk decisions which could jeopardize the entire connected system without consideration for other hospitals and their patients. There was no one in charge of cyber security in the HSE, so there was no formal plan in place to protect the information and medical services of Irish citizens. The problems were many and glaring. And yet the Board was resistant to reasoned arguments (Conti Cyber Attack on the HSE, 2021). What could a system administrator have said to turn the balance?

In 1986, the Challenger space shuttle was supposed to launch beyond the Earth carrying seven crew members. The morning of the launch was much colder than expected. Space shuttle launches are meticulously planned to prevent accidents, so an unexpected low temperature was cause for concern. Despite this, the shuttle headed towards the stars at 11:39 am. And then, suddenly with millions of people watching, the shuttle exploded in the sky. The solid rocket booster failed in the cold temperatures causing a catastrophic explosion which destroyed the ship and killed the crew. This was a known possibility and yet Challenger was launched anyway (Challenger Explosion, 2020).

One possible reason was pressure on NASA to be more cost effective by the Reagan administration. In the 1960s, NASA had very dependent safety systems where each safety check was complete before the next one started. This and many other processes were time-consuming and expensive. They could also result in delayed rocket launches and cancelled missions. President Kennedy bragged in his speech at Rice University that his administration had dramatically increased government spending on space exploration. He  insisted paying the high bill was worth it to make sure it was done right. (“Why Go to the Moon?”, 2018). In contrast to President Kennedy, President Reagan ran on making dramatic cuts to government spending (“Reaganomics”, 2016). In the 1980s, NASA restructured these safety processes to happen concurrently, thereby saving time and expense. These changes resulted in faster processes and fewer launch delays but introduced some additional risk into launches (Heimann, 1997).

NASA considered the Challenger launch to be an acceptable risk. The agency is unable to eliminate all risk from launching rockets and shuttles. Space travel is very dangerous. If risk mitigation processes are too expensive, they may prevent any shuttles from launching at all. At some point NASA needed to accept some risk to maintain a functioning program. The fate of those seven crew members was tragic, but it’s also possible they would never have had the chance to be on that crew at all. If NASA had maintained its more robust and expensive safety practices, they may have ultimately lost political support for the program (Heimann, 1997). Could a NASA technician have stopped the launch of Challenger based on the technical information they knew?

Both the HSE Board and NASA decided to accept a large amount of risk with catastrophic consequences. In both situations they had all the information they needed to avoid it. So why did they proceed? A reasonable argument from technical professionals and experts was not enough to dissuade them. So why do we find ourselves entreating the system administrator to make the case?

To understand this, we need to leave technical expertise behind and depart the realm of the system administrator. They can’t help us here. We need to look at an alternative time line to the ones experienced by both the HSE and NASA. If the HSE Board had been right to avoid the tremendous difficulty and cost of addressing their cybersecurity shortfalls, they would have maintained more resources to keep patient costs lower, to have better facilities, to hire and retain better doctors and medical staff. The risk is high, yes, but the reward is also high.

This paradigm is the same for NASA. If the launch had been successful, it would have solidified even more support for the space program and sent the first teacher into space. The mission would have been an inspiring and historic moment for everyone watching across the world. One can’t exaggerate the admiration and marvel that people experience observing these missions. They often captivate and amaze everyone across the planet. Again, the risk was very high, but so was the reward. 

Cybersecurity itself is a risk management discipline with little to say about reward. Reward is primarily a business concern. From this perspective, the system administrator is often poorly positioned to affect the safety culture of an organization. No matter how well they understand the risks, they are likely to have a poor understanding of the rewards. People will take incredible amounts of risks for the right rewards. The perception of high rewards can render risk conversations irrelevant.

We can learn this from the HSE board. They claimed none of the provided evidence was sufficient to make them act or raise the priority of cybersecurity efforts. Risk was communicated to them, but it wasn’t presented in comparison to the rewards. An outcome they couldn’t anticipate based on their understanding, was a nationally televised incident that temporarily halted medical care throughout Ireland and compromised the medical privacy of large numbers of their patients. This outcome was in some respects worse than the non-cybersecurity disaster scenarios which they trained and practiced on (Conti Cyber Attack on the HSE, 2021). Technology risk didn’t concern them until it was clearly and obviously jeopardizing medical service availability and patient privacy.

The desire for more efficient manned space launches also had unintended consequences. President Reagan launched a special investigation into the Challenger accident. For two years no further manned launches were attempted. Instead of speeding up NASA’s manned space program, the reduced safety procedures ended up stalling it. What NASA didn't consider was that the risk of that mission was even greater than the lives of the astronauts and the immediate fate of funded manned space travel. The Challenger disaster shook the nation, and likely the world. To some degree, it removed the wonder of space travel and rendered it terrifying. The preventable nature of the accident was also difficult for the public to accept.

We need more than a system administrator to make the case for safe practices. We need individuals looking beyond systems to potential business impacts and overall mission impacts. That individual is likely to operate at a planning and strategic level within an organization with a clear view of acceptable risks. They need to understand how realized risks impact the goals and objectives of the organization. That person or group of people needs to be on boards, senior staff, and advisory committees. They need to help define the organizational safety culture from the top and establish risk tolerance. To some extent we need to leave these system administrators alone and encourage organizational leaders to take up this challenge.  

Need to report an IT security event or incident?

To report, please submit a ticket here: Report an IT Security Incident, or call the IT Service Desk at (585) 395-5151 Option 1.

References

Cyber Attack on the HSE. (2021). HSE Board. https://www.hse.ie/eng/services/publications/conti-cyber-attack-on-the-hse-full-report.pdf

Challenger Explosion. (2020, September 11). https://www.history.com/topics/1980s/challenger-disaster

Challenger: President Reagan’s Challenger Disaster Speech—1/28/86—YouTube. (2009, April 2). https://www.youtube.com/watch?v=Qa7icmqgsow

Heimann, C. F. L. (Clarence F. (1997). Acceptable risks politics, policy, and risky technologies. University of Michigan Press. https://doi.org/10.3998/mpub.14948

“Reaganomics”: The Economic Recovery Tax Act of 1981 – The Reagan Library Education Blog. (2016, August 15). https://reagan.blogs.archives.gov/2016/08/15/reaganomics-the-economic-recovery-tax-act-of-1981/

“Why go to the moon?”—John F. Kennedy at Rice University—YouTube. (2018, September 12). https://www.youtube.com/watch?v=QXqlziZV63k

Details

Article ID: 146193
Created
Fri 6/16/23 10:45 AM
Modified
Wed 12/13/23 4:11 PM