ProPublica

Journalism in the Public Interest

Cancel

Remember Stuxnet? Why the U.S. is Still Vulnerable

Years after the world’s scariest computer virus attack, not much has changed.

.

A high-pressure gas pipeline in Southern California. The Department of Homeland Security revealed a rash of cyber attacks on natural gas pipeline companies last week. (Mike Nelson/AFP/Getty Images)

Last week, the Department of Homeland Security revealed a rash of cyber attacks on natural gas pipeline companies. Just as with previous cyber attacks on infrastructure, there was no known physical damage. But security experts worry it may only be a matter of time.

Efforts to protect pipelines and other critical systems have been halting despite broad agreement that they're vulnerable to viruses like Stuxnet — the mysterious worm that caused havoc to Iran's nuclear program two years ago.

The Frankenstein-like virus infected a type of industrial controller that is ubiquitous — used around the world on everything from pipelines to the electric grid.

Experts say manufacturers haven't fixed security flaws in these essential but obscure devices.

Why hasn't more been done? Here's why Stuxnet remains a top national security risk.

Q. What is Stuxnet, anyway?

Stuxnet first made headlines when it burrowed into computers that controlled uranium centrifuges in Iran's renegade nuclear program. Its self-replicating computer code is usually transmitted on flash drives anyone can stick into a computer. Once activated, the virus made Iran's centrifuges spin out of control while making technicians think everything was working normally — think of a scene in a bank heist movie where the robbers loop old security camera footage while they sneak into the vault.

Q. Who created it?

Whoever knows the answer to this isn't telling — but if cybersecurity researchers, the Iranian government and vocal Internet users are to be believed, the two prime suspects are the U.S. and Israeli governments.

Q. How does it work?

Stuxnet seeks out little gray computers called programmable logic controllers, or PLCs. The size and shape of a carton of cigarettes, PLCs are used in industrial settings from pretzel factories to nuclear power plants. Unfortunately, security researchers say the password requirements for the devices are often weak, creating openings that Stuxnet (or other viruses) can exploit. Siemens made the PLCs that ran Iran's centrifuges; other makers include Modicon and Allen Bradley. Once introduced via computers running Microsoft Windows, Stuxnet looks for a PLC it can control.

Q. How big is the problem?

Millions of PLCs are in use all over the world, and Siemens is one of the top five vendors.

Q. After Iran, did Siemens fix its devices?

Siemens released a software tool for users to detect and remove the Stuxnet virus, and encourages its customers to install fixes Microsoft put out for its Windows system soon after the Iran attack became public (most PLCs are programmed from computers running Windows.) It is also planning to release a new piece of hardware for its PLCs, called a communications processor, to make them more secure — though it's unclear whether the new processor will fix the specific problems Stuxnet exploited. Meanwhile, the firm acknowledges its PLCs remain vulnerable— in a statement to ProPublica, Siemens said it was impossible to guard against every possible attack.

Q. Is Siemens alone?

Logic controllers made by other companies also have flaws, as researchers from NSS labs, a security research firm, have pointed out. Researchers at a consulting firm called Digital Bond drew more attention to the problem earlier this year when they released code targeting commonly used PLCs using some of Stuxnet's techniques. A key vulnerability is password strength — PLCs connected to corporate networks or the Internet are frequently left wide open, Digital Bond CEO Dale Peterson says.

Q. What makes these systems so tough to protect?

Like any computer product, industrial control systems have bugs that programmers can't foresee. Government officials and security researchers say critical systems should never be connected to the Internet — though they frequently are. But having Internet access is convenient and saves money for companies that operate water, power, transit and other systems.

Q. Is cost an issue?

System manufacturers are reluctant to patch older versions of their products, government and private sector researchers said. Utility companies and other operators don't want to shell out money to replace systems that seem to be working fine. Dan Auerbach of the Electronic Frontier Foundation, formerly a security engineer at Google, says the pressure on tech companies to quickly release products sometimes trumps security. "There's an incentive problem," he said.

Q. What's the government doing?

The Department of Energy and the Department of Homeland Security's Computer Emergency Readiness Team, or CERT, work with infrastructure owners, operators and vendors to prevent and respond to cyber threats. Researchers at government-funded labs also assess threats and recommend fixes. But government agencies cannot — and do not attempt to — compel systems vendors to fix bugs.

The only national cybersecurity regulation is a set of eight standards approved by the Federal Energy Regulatory Commission — but these only apply to producers of high-voltage electricity. A Department of Energy audit last year concluded the standards were weak and not well implemented.

Q. So is Congress weighing in?

Cybersecurity has been a much-debated issue. Leading bills, including the Cyber Intelligence Sharing and Protection Act, would enable government and the private sector to share more threat information. But while CISPA and other bills give the Department of Homeland Security and other agencies more power to monitor problems, they all take voluntary approaches.

"Some of my colleagues have said nothing will change until something really bad happens," said Peterson, whose consulting firm exposed vulnerabilities. "I'm hoping that's not true."

Q. What does the Obama administration want?

The White House has called for legislation that encourages private companies to notify government agencies after they've faced cyber intrusions, and recommends private companies secure their own systems against hackers. But the White House stops short of calling for mandatory cybersecurity standards for the private sector.

I wouldn’t even mention CISPA (a punchline I was sure was in here) in this context.  It wouldn’t help.

The solution to these problems is in the answer to the fifth question:  Everybody needs to keep their software updated, period.  CISPA is shockingly and dangerously silent on this, the one biggest tool in defending a system.

(Think of software updates like vaccines without the weird animal parts and mysterious preservatives:  When you update your computers, you’re not only reducing the risk of infection, but also reducing the spread of infection.  Anybody remotely concerned about cybersecurity attacks absolutely should keep 100% up to date, no excuses.)

Also, right now, the power to investigate security is in the wrong hands.  Security researchers (the good guys) get hit with DMCA copyright infringement notices to silence them.  Malevolent hackers (the bad guys) are not doing anything illegal when they sell exploits that will become tomorrow’s viruses.  CISPA continues this tradition, rather than reversing it.

(What CISPA does very well is give companies incentives to deny you service and report you as a criminal for being their customer.  I’m sure it totally won’t lead to a list of everybody who reads WikiLeaks.)

And, tinfoil hat firmly on my head, I’m SURE it’s a complete coincidence that these attacks are suddenly revealed at a time when CISPA and other fake cybersecurity bills are going before the Senate.  Those sorts of things are always easy to cover up, right?  It’s not like anybody is watching the energy companies for interesting news…

Crazy suggestion - legislate that government agencies may not create or aid in the creation of viruses, worms etc.

Won’t happen, because the US government likes having the ability.  It just doesn’t realise that by using Stuxnet it has said the use of cyber attacks is a legitimate tool of state.

It’s a bit tough taking the moral high ground after demonstrating that “might makes right”.

Oh…I notice the reference to ‘Iran’s rengegade nuclear program’ and damage done there. In whose goddamn opinion is Iran’s program illegi tamate? 

Here’s hoping America suffers a thousand times over. Go and rot in Hell all of you!

Above “John” isn’t me.  I agree that the Iranian program is legitimate (especially the parts Stuxnet attacked), but happen to think “rot in Hell” is reserved for people who suggest it.

Stephen, it goes deeper than that.  Anybody who’s not a “serious” (for lack of a better term) programmer wouldn’t realize this, but almost any non-trivial programming or security research (the former of which needs to be done, the latter of which should be) aids in the creation of malware.

I do know what you mean, and I’m not trying to dismiss your point, but it’d be like asking biologists and doctors to restrict their research to things that can’t be used for biowarfare.

You’re on the right track, though.  If it was up to me, I’d say that they shouldn’t be allowed to keep private any exploit that falls into their hands.  It must be reported to the target’s supplier (i.e., Microsoft, Facebook, Apple) within a week and to the public within, say, two months.  I’d argue that should be everybody’s responsibility, with criminal negligence charges filed for anybody who sits on or (worse) resells the information if an attack happens.

To the John who did wish Americans to “rot in Hell!” - your ignorance is disgusting and atrocious. The vitriol you spew is laughable at best. I suspect that many of the novelties that you enjoy in your pathetic existence are the direct result of many Americans, our products and our inventions. Keep your hate speech to yourself unless you have some point worth making… one of my favorite quotes for people like you is, “It’s better to keep your mouth shut and be thought a fool than to open you mouth and prove it.”

To Stephen, here is the flaw in your logic… until you can come up with a way to make it so that ALL governments cannot do that AND to lock down the Internet so that no rogue hacking organization can either, preventing our govenrment from being able to create them is also preventing our government from being able to protect against them. The reason that the first nuclear bomb was created was to end a war… the reason the rest of them have been created is to serve as a deterrent.

I’d like to suggest a revision to your style guide: please reserve the use of the word “frankenstein” for describing machinery or software that deceives by giving the illusion of being a living person.  Hyperventilating hyperbole doesn’t help the public understand.

You also could have mentioned “air-gap” security, which means disconnecting the networks that connect PLCs from the public network.  That is a very commonly practiced security measure for the gizmos, like centrifuges and conveyor belts, controlled by PLCs.  It’s inconvenient, but not nearly as inconvenient as ruined equipment.

Fixing this equipment is not as simple as downloading the latest updates to your favorite smart phone app.  Some of this PLC stuff has been deployed for decades. Siemens’ creation of a communication controller probably will help a lot for those who can deploy it, but plenty of people can’t.

Finally, the attack method STUXNET used—impairing the feedback loop and reporting pathways for centrifuge RPM—was no doubt specifically designed for those Iranian uranium hexafluoride centrifuges.  Would a generic attack work on, say, natural gas pipeline compressor RPM control?  It’s not terribly likely, but it’s possible.

I think it’s vital to do the best job possible of writing these electronic warfare stories. Please keep up the good parts of your work and turn down the “frankenstein” rhetoric.