Hacker Group Unveils Critical Attacks, Accused of Drawing a ‘Road Map for the Bad Guys’

Photography: Ebrahim Norouzi, IIPA/AP Photo

Iranian technicians work at the Bushehr nuclear power plant in 2010. A computer worm known as Stuxnet is believed to have damaged Iran

For the past few months, a team of computer hackers has engaged in an aggressive form of cyber subversion. Its work has enraged critics and led to accusations that it is endangering people’s lives.

It’s not Anonymous. The group causing the uproar is a team of professional researchers that has embraced an extreme approach to cyber security.

Dale Peterson, who runs a Sunrise (Fla.)-based security company called Digital Bond, has recruited six elite hackers to help him show the world how to sabotage the computer systems that run nuclear power generators, oil and gas pipelines, and other “critical infrastructure”—in the name of improving their safety.

The volunteers have gone beyond probing networks and working to fix vulnerabilities they find. They are writing programs to attack weaknesses and releasing those programs on the Internet, making them available on Digital Bond’s website.

The public release of the software has led to a fierce debate in the security world between Peterson and other experts critical of exposing techniques that could bring vital industrial systems to a standstill. “It’s never, never been justifiable from a security viewpoint, or a business viewpoint, or any other viewpoint, to give away that exploit code,” says John Pescatore, a security expert and vice president at technology researcher Gartner.

Peterson, who has been securing control systems for 12 years, says he’s not trying to harm people. He argues that releasing attack code, or “exploit” code, is necessary to shock infrastructure operators into upgrading their security. Peterson also says he was alarmed by a computer attack on Iran’s nuclear program and what he sees as a slow response time by utilities and other infrastructure operators dealing with mounting threats.

Photograph: Digital Bond

Dale Peterson has enraged critics by releasing exploit code on the Web.

“We tried to play nice, we worked inside the community, we were very careful about sharing anything that would demonstrate how this could be done,” Peterson says. “But last year it was ten years after 9/11, and we looked around and that approach had made very little progress.”

His group has found weaknesses in top-selling “controllers” that are ubiquitous in industrial settings. The software Peterson’s group has developed uses vulnerabilities in the controllers’ design to shut them down or gain administrative passwords. The programs are small. One that uses a brute force attack to deduce passwords, for instance, fits in 131 short lines of code.

The danger with controllers is they form a bridge between computers and physical machinery, and they are everywhere, from prisons to chemical plants to manufacturing facilities. Just as a clicker tells a garage door to open or close, the controllers keep industrial machinery moving—or not.

Their software is used to tell prison gates when to open and shut, for example, or how fast assembly-line motors should run, or even whether centrifuges should spin faster or shut down. In 2010 an Iranian nuclear plant was believed damaged by a malware attack that exploited that capability.

A hacking team based in Virginia reported last year that it was allowed to inspect a correctional institution—it won’t say which one—and found the same vulnerable controllers used in Iran were also used extensively throughout the facility. Guards were spotted using Web-based e-mail on internal networks, creating an opening for attack. According to the group, if hackers broke into the network, they could manipulate the controllers to open and close doors, suppress alarms, and disrupt video surveillance feeds.

Also last year, Mocana, a San Francisco-based security group, reported that it found multiple serious vulnerabilities in controllers used throughout the substations at an unspecified Southern California power company, which commissioned the testing. Mocana said it took them just one day to find the weaknesses.

Code from a program that uses brute force to hack controller passwords.

Peterson’s work exposes deep divisions within the security community about how to protect critical infrastructure. It has touched a nerve because the controllers can be used for decades and are not easily updated with software patches, unlike PCs or smartphones. They are typically built with reliability—not security—in mind, and fixing them often means replacing them.

Hackers have long used public pressure to force uncooperative technology companies to listen to their security complaints. But Peterson’s group takes it a step further.

The attack code is being released with the help of Rapid7, a Boston-based firm that makes Metasploit, which is software that companies use to probe their networks for security holes. Because Metasploit is open-source, it accepts additions from the outside, such as those from Peterson’s group. Like some other legitimate security software, Metasploit can serve a dual role, as criminals can also use it to locate attack targets.

HD Moore, chief architect of Metasploit and Rapid7′s chief security officer, says the rationale behind such software is that the bad guys are already ahead.

“We’ve been chasing what the bad guys have been doing and not the other way around for a long time now,” he said in an interview. “At this point we’re only trying to level the playing field.”

Peterson’s group began releasing the code in January. The targeted machines are made by some of the biggest names in the industrial controller industry: General Electric, Rockwell Automation, Schweitzer Engineering Laboratories, Japan’s Koyo Electronics Industries, and France’s Schneider Electric.

No known attacks have used the techniques created by the team, though the code is easy to find with a Google search of the names and models of the affected devices. Still, the work has been harshly criticized.

Even among those who believe it’s important to highlight the vulnerabilities of key systems, Peterson’s approach has elicited a backlash. Gartner’s Pescatore, for instance, says he strongly supports using “shock tactics,” such as publicizing a system’s weakness, to pressure companies that refuse to fix their security holes. Giving out attack code, though, is “over the line,” he says. “It has invariably been ambulance-chasing consultancies or small-time companies that come in and want to put out the fire after they’ve caused the fire.”

General Electric Company

GE

Joe Weiss, who runs Applied Control Solutions, a Cupertino (Calif.) security consultancy focused on control systems, says the research highlights an urgent problem but makes it too easy for criminals to exploit the findings.

“Why are you giving the bad guys a road map to what to do? There’s no reason for this,” he says. “The vendors aren’t going to turn around and all of a sudden become good guys. Those vendors that care about security are going to make changes, and those that don’t won’t change anything. So you are putting users in peril, and what have you accomplished?”

Peterson and his team did not contact the targeted companies about their work, he says, because most of the weaknesses were known issues.

“We didn’t see anything in there that was dramatically new or unknown,” Peterson says. “Most of the easily exploited vulnerabilities are simply features of the products or issues that have been known for a long time and have been ignored.”

Implanting malicious software inside such controllers would be just one step in a real-world attack. Detailed knowledge of the physical plants would also be needed, which is typically the realm of insiders and intelligence agents. The Stuxnet malware used against Iran followed extensive rules that prevented it from deploying inside anything other than a specific Iranian nuclear facility, suggesting the authors had deep knowledge of the facility’s computer systems.

Peterson acknowledges that the disclosures might help some low-level criminals. Sophisticated adversaries, however, likely already know the vulnerabilities, he says.

Digital Bond

The exploit code in action, scouring a GE-made controller for passwords.

“It wouldn’t be intellectually honest if we didn’t acknowledge that someone with limited to moderate skills could do something bad with this if they chose to,” Peterson says. “But an organization with resources could figure this out themselves. I don’t think we’re really helping any state-sponsored efforts, or any efforts that are truly motivated to cause trouble, because this stuff is not that hard.”

Ed Schweitzer, founder of Schweitzer Engineering Laboratories, says the weaknesses Peterson’s team found in a product his company makes, the SEL-2032 Communications Processor, were “functionally inconsequential.” Exploiting the nearly $3,000 processor, which is used inside power substations, requires other security mechanisms to fail first, he says.

Schweitzer says a few customers called, but he did not lose business. His company might disclose the issues in the product’s manual in the future but isn’t planning any changes to the product itself.

Schweitzer also accused Peterson’s group of “grandstanding about stuff that doesn’t matter.”

“My dad used to have an expression: ‘It’s like a farmer shaving a pig—there’s a lot of noise and not much wool,’” Schweitzer says. “If you go out to the security perimeter and take care of business, you’re going to be OK.”

Rockwell Automation has started a “detailed analysis” of its products in response to the research and is considering making changes, according to John Bernaden, a company spokesman.

The other targeted companies either declined to comment or didn’t return messages. It’s not known which energy utilities and other firms use the specific controllers Peterson’s group has studied. The Edison Electric Institute, an industry group for electric companies, and the Department of Homeland Security did not respond to messages.

ODVA, an industry group for controller and other automation companies, says it is working with its members to evaluate security improvements to network communication standards that Peterson’s crew exploited.

Peterson rejects the accusation that the project is a public-relations ploy. He says his company has only six employees and is “typically overbooked.”

“It’s really too early to say whether this will have an impact or not,” Peterson says. “We’re still hopeful.”

What do you think about this article? Comment below!