When Security Experts Gather to Talk Consensus, Chaos Ensues

Tension between researchers and vendors over the disclosure of software security vulnerabilities has raged for two decades. A meeting to address that tension further highlighted the tension.
GettyImages482043092story
Getty Images

Security researchers and vendors have long been locked in a debate over how to disclose security vulnerabilities, and there's little on which the two sides agree. Apparently this extends even to the question of whether they should meet to hash out their disagreements.

That's the conclusion after a coalition of security vendors, academics, lawyers, and researchers gathered at UC Berkeley on Tuesday to discuss how to improve the sometimes-hostile system for reporting software vulnerabilities.

But the diverse group of participants had a hard time even agreeing on the purpose of the meeting: Was it to draft a charter for best practices in reporting software vulnerabilities? Was it to reform parts of the Digital Millennium Copyright Act and Computer Fraud and Abuse Act to make them less hostile to researchers? Or was it to develop guidelines for companies interested in launching bug bounty programs?

The participants hit another sticking point when they tried to determine if they should hold a second meeting. "I spent $2,000 [to come to this meeting]," Dave Aitel, CEO and founder of the Florida-based security firm Immunity, told attendees. Whether or not there's a second meeting, "should at least be an option" for discussion.

Organized by the National Telecommunications and Information Administration (NTIA), a division of the US Commerce Department, the six-hour meeting marked one of the government's first forays into the controversial world of bug reporting. But not all of the participants entirely welcomed the government's involvement—some of them pointed out that a government that withholds information about zero-day vulnerabilities from software vendors in order to exploit them in the systems of adversaries is not exactly in a position to tell researchers and vendors how to handle the vulnerability disclosure process.

Some participants also expressed mistrust privately to WIRED that the meeting might simply be the first step in yet another government attempt to regulate software research.

"The DMCA has already created a chilling effect on some research," one participant, who asked to remain anonymous, said. "The Wassenaar agreement is [also] a problem. This is the Commerce Department. What makes you think they won't take [information gathered from this meeting] to Congress [to get legislation passed]?"

The 1998 Digital Millennium Copyright Act (DMCA) has often been used by companies to threaten researchers who reverse-engineer software and products to find vulnerabilities. As for the Wassenaar Arrangement, which is an international arms control agreement that calls for export controls on the sale and trafficking of certain types of surveillance software, the Commerce Department has drafted US export rules to comply with it. Security professionals say these rules would thwart security research and bug disclosures.

But Allan Friedman, the NTIA’s director of cybersecurity, assured the crowd, which included those watching a livestream over the Internet, that the Commerce Department's role was simply to facilitate a discussion between stakeholders—not to impose solutions.

Many Companies Are New to the Security Disclosure World

Whether or not concerns about government regulation are reasonable, the gathering did achieve one important thing. It brought together traditional companies that have been dealing with disclosure issues for years—such as Google, Microsoft, and Oracle—with representatives from companies like General Motors and Honda that are new to the world of security disclosures. The auto industry has awoken to the vulnerability issue only recently, following several high-profile hacks of automobiles by researchers. And at least one person was there representing the medical industry, which has had its own run-ins with researchers recently.

All of this is a sign that the definition of "software vendor" has expanded in recent years to include the makers of products that previously contained no digital code. It also highlights the need for these new players to learn from the start how to avoid making the same mistakes their predecessors made in dealing with researchers.

"Everyone is a software company [today]," Josh Corman, CTO of Sonatype, a firm that develops enterprise software tools, told the audience. "Everyone is going to grapple with this problem [of vulnerability disclosure at some point]." Unfortunately, he noted, "99 percent of the people tracking with this issue are at day zero" in their understanding of how to deal with researchers and software vulnerabilities.

Many of these new players are in the position that Microsoft and other companies were 15 years ago, when vendors saw researchers as adversaries rather than assets.

"Fifteen years ago, friends got cease-and-desist letters from Microsoft and framed them," Corman said. "Now Microsoft is giving six-figure [bug] bounties. That mean-time to enlightenment took 15 years, so do not expect that these people who are in the 99 percent [will] wake up overnight. They will have a steep learning curve." He's hopeful, however, that companies entering the arena today will learn faster than their predecessors. "We want to compress that mean-time to enlightenment from 15 years to maybe three."

But if comments made at the meeting are any indication, both sides still have to overcome negative perceptions of each other.

Members of the audience snickered, for example, when a representative from the auto industry pleaded that researchers should consider "safety" when testing for vulnerabilities. At a bar after the event, some attendees said automakers are the ones who don't seem concerned about consumer safety when they sell cars that haven't been pen-tested for vulnerabilities or when it takes them five years to fix a known vulnerability.

And when Corman sought community support for new companies entering the bug bounty arena, some attendees responded with derision. He noted that after United Airlines launched its bug bounty program this year—the first for the airline industry—it suffered backlash from the security community instead of support.

"United took a baby step and put their toe in the water, [and] the research community bit it, like a piranha, down to the bone," he said. "It really scared other companies."

But Neal Krawetz, founder of Hacker Factor Solutions, pointed out that "50 percent of the United bug bounty program's announcement were warnings about how they would sue you if you did certain things."

The Rift Between Security Researchers and Vendors Runs Deep

This is not the first time someone has attempted to resolve the issues between researchers and vendors. The rift between them goes back nearly two decades.

In 2000, a prominent hacker and researcher who went by the name Rain Forest Puppy crafted a "full disclosure" policy for publishing information about security holes that hackers and other researchers discovered. Back then, it wasn't unusual for a researcher to disclose a vulnerability to a software maker or web site owner, only to be ignored—or to be served a letter accusing them of illegally hacking or reverse-engineering the software or system. Some researchers fought back by bypassing vendors altogether and simply disclosing information about holes directly to the public, through the media or conference presentations. This kind of approach embarrassed vendors, but it also made them more likely to fix the hole and leave the researcher alone.

Puppy's disclosure policy proposed that researchers should reveal vulnerabilities to vendors before publishing them, but vendors would be required to respond within five business days, or the researcher would go public. The vendor didn’t have to fix the vulnerability within that time—it could negotiate a reasonable timeframe for doing so—but it had to at least acknowledge the bug report and respond politely during that time, or the researcher would be free to disclose the information to the public.

This was in the days before bug bounty programs, when security pros were volunteering their skills for free to improve vendors' products. In exchange, researchers hoped for public acknowledgement and thanks, and a boost to their resumes. But it didn’t work out this way. Instead, the history of computer security became littered with researchers put through the ringer over what they considered to be Good Samaritan acts.

This problem has partly been alleviated by the growth in bug bounty programs offered by vendors like Google and Microsoft to pay researchers for vulnerabilities they uncover in software. Such programs, for the most part, have made it much easier for researchers to report vulnerabilities, get them fixed in a timely manner, and receive fair treatment from vendors.

But this isn't always the case. Some vendors still react to researchers with mistrust and hostility, as evidenced by the recent legal altercation between two security firms after one uncovered vulnerabilities in the other's product. And some researchers push boundaries when they conduct their research, as evidenced by the recent case involving a researcher who told the FBI he had hacked airline networks while inflight.

All of which is to say that many of the original issues around disclosure that plagued the community two decades ago remain the same: Vendors sometimes still take too long to patch vulnerabilities, ignore researchers altogether, or threaten legal action. And there's often still a lot of tension when researchers threaten to publicly disclose vulnerabilities.

Rather than offer solutions to these problems, a number of people in the crowd urged attendees not to try to come up with answers yet, but to instead use this and future meetings to first listen to all sides and develop a broad understanding of researchers' and vendors' concerns.

First, though, they need to agree on whether they'll even have a second meeting.