Security B-Sides is an odd duck series of workshops. Are you:
- Traveling to attend (or already living near) a major commercial security conference (RSA in San Francisco, Black Hat in Las Vegas, or SOURCE in Boston)?
- Not particularly interested in attending any of the talks in the commercial security conference you’ve already paid hundreds of dollars to attend?
- Unconcerned with any quality control issues that may arise in choosing a conference program via upvotes on Twitter?
Then you should attend B-Sides.
Okay, so it’s not as grim as I lay out above. Earlier this month I attended Security B-Sides Boston (BSidesBOS 2013) on USSJoin’s imprimatur. I felt the B-Sides program itself was weak, the hallway conversations were good, the keynotes were great, and the post-workshop reception was excellent.
But if I were on the B-Sides steering committee I would have B-Sides take place either immediately before or immediately after its symbiotic commercial conference. In academic conferences you will often see a “core” conference with 1-day workshops before or after or both, meaning that attendees can optionally participate, without requiring separate travel, and without interfering with the conference they’ve already paid hundreds of dollars to attend.
My takeaways from the B-Sides workshop came from the two keynote talks. Dr. Dan Geer (chief information security officer at In-Q-Tel)’s talk was one of the best keynotes I’ve ever seen. Some of his thought-provoking points included:
- It’s far cheaper to keep all your data than to do selective deletion. He implied that there is an economic incentive at work whose implications we need to understand: As long as it’s cheaper to just keep everything (disks are cheap, and now cloud storage is cheap), people are going to just keep everything. I’d thought about the save-everything concept before, but not from an economic perspective.
- When network intrusions are discovered, the important question is often “how long has this been going on?” instead of “who is doing this?” He implied that recovery was often more important than adversarial discovery (i.e., most people just want to revert affected systems to a known-good state, make sure that known holes are plugged, and move forward.) And the times could be staggering; he noted a Symantec report that the average zero-day exploit is in use for 300 days before it is discovered.
- Could the U.S. corner the vulnerability market? Geer made the fascinating suggestion that the U.S. buy every vulnerability on the market (offering 10 times market rates if needed) and immediately release them publicly. His goal is to collapse the information asymmetry that has built up because of the economics of selling zero-day attacks. He pined for the halcyon days of yore when zero-day attacks were discovered by hobbyists and released for fun (leading to “market efficiency” where everyone was on the same playing field when it came to technology decisions) rather than the days of today when they are sold for profit (leading to asymmetry, where known vulnerabilities are no longer public).
- “Security is the state of unmitigatable surprise. Privacy is where you have the effective capacity to misrepresent yourself. Freedom in the context of the Internet is the ability to reinvent yourself when you want.” He suggested that each of us should have as many distinct, curated online identities as we can manage — definitely an interesting research area. He made the fascinating suggestion of “try to erase things sometime,” for example by creating a Facebook profile…then later trying to delete it and all references to it.
- Observability is getting out of control and is not coming back. He commented that facial recognition is viable at 500 meters, and iris identification at 50 meters.
- All security technology is dual use; technology itself is neutral and should be treated as such. During my early days as a government contractor I similarly railed against the automatic (by executive order) top secret classifications applied to cyber weaponry and payloads — because doing so puts the knowledge out of reach of our network security defenders. As it turns out, One Voice Railing usually isn’t the most effective way to change entrenched bureaucratic thinking. (I haven’t really figured out what is the most effective way.)
- “Your choice is one big brother or many little brothers. Choose wisely.” This closing line is open to deep debate and interpretation; I’ve already had several interesting conversations about what Geer meant and what he’s implying. My position is that his earlier points (e.g., observability is out of control and is not coming back) demonstrate that we’ve already crossed the Rubicon of “no anonymity, no privacy” — without even realizing it — and that it’s far too late to go back to a time where no brother will watch you. Can anything be done? I’m very interested in continuing to debate this question.
Mr. Josh Corman (director of security intelligence at Akamai) gave the second keynote. Some of his interesting points included:
- Our dependence on software and IT is growing faster than our ability to secure it. Although this assertion isn’t new, it always brings up an interesting debate: if you can’t secure the software, then what can you do instead? (N-way voting? Graceful degradation? Multiple layers of encryption or authentication? Auditing and forensic analyses? Give up?) A professor I knew gave everybody the root password on his systems, under the theory that since he knew it was insecure then he would only to use the computer as a flawed tool rather than as a vital piece of infrastructure. Clearly the professor’s Zen-like approach wouldn’t solve everyone’s security conundrums, but the simplicity and power of his approach makes me think that there are alternative, unexplored, powerful ways to mitigate the imbalance of insecure and increasingly critical computer systems.
- HDMoore’s Law: Casual attacker power grows at the rate of Metasploit. This observation was especially interesting: not only do defenders have to worry about an increase in vulnerabilities but they need to worry about an increase in baseline attacker sophistication, as open-source security-analysis tools grow in capability and complexity.
- “The bacon principle: Everything’s better with bacon.” His observation here is that it is especially frustrating when designers introduce potential vulnerability vectors into a system for no useful reason. As an example, he asks why an external medical devices needs to be configurable using Bluetooth when the device (a) doesn’t need to be frequently reconfigured and (b) could be just as easily configured using a wired [less permissive] connection. The only thing Bluetooth (“bacon”) adds to such a safety-critical device is insecurity.
- Compliance regulations set the bar too low. Corman asserts that the industry’s emphasis on PCI compliance (the payment card industry data security standard) means that we put the most resources towards protecting the least important information (credit card numbers). It’s a double whammy: Not only is there an incentive to only protect PCI information and systems, but there is no incentive to do better than the minimal set of legally-compliant protections.
- Is it time for the security community to organize and professionalize? Corman railed against “charlatans” who draw attention to themselves (for example, by appearing on television) without having meaningful or true things to say. He implied that the security community should work together to define and promulgate criteria, beyond security certifications, that could provide a quality control function for people claiming to represent security expertise and best practices. (A controversial proposal!) A decade ago I explored related conversations about the need to create licensed professional software engineers, both to incent members of our community to adhere to well-grounded and ethical principles in their practice & to provide the community and the state with engineers who assume the responsibility and risk over critical systems designs.
- “Do something!” Corman closed by advocating for the security community to come together to shape the narrative of information security — especially in terms of lobbying to influence Governmental oversight and regulation — instead of letting other people do the lobbying and define the narrative. He gave the example of unpopular security legislation like SOPA and PIPA: “you can either DDoS it [after the legislation is proposed] or you can supply draft language [to help make it good to begin with].” I felt this was a great message for a keynote talk, especially in how it matches the influential message I heard from a professor at Carnegie Mellon (Dr. Philip Koopman) who fought successfully against adoption of the Uniform Computer Information Transactions Act and who exhorted me and my fellow students to be that person who stands up and fights on important issues when others remain silent.
All in all not a bad way to spend $20 and a Saturday’s worth of time.