Jagged Thoughts | Dr. John Linwood Griffin

August 8, 2013

KB1YBA/AE, or How I Spent My Weekend In Vegas

Filed under: Opinions,Reviews — JLG @ 8:15 PM

Last week I attended Black Hat USA 2013, BSidesLV 2013, and DEF CON 21 in fabulous Las Vegas, Nevada, thanks to the generosity of my employer offering to underwrite the trip.  Here were the top six topics of discussion from the weekend:

1. Not surprisingly, PRISM was the main topic of conversation all weekend.  Depending on your perspective, PRISM is either

“an internal government computer system used to facilitate the government’s statutorily authorized collection of foreign intelligence information from electronic communication service providers under court supervision” (Director of National Intelligence statement, June 8, 2013)

or

“a surveillance program under which the National Security Agency [NSA] vacuums up information about every phone call placed within, from, or to the United States [and where] the program violates the First Amendment rights of free speech and association as well as the right of privacy protected by the Fourth Amendment” (American Civil Liberties Union statement, June 11, 2013)

The NSA director, Gen. Keith Alexander, used the opening keynote at Black Hat to explain his agency’s approach to executing the authorities granted by Section 215 of the USA PATRIOT act and Section 702 of the Foreign Intelligence Surveillance Act.  His key points were:

  • The Foreign Intelligence Surveillance Court (FISC) does not rubber-stamp decisions, but rather is staffed with deeply-experienced federal judges who take their responsibilities seriously and who execute their oversight thoroughly.  Along similar lines, Gen. Alexander stated that he himself has read the Constitution and relevant federal law, that he has given testimony both at FISC hearings and at Congressional oversight hearings, and that he is completely satisfied that the NSA is acting within the spirit and the letter of the law.
  • Members of the U.S. Senate, as well as other executive branch agencies, have audited (and will continue to audit) the NSA’s use of data collected under Section 215 and Section 702.  These audits have not found any misuse of the collected data.  He offered that point as a rebuttal to the argument that the Government can abuse its collection capability—i.e., the audits show that the Government is not abusing the capability.
  • Records collected under Section 215 and Section 702 are clearly marked to indicate the statutory authority under which it is collected; this indication is shown on screen (a “source” field for the record) whenever the records are displayed.  Only specially trained and tested operators at the NSA are allowed to see the records, and that only a small number of NSA employees are in this category.  The collected data are not shared wholesale with other Government agencies but rather are shared on a case-by-case basis.
  • The NSA has been charged with (a) preventing terrorism and (b) protecting U.S. civil liberties.  If anyone can think of a better way of pursuing these goals, they are encouraged to share their suggestions at ideas@nsa.gov.

In the end I was not convinced by Gen. Alexander’s arguments (nor, anecdotally speaking, was any attendee I met at either Black Hat or DEF CON).  I walked away from the keynote feeling that the NSA’s collection of data is an indiscriminate Government surveillance program, executed under a dangerous and unnecessary veil of secrecy, with dubious controls in place to prevent abuse of the collected data that, if abused, would lead to violations of civil rights of U.S. citizens.  In particular, if this program had existed on September 11, 2001, I harbor no doubt that the statutory limits (use or visibility of collected data) would have been exceeded in the legislative and executive overreaction to the attacks.  This forbidden fruit is just too ripe and juicy.  As such I believe the Section 215 and Section 702 statutory limits will inexorably be exceeded if these programs—i.e., the regular exercise of the federal Government’s technical capability to indiscriminately collect corporate business records about citizen activities—continue to exist.

I do appreciate how the NSA is soliciting input from the community on how the NSA could better accomplish its antiterrorism directive.  Unfortunately, Pandora’s Box is already open; I can’t help but feel disappointed that my Government chose secretly to “vacuum up information” as its first-stab approach to satisfying the antiterrorism directive.  As I wrote in a comment on the Transportation Security Administration (TSA)’s proposed rule to allow the use of millimeter wave scanning in passenger screening:

I fly monthly.  Every time I fly, I opt out of the [millimeter wave] scanning, and thus I have no choice but to be patted down.  I shouldn’t have to submit to either.  In my opinion and in my experience, the TSA’s intrusive searching of my person without probable cause is unconstitutional, period.

I appreciate that the TSA feels that they’re between a rock and a hard place in responding to their Congressional directive, but eroding civil liberties for U.S. citizens is not the answer to the TSA’s conundrum.  Do better.

Eroding civil liberties for U.S. citizens is not the answer to the NSA’s conundrum.  Do better.

2. Distributed Denial of Service (DDoS) attacks.  There were at least five Black Hat talks principally about DDoS, including one from Matthew Prince on how his company handled a three hundred gigabit per second attack against a customer.  The story at that link is well worth reading.  Prince’s talk was frightening in that he forecast how next year we will be discussing 3 terabit/sec, or perhaps 30 terabit/sec, attacks that the Internet will struggle to counter. The DDoS attacks his company encountered required both a misconfigured DNS server (an open DNS resolver that is able to contribute to a DNS amplification attack; there are over 28 million such servers on the Internet) and a misconfigured network (one that does not prevent source address spoofing; Prince reports there are many such networks on the Internet).

Another interesting DDoS talk was Million Browser Botnet by Jeremiah Grossman and Matt Johansen.  The essence is that you write your botnet code in Javascript, then buy Javascript ads on websites…resulting in each reader of those websites becoming a node in your personal botnet (as long as they’re displaying the page).  Wow.

3. CreepyDOL.  To quote the Ars Technica article about this work: “You may not know it, but the smartphone in your pocket is spilling some of your deepest secrets to anyone who takes the time to listen. It knows what time you left the bar last night, the number of times per day you take a cappuccino break, and even the dating website you use. And because the information is leaked in dribs and drabs, no one seems to notice. Until now.”  The researcher, Brendan O’Connor, stalked himself electronically to determine how much information an attacker could glean by stalking him electronically.  In his presentations he advanced a twofold point:

(a) it’s easy to track people when they enable wifi or Bluetooth on their phones (since it sprays out its MAC address in trying to connect over those protocols), and

(b) many services (dating websites, weather apps, apple imessage registration, etc.) leak information in clear text that can be correlated with the MAC address to figure out who’s using a particular device.

O’Connor’s aha! moment was that he created a $50 device that can do this tracking.  You could put 100 of these around the city and do a pretty good job of figuring out where a person of interest is and/or where that person goes, for relatively cheap ($5,000) and with no need to submit official auditable requests through official channels.

The researcher also brought up an excellent point about how the chilling effect of Draconian laws like the Computer Fraud and Abuse Act (CFAA) make it impossible for legitimate computer security researchers to perform their beneficial-to-society function.  If the CFAA had existed in the physiomechanical domain in the 1960s then Ralph Nader could have faced time in federal prison for the research and exposition he presented in Unsafe At Any Speed: The Designed-In Dangers of The American Automobile—and consumers might never have benefitted from the decades of safety improvements illustrated in this video.  Should consumer risks in computer and network security systems should be treated any differently than consumer risks in automotive systems?  I’m especially curious to hear arguments from anybody who thinks “yes.”

4. Home automation (in)security.  In a forthcoming blog post I will describe the astonishingly expensive surprise expense we incurred this summer to replace the air conditioner at our house.  (“What’s this puddle under the furnace?” asked Evelyn.  “Oh, it’ll undoubtedly be a cheap and easy fix,” replied John.)

As part of the work we had a new “smart” thermostat installed to control the A/C and furnace.  The thing is a Linux-based touchscreen, and is amazing—I feel as though I will launch a space shuttle if I press the wrong button—and of course it is Wi-Fi enabled and comes with a service where I can control the temperature from a smartphone or from a web application.

And, of course, with great accessibility comes great security concerns.  Once the thermostat was up and running on the home network I did the usual security scans to see what services seemed to be available (short answer: TCP ports 22 and 9999).  Gawking at this new shuttle control panel got me interested in where the flaws might be in all these automation devices, and sure enough at Black Hat there were a variety of presentations on the vulnerabilities that can be introduced by consumer environmental automation systems:

Clearly, home automation and/or camera-enabled insecurity was a hot topic this year.  I was glad to see that the installation manual for our new thermostat (not camera-enabled, I think) emphasizes that it should be installed behind a home router’s firewall; it may even have checked during installation that it received an RFC 1918 private address to ensure that it wasn’t directly routable from the greater Internet.

5. Femtocells and responsible disclosureTwo years ago I wrote about research that demonstrated vulnerabilities in femtocells (a.k.a. microcells), the little cellular base stations you can plug into your Internet router to improve your cellular reception in dead zones.  This year, Doug DePerry and Tom Ritter continued the femtocell hacking tradition with a talk on how they got root access on the same device I use at home.  The researchers discovered an HDMI port on the bottom of the device, hidden under a sticker, and sussed out that it was actually an obfuscated USB port that provided console access.  Via this console they were able to modify the Linux kernel running on the device and capture unencrypted voice and SMS traffic.  The researchers demonstrated both capabilities live on stage, causing every attendee to nervously pull out and turn off their phones.  They closed by raising the interesting question of why any traffic exists unencrypted at the femtocell—why doesn’t the cellular device simply create an encrypted tunnel to a piece of trusted back-end infrastructure?  They also asked why deploy femtocells at all, instead of simply piggybacking an encrypted tunnel over ubiquitous Wi-Fi.

Regarding responsible disclosure, the researchers notified Verizon in December 2012 about the vulnerability.  Verizon immediately created a patch and pushed it out to all deployed femtocells, then gave the researchers a green light to give the talk as thanks for their responsible disclosure.  Several other presenters reported good experiences with having responsibly disclosed other vulnerabilities to other vendors, enough so that I felt it was a theme of this year’s conference.

6. Presentation of the Year:Adventures in Automotive Networks and Control Units” by Charlie Miller and Chris Valasek.  It turns out it’s possible to inject traffic through the OBD-II diagnostic port that can disable a vehicle’s brakes, stick the throttle wide open, and cause the steering wheel to swerve without any driver input.  Miller and Valasek showed videos of all these happening on a Ford Escape and a Toyota Prius that they bought, took apart, and reverse engineered.  It’s only August but their work gets my vote for Security Result of 2013.  Read their 101-page technical report here.

Wow.

I mean, wow.  The Slashdot discussion of their work detailed crafty ways that this attack could literally be used to kill people.  The exposed risks are viscerally serious; Miller showed a picture from testing the brake-disabling command wherein he crashed uncontrolled through his garage, crushing a lawnmower and causing thousands of dollars of damage to the rear wall.  (In a Black Hat talk the day before, Out of Control: Demonstrating SCADA Device Exploitation, researchers Eric Forner and Brian Meixell provided an equally visceral demonstration of the risks of Internet-exposed and non-firewalled SCADA controllers by overflowing a real fluid tank using a real SCADA controller right on stage.  I for one look forward to this new age of security-of-physical-systems research where researchers viscerally demonstrate the insecurity of physical systems.)

Regardless, other than those gems I was unimpressed by this year’s Black Hat (overpriced and undergood) and felt “meh” about DEF CON (overcrowded and undernovel).  Earlier in the year I was on the fence about whether to attend the Vegas conferences, having been underwhelmed last year, which prompted my good friend and research partner Brendan to observe that if I had a specific complaint about these conferences then I should stop whining about it and instead do something to help fix the problem.  In that spirit I volunteered to be on the DEF CON “CFP review team,” in hopes that I could help shape the program and shepherd some of the talks.  Unfortunately I was not selected to participate (not at all surprising, since I work indirectly for The Man).

In my offer to volunteer I offered these specific suggestions toward improving DEF CON, many of which are equally relevant to improving Black Hat:

I’d like to see the DEFCON review committee take on more of a “shepherding” role, as is done with some academic security conferences — i.e., providing detailed constructive feedback to the authors, and potentially working with them one-on-one in suggesting edits to presentations or associated whitepapers.

I think there are things the hacker community can learn from the academic community, such as:

* You have to answer the core question of why the audience should care about your work and its implications.

* It’s one thing to present a cool demo; it’s another to organize and convey enough information that others can build upon your work.

* It only strengthens your work if you describe others’ related work and explain the similarities and differences in your approach and results.

Of course there are plenty of things the academic community can learn from the hacker community!  I’m not proposing to swoop in with a big broom and try to change the process that’s been working fine for DEFCON for decades.  In fact I’m curious to experience how the hacker community selects its talks, so I can more effectively share that information with the academic community.  (For example I spoke at last year’s USENIX Association board meeting on the differences between the USENIX annual technical conference and events like DEFCON, Shmoocon, and HOPE, and I commented on lessons USENIX could take away from hacker cons.)

But each year I’ve been disappointed at how little of a “lasting impression” I’ve taken away from almost all of the DEFCON talks.  A “good presentation” makes me think about how *I* should change the way I approach my projects, computer systems, or external advocacy. I wish more DEFCON talks (and, frankly, Black Hat talks) were “good presentations.”  I’m willing to contribute my effort to your committee to help the community get there.

One academic idea you might be able to leverage is that whenever you’re published, you’re expected to serve on program committees (or as a reviewer) in future years.  (The list of PC members is usually released at the same time as the RFP, and the full list of PC members and reviewers are included in the conference program, so it’s both a service activity and resume fodder for those who participate.)  So perhaps you could start promulgating the idea that published authors at BH and DC are expected to do service activities (in the form of CFP review team membership) for future conferences.

Finally, KB1YBA/AE in the title of this post refers to the culmination of a goal I set for myself eighteen years ago.  There are three levels of achievement (three license classes) that you can attain as a U.S. ham radio enthusiast:

  • Technician class.  Imagine, if you will, a much younger John.  In 1995, at the urging of my old friend K4LLA, I earned technician (technically “technician plus”) class radiotelephone privileges by passing both a written exam and a 5-words-per-minute Morse code transcription exam.  [Morse code exams are no longer required to participate in amateur radio at any level in the United States.]  At that time I set a goal for myself that someday I would pass the highest-level (extra class) exam.
  • General class.  In 2011, at the urging of my young friend K3QB, I passed the written exam to earn general class privileges.  With each class upgrade you are allowed to transmit on a broader range of frequencies.  With this upgrade I was assigned the new call sign KB1YBA by the Federal Communications Commission.
  • Amateur Extra class.  At DEF CON this year, with the encouragement of my former colleague NK1B, I passed the final written exam and earned extra class privileges.  It will take a few weeks before the FCC assigns me a new call sign, but in the meantime I am allowed to transmit on extra-class frequencies by appending an /AE suffix onto my general-class call sign when I identify myself on-air.  For example: “CQ, CQ, this is KB1YBA temporary AE calling CQ on [frequency].”

I don’t mean to toot my own horn, but it feels pretty dang good to fulfill a goal I’ve held for half my life.  Only 17% [about 120,000] of U.S. ham radio operators have earned Amateur Extra class privileges.