Jagged Thoughts | Dr. John Linwood Griffin

November 13, 2009

CCS 2009

Filed under: Reviews — JLG @ 9:31 PM

16th Conference on Computer and Communication Security (CCS’09)
Chicago, Illinois
November 9-13, 2009

CCS is one of the top international security conferences (example topics: detecting kernel rootkits, RFID, privacy and anonymization networks, botnets, cryptography).  It is held annually in November.  This year there were 315 submitted papers from 31 countries, of which 18% were accepted after peer review.

I’ve attended CCS twice (2006 and 2009).  It is one of the best conferences I’ve ever attended — I find that the speakers describe practical, cutting edge, informative results; I keep up with old acquaintances and meet new ones; I keep sharp and up-to-date as a research scientist.

Here are some of the major themes from this year:

* ASCII-compliant shellcode:  My favorite paper of the conference is “English Shellcode” where the authors developed a tool that takes malicious software as input and converts it into REAL ENGLISH PHRASES (taken from Wikipedia and Project Gutenberg) that execute natively on 32-bit x86.  If you read no other paper this year, you simply must read this paper, it is wack incredulous.  There was another paper that uses only valid ASCII characters for shellcode on the ARM architecture.  These demonstrations are important because ASCII (and especially English ASCII) is likely to be passed through by network intrusion detection systems.  The favorite paper is here:

http://www.cs.jhu.edu/~sam/ccs243-mason.pdf

* Cloud computing:  Few authors of cloud-related papers seemed to address the cloudiness of their work, instead (and disappointingly) discussing generic distributed computing principles under a cloud umbrella.  The best cloud talk I saw was Ian Foster, an invited speaker at the cloud security workshop, who described the transition from grid computing to cloud computing thus: grid was about federation, cloud is about infrastructure and hosting.  He pointed out that the grid folks did a good job of developing (e.g., medical research) applications and executing analyses, but that it is the advent of data distribution and sharing in the cloud that is a game-changer in cloud computing.

* Anonymous communication:  There were several talks analyzing the efficacy of anonymization networks (mix networks, remailers, Tor, onion routing).  My takeaway is that these techniques work very well for latency-insensitive traffic (such as email), only moderately well for latency-sensitive traffic (such as web browsing), and not very well yet for high-bandwidth traffic (such as VoIP).  My favorite work was a poster on “Preventing SSL Traffic Analysis with Realistic Cover Traffic” (Nabil Schear and Nikita Borisov) where the authors change the statistical profile of your encrypted traffic such that existing analyses (such as measuring keystroke latencies) are impossible.

* Off-client emulation:  Several speakers described a technique for client-server applications (such as game clients running on customers’ home computers) that help to ensure the correctness, robustness, or speed of the client application.  It’s impractical to run a complete copy of the client on the server (because one server handles many clients) so the authors generally create minimalist versions of the client (for example, a game client that contains no rendering code) that are server-efficient.  In the game example, the client would send the user’s commands (“turn left, walk forward”) to the server, where the minimalist client would verify that those commands didn’t result in an invalid state (such as walking through a wall) that would indicate cheating by the player.

* Function-call graphs:  These are well-known techniques for tracing how an application executes (create a graph of the control flow of an application).  The technique kept popping up during the conference: using them to identify when someone has violated your software license and included your source code in their application; using them inside a hypervisor to identify when a kernel rootkit is present in a virtual machine due to the different hypercalls).  One attendee I had lunch with was very critical of the function-call graph technique (using an argument I didn’t really follow) but otherwise the technique seems useful.

* Power grids:  The currently-hot topic in security research is power grids and smart meters.  There are at least projects at Penn State, Carnegie Mellon, Johns Hopkins, and I’m certain many other places.  There was a tutorial, a paper, and several posters all discussing security issues in the power grid.  The most interesting aspect to me was attacks against state estimators: the researchers described techniques to manipulate the system components involved in measuring and predicting the state of generators, transmission lines, etc.  However, the research community still suffers from a dearth of real-world information of how these networks operate and where the real vulnerabilities might be.

* RFID:  As we already know, it is possible to do RFID well but none of the actual deployed RFID implementations do it well.  One classic observation by a speaker was of the RFID-enabled drivers licenses issued in Washington State (in advance of the Winter Olympics) that include a KILL command that’s supposed to be set with a unique PIN but in reality is unset (using a default PIN)…meaning that anyone with a transmitter and sufficient power could kill a device.

* Ethical standards for security researchers:  One paper raised an ethical issue in its appendix (how can we do security research inside Amazon’s cloud computing infrastructure in a manner that doesn’t violate their terms of service?) and some researchers from the Stevens Institute have published a report and are organizing a workshop to investigate ethical standards for security researchers.  I didn’t really agree with many of the points made (my ethical line is drawn much further to the left: security researchers should have few constraints) but it was a hotly discussed and debated issue during the session breaks.

Wolfram Schulte at Microsoft Research gave an invited workshop talk on their Singularity OS project (reinventing the OS from scratch; using software-enforced isolation instead of relying on hardware memory management techniques).  It’s an interesting project but impractical since it would require a widescale by developers in such a way that very little development would happen for awhile.  The work was inspired by his team’s frustration on using best-practices formal verification (etc.) techniques for software development — or, taken another way, it was so frustrating when a blue-sky team tried to use existing techniques to develop and prove major software projects that they gave up.  That doesn’t bode well for using those techniques extensively in any real-world software development project (although they can still be very useful and insightful…just frustrating).

Also a shout-out to my student Brendan O’Connor for delivering a well-received talk on stock markets for reputation at the digital identity workshop.