Archive for February, 2008

Daily Progress: Security code easy hacking for UVa student

Thursday, February 28th, 2008

The Daily Progress has an article about Karsten Nohl’s work on analyzing RFID tag security: Security code easy hacking for UVa student, 28 February 2008.

… Projects such as hacking the security code of a RFID chip is the “evil twin” of Nohl’s regular research, he said, which focuses on the development of cryptographic algorithms for computer security.

Nohl said that a more secure option for RFID security codes would be to rely on publicly known and time-tested security algorithms. NXP’s secret code, he said, is an example of “security by obscurity,” or the practice of keeping the code private and hoping hackers do not figure it out. Private algorithms, Nohl said, are more likely to have flaws and vulnerabilities.

“We found significant vulnerabilities in their algorithm,” he said. “By keeping it secret, they hurt themselves in the end.”

[Added 1 March] The story also appears in The Danville Register (Hackers claim they broke key security code). Blog reports include PogoWasRight and LiquidMatrix Security Digest.

[Added 2 March]: More reports: Xenophilia, WAVY-TV.

Group Demonstrates Security Hole in World’s Most Popular Smartcard

Tuesday, February 26th, 2008

UVaToday has an article about Karsten Nohl’s work on reverse engineering the cryptographic algorithms on the Mifare Classic RFID tag:

… The idea of keeping secret the design of a security system is known in the trade as “security by obscurity.” It almost never works; the secret invariably leaks out and then the security is gone, Evans and Nohl said.

As a result, most security professionals espouse Kerckhoffs Principle — first published by the Dutch cryptographer Auguste Kerckhoffs in 1883 — the idea that the design of all security systems should be fully public, with the security dependent only on a secret key. Public review of security designs also tends to catch flaws during the design process, rather than after the flaws are inherent in expensive systems, such as in the Netherlands transit system, noted Nohl and Evans.

… If more consumers understand the fundamental flaw of “proprietary security algorithms” and other marketing-speak that touts what amounts to security by obscurity, then manufacturers may start opening up more of their security designs to the light of public scrutiny, which will ultimately result in better security in our digital age.

Full article: Group Demonstrates Security Hole in World’s Most Popular Smartcard, UVaToday, February 26, 2008.

Frozen in Memory

Friday, February 22nd, 2008

A group at Princeton has released an interesting paper showing that encryption keys can be read from DRAM even after power is lost: Lest We Remember: Cold Boot Attacks on Encryption Keys

The research team includes Joseph Calandrino, who was a UVa undergraduate student, as well as J. Alex Halderman, Seth Schoen, Nadia Heninger, William Clarkson, William Paul, Ariel Feldman, Jacob Appelbaum, and Edward Felten.

It seems that most encrypted disk drives (any drive where the key is stored in the host’s DRAM) are likely to be vulnerable to this attack. This work seems to provide further support for moving more processing to the disk itself – if the disk processor performs all the encryption and decryption directly, there is no need to move the key into the host memory at all (where this work provides even more evidence that it becomes difficult to protect).

[Added 23 Feb]: New York Times article

What Every Human Should Know About Security

Friday, February 22nd, 2008

I gave a talk in cs290 (which is our weekly undergraduate seminar) on “What Every Human Should Know About Security”.
My slides are available here: [PPT (warning: 18MB)] [PDF].

CMU: Study shows dangers in Facebook apps

Tuesday, February 12th, 2008

CMU’s newspaper, The Tartan has an article on Adrienne Felt’s facebook platform privacy work: Study shows dangers in Facebook apps, The Tartan, 11 February 2008.

Each day, students are bombarded with requests to become a Greek god, a Disney princess, and the biggest brain — on Facebook, that is. Over 15,000 Facebook applications exist today, offering a variety of capabilities to the social networking website. However, according to a new study from the University of Virginia, users risk losing their privacy by simply rating their 10 hottest friends or discovering their ideal desperate housewife.

It even includes a comic featuring nudity!

[Added 23 Feb] Cornell’s The Ithacan also has an article: Facebook applications access personal information, February 21, 2008.

Should Facebook preemptively protect users against rogue apps?

Friday, February 8th, 2008

Jonathan Zittrain, Professor of Internet Governance and Regulation at the Oxford Internet Institute, has an interesting blog post about Adrienne Felt’s work on Facebook platform privacy: Should Facebook preemptively protect users against rogue apps?.

It is worth reading the whole article, but here are a few excerpts:

Enterprising UVa senior Adrienne Felt has developed an intriguing argument about privacy for Web 2.0 apps like those on the Facebook development platform. It will get lots of news coverage, much of it boiling down to reports that don’t capture the richness of the problem.

But there is another difference at work: partly because of technology and partly because of historical inertia, Facebook can more obviously be asked to play a gatekeeper role with its apps than an OS maker can with its desktop apps. Felt’s solution to the problem she identifies is to have Facebook run interference — serve as a proxy — between most apps and the data they presumably don’t really need. The app can say to Facebook, “Display the user’s birthday in the upper right corner of the screen,” without having to know the user’s birthday. Only in a few instances, they say, must an app really access the data in order to work.

Social networks are rightly recognized as powerful, even transformative. The ability for unaccredited third parties to write apps that users can run to access their data and do cool things with it further leverages their power. The wild card of the platform makers’ power over those apps creates a range of options simply not available to the OS makers that preceded Web 2.0, and being put out of business by it.

Harvard Crimson: Study Finds Privacy Lapse in Facebook Apps

Friday, February 8th, 2008

From Study Finds Privacy Lapse in Facebook Apps, The Harvard Crimson, 8 February 2008:

Playing Jetman on Facebook.com may cause you to lose more than just the game. Your private information is also at stake.

Facebook application developers—who can be anybody—are unnecessarily given full access to both users’ and their friends’ private information, according to a University of Virginia study.

Slashdotted: Facebook Platform Privacy

Thursday, February 7th, 2008

Slashdot has an article on Adrienne Felt’s Facebook platform privacy work:
Facebook Sharing Too Much Personal Data With Application Developers.

Adrienne Felt interviewed on Utah NPR

Tuesday, February 5th, 2008

Adrienne Felt was interviewed on KCPW Midday Utah:

Users of the popular social networking website called Facebook should be concerned about security, according to Adrienne Felt. As a senior in the School of Engineering and Applied Science at the University of Virginia where she specializes in computer security, her research shows that when users download a Facebook application – a program that allows the user to interact with other users – privacy is compromised.

The KCPW site has audio: http://www.kcpw.org/article/5281. Its quite an in-depth interview (about 20 minutes long).

Voting Machines and Secret Recounts

Tuesday, February 5th, 2008

I (David Evans) was interviewed for this story on NBC 29 news about a lawsuit challenging the use of DREs (software-only) voting machines as unconstitutional because they do not support non-secret counting:

Should the Voting Machines be Scrapped?, WVIR (NBC 29) TV News, 4 February 2008. (Includes Video)