Dates
Friday, October 07, 2016 - 02:30pm to Friday, October 07, 2016 - 04:30pm
Location
Room 120 (105 Seats)
Event Description

Information Control in the Digital Age
Internet users in many countries around the world are subject to various forms of censorship and information control. Many governments control its citizens' access to information. Fostering free and open communication on the Internet first depends on being able to understand the nature of Internet censorship and control. Given a deeper understanding of the nature of censorship, we can then design systems to circumvent it. Any measurement exercise involving censorship is more complex than meets the eye: even within a single country, censorship changes over time and varies across regions. To capture these complex dynamics, Internet censorship must be both continuous and distributed across a large number of vantage points. Gathering such data also requires grappling with many ethical issues. I will first discuss a system called Encore that performs continuous, large-scale measurements of web filtering. Encore harnesses cross-origin requests to measure Web filtering from a diverse set of vantage points without requiring users to install custom software, enabling longitudinal measurements from many vantage points. We explain how Encore induces Web clients to perform cross-origin requests that measure Web filtering, design a distributed platform for scheduling and collecting these measurements, show the feasibility of a global-scale deployment with a pilot study and an analysis of potentially censored Web content, identify several cases of filtering in six months of measurements, and discuss ethical concerns that would arise with widespread deployment. I will then discuss how certain countries, such as China, go beyond simple web filtering, to actively search for and block circumvention software, such as Tor. The active probing method we discovered monitors the network for suspicious traffic, then actively probing the corresponding servers, and blocking any that are determined to run circumvention servers such as Tor. Our study draws on multiple forms of measurements, some spanning years, to illuminate the nature of this probing. We identify the different types of probing, develop fingerprinting techniques to infer the physical structure of the system, localize the sensors that trigger probing--showing that they differ from the Great Firewall infrastructure--and assess probing's efficacy in blocking different versions of Tor.

Bio:

Nick Feamster is a professor in the Computer Science Department at Princeton University and the Acting Director of the Princeton University Center for Information Technology Policy (CITP). Before joining the faculty at Princeton, he was a professor in the School of Computer Science at Georgia Tech. He received his Ph.D. in Computer science from MIT in 2005, and his S.B. and M.Eng. degrees in Electrical Engineering and Computer Science from MIT in 2000 and 2001, respectively. His research focuses on many aspects of computer networking and networked systems, with a focus on network operations, network security, and censorship-resistant communication systems. In December 2008, he received the Presidential Early Career Award for Scientists and Engineers (PECASE) for his contributions to cybersecurity, notably spam filtering. His honors include the Technology Review 35 Top Young Innovators Under 35 award, the ACM SIGCOMM Rising Star Award, a Sloan Research Fellowship, the NSF CAREER award, the IBM Faculty Fellowship, the IRTF Applied Networking Research Prize, and award papers at the SIGCOMM Internet Measurement Conference (measuring Web performance bottlenecks), SIGCOMM (network-level behavior of spammers), the NSDI conference (fault detection in router configuration), Usenix Security (circumventing web censorship using Infranet), and Usenix Security (web cookie analysis).

Event Title
DLS & CSE 600: Nick Feamster from Princeton