Visible to the public Biblio

Filters: Author is Cox, Landon P.  [Clear All Filters]
2019-12-05
Razeen, Ali, Lebeck, Alvin R., Liu, David H., Meijer, Alexander, Pistol, Valentin, Cox, Landon P..  2018.  SandTrap: Tracking Information Flows On Demand with Parallel Permissions. Proceedings of the 16th Annual International Conference on Mobile Systems, Applications, and Services. :230-242.

The most promising way to improve the performance of dynamic information-flow tracking (DIFT) for machine code is to only track instructions when they process tainted data. Unfortunately, prior approaches to on-demand DIFT are a poor match for modern mobile platforms that rely heavily on parallelism to provide good interactivity in the face of computationally intensive tasks like image processing. The main shortcoming of these prior efforts is that they cannot support an arbitrary mix of parallel threads due to the limitations of page protections. In this paper, we identify parallel permissions as a key requirement for multithreaded, on-demand native DIFT, and we describe the design and implementation of a system called SandTrap that embodies this approach. Using our prototype implementation, we demonstrate that SandTrap's native DIFT overhead is proportional to the amount of tainted data that native code processes. For example, in the photo-sharing app Instagram, SandTrap's performance is close to baseline (1x) when the app does not access tainted data. When it does, SandTrap imposes a slowdown comparable to prior DIFT systems (\textasciitilde8x).

2018-11-19
Srivastava, Animesh, Jain, Puneet, Demetriou, Soteris, Cox, Landon P., Kim, Kyu-Han.  2017.  CamForensics: Understanding Visual Privacy Leaks in the Wild. Proceedings of the 15th ACM Conference on Embedded Network Sensor Systems. :30:1–30:13.

Many mobile apps, including augmented-reality games, bar-code readers, and document scanners, digitize information from the physical world by applying computer-vision algorithms to live camera data. However, because camera permissions for existing mobile operating systems are coarse (i.e., an app may access a camera's entire view or none of it), users are vulnerable to visual privacy leaks. An app violates visual privacy if it extracts information from camera data in unexpected ways. For example, a user might be surprised to find that an augmented-reality makeup app extracts text from the camera's view in addition to detecting faces. This paper presents results from the first large-scale study of visual privacy leaks in the wild. We build CamForensics to identify the kind of information that apps extract from camera data. Our extensive user surveys determine what kind of information users expected an app to extract. Finally, our results show that camera apps frequently defy users' expectations based on their descriptions.