The Privacy Lab is led by Prof. Apu Kapadia in the School of Informatics and Computing at Indiana University. Our goal is to advance research in online privacy, mobile security, and peer-to-peer systems. For an overview of our research, see our Research Projects and Publications pages. Visit the People page to see the faces behind our research.
The IU Privacy Lab led by PI Apu Kapadia has four papers accepted at CHI 2015! The first paper titled Privacy Concerns and Behaviors of People with Visual Impairments is a qualitative study that reports on interviews with 14 visually impaired people and suggests new directions for improving the privacy of the visually impaired using wearable technologies. The second paper titled Crowdsourced Exploration of Security Configurations explores the use of crowdsourcing to efficiently determine restricted sets of permissions that can strike reasonable tradeoffs between privacy and usability for smartphone apps.
The third paper (Note) titled Sensitive Lifelogs: A Privacy Analysis of Photos from Wearable Cameras is a followup study to our UbiComp 2014 paper titled Privacy Behaviors of Lifeloggers using Wearable Cameras. For this Note we analyzed the photos collected in our lifelogging study, seeking to understand what makes a photo private and what we can learn about privacy in this new and very different context where photos are captured automatically by one’s wearable camera. The fourth paper (Note) titled Interrupt Now or Inform Later?: Comparing Immediate and Delayed Privacy Feedback follows up on our CHI 2014 paper titled Reflection or Action?: How Feedback and Control Affect Location Sharing Decisions. This Note explored the effect of providing immediate vs. delayed privacy feedback (e.g., for location accesses). We found that the sense of privacy violation was heightened when feedback was immediate, but not actionable, and has implications on how and when privacy feedback should be provided.
PIs Apu Kapadia and David Crandall at IU, and Denise Anthony at Dartmouth College, have received a $1.2M collaborative NSF award (IU Share: $800K) to study privacy in the context of wearable cameras over the next four years. The ubiquity of cameras, both traditional and wearable, will soon create a new era of visual sensing applications, raising significant implications for individuals and society, both beneficial and hazardous. This research couples a sociological understanding of privacy with an investigation of technical mechanisms to address these needs. Issues such as context (e.g., capturing images for public use may be okay at a public event, but not in the home) and content (are individuals recognizable?) will be explored both on technical and sociological fronts: What can we determine about images, what does this mean in terms of privacy risk, and how can systems protect against risk to privacy?
Read more about this grant, and our project. Here is a 90-second video!
Researchers have shown how ‘network alignment’ techniques can be used to map nodes from a reference graph into an anonymized social-network graph. These algorithms, however, are often sensitive to larger network sizes, the number of seeds, and noise~— which may be added to preserve privacy. We propose a divide-and-conquer approach to strengthen the power of such algorithms. Our approach partitions the networks into ‘communities’ and performs a two-stage mapping: first at the community level, and then for the entire network. Through extensive simulation on real-world social network datasets, we show how such community-aware network alignment improves de-anonymization performance under high levels of noise, large network sizes, and a low number of seeds. Read more in our paper, which will be presented at ACM CCS 2014.
PIs Kapadia and Crandall have received a 2014 Google Research Award for their research on privacy in the context of ‘lifelogging’ wearable cameras. We expect that these wearable cameras (see the Narrative Clip and the Autographer in addition to Google Glass) will become commonplace within the next few years, regularly capturing photos to record a first-person perspective of the wearer’s life. The goal of this project is to investigate and build automatic algorithms to organize images from lifelogging cameras, using a combination of computer vision and analysis of sensor data (like GPS, WiFi, accelerometers, etc.), thus empowering users to efficiently manage and share these images in a way that protects their privacy. As a first step, we proposed PlaceAvoider, an approach for recognizing (and avoiding) sensitive spaces within images. Read an article about this work by the MIT Technology Review. Read more about our project here.
Indiana University hosted the interactive and thought-provoking PETools workshop, chaired by Prof. Apu Kapadia and held in conjunction with PETS 2013. The goal of this workshop was to discuss the design of privacy tools aimed at real-world deployments. This workshop brought together privacy practitioners and researchers with the aim to spark dialog and collaboration between these communities. We thank the authors and attendees for a successful workshop! Please check out the program (with links to the abstracts)!
Prof. Apu Kapadia’s award from the National Science Foundation (NSF) is titled CAREER: Sensible Privacy: Pragmatic Privacy Controls in an Era of Sensor-Enabled Computing. From the press release: Kapadia will receive $550,887 over the next five years to advance his work in security and privacy in pervasive and mobile computing. Kapadia’s grant will allow him to pursue development of reactive privacy mechanisms that he said could have a profound and positive societal impact by not only helping people control their privacy, but also potentially increasing their participation in sensor-enabled computing. “People need only care about the subset of data and usage scenarios that have the potential to violate their privacy, and this reduces the amount of data to which they must regulate access,” he said. “And people make better decisions concerning such access when these decisions are made in a context where they know how their data is being used.”