Privacy concerns are becoming a major obstacle to using data, and it is often unclear how current regulations should translate into technology. Over the last decade, differential privacy has emerged as the de facto algorithmic gold-standard in privacy-preserving data analysis, enabling analysis of sensitive data with rigorous privacy guarantees. It is a parameterized privacy notion, and tuning this privacy parameter allows an analyst to smoothly tradeoff between privacy to the individual with accuracy of the analysis. Much of the theoretical work on differential privacy has purposely left this parameter as a free variable that should be chosen based upon the context of data use. Contextual integrity is a framework for formalizing context of data use, and offers a descriptive categorization of information flows as either appropriate or inappropriate based upon the context and cultural norms. This workshop will combine tools from differential privacy and contextual integrity to develop context-based prescriptions for the use and implementation of differential privacy. It will bring together a multi-disciplinary team of privacy researchers from computer science, information science, statistics, business, law, and public policy to address this multifaceted challenge.