Individuals generate enormous amounts of personal data that are subsequently collected and stored by organizations and governments. The data powers many innovative applications in areas such as web services, health care, and transportation, but they also increase privacy risks. Differential privacy, a framework to rigorously reason about privacy properties of algorithms, holds tremendous promise for enabling privacy-preserving yet useful data analyses. However, its adoption has been limited to entities with massive user bases. This project aims to democratize the ability to deploy differential privacy by making it practical for entities with smaller user bases. Research activities in this project include formulation of a new, hybrid, privacy model that models heterogeneous privacy preferences of individuals and current industry practices. The project also develops novel algorithms that preserve differential privacy while taking advantage of the hybrid model to improve utility outcomes, and evaluation of their performance. The work makes progress towards eliminating one of the significant barriers to data-driven innovation by expanding the applicability of differential privacy to a wider range of entities.