Differential Privacy is an important advance in the modern toolkit for protecting privacy and confidentiality. It allows organizations such as government agencies and private companies to collect data and publish statistics about it without leaking personal information about people -- no matter how sophisticated an attacker is. The project's novelties are in the careful design of new differentially private tools that provide more accurate population statistics while maintaining strong privacy guarantees. The project's impacts are in the ability to create datasets for social science and policy research without sacrificing privacy of individuals. The project includes both graduate and undergraduate students in this research. The technical ideas behind this project are that a careful analysis of privacy proofs for many differentially private algorithms can identify additional (noisy) information that can be released without changing the privacy guarantees. In addition to this, noise that is correlated between different stages of a differentially private algorithm can further reduce the variance of the final result. These techniques will allow smaller organizations to optimize the accuracy of their privacy preserving algorithms for practical deployment, and customize existing algorithms by swapping in building blocks that have more accuracy and that allow them to take better advantage of background knowledge.