White Christian America

The End of White Christian America

Submitted byRWMaster onFri, 01/13/2017 - 16:31

"Robert P. Jones, CEO of the Public Religion Research Institute, spells out the profound political and cultural consequences of a new reality—that America is no longer a majority white Christian nation. “Quite possibly the most illuminating text for this election year” (The New York Times Book Review).