I'm biased. I am trained as a historian (church history), with additional graduate study in American History. I think history is important. I believe that having at least a basic understanding of American history is something we should all have (and we should emerge with if we're high school graduates). Having a broad understanding of history can help us place religion in proper context. Unfortunately, as Martin Marty reveals, this is not happening with any regularity, leaving even college educated Americans woefully ignorant about our own history, let alone the role of religion in American history. I invite you to read this essay by the eminent historian of American Christianity and the relationship of faith to the public square, Martin Marty!