Although the American culture has indeed changed, we need to be careful how we describe this change. The illustration on pages 53 and 82 seems to teach that the United States formerly had God’s Word as her foundation, but now the foundation is man’s word. Let’s be clear: The United States has never been a Christian nation. The United State does not teach or enforce Christian doctrine. She does not rule her citizens by the spiritual or even the civil righteousness of the Ten Commandments. She was founded by rationalists, deists, and Christians. I’m not sure that the majority of her inhabitants were ever practicing Christians–but maybe at some time. We can say that general Biblical knowledge of the population has probably declined. We can say that outwardly immoral behavior (civil righteousness) has become more acceptable. We can say that many are openly denying natural law and the conclusions of human reason in order to embrace their own sinful desires.
We also need to be concerned when the church is directed to influence “culture.” For example, p.117 says, “The church is, by and large, no longer influencing the culture in our once very Christianized West, but now the culture is rapidly influencing (infiltrating) the church.” When the foundation and purpose for the right hand (church) kingdom and the left hand (state) kingdom are not properly distinguished, there will be much mischief. Next week we will take up the topic of the three estates in order to determine what kind of “influence” is appropriate.