The War of 1812 had a deep and intense effect on the United States and its society, and how the rest of the world viewed this developing country. As soon as the war was over the American and British ties went back to the ante bellum status quo. The one significant changed that was noticeable is how United States was began to be viewed as a nation capable and strong by having been competent and capable of withstanding two great wars with Great Britain.
As a country, the American society changed as well as transformed faith in the country reigned supreme. The War of 1812 certainly played a huge role in changing and developing the United States. All federalists were then deemed to be traitors and the Federal party was abolished. Most countries then were given the idea that the United States was not a nation that can easily be conquered and destroyed.
By managing to go up against the British Empire, the United States proved something and gained international respect. The American society became more solid as their faith for their nation was renewed. The American citizens had their morale uplifted because of surviving the war, and their feelings of nationalism intensified.
Adams, Henry. The War of 1812. New York: Cooper Square Press, 1999. Nevin, David. 1812 (The American Story). New York: Forge Books, 1997.