Americans, not just those out west, love boots and the American Country Lifestyle that boots embody. These boots just make you feel a little more at home and confident when you slip them on.