November 24, 2011

The “Wild West” is an expression used to refer to life in the western United States during the late 1800s. For decades, films and books have depicted the Wild West as a place of gunfights, outlaws, and mass disorder. But recent scholarship shows otherwise. It turns out that the Wild West may not have been…