John Wayne

John Wayne

John Wayne said that there was nothing wrong with the fact that colonists took land away from Native Americans, because the early settlers were "people who needed new land, and the Indians were selfishly trying to keep it for themselves."

Previous Fact Next Fact
Categories: ActorsTribes

Latest FactRepublic Video

15 Most Controversial & Costly Blunders in History

Sponsored Links