John Wayne

John Wayne

John Wayne said that there was nothing wrong with the fact that colonists took land away from Native Americans, because the early settlers were "people who needed new land, and the Indians were selfishly trying to keep it for themselves."

Previous Fact Next Fact
Categories: ActorsTribes

Latest FactRepublic Video

32 Incredible Easter Eggs You Missed in Harry Potter Movies