The Conservative wrote:Native Americans were the original owners of the land... and if you remember anything about the Indian nations at the time, they had fights with neighboring nations all the time, so...
They "Owned" the land?
Not sure property rights were part of Indiginous culture.
Pretty sure the Land was seen as above Ownership in most indigenous cultures...
Although they fought viciously to control access to it... like most of humanity has over the millennia.
It's common knowledge that you could just walk into a wagon burner camp and eat good if you were hungry. They didn't believe in property and everything was free.
doc_loliday wrote:MAKE A BETTER LIFE FOR THEMSELVES.
Right... not all were there to make a better life. A good portion of them was there for business. Not your imaginary “better life” story that was pushed at you during middle school to make you feel better.
What the fuck are you guys smoking and where can I get some? Is it possible to get more retarded than this?
Technically, a lot of them didn't come here seeking a better life. They came here as slaves or indentured servants. We always focus on the free settlers in our propagandized history, but the reality is that quite a lot of the earliest settlers came here because they had to sell themselves into indenture to pay off impossible debts in England and Ireland.
Actually, America was conquered in the name of Christ to save heathen souls from eternal damnation. It's a myth that people came here for a better life.