How U.S. Companies Profit From War

The United States exports more weapons than any other nation – which means American companies profit from other countries’ wars.

Advertisements