The United States exports more weapons than any other nation – which means American companies profit from other countries’ wars.
The United States exports more weapons than any other nation – which means American companies profit from other countries’ wars.
It’s been 13 years since the United States invaded Afghanistan with the aim of bringing down the Taliban government and search for Osama bin Laden. But has the experience made the U.S. any wiser?