Re: maybe sometimes even to promote peace in an area
Post WWII, the USA has never done anything to promote peace in an area.
Whatever it has done on the international scene was to promote the US-centric view of the world. If peace came about, it was a side effect.