English dictionary
united states army meaning and definition
Definition and meaning of united states army at MeaningMonkey.org. united states army meaning and definition in the English Dictionary.UNITED STATES ARMY noun
Definition of United States Army (noun)
- the army of the United States of America; the agency that organizes and trains soldiers for land warfare
- synonyms: Army, U. S. Army, US Army, USA
Source: Princeton University Wordnet