English dictionary

united states army meaning and definition

Definition and meaning of united states army at MeaningMonkey.org. united states army meaning and definition in the English Dictionary.

UNITED STATES ARMY noun

Definition of United States Army (noun)

  1. the army of the United States of America; the agency that organizes and trains soldiers for land warfare
Source: Princeton University Wordnet

If you find this page useful, share it with others! It would be a great help. Thank you!

  

Link to this page: