English dictionary

u. s. army meaning and definition

Definition and meaning of u. s. army at MeaningMonkey.org. u. s. army meaning and definition in the English Dictionary.

U. S. ARMY noun

Definition of U. S. Army (noun)

  1. the army of the United States of America; the agency that organizes and trains soldiers for land warfare
Source: Princeton University Wordnet

If you find this page useful, share it with others! It would be a great help. Thank you!

  

Link to this page: