Define West Tenn Meaning

West Tenn
The western portion of the state of Tennessee. Everything west of the Tennessee river.

The realest niggas stay in West Tenn. The farther east you go the lamer the niggas get!
By Audry