RhymeZone

 

Definitions of dominionism:
  • noun:   A tendency among some conservative Christians, especially in the USA, to seek influence or control over secular civil government through political action.
  • noun:   The belief that human beings should be free to dominate and exploit nature, including plants and animals.

(Definitions from Wiktionary)

Related words...
Descriptive words...


 
Help  Feedback  Privacy  Terms of Use

Copyright © 2023