Do you think liberals are trying to destroy the United States?

I hear a lot of talk about how liberals are trying to destroy the United States. Most of this is just stuff I hear on TV or the internet from conservative personalities.

The only conservatives I’ve heard say such a thing in the everyday world are typically grumpy old men who complain about everything.

From my perspective, I really don’t think liberals or conservatives are trying to destroy anything. From what I see, people just have very different values systems which leads to differing ideas about what it takes to improve things here in the United States.

Aside from extremists who want to watch the world burn (and exist on both sides), do you believe that the average liberal wants to destroy the United States?