Hopeless in the US
I'm trying to avoid spreading a doomer mindset around but I need to vent. I am getting involved and doing what I can to stand up to what's happening here in the US but honestly I don't feel optimistic.
It just seems like we are up against something so powerful and there aren't even enough of us who are taking a stand. It's business as usual everywhere I go, hardly anyone wants to talk about what's going on, I see practically everyone burying their heads in the sand because of this "well things will be okay! They always have been" kind of mindset.
I feel like it's almost written in the stars that we aren't meant to win. This has been planned out for decades im sure and I fear even the politicians that are supposed to be on our side are complicit in all of this.
Does anyone else feel like we are just fighting destiny at this point?
I just have this heavy feeling that we are just not meant to win. Complacency is too innately woven into our society, and this is by design.
At the same time, we have to try right?