People love watching and reading about dystopias and rebellions, but when it happens in real life suddenly those people are demonizing the resistance and championing the oppressors as the protectors of society.
It’s only a legitimate rebellion when white people are doing it
Like that post that explained dystopic futures in fiction as “what if all the bad things that already happen ALSO HAPPENED TO WHITE PEOPLE?” and i feel that nailed it
"And like the leaves,
Fall for autumn
I fall for you"
half of me wants to be a really physically active person but the other half of me is like “nah son” and how can I argue with that
SERIOUSLY BE NICE TO YOUR ANIMALS BECAUSE THEY LOVE YOU MORE THAN YOU DESERVE AND MORE THAN ANY HUMAN EVER WILL