I think I've unlocked the secret of what liberals seem to be up to in America. They are trying to take things considered "bad" and make them seem okay.
For example, drugs. Growing up in a post-Reagan America with things such as "Just Say No" and "D.A.R.E." and films and TV shows showing how bad drugs were and how they could only lead to trouble, liberals often complain that so many people are in prison due to drugs. So instead of trying to stop drug use and enforce against it, they want to make it so that the bad thing people are doing is actually legal.
The recent Trayvon Martin shooting and Michael Brown situation ongoing are more examples. Many people began mentioning how hoodies are associated with a "thug" culture. This is true. Hoodies, baggy pants, ect. So instead of thinking, "Gee maybe we shouldn't wear these things so associated with negative culture" what do liberals do? They show "solidarity" by wearing the hoodies.
Basically it's like the bad stuff makes people realize they're not behaving and are becoming socially undesired by society so instead of changing, they are embracing the bad. It's the same as the analogy of saying, "There are so many murderers in jail, so let's make murder not be a crime anymore so that those people won't be shunned or looked down upon or feel shame, ect ect."
I think it's a terrifying road to go down.
Your post sounds like it's straight out of 1994 when people would actually buy into this authoritarianism. Sounds like my high school right after the 'Footloose' era.