Why has woke become a bad word in American society?
What started as a term to describe those who were aware of the complicated social issues that plagued American society has morphed into a politically charged phrase that is used by conservative talking heads as a catch-all boogeyman for everything wrong with the Modern United States. But how did we get here?
Today the word woke is the main driver behind the culture war in America and no matter what side you're on, the term means something that's become more hardened as politicians and media persons have co-opted the term for their own nefarious purposes.
A worrying trend has developed around the use of the word woke. From politicians who spout conspiracy theories to purple-haired college graduates, it seems like everyone has used the word to fit their narrative.
The term has become so divisive in today’s modern world that it is harming support for the issues it was originally meant to highlight.
There is no better example than how the American right has used the term woke to help forge lasting political alliances to “save” American values.
Staffers for Florida Governor Ron DeSantis were recently asked to define the meaning of the term woke in a Tallahassee courtroom and they came up with an interesting answer that explains why the word is at the center of America’s culture war.
Ryan Newman, DeSantis’ general counsel said that woke referred to “the belief there are systemic injustices in American society and the need to address them.”
Taryn Fenske, director of DeSantis’ communications also chimed in and said that woke was really just slang for “progressive activism,'' adding that it could also refer to systemic injustices in the United States.
Both of DeSantis’ aides could be considered correct. But in their answer, you will discover why the term woke has become an insult against the American left — believing that the freest nation on the face of the Earth could also be home to systemic injustice just goes against the right's “pull yourself up by your bootstraps” philosophy.
For American conservatives, anyone can make it in America with a little hard work. But there’s also a more sinister idea behind how the right is using its marketing machine to demonize the word.
Words like woke and critical race theory are being co-opted by right-wing political thinkers to prompt voters to think about violence rather than social justice.
Right-wing American pundit Christopher Rufo was quite open about this intention in a March 2021 Tweet where he explained exactly how the American right was turning the word into a taboo for its audience.
Photo by Facebook @realchrisrufo
“The goal is to have the public read something crazy in the newspaper and immediately think ‘critical race theory,’ ” Rufo wrote on Twitter.
Photo by Facebook @realchrisrufo
“We have decodified the term and will recodify it to annex the entire range of cultural constructions that are unpopular with Americans,” Rufo added.
Photo by Facebook @realchrisrufo
This decodifying seemed to have worked remarkably well. For example, red states have been banning books, and in an October 2022 poll from Harvard-Harris, researchers found that 64% of Americans blame woke politics for the country’s spike in crime.
Moreover, the published study revealed that a majority of Democrats no longer consider themselves woke and believe the concept is responsible for the country’s current public safety issues, meaning that the Republican strategy of twisting the word woke actually worked.
Words like woke and phrases like critical race theory have been turned into insults and catchalls for whatever ridiculous thing is happening in American politics rather than referring to being aware of the very real systemic inequities and biases that exist in the country.
This isn’t just a problem in the United States, though. Europe has seen its fair share of issues with woke culture being imported to its shores.
In France, Diversity Minister Elizabeth Moreno has called “woke culture is something very dangerous, and we shouldn’t bring it to” the country.
The concept of “being woke” has come to be viewed by France’s elite as a destructive political force since its theories on race, gender, and post-colonialism threat the foundations of French society and French values according to Samuel Hayat, a political researcher and fellow at the French National Center for Scientific Research.
“Woke is seen as a threat that comes from a society thought to be multicultural and violent and does not have the same values on secularism that France does,” Hayat noted.
Many other non-colonial European nations also have similar feelings. Conservatives in Central and Eastern Europe have been the most likely to grab onto terms and use them to beat their left-wing political opponents.
Where this will end is still unknown, but we are certainly seeing the pendulum swing and that may not be for the best. Turning woke into a curse word has only served to divide our nations and undermine the very ideals that allow western democracy to thrive.