Anonymous
Anonymous asked in Politics & GovernmentPolitics · 1 month ago

Is woke culture a cancer that’s ruining America ?

Can’t have black people in advertising redskins name is offensive respect my 67 genders trans bathroom issues at what point is the line drawn in the stand and woke culture stops  

1 Answer

Relevance
  • Anonymous
    1 month ago

    No wonder the USA is going broke.

    Source(s): "Get woke, go broke"
Still have questions? Get answers by asking now.