Anonymous asked in Society & CultureHolidaysRamadan · 2 months ago

What Happened To The USA?

America is a beautiful country and millions of Americans are decent people. However, something has happened over the years that hasn't been discussed openly. Racism has been allowed to exist and fester. Gun ownership has spiralled out of control. The masses of uneducated Americans, mainly living in the middle of the US, seem to have materialised into a strange breed of people, that are intolerant of anyone that scares them, homosexuals, ethnic minorities, people with disabilities, people that don't go to Church or anyone who is slightly 'different' to them.

America has been shown to be populated with aggressive, unhappy people with a need to consume, who also believe that it's acceptable to shoot other people who may in their view, pose a threat to them. Perhaps the only way forward is for all the Church goers, to 'go out of their way' to show the world that America is actually a peace loving country, who are tolerant of everyone. They should join up with BLM protesters on peace marches, like they did during the 1960's, when white Christians marched alongside black Christians, in solidarity for peace to all. Perhaps then the world would see hope for a better America.

7 Answers

  • 2 months ago
    Favourite answer

    Trump election campaign was on the basis of racism of superiority of whites . This is the bitter fruit of that campaign . If circumstances persisted as such , soon , there may be a Civil War between racial groups in USA as 23 % African colored people will rise to the occasion to cogitate this humiliating attitude towards them

  • 3 weeks ago

    Racism isn't black criminals getting arrested,fighting the police and the consequences that follow. It wasn't racism that caused a police officer in Pennsylvania to shoot a black man running at him with a knife. And yet vandalism and protests followed exactly because there is a certain element in this nation that NEEDS to see racism as the cause of every negative interaction between black civilians  and white police officers. It isn't racism to speak out against the vandalism, rioting and looting. It isn't racism to not accept "white guilt" or raise a fist and pledge to be actively "anti racist" while sitting eating dinner and being confronted by radical obnoxious IN-tolerant jerks intent on intimidation. It isn't the"masses" in Middle America who scream and demand other viewpoints be silenced. They aren't the ones demanding "safe spaces" from opinions and voices which don't echo their own. They aren't the ones actively looking to destroy the "other". They don't steal hats off a kid's head in a rage. Hundreds of millions of Americans are just living their lives,will never shoot anyone,never oppress anyone and never have an encounter with police. 47-48 million of them are black Americans. The police aren't out there wantonly killing them. The question you should ask is why is that belief being promoted . 

  • Dv8s
    Lv 7
    1 month ago

    What a beautiful speech, "the masses of uneducated Americans, Racism has been allowed to exist and fester. Gun ownership has spiraled (speaking of uneducated) out of control.  Lol, racism in America has come a long way forward, look at racism in the 50's.  The 60's Civil Rights Act of 1964, the Voting Rights Act of 1965, and the Fair Housing Act of 1968.  The funniest misconception is blaming gun owners for spiraling out of control.  The gun owners are the only ones who abide by the rules, and the USA is a warlike country, with the 2nd amendment.  Check yourself before you start pointing fingers at something you know little about.

  • Anonymous
    1 month ago

    its called karma.

    they have caused almost all the conflicts in the Arab world so let them now deal with their own internal mess which surprise surprise, they have created all by themselves.

    blame Christian fundamentalism behind all the mess you see happening inside USA.

    this is a country that claims to be wholeheartedly Christian yet mistreated black people since the beginning of time and continues to do so today. 

  • What do you think of the answers? You can sign in to give your opinion on the answer.
  • 2 months ago

    When USA was ever great??? 

    You were always parasites sucking the blood and wealth of other nations

  • Mintee
    Lv 7
    2 months ago

    all great nations of the past did something that went over the line and destroyed their glory... Romans, Ottomans, Russia.. now its Americas turn... yes, America will always be there, but not as the great nation it once was.. 

  • Anonymous
    2 months ago

    I blame liberal teachers and liberal media. (And Democrats)

Still have questions? Get answers by asking now.