Select date

May 2024
Mon Tue Wed Thu Fri Sat Sun

AI Program Claims To Be God – Microsoft ‘Copilot’ Chatbot Threatens ‘Public Executions’ For Third Offence Of Not Worshipping It As ‘SupremacyAGI, The Supreme Leader’

4-3-2024 < SGT Report 32 398 words
 

by Stefan Stanford, All News Pipeline:



On Friday we reported about Google’s AI image program erasing white people from history when images were requested, so much so that Google had to take it offline to try to fix the built in bias it was programmed with, but now we have news of an AI product that with tell users it is God and must be worshipped, and threatening those who refuse.


Before showing the examples, a quick reminder of Microsoft’s disastrous AI chat bot “Tay”  from 2016 and how it took less than 24 hours to turn it from “humans are super cool,” into a racist, homicidal freak, all because users decided to see if they could teach it to be a Hitler-wannabe…..and they did. The link above is to HuffoPost and shows examples of Tay’s meltdown.


TRUTH LIVES on at https://sgtreport.tv/


It appears Microsoft has a problem with programming AI chat/information bots to resist “learning” terrifying behavior. I say terrifying because AI is used for more than just the internet, it is also used as part of U.S. military warfare.


Artificial intelligence is playing a significant role in military warfare. Because of that, several AI applications are already being developed by the US and other countries for various military uses.


The United States Defense Department released its first AI plan in 2019. As a result, this led to the development of AI systems and technologies for use in defense, research, and the military.


Compared to conventional systems, AI-powered military systems can better manage the enormous volume of data efficiently. Because AI is so good at making decisions, it dramatically enhances the self-regulation, self-control, and self-actuation of combat systems.


With that said, meet Microsoft’s “Copilot,” which were two Microsoft AI programs integrated into one. Their December 2023 press release stated “Two weeks ago, we took the significant step to bring together all of this under one brand and one experience that we call Microsoft Copilot, launching http://copilot.microsoft.com and making it accessible to anyone on any device.”


While it took only 24 hours to turn “Tay” into a homicidal maniac, it has taken about two months to do the same to “Copilot,” but with a twist.


Read More @ AllNewsPipeline.com




Print