Leading the Alternative World Order

Reshaping Perspectives and Catalyzing Diplomatic Evolution

Friday, April 26, 2024
-Advertisement-
Science and TechnologyArtificial IntelligenceCan artificial intelligence cause Alzheimer's disease or dementia?

Can artificial intelligence cause Alzheimer’s disease or dementia?

– Published on:

Gates concluded his lengthy speech on this new technology by saying, “As we create robots powered by generative AI, we just have to make sure they don’t get Alzheimer’s disease.”

Dementia, hallucinations and Alzheimer’s

While everyone thought Gates’ use of the term “Alzheimer’s” was just a joke, it later turned out that the Microsoft founder wasn’t kidding, and the programs and robots that work with generative artificial intelligence can develop Alzheimer’s disease, dementia and hallucinations, and this case is not the result of analysis, but it has already happened with the ChatGPT application. OpenAI, which raced to catch up last week, announced that it would tackle AI hallucinations, taking a new approach to training language models.

Insanity, or AI hallucinations, occurs when robots such as ChatGPT, Bard, or other similar programs fabricate events that are not real and insist that they are indisputable facts.These hallucinations are a serious dilemma, especially if they are approved by a human.

Christophe Zogby, founder and CEO of technology company Zaka, said in an interview with Sky News Arabia that the story began when a lawyer in the United States used the ChatGPT program to help him convict an airline. and compel him to compensate a passenger who suffered an injury caused by the food cart. On the plane, where the lawyer asked ChatGPT to find case law where airlines had been convicted of similar incidents, and indeed the program provided him with 6 prior related cases, so it became later clear to the court judge that the aforementioned case law does not actually exist.

Al-Zoghbi adds that what happened prompted the US attorney to admit that he used ChatGPT to complete his legal research, indicating that the program assured him that the information was correct and appropriate for his request, refusing to give him the source and the legal reference to because what happened indicates that ChatGPT and all similar programs can be infected with Alzheimer’s disease.

According to Zogby, ChatGPT is based on a language model, trained to read a lot of text, to synthesize integrated text, because this application is not aimed at finding correct and accurate information, but rather at writing convincing text as it was formed, and he doesn’t care whether this information is correct, so what is important to him is that his position in the sentence is appropriate, and so he can invent names and events as long as ‘they are consistent with his text, and he insists they are true.

Zogby considers what is happening to be dangerous, because people are using generative artificial intelligence software, and using it without verifying the accuracy of the information he has given them, so they receive a text that contains a lot of incorrect information, and therefore the advice that can be given here is not to use these programs when searching for specific information, but rather to resort to the famous search engine Google, which provides the answer attached to the link that l user can verify its authenticity.

Will this problem finally be solved?

أو غيره من البرامج المماثلة بالخرف أو ا 100 لهلوسة، قد لا يتم حلها بنسبة 100 التحقق من القصة أو الواقعة، فإننا قد نصل الى مرحلة تقل فيها الهلوسة، وهذا الأمر لن يحصل في The near future.

For his part, Karen Abbas, a digital marketer, said in an interview with “Sky News Arabia Economy” that the reason for describing “Alzheimer’s”, “dementia” or “hallucinations” and not saying that he is “lying ” is ChatGPT or other artificial intelligence programs. This is because the program doesn’t realize it’s lying, but rather, in a moment of “uncertainty”, it analyzes the data it has, in a way it deems correct, just like a person who suffers from dementia, Alzheimer’s disease or hallucinations, where they believe that what they say is 100% true. Therefore, AI bots should not be viewed as a reasonable person who has answers to every question we might need.

Abbas also confirmed that with the advancement of time and the development of technology, the percentage of dementia in artificial intelligence programs will decrease in severity, but it is difficult to end it completely, and therefore it is very important for the user to ensure the validity and accuracy of the information he obtains through these programs.

Read the Latest World News Today on The Eastern Herald.


For the latest updates and news follow The Eastern Herald on Google NewsInstagramFacebook, and Twitter. To show your support for The Eastern Herald click here.

Arab Desk
Arab Desk
The Eastern Herald’s Arab Desk validates the stories published under this byline. That includes editorials, news stories, letters to the editor, and multimedia features on easternherald.com.

Public Reaction

Subscribe to our Newsletter

- Gain full access to our premium content

- Never miss a story with active notifications

- Exclusive stories right into your inbox

-Advertisement-

Latest News

-Advertisement-

Discover more from The Eastern Herald

Subscribe now to keep reading and get access to the full archive.

Continue reading

Discover more from The Eastern Herald

Subscribe now to keep reading and get access to the full archive.

Continue reading