Artificial intelligence can now read people’s thoughts and turn their thoughts into words

Artificial Intelligence Summary (Getty Creative) (Vitaly Golinok via Getty Images)

the artificial intelligence It is not just a tool intended to implement complex functions that so far can only be done by engineers, artists or programmers. It is also an advance that could help humans achieve a level of development that is almost impossible, and cannot even be seen in science fiction. What’s even more surprising is that these kinds of previously unimaginable advances are right around the corner. one of them? The ability to literally read minds.

The year 2023, which is coming to an end, has left us with two discoveries in this regard. First, a group of scientists developed a system that could read a person’s mind and reproduce brain activity in a stream of text, based in part on a transformer model similar to the ones that power Open AI’s ChatGPT and Google Bard; The second was known in the same week Via this New Scientist articleOther scientists have also used a helmet filled with sensors combined with artificial intelligence to turn a person’s thoughts into written words.

The first discovery is an important step on the path toward developing brain-computer interfaces capable of decoding continuous language through non-invasive recordings of thoughts. The results were published in a study last May in the journal Normal neuroscienceLed by Jerry Tang, a doctoral student in computer science, and Alex Huth, an associate professor of neuroscience and computer science at UT Austin.

Non-invasive method

Tang and Huth’s semantic decoder is not implanted directly into the brain, but instead uses functional magnetic resonance imaging to measure brain activity. For the study, experiment participants listened to podcasts while the AI ​​attempted to transcribe their thoughts into text.

“For a non-surgical method, this is a real leap forward compared to what has been done before, which usually consisted of single words or short phrases,” says Alex Huth in the scientific article. “We’ve got a model for decoding continuous language over long periods of time with complex ideas.”.

See also  These 9 realme phones are about to update to Android 12 and realme UI 3.0

These types of systems can be particularly useful for people who cannot physically speak, such as those who have suffered a stroke, and allow them to communicate more effectively.

According to Tang and Huth, the study results demonstrate the feasibility of non-invasive brain-computer interfaces for language. They claim that the semantic decoder still needs more work and can only provide the basic “gist” of what someone is thinking. The AI ​​decoder produced text that matched the subject’s thinking only half the time.

Artificial Intelligence is now able to literally read minds.  Photo: Getty Images.

Artificial Intelligence is now able to literally read minds. Photo: Getty Images. (Yuichiro Chino via Getty Images)

Decoder in action

The study provides some examples of the decoder in action. In one case, a test participant heard and therefore thought of the statement “…I didn’t know whether to scream, cry or run away, so instead I said leave me alone, I don’t need your help, Adam disappeared.”

The decoder reproduced this part of the sentence as follows: “…she started screaming and crying and then she just said I asked you to leave me alone and you can’t hurt me anymore, I’m sorry and then he walked away.”

The researchers also added that they were concerned about the mental privacy aspect. “We take concerns that they could be used for bad purposes seriously and have worked to prevent that,” says Jerry Tang in Nature Neuroscience. “We want to make sure people only use these types of technologies when they want to and when it helps them.”

For this reason, they also tested whether successful decryption requires the cooperation of the person being decrypted, and found that cooperation is absolutely necessary for the decoder to work.

See also  Telegram is not making money (for now), so who pays the party?

Mind-reading helmet

In the second and most recent discovery, participants in the scientific study read passages from the text while wearing a hat that recorded the electrical activity of the brain through the scalp. These EEG recordings are converted into text using an artificial intelligence model called DeWave that interprets the measurements. That is, he reads minds.

The technology is non-invasive, relatively cheap, and easily transportable, says Chen Tinglin of the University of Technology Sydney (UTS) in Australia. Although the system is far from perfect, with an accuracy of about 40%, Lin says the latest data currently being reviewed shows an accuracy improvement of more than 60%.

The DeWave model was trained by looking at lots of examples where brain signals matched certain phrases, explains Charles Chu, a member of the UTS team. For example, when you think about saying “hello,” your brain sends certain signals. DeWave learns how these signals relate to the word “hello” by seeing many examples of these signals for different words or sentences. He explains in statements compiled by New Scientist magazine.

There are undoubtedly two revolutionary advances that herald the coming of a new era: an era in which machines can access our minds in a way that no one in history has been able to do.

More news that may interest you:

Video | Microsoft and AFL-CIO agree to dialogue about AI at work

Leave a Reply

Your email address will not be published. Required fields are marked *