The human brain’s internal code has been deciphered for the sake of reading thoughts.
Is it possible to build an internal semiotic map of the brain that will allow inner thoughts to be read like a book in a library? Apparently, it is a possibility. Researchers have created a semantic atlas of sorts that shows how the brain categorizes linguistic cues.
Buy Now: Sony PlaysStation VR In Stock Here
The brain responds to different words in many ways. Also association of words takes place inside the neurochemical centers of the brain. The finding will be published in a journal soon.
The brain’s cerebral cortex is involved in language processing. Also it achieves the task of higher echelon cognition. Even people who were very different from each other tend to share language maps.
This mimesis is very surprising. Such a semantic map of the brain could lend us clues about how patients who have suffered from stroke or brain injury tend to think while in their immobile state.
Although the inception of mind reading technology is far off in the future, such a scheme of things could be pursued in its nascent phases even today.
The basic challenge is to chalk out a plan that elucidates how the language instinct is subcategorized in the pathways of the brain. Man after all is the only creature with a full-fledged language and the moral and cultural concomitants that go along with it.
The brain’s activity could be compared to the thought processes and linguistic behavior of the patient. This will allow a one-on-one correspondence between outward speech and inward thoughts.
A decoder is of the essence here. Even those people who have difficulty articulating their feelings or thoughts could have this latest technique applied to them to elicit their inner brain patterns.
Were this thing pursued to its denouement, it would prove to be a fascinating and fantastic piece of technology. It will open new doors. The brain is very intricate and shows extreme complexity.
Mapping it is a formidable task rather like the Human Genome Project. The study, published in the journal Nature, involved people who had vital signs sensors attached to their bodies as they heard a story from headphones put on their ears.
Their brains were subject to imagery so as to reveal the patterns that emerged within them in response to the stories. After much processing and analysis some rudimentary semantic models were constructed that showed the internal dialogues that these subjects were having with themselves.
"Our semantic models are good at predicting responses to language in several big swaths of cortex," Huth said. "But we also get the fine-grained information that tells us what kind of information is represented in each brain area. That's why these maps are so exciting and hold so much potential."
"Although the maps are broadly consistent across individuals, there are also substantial individual differences," said study senior author Jack Gallant, a UC Berkeley neuroscientist. "We will need to conduct further studies across a larger, more diverse sample of people before we will be able to map these individual differences in detail."