Posts tagged information theory

Posts tagged information theory
Human consciousness is simply a state of matter, like a solid or liquid – but quantum
Thanks to the work of a small group neuroscientists and theoretical physicists over the last few years, we may finally have found a way of analyzing the mysterious, metaphysical realm of consciousness in a scientific manner. The latest breakthrough in this new field, published by Max Tegmark of MIT, postulates that consciousness is actually a state of matter. “Just as there are many types of liquids, there are many types of consciousness,” he says. With this new model, Tegmark says that consciousness can be described in terms of quantum mechanics and information theory, allowing us to scientifically tackle murky topics such as self awareness, and why we perceive the world in classical three-dimensional terms, rather than the infinite number of objective realities offered up by the many-worlds interpretation of quantum mechanics.
Explaining the origins of word order using information theory
The majority of languages — roughly 85 percent of them — can be sorted into two categories: those, like English, in which the basic sentence form is subject-verb-object (“the girl kicks the ball”), and those, like Japanese, in which the basic sentence form is subject-object-verb (“the girl the ball kicks”).
The reason for the difference has remained somewhat mysterious, but researchers from MIT’s Department of Brain and Cognitive Sciences now believe that they can account for it using concepts borrowed from information theory, the discipline, invented almost singlehandedly by longtime MIT professor Claude Shannon, that led to the digital revolution in communications. The researchers will present their hypothesis in an upcoming issue of the journal Psychological Science.
Shannon was largely concerned with faithful communication in the presence of “noise” — any external influence that can corrupt a message on its way from sender to receiver. Ted Gibson, a professor of cognitive sciences at MIT and corresponding author on the new paper, argues that human speech is an example of what Shannon called a “noisy channel.”
“If I’m getting an idea across to you, there’s noise in what I’m saying,” Gibson says. “I may not say what I mean — I pick up the wrong word, or whatever. Even if I say something right, you may hear the wrong thing. And then there’s ambient stuff in between on the signal, which can screw us up. It’s a real problem.” In their paper, the MIT researchers argue that languages develop the word order rules they do in order to minimize the risk of miscommunication across a noisy channel.
[E. Gibson, S.T. Piantadosi, K. Brink, L. Bergen, E. Lim, and R. Saxe. A noisy-channel account of crosslinguistic word order variation. Psychological Science, accepted, 2012]
What number is halfway between 1 and 9? Is it 5 — or 3?
Ask adults from the industrialized world what number is halfway between 1 and 9, and most will say 5. But pose the same question to small children, or people living in some traditional societies, and they’re likely to answer 3.
Cognitive scientists theorize that that’s because it’s actually more natural for humans to think logarithmically than linearly: 30 is 1, and 32 is 9, so logarithmically, the number halfway between them is 31, or 3. Neural circuits seem to bear out that theory. For instance, psychological experiments suggest that multiplying the intensity of some sensory stimuli causes a linear increase in perceived intensity.
In a paper that appeared online last week in the Journal of Mathematical Psychology, researchers from MIT’s Research Laboratory of Electronics (RLE) use the techniques of information theory to demonstrate that, given certain assumptions about the natural environment and the way neural systems work, representing information logarithmically rather than linearly reduces the risk of error.