Perceiving is Believing

Credit: Martha Morales

 

How humans perceive the world is a complex dance between input from our senses, how our brains encode that input and how everything interacts with our previous experiences. What we perceive often differs from reality, leading to what is known as perceptual bias. 

Now a neuroscientist at The University of Texas at Austin has helped create a unifying theory to help explain perceptual biases. The work, built on decades of data, even can help predict the biases of individuals. 

“Perception is not simply understanding the environment around us as it is, but about how our brains reconstruct the environment around us,” said Xue-Xin Wei, UT assistant professor of neuroscience. “This theory allows us to understand how humans see the world and predict how they will see it, as well as how they may behave.”

Bias in perception traces its roots to many sources, including irrelevant sensory information, often called sensory noise. In accounting for a wide variety of inputs, Wei’s theory helps to shed light on common phenomena, such as people perceiving a slightly tilted line as being more tilted than it really is, believing objects at a distance to be smaller than they are or seeing colors differently near certain colored objects. 

The study’s real-world implications include better understanding of psychiatric disorders that have been linked to certain perceptual bias. Additionally, economic choices and complex human emotions can be shaped by perceptual biases. If we want more control over our own perceptual biases, a key seems to be seeking to limit irrelevant noise and base decisions instead on solid data.

“More information means less biases,” Wei summarized.

 
More information means less biases.