Scientists suggest that an excessive focus on sight and sound is negatively impacting people’s well-being and argue that our devices should provide a multisensory experience
“Wait a minute, wait a minute. You ain’t heard nothing yet.” This iconic line from the 1927 film The Jazz Singer marked a pivotal moment in media history, introducing audiences to the combined experience of sight and sound in film. Over the years, we’ve seen advancements from black and white to color, improvements in frame rates, resolutions, and sound quality. However, our media consumption still predominantly focuses on stimulating our eyes and ears.
Today, with the average person spending nearly seven hours a day on screens, much of it indoors, our dependence on sight and sound has grown. Yet, as humans with multiple senses, are we neglecting our other faculties, and what impact is this having on us?
Numerous psychologists classify our primary senses as either rational or emotional, and there is evidence to support this categorization. According to Charles Spence, a professor of experimental psychology at Oxford University, “Smell [and taste are] directly connected to the emotional processing areas of the brain,” while the rational senses like hearing and vision are processed in the cortex. Spence further explains that more than half of the neocortex, which constitutes more than half the volume of the brain, is dedicated to processing visual information.
It’s undeniable that humans are primarily visual creatures, which is why our media primarily focus on audiovisual elements. Meike Scheller, an assistant professor in the department of psychology at Durham University, explains, “I think it’s mostly driven by the fact that a lot of the information that we consider important today can be conveyed via visual or auditory means. But what we consider important doesn’t necessarily mean these are the things that we need.”
While most people would say sight is the sense they could not live without, evidence suggests that what we would truly miss is our sense of smell. Scheller notes, “There’s a much higher rate of suicide and suicidal ideations among people with anosmia, because it’s a sense that’s so strongly linked to our emotions.”
Is the neglect of certain senses in favor of others affecting our emotional lives? Given that our emotional health is closely tied to our social well-being, the answer is likely yes. Meike Scheller notes, “Smell is a really important cue for social communication and this is something that’s not implemented in any technology we’re using today.”
For instance, research has shown that we subconsciously smell our palms after shaking hands with someone. Charles Spence explains, “That gives you hints about all sorts of things, from their health, to their age, even their personality. A fair amount of that’s lost if we’re interacting digitally only.”
Touch also plays a significant role in our emotional lives, but the finger-focused haptic feedback of our digital devices falls short. C-tactile afferents, a type of nerve receptor found abundantly on the hairy skin of our arms (but not the pads of our fingers), have been shown to evoke positive emotions when stimulated. Spence adds, “These receptors like slow, warm, tactile stroking.”
The cool, smooth touchscreen of a smartphone cannot replicate the warm, soft, subtly fragrant skin of another person. While this might result in less fulfilling social interactions for adults, for a generation of children increasingly socialized through technology, the consequences could be significant.
Scheller explains that children learn to interpret their senses in relation to each other. They might learn to associate a particular scent with the sound of someone shouting or the sight of them smiling, using these cues to navigate social interactions in the future. “Those children growing up with less input basically have less training in being able to categorize how certain things smell, or what a certain touch might mean,” says Scheller. “If all of a sudden we take something away that has evolved over millions of years, that will not only be the removal of one sense, but it will affect how all the other senses work.”
Marianna Obrist, a professor of multisensory interfaces at University College London, emphasizes that our everyday life experiences engage all of our senses. She notes that everything we experience is multisensory.
Consider eating, for example. While it’s easy to think of eating as primarily about taste, the shape and color of our food, its smell and sound, temperature, texture, and weight all engage our vision, olfaction, audition, and touch. “All those senses have already started playing before you’re even eating,” Obrist explains. Then there’s mouthfeel, which includes the physical sensations of spiciness or sourness, and of course, the flavor.
Removing just one of these senses can significantly impact the entire experience. For instance, when people eat ice-cream in the dark, they are less likely to enjoy it or even accurately determine its taste. “Whenever we have multisensory stimulation, we get a much better and richer representation of the environment around us,” says Scheller.