Key Takeaways
- Human communication is a constant blend of meaning-driven (semantic) and emotion-driven (affective) signals.
- Our brains process spoken language primarily in the left hemisphere, but emotional sounds and music engage the right hemisphere more evenly.
- Spoken language may have initially evolved from singing, a form of emotional communication, before developing into abstract meaning.
- Facial expressions serve as a vital, often unconscious, layer of communication that clarifies vocal ambiguity.
Your Message Isn't Just Logic
Andrew Huberman and Dr. Erich Jarvis reveal a deeper truth about communication. It encompasses both meaning-driven (semantic) and emotion-driven (affective) signals. Dr. Jarvis distinguishes between “semantic communication, communication with meaning and effective communication, communication that has more of an emotional feeling content to it.” As a founder, you're constantly trying to convey meaning—market opportunity, product vision, team strategy. The core insight from this discussion is that emotional content holds a primary position; it might be the very foundation of communication.
The podcast explores the hypothesis that spoken language didn't emerge fully formed for abstract thought. Instead, it likely evolved from more primal, emotional vocalizations, specifically singing. “That has led a number of people to hypothesize that the evolution of spoken language of speech evolved first for singing... and then later on it became used for abstract communication like we're doing now,” Jarvis states. This isn't just an evolutionary footnote. It means every time you speak, you're tapping into a system built for conveying emotion first.
Decoding Unspoken Signals
While our left brain handles the heavy lifting for speech, processing words and grammar, the right hemisphere is more balanced when it comes to music and emotional sounds. “The left in us humans is more dominant for speech but the right has a more balance for singing or processing musical sounds as opposed to processing speech,” Jarvis explains. This separation implies that even as you deliver a logically structured argument, the emotional tone and non-verbal cues are being processed on a parallel, powerful track.
Critically, facial expressions play a silent but active role. They aren't mere accompaniments to speech; they are an intrinsic part of the communication system. “The facial expressions get rid of that ambiguity,” Jarvis notes. If your words say one thing but your face, posture, or tone say another, the emotional signal often wins. This is why a perfectly logical pitch can fall flat if the speaker's affect doesn't align with their message.
This perspective challenges the common founder mindset that clear, concise logic is king. While clarity is important, ignoring the affective layer is like sending an email without a subject line. Your team, investors, and customers are constantly reading your emotional signals, often unconsciously. These signals clarify, or contradict, your spoken words.
What to Do With This
Next week, before any critical communication—a pitch, a difficult feedback session, or a team meeting—record yourself. Don't just check if your points are clear. Actively review your tone, pacing, and facial expressions. Ask yourself: does the emotional content of my delivery match the meaning I intend to convey? If your words say 'confident' but your face looks 'anxious,' adjust your delivery until both layers align.