Anonymous yet personal, this Blog chronicles
the daily events and musings of Jim.
It provides an easy way for his friends and family to check in on him,
and serves as a online repository for his random
thoughts, kaleidoscopic flashbacks, and writings on an array of diverse topics.
“Deconstructing Jim” is simply here to
entertain you, but not intended for college credit.

A little about me

My photo
Chapel Hill, NC, United States


Art (27) Birthday (3) Book Review (4) Boston (39) CD Review (2) Celebrations (10) Concert Review (39) Dreams (4) Education (5) Employment (11) Factoid (26) Family (28) Flashback (40) Flying (6) Food (22) Friends (8) Fun (14) Health (3) Holland (5) Movies (9) Music (261) Nature (12) NY (8) Obit (8) Poetry (6) Random thoughts (99) Science (12) Sports (6) Tech (34) Travel (27) Weird stuff (28) Woodwind Quintet (1)

Monday, December 14, 2009

Some thoughts on hearing

Neural Science may someday develop to the point where they will explain why we listen to music. But for now, the subject of how we listen to music is complex enough.

Although there is a theory that plants like to groove to Rock music, our fellow mammals don't seem to care for it. A neural scientist at NY University said, "...if you give monkeys a choice between music and silence, they choose silence pretty strongly."

Yet every human culture studied exhibits music in some form. It seems to have always been that way. An artifact of a 43,ooo to 82,ooo-year Neanderthal bone flute was excavated recently from the cave of our distant ancestor in Germany. It's tuned for "Do-Re-Mi."

Here are a few curious facts...

Humans come with eyelids which allow us to voluntarily shut out what we see. Why didn't we evolve with ear flaps? It would be nice to turn off an annoying sound - or a bad piece of music. (Those of us with hearing-aids thankfully have volume controls).

Of all our senses, hearing seems to be the most supercharged. We hear in the equivalent of a 60" LCD flat-panel HDTV with 1080p resolution in 3D. Compared to vision, hearing can differentiate between discrete temporal events much more accurately than the human eye. A percussionist playing on a snare drum can easily tap out a repetitive pattern of twenty beats per second. We will hear each tap on the drum as an individual event in time. But when twenty successive images are presented to the eye, we see it as a movie. The brain connects the images, unifies them, and interprets the string of visual events quite differently.

This means that music can potentially express ideas and relationships that the visual arts can not. Works that push the envelope of aural perception - such as Milton Babbitt's solo snare drum piece of 1987 - "Homily" - explore rhythm to the max. The ear is faster than the eye, and when the message includes stimuli from both senses, the ear usually wins.

But just how does the brain resolve conflicts between the senses? There is at least one example where visual cues override aural input. If you watch a film where someone on the screen mouths the word "bah" while the soundtrack plays the sound of "dah," your brain will perceive the word as "bah." However, when we are confronted with other contradicting pieces of information, such as rapid flashes of light and the sound of quickfire beeps, our brains usually defer to our ears for accuracy.

The complexity of the ear-eye relationship in music is quite pronounced for the orchestral conductor. The conductor has an almost impossible task to stand before a hundred or so musicians who sit at varying degrees of distance from one another and relay a series of variable beats to them on a rather precise time-grid. Orchestral musicians all see the conductor from a different perspective, and hear the flow of the music in differing degrees of delay because of the room acoustics. They have to meld together both visual and aural information in the context of the written music sitting before them on the music stand.

For the conductor, something as simple as communicating a downbeat is fraught with complexity. The exact start-time with which s musical event begins and how it will ultimately sound has many variables. Unlike an oscillator, musical instruments don't have two states: ON or OFF. The complex and important micro-events that occur within the short attack of a single note exists well within the human perception of time.

Individually and collectively the complex acoustical soup of these events are channeled through the front-end processing of the listeners' middle and inner ear. The information is ultimately parsed, processed and interpreted by the amazing audio engineering burned into our brains.

Generally speaking, we find pleasure with what we hear. Talented and trained musicians and conductors have honed their visual-aural communication skills to maximize results and minimize collective error. Exactitude and clarity is the name of the game.

Composers often have to delegate the details of implementation of their works to conductors and musicians, but are well aware of the complexities of aural perception and the limits of human performance. It's long been known that we can hear far faster than we can physically perform.

As a consequence, some composers have ventured into the world of electronic music to exploit the full capacity of the impressive engineering of the human hearing mechanism. Computer-generated music can (and often does) move at a dizzying pace, and often outstrips even the most agile of musicians who perform on acoustic instruments. Perceptually, we can process music at a very rapid rate - taking in events far quicker than anyone can physically produce them. Some composers (such as Babbitt) have skillfully explored this rich territory of sound, and pushed both our hearing and what can be considered valid musical content right up to the limits of human perception.