New cochlear implant processor from Advanced Bionics

19 01 2013

AB's next gen BTE

The next generation cochlear implant processor has been rolled out by Advanced Bionics, enhanced with Phonak’s high-performance hearing aid technology, achieving an industry first in bimodal technology. The processor will be launched in the UK within the next few weeks, so it will be some time before it is actually available as the audiology clinicians will require training prior to provision – this processor has the most exciting list of goodies!

The new processor is 40% smaller than the Harmony, Advanced Bionic’s previous BTE processor – in the photo below, the Harmony is on the left, a hearing aid is on the right, and the new processor is in the middle. It’s thinner and lighter – it is smaller in size, but larger than life in performance. The processor is available in lots of colourways.

Colour wheel

Size

Features

UltraZoom – The user can focus on a speaker in front of them in a noisy environment.

ClearVoice – Sound is automatically analysed to filter out environmental sounds from the speech signal, improving understanding of speech in noise by up to 55%.

The SNR (Signal to Noise Ratio) is how much louder a voice is in relation to other background noise. Children need a speaker to be 15 decibels louder than background noise in order to be easily understood. Adults only need a SNR increase of 4 to 6 decibels. When UltraZoom is used with ClearVoice by a unilateral user, up to 6.5 dB SNR is obtained. The future capability of these combined features mean bilateral users will benefit with up to 70% improvement in understanding speech in noise.

HiRes Optima – Advanced Bionics’ newest sound processing strategy optimises battery life.

T-Mic – The unique T-mic is a microphone situated at the ear canal, utilizing the ear’s natural capability to gather sound for optimal listening.

HiRes Fidelity 120 – This is the only sound strategy in the cochlear implant industry that uses 120 spectral bands to deliver five times more sound resolution than any other cochlear implant processor. This means sounds are richer, fuller, and more natural.

AutoSound – The widest range of sounds, from softest to loudest, up to 80 IDR (Input Dynamic Range), are automatically adjusted to.

WIRELESS CONNECTIVITY

Matt has a Harmony processor, a Oticon Safari 900 SP hearing aid, and an iConnect. He wants to pair his laptop to his Harmony via Bluetooth – he hates wires everywhere and wants to do his auditory rehabilitation at his desk in a noisy open plan office. The easiest solution for Matt is to use a Phonak Smartlink with an MLXi FM receiver, but this is expensive. To access Bluetooth, the iConnect is needed to attach the processor to cables, neck loop, or Bluetooth headphones. Advanced Bionics advised Matt that the best solution is to upgrade to a Phonak hearing aid and the new Advanced Bionics processor with all its connectivity options…..

Phonak ComPilot – With the ComPilot, Matt can link wirelessly to a wide variety of devices such as mobile phones, computers, Bluetooth, media players, TVs, navigation systems, and FM systems.

Phonak RemoteMic – Speech is streamed directly and wirelessly to both ears, making it easier for Matt to listen to speech in noisy places.

Phonak TVLink – Audio is streamed directly to Matt’s cochlear implant processor so he can listen to the TV.

Advanced Bionics myPilot – With this remote control, Matt can change his processor settings.

*** Bimodal Technology ***

For the first time, Matt is able to wirelessly and simultaneously stream sound to a Phonak hearing aid and an Advanced Bionics cochlear implant processor. This is a hugely exciting stride forward in cochlear implant technology, merging the technologies available from Phonak and Advanced Bionics.

Phonak ComPilot

Phonak RemoteMic

Phonak TVLink

AB myPilot

BILATERAL HEARING

The new cochlear implant processor is very exciting for bilateral listeners – such as me! Phonak Binaural VoiceStream Technology™ will allow me to hear speech and phone calls, adjust volume, and change programs – simultaneously.

Features

ZoomControl – I will be able to focus on a speaker situated on either side of me, to help me hear better in noisy places.

DuoPhone – My phone calls can be automatically streamed to both ears so I can hear voices in stereo, with a higher level of speech perception from binaural hearing.

QuickSync – Both processors can be adjusted instantly at the same time.

Future Developments

There is yet more to come from Advanced Bionics – WindBlock, EchoBlock, and SoundRelax.Windblock Echoblock SoundRelaxCochlear implant processors are entering an exciting phase of development as all manufacturers are expected to come out with new processors this year.

Watch this space!

Advanced Bionics
Brochure: Next Gen Processor
Phonak: Dynamic FM





Getting the eyes checked: Tick

13 09 2010

PhotobucketThe eyes are sooo important. I needed a contacts lens checkup and popped into my optician this morning. It’s Tricky Dicky Time.  Today was a different experience with my new cochlear implant. (It’s still a baby, so it’s still new and exciting.)

I was annoyed with the horrible music they were playing in the shop – I had never noticed this before.

In the examination room, I picked up the loud hum of the air conditioning unit (I had never noticed this before either) and changed my cochlear implant setting to ClearVoice high, which cut out the hum. Now I could more clearly focus on the optician’s voice. She started talking and turning around so I told her I need to lipread. From then on, everything went swimmingly.

The equipment for the checkup is straight out of the control room in Star Trek. A huge pair of ‘glasses’ is swung from the wall to the front of my face and I fit my chin on the chin rest. I peep through the holes and I can see the optician’s face through one tiny round window, then the other. I am asked to say which letters I can see on the wall chart, then if green and red look the same on the wall, do the letters look sharper or not as she switches lenses.

Then we get to the ever-tricky part. She needs to look at the internal health of my eyes. To do this, she needs to switch off the light, and then I’m rendered unconscious incommunicado.  During this checkup, I always find it tricky as I am unable to either hear the optician tell me what to do or to lipread instructions in the dark – ‘Look up’, ‘Look straight ahead’ etc. I have devised coping strategies for this. I always remind them of my need for visual clues and at the same time as giving me instructions, they will point when telling me where to look during the checkup. Luckily, there is just enough light to see their finger point the way.

Today, however, I discovered I could hear *and understand* what she said in the dark. It made the experience much easier, quicker, and less stressful. Another little step up the cochlear implant staircase of progress! Whoo Hoo!





Please, Audie, can I have some more?

18 08 2010

I had been told I had a 9 month wait until my next mapping session and I wasn’t too happy when I discovered Michele has a mapping session every month at her hospital, over a year after her activation. Apparently my large London hospital is too busy implanting children. In an attempt to jump-start my hearing’s learning curve, I popped in for a new mapping.

My settings were (Fidelity 120 HighRes-S);

Program slot 1: IDR60, ClearVoice medium, 100% T-mic

Program slot 2: IDR70, no ClearVoice, 100% T-mic

Program slot 3: IDR60, no ClearVoice, 50% T-mic

I asked for these to be changed to;

Program slot 1: IDR70, ClearVoice medium, 100% T-mic (my everyday program)

Program slot 2: IDR80, no ClearVoice, 100% T-mic (for music)

Program slot 3: IDR70, ClearVoice high, 50% T-mic (for noisy places)

I wanted an IDR of 80 to improve my enjoyment of music. I like an IDR of 70 as I can hear the lower tones in traffic and people’s voices. The default IDR for Advanced Bionics is 60 but I found this a little flat once hearing through a cochlear implant felt normal (about 3 months). Not everyone likes a high IDR and it can take some getting used to. The IDR is the Input Dynamic Range of the cochlear implant processor. This is not the same as the volume.

Imagine, if you will, you have been locked in a dark room for all of your life. The curtains are opened and bright sunlight floods in, diffused through the partially open window blind. At first, the light blinds you. It’s so stange and so bright that you can’t see. This is what it’s like at the cochlear implant activation. People who have been blind for a shorter time find it easier to cope with than those who have been blind all of their lives. Your eyes gradually adjust and you can make more sense of what you see – colours, outlines, contrast, etc. Then you open the curtains a little more – this is like increasing the volume of the cochlear implant. The window blind, behind the curtains, is pulled up a little so you can see more outside. Opening the blind is like increasing the IDR. You gain a wider range of sight and can see more objects, which get clearer the more you look at them. Or, you can hear a wider range of sounds and the more you listen, the more distinct and individual they become.

ClearVoice is an amazing additional program which automatically softens background noise so I can hear speech. I like using ClearVoice high on the London streets and train stations. Today I used it in the office as we had roadworks outside and the loud drill was horrific – my poor colleagues had headaches but I just flipped a switch. *grins*

I had another hearing test and showed a slight improvement. It’s amazing – and very surreal – to have continually improving hearing!

Photobucket

The lines on the audiogram show my hearing, from the bottom line up;

1: Black line: February 2010, before my cochlear implant

2: Green line: April 2010, 2 weeks after switch-on

3: Blue line: June 2010, 3 months after switch-on

4: Pink line: August 2010, 5 months after switch-on

Although I now have good hearing, my brain cannot process all of this information. It’s too new and too much data. I’ve been very deaf all of my life and this new information is not going to get sorted and filed in a matter of weeks! The likely timeline for optimum performance from a cochlear implant for a born-deaf candidate is 2 years. It is hard to watch other people do so well so quickly, but this is why I celebrate every little milestone – because it IS progress, and a snail can win the race just as well as the hare – it just takes longer.

I’m picking out the odd sentence or even paragraph in my audio book. In real life – much harder – I’m starting to pick out a few words here and there. I went shopping last weekend and asked the assistant if he could unlock the changing room for me. He said ‘Follow me’, unlocked the door, and I thanked him. He replied ‘You’re welcome’. As is often typical for a hearing person, he had not been looking at me when talking. Yet I managed to understand what he said. I was thrilled! I’ve heard people say ‘Excuse me’ when they shove move past – not something I’ve heard before either.

Last weekend, I took a train and on the hour long journey, I had great fun listening to all the announcements, to see how much I could understand. I could hear about 70% of them. I met a friend in a noisy supermarket cafe and had no problem having a conversation with her (unthinkable in my hearing aid days). I then caught a train back, on a different network, but the announcements were not as loud or clear. Today, I’ve been able to hear quite a few of the announcements on the London underground.

I’ve been practising using the phone for a week or so. The success of this varies according to vocabulary, clarity of the speaker, tone of voice, accent, type of phone being used, background niose ….. so this is still quite a hill I have to climb. I’m starting off with very simple sentences rather than conversations.

Our cat Hussy miaows all the time. She cries for food, attention, whatever. In 2 years, I have never heard her miaow. A couple of days ago I saw her come up to the doorway and this huge MIAOOOOW smacked me between the eyes. Or should I say, between the ears. I was stunned; the loudness and clarity took my breath away.

I haven’t been able to hear the smoke alarm for a few years and when the placement officer from Hearing Dogs came to visit a couple of weeks ago, she tested it for Smudge. EEEEEeeeeeeeEEEEEEEEeeeeeeeEEEEEEEeeee – it is a totally disgusting sound! Other surprising sounds in the home have been the extraction fan and bolognese sauce bubbling on the cooker.

Last week I went for a drink and sat outside talking to a friend at a pavement café – again, unthinkable with hearing aids. It was on Goodge Street which was very noisy indeed with rush hour traffic. With my cochlear implant, I had very little difficulty following the conversation. I was prompted to put my hearing aid in my other ear – not having looked at it for months – and was assaulted by an indescribable wall of loud meaningless sound. I thought a number of police sirens had suddenly started somewhere but there was just ordinary traffic. Nothing was clear and all the sounds were blended together. After a minute I took it off and I had a thumping headache. My head felt as if it had been kicked really hard on the hearing aid side, it actually throbbed with the pain.

It’s been an interesting 5 months. I’m hearing sounds I never realised existed. I’m enjoying sounds I’ve never heard before. I’m feeling so much less stressed with communication. I do have ‘off’ days when I feel as if life is too loud, or I only have half a head of hearing. It’s early days though and this will go away in time.  I’m so glad I took the road less travelled, I’m starting to reap what I and the medical team have sowed, and actually, I’m thinking of getting a second cochlear implant. Hell – I’d get a third if they could find somewhere to put it!





Music makes my heart sing

19 06 2010

I’ve just had my 3 month checkup. The first person I saw was my surgeon who is a very happy bunny. All looks good! My internal implant is a little sore along the side of the bump, this turned out to be where it touches the processor. I’ll need to hop along to my opticians and get the arm of my glasses (and sunnies!) adjusted so it doesn’t press against the area behind my ear and weaken the skin.

I then went to see my audiologist. She was also very happy at my progress. I had all my electrodes set again to maximum comfort levels and was given a slight increase in sound. I asked for ClearVoice (high) to be replaced with a normal program with a wider IDR (Input Dynamic Range) of 70 for music. With a wider IDR (explained here), you gain a wider range of sound. For the last two weeks, music has sounded pretty much perfect. When I listen to my iPod with the 70 IDR, it sounds even better, it’s so beautiful that I don’t want to stop listening. If I close my eyes, I can pretend I am hearing in stereo, as I sit enveloped in this wonderful sound that is in-my-face-listen-to-me, full and rich, swirling around my head and making me feeeeel the emotion. Vocals sound normal and some are so beautiful that they make me want to cry. Isn’t this what music’s all about?

I had a hearing test and have improved in the last 2 months so this was great news. The decibel range of zero going down  to -30db is considered to be a normal range of hearing for a hearing person (above the red line).

Red dots : My hearing 3 months after activation
Blue dots : My hearing 2 weeks after activation
Black dots : My hearing before the cochlear implant

My speech and language therapist tested me on my language comprehension, in the left ear with cochlear implant only. Here’s an updated progress chart from pre-implant through 2 weeks post-implant, to my current 3 month status. I’m aiming to get all speech comprehension scores close to 100%.

KEY:
Sentences in quiet = Listening to sentences without lipreading
Words in quiet = Listening to single words without lipreading
Lipreading & sound = Lipreading and listening to a speaker’s sentences
Lipreading in quiet = Lipreading a speaker’s sentences with no sound

The biggest change has been my ability to hear sentences in a soundproof booth, it has jumped from 24% with a hearing aid to 43% with a cochlear implant. If I did not have the cochlear implant, this ability would have continued to decline. I have been able to understand some words when listening to my Harry Potter audio book, it’s so exciting when I am able to pick out a bit here and there. It’s hard work, it’s almost like concentrating but trying not to concentrate too hard – like when you look at those magic eye 3D pictures and try to see what’s hidden there. My ability to hear words in quiet hasn’t changed, as this is very difficult to do without context to help.

My lipreading in quiet scores, at 43%, are very high. I spoke to a professor whose area of interest is forensic lipreading, and she said most people would score 5% in lipreading in quiet. Deaf people get to practise lipreading every day of their lives but as there are so many homophenes and unseen phonemes, it is not possible to score 100%. It’s great that I can still lip read well – and thank goodness I can, or I’d be stuffed trying to get through all of this! I’ve been worried that my ability to lipread would decrease as I learn to hear and try to break the habit, but my audie reassures me and academic studies show this is not usually the case. However, some of my implanted friends say they cannot lipread any more, discovered when they run out of battery power and are forced to rely on lipreading. So I don’t really know if I’ll be able to hang onto my lipreading ability.

I have experienced some new sounds in the last month. The beeping as the green man (walk/don’t walk) sign flashes when I cross the road, and I can hear it All The Way Across The Road. Amazing! I went to see a ballet, Swan Lake, at the Royal Albert Hall.  This was my first visit to a ballet. I was able to hear the orchestra very well and was surprised to see the ballet dancers enter and exit stage very beautifully and gracefully, but with an incongruously ungraceful THD THUD THUD THUD THUD THUD at the same time! I peeled a banana this morning and was surprised by the loud SSSSSSSSSSS sound it made. I then popped out to the shops and another new sound had me jumping in fright so much so that I almost threw myself into the nearest wall. I heard this very loud and deep roar right behind me, I could almost feel it and it jumped out of nowhere, I didn’t know what it was, and it frightened the crap out of me. I used to be scared of dogs that jumped and barked at me so maybe this is where that fright came from, apart from it being so loud and unexpected. I then saw a Harley Davidson go past, obviously it revved just before it reached me. SHEESH!

Although singing voices sound normal, speaking voices don’t sound normal yet (when people talk to me directly) although they are not far off.  I feel as if I am living on Planet Cartoon as people walking past still sound like Minnie Mouse or Donald Duck.

And my shoes squeak all the time! Bah!





Cochlear implant mapping

4 05 2010

I’ve had 5 “mapping” sessions since activation. A mapping is a reprogramming of the cochlear implant, to readjust the electrical stimulation limits of the electrodes as each user’s brain adapts to sound and fibrous tissue grows over the internal implant. Mappings are typically carried out once a week for the first 6 weeks, then every few months, then twice annually.

At each mapping I was given increased volume and it was an opportunity to address any concerns with the audiologist. This was followed by a coffee break in the hospital cafe then a speech therapy session. I have one more mapping session this week, then my next one is in June when I have my 3 month check.

It’s been a rollercoaster ride. I’ve started with beeps and pushed so hard that I got a constant whine when I put the implant on. This set me back and I had to slowly build up my tolerance again of high frequency sounds from zero, bit by bit, and have successfully avoided a reocurrence of the whine. I have not yet reached full volume, there is still some way to go, which is kind of scary. I found last week quite difficult as everything seemed too loud and I started feeling stressed, but I hung in there and carried on wearing the cochlear implant until I got used to the increased sound levels.

Increased sound levels can be problematic for cochlear implant users because they are more sensitive to loudness changes. A normal hearing person can hear a wide range of sounds from very soft whispers to loud rock bands; this dynamic range of hearing is about 120dB (normal speech is within the 40-60dB range). However, a cochlear implant processor’s input dynamic range (IDR) or sound window is limited to an electrical current of 20dB, and 120dB of sound needs to be compressed into this. Therefore the cochlear implant user is more sensitive to changes in loudness than a hearing person.

If the IDR is small, sounds outside the IDR need to be omitted or compressed; sounds that are too quiet will be cut off, and sounds that are too loud will be compressed and will sound distorted. The 3 main brands of cochlear implants have different IDRs; Advanced Bionics has 80dB, MedEl 55dB, and Cochlear 45dB but with autoranging. I currently have my IDR set at 60dB.

What actually happens in a mapping session? I replace my processor’s battery with a direct connect input lead to the audie’s computer and put the processor back on my head. (Yeah, this freaked ME out the first time I did this).

The audie’s software will reprogramme my implant’s Harmony external processor.

My cochlear implant has 16 electrodes and when each one is stimulated, I will sense each stimulation as a beep.
The audie will set the Threshold (T) levels [to access soft speech and environmental sounds] and Comfort (M) levels [the amount of electrical current required to perceive a loud signal] for each electrode by judging the most comfortable and the highest stimulation I can tolerate – the most comfortable and loudest beeps I am happy to listen to.

I use the loudness scaling chart to indicate to the audiologist which level each stimulation correlates to, ranging from ‘Just Audible’ to ‘Too Loud’.

Then the audie ensures the M levels are similar in terms of my perception, so that the volume is the same in each electrode – I was able to tolerate very high levels of high frequency sounds this week but she brought these back down, otherwise everything would have sounded weird and unbalanced.

This mapping method is rather tedious and drawn out over several months. Clarujust is new software (currently in FDA trial) from Audigence Inc where the patient and processor are interfaced to a computer and words are played and repeated as heard, the software adjusting the map accordingly. Mapping this way reportedly takes 30 minutes. This software can be used by all hearing aid and cochlear implant companies except Advanced Bionics, however Phonak signed up with Audigence Inc this year prior to the Advanced Bionics/Sonova acquisition.

When a mapping is new, it tends to sound louder, until I get used to it. It takes 3 days to get used to a new mapping, then I find loud sounds have become softer and more tolerable, and I can hear a wider range of soft sounds. It is uncomfortable turning up the volume of life to the max every few days. I still have to brace myself for the jolt first thing in the morning, making the transition from complete silence to a full technicolour river of loud sounds pouring into my brain. Amanda’s Tip of the Day: If you wake with a hangover, take your time to put on your CI and turn down the volume. It helps. A little.

It’s an amazing learning process as I am also trying to identify sounds as well, discovering amazing new ones, and learning to discriminate between things that sound similar to me. My hearing is like a baby, it needs time to learn and grow, but it can be fun too.

Erber’s model set forth 4 levels of auditory skill development;

    Awareness of sound (presence v absence)
    Discrimination (same v different)
    Recognition (associating sound/word with meaning)
    Comprehension (using listening for understanding)

I have now reached the second level, I am hearing things but finding it difficult to discriminate between some sounds. Obviously, this means I am still lipreading. In my speech therapy session this week, I discovered I can’t distinguish between Gear v. Dear, tIme v. tAme. I can listen and fill in a word missing in a written sentence, but listening to a whole sentence and being given one written word is more difficult. With hearing aids, both tasks would have been impossible.

In addition to mapping, my progress is occasionally evaluated with an audiogram and speech perception performance with the cochlear implant in a soundproof booth. These tests assist the mapping process and indicate any further adjustments required. I expect I’ll have this done this week, and hope to have improved upon the 18% I achieved in my last speech perception test.

I was programmed with ClearVoice last week but am still adjusting to my new mapping, so I have just been ‘tasting’ this wonderful addition. I tried it on the train; the roar of the train engine and clashing sounds (brakes or pressure pads? – haven’t worked this sound out yet) dropped away significantly and I could clearly hear voices around me. It was awesome. Yesterday, I was sitting by a window and became conscious of this sound. I realised it was the rain spitting outside. In the garage, I could hear the drumming of the rain on the roof and the traffic outside. With ClearVoice on, the traffic became very quiet and the rain became a very clear high PLINK PLINK PLINK, and a lower PLONK PLONK when it came through a hole in the roof and landed on an object. Again, awesome!

Try out the ClearVoice demo for yourself. Don’t forget to say the mandatory WOW!

Shanti is waiting for her cochlear implant operation date and works as a personal trainer and complementary therapist. She gave me a super aromatherapy massage yesterday and I left feeling very relaxed. As soon as I left, I plugged into my iPod and was amazed to hear that the tinny / Donald Duck tone of vocals had gone from a lot of songs. Perhaps there is a link between relaxation and better hearing. Today, voices sound largely normal and it’s so so so NICE to have some normality again!

Photos courtesy of Byron





The unbearable loudness of hearing

25 04 2010

It has been one month since activation and my world has changed beyond recognition and exploded into a kaleidoscope of sounds. Some are old sounds which sound different, some are completely new. The world sounds different through a cochlear implant and it is starting to sound much better.

Each time I have a mapping, my bionic hearing is adjusted – at the moment we are still focusing on increasing the volume. For the last week I have been listening in awe to the (surprisingly noisy) birds, the crackle and pop of rice krispies, my office clock ticking, the ssss of my perfume atomizer, the jangle of keys and my dog’s clinking collar tag, and all the little sounds my dog makes when he wants something! I am discovering that some things, heretofore silent to me, actually do make a sound. The photocopier touchpad beeps, the door of the room next door squeaks (and now annoys me immensely), my hands rasp when I rub them together and so does moisturiser when rubbed on my skin, running my fingers up my arm makes a soft rasping sound too.

I have been utterly shocked by the cacophonous ssshh of brakes and beeps of doors on public transport, the roar of traffic, people in the street, the sharp crackle of plastic bags and paper, the clatter of crockery, the flushing toilet, the microwave nuking food, and the kill-me-now roar of aeroplanes (unfortunately, I live near Heathrow). Last Saturday was the first day in my life that I was able to hear all the birds so I sat in the garden, in the sunshine, and listened. This also happened to be the first day of the airline stoppages due to the Icelandic volcano eruption and the skies were silent. I only realised how much noise airplanes made this week when the airports re-opened for business. Over the last three days, I have become quite overwhelmed by the loudness of some sounds, now that my implant’s volume is nearing an optimum level.

I went to a social event a few days ago and although noisy, I was able to pick out peoples’ voices more easily which made lipreading easier. I heard this strange sound behind me and turned around to see a woman playing a harp. It sounded totally different from what I expected, like a soft guitar.

The strange thing is that high frequency sounds seem much louder to me than other sounds. A person with a hearing loss cannot screen or ‘filter’ out sounds in the way that hearing people do, so everything seems loud. This is why noisy places are so problematic, as hearing aids and cochlear implants amplify all sounds so that environmental sounds are as loud as voices, and the hearing impaired person is unable to filter out the background noise (the cocktail party effect). Now that the high frequency sounds are so new to my brain, these seem extra loud to me, my brain is going WOW What’s This?, sitting up and taking notice, and is only now listening to low frequency sounds again. The world is starting to sound more normal. Voices still sound tinny so it’s a struggle to understand speech.

I can now hear the dial tone on the phone. I started off by listening to phone sounds (these work on both pc and Mac) and will next try listening to a script I’ll give to a friend.

There are four levels of auditory skill development according to Erber’s model – awareness of sound (presence v absence), discrimination (same v different), recognition (associating sound/word with meaning) and finally, comprehension (using listening for understanding). As I was born deaf and have been deaf for 40 years, I’m going to struggle harder and for longer to climb up this ladder.

It is a common misconception that we hear with our ears. In fact, the auditory system, from the outer ear to the auditory nerve, merely provides the pathway for sound to travel to the brain. It is in the brain that we hear. If a person developed hearing loss after learning language (postlingual hearing loss), the brain will retain an “imprint” of the sounds of spoken language. The longer a person is deaf, the more challenging it is to recall these sounds. In the case of a person who has never heard (hearing loss at birth), or who has had very limited benefit from hearing aids, sound through a cochlear implant will be almost entirely new. That person will need to develop new auditory pathways, along with the memory skills to retain these new sounds. Whatever a person’s history, rehabilitation can be very useful in optimizing experience with a cochlear implant.

Being able to detect sound, even at quiet levels, does not mean that an individual will be able to understand words. Norman Erber developed a simple hierarchy of listening skills that begins with simple detection: being aware that a sound exists. An audiogram indicates detection thresholds. Although thresholds with a cochlear implant may indicate hearing at 20 dB to 40 dB (the range of a mild hearing loss), the ability to understand words can vary greatly. The next level of auditory skill is that of discrimination; that is, being able to determine if two sounds are the same or different. For example, one may detect the vowels oo and ee but not be able to hear a difference between the two sounds. The third level of auditory skill is identification. At this level, one is able to associate a label with the sound or word that is heard. Key words may be clear, such as cloudy or rainy, within the context of listening to a weather report. Erber’s final level of auditory skill is that of comprehension. Words and phrases provide meaningful information rather than just an auditory exercise. At the level of comprehension, a person is able to follow directions, answer questions, and engage in conversation through listening.

(Source: Advanced Bionics)

I’m still at the stage of detecting sounds and trying to move into the next stage of discriminating between sounds.  Two weeks ago, I was unable to tell the difference between PAT and BAT, TIN and DIN, FAN and VAN. With the practice I have done, I am now able to do this with almost no errors. I am now working on listening for the difference between MACE and MICE, and DEAR and GEAR – which is difficult as D and G sound so similar. I don’t know what to listen for so am hoping the brain kicks in at some point!

My speech perception is improving slowly. I have tried to make discrimination practice fun, by listening to Amanda on Skype. She will give me a colour, or a month, or a day of the week, or a number between 1-10. Maybe next I will try tube stations or football teams, whatever I think I can cope with, to keep it fun. We decide which closed set we will do – using Mac to Mac, the in-built sound (and video, for lipreading) quality is very good, aided by my use of a direct-connect lead to my processor. I am trying to work towards ‘open sets’ – unknown sentences – by asking people to put a hand over their mouth and give me a sentence. Patrice gave me my first sentence this week : “Bob and Kirby are waiting for me in the car park” and I got it correct except for the word “car”. She gave me a second sentence and I got that spot on. With practice, I will improve. We tried a discrimination exercise – I am now able to hear the roadworks behind the office – they had been working there for a year and I had missed it all (lucky me). So when they hammered, drilled, or dug with a spade, Patrice told me and I listened for the different sounds.

Music is improving too. I am finding that rock with vocals louder than music wins hands down. Opera sounds good, piano/flute/guitar sound quite good. There are musical resources specifically for CI users. Advanced Bionics offer Musical Atmospheres (free for AB users) and available online or on CD, where new music is discovered through 3 hours of recorded musical examples, each containing increasing levels of complexity in musical appreciation, helping to establish a firm foundation for musical memory. They also offer A Musical Journey, through the Rainforest and Music Time for children. Med El offer Noise Carriers, a musical composition available on CD from hearf@aol.com – see Listen,Hear! newsletter no.20 for further information. Cochlear don’t seem to have any resources but they do offer tips.

I am finding that I am feeling soooo much better than I did with hearing aids. I used to have headaches almost every day, I was always exhausted from the constant effort of lipreading, reading the palantype (CART), concentrating to make sure I didn’t miss anything, and stressed by the thought of any social event.  Now, I’m not exhausted every evening, I’ve had one headache since activation, lipreading is somehow easier as I’m getting more audio input even though people still sound like Donald Duck, and I feel much more relaxed overall, and more positive about communicating with ducks people.

I’ve finally discovered the noisy world that everyone lives in. This noisy world should get a bit quieter this week when I get ClearVoice, which will automatically reduce background sounds so I can concentrate on voices. It’s almost a magic wand for hearing loss. All I will then need is to be able to comprehend speech, and I’ll do a convincing fake of a Hearing Person.

I’ve lost the clouds but I’m gaining the sky. And the sun will shine. You’ll find me out there, in the Hearing World, shining brightly with happiness. And as the video below nicely demonstrates, I want to kick your butt!





Dancing into a new life

29 03 2010

It’s been four days since switch-on and my bionic hearing is changing quickly. On Wednesday, I was able to detect a few sounds but they were all beeps. I started to pick out people beeping around me like little birds, items being banged set upon my desk, pages being turned in a series of beeps, the phone ringing in a beep. I had been given three levels of sound on my processor, which was expected to easily last me until my next visit to the audiologist’s, in five days time. On the second day, the volume of the beeps was getting quieter and quieter, and I kept increasing the sound. I started to detect my work colleagues’ voices, with an accompanying beep. I was able to detect a glass being filled with beepy water and draining down the sink’s plug hole, a kettle boiling in mini beeps and switching off with a ping, a crisp packet being beepily rustled. I was getting rather beeped off!

By Thursday evening, I didn’t have any more volume to add on my processor and I didn’t want to wait until Monday’s audiology appointment and miss out on any progress. So there I was on Friday morning, banging on my audie’s door, and she gave me a big increase in sound levels on the processor. She can’t believe how fast I am progressing and has told me to slow down, that my brain needs time to take it all in or I might hamper my progress. She thinks it is because I have done so well with my hearing aids that my brain is very well developed at listening so is able to make sense of the new cochlear implant sound more quickly. My audie said a lot of people take a month off after switch-on to relax at home and take in the new sounds, then they have a shock getting used to their usual routine when they go back to work. Considering I am facing a month of crappy sound whilst my brain adjusts, I reckon returning to work is the smartest thing to do. Just before I left the audie’s office, I realised she wasn’t beeping when she was talking to me.

*hurrah!*

It was Friday afternoon and I was back in the office. I was shocked to find that I could make out my colleagues voices without beeps over the top, their voices sounded distant, in high tones, but I could make them out. It was happy tears all round and a very emotional day. I was amazed that I was able to hear through a computer in my head.

I received a very kind gift from Patrice, Bob and Kirby – a pretty seashell for my New Ear Day, very appropriately reminiscent of a cochlea – and beautifully polished until it shone. I have spent the three days since activation working as normal and that means listening and taking in sound from clients all day, chatting to my interpreter, colleagues and friends, going to the usual noisy cafes for lunch. I think all this has really helped me to ‘acclimatise’ to listening through a computer. It was wonderful to hear a voice and not a beep, and it really helped with my lipreading – which I found a lot less stressful.

Voices now sound quite weird as my brain adjusts to the new sound, and I am having great difficulty understanding what is being said today. I expect to go backwards sometimes as I adjust but to carry on moving in the right direction. I can see that there are so many different shades of hearing. Moving from silence, to sensations or beeps, to detecting some environmental sounds, detecting voices and life around me, moving on to comprehending sounds and then – finally! – understanding speech. My Holy Grail is to understand speech without lipreading. A bonus gift would be to enjoy music. I’m on Advanced Bionic’s HiRes-S with Fidelity 120 program and will get an additional program in May 2010 called ClearVoice, which is revolutionary in having the ability to reduce background noise or the ‘cocktail party effect’. So now I have my goal in sight.

My sound database is now constantly being populated with a drip-feed of familiar, new and sometimes surprising sounds. Familiar sounds I can now hear are the dog barking next door, cars passing me, and sometimes footsteps. When I walk through a busy place such as the cafe in our office building, I don’t experience the usual wall of indecipherable echoey loud white TV noise that hurts my ears and makes me want to scream very loudly, but instead I detect the quiet chirping of people’s voices. (I know from Amanda, the cutlery will become my new sound from hell). At the moment my window of sound is still quite small, because I would not be able to cope if the audie let me have it all at once. It is a mountain that I have to climb slowly, take a rest now and again, acclimatise myself bit by bit. So at the moment, I am only able to hear high frequency sounds that are not too far away from me. I have tried listening to my iPod and music sounds absolutely rubbish, however the volume is set much lower (my hearing aid required the maximum iPod setting and hearing aid setting). I’ve bought my first audio book, Harry Potter’s first book, and find that very hard to listen to as the sound makes no sense – what I am hearing sounds like a long wail with gaps. But I’m holding the faith! Here’s why ….

New sounds from my cochlear implant are the Blackberry / Mac / remote control keys clicking, cutlery on plates, plastic bags rustling, scissors cutting plastic, clicking fingers, bus doors thumping shut, my dog panting and whining (he sure whines a lot!), using an eraser, the bathroom door lock and light clicking from the room next door, the doves and pigeons making a racket in my chimney, my own breathing and sniffing, zips, Amanda’s jaw cracking every time she opens her mouth – all these tell me that the cochlear implant is already outperforming my hearing aid. And it’s only been FOUR DAYS! I am realising that when something moves, it makes a sound. The first sound I could hear clearly, sounding normal and without any pesky beeping, was my shoes scraping on the tarmac when I walked my dog yesterday morning, and I took great pleasure scraping my shoes (and dancing) all around the park. Unfortunately, I now need a new pair of shoes.

I am amazed that I put up with such crappy hearing for so many years.

Bye bye, Crappy Hearing Aid.

Hello, ‘Borg with new shoes.

New, polished, and shining with pride. ~ Come dance with me!