Train to be a velotype captioner

31 10 2015

Are you interested in training as a velotype captioner?

Come to 121 Captions’ assessment day and find out if this is for you!



Using the Velotype system, you can write up to 200 words per minute using a specially adapted keyboard, with free annual software upgrades. The software is available for Microsoft Windows, Apple Macintosh and Linux. The keyboard can be used for over 30 languages.


Velotype Academy

The training is free. The course can be downloaded and interactive. Learning the basics only takes a few months. Training to get to a high speed can take between 7 months and 2 years, depending on how much time you invest in the training.

velotype keyboard


Assessment Day

Date: 30 November 2015 9AM

Venue: PC Werth, Audiology House, 45 Nightingale Lane, London SW12 8SP

Cost: Free. Donations are welcome.

This event is led by

  • Wim Gerbecks – Velotype
  • Sander Pasveer – Velotype
  • Tina Lannin – 121 Captions

The event aims to give you an opportunity to:

  • Meet the Velotype training team
  • Try out the Velotype keyboard
  • Assess your ability to be a velotypist
  • Check out remote live captioning platforms

You will be able to find out about

  • Velotype Academy
  • Working remotely : Q&A session
  • 121 training: Remote working & deaf awareness
  • 1Fuzion remote captioning system

There will also be a short training session on Text on Top, an on-site wireless captioning system.

text on top

Booking your place

You will need to book your place for this event. Places are limited – book now!  To book, contact  or call 020 8012 8170.

When you book, please confirm if you already have a Veloboard and if you are already working as an Electronic Notetaker.


Further information

Velotype Academy

How the veloboard works

Veloboard including costs and languages

VeloNote text editor software

Text on Top

121 Captions training courses



PC Werth, Audiology House, 45 Nightingale Lane, London SW12 8SP

PC Werth London

Lipreading the dregs of history

19 07 2015

It is with great disappointment that we have seen a video from the Royal Archives of the Queen and Queen Mother published in the newspapers with an attempted lipreading translation of the footage.

As expert witness forensic lipreaders, working with the courts and police in the UK and internationally, we are well qualified to comment on this video. Several of our expert lipreaders have examined this footage and our professional conclusion is that this footage is not lipreadable due to the very grainy resolution and distance from the video camera. This video is of such poor quality that it is not lipreadable – at all. Therefore it is not possible to have lipread and to come up with the comments that were published today.

Lipreading is a difficult skill to learn however it is subject to misinterpretation. When lipreading, only up to 30% of speech can actually be seen on the lips. The rest is inferred from the context of what is being said, therefore an excellent knowledge of the language is required.

Have a look in the mirror and say, without voice, “island view” and “I love you” – it is very common in lipreading to have such homophenes (words that look alike). This makes a lipreader’s job much more difficult, particularly so when you have very few words to work with.

Lipreading is not a reliable form of evidence in court and great care must be taken when using it. One of our lipreaders was involved in a quality check of the lipreading skills of Jessica Rees. Independently of two other lipreaders, they all came to the same conclusion, with no prior knowledge, that none of the key words matched the report created by Jessica Rees.

We have been following the reactions on the news and social media, it seems this is not a “wave”, however it must be pointed out that professional forensic lipreaders are not body language experts and it would be unprofessional to comment on this aspect.

The 121 Captions forensic lipreading team

HLAA Convention 2015

28 06 2015

We attended the HLAA convention in St Louis and we had such fun! It was great to see many old friends again and catch up on our amazing cyborg-ness.

The photo shows the Japanese delegation, I was so pleased to be able to practice my Japanese.  今日は!

Japanese delegation

One of them kept asking why Jacob had to raise money for his cochlear implants when CI recipients have insurance in the US. In Japan, the national health care system, like the UK, completely funds cochlear implants.

One of the guys in the photo is a jazz musician. He’s looking for any other jazz musician CI recipients to connect with – do leave a comment and contact link if you know of anyone or you’re a jazz musician yourself.

Every workshop at the event was captioned – which is fantastic. In Japan, they are not so fortunate with access for deaf people. Japan has turned to digital broadcasting, depending on the late night programming and region, but there are often no closed captions, and the DVDs and BluRays for Japanese movies and animation, as well as internet broadcasts are rarely closed captioned.

There are few places where Japanese films are screened with Japanese closed captions, and those screenings usually happen within a couple of days, and in many cases are only screened once. Film making in Japan often has a low budget and tight timeline, so low budget late-night broadcasting and UHF stations are rarely closed captioned. Since it costs television stations money to close caption broadcasts, they use a legal loophole. In order to escape having to add captions in any case, they will broadcast late at night or on UHF stations. And of course the country is pretending not to see this.

The process of closed captioning has been kept hidden from the country’s inhabitants, and in order for the majority of the society to be kept out of the know, they are not putting effort into developing people capable of providing captioning services. Broadcast and cable television stations are more likely to have closed captions. There is a small number of Japanese captioners working for the deaf.

The country, media, NPOs and even organizations who work with disabled people won’t consider requests for closed captioning and won’t do anything about it. The younger generation in Japan have an openly disablist attitude. However there are both disabled and non-disabled people working towards life for disabled people to become a little more enjoyable.

HLAA delegates and USA inhabitants, count yourselves very fortunate!

Live captioning comes to South Africa

30 04 2015

Live Captioning at a university in Cape Town, South Africa.

The client connects to the captioning service via a microphone set up in the classroom. The captioner hears what is being said and writes the text back – coming up live on your device in 1 second. Your device can be a laptop, smartphone, Google Glass, Kindle…. whatever connects to the internet.

Used effectively in classrooms, meeting, conferences, and teleconference calls – having the text coming up on a big screen and on participants’ own devices in 1 second. Making conversation accessible to all disabilities!

Live captioning enquiries –
Follow on Twitter @121captions

Blog post: Live captioning in South Africa

Who’s the best lipreader of all?

28 02 2015


Lipreading has become a rather commercial activity in the last few years. I’ve been asked to lipread celebrities at royal weddings, the Royals at royal weddings, babies and parents at royal christenings, criminals, sports people, and even the unsuspecting public.

I was born deaf and I have always been a lipreader. I am now totally deaf with 2 cochlear implants, yet I retain my lipreading skills. I am able to lipread most people I meet, lipread sideways, and even fool a lot of people into thinking I am a hearing person. I believe lipreading is not a science, it is an art. An art I have honed over many years, in many situations, in many different countries with various accents. My life experience of travelling around the world and “getting on with it” has served to make me a better lipreader. I can even lipread in Spanish, Japanese and Arabic.

There are days when my brain just “won’t compute” – I can be too tired for lipreading and the mental exercise inherent within, or my brain might just say “no”, or I may mentally get stuck and see a phrase which I know is not correct from the context of the language.

Forensic lipreading is even harder, as there is no sound at all, and often the screen view is very small. You are watching video footage over and over and over, sometimes a hundred times over, for perhaps a five second segment of footage, just to get that one word so that the whole sentence makes sense. You really can’t do this with a foreign language that you do not know.

Sometimes people tell me they are the best lipreader in the world. How do you think they are able to make that claim? Don’t you think it’s a bit presumptuous?

I’d love to hear what you think, please comment!

Hands off our hearing aids!

28 07 2014

social media 2

North Staffordshire Clinical Commissioning Group (CCG) has announced proposals to withdraw the provision of NHS-funded hearing aids for adults with mild to moderate age-related hearing loss!

This would be devastating for people with hearing loss, leaving thousands of local residents unable to communicate in their day-to-day lives. If the cuts go ahead in North Staffordshire, who will be next? We could be looking at millions of people who struggle to hear being denied NHS hearing aids.

We’re calling on anyone who values free NHS hearing aids to join us in the fight to stop these changes!

Link to cause – Hands off our hearing aids!

Say hello to real-time captioning on Google Glass

2 02 2014

Did you know? 121 Captions can now stream real-time captions to Google Glass on their caption streaming platform, 1CapApp.

This means you can have a speech-to-text reporter (palantypist or stenographer), a CART writer, or an electronic notetaker listen to your conversation and stream it to your Glass as captions. BRILLIANT for deaf people!

As with all new technology, the Glass can be rather confusing at first. You are probably wondering, what on earth is Google Glass? You’ll understand what this product is and how it feels to wear one after the jump,  …. perhaps you’re even thinking about the potential uses. Wouldn’t you like to have everything captioned for you?

Marlene's screenshot of real-time captions on Google Glass

Marlene’s screenshot of real-time captions on Google Glass

To read more and to find out about user experiences:

121 Captions – Google Glass: Introduction