Building better conversations for people with hearing problems

Handwriting notes to my grandma so that we could have a conversation was not really a conversation at all.
altText of the image

Can you believe it?

Day 1

What sparked this idea?

Being able-bodied, with good hearing, good eyesight, and the rest, I have never dealt with the struggle of disability and the workarounds to enable living in our society. I am experiencing this in one form when I visit my nanna and try to speak with her. She is 96 years old and hates her hearing aid. This has resulted in disconnection even when we are with her.

A quick fix was using a notepad and pen to write down words she couldn't lipread. Besides being slow it often resulted in broken conversations as her short-term memory is also affected.

After multiple visits using this low-tech solution, I tried my Samsung Notes app and voice-to-text function to live transcribe our conversation. Voila! An immediate improvement. My nanna kept up with the large text displaying our (my sister, my nanna, and my conversation), allowing us to have an interaction vastly different from those times using handwritten notes. However, there still existed limitations. And that is where this project begins.

Day 40

Many heads make light bulbs

EasySpeak, a placeholder name, has remained in the ideation stage for however long now. I spoke to my friend, Liam, who gave sound advice about where to start and where a project like this could take me. As long as the execution is there.

I still needed direction about an efficient tech stack for this kind of program.

Just my luck, I run into an old friend at my cousin's wedding. Jeno works as a programmer at Canva and he attends a monthly build club in Melbourne. I joined the following month, gained a mountain of help and found a welcoming community.

That is where I find myself so far. Building with Next.js and using OpenAI's Whisper API.

Day 100 or so

Going all in

"Give EasySpeak a try." was a placeholder for this entry that I wrote on Day 40.

So much has happened since then. Firstly, I decided on the name ReVerbal not EasySpeak, as the latter was used extensively in different arenas. It makes sense since the conception is elementary wordplay. The name was always a placeholder, and deciding on ReVerbal did not take long once I began to take this thing seriously.

I landed back in Melbourne from my month-long journey across Japan, had a lot of free time before relief teaching picked up, and made strides to realise ReVerbal.

I started with a survey, conversations, and so much ChatGPT strategising that I would consider it a technical co-founder. All of these cemented in my mind that this thing could be big and help teachers and students across Australia.

Exactly 4 weeks from that point in time, I am preparing to present ReVerbal at an Open Pitch Night held in Melbourne City. This is my first, real opportunity at reaching out to early-stage investors and collaborators.

Wish me luck!

Other milestones:

  • A roadmap

  • Software Requirements Doc

  • An MVP

  • Trademark application

  • Two domains

  • 17 Teacher survey respondents

See you next time,

JJ