Search

How Ford and Edison Inspired Daughter and Father to Build Optisapiens AI for the Blind

Almost a year ago, we visited the Ford and Edison Winter Estate Museum at Ft. Myers. Before entering the property, we were given a sound player, so that when we reach a certain part of the estate, a narration telling the story of that area will be played by pressing the right buttons on the device; this of course became a major challenge for us. Most the time, we ended-up debating which part of the estate we were, and that was kind of stressful. While roaming the place and talking about Edison, Ford, and inventions in general, my 9-year old daughter Paula came-up with a brilliant idea, that instead of the sound player and pushing all the buttons, why can’t a better device be used in the museum. Her idea is that, when you get into a certain area of the estate, the system will automatically detect where you are and provide you the necessary background story, instead of pushing buttons. Nowadays, we usually call this, “a first world problem”. I actually call it, the “the Alpha Generation’s (those who were born between 2010 and 224) problem”.

That night, we spent a lot of time talking about her idea and we dreamt of developing an intelligent device that we agreed to call “Optisapiens”. “Opti” – meaning vision, and “Sapiens” – meaning smart. Smart-vision, I loved the name, so that same night I bought the domain and registered optisapiens.com.


Fast forward 1 year, in between work, studies, family, and others, I now have a working prototype of Optisapiens. Last semester, I took an AI and IoT class at Harvard as an elective for my Master’s in Information System Management and decided to build the MVP (minimum viable product) of Optisapiens to help the visually impaired people to navigate the world through the lens of AI.


Here is a YouTube video of the MVP prototype of Optisapien SV (Smart Vision) 2020 which includes, among others:


  • Digital Echolocation Ultrasonic Sensing (DEUS) – for obstacle distance detection like a bat would.

  • RA (Recognition Algorithm) – an edge facial recognition system using a single photo to parse a 90%+ accurate.

  • Ambiental sound classification – to classify between environmental noise, motor vehicle, and the alert sound coming from the buzzer. I want to evolve this component to vibrate to help those who are both deaf and blind.

  • NLP (Natural Language Processing) – using Google assistant AI.

  • InteliOptics glasses – a bluetooth glasses with speaker, to provide audible information to the blind (i.e. when the system detects somebody or to listen to the response from the Google Assistant).

My ultimate goal is to build a hybrid-cloud Artificial Neural Network at the least possible cost, using open source technologies, so Optisapiens will be accessible to most visually-impaired who may not have the economic means to buy commercialized solutions, that are often very expensive.

I am now in the process of integrating other features for Optisapiens SV2020. The final version will include the following:


  • OCR (Optical Character Recognition) – to let the blind read regular texts such as traffic signs, books, etc.

  • Color detection – to detect traffic light signs.

  • Object detection – to detect what objects are around them.

  • Home automation control – so they can control smart home devices in their home so they don’t need to walk to adjust the AC’s temperature, turn the lights on or off, etc.

  • Google translation – so they will not only be able to read regular texts, they will be able to read other characters and translate them.

  • Sentiment analysis – to see if the person in front of them is sad, happy, perplexed, etc.

  • Scenario analysis – to find out what is happening around them.

  • Sending of text or voice messages to family members

  • Online Remote Visual Assistance – using InteliOptics version 2.0, the visually-impaired can ask family members or volunteers to activate remote vision assistance to stream video to see what is in front of them close to real-time.

  • And a seemingly infinite AI functional opportunities for both the visually-impaired and for those who are not.

I hope that my daughter and I will be given the chance to pilot the use Optisapiens SV2020 to be used at Ford and Edison Winter Estate Museum, as this will be small win but a big honor for my daughter Paula Siquijor, who inspired me to bring this project to life.

18 views

braincubation

A TECH COLLABORATION FORUM

Hosted by: Rom Siquijor 

A practical technologist & braincubator

WEEKLY NEWSLETTER 

© 2018 BY BRAINCUBATION