carly_cropped

Since my team won MIT’s Hacking Medicine hackathon 2 years ago with an app that generates structured documentation from an unstructured patient-doctor interaction (by passively listening and watching the interaction), I’ve taken on the challenge of natural user interfaces. Recently, APIs such as Google’s WebSpeechAPI, which I’ve used in the setting of academic research, proved their ability to convert speech to text with enough fidelity to be useful in real-life applications. With devices like Amazon Echo and wide-range microphones capable of discerning speech through ambient noise, we’ve bypassed the second major hurdle in natural user interfaces.

Now it’s time to start bidding farewell to keyboards, mouses, and ugly software that looks like an Excel spreadsheet. Instead of scrolling through a mind-numbing list of vital signs and lab values in an electronic health record, a provider should simply be able to say: How high did John Doe’s blood pressure get in the past 4 hours? or Trend Jane Doe’s creatinine level over the past week. This is what I mean by natural language interface: software that allows humans to interact with it in the same way humans think, which is through natural speech.

If you have an Amazon Echo, check out my new Alexa Skill, Carly, a voice-activated health coach for Amazon Echo. The next stop will be introducing doctors to Carly and the joy of natural user interface.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s