2008-10-21

CODING: Getting computers to understand human speech

One bit of coding I have in mind to develop (public license of course) is a kind of language interpreter.

The premise: why can't we tell our computers what to do instead of moving around a mouse and typing on a keyboard?

As far as simple sentences go, it shouldn't be hard. The only difficult piece will be code that converts spoken words to text that the computer can parse, e.g. ANSI (or ASCII?)

After that, it comes down to linguistic rules. Here is an example, using two languages to show versatility.

Rule for identifying the weekday:

English: "Today is {weekday}"
Japanese: "Kyou wa {weekday} desu"


Rule for {weekday} (with a pipe "|" indiciating "or"):


"{sunday}|{monday}|..."


Rule for {sunday}:

English: "sunday"
Japanese: "nichiyoubi"

Same for the other days of the week.

For possible mis-translations, the pattern would be something like this:


"sunday|(sunny day)|..."


When the rule matches something said by the user, you could use logic like the following:


if ({Today rule hit})
if ({weekday} = {sunday})
{code for sunday}


And it would follow for the rest of the weekdays.

This somewhat mimics human memory recall. If a person asked what day it was, upon hearing an answer, their mind would search through their memory for things to remember on that particular day.

No comments: