SwiftKey and other predictive typing tools

This area has been established to allow you, our visitors and contributors, to get to know one another a bit better, or to discuss subjects of general interest, without feeling obliged to restrict your postings to language-related topics. But we draw the line at floccinaucinihilivilification.
Post Reply

SwiftKey and other predictive typing tools

Post by Erik_Kowal » Thu Jan 28, 2016 7:16 pm

Here is an article from TechRepublic about the conception and development of SwiftKey, a predictive-typing app that is primarily implemented on Android smartphones but is also about to be released for Apple's new iOS8 operating system. One of the interesting aspects of this evolution is how the app has been shaped by applying models of natural-language usage to the technology.
"We very quickly realised that people have Qwerty wired into their brains and that layout is almost as familiar to them as speaking. It really is that fundamental an element of people's communications lives," he [i.e. SwiftKey's founder, Jon Reynolds] said.

Instead [of replacing the QWERTY layout with a different one], they focused on the problem of how to capture the way we use language and how to build that into the software that sits behind the keyboard and makes predictions.

This meant approaching the problem from two different directions. The first thing Medlock [i.e. Ben Medlock, co-founder and Swiftkey's CTO] needed was a huge source of information about how people use language, so he used the European Grid - a huge massive parallel computing network built to analyse data from the Large Hadron Collider data — to extract all the publicly available texts off the internet in different languages. This formed the basis of the background model.

The second element was to build the personalised individual-specific element that recorded and understood the foibles of each individual user - how we hit the keys, the way we commonly misspell things, etc.

"That individual bit is both about what kind of word(s) you use and the context you use them in and how to blend that with this background usage, but it's also about how you interact with your phone. When we capture the ways you tap on your own screen, we can model almost an individual fingerprint of your perception of the keyboard to make the experience more accurate for you," Medlock explained.

For example, if you regularly tap the "i" a little to the left and it picks up the "u" instead (and you subsequently correct it), then SwiftKey sees that and knows when you makes that tap that you mean to get an "i" so it learns that and starts giving you the "i" when you tap there.

"It's actually phenomenal what you can do; every time you select a word on SwiftKey it's updating all of these geometric models about the way you interact with the touchscreen. It's remembering the use of the language and using that to influence the language models. There's a huge amount going on," he explained.
Commenters to the article have also mentioned a rival app, Swype, which the article itself ignores.

Here are Wikipedia links for SwiftKey and Swype, plus an article about the corresponding Microsoft product, Wordflow (available for Windows Phone 8.1 devices), which can even be used when texting blindfolded:


Post Reply