Computers extract meaning from static handwritten text by processing an image, including separating characters from background noise. Processing text as it is being written often takes account of pen movement and uses special tablets.
On-line handwriting recognition is unusual among sequence labelling tasks in that the underlying generator of the observed data, i.e. the movement of the pen, is recorded directly. However, the raw data can be difficult to interpret because each letter is spread over many pen locations. As a consequence, sophisticated pre-processing is required to obtain inputs suitable for conventional sequence labelling algorithms, such as HMMs. In this paper we describe a system capable of directly transcribing raw on-line handwriting data. The system consists of a recurrent neural network trained for sequence labelling, combined with a probabilistic language model.
Offline Handwritten Text Recognition (HTR) systems transcribe text contained in scanned images into digital text, an example is shown in Figure 1. We will build a Neural Network (NN) which is trained on word-images from the IAM dataset. As the input layer (and therefore also all the other layers) can be kept small for word-images, NN-training is feasible on the CPU (of course, a GPU would be better). This implementation is the bare minimum that is needed for HTR using TF. We use a NN for our task.
We learn such a generative model for each digit. Then, when a new input comes along, we check which digit model can best approximate the new input. This procedure is typically called analysis-by-synthesis, because we analyse the content of the image according to the model that can best synthesise it. That's really the key difference: feedforward networks have no way to check their predictions, you have to trust them. Our analysis-by-synthesis model, on the other hand, looks whether certain image features are really present in the input before jumping to a conclusion.
In Italy, 120 high school students helped solve a centuries-old problem: how to give researchers access to the Vatican Secret Archives, a massive collection of documents detailing the Vatican's activities as far back as the eighth century. That should look pretty great on their college applications. The shelves of the Vatican Secret Archives are about 85 kilometers (53 miles) long and house 35,000 volumes of catalogues. But the documents that researchers have scanned and uploaded take up less than an inch. That's because the Vatican seems to not have wanted to share the information.
During its Build conference today, Microsoft introduced Project Ink Analysis, which does exactly what you'd think: Make sense of digital writing. The toolkit both understands words and provides features typically found in text editors, like alignment and bulleting. While Project Ink Analysis is still in its experimental stages, it could obviously help anyone who habitually writes with styluses on digital platforms. It might not garner deep insights into your personality like IBM Watson, but its simple beautification tools can clean up chickenscratch and even translate from 67 languages. It could be plenty useful for all the Surface Pen users out there who want their scrawling handwriting to look just a bit more professional (and legible).
Inking and navigating with a digital pen or stylus within Windows 10 will become easier within the Fall Creators Update, for those of you who use a tablet as, you know, a tablet. The improvements include two major elements: navigation, including using the pen or stylus to select and scroll text; and better interpretation of inked words as text, via a more accurate and responsive handwriting panel. Combined, it's a love letter of sorts to Surface and other tablet users who use the pen to input data. It's amazing how well Windows can interpret your chicken-scratch into text that can be edited in Word and elsewhere. General Windows 10 users won't be able to take advantage of the new features until the launch of the Fall Creators Update on Oct. 17.
When I reviewed Nebo, MyScript's award-winning, ink-to-digital-text app, I wrote that it's a "game changer" that "sets new standards for accurate handwriting recognition". Nebo's superior accuracy rests on MyScript's interactive ink technology. Recently I had the opportunity to ask MyScript's CTO, Pierre-Michel Lallican, how interactive ink works. Interactive ink is a complex system with three primary modules that are shown in the image at the top of this article. The Digital Ink Management module is the interface between the screen and interactive ink's text recognition and management functions.