simard
No-code AI: Former Microsoft and Salesforce execs reveal new 'machine teaching' startup Intelus - GeekWire
Machine learning is the common basis for modern artificial intelligence, using large amounts of data to build AI models that recognize patterns and make predictions when presented with new information. A new Seattle startup led by a former Microsoft distinguished engineer uses a different approach: machine teaching. "It's not extracting knowledge from data; it's extracting knowledge from the person," explained Patrice Simard, CEO and co-founder of Intelus, who oversaw Microsoft research groups in areas including machine learning, databases, graphics, vision and cryptography in more than 20 years at the Redmond company. Intelus emerged from stealth mode Tuesday to launch an open beta of its new machine teaching platform, Duet, which offers a graphical user interface to create AI models from unstructured data without writing code or requiring advanced data science tools. The models can then be used to classify and extract data from text.
Quebec-based automation supplier Omnirobotic gets $6.5 million financing for AI development - Canadian Plastics
A Quebec-based robotics automation startup has closed a seed round of $6.5 million to further develop and commercialize its artificial intelligence (AI) platform for factory robots. Omnirobotic, founded in 2016 and headquartered in Laval, plans to use the new capital to continue building its autonomous robotic capabilities for production environments. The company intends for its robots to see, plan, and execute processes such as painting, welding, and machining, with limited human oversight. Fonds de solidarité FTQ and Export Development Canada (EDC) led the funding round with participation from Real Ventures and a joint venture including the company's current employees. The Fonds de solidarité FTQ and EDC recently agreed to work closer together to support the growth of companies.
Never Underestimate the Intelligence of Trees - Issue 77: Underworlds
Consider a forest: One notices the trunks, of course, and the canopy. If a few roots project artfully above the soil and fallen leaves, one notices those too, but with little thought for a matrix that may spread as deep and wide as the branches above. Fungi don't register at all except for a sprinkling of mushrooms; those are regarded in isolation, rather than as the fruiting tips of a vast underground lattice intertwined with those roots. The world beneath the earth is as rich as the one above. For the past two decades, Suzanne Simard, a professor in the Department of Forest & Conservation at the University of British Columbia, has studied that unappreciated underworld. Her specialty is mycorrhizae: the symbiotic unions of fungi and root long known to help plants absorb nutrients from soil. Beginning with landmark experiments describing how carbon flowed between paper birch and Douglas fir trees, Simard found that mycorrhizae didn't just connect trees to the earth, but to each other as well. Simard went on to show how mycorrhizae-linked trees form networks, with individuals she dubbed Mother Trees at the center of communities that are in turn linked to one another, exchanging nutrients and water in a literally pulsing web that includes not only trees but all of a forest's life.
- North America > Canada > British Columbia (0.25)
- North America > United States (0.14)
- North America > United States > California > San Mateo County > San Mateo (0.05)
- North America > United States > Pennsylvania > Allegheny County > Pittsburgh (0.04)
- North America > United States > New York (0.04)
- (3 more...)
- North America > United States > California > San Mateo County > San Mateo (0.05)
- North America > United States > Pennsylvania > Allegheny County > Pittsburgh (0.04)
- North America > United States > New York (0.04)
- (3 more...)
- North America > United States > California > San Mateo County > San Mateo (0.05)
- North America > United States > Pennsylvania > Allegheny County > Pittsburgh (0.04)
- North America > United States > New York (0.04)
- (3 more...)
Transformation Invariant Autoassociation with Application to Handwritten Character Recognition
Schwenk, Holger, Milgram, Maurice
When training neural networks by the classical backpropagation algorithm the whole problem to learn must be expressed by a set of inputs and desired outputs. However, we often have high-level knowledge about the learning problem. In optical character recognition (OCR), for instance, we know that the classification should be invariant under a set of transformations like rotation or translation. We propose a new modular classification system based on several autoassociative multilayer perceptrons which allows the efficient incorporation of such knowledge. Results are reported on the NIST database of upper case handwritten letters and compared to other approaches to the invariance problem. 1 INCORPORATION OF EXPLICIT KNOWLEDGE The aim of supervised learning is to learn a mapping between the input and the output space from a set of example pairs (input, desired output). The classical implementation in the domain of neural networks is the backpropagation algorithm. If this learning set is sufficiently representative of the underlying data distributions, one hopes that after learning, the system is able to generalize correctly to other inputs of the same distribution.
- North America > United States (0.14)
- Europe > France (0.04)
A Rapid Graph-based Method for Arbitrary Transformation-Invariant Pattern Classification
Sperduti, Alessandro, Stork, David G.
We present a graph-based method for rapid, accurate search through prototypes for transformation-invariant pattern classification. Our method has in theory the same recognition accuracy as other recent methods based on ''tangent distance" [Simard et al., 1994], since it uses the same categorization rule. Nevertheless ours is significantly faster during classification because far fewer tangent distances need be computed. Crucial to the success of our system are 1) a novel graph architecture in which transformation constraints and geometric relationships among prototypes are encoded during learning, and 2) an improved graph search criterion, used during classification. These architectural insights are applicable to a wide range of problem domains. Here we demonstrate that on a handwriting recognition task, a basic implementation of our system requires less than half the computation of the Euclidean sorting method. 1 INTRODUCTION In recent years, the crucial issue of incorporating invariances into networks for pattern recognition has received increased attention, most especially due to the work of 666 Alessandro Sperduti, David G. Stork
- North America > United States > California > San Mateo County > Menlo Park (0.04)
- Europe > Italy > Tuscany > Pisa Province > Pisa (0.04)
Transformation Invariant Autoassociation with Application to Handwritten Character Recognition
Schwenk, Holger, Milgram, Maurice
When training neural networks by the classical backpropagation algorithm the whole problem to learn must be expressed by a set of inputs and desired outputs. However, we often have high-level knowledge about the learning problem. In optical character recognition (OCR), for instance, we know that the classification should be invariant under a set of transformations like rotation or translation. We propose a new modular classification system based on several autoassociative multilayer perceptrons which allows the efficient incorporation of such knowledge. Results are reported on the NIST database of upper case handwritten letters and compared to other approaches to the invariance problem. 1 INCORPORATION OF EXPLICIT KNOWLEDGE The aim of supervised learning is to learn a mapping between the input and the output space from a set of example pairs (input, desired output). The classical implementation in the domain of neural networks is the backpropagation algorithm. If this learning set is sufficiently representative of the underlying data distributions, one hopes that after learning, the system is able to generalize correctly to other inputs of the same distribution.
- North America > United States (0.14)
- Europe > France (0.04)
A Rapid Graph-based Method for Arbitrary Transformation-Invariant Pattern Classification
Sperduti, Alessandro, Stork, David G.
We present a graph-based method for rapid, accurate search through prototypes for transformation-invariant pattern classification. Our method has in theory the same recognition accuracy as other recent methods based on ''tangent distance" [Simard et al., 1994], since it uses the same categorization rule. Nevertheless ours is significantly faster during classification because far fewer tangent distances need be computed. Crucial to the success of our system are 1) a novel graph architecture in which transformation constraints and geometric relationships among prototypes are encoded during learning, and 2) an improved graph search criterion, used during classification. These architectural insights are applicable to a wide range of problem domains. Here we demonstrate that on a handwriting recognition task, a basic implementation of our system requires less than half the computation of the Euclidean sorting method. 1 INTRODUCTION In recent years, the crucial issue of incorporating invariances into networks for pattern recognition has received increased attention, most especially due to the work of 666 Alessandro Sperduti, David G. Stork
- North America > United States > California > San Mateo County > Menlo Park (0.04)
- Europe > Italy > Tuscany > Pisa Province > Pisa (0.04)