Amazon shifts some Alexa and Rekognition computing to its own Inferentia chip
Amazon.com on Thursday said it shifted part of the computing for its Alexa voice assistant to its own custom-designed chips, aiming to make the work faster and cheaper while moving it away from chips supplied by Nvidia. When users of devices such as Amazon's Echo line of smart speakers ask the voice assistant a question, the query is sent to one of Amazon's data centers for several steps of processing. When Amazon's computers spit out an answer, that reply is in a text format that must be translated into audible speech for the voice assistant. Amazon previously handled that computing using chips from Nvidia but now the "majority" of it will happen using its own Inferentia computing chip. First announced in 2018, the Amazon chip is custom designed to speed up large volumes of machine learning tasks such as translating text to speech or recognizing images.
Nov-13-2020, 09:10:12 GMT
- Country:
- North America > United States > California > San Francisco County > San Francisco (0.07)
- Industry:
- Information Technology > Hardware (0.82)
- Technology:
- Information Technology > Artificial Intelligence
- Speech (1.00)
- Vision > Face Recognition (0.58)
- Information Technology > Artificial Intelligence