Goto

Collaborating Authors

 mythic


What's Old Is New Again

Communications of the ACM

What's old is new again. At least, it is if we are talking about analog computing. The moment you hear the phrase "analog computing," you might be forgiven for thinking we are talking about the hipsters of the technology world. The people who prefer vinyl over Spotify. The ones that want to bring back typewriters to replace word processors, or the folks who prize handwritten notes over those generated by ChatGPT.


Analog A.I.? It sounds crazy, but it might be the future

#artificialintelligence

The future of A.I. is … analog? At least, that's the assertion of Mythic, an A.I. chip company that, in its own words, is taking "a leap forward in performance in power" by going back in time. Before ENIAC, the world's first room-sized programmable, electronic, general-purpose digital computer, buzzed to life in 1945, arguably all computers were analog -- and had been for as long as computers have been around. Analog computers are a bit like stereo amps, using variable range as a way of representing desired values. In an analog computer, numbers are represented by way of currents or voltages, instead of the zeroes and ones that are used in a digital computer.


Analog A.I.? It sounds crazy, but it might be the future

#artificialintelligence

The future of A.I. is ... analog? Before ENIAC, the world's first room-sized programmable, electronic, general-purpose digital computer, buzzed to life in 1945, arguably all computers were analog -- and had been for as long as computers have been around. Analog computers are a bit like stereo amps, using variable range as a way of representing desired values. In an analog computer, numbers are represented by way of currents or voltages, instead of the zeroes and ones that are used in a digital computer. While ENIAC represented the beginning of the end for analog computers, in fact, analog machines stuck around in some form until the 1950s or 1960s when digital transistors won out.


Edge AI Chips Take to the Field

#artificialintelligence

Airplanes and automobiles, databases and personal computers – all entities with ubiquitous form factors today, but that started out with diverging architectures. So it's not surprising that the shape of edge AI chip technology is similarly diversified. These are nascent days for AI chips. And with numerous designs in the market, there's unlikely to be a common architecture anytime soon. Today, established vendors and startup chip houses alike have jumped into the fray in a bid to complement or displace conventional microprocessors and controllers.


World's First AI Analog Chip

#artificialintelligence

Austin-based Mythic has launched the Mythic Analog Matrix Processor (Mythic AMP) -- a single-chip analog computation device. The M1076 AMP uses Mythic Analog Compute Engine (ACE) to deliver the compute resources of a GPU at up to a tenth of the power consumption. With a 3-watt power draw, the M1076 can perform up to 25 trillion operations per second (TOPS). The new lineup includes a single chip, a PCIe M2 card for low-footprint applications, and a PCIe card with up to 16 chips. Now, edge devices can execute complex AI applications at greater resolutions and frame rates, resulting in superior inference results.


Mythic Launches Industry First Analog AI Chip

#artificialintelligence

Please welcome new Cambrian-AI Analyst Gary Fritz, who contributed to this article. Artificial Intelligence applications are starting to show up in everything from cell phones to supertankers. But at the edge, they are running into the same roadblocks that traditional applications have fought for years: they need more speed. What's a burgeoning neural net to do? To make matters worse, machine learning models are growing at an exponential rate, doubling in size every 3.5 months.


Mythic launches analog AI processor that consumes 10 times less power

#artificialintelligence

Analog AI processor company Mythic launched its M1076 Analog Matrix Processor today to provide low-power AI processing. The company uses analog circuits rather than digital to create its processor, making it easier to integrate memory into the processor and operate its device with 10 times less power than a typical system-on-chip or graphics processing unit (GPU). The M1076 AMP can support up to 25 trillion operations per second (TOPS) of AI compute in a 3-watt power envelope. It is targeted at AI at the edge applications, but the company said it can scale from the edge to server applications, addressing multiple vertical markets including smart cities, industrial applications, enterprise applications, and consumer devices. To address a wider range of designs, the M1076 AMP comes in several form factors: a standalone processor, an ultra-compact PCIe M.2 card, and a PCIe card with up to 16 AMPs.


EETimes - Chip Startups for AI in Edge and Endpoint Applications

#artificialintelligence

As the industry grapples with the best way to accelerate AI performance to keep up with requirements from cutting-edge neural networks, there are many startup companies springing up around the world with new ideas about how this is best achieved. This sector is attracting a lot of venture capital funding and the result is a sector rich in not just cash, but in novel ideas for computing architectures. Here at EETimes we are currently tracking around 60 AI chip startups in the US, Europe and Asia, from companies reinventing programmable logic and multi-core designs, to those developing their own entirely new architectures, to those using futuristic technologies such as neuromorphic (brain-inspired) architectures and and optical computing. Here is a snapshot of ten we think show promise, or at the very least, have some interesting ideas. We've got them categorized by where in the network their products are targeted: data centers, endpoints, or AIoT devices.


Mythic: Pushing AI into Newer Territories

#artificialintelligence

Mike Henry, CEO One cannot possibly miss all the hype and recognition around artificial intelligence (AI) these days. AI has percolated deep into our everyday lives in ways we cannot fathom. Even for most of the tech-savvy millennials, the AI experience is not defined by complex algorithms, huge computing power, or advanced analytical methods. Instead, it sounds like the voice of a smart speaker that responds to weather-related queries or tunes into a podcast. The truth is, the potential for a "digital me" resides in every device.


Two Startups Use Processing in Flash Memory for AI at the Edge

IEEE Spectrum Robotics

Irvine Calif.-based Syntiant thinks it can use embedded flash memory to greatly reduce the amount of power needed to perform deep-learning computations. Austin, Tex.-based Mythic thinks it can use embedded flash memory to greatly reduce the amount of power needed to perform deep-learning computations. They both might be right. A growing crowd of companies is hoping to deliver chips that accelerate otherwise onerous deep learning applications, and to some degree they all have similarities because "these are solutions that are created by the shape of the problem," explains Mythic founder and CTO Dave Fick. When executed in a CPU, that problem is shaped like a traffic jam of data. A neural network is made up of connections and "weights" that denote how strong those connections are, and having to move those weights around so they can be represented digitally in the right place and time is the major energy expenditure in doing deep learning today.