Semiconductors & Electronics


VLSI CAD Part I: Logic Coursera

@machinelearnbot

About this course: A modern VLSI chip has a zillion parts -- logic, control, memory, interconnect, etc. How do we design these complex chips? Learn how to build thesA modern VLSI chip is a remarkably complex beast: billions of transistors, millions of logic gates deployed for computation and control, big blocks of memory, embedded blocks of pre-designed functions designed by third parties (called "intellectual property" or IP blocks). Topics covered will include: Computational Boolean algebra, logic verification, and logic synthesis (2-level and multi-level).


Qualcomm's new depth-sensing camera is surprisingly effective

Engadget

The additions are an iris-authentication front-facing option, an "Entry-Level Computer Vision" setup and a "Premium Computer Vision" kit. Of the three new modules, the most intriguing is the premium computer vision kit. That option is capable of active depth sensing, using an infrared illuminator, IR camera and a 16-megapixel (or 20-MP, depending on configuration) RGB camera. And according to Qualcomm, its iris authentication module can read your eyes even when you have sunglasses on -- something the company's representatives demonstrated effectively at the briefing.


The Morning After: Monday, August 7th

Engadget

They've learned that it's relatively easy to throw off an autonomous vehicle's image-recognition system by strategically using stickers to alter street signs. Virgin Galactic conducts a'dry run' for rocket-powered flights Virgin Galactic's SpaceShipTwo (official name VSS Unity) has just completed its sixth test glide. It probably supports more HDR modes than your TV The next Apple TV will likely support 4K and HDR. Developer Guilherme Rambo has sifted through it to discover references to both 4K and HDR support in an upcoming Apple TV model.


Moto Z2 Force review: One step forward, another step back

Engadget

Just like last year's model, the Moto Z2 Force packs a 5.5-inch Super AMOLED screen running at Quad HD (2,560 x 1,440) resolution. That means we're working with one of Motorola's ShatterShield displays... which basically just means there's a lot of plastic covering the flexible OLED screen. Motorola's approach to protecting screens worked well in last year's Z Force, but it seems to have been the victim of compromise this time. In fairness, Google Assistant can't always handle those multi-part queries either; if you want an assistant that does, Samsung's Bixby might be your best option.


Computer Design Starts Over

IEEE Computer

To maintain Moore's law, the semiconductor industry decided a decade ago that a new transistor was imperative. That silver bullet has yet to materialize, but computer design innovations are now maintaining or even exceeding expected scaling progress. This theme issue gives a cross-sectional view of these new scaling drivers.


Neural Net Computing Explodes

#artificialintelligence

Kim pointed to convolutional neural networks, recurrent neural networks, and Long Short Term Memory (LSTM) networks, among others, each of which is designed to solve a specific problem, such as image recognition, speech or language translation. Norm Jouppi, a Google distinguished hardware engineer, unveiled details of the company's several-year effort, the Tensor Processor Unit (TPU), an ASIC that implements components of a neural network in silicon--as opposed to using raw silicon compute power and memory banks and software on top of that, which is something that Google also does. In discussing the Google TPU, Jouppi highlighted one way that teams of researchers and engineers around the world can benchmark their work and the performance of the hardware and software they are utilizing: ImageNet. Arnold Smeulders and Theo Gevers, the general chairs of ECCV 2016, told Semiconductor Engineering that many of the attendees of ECCV do work in the area of semiconductor technologies (as opposed to software that runs on silicon) that enable computer vision.


Adapting Data for the Rise of Artificial Intelligence in Business

#artificialintelligence

Surprisingly, many enterprise leaders in the PwC AI survey did not perceive artificial intelligence in business to be as disruptive as IoT, despite its integral role in IoT. Fundamentally, well-managed data (from new and legacy sources) combined with AI is poised to drive product innovation, content creation and new engagement models that will define customer engagement disruptive to industries and profit margins in the future. Talent examples include data scientists and experts in big data analytics who can incorporate artificial intelligence, predictive learning models, machine learning and deep learning techniques. Paramount to this success is establishing a robust data architecture that accommodates IoT, big data and evolving regulatory requirements to support a wide range of future business initiatives.


Qualcomm's neural network SDK made free for all comers

#artificialintelligence

Qualcomm's decided to open up its year-old AI, by making its Neural Processing Engine (NPE) available to all. TensorFlow is also name-checked in the announcement, and since the SDK's page also mentions convolutional neural network support, Vulture South reckons Cuda ConvaNet (part of last year's announcement) is also in there somewhere. Target verticals include "mobile, automotive, IoT, AR, drones, and robotics", the development environment of choice is Ubuntu 14.04, and naturally you'll need a supported Snapdragon device for app tests. The full list of supported devices are the Snapdragon 820, 835, 625, 626, 650, 652, 653, 660 and 630, and the Adreno GPU support needs libOpenCL.so


Microsoft touts new HoloLens chip as breakthrough in AI

Daily Mail

Not even Microsoft has been able to tackle one of the biggest challenges preventing tech companies from bringing seamless artificial intelligence experiences to phones and augmented reality goggles without wrecking the user experience - until now. The pre-production version was released in 2016 and targeted to U.S. developers for $3000 The company says the addition of the extra processor to its current chip design is the answer for improving the AI experience on its HoloLens mixed reality goggles and that it's the first chip of its kind designed for mobile. Harry Shum, Microsoft's executive vice president of Artificial Intelligence and Research Group, announced the second version of the chip, known as the Holographic Processing Unit, or HPU, during a keynote speech at CVPR 2017. The company says the addition of the extra processor to its current chip design is the answer for improving the AI experience on its HoloLens mixed reality goggles and that it's the first chip of its kind designed for mobile.


The Six Accelerants of the AI boom

#artificialintelligence

Feature ComputingHardware Graphic: Transistor Production Has Reached Astronomical Scales A look at Moore's Law in action By Dan Hutcheson Data Source: VLSI Research graphic link for Moore's Law special report In 2014, semiconductor production facilities made some 250 billion billion (250 x 1018) transistors. The result has been a steady decrease in manufacturing cost per transistor (transistor price, which is easier to track, is plotted above). Github, arxiv, Kaggle, even hacker news have made information more accessible to scientists, researchers, engineers and hackers. AI systems have become the key to product success in certain classes of products.