Semiconductors & Electronics


Automotive Radar 2020-2040: Devices, Materials, Processing, AI, Markets, and Players: IDTechEx

#artificialintelligence

This report investigates the market for radar technology, specifically focusing on automotive applications. It develops a comprehensive technology roadmap, examining the technology at the levels of materials, semiconductor technologies, packaging techniques, antenna array, and signal processing. It demonstrates how radar technology can evolve towards becoming a 4D imaging radar capable of providing a dense 4D point cloud that can enable object detection, classification, and tracking. The report examines the latest product innovations. It identifies and reviews promising start-ups worldwide.


AI Was Everywhere at CES

#artificialintelligence

Artificial intelligence was on the tip of the tongue this week at CES, the annual technology extravaganza formerly known as the Consumer Electronics Show. From Samsung's Neon avatars and LG's smart washing machine, to Intel's Tiger Lake processors and the gun-detecting PATSCAN, AI seemed to be everywhere. Samsung's research subsidiary, STAR Labs, unveiled its latest AI project, called Neon. Similar to a chatbot, Neon generates a photo-realistic digital avatar that interacts with people in real time. The South Korean technology giant plans to weave the Neons into people's day-to-day lives, where the avatars will play the role of doctors, personal trainers, and TV anchors giving you the evening news.


Artificial General Intelligence: An Advancement to Foresee Analytics Insight

#artificialintelligence

At the core of the discipline of artificial intelligence is the possibility that one day we'll have the option to construct a machine that is as smart as a human. Such a system is frequently alluded to as artificial general intelligence, or AGI, which is a name that recognizes the idea from the more extensive field of study. It additionally clarifies that true AI has insight that is both wide and flexible. Until this point in time, we've built innumerable systems that are superhuman at explicit tasks, yet none that can match a rat with regards to general mental ability. However, regardless of the centrality of this idea to the field of AI, there's little understanding among analysts with respect to when this feat might really be achievable.


AI/ML Software Engineer

#artificialintelligence

Qualcomm is a company of inventors that unlocked 5G - ushering in an age of rapid acceleration in connectivity and new possibilities that will transform industries, create jobs, and enrich lives. But this is just the beginning. It takes inventive minds with diverse skills, backgrounds, and cultures to transform 5Gs potential into world-changing technologies and products. This is the Invention Age - and this is where you come in. In addition to world-renowned strengths in wireless connectivity solutions, Qualcomm has invested significantly for years in research and development in advanced sensor technology, machine learning(ML) and artificial intelligence(AI) for environment sensing, perception and cognition.


AI computational power leaves Moore's Law in the dust Newsflash

#artificialintelligence

Moore's Law has held up pretty well over the past few decades. The observational'law' was based on the rate at which the number of microcomponents in a microchip or integrated circuit increased. Essentially, this predicted that available computational power or processing speeds would double every 18 months. The rate held more or less steady from 1975 until 2012, but now the advent of artificial intelligence (AI) has seen an increasingly rapid acceleration in processing power. According to Stanford University's 2019 AI Index, the speed at which computational power is doubling has increased massively.


Samsung Electronics Promoting 'Multi-IoT Hub' Strategy – Tech Check News

#artificialintelligence

Samsung Electronics' strategy is to make home appliances an IoT hub by applying its IoT platform SmartThings and voice-recognition AI Bixby to them. Samsung Electronics is promoting a multi-IoT hub strategy to differentiate itself from such companies as Amazon and Google that use their artificial intelligence (AI) speakers as a home IoT hub. Samsung's multi-IoT hub strategy is based on its strength as the global No. 1 supplier of smartphones and consumer appliances such as TVs and refrigerators. The company's strategy is to make these devices home IoT hubs […]


Why 2020 is the year you should finally buy true wireless earbuds

#artificialintelligence

Since the first true wireless earbuds were unveiled in 2015 by Japanese electronics company Onkyo, the fledgling form factor has improved in both audio quality and performance – and CES 2020 showed that true wireless technology might finally be ready for the bigtime. In the past, true wireless earbuds were riddled with connectivity issues, poor audio quality, and bulky designs – however, based on what we saw at CES this year, the best true wireless earbuds of 2020 will be able to compete with wired headphones on a much more level playing field. We finally saw the kind of specs we can expect from true wireless earbuds in 2020; from noise cancellation to long-lasting battery life, so here are three reasons why, if you've been holding off, you should consider a pair of untethered earbuds to enjoy your tunes every day. For a while now, true wireless earbuds have typically cost more than their wired counterparts – but CES 2020 showed us this form factor doesn't have to come at a premium. The new JLab Go Air True Wireless Earbuds are a great example of the growing accessibility of cord-free listening; at just $29 / £29 (about AU$40), they're nearly eight times cheaper than the current class-leading model, the Sony WF-1000XM3.


Deep Learning Market Garner Growth at CAGR of 51.1% by 2026

#artificialintelligence

The global deep learning market is expected to grow at a CAGR of 51.1% from forecast period 2019 to 2026 and expected to reach the value of around US$ 56,427.2 Deep learning is a subdivision of machine learning in artificial intelligence (AI) concerned with the algorithm inspired by the functioning of human brain termed as artificial neural networks. It is also termed as deep neural learning or deep neural network. Deep learning is evolved with the increasing amount of unstructured data due to digitalization. The available amount of data is utilized in deep learning to process or understand that data for effective decision making in various industry verticals including healthcare, manufacturing, automotive, agriculture, retail, security, human resources, marketing, law, and fintech.


Samsung's artificial Neon humans are "a new kind of life"

#artificialintelligence

In a bid to "make science fiction a reality", Samsung's future factory STAR Labs has developed Neon, AI-powered virtual beings that look and behave like real humans. Unlike artificially intelligent (AI) assistants like Siri or Alexa, STAR Labs' computationally created beings aren't programmed to be "know-it-all bots" or an interface to answer users' questions and demands. Instead, the avatars are designed to converse and sympathise "like real people" in order to act as hyper lifelike companions. "We have always dreamed of such virtual beings in science fictions and movies," said STAR Labs CEO Pranav Mistry. "Neons will integrate with our world and serve as new links to a better future, a world where'humans are humans' and'machines are humane'," he continued.


Computational Intelligence in Digital and Network Designs and Applications - Programmer Books

#artificialintelligence

This book explains the application of recent advances in computational intelligence – algorithms, design methodologies, and synthesis techniques – to the design of integrated circuits and systems. It highlights new biasing and sizing approaches and optimization techniques and their application to the design of high-performance digital, VLSI, radio-frequency, and mixed-signal circuits and systems. This second of two related volumes addresses digital and network designs and applications, with 12 chapters grouped into parts on digital circuit design, network optimization, and applications. It will be of interest to practitioners and researchers in computer science and electronics engineering engaged with the design of electronic circuits.