Goto

Collaborating Authors

Results


Shop these Prime Day deals while there's still time — sales end tonight

Mashable

BEST AMAZON DEVICE DEAL: Echo Studio with two Philips Hue bulbs -- save $79.97 BEST APPLE DEAL: MacBook Air (13-inch, 8GB RAM, 512GB SSD) -- save $149.01 It's day two of Prime Day 2020, and deals are still in full swing. If you missed your chance to shop yesterday, don't worry -- we're still seeing some amazing sales on top items. Because there are literally thousands and thousands of deals to shop across the web during this two-day span, we've done the heavy lifting for you to find the very best Prime Day deals. Remember that Prime Day ends at 11:59 p.m. PT Oct. 14 (2:59 a.m.


Samsung Galaxy Watch 3: New features to include hand gestures, fall detection to compete with Apple Watch

USATODAY - Tech Top Stories

Samsung's Galaxy Watch 3, which is set to be unveiled on Aug. 5 at its upcoming Galaxy Unpacked event, will include new hand control commands and support fall detection, according to XDA Developers, a mobile software forum. One feature will allow users to control the device by using gestures, such as answering a call by clenching and unclenching a fist, XDA Developers said. The Samsung Galaxy Watch 3 will have a speaker, allowing users to take the call entirely on the watch itself, or they can shake their hand to reject the call. Another new feature includes add support for fall detection, similar to Apple Watch devices. If a user falls, the smartwatch will ring for 60 seconds.


Neuromorphic Chipsets - Industry Adoption Analysis

#artificialintelligence

Von Neumann Architecture Neuromorphic Architecture Neuromorphic architectures address challenges like high power consumption, low speed, and other efficiency-related bottlenecks prevalent in the traditional von Neumann architecture Architecture Bottleneck CPU Memory Neuromorphic architectures integrate processing and storage, getting rid of the bus bottleneck connecting the CPU and memory Encoding Scheme and Signals Unlike the von Neumann architecture with sudden highs and lows in the form of binary encoding, neuromorphic chips offer a continuous analog transition in the form of spiking signals Devices and Components CPU, memory, logic gates, etc. Artificial neurons and synapses Neuromorphic devices and components are more complex than logic gates Versus Versus Versus 10. NEUROMORPHIC CHIPSETS 10 SAMPLE REPORT Neuromorphic Chipsets vs. GPUs Parameters Neuromorphic Chips GPU Chips Basic Operation Based on the emulation of the biological nature of neurons onto a chip Use parallel processing to perform mathematical operations Parallelism Inherent parallelism enabled by neurons and synapses Require the development of architectures for parallel processing to handle multiple tasks simultaneously Data Processing High High Power Low Power-intensive Accuracy Low High Industry Adoption Still in the experimental stage More accessible Software New tools and methodologies need to be developed for programming neuromorphic hardware Easier to program than neuromorphic silicons Memory Integrated memory and neural processing Use of an external memory Limitations • Not suitable for precise calculations and programming- related challenges • Creation of neuromorphic devices is difficult due to the complexity of interconnections • Thread limited • Suboptimal for massively parallel structures Neuromorphic chipsets are at an early stage of development, and would take approximately 20 years to be at the same level as GPUs. The asynchronous operation of neuromorphic chips makes them more efficient than other processing units.


Evolving Moore's Law with chiplets and 3D packaging

#artificialintelligence

For more than 50 years, Moore's Law has paced the advance of electronics, from semiconductor chips to laptops and cell phones. Now, the golden rule of technology price and performance from Intel co-founder Gordon Moore is evolving once again. Demanding modern workloads like AI often need specialized, high-powered processors with unique requirements. So Intel and other leading-edge chipmakers are turning to innovative new chip design and packaging techniques – and re-writing the rules of digital innovation for a new era. Ramune Nagisetty is Director of Process and Product Integration for Intel Technology Development.


IFA 2019 highlights: Amazon Fire TV Cube, LG Dual Screen, and Huawei Kirin 990 5G

#artificialintelligence

IFA, or Internationale Funkausstellung Berlin, began humbly in 1924 as an international radio exhibition. Since then, it's grown into one of the largest annual consumer electronics shows anywhere, with more than 200,000 visitors anticipated over the next few days. The Berlin expo is a who's who of the appliance and device makers, with the likes of Microsoft, Bang & Olufsen, Casio, Samsung, LG, Sony, Huawei, ZTE, Asus, Acer, Toshiba, Philips, Panasonic, Amazon, and other industry titans making appearances. As in years past, a slew of announcements marked the start of IFA 2019, and VentureBeat covered the events as they happened from the show floor. Here are a few of the highlights so far.


A New Golden Age for Computer Architecture

Communications of the ACM

We began our Turing Lecture June 4, 201811 with a review of computer architecture since the 1960s. In addition to that review, here, we highlight current challenges and identify future opportunities, projecting another golden age for the field of computer architecture in the next decade, much like the 1980s when we did the research that led to our award, delivering gains in cost, energy, and security, as well as performance. "Those who cannot remember the past are condemned to repeat it."--George Software talks to hardware through a vocabulary called an instruction set architecture (ISA). By the early 1960s, IBM had four incompatible lines of computers, each with its own ISA, software stack, I/O system, and market niche--targeting small business, large business, scientific, and real time, respectively. IBM engineers, including ACM A.M. Turing Award laureate Fred Brooks, Jr., thought they could create a single ISA that would efficiently unify all four of these ISA bases. They needed a technical solution for how computers as inexpensive as those with 8-bit data paths and as fast as those with 64-bit data paths could share a single ISA. The data paths are the "brawn" of the processor in that they perform the arithmetic but are relatively easy to "widen" or "narrow." The greatest challenge for computer designers then and now is the "brains" of the processor--the control hardware. Inspired by software programming, computing pioneer and Turing laureate Maurice Wilkes proposed how to simplify control. Control was specified as a two-dimensional array he called a "control store." Each column of the array corresponded to one control line, each row was a microinstruction, and writing microinstructions was called microprogramming.39 A control store contains an ISA interpreter written using microinstructions, so execution of a conventional instruction takes several microinstructions. The control store was implemented through memory, which was much less costly than logic gates. The table here lists four models of the new System/360 ISA IBM announced April 7, 1964. The data paths vary by a factor of 8, memory capacity by a factor of 16, clock rate by nearly 4, performance by 50, and cost by nearly 6.


CES Editors' Choice Awards: The best and coolest tech to expect in 2019

USATODAY - Tech Top Stories

If you make a purchase by clicking one of our links, we may earn a small share of the revenue. However, our picks and opinions are independent from USA TODAY's newsroom and any business incentives. CES is the biggest technology show of the year, and every year Reviewed's crack team of product experts spend days sorting through the thousands of new releases that debut in Las Vegas. To weed out the pretenders and highlight only the things we think will actually make a splash in 2019. We call them our CES Editors' Choice winners, and once again we've found some truly exceptional products. Though there are plenty of flashy products making big promises, we focus on the stuff you're actually going to buy this year. All 40 of our winners strike a balance in our four key criteria: innovation, technology, design, and value. Congratulations to all of our winners and be sure to check back as we update this page with coverage of all of our winners as CES 2019 continues. Okay, while a "roll-up" OLED TV did debut last year, the LG R9--technically, the OLED65R9PUA--is a real, flesh-and-blood product.


CES 2019: Moore's Law is dead, says Nvidia's CEO

#artificialintelligence

Nvidia CEO Jensen Huang shows off the new RTX 2060 graphics card at an event at CES 2019. At least that's what Nvidia CEO Jensen Huang believes. The executive, who co-founded graphics-chip maker Nvidia, on Wednesday declared that "Moore's Law isn't possible anymore." A key part of semiconductor manufacturing is shrinking the components called transistors, the extraordinarily tiny electronic switches that process data for everything from the clocks in microwave ovens to the artificial intelligence algorithms running in our phones. Intel co-founder Gordon Moore in 1965 predicted a steady, two-year cadence of chip improvements that would double a processor's performance every couple of years.


Samsung reveals 98-inch TV at the Consumer Electronics Show in Las Vegas

Daily Mail - Science & tech

Samsung has unveiled a mammoth piece of technology in the form of a 98-inch 8K television at CES in Las Vegas. The TV measures nearly two and a half metres from corner to corner and uses a specially designed Quantum Processor 8K chip that relies on AI-based tech. It is the largest model in a range which also includes 65-inch (165cm), 75-inch (190cm), 82-inch (208cm) and 85-inch (215cm) versions. The Korean company promises'near pristine' 8K quality, however there is no formatted content available to consumers today. According to Samsung, the televisions will support native 8K content via the HDMI 2.1 spec when it becomes available.


Nubia X avoids a notch by adding a rear display for selfies

Engadget

While local competitors like Vivo, Oppo, Xiaomi and Honor have been trying various sliding mechanisms to achieve all-screen, notch-free smartphone designs, Nubia decided to take the easy approach: Getting rid of the front cameras and forcing you to use the rear cameras for selfies. This is why the freshly-announced Nubia X is a dual-screen flagship smartphone, with its 6.26-inch FHD LCD covering almost the entire front side, and the back featuring a smaller 5.1-inch 1,520 x 720 OLED panel to go with the dual cameras. Obviously, the main purpose of the secondary OLED screen is to let you take selfies using the device's only two cameras (16MP f/1.8 and 24MP f/1.7), but this also means greater flexibility when using these main cameras -- think awkward angles that would otherwise require crouching down or even lying on the ground. Nubia also claims that this camera's portrait mode uses AI tricks to analyze the subject's ethnicity, facial features, age, skin tone and other characteristics to set the right kind of beautification and bokeh. Similarly, the AI software has also been trained with over 4,000 scenes to automatically optimze images accordingly.