The Sony A8H Bravia 55-inch OLED 4K TV is on sale for $1,298 at Amazon as of June 29. If you somehow slept through two days of Prime Day deals, luck is on your side if you need a new TV. Discounts on some of our top TV picks will appear once again as Fourth of July sales pick up soon. And one popular option, in particular, is available for the taking at one of the lowest prices we've seen. That's 32% off the original price of a 2020 TV model that still packs the latest technology for a prime viewing experience at home. The Sony A8H is perfect for all forms of entertainment whether it's movies, video games, or live sports.
Silicon Valley adaptive computing bellwether Xilinx announced its entrance into the growing system-on-module (SOM) market today, with a portfolio of palm-sized compute modules for embedded applications that accelerate AI, machine learning and vision at the edge. Xilinx Kria will eventually expand into a family of single board computers based on reconfigurable FPGA (Field Programmable Gate Array) technology, coupled to Arm core CPU engines and a full software stack with an app store, the first of which is specifically is targeted at AI machine vision and inference applications. The Xilinx Kria K26 SOM employs the company's UltraScale multi-processor system on a chip (MPSoC) architecture, which sports a quad-core Arm Cortex A53 CPU, along with over 250 thousand logic cells and an H.264/265 video compression / decompression engine (CODEC). This may sound like alphabet soup as I spit out acronyms, however, the underlying solution is a compelling offering for developers and engineers looking to give new intelligent systems, in industries like security, smart cities, retail analytics, autonomous machines and robotics, the ability to see, infer information and adapt to their deployments in the field. Also on board the Xilinx Kria K26 SOM is 4GB of DDR4 memory and 245 general purpose IO, along with the ability to support 15 cameras, up to 40 Gbps of combined Ethernet throughput, and four USB 2/3 compatible ports.
Analog AI processor company Mythic launched its M1076 Analog Matrix Processor today to provide low-power AI processing. The company uses analog circuits rather than digital to create its processor, making it easier to integrate memory into the processor and operate its device with 10 times less power than a typical system-on-chip or graphics processing unit (GPU). The M1076 AMP can support up to 25 trillion operations per second (TOPS) of AI compute in a 3-watt power envelope. It is targeted at AI at the edge applications, but the company said it can scale from the edge to server applications, addressing multiple vertical markets including smart cities, industrial applications, enterprise applications, and consumer devices. To address a wider range of designs, the M1076 AMP comes in several form factors: a standalone processor, an ultra-compact PCIe M.2 card, and a PCIe card with up to 16 AMPs.
Lisp machines are general-purpose computers designed to efficiently run Lisp as their main software and programming language, usually via hardware support. They are an example of a high-level language computer architecture, and in a sense, they were the first commercial single-user workstations. Despite being modest in number (perhaps 7,000 units total as of 1988), Lisp machines commercially pioneered many now-commonplace technologies, including effective garbage collection, laser printing, windowing systems, computer mice, high-resolution bit-mapped raster graphics, computer graphic rendering, and networking innovations such as Chaosnet.[citation The operating systems were written in Lisp Machine Lisp, Interlisp (Xerox), and later partly in Common Lisp. Artificial intelligence (AI) computer programs of the 1960s and 1970s intrinsically required what was then considered a huge amount of computer power, as measured in processor time and memory space.
New computing technologies inspired by the brain promise fundamentally different ways to process information with extreme energy efficiency and the ability to handle the avalanche of unstructured and noisy data that we are generating at an ever-increasing rate. To realise this promise requires a brave and coordinated plan to bring together disparate research communities and to provide them with the funding, focus and support needed. We have done this in the past with digital technologies; we are in the process of doing it with quantum technologies; can we now do it for brain-inspired computing?
Moore's Law is dead, right? Although the historical annual improvement of about 40% in central processing unit performance is slowing, the combination of CPUs packaged with alternative processors is improving at a rate of more than 100% per annum. These unprecedented and massive improvements in processing power combined with data and artificial intelligence will completely change the way we think about designing hardware, writing software and applying technology to businesses. Every industry will be disrupted. You hear that all the time. Well, it's absolutely true and we're going to explain why and what it all means. In this Breaking Analysis, we're going to unveil some data that suggests we're entering a new era of innovation where inexpensive processing capabilities will power an explosion of machine intelligence applications.
Sony's latest trick for Bravia TVs is something called "cognitive intelligence" that can enhance parts of an image depending on what's going on. The first 4K TVs it released with the tech were high-end OLED models, but those started at $2,999 and went up from there. Now, Sony has unveiled its first LED models with the Cognitive Processor XR, and they're a lot more reasonably priced. The X90J is the top-end OLED model and it's available in 50-, 55- and 65-inch sizes at $1,299, $1,499 and $1,799, respectively. These offer the Cognitive Processor XR, though cinephiles will likely want to turn that off to see the content as the creators intended.
Google has hired former Intel executive Uri Frank to lead its custom chip division. Apart from Google, many companies have taken to chipmaking in the last few years to build competitive moats. The Intel veteran will serve as the Vice-President of Engineering for server chip design at Google. Uri Frank has over two decades of experience in custom CPU design and delivery experience. His expertise in design engineering at Intel will come in handy for Google.
Samsung has unveiled a new RAM module that shows the potential of DDR5 memory in terms of speed and capacity. The 512GB DDR5 module is the first to use High-K Metal Gate (HKMG) tech, delivering 7,200 Mbps speeds -- over double that of DDR4, Samsung said. Right now, it's aimed at data-hungry supercomputing, AI and machine learning functions, but DDR5 will eventually find its way to regular PCs, boosting gaming and other applications. Samsung first used HKMG tech in 2018 with GDDR6 chips used in GPUs. Developed by Intel, it uses hafnium instead of silicon, with metals replacing the normal polysilicon gate electrodes.
Samsung Electronics on Thursday said it has developed a 512GB DDR5 memory module. It is the company's first DRAM made with the latest DDR5 standard that was set by JEDEC in July last year. The hardware, made with high-k metal gate (HKMG) process technology, offers up to 7,200Mbps in data transfer rate, over double that of conventional DDR4, Samsung said. The company stacked eight layers of 16Gb DRAM chips for the module, it said. The memory will be able to handle high-bandwidth workloads in applications such as supercomputing, artificial intelligence, machine learning, and data analytics, the company added.