Goto

Collaborating Authors

Results


Broadcom's AI, Cloud, and security solutions add value on new z16 mainframe

#artificialintelligence

Broadcom Inc. has announced expanding opportunities for organizations to gain greater value from the company's advanced AI, security, and hybrid cloud solutions with "Day One" support for IBM's new z16. Broadcom's suite of software solutions, services, and unique "beyond code" programs provide clients an advantage to succeed in an increasingly challenging business environment. "Our strategic investments position clients to exploit the z16 along with advances in AI, cybersecurity, cloud integration, and agility," said Greg Lotko, senior VP and GM, Mainframe Software Division, Broadcom. What distinguishes Broadcom is our deep investment in technology and how we work side-by-side in partnership with our clients to overcome their unique challenges and create new opportunities." As a member of the z16 Early Ship Program, Broadcom collaborated with IBM to ensure clients can capitalize on the full range of our mainframe software solutions on the new platform to drive progress toward their innovation and business goals. "Nothing can match the transaction performance of a mainframe, and the way that we manage the platform using Broadcom technology is a real differentiator for us," said Johan Bosch, executive director for iOCO Infrastructure Services. "We can deliver our services at 25 percent of the cost when measured against standalone banking environments.


Machine Learning Engineer Careers at Intel in Phoenix, AZ

#artificialintelligence

The mission of Intel's Incubation Disruptive Innovation (IDI) team is to create an environment to identify new opportunities for innovation and disruptive technologies as a path to create new markets and new organizational capabilities leveraging Intel's competitive advantages.


Qualcomm plunges into the robotics market with new platform

ZDNet

Greg Nichols covers robotics, AI, and AR/VR for ZDNet. A full-time journalist and author, he writes about tech, travel, crime, and the economy for global media outlets and reports from across the U. Qualcomm is taking a big dive into robotics. At its 5G Summit event, the company announced a new robotics platform that serves as an off-the-shelf developer kit for creating autonomous mobile robots (AMRs) and drones, utilizing 5G and edge AI for next-gen autonomy. In practical terms, this could set off huge changes in the expanding AMR market and the upstart enterprise drone market. Currently, the space is dominated by a handful of robotics firms that build AMRs or drone-in-a-box solutions and lease them on an as-a-service model.


How AI/ML Improves Fab Operations

#artificialintelligence

Chip shortages are forcing fabs and OSATs to maximize capacity and assess how much benefit AI and machine learning can provide. This is particularly important in light of the growth projections by market analysts. The chip manufacturing industry is expected to double in size over the next five years, and collective improvements in factories, AI databases, and tools will be essential for doubling down on productivity. "We're not going to fail on this digital transformation, because there's no option," said John Behnke, general manager in charge of smart manufacturing at Inficon. "All the fabs are collectively going to make 20% to 40% more product, but they can't get a new tool right now for 18 to 36 months. To leverage all this potential, we're going to overcome the historical human fear of change."


Stocks To Watch in 5G Wireless Growth Wave: Jeff Kagan

#artificialintelligence

The wireless industry has been one of the fastest growing spaces for several decades. That does not mean, however, that it is always on fire. Every growth wave has ebbs and flows. It all depends on the period of time in which you are focused. The good news is the wireless industry has entered the next growth wave with 5G, AI, IoT, AR, VR, cloud and more.


Google faces internal battle over research on AI to speed chip design

#artificialintelligence

OAKLAND, Calif., May 2 (Reuters) - Alphabet Inc's (GOOGL.O) Google said on Monday it had recently fired a senior engineering manager after colleagues, whose landmark research on artificial intelligence software he had been trying to discredit, accused him of harassing behavior. The dispute, which stems from efforts to automate chip design, threatens to undermine the reputation of Google's research in the academic community. It also could disrupt the flow of millions of dollars in government grants for research into AI and chips. Google's research unit has faced scrutiny since late 2020 after workers lodged open critiques about its handling of personnel complaints and publication practices. The new episode emerged after the scientific journal Nature in June published "A graph placement methodology for fast chip design," led by Google scientists Azalia Mirhoseini and Anna Goldie.


AMD teases CPUs with Xilinx AI engines for 2023

#artificialintelligence

AMD plans to introduce processors next year that integrate AI engines from the company's recently acquired Xilinx FPGA business unit, which helped the chip designer deliver high sales growth in the first quarter along with the company's traditional PC and server businesses. CEO Lisa Su disclosed the plans for new AI-fueled CPUs during her company's first-quarter earnings call Tuesday, where she said the resulting microprocessors will "enable industry-leading inference capabilities" as part of broader plans to capitalize on AMD's $49 billion Xilinx acquisition. The AI engines are already being used in Xilinx's FPGA-based products for embedded and edge applications, including image recognition for cars, according to Victor Peng, Xilinx's former CEO who now leads AMD's Adaptive and Embedded Computing Group. Peng said AMD is working on developing "unified" software that will help developers take advantage of the new AI capabilities for both inference and training in datacenters and at the edge. Overall, Su said, Xilinx will allow AMD to have a "much broader set of offerings" in the AI hardware space that goes beyond the company's current capabilities with CPUs and GPUs.


Intel CEO expects chip shortage to last until at least 2024

ZDNet

Since completing a degree in journalism, Aimee has had her fair share of covering various topics, including business, retail, manufacturing, and travel. She continues to expand her repertoire as a tech journalist with ZDNet. Intel chief Pat Gelsinger has predicted that the global chip shortage will remain a challenge for the industry until at least 2024, particularly in areas such as foundry capacity and tool availability. Despite this forecast, Gelsinger outlined that Intel is in a "good position" to manage the constraints that arise as a result of the supply chain shortage. "In fact, Intel is rising to meet this challenge," he told investors on Thursday during a first-quarter earnings call.


Modern Computing: A Short History, 1945-2022

#artificialintelligence

Inspired by A New History of Modern Computing by Thomas Haigh and Paul E. Ceruzzi. But the selection of key events in the journey from ENIAC to Tesla, from Data Processing to Big Data, is mine. This was the first computer made by Apple Computers Inc, which became one of the fastest growing ... [ ] companies in history, launching a number of innovative and influential computer hardware and software products. Most home computer users in the 1970s were hobbyists who designed and assembled their own machines. The Apple I, devised in a bedroom by Steve Wozniak, Steven Jobs and Ron Wayne, was a basic circuit board to which enthusiasts would add display units and keyboards. April 1945 John von Neumann's "First Draft of a Report on the EDVAC," often called the founding document of modern computing, defines "the stored program concept." July 1945 Vannevar Bush publishes "As We May Think," in which he envisions the "Memex," a memory extension device serving as a large personal repository of information that could be instantly retrieved through associative links.


The Tricky Aftermath of Source Code Leaks

WIRED

The Lapsus$ digital extortion group is the latest to mount a high-profile data-stealing rampage against major tech companies. And among other things, the group is known for grabbing and leaking source code at every opportunity, including from Samsung, Qualcomm, and Nvidia. At the end of March, alongside revelations that they had breached an Okta subprocessor, the hackers also dropped a trove of data containing portions of the source code for Microsoft's Bing, Bing Maps, and its Cortana virtual assistant. Businesses, governments, and other institutions have been plagued by ransomware attacks, business email compromise, and an array other breaches in recent years. Researchers say, though, that while source code leaks may seem catastrophic, and certainly aren't good, they typically aren't the worst-case scenario of a criminal data breach.