Goto

Collaborating Authors

 mainframe


Applying Ontologies and Knowledge Augmented Large Language Models to Industrial Automation: A Decision-Making Guidance for Achieving Human-Robot Collaboration in Industry 5.0

Oyekan, John, Turner, Christopher, Bax, Michael, Graf, Erich

arXiv.org Artificial Intelligence

The rapid advancement of Large Language Models (LLMs) has resulted in interest in their potential applications within manufacturing systems, particularly in the context of Industry 5.0. However, determining when to implement LLMs versus other Natural Language Processing (NLP) techniques, ontologies or knowledge graphs, remains an open question. This paper offers decision-making guidance for selecting the most suitable technique in various industrial contexts, emphasizing human-robot collaboration and resilience in manufacturing. We examine the origins and unique strengths of LLMs, ontologies, and knowledge graphs, assessing their effectiveness across different industrial scenarios based on the number of domains or disciplines required to bring a product from design to manufacture. Through this comparative framework, we explore specific use cases where LLMs could enhance robotics for human-robot collaboration, while underscoring the continued relevance of ontologies and knowledge graphs in low-dependency or resource-constrained sectors. Additionally, we address the practical challenges of deploying these technologies, such as computational cost and interpretability, providing a roadmap for manufacturers to navigate the evolving landscape of Language based AI tools in Industry 5.0. Our findings offer a foundation for informed decision-making, helping industry professionals optimize the use of Language Based models for sustainable, resilient, and human-centric manufacturing. We also propose a Large Knowledge Language Model architecture that offers the potential for transparency and configuration based on complexity of task and computing resources available.


SpaceWar is back! Rebuilding the world's first gaming computer

The Guardian

On my desk right now, sitting beside my ultra-modern gaming PC, there is a strange device resembling the spaceship control panel from a 1970s sci-fi movie. It has no keyboard, no monitor, just several neat lines of coloured switches below a cascade of flashing lights. If you thought the recent spate of retro video game consoles such as the Mini SNES and the Mega Drive Mini was a surprising development in tech nostalgia, meet the PiDP-10, a 2:3 scale replica of the PDP-10 mainframe computer first launched by the Digital Equipment Corporation (DEC) in 1966. Designed and built by an international group of computer enthusiasts known as Obsolescence Guaranteed, it is a thing of beauty. Oscar Vermeulen, a Dutch economist and lifelong computer collector, wanted to build a single replica of a PDP-8 mainframe, a machine he had been obsessed with since childhood.


From Banks to Bananas: The Future of AI for IT Operations

#artificialintelligence

The concept of artificial intelligence (AI) has far-reaching promise and applicability across the business and technology landscape. Thanks to the growth of big data, machine learning, analytics, and blazing computational speeds, the use of artificial intelligence has matured and is now playing a critical role in every major vertical, from banking to retail to logistics. This opens opportunities for organizations to combine the power of AI and mainframe to drive higher levels of operational resilience across their IT environment. To compete, enterprises need to ensure that their business processes can scale and perform flawlessly to delight their customers. Whether purchasing groceries, booking a flight, or trading stock, consumers expect 24/7 sub-second response.


Broadcom's AI, Cloud, and security solutions add value on new z16 mainframe

#artificialintelligence

Broadcom Inc. has announced expanding opportunities for organizations to gain greater value from the company's advanced AI, security, and hybrid cloud solutions with "Day One" support for IBM's new z16. Broadcom's suite of software solutions, services, and unique "beyond code" programs provide clients an advantage to succeed in an increasingly challenging business environment. "Our strategic investments position clients to exploit the z16 along with advances in AI, cybersecurity, cloud integration, and agility," said Greg Lotko, senior VP and GM, Mainframe Software Division, Broadcom. What distinguishes Broadcom is our deep investment in technology and how we work side-by-side in partnership with our clients to overcome their unique challenges and create new opportunities." As a member of the z16 Early Ship Program, Broadcom collaborated with IBM to ensure clients can capitalize on the full range of our mainframe software solutions on the new platform to drive progress toward their innovation and business goals. "Nothing can match the transaction performance of a mainframe, and the way that we manage the platform using Broadcom technology is a real differentiator for us," said Johan Bosch, executive director for iOCO Infrastructure Services. "We can deliver our services at 25 percent of the cost when measured against standalone banking environments.


IBM z16: A mainframe designed for AI, hybrid cloud, security and open source

#artificialintelligence

Today's announcement of IBM's new z16 mainframes promises a system that caters to enterprise needs that include support for AI, security, hybrid cloud, and open source efforts well into the future. The new, more powerful and feature-rich Big Iron boasts an AI accelerator built onto its core Telum processor that can do 300 billion deep-learning inferences per day with one millisecond latency and includes what IBM calls a quantum-safe system to protect organizations from anticipated quantum-based security threats. The system's IBM Telum dual-processor chip has 16 cores and runs at 5.2 GHz. IBM says that the z16 comes with up to 200 configurable cores in a single model--the Model A01--and includes 40TB of redundant array of independent memory (RAIM) per system. But while z16 family, available May 31, is more powerful, the system also promises to accelerate other core IBM strategies of growing hybrid computing and open-source based enterprise systems.


AWS unveils latest innovations in cloud, machine learning, 5G, and IoT

#artificialintelligence

Amazon Web Services has this week detailed the latest innovation in cloud, machine learning, data analytics, 5G, and Internet of Things technology at its 10th annual re:Invent conference. In his first keynote as the company's new chief executive, Adam Selipksy reinforced how AWS "continues to be the most comprehensive and broadly adopted cloud offering in the world by further democratising the use of cloud technology". Selipsky also revealed six major service announcements to help businesses, and government organisations reduce their IT spend, improve their customer experience, and better harness the power of data to make better decisions. Phil Davis, regional managing director for AWS in Asia Pacific and Japan, says the launches are the next iteration of its continued innovation to help solve problems on behalf of its customers. "We are focused on helping customers power their digital transformation across Asia Pacific and Japan so that they can build innovative solutions that improve lives and protect our planet," he says.


How AI-assisted software testing makes DevOps work – QA Valley

#artificialintelligence

Nearly two-thirds of large enterprises are running mainframe-based apps dating back two decades, according to the recent Mainframe Modernization Business Barometer Report from Advanced. Over a quarter of businesses run production applications that are as much as 30 years old–some even go back to the 1960s. For example, in a conversation with a friend at a U.S. public pension fund with nearly $100 billion under management, he told me they decided to take action and migrate most of their remaining mainframe applications from COBOL to Java. Well, for one thing, it was hard to find developers who knew the language, or wanted to, with COBOL ranking #1 as the "most dreaded" programming language in Stack Overflow's annual survey. But there were more reasons for embracing Java, starting with a desire to make better use of DevOps to improve software delivery. When migrating from COBOL (or any language) to Java (or any language), it's smart to start with testing requirements.


How AI-assisted software testing makes DevOps work

#artificialintelligence

Nearly two-thirds of large enterprises are running mainframe-based apps dating back two decades, according to the recent Mainframe Modernization Business Barometer Report from Advanced. Over a quarter of businesses run production applications that are as much as 30 years old–some even go back to the 1960s. In other words, as much as we like to tout the cool, new tech, many enterprises are mired in not-so-cool, old tech. For example, in a conversation with a friend at a U.S. public pension fund with nearly $100 billion under management, he told me they decided to take action and migrate most of their remaining mainframe applications from COBOL to Java. Well, for one thing, it was hard to find developers who knew the language, or wanted to, with COBOL ranking #1 as the "most dreaded" programming language in Stack Overflow's annual survey.


Mainframes: The Missing Link To AI (Artificial Intelligence)?

#artificialintelligence

Data is certainly the fuel for AI. Yet there is a source of valuable data that usually does not garner much attention. It is from mainframe systems. They hold enormous amounts of data--which go back decades--for mission critical operations. But then again, there are difficulties working with mainframes and AI.


Real-Time Machine Learning: Why It's Vital and How to Do It « Machine Learning Times

#artificialintelligence

This article is sponsored by IBM. SUMMARY: Organizations often miss the greatest opportunities that machine learning has to offer because tapping them requires real-time predictive scoring. In order to optimize the very largest-scale processes – which is a vital endeavor for your business – predictive scoring must take place right at the moment of each and every interaction. The good news is that you probably already have the hardware to handle this endeavor: the same system currently running your high-volume transactions – oftentimes a mainframe. But getting this done requires a specialized leadership practice and strong-willed change management. Heed this warning: The greatest opportunities with machine learning are exactly the ones that your business is most likely to miss. To be specific, there's massive potential for real-time predictive scoring to optimize your largest-scale operations. But with these particularly high stakes comes a tragic case of analysis paralysis.