Collaborating Authors


Microsoft's Code-Writing AI Points to the Future of Computers


Microsoft just showed how artificial intelligence could find its way into many software applications--by writing code on the fly. At the Microsoft Build developer conference today, the company's chief technology officer, Kevin Scott, demonstrated an AI helper for the game Minecraft. The non-player character within the game is powered by the same machine learning technology Microsoft has been testing for auto-generating software code. The feat hints at how recent advances in AI could change personal computing in years to come by replacing interfaces that you tap, type, and click to navigate into interfaces that you simply have a conversation with. The Minecraft agent responds appropriately to typed commands by converting them into working code behind the scenes using the software API for the game.

Self-Supervised Learning and Its Applications -


In the past decade, the research and development in AI have skyrocketed, especially after the results of the ImageNet competition in 2012. The focus was largely on supervised learning methods that require huge amounts of labeled data to train systems for specific use cases. In this article, we will explore Self Supervised Learning (SSL) – a hot research topic in a machine learning community. Self-supervised learning (SSL) is an evolving machine learning technique poised to solve the challenges posed by the over-dependence of labeled data. For many years, building intelligent systems using machine learning methods has been largely dependent on good quality labeled data. Consequently, the cost of high-quality annotated data is a major bottleneck in the overall training process.

GitLab 15 provides replacement for do-it-yourself DevOps with The One DevOps Platform


GitLab Inc., provider of The One DevOps Platform, announced the launch of its next major iteration, GitLab 15, starting with its first release version, 15.0, bringing forward new cutting edge DevOps capabilities in one platform. GitLab 15 helps companies develop and collaborate around business-critical code to deliver software securely and achieve desired business results through its comprehensive DevOps capabilities. Upcoming releases will enhance the platform's capabilities in solution areas including visibility and observability, continuous security and compliance, enterprise agile planning, and workflow automation and support for data science workloads. Customers using The DevOps Platform, such as Airbus, have noted tremendous improvements in efficiency. After adopting GitLab, the Airbus DevOps team was able to release feature updates in just 10 minutes – down from the full 24 hours required to set up for production, and conduct manual tests before implementing GitLab.

How to Achieve Digital Transformation Goals with Hyperautomation


Are you an IT leader feeling stuck in your digital transformation goals? One of the most challenging questions in digital transformation is how to go from vision to execution. You may not be as far behind as you think. You simply need to adopt a better approach. One approach that will make the whole process easier for you to achieve your digital transformation goals is called hyperautomation.

Meet 'Slai', An AI Startup That Is Trying To Help Developers In Selecting Their Ideal Machine Learning Setup For Getting The Fastest Way to Add Production-Ready ML Into An App


You wouldn't conceive of setting up your own SMS messaging stack across 193 countries and god knows how many telecom carriers in a world where Twilio exists. Machine learning (ML) is in a similar scenario; why would you waste time putting together a whole infrastructure unless Machine Learning is key to your program -- which it probably isn't? Slai is claiming to have laid the foundation to a developer-first machine learning platform to address this specific challenge. It gives developers the tools they need to release machine-learning apps swiftly. The company's offering claims to focus on allowing developers to focus on the machine learning models rather than all of the other nonsense that wastes time but doesn't directly add to the application.

Artificial Intelligence Developments Financial Institutions Should Expect in 2022


Throughout 2021 many banks and credit unions implemented AI and virtual agents for the first time, and many more plan to follow suit this year. While sometimes slow to adopt new technology like this, financial institutions needed to be more rigorous in their approach to problem-solving in a socially-distanced world. While AI started to permeate member-serving businesses even before COVID, its use in the financial sector is reorienting the digital trajectory of the industry as a whole. AI has allowed financial institutions to remain competitive and provide high-quality customer experiences throughout the disruption of the last two years. It is clear more than ever that member bases will continue to seek the digital-first experiences they've come to enjoy.

Bank IT compliance: how financial services can stay compliant


Financial services compliance is a big area. Prajit Nanu, CEO of B2B payments platform Nium, says it's in everybody's interest that payment transactions are as frictionless as possible, but many commonly used payment systems carry unnecessary layers of complexity, including when ensuring regulations and compliance. He says automation can help to resolve lags arising from risk and compliance checks, which can be a time-consuming and labour-intensive process, particularly for those dealing with cross region, cross country checks. An automated payment platform appropriately integrated with other business software can perform these checks much more seamlessly. Nanu says: "Digital tools, such as individualised transaction profiles, coupled with the output of machine learning processes, will be able to offer real-time solutions which significantly reduce the time required for risk and compliance checks, while still allowing effective identity verification and fraud detection checks."

Neuromorphic chips more energy efficient for deep learning


Neuromorphic chips have been endorsed in research showing that they are much more energy efficient at operating large deep learning networks than non-neuromorphic hardware. This may become important as AI adoption increases. The study was carried out by the Institute of Theoretical Computer Science at the Graz University of Technology (TU Graz) in Austria using Intel's Loihi 2 silicon, a second-generation experimental neuromorphic chip announced by Intel Labs last year that has about a million artificial neurons. Their research paper, "A Long Short-Term Memory for AI Applications in Spike-based Neuromorphic Hardware," published in Nature Machine Intelligence, claims that the Intel chips are up to 16 times more energy efficient in deep learning tasks than performing the same task on non-neuromorphic hardware. The hardware tested consisted of 32 Loihi chips.

Global Big Data Conference


Data science has reached its peak through automation. All the phases of a data science project -- like data cleaning, model development, model comparison, model validation, and deployment -- are fully automated and can be executed in minutes, which earlier would have taken months. Machine learning (ML) continuously works to tweak the model to improve predictions. It's extremely critical to set up the right data pipeline to have a continuous flow of new data for all your data science, artificial intelligence (AI), ML, and decision intelligence projects. Decision intelligence (DI) is the next major data-driven decision-making technique for disruptive innovation after data science. Futuristic – Models ML outcomes to predict social, environmental, and business impact.

Microsoft expands its AI partnership with Meta


Microsoft and Meta are extending their ongoing AI partnership, with Meta selecting Azure as "a strategic cloud provider" to accelerate its own AI research and development. Microsoft officials shared more details about the latest on the Microsoft-Meta partnership on Day 2 of the Microsoft Build 2022 developers conference. Microsoft and Meta -- back when it was still known as Facebook -- announced the ONNX (Open Neural Network Exchange) format in 2017 in the name of enabling developers to move deep-learning models between different AI frameworks. Microsoft open sourced the ONNX Runtime, which is the inference engine for models in the ONNX format, in 2018. Today, Meta officials said they'll be using Azure to accelerate research and development across the Meta AI group.