Goto

Collaborating Authors

 maintenance activity


Quantifying Process Quality: The Role of Effective Organizational Learning in Software Evolution

Hönel, Sebastian

arXiv.org Machine Learning

Real-world software applications must constantly evolve to remain relevant. This evolution occurs when developing new applications or adapting existing ones to meet new requirements, make corrections, or incorporate future functionality. Traditional methods of software quality control involve software quality models and continuous code inspection tools. These measures focus on directly assessing the quality of the software. However, there is a strong correlation and causation between the quality of the development process and the resulting software product. Therefore, improving the development process indirectly improves the software product, too. To achieve this, effective learning from past processes is necessary, often embraced through post mortem organizational learning. While qualitative evaluation of large artifacts is common, smaller quantitative changes captured by application lifecycle management are often overlooked. In addition to software metrics, these smaller changes can reveal complex phenomena related to project culture and management. Leveraging these changes can help detect and address such complex issues. Software evolution was previously measured by the size of changes, but the lack of consensus on a reliable and versatile quantification method prevents its use as a dependable metric. Different size classifications fail to reliably describe the nature of evolution. While application lifecycle management data is rich, identifying which artifacts can model detrimental managerial practices remains uncertain. Approaches such as simulation modeling, discrete events simulation, or Bayesian networks have only limited ability to exploit continuous-time process models of such phenomena. Even worse, the accessibility and mechanistic insight into such gray- or black-box models are typically very low. To address these challenges, we suggest leveraging objectively [...]


A Survey on Machine Learning Techniques for Source Code Analysis

Sharma, Tushar, Kechagia, Maria, Georgiou, Stefanos, Tiwari, Rohit, Vats, Indira, Moazen, Hadi, Sarro, Federica

arXiv.org Artificial Intelligence

The advancements in machine learning techniques have encouraged researchers to apply these techniques to a myriad of software engineering tasks that use source code analysis, such as testing and vulnerability detection. Such a large number of studies hinders the community from understanding the current research landscape. This paper aims to summarize the current knowledge in applied machine learning for source code analysis. We review studies belonging to twelve categories of software engineering tasks and corresponding machine learning techniques, tools, and datasets that have been applied to solve them. To do so, we conducted an extensive literature search and identified 479 primary studies published between 2011 and 2021. We summarize our observations and findings with the help of the identified studies. Our findings suggest that the use of machine learning techniques for source code analysis tasks is consistently increasing. We synthesize commonly used steps and the overall workflow for each task and summarize machine learning techniques employed. We identify a comprehensive list of available datasets and tools useable in this context. Finally, the paper discusses perceived challenges in this area, including the availability of standard datasets, reproducibility and replicability, and hardware resources.


Artificial intelligence drives the way to net-zero emissions

#artificialintelligence

The fourth industrial revolution (Industry 4.0) is already happening, and it's transforming the way manufacturing operations are carried out. Industry 4.0 is a product of the digital era as automation and data exchange in manufacturing technologies shift the central industrial control system to a smart setup that bridges the physical and digital world, addressed via the Internet of Things (IoT). Industry 4.0 is creating cyber-physical systems that can network a production process enabling value creation and real-time optimisation. The main factor driving the revolution is the advances in artificial intelligence (AI) and machine learning. The complex algorithms involved in AI use the data collected from cyber-physical systems, resulting in "smart manufacturing".


A Beginner's Guide to The Internet of Things (IoT) 2022

#artificialintelligence

We are able to turn on the lights in our homes from a desk in an office miles away. The built-in cameras and sensors embedded in our refrigerator let us easily keep tabs on what is present on the shelves, and when an item is close to expiration. When we get home, the thermostat has already adjusted the temperature so that it's lukewarm or brisk, depending on our preference. These are not examples from a futuristic science fiction story. These are only a few of the millions of frameworks part of the Internet of Things (IoT) being deployed today.


A Column Generation based Heuristic for the Tail Assignment Problem

Sambrekar, Akash, Raqabi, El Mehdi Er

arXiv.org Artificial Intelligence

This article proposes an efficient heuristic in accelerating the column generation by parallel resolution of pricing problems for aircrafts in the tail assignment problem (TAP). The approach is able to achieve considerable improvement in resolution time for real life test instances from two major Indian air carriers. The different restrictions on individual aircraft for maintenance routing as per aviation regulatory bodies are considered in this paper. We also present a variable fixing heuristic to improve the integrality of the solution. The hybridization of constraint programming and column generation was substantial in accelerating the resolution process.


Tapping the IoT Potential in the Post COVID-19 World

#artificialintelligence

Massive data is generated by sensors placed by billions of connected devices around the world. IoT is everywhere, consider the rise of smart watches that allow people to track their fitness, monitor their sleeping patterns, measure their heart rate, to smart sensors that go beyond the human reach in industrial maintenance activities, IoT is everywhere. Think of a future where self-driving cars will collect, process, and store driving data at the edge to make road travel safer and more enjoyable. IoT Techology is deployed by a handful of companies to keep track of their pest populations. For instance, Semios, uses sensors and machine vision technology to check the pest populations in vineyards, orchards and other agricultural settings.


Predictive Maintenance with Machine Learning on Oracle Database 20c

#artificialintelligence

According to McKinsey's study "Visualizing the uses and potential impact of AI and other analytics", 2018, the estimated impact of artificial intelligence and other analytics on all industries regarding anomaly detection is between $1.0T and $1.4T. Anomaly detection is the critical success factor in predictive maintenance, which tries to anticipate when maintenance is required. This differs from the classical preventive approach, in which activities are planned on a regularly scheduled basis, or condition-based maintenance activities, in which assets are monitored through IoT sensors. Applying anomaly detection algorithms based on machine learning, it's possible to perform prognostics to estimate the condition of a system or a component and its remaining useful life (RUL), in order to predict an incoming failure. One of the most famous algorithms is the MSET-SPRT, well-described with a use case in this blog post: "Machine Learning Use Case: Real-Time Support for Engineered Systems."


Predictive Maintenance

#artificialintelligence

Running high-volume manufacturing smoothly will involve the perfect functioning of all the machines, resulting in efficient production. The objective of any manufacturing unit is to keep operations at optimum speed with very less downtime. However, every equipment goes through wear and tear and needs servicing and maintenance periodically. The critical question here is, when is the best time to conduct equipment maintenance. Scheduled maintenance is effective or damage control in case of machine failure?


Into the Dataverse!

#artificialintelligence

Industry 4.0 - the fourth industrial revolution – is upon us. Artificial intelligence (AI) is forever changing the way information is used across all business lines of the Government and private sector alike. An important DoD priority is to use AI to improve maintenance activities. However, AI depends on the quality of data, so the DoD must first be able to capture data on maintenance activities that is complete, structured, and readily accessible. DoD maintenance faces growing challenges that threaten the strategic advantage the United States Military has long held in both combat and humanitarian missions.


How to use game-changing AI to boost decision quality

#artificialintelligence

And show why Game AI for business decisions makes so much sense. In the debate about AI for business, there is a lot of focus on the data, but data by itself doesn't generate value. Businesses generate value by turning data into decisions.