Materwala, Huned
Secure and Privacy-Preserving Automated Machine Learning Operations into End-to-End Integrated IoT-Edge-Artificial Intelligence-Blockchain Monitoring System for Diabetes Mellitus Prediction
Hennebelle, Alain, Ismail, Leila, Materwala, Huned, Kaabi, Juma Al, Ranjan, Priya, Janardhanan, Rajiv
Diabetes Mellitus, one of the leading causes of death worldwide, has no cure to date and can lead to severe health complications, such as retinopathy, limb amputation, cardiovascular diseases, and neuronal disease, if left untreated. Consequently, it becomes crucial to take precautionary measures to avoid/predict the occurrence of diabetes. Machine learning approaches have been proposed and evaluated in the literature for diabetes prediction. This paper proposes an IoT-edge-Artificial Intelligence (AI)-blockchain system for diabetes prediction based on risk factors. The proposed system is underpinned by the blockchain to obtain a cohesive view of the risk factors data from patients across different hospitals and to ensure security and privacy of the user's data. Furthermore, we provide a comparative analysis of different medical sensors, devices, and methods to measure and collect the risk factors values in the system. Numerical experiments and comparative analysis were carried out between our proposed system, using the most accurate random forest (RF) model, and the two most used state-of-the-art machine learning approaches, Logistic Regression (LR) and Support Vector Machine (SVM), using three real-life diabetes datasets. The results show that the proposed system using RF predicts diabetes with 4.57% more accuracy on average compared to LR and SVM, with 2.87 times more execution time. Data balancing without feature selection does not show significant improvement. The performance is improved by 1.14% and 0.02% after feature selection for PIMA Indian and Sylhet datasets respectively, while it reduces by 0.89% for MIMIC III.
From Conception to Deployment: Intelligent Stroke Prediction Framework using Machine Learning and Performance Evaluation
Ismail, Leila, Materwala, Huned
Stroke is the second leading cause of death worldwide. Machine learning classification algorithms have been widely adopted for stroke prediction. However, these algorithms were evaluated using different datasets and evaluation metrics. Moreover, there is no comprehensive framework for stroke data analytics. This paper proposes an intelligent stroke prediction framework based on a critical examination of machine learning prediction algorithms in the literature. The five most used machine learning algorithms for stroke prediction are evaluated using a unified setup for objective comparison. Comparative analysis and numerical results reveal that the Random Forest algorithm is best suited for stroke prediction.
Forecasting COVID-19 Infections in Gulf Cooperation Council (GCC) Countries using Machine Learning
Ismail, Leila, Materwala, Huned, Hennebelle, Alain
The novel coronavirus (COVID-19) was declared as a global pandemic by the World Health Organization (WHO) after it was first discovered in Wuhan, China [1]. Over one year, the virus has infected more than 68 million people worldwide [2]. The virus can be fatal for elderly people or ones with chronic diseases [3]. Different countries across the globe have imposed several social practices and strategies to reduce the spread of the infection and to ensure the well-being of the residents. These practices and strategies include but are not limited to social distancing, restricted and authorized travels, remote work and education, reduced working staff in organizations, and frequent COVID-19 tests. These measures have been proved potential in reducing the disease spread and death in the previous pandemics [3], [4]. Several studies have focused on machine learning time series models to forecast the number of COVID-19 infections in different countries [5, 6, 7, 8, 9, 10, 11, 12, 13, 14]. This is to aid the government in designing and regulating efficient virus spread-mitigating strategies and to enable healthcare organizations for effective planning of health personnel and facilities resources. Based on the forecasted infections, the government can either make the confinement laws stricter or can ease them.
HealthEdge: A Machine Learning-Based Smart Healthcare Framework for Prediction of Type 2 Diabetes in an Integrated IoT, Edge, and Cloud Computing System
Hennebelle, Alain, Materwala, Huned, Ismail, Leila
Based on a report by the International Diabetes Federation, in 2021, 537 million adults globally were suffering from diabetes causing 6.7 million deaths [3]. Furthermore, the number of diabetics is projected to reach 643 million by 2030 and 783 million by 2045 [3]. Diabetes in an individual prevails due to a dynamic interaction between different risk factors such as sleep duration, alcohol consumption, dyslipidemia, physical inactivity, serum uric acid, obesity, hypertension, cardiovascular disease, family history of diabetes, ethnicity, depression, age, and gender [4]. If not treated at an early stage, diabetes can lead to severe complications [5]. The use of machine learning has thus gained wide attention for the prediction of diabetes based on risk factors data [6-13] in context of smart healthcare [14,15].
QoS-SLA-Aware Artificial Intelligence Adaptive Genetic Algorithm for Multi-Request Offloading in Integrated Edge-Cloud Computing System for the Internet of Vehicles
Ismail, Leila, Materwala, Huned, Hassanein, Hossam S.
Internet of Vehicles (IoV) over Vehicular Ad-hoc Networks (VANETS) is an emerging technology enabling the development of smart cities applications for safer, efficient, and pleasant travel. These applications have stringent requirements expressed in Service Level Agreements (SLAs). Considering vehicles limited computational and storage capabilities, applications requests are offloaded into an integrated edge-cloud computing system. Existing offloading solutions focus on optimizing applications Quality of Service (QoS) while respecting a single SLA constraint. They do not consider the impact of overlapped requests processing. Very few contemplate the varying speed of a vehicle. This paper proposes a novel Artificial Intelligence (AI) QoS-SLA-aware genetic algorithm (GA) for multi-request offloading in a heterogeneous edge-cloud computing system, considering the impact of overlapping requests processing and dynamic vehicle speed. The objective of the optimization algorithm is to improve the applications' Quality of Service (QoS) by minimizing the total execution time. The proposed algorithm integrates an adaptive penalty function to assimilate the SLAs constraints in terms of latency, processing time, deadline, CPU, and memory requirements. Numerical experiments and comparative analysis are achieved between our proposed QoS-SLA-aware GA, random, and GA baseline approaches. The results show that the proposed algorithm executes the requests 1.22 times faster on average compared to the random approach with 59.9% less SLA violations. While the GA baseline approach increases the performance of the requests by 1.14 times, it has 19.8% more SLA violations than our approach.
Performance and Energy-Aware Bi-objective Tasks Scheduling for Cloud Data Centers
Materwala, Huned, Ismail, Leila
Cloud computing enables remote execution of users' tasks. The pervasive adoption of cloud computing in smart cities' services and applications requires timely execution of tasks adhering to Quality of Services (QoS). However, the increasing use of computing servers exacerbates the issues of high energy consumption, operating costs, and environmental pollution. Maximizing the performance and minimizing the energy in a cloud data center is challenging. In this paper, we propose a performance and energy optimization bi-objective algorithm to tradeoff the contradicting performance and energy objectives. An evolutionary algorithm-based multi-objective optimization is for the first time proposed using system performance counters. The performance of the proposed model is evaluated using a realistic cloud dataset in a cloud computing environment. Our experimental results achieve higher performance and lower energy consumption compared to a state-of-the-art algorithm.