AI can impact and transform business IT operations. The technology has never been more useful as IT teams look to enable mass remote working during the Covid-19 pandemic. Nick McQuire, SVP, Enterprise Research at CCS Insight, explains that "the area of IT Helpdesk support has become a big use case for AI especially in the context of remote IT operations during the pandemic and we have seen the domain become a big focus for the likes of IBM recently with the launch of Watson AIOps, for example." Generally, he points out that AI can help business IT operations quickly diagnose problems and handle support tickets through greater automation. In some cases, "IT is also able to proactively fix problems based on predicting when an issue will arise," McQuire adds. Dr Iain Brown, head of data science at SAS UK & Ireland, says that it is "no understatement to say that AI can revolutionise business IT operations.
Why kids need special protection from AI's influence Algorithms can change the course of children's lives. Kids are interacting with Alexas that can record their voice data and influence their speech and social development. They're binging videos on TikTok and YouTube pushed to them by recommendation systems that end up shaping their worldviews. Algorithms are also increasingly used to determine what their education is like, whether they'll receive health care, and even whether their parents are deemed fit to care for them. Sometimes this can have devastating effects: this past summer, for example, thousands of students lost their university admissions after algorithms--used in lieu of pandemic-canceled standardized tests--inaccurately predicted their academic performance.
Mashable's series Algorithms explores the mysterious lines of code that increasingly control our lives -- and our futures. In hospitals and health systems across the country, physicians sometimes use algorithms to help them decide what type of treatment or care their patients receive. These algorithms vary from basic computations using several factors to sophisticated formulas driven by artificial intelligence that incorporate hundreds of variables. They can play a role in influencing how a doctor assesses kidney function, if a mother should give birth vaginally once she's had a Cesarean section, and which patients could benefit from certain interventions. In a perfect world, the computer science that powers these algorithms would give clinicians unparalleled clarity about their patients' needs.
Psychiatrists typically diagnose autism spectrum disorders (ASD) by observing a person's behavior and by leaning on the Diagnostic and Statistical Manual of Mental Disorders (DSM-5), widely considered the'bible' of mental health diagnosis. However, there are substantial differences amongst individuals on the spectrum and a great deal remains unknown by science about the causes of autism, or even what autism is. As a result, an accurate diagnosis of ASD and a prognosis prediction for patients can be extremely difficult. But what if artificial intelligence (AI) could help? Deep learning, a type of AI, deploys artificial neural networks based on the human brain to recognize patterns in a way that is akin to, and in some cases can surpass, human ability.
Artificial intelligence (AI) can detect loneliness with 94 per cent accuracy from a person's speech, a new scientific paper reports. Researchers in the US used several AI tools, including IBM Watson, to analyse transcripts of older adults interviewed about feelings of loneliness. By analysing words, phrases, and gaps of silence during the interviews, the AI assessed loneliness symptoms nearly as accurately as loneliness questionnaires completed by the participants themselves, which can be biased. It revealed that lonely individuals tend to have longer responses to direct questions about loneliness, and express more sadness in their answers. 'Most studies use either a direct question of "how often do you feel lonely", which can lead to biased responses due to stigma associated with loneliness,' said senior author Ellen Lee at UC San Diego (UCSD) School of Medicine.
Safety is the central focus on driverless vehicle systems development. Artificial intelligence (AI) is coming at us fast. It's being used in the apps and services we plug into daily without us really noticing, whether it's a personalized ad on Facebook, or Google recommending how you sign off your email. If these applications fail, it may result in some irritation to the user in the worst case. But we are increasingly entrusting AI and machine learning to safety-critical applications, where system failure results in a lot more than a slight UX issue.
The government aims to put a facial recognition system into practical use to prevent new coronavirus infections at large-scale events including the Tokyo Olympics and Paralympics, it was learned Friday. The government also hopes to improve the national capacity to conduct saliva-based polymerase chain reaction tests to simultaneously detect cases of influenza and novel coronavirus infection, informed sources said. The proposals are included in a draft program for developing new technologies for preventing coronavirus infection. The government will unveil the program shortly and carry out demonstration tests at relevant ministries and agencies. According to the draft, the government is looking at using security cameras equipped with a facial recognition system to record the movements of visitors to the Tokyo Games, which were postponed to 2021, and other large-scale events, the sources said.
The damage from pandemic-induced lockdowns, office and school closures and consumer retrenchment continue to reverberate through the economy. As the crisis drags into its seventh month, it has left businesses facing hard choices in adjusting to what now seems like many permanent changes. Required actions to address the COVID-19 crisis can be divided into three major stages: Respond, Recover and Thrive. These three stages are interspersed with two additional interim stages, and culminate in a long-term operating environment we call the'next normal'. The early months were focused on business survival through a series of reactionary changes, which was followed by mid-term operational stabilization in a world with diminished demand, continued socio-political restrictions and unpredictable events.
A new study by Monash University, together with Alfred Health and The Royal Melbourne Hospital, has uncovered how machine learning technology could be used to automate epilepsy diagnosis. As part of the study, Monash University researchers applied over 400 electroencephalogram (EEG) recordings of patients with and without epilepsy from Alfred Health and The Royal Melbourne hospital to a machine learning model. Training the model with the various datasets enabled it to automatically detect signs of epilepsy -- or abnormal activities known as "spikes" in EEG recordings. "The objective of the first stage is to evaluate existing patterns involved in the detection of abnormal electrical recordings among neurons in the brain, called epileptiform activity. These abnormalities are often sharp spikes which stand out from the rhythmic patterns of a patient's EEG scan," explained Levin Kuhlmann, Monash University senior lecturer at the Faculty of IT Department of Data Science and AI.
Even prior to the pandemic, improving customer experience was becoming a major priority for IT. COVID-19 and the resulting shift in business models have only accelerated that strategic directive. Delivering a positive customer experience is even more important now, as businesses prepare for a post-pandemic world that will still involve lots of home-based workers, rising e-commerce transactions, and an unprecedented number of digital interactions between companies and their clients. Get the insights by signing up for our newsletters. From healthcare to retail, artificial intelligence is rising to the challenge.