Goto

Collaborating Authors

Results


"Thought I'd Share First": An Analysis of COVID-19 Conspiracy Theories and Misinformation Spread on Twitter

arXiv.org Machine Learning

Background: Misinformation spread through social media is a growing problem, and the emergence of COVID-19 has caused an explosion in new activity and renewed focus on the resulting threat to public health. Given this increased visibility, in-depth analysis of COVID-19 misinformation spread is critical to understanding the evolution of ideas with potential negative public health impact. Methods: Using a curated data set of COVID-19 tweets (N ~120 million tweets) spanning late January to early May 2020, we applied methods including regular expression filtering, supervised machine learning, sentiment analysis, geospatial analysis, and dynamic topic modeling to trace the spread of misinformation and to characterize novel features of COVID-19 conspiracy theories. Results: Random forest models for four major misinformation topics provided mixed results, with narrowly-defined conspiracy theories achieving F1 scores of 0.804 and 0.857, while more broad theories performed measurably worse, with scores of 0.654 and 0.347. Despite this, analysis using model-labeled data was beneficial for increasing the proportion of data matching misinformation indicators. We were able to identify distinct increases in negative sentiment, theory-specific trends in geospatial spread, and the evolution of conspiracy theory topics and subtopics over time. Conclusions: COVID-19 related conspiracy theories show that history frequently repeats itself, with the same conspiracy theories being recycled for new situations. We use a combination of supervised learning, unsupervised learning, and natural language processing techniques to look at the evolution of theories over the first four months of the COVID-19 outbreak, how these theories intertwine, and to hypothesize on more effective public health messaging to combat misinformation in online spaces.


How 2020 transformed big tech: the story of Facebook, QAnon and the world's slackening grip on reality

The Guardian

As with many others in Britain, lockdown hit Rachel and her husband, Philip, hard. Almost overnight, the couple, both in their early 50s, found themselves cut off from friends, family and colleagues. Before the Covid-19 outbreak, they had both been working every day; now Philip found himself furloughed, while Rachel was put on rotation with other essential staff, working fewer shifts at odd hours. They were unable to meet up with their four adult sons and daughters. They had to attend a family funeral while remaining socially distanced. Initially, Rachel coped in the way many others did. She played more video games than normal, and felt stressed at work, but as far as possible she managed. For him, it seemed there must be more to it than the authorities struggling to cope with a novel virus and evolving expert advice. "The regularly changing and conflicting information that was coming from the government added to the feeling in him that they were making things up or covering something up," Rachel says now. Initially, Philip and Rachel (their names have been changed for this article) discussed his fears, but as lockdown went on, their conversations stopped. Philip was frustrated that Rachel wasn't taking his concerns seriously: someone had to be benefiting from the situation, he insisted, and events such as Dominic Cummings' Barnard Castle "eye test" only increased his belief that "they" knew the pandemic was fake, and the nation was being kept indoors for a more sinister purpose.


The Turing legacy: How Facebook's developers still tap learnings from wartime code-breaking

ZDNet

Bletchley Park was the home of British war-time codebreaking but it's not just a historical curiosity and still has relevance today; even some of Facebook's engineering breakthroughs can be traced back all the way back to the super-secret birthplace of the Bombe machine. Located in Milton Keynes, about 50 miles north of London, Bletchley Park was home to thousands of women and men who were part of the Government Code and Cypher School (GC&CS) during World War Two. It was there that British mathematician Alan Turing and his team cracked the German Enigma code using the world's first special-purpose computer device. Called the Bombe, Turing's electro-mechanical machine was capable of imitating Enigma devices, and carried out sophisticated crypto-analysis of the cipher to eventually read out the encoded messages that the Germans were exchanging. The technology effectively let the Allies listen in to their enemies' secret communications, and was instrumental in determining the outcome of the war.


Facebook donates £1 million to WWII code-breaking site Bletchley Park

Engadget

Bletchley Park was the site where Alan Turing and a World War II team of code-breakers cracked Germany's Enigma machine and helped save the world from Nazi tyranny. The site is now a popular museum, but it's facing a £2 million ($2.6 million) revenue shortfall due to the loss of tourism caused by the COVID-19 pandemic. Now, Facebook has announced that it will donate £1 million to the Bletchley Park Trust charity that runs the site. In a blog post, Facebook CTO Mike Schroepfer wrote that Facebook felt "lucky" to be involved with the site, and that the company "simply would not exist today" without its achievements. "The work of its most brilliant scientist, Alan Turing, still inspires our tens of thousands of engineers and research scientists today," he added.


From viral conspiracies to exam fiascos, algorithms come with serious side effects

The Guardian

Will Thursday 13 August 2020 be remembered as a pivotal moment in democracy's relationship with digital technology? Because of the coronavirus outbreak, A-level and GCSE examinations had to be cancelled, leaving education authorities with a choice: give the kids the grades that had been predicted by their teachers, or use an algorithm. They went with the latter. The outcome was that more than one-third of results in England (35.6%) were downgraded by one grade from the mark issued by teachers. This meant that a lot of pupils didn't get the grades they needed to get to their university of choice.


Tinder launches apocalyptic Swipe Night experience in the UK and around the world

Mashable

Trying to find love as the world ends? That premise is central to Tinder's interactive Swipe Night event, which launches in the UK and around the world on Sept. 12 at 10am. If you're unfamiliar with Swipe Night, then here's a lil catch up: Swipe Night is a first-person choose-your-own-adventure style event where Tinder users can swipe at key moments to determine the direction of the story within the app. Swiping doesn't just affect how the story ends -- it also has a bearing on who users match with and what they end up chatting about. As for the storyline, well, it couldn't be more pertinent to the times we're living in.


EliteSingles vs. Match: How do the dating sites compare in the UK?

Mashable

TV makes meeting people look much too easy. No one expects to live across the hall from their soulmates like Monica and Chandler or to find love at their small town office job like Jim and Pam. Between success stories from Love Island and going on first dates via video calls to get around a pandemic, the rules for finding love have officially gone right out the window. Online dating is hardly a novel way to meet people and is an increasingly popular topic of study. If you're still doubting the possibility of finding love online, consider this study cited in the MIT Technology Review that found that compatibility was greater in partners who had met online.


Government paid Vote Leave AI firm to analyse UK citizens' tweets

The Guardian

Privacy campaigners have expressed alarm after the government revealed it had hired an artificial intelligence firm to collect and analyse the tweets of UK citizens as part of a coronavirus-related contract. Faculty, which was hired by Dominic Cummings to work for the Vote Leave campaign and counts two current and former Conservative ministers among its shareholders, was paid £400,000 by the Ministry of Housing, Communities and Local Government for the work, according to a copy of the contract published online. In June the Guardian reported Faculty had been awarded the contract, but that key passages in the published version of the document describing the work that the company would carry out had been redacted. In response to questions about the contract in the House of Lords, the government published an unredacted version of the contract, which describes the company's work as "topic analysis of social media to understand public perception and emerging issues of concern to HMG arising from the Covid-19 crisis". A further paragraph describes how machine learning will be applied to social media data.


Match vs. eharmony: Both are for serious relationships, but how do the dating sites compare in the UK?

Mashable

Though society has outgrown most cliché tropes that surrounded online dating in its early years, believing that meeting online can grow into a genuine connection can still be hard. If any dating sites can rekindle your hope that there's someone out there who wants the same thing you do, Match and eharmony are it. Technically speaking, online dating amplifies your selection of potential love interests to people you would never have stumbled upon IRL. It's the obvious next step after you've exhausted the qualified singles in your local dating pool, and the pandemic has made online dating an even more ubiquitous way to meet people than it already was. Since social distancing has essentially made hookups with strangers a non-issue, weeding out people who aren't taking dating seriously is easier than ever.


On the Nature and Types of Anomalies: A Review

arXiv.org Artificial Intelligence

Anomalies are occurrences in a dataset that are in some way unusual and do not fit the general patterns. The concept of the anomaly is generally ill-defined and perceived as vague and domain-dependent. Moreover, no comprehensive and concrete overviews of the different types of anomalies have hitherto been published. By means of an extensive literature review this study therefore offers the first theoretically principled and domain-independent typology of data anomalies, and presents a full overview of anomaly types and subtypes. To concretely define the concept of the anomaly and its different manifestations the typology employs four dimensions: data type, cardinality of relationship, data structure and data distribution. These fundamental and data-centric dimensions naturally yield 3 broad groups, 9 basic types and 61 subtypes of anomalies. The typology facilitates the evaluation of the functional capabilities of anomaly detection algorithms, contributes to explainable data science, and provides insights into relevant topics such as local versus global anomalies.