An international scam ring is targeting dating app users in a romance scam to not only deprive victims of their cryptocurrency but also the control of their handsets. On Wednesday, Sophos cybersecurity researchers named the gang "CryptoRom" and said they have recently expanded their operations from Asia, spreading to both the United States and Europe. Romance scams are an insidious and constant problem, and thanks to the rising popularity of dating apps, are now not only limited to phishing emails. Instead, fraudsters will'match' with their victims, pretend interest until they build a foundation of trust, and then they will ask for money -- only to vanish soon after. In recent years, romance scams have become more sophisticated, with some cybercriminals offering their victims'exclusivity' in trading deals or in cryptocurrency investments, using the lure of easy profit as well as potential love matches. Interpol warned of an uptick in investment-based romance fraud taking place across dating apps in January this year.
Sophos has released a new report this week about a dating app scam that led to the theft of millions of dollars from people on Tinder, Bumble, Grindr, Facebook Dating and similar apps. After gaining their trust on these dating apps, scammers convinced victims to download fake crypto apps, where they duped them into investing money before freezing the accounts. The scammers were somehow able to easily game Apple's Developer Enterprise program -- and the Apple Enterprise/Corporate Signature -- to distribute these fraudulent crypto apps, which were masquerading as Binance and other legitimate brands. Sophos said its threat hunters observed the scammers abusing Apple's Enterprise Signature to manage victims' devices remotely. Apple did not respond to requests for comment. Sophos also contacted Apple about the issue and did not get a response.
Customer experience will drive your bottom line, and AI will help you win that race. In the meanwhile, privacy is becoming the key to keeping those wins. In today's era of digital business customer experience is that sole factor that distinguishes your service from your competitors'. When McKinsey had forecasted that the era of hyper-personalisation is dawning upon us, Gartner had published multiple reports within a span of a year, remarking on the rising value of AI - after all, great, personalised experiences inevitably leverage AI across multiple functions, achieving astonishing results in the process. At the same time, legislations were setting new benchmarks for the cost of privacy breaches. In 2021, Amazon was slammed with a €746m fine for non-compliance with the General Data Protection Regulation (GDPR).
There's increasing momentum to optimize operations with digital processes and meet customers where they are digitally. But how can enterprises manage this change in a secure way? Over the past 18 months, enterprises have drastically accelerated the shift to optimize their operations with digital processes. On top of this, customers are becoming increasingly digital, and this behavior is predicted to continue even as the world opens up further, and customers have more options. Enterprises face some tough challenges as they optimize their infrastructures to better meet changing customer demands and advance their key business priorities.
The trick of Refik Anadol's Machine Hallucinations, a three-day public art installation at The Shed in New York City, is to transform the processing of data into surreal hypnosis. The immersive audiovisual exhibit towers over a cavernous 17,000 sq ft gallery in Hudson Yards, an outer ring of screens features a shimmering and chameleonic display of what looks like pixelated sand. But each square is a narrative of data: a familiar image – tree, building, lamppost, over 130m publicly available images of New York City searched and collected by Anadol and his team's algorithms – morphed into a single-colored square and then silenced by a single question: what would you do if you owned your data? The free exhibit, part of a $250m project to shift data ownership from private mega-corporations to individual users called Project Liberty, makes a tactile, sensory, emotional argument for data dignity and decentralization of internet power – concepts often so bogged down in technicality, abstraction and vagueness as to be inaccessible. The overarching aim of Project Liberty is to imagine an internet future not governed by tech CEOs, the forfeit of your data for participation, surveillance capitalism and the whims of social media companies aiming for infinite scale.
A key part of the NLP ethics movement is responsible use of data, but exactly what that means or how it can be best achieved remain unclear. This position paper discusses the core legal and ethical principles for collection and sharing of textual data, and the tensions between them. We propose a potential checklist for responsible data (re-)use that could both standardise the peer review of conference submissions, as well as enable a more in-depth view of published research across the community. Our proposal aims to contribute to the development of a consistent standard for data (re-)use, embraced across NLP conferences.
Highly realistic deepfake videos didn't quite make the splash some feared they would during the 2020 presidential election. Nevertheless, deepfakes are causing trouble--for regular people. In March, the Federal Bureau of Investigation warned that it expected fraudsters to leverage "synthetic content for cyber … operations in the next 12-18 months." In deepfake videos, which first appeared in 2017, a computer-generated face (often of a real person) is superimposed on someone else. After the swap, the fraudsters can make the target person say or do just about anything.
The US Federal Trade Commission (FTC) warns of extortion scammers targeting the LGBTQ community via online dating apps such as Grindr and Feeld. As the FTC revealed, the fraudsters would pose as potential romantic partners on LGBTQ dating apps, sending explicit photos and asking their targets to reciprocate. If they fall for the scammers' tricks, the victims will be blackmailed to pay a ransom, usually in gift cards, under the threat of leaking the shared sexual imagery with their family, friends, or employers. "To make their threats more credible, these scammers will tell you the names of exactly who they plan to contact if you don't pay up. This is information scammers can find online by using your phone number or your social media profile," the FTC said.
The Financial Times reports the Irish Data Protection Commission has fined WhatsApp €225 million ($266.8 million) for not sharing enough details of how it shares European Union users' data with Facebook. The messaging service allegedly failed to live up to its General Data Protection Regulation (GDPR) transparency obligations. The Commission also said the data sharing itself violated GDPR. WhatsApp was merely storing "pseudonymous" phone number data, for instance, rather than truly anonymizing it. While the numbers were stored using lossy hashes, WhatsApp had the hash key needed to decrypt that info -- it could tie that number to a specific person if it wanted. The ruling asked WhatsApp to both improve its transparency and bring the data sharing in line with the GDPR.
The graph represents a network of 1,365 Twitter users whose tweets in the requested range contained "#iiot", or who were replied to or mentioned in those tweets. The network was obtained from the NodeXL Graph Server on Saturday, 21 August 2021 at 20:59 UTC. The requested start date was Tuesday, 17 August 2021 at 00:01 UTC and the maximum number of tweets (going backward in time) was 7,500. The tweets in the network were tweeted over the 3-day, 6-hour, 15-minute period from Friday, 13 August 2021 at 17:43 UTC to Monday, 16 August 2021 at 23:59 UTC. Additional tweets that were mentioned in this data set were also collected from prior time periods.