Collaborating Authors

Europe Government

UK regulator to write to WhatsApp over Facebook data sharing

The Guardian

The UK's data regulator is writing to WhatsApp to demand that the chat app does not hand user data to Facebook, as millions worldwide continue to sign up for alternatives such as Signal and Telegram to avoid forthcoming changes to its terms of service. Elizabeth Denham, the information commissioner, told a parliamentary committee that in 2017, WhatsApp had committed not to hand any user information over to Facebook until it could prove that doing so respected GDPR. But, she said, that agreement was enforced by the Irish data protection authority until the Brexit transition period ended on 1 January. Now that Britain is fully outside the EU, ensuring that those promises are being kept falls to the Information Commissioner's Office. "The change in the terms of service, and the requirement of users to share information with Facebook, does not apply to UK users or to users in the EU," Denham told the digital, culture, media and sport sub-committee on online harms and disinformation, "and that's because in 2017 my office negotiated with WhatsApp so that they agreed not to share user information and contact information until they could show that they complied with the GDPR."

UK police warn of sextortion attempts in intimate online dating chats


As politicians play whack-a-mole with COVID-19 infection rates and try to balance the economic damage caused by lockdowns, stay-at-home orders have also impacted those out there in the dating scene. No longer able to meet up for a drink, a coffee, or now even a walk in the park, organizing an encounter with anyone other than your household or support bubble is banned and can result in a fine in the United Kingdom -- and this includes both dates and overnight stays. Therefore, the only feasible option available is online connections, by way of social networks or dating apps. Dating is hard enough at the best of times but sexual desire doesn't disappear just because you are cooped up at home. Realizing this, a number of healthcare organizations worldwide have urged us not to contribute to the spread of COVID-19 by meeting up with others for discreet sex outside of our social bubbles, bringing new meaning to the phrase, "You are your safest sex partner."

To make AI a success, every child needs to understand the risk and potential of algorithms


With artificial intelligence estimated to have the potential to deliver as much as a 10% increase to the UK's GDP before 2030, the challenge remains to unlock the technology's potential – and to do so, a panel of AI experts recommends placing a bet on young brains. A new report from the AI Council, an independent committee that provides advice to the UK government on all algorithmic matters, finds that steps need to be taken from the very start of children's education for artificial intelligence to flourish across the country. The goal, for the next ten years, should be no less ambitious than to ensure that every child leaves school with a basic sense of how AI works. This is not only about understanding the basics of coding and ethics, but about knowing enough to be a confident user of AI products, to look out for potential risks and to engage with the opportunities that the technology presents. "Without basic literacy in AI specifically, the UK will miss out on opportunities created by AI applications, and will be vulnerable to poor consumer and public decision-making, and the dangers of over-persuasive hype or misplaced fear," argues the report.

Raising standards for global data-sharing


In their Policy Forum “How to fix the GDPR's frustration of global biomedical research” (2 October 2020, p. [40][1]), J. Bovenberg et al. argue that the biomedical research community has struggled to share data outside the European Union as a result of the EU's General Data Protection Regulation (GDPR), which strictly limits the international transfer of personal data. However, they do not acknowledge the law's flexibility, and their solutions fail to recognize the importance of multilateral efforts to raise standards for global data-sharing. Bovenberg et al. express concern about the thwarting of “critical data flows” in biomedical research. However, the limited number of critical commentaries ([ 1 ][2], [ 2 ][3]) and registered complaints ([ 3 ][4]) indicate that hindered data exchange may not be a substantial global problem. Moreover, the authors concede that during the COVID-19 pandemic, data transfers remain ongoing because transfers “necessary for important reasons of public interest” are already provided in the law [([ 4 ][5]), Article 49(1)(d)]. The European Data Protection Board (EDPB) has cautioned that transfers according to this derogation shall not become the rule in practice ([ 5 ][6]), but this conditional support for international COVID-19 data sharing shows that the law already provides suitable flexibility. This flexibility also shows the EDPB's recognition of the pressing social need that biomedical research represents for the global research community during the COVID-19 pandemic, while also seeking to ensure that this remains the exception and not the beginning of a normalized practice. Bovenberg et al. contend that pseudonymized data should not be considered personal data in the hands of an entity that does not possess the key needed for re-identification. This proposal runs against well-established guidance in EU member states such as Ireland ([ 6 ][7]) and Germany ([ 7 ][8]), and it does not take into account the cases in which identifiers remain attached to transferred biomedical data or in which data could be identified without a key. Bovenberg et al. also neglect to state that the GDPR has special principles and safeguards for particularly sensitive re-identifiable data, not just for the protection of privacy but also for the security and integrity of health research data—aims that align with all high-quality scientific research. Respecting these standards (both technical and organizational) is fundamental to ensuring better data security and accuracy in the transferring of huge datasets of sensitive health data that are essential to global collaboration [([ 4 ][5]), Articles 5 and 9, Recitals 53 and 54, and ([ 8 ][9])]. Thus, these rules should not be subject to exemptions, which would result from not classifying pseudonymized data as personal data. The purpose of the GDPR's strict rules is to ensure that when personal data are transferred to non-EU countries, the level of protection ensured in the European Union is not undermined. The EU's Court of Justice decisions ([ 9 ][10], [ 10 ][11]) make it clear that ensuring an adequate level of protection in non-EU countries, especially independent oversight and judicial remedies—which the Court found lacking in the United States—is a matter of fundamental rights. This discrepancy is an opportunity for non-EU countries, including the United States, to raise their data protection standards to the level of the European Union's, not for the European Union to decrease its own standards in a regulatory race to the bottom. We encourage research organizations and country delegations to work with the European Commission, national data protection authorities, and the EDPB to craft interoperable rules on data sharing applicable for biomedical research in ways that do not undermine fundamental rights owed to data subjects. 1. [↵][12]1. R. Eiss , Nature 584, 498 (2020). [OpenUrl][13] 2. [↵][14]1. R. Becker et al ., J. Med. Internet Res. 22, e19799 (2020). [OpenUrl][15] 3. [↵][16]1. A. Jelinek , EDPB response letter to Mark W. Libby, Chargé d'Affaires, United States Mission to the European Union (2020); [\_letter\_out2020-0029\_usmission\_covid19.pdf][17]. 4. [↵][18]GDPR (2016); . 5. [↵][19]EDPB, “Guidelines 03/2020 on the processing of data concerning health for the purpose of scientific research in the context of the COVID-19 outbreak” (2020). 6. [↵][20]Data Protection Commission, “Guidance on Anonymisation and Pseudonymisation” (2019); [][21]. 7. [↵][22]German Federal Ministry of the Interior, Building and Community, “Draft for a Code of Conduct on the use of GDPR compliant pseudonymisation” (2019); [\_Protection\_Focus\_Group-Draft\_CoC\_Pseudonymisation\_V1.0.pdf][23]. 8. [↵][24]1. D. Anderson et al ., Int. Data Privacy L. 10, 180 (2020). [OpenUrl][25] 9. [↵][26]Case C-362/14 Maximilian Schrems v. Data Protection Commissioner (Court of Justice of the EU, 2015). 10. [↵][27]Case C-311/18 Data Protection Commissioner v. Facebook Ireland Limited and Maximillian Schrems (Court of Justice of the EU, 2020). [1]: [2]: #ref-1 [3]: #ref-2 [4]: #ref-3 [5]: #ref-4 [6]: #ref-5 [7]: #ref-6 [8]: #ref-7 [9]: #ref-8 [10]: #ref-9 [11]: #ref-10 [12]: #xref-ref-1-1 "View reference 1 in text" [13]: {openurl}?query=rft.jtitle%253DNature%26rft.volume%253D584%26rft.spage%253D498%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [14]: #xref-ref-2-1 "View reference 2 in text" [15]: {openurl}?query=rft.jtitle%253DJ.%2BMed.%2BInternet%2BRes.%26rft.volume%253D22%26rft.spage%253De19799%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [16]: #xref-ref-3-1 "View reference 3 in text" [17]: [18]: #xref-ref-4-1 "View reference 4 in text" [19]: #xref-ref-5-1 "View reference 5 in text" [20]: #xref-ref-6-1 "View reference 6 in text" [21]: [22]: #xref-ref-7-1 "View reference 7 in text" [23]: [24]: #xref-ref-8-1 "View reference 8 in text" [25]: {openurl}?query=rft.jtitle%253DInt.%2BData%2BPrivacy%2BL.%26rft.volume%253D10%26rft.spage%253D180%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [26]: #xref-ref-9-1 "View reference 9 in text" [27]: #xref-ref-10-1 "View reference 10 in text"

One in six children steal money to pay for video game loot boxes

Daily Mail - Science & tech

Around one in six children steal money from their parents to pay for video game loot boxes – in-game'treasure chests' that award players random virtual prizes. In a survey of British teen and young adult gamers, Gambling Health Alliance (GHA) found 15 per cent had taken money from parents without permission to buy loot boxes. Overall, one in ten – 11 per cent – had used their parents' credit or debit cards to fund their loot box purchases, while 9 per cent had borrowed money they couldn't repay for the addictive in-game feature. Three young gamers' loot box buying habits resulted in their families having to re-mortgage their homes to cover the costs, according to the study. GHA is currently putting pressure on the UK government to class loot boxes in video games as a form of gambling.

AI will take away lots of jobs. And we are nowhere near ready to replace them


The scale of the challenge that automation poses to the jobs market needs to be met with much stronger action to up-skill the workforce, finds a new report published by a committee in the UK Parliament. The House of Lords' select committee on artificial intelligence raised concerns at the "inertia" that is slowing down the country when it comes to digital skills, and urged the government to take steps to make sure that people have the opportunity to re-skill and re-train, to be able to adapt to the changing labor market that AI is bringing about. Citing research carried out by Microsoft, the committee stressed that only 17% of UK employees say that they have been part of re-skilling efforts, which sits well below the global average of 38%. Microsoft also recently reported that almost 70% of business leaders in the UK believe that their organization currently has a digital skills gap, and that two-thirds of employees feel that they do not have the appropriate digital skills to fulfil new and emerging roles in their industry. Even basic digital skills are lacking: a recent Lloyds Bank survey found that 19% of individuals in the UK couldn't complete tasks such as using a web browser.

The Queen's Christmas message will be available on Alexa for the first time


You won't have to go out of your way to catch Queen Elizabeth II's annual Christmas Day message if you have an Echo (or a similar device) on hand. The Guardian reports that the Queen's message will be available on smart speakers for the first time through Amazon's Alexa. So long as you live in an English-speaking country, you can ask Alexa to "play the Queen's Christmas day message" after 3PM GMT (10AM ET) and get the inspiring speech while you're finishing a holiday meal. Google Assistant and HomePod users are out of luck for the on-demand message, but you can always stream BBC Radio 4 on your speaker to get the live broadcast. It's a relatively late move when smart speakers have been around for several years.

The algorithms are watching us, but who is watching the algorithms?


Empowering algorithms to make potentially life-changing decisions about citizens still comes with significant risk of unfair discrimination, according to a new report published by the UK's Center for Data Ethics and Innovation (CDEI). In some sectors, the need to provide adequate resources to make sure that AI systems are unbiased is becoming particularly pressing – namely, the public sector, and specifically, policing. The CDEI spent two years investigating the use of algorithms in both the private and the public sector, and was faced with many different levels of maturity in dealing with the risks posed by algorithms. In the financial sector, for example, there seems to be much closer regulation of the use of data for decision-making, while local government is still in the early days of managing the issue. What is AI? Everything you need to know about Artificial Intelligence Although awareness of the threats that AI might pose is growing across all industries, the report found that there is no particular example of good practice when it comes to building responsible algorithms.

Achieving Security and Privacy in Federated Learning Systems: Survey, Research Challenges and Future Directions Artificial Intelligence

Federated learning (FL) allows a server to learn a machine learning (ML) model across multiple decentralized clients that privately store their own training data. In contrast with centralized ML approaches, FL saves computation to the server and does not require the clients to outsource their private data to the server. However, FL is not free of issues. On the one hand, the model updates sent by the clients at each training epoch might leak information on the clients' private data. On the other hand, the model learnt by the server may be subjected to attacks by malicious clients; these security attacks might poison the model or prevent it from converging. In this paper, we first examine security and privacy attacks to FL and critically survey solutions proposed in the literature to mitigate each attack. Afterwards, we discuss the difficulty of simultaneously achieving security and privacy protection. Finally, we sketch ways to tackle this open problem and attain both security and privacy.

Former NHS surgeon creates AI 'virtual patient' for remote training


A former NHS surgeon has created an AI-powered "virtual patient" which helps to keep skills sharp during a time when most in-person training is on hold. Dr Alex Young is a trained orthopaedic and trauma surgeon who founded Virti and set out to use emerging technologies to provide immersive training for both new healthcare professionals and experienced ones looking to hone their skills. COVID-19 has put most in-person training on hold to minimise transmission risks. Hospitals and universities across the UK and US are now using the virtual patient as a replacement--including our fantastic local medics and surgeons at the Bristol NHS Foundation Trust. The virtual patient uses Natural Language Processing (NLP) and'narrative branching' to allow medics to roleplay lifelike clinical scenarios.