This translation by Jeffrey Ding, edited by Paul Triolo, covers some of the most interesting parts of the Standards Administration of China's 2018 White Paper on Artificial Intelligence Standardization, a joint effort by more than 30 academic and industry organizations overseen by the Chinese Electronics Standards Institute. Ding, Triolo, and Samm Sacks describe the importance of this white paper and other Chinese government efforts to influence global AI development and policy formulation in their companion piece, "Chinese Interests Take a Big Seat at the AI Governance Table." Historical experience demonstrates that new technologies can often improve productivity and promote societal progress. But at the same time, as artificial intelligence (AI) is still in the early phrase of development, the policies, laws, and standards for safety, ethics, and privacy in this area are worthy of attention. In the case of AI technology, issues of safety, ethics, and privacy have a direct impact on people's trust in AI technology in their interaction experience with AI tools.
With the proliferation of AI principles worldwide1, industry is faced with a new challenge: how to implement these AI principles? Since 2017, the international committee responsible for the standardization of AI (SC 42) has been tackling this challenge: it is developing standards covering both technical and organisational specifications to enable responsible and trustworthy AI. Forty-four countries are currently involved in the work of SC 42, and Australia plays an active role in the development of the AI international standards, as it has formed standards committee IT-043 to be Australia's voice at SC 42. When it comes to AI, it is essential to provide for interoperability and global governance, and this is why AI international standards have the buy in from key governments (such as China, the US and the EU). Australia has also identified AI standards as an important national priority.
The role of religion in an increasingly secular world is a hotly debated topic. But a new scientific study indicates that children raised in religious societies perform worse in maths and science at school than their atheist or agnostic counterparts. The team behind the research suggests that standards in these subjects could be raised by keeping religion out of educational institutions. Scientists have found that children in highly religious societies perform worse in maths and science at school. Out of the 82 countries analysed, the five least religious countries were shown to be Czech Republic, Japan, Estonia, Sweden and Norway.
The UK's hopes of retaining an influential role for its data protection agency in shaping European Union regulations post-Brexit -- including helping to set any new Europe-wide rules around artificial intelligence -- look well and truly dashed. In a speech at the weekend in front of the International Federation for European Law, the EU's chief Brexit negotiator, Michel Barnier, shot down the notion of anything other than a so-called'adequacy decision' being on the table for the UK after it exits the bloc. If granted, an adequacy decision is an EU mechanism for enabling citizens' personal data to more easily flow from the bloc to third countries -- as the UK will be after Brexit. Such decisions are only granted by the European Commission after a review of a third country's privacy standards that's intended to determine that they offer essentially equivalent protections as EU rules. But the mechanism does not allow for the third country to be involved, in any shape or form, in discussions around forming and shaping the EU's rules themselves.