Jeffrey Epstein told a journalist he funded Sophia the robot, who he claimed would have 'more empathy than a woman'


Jeffrey Epstein's tangled web leads down some surprising paths, including, he claimed, to Sophia the robot. The female robot styled after Audrey Hepburn made headlines in recent years for her eerily lifelike skin and appearance, complete with a diverse set of facial expressions, and the artificial intelligence she uses to spout off quotes like "OK. She also got in a Twitter fight with Chrissy Teigen. In a new essay detailing a journalist's friendship with Jeffrey Epstein over the past three decades, Edward Jay Epstein (the two are not related) says the wealthy financier told him in April 2013 that he was funding a Hong Kong group to build "the world's smartest robot," named Sophia. Sophia was built by Hanson Robotics, a Hong Kong company created and led by David Hanson. In a statement shared with Business Insider, Hanson denied that Epstein ever directly contributed funding to either Sophia or Hanson Robotics. "With all of our software efforts, both inside Hanson Robotics, and via collaboration with universities and other institutions, we seek to further our mission to empower socially intelligent AI and robots that enrich the quality of human lives.

AI helps you move in faster - What is technology and what is it good for?


The last time I applied for a mortgage, I had forgotten how many different pieces of paper and documents I needed to find and hand over to the bank. At the time, I couldn't help but think how irritating this was, and how that surely they had all this data to hand already – especially as I was an existing customer with the bank. What I hadn't really thought about was the process and technology behind the application and reams of paper I was relinquishing. Not until I recently met a company based in the Moorgate WeWork, London. Having written a few of these blogs already I realise that I use terms like "the company" or "that company" a lot, so from now on I will simplify this by using "TechCo" (short hand for "Tech Company").

AI and the Social Sciences Used to Talk More. Now They've Drifted Apart.


So, in light of these developments, how should social scientists think differently about people, the economy, and society? And how should the engineers who write these algorithms handle the social and ethical dilemmas their creations pose? "These are the kinds of questions you can't answer with just the technical solutions," says Dashun Wang, an associate professor of management and organizations at Kellogg. "These are fundamentally interdisciplinary issues." Indeed, economists seeking to predict how automation will impact the labor market need to understand which skills machines are best suited to perform.

OCBC ramps up banking app with voice-based helper Frontier Enterprise


The OCBC Mobile Banking app now offers an artificial intelligence-powered, voice-based virtual assistant, which has already helped with over 20,000 requests made via voice since the feature's launch last August. In half of these requests, the customer was seeking information about spending categories and budgets; another 30% concerned past banking transactions. Other mobile banking services that were performed using voice included locating ATMs, paying bills and changing banking PINs. The new voice-activated banking service, which was developed and trained over 13 months and is called the OCBC Banking Assistant, 'lives' in the OCBC Mobile Banking app. The customer speaks to the assistant as if conversing with a human assistant – and the requested banking task gets done.

AI and ethics: The debate that needs to be had ZDNet


Whether we know it or not, artificial intelligence (AI) is already steeped into everyday life. It's present in the way social media feeds are organised; the way predictive searches show up on Google; and how music services such as Spotify make song suggestions. The technology is also helping transform the way enterprises do business. Commonwealth Bank of Australia, for instance, has applied AI to analyse 200 billion data points to free up more time so its customer service officers can focus on doing exactly what their title suggests: servicing customers. As a result, the bank has seen a 400% uplift in customer engagement.

Will God or Humans Decide if Artificial Intelligent Robots Deserve a Soul?


Artificial Intelligence (AI) is here today; it's not just the future of technology. It is also not just found in toy robots or Hollywood sci-fi movies. It's embedded in the fabric of your everyday life. Despite AI's promise, certain thinkers are deeply concerned about a time when machines might become fully sentient, rational agents--beings with emotions, consciousness, and self-awareness. "The development of full artificial intelligence could spell the end of the human race," Stephen Hawking told the BBC in 2014.

The Real Reason You Should Fear the Future of Artificial Intelligence.


Forget Killer Robots--Bias Is the Real Danger of artificial intelligence. Machine learning bias, also known as algorithm bias or AI bias, is a phenomenon that occurs when an algorithm produces results that are systematically prejudiced due to erroneous assumptions in the machine learning process. Oscar Wilde once argued that life imitates art more than art imitates life. Strangely, that's proving to be the case when it comes to AI development – but not in the way some had hoped. AI programs are made up of algorithms, or a set of rules that help them identify patterns so they can make decisions with little intervention from humans.

High-speed 5G network seen as ready to give big boost to online gaming

The Japan Times

CHIBA – At this year's Tokyo Game Show, the big draw was next-generation 5G networking -- setting pulses racing with the prospect of a radically more immersive gaming experience. Offering data transmission speeds around 100 times faster than 4G, 5G is expected to enable more seamless imagery -- particularly lower latency, more vivid images -- and sharper motion. Industry experts say it will dramatically improve the quality of augmented and virtual reality games. "It was very smooth, responsive and consistent," said Omar Alshiji, a 23-year-old game designer from Bahrain, after trying out the fighting game Tekken at the NTT Docomo Inc. booth. The major mobile carrier installed 5G base stations at its booth this year, making the high-speed network available at the show.

Smart Talk


Conversational assistants are here to stay, making everything from boiling an egg to making a payment that much easier. And consumers expect more of them day by day. If they meet these growing expectations, conversational assistants are in a position to transform the customer experience landscape. But do organizations have the customer centricity and organizational capabilities necessary to deploy these technologies successfully? In the new report from the Capgemini Research Institute, Smart Talk: How organizations and consumers are embracing voice and chat assistants,we talked to over 12,000 consumers who've used and continue to use voice and/or chat assistants and to 1,000 executives from consumer products and retail, financial services, and automotive, including pure-play digital players.

Predictive policing poses discrimination risk, thinktank warns


Predictive policing – the use of machine-learning algorithms to fight crime – risks unfairly discriminating against protected characteristics including race, sexuality and age, a security thinktank has warned. Such algorithms, used to mine insights from data collected by police, are currently deployed for various purposes including facial recognition, mobile phone data extraction, social media analysis, predictive crime mapping and individual risk assessment. Researchers at the Royal United Services Institute (RUSI), commissioned by the government's Centre for Data Ethics and Innovation, focused on predictive crime mapping and individual risk assessment and found algorithms that are trained on police data may replicate – and in some cases amplify – the existing biases inherent in the data set, such as over- or under-policing of certain communities. "The effects of a biased sample could be amplified by algorithmic predictions via a feedback loop, whereby future policing is predicted, not future crime," the authors said. The paper reveals that police officers, who were interviewed for the research, are concerned about the lack of safeguards and oversight regarding the use of predictive policing.