Developers are building better software, faster, using AI


Without any human guidance, the system had learned how to detect cats, people, and over 3,000 other objects just by ingesting 10 million images from YouTube videos. It proved that machines could learn without labored assistance from humans, and reach new levels of accuracy to boot.

Introducing: Unity Machine Learning Agents – Unity Blog


It is critical to our mission to enable machine learning researchers with the most powerful training scenarios, and for us to give back to the gaming community by enabling them to utilize the latest machine learning technologies. At Unity, we wanted to design a system that provide greater flexibility and ease-of-use to the growing groups interested in applying machine learning to developing intelligent agents. The ML-Agents SDK allows researchers and developers to transform games and simulations created using the Unity Editor into environments where intelligent agents can be trained using Deep Reinforcement Learning, Evolutionary Strategies, or other machine learning methods through a simple to use Python API. As mentioned above, we are excited to be releasing this open beta version of Unity Machine Learning Agents today, which can be downloaded from our GitHub page.

Flipboard on Flipboard


Today the company announced Unity Machine Learning Agents--open-source software linking its game engine to machine learning programs such as Google's TensorFlow. It will allow non-playable characters, through trial and error, to develop better, more creative strategies than a human could program, says Lange, using a branch of machine learning called deep reinforcement learning. Google's DeepMind, for instance, has used deep reinforcement learning to teach AI agents to play 1980s video games like Breakout, and, in part, to master the notoriously challenging ancient Chinese game Go. And Nvidia's new Isaac Lab uses rival Epic Games' Unreal Engine to generate lifelike virtual environments for training the algorithms that control actual robots.


International Business Times

Update: Apple's iOS 11 release time is here, as many users have started seeing the iOS 11 update appear. For iPads, iOS 11 will be compatible with: 12.9-inch iPad Pro (second generation), 12.9-inch iPad Pro (first generation), 10.5-inch iPad Pro, 9.7-inch iPad Pro, iPad Air 2, iPad Air, iPad (fifth generation), iPad mini 4, iPad mini 3, iPad mini 2. With iOS 11, users will be able to pay their friends and family with Apple Pay on iMessage using their device's Touch ID or Face ID for those who get the iPhone X. Using the information gathered, Siri will suggest topics on the News app, improve its suggestions on iMessage and will create calendar notifications based on things booked in Safari.

Validation Methods For Trading Strategy Development


In this sense, the test is still useful but trading strategy developers know that good performance in out-of-samples for strategies developed via multiple comparisons is in most cases a random result. 1 with two examples that correspond to two major market regimes, highly significant strategies even after corrections for bias are applied can also fail due to changing markets. Conclusion: Robustness tests and stochastic modeling in general can assess over-fitting conditions but Type-I error (false discoveries) is high especially in the case of multiple comparisons even when applied to an out-of-sample. Most validation tests done by practitioners but also academics suffer from either multiple comparisons bias or fail under changing market conditions.

Machine Learning APIs power a more intelligent Internet of Things at the Edge


Where these IoT devices are in fact already doing some limited analytics at or very near the point of capture (as in the case with true Edge Computing systems), there is opportunity to create a more intelligent, more relevant, and more positive experience or outcome from the Internet of Things by using Haven OnDemand Machine Learning APIs to perform early analytics and computing that enhances or augments the data that is being acquired and aggregated at the edge. It achieved this by analyzing local law enforcement open data crime statistics to detect specific crime trends and specific crime anomalies. A more intelligent IoT solution would analyze still images to detect the presence of faces, recognize and extract text via Optical Character Recognition (OCR), identify corporate logos and even read barcodes. Examples include counting customers, analyzing customer demographics, analyzing customer personal effects to detect logos and determine brand preferences, analyzing real-time social media check-in mentions for sentiment, and point-of-sale data trend analysis.

CCleaner Malware Shows Software's Serious Supply-Chain Security Problem


On Monday, Cisco's Talos security research division revealed that hackers sabotaged the ultra-popular, free computer-cleanup tool CCleaner for at least the last month, inserting a backdoor into updates to the application that landed in millions of personal computers. Three times in the last three months, hackers have exploited the digital supply chain to plant tainted code that hides in software companies' own systems of installation and updates, hijacking those trusted channels to stealthily spread their malicious code. Even Artificial Neural Networks Can Have Exploitable'Backdoors' According to Avast, the tainted version of the CCleaner app had been installed 2.27 million times from when the software was first sabotaged in August until last week, when a beta version of a Cisco network monitoring tool discovered the rogue app acting suspiciously on a customer's network. One month later, researchers at Russian security firm Kaspersky discovered another supply chain attack they called "Shadowpad:" Hackers had smuggled a backdoor capable of downloading malware into hundreds of banks, energy, and drug companies via corrupted software distributed by the South Korea-based firm Netsarang, which sells enterprise and network management tools.

To chat or not to chat? Shakespeare has the answer to your question


Chatbots are vying to become one of the cornerstones of the messaging world: using AI tools like natural language and machine learning, developers are hoping to tap into the popularity of chat apps as a medium of communication to explore new ways to help you get information, buy things, plan your life and more by letting you converse with intelligent computers instead of humans. Shakespeare, as the bot is called, is a new messaging bot based on the works of the Bard of Avon. Attaching them to different kinds of intent (for example, food-lunch-dish or here-place-hell), they have turned Shakespeare's lines into potential answers to questions. The idea of creating bots that are reminiscent of other people is something that has been explored before, such as this chatbot that a friend made as a memorial and memento of someone she loved who had passed away.

Machine learning at Elasticsearch: In quest of data anomalies - JAXenter


Earlier this year, we formed a new partnership with Google to provide Elastic Cloud on GCP, launched Elastic Cloud Enterprise (ECE) for enterprises to deploy and manage multiple Elastic Stack environments on-premise or in a private cloud, and we just acquired a SaaS APM company based in Copenhagen. There are a lot of things about it that make it popular with developers: it's easy to get started and one can download it on a laptop; it works great for both structured and unstructured data; Elasticsearch horizontally scales; ingesting data into Elasticsearch is easy with 200 connectors; Kibana visualizations are intuitive, powerful and provide real-time exploration; and everything works on-premise or in the cloud. Shay Banon: Today this only works with time series data, such as, log files, application and performance metrics, network flows, and financial or transactional data, which is a lot. Shay Banon: Earlier this summer, we acquired Opbeat, an application performance management (APM) company, based in Copenhagen.

New AI Can Develop Video Games Just by Watching Gameplay


But this new AI, developed by researchers at Georgia Institute of Technology with the help of the original NES game Super Mario Bros., is capable of building a video game just by watching gameplay. In a new study, "Game Learning Engine from Video," researchers explained how the AI managed to recreate the engine of Mario's most famous outing. Instead, the team hopes that this AI will be able to aid developers in an effort to "speed up game development and experiment with different styles of play," according to a recent press release. And this AI does work quickly - the team recorded that the system only had to observe two minutes of Super Mario Bros. gameplay before it was able to build its own version of the game.