The International Telecommunications Union (ITU), the United Nations specialised agency for information and communication technologies (ICTs), has launched a new ITU Focus Group to establish a basis for ITU standardisation to assist machine learning (ML) in bringing more automation and intelligence to ICT network design and management. Machine learning algorithms are helping operators to make smarter use of network-generated data. These algorithms enable ICT networks and their components to adapt their behaviour autonomously in the interests of efficiency, security and optimal user experience. Fixed and mobile networks generate a huge amount of data both at the network infrastructure level and at the user/customer level, which contain a lot of useful information such as data on location, mobility and call patterns. New ML methods for big data analytics in communication networks can extract relevant information from the network data, and then leverage this knowledge for autonomic network control and management as well as service provisioning.
Knowing how to write high quality software -- the days of one team writing throwaway models and another team implementing them in production are slowly coming to an end. With programming languages like Python and R and their packages making it easy to work with data and models, it is reasonable to expect a data scientist or machine learning engineer to attain a high level of programming proficiency and understand the basics of system design. Knowing how to write high quality software -- the days of one team writing throwaway models and another team implementing them in production are slowly coming to an end. With programming languages like Python and R and their packages making it easy to work with data and models, it is reasonable to expect a data scientist or machine learning engineer to attain a high level of programming proficiency and understand the basics of system design.
The next level will be using artificial intelligence in election campaigns and political life. This highly sophisticated micro-targeting operation relied on big data and machine learning to influence people's emotions. Typically disguised as ordinary human accounts, bots spread misinformation and contribute to an acrimonious political climate on sites like Twitter and Facebook. For example, if a person is interested in environment policy, an AI targeting tool could be used to help them find out what each party has to say about the environment.
Crowdsourced investment strategies are many and varied, but Numerai crowdsources machine intelligence in a totally unique way by supplying its network of data scientists with encrypted data on which to test their machine learning models, thus removing any bias attached to the application of the algorithms. NMR tokens were not sold like a typical initial coin offering, but rather 1.2 million of the tokens (a cap of 21m has been stated) were distributed via smart contracts on Ethereum, only to participating data scientists. All this value pumped into the tokens at once presented a number of immediate risks to the ecosystem, offering a huge bounty to hackers and threatening its carefully aligned goals; a worry aired on discussion forums was that speculation around the coins could ultimately detract from their primary function - to get create good models and garner network effect. When a data scientist accesses the test set multiple times and uses that score as feedback for model selection, there's a risk of training a model that overfits the test set.
A session on Tuesday featured Christina Qi, the co-founder of a high-frequency trading firm called Domeyard LP; Jonathan Larkin, an executive from Quantopian, a hedge fund taking a data-driven systematic approach; and Andy Weissman of Union Square Ventures, a venture capital firm that has invested in an autonomous hedge fund. Many of the world's largest hedge funds already rely on powerful computing infrastructure and quantitative methods--whether that's high-frequency trading, incorporating machine learning, or applying data science--to make trades. Some have begun to incorporate machine learning into their systems, hand over key management decisions to troves of data scientists, and even crowdsource investment strategies. Domeyard can't incorporate machine learning, Qi says, because machine learning programs are generally optimized for throughput, rather than latency.
The pair joined forces to deliver an in-depth webinar on Machine Learning and business intelligence, which you can view in full here. Or, put another way: when does it make sense to invest in Machine Learning projects for my business? One of the most exciting applications, says Boaz, is Natural Language Processing (NLP). For example, Sisense Everywhere uses bots and NLP to deliver data insights outside of the usual dashboard environment.
Big Data and machine learning would seem to be a perfect match, coming together at just the right time. But having vast amounts of data and computing power isn't enough. For machine learning tools to work, they need to be fed high-quality data, and they must also be guided by highly skilled humans. What is clear is that the business of combining Big Data and big computing power for new insight is harder than it looks.
For instance, by assisting the bank with its investigations into anti-money laundering, risk analytics, risk reporting, while also helping it cut down the time it takes to conduct valuation and finance liquidity assessments. The company considered building out its in-house big data analytics capabilities to tackle these five areas, but the process turned out harder than expected. Google has a growing array of machine learning tools and technologies in its portfolio, such as the Google Cloud Machine Learning Engine, which is a managed service that helps organisations create machine learning models for any size or type of dataset. The company is now on the verge of putting the original five pilots into production, before applying all it has learned to a new set of use cases, and accelerating the spread of cloud-based data analytics and machine learning tools through the rest of the business.
When I tell people that I work at an AI company, they often follow up with, "So, what kind of machine learning/deep learning do you do?" This isn't surprising, as most of the market attention (and hype) in and around AI has been centered around machine learning and its high-profile subset deep learning and around natural language processing with the rise of the chatbot and virtual assistants. But while machine learning is a core component of artificial intelligence, AI is, in fact, more than just ML. So, what does it really mean for an application to be "intelligent"? What does it take to create a system that is artificially intelligent?
I am spending some cycles on my algorithmic rotoscope work -- which is basically a stationary exercise bicycle for my learning about what is and what is not Machine Learning. I am using it to help me understand and tell stories about Machine Learning by creating images using Machine Learning that I can use in my Machine Learning storytelling. Picture a bunch of Machine Learning gears all working together to help make sense of what I'm doing, and WTF I am talking about? As I'm writing a story on how image style transfer Machine Learning could be put to use by libraries, museums, and collection curators, I'm reminded of what a con machine learning will be in the future, and how it will be a vehicle for the extraction of value and outright theft. My image style transfer work is just one tiny slice of this pie.