Results


What is hardcore data science – in practice?

@machinelearnbot

For example, for personalized recommendations, we have been working with learning to rank methods that learn individual rankings over item sets. Figure 1: Typical data science workflow, starting with raw data that is turned into features and fed into learning algorithms, resulting in a model that is applied on future data. This means that this pipeline is iterated and improved many times, trying out different features, different forms of preprocessing, different learning methods, or maybe even going back to the source and trying to add more data sources. Probably the main difference between production systems and data science systems is that production systems are real-time systems that are continuously running.


AI in HR: Artificial intelligence to bring out the best in people

#artificialintelligence

Whenever there's a need to draft a job description, Expedia Inc.'s 3,000-plus recruiters and hiring managers have the option to call on a writing coach. The online travel-booking company's writing companion is Textio Inc., an artificial intelligence application that runs in the cloud and analyzes each typewritten word in milliseconds to spot gender bias or other language that might turn off good candidates. The software generates an effectiveness score and suggests alternative phrasing, in effect teaching the recruiter how to write a job description more effectively. We are in the age of "the Facebook generation"-- millennials. They'll make up the majority of the workforce.


Gadget : Machine learning will go big and small in 2017

#artificialintelligence

The machine learning sector is really beginning to take form in South Africa with various start-ups taking off and entering the international scene. At DataProphet, we specialise in the application of machine learning algorithms to provide actionable solutions for a variety of industries. Fortunately, South African companies – not generally known for their customer care – are starting to wake up to the possibilities of efficient customer relationship management (CRM) through bespoke products, targeted marketing and improved customer service. Many of the very best machine learning products are open-source and open-data which allow for the establishment of social-good machine learning applications that many may not have even considered yet.


Is Python Slow As Molasses? (IT Best Kept Secret Is Optimization)

#artificialintelligence

But when you use Python for machine learning you mostly use packages written in compiled languages (C, C, Cython, even Fortran) and you get good performance. Cython is heavily used in the popular scikit-learn machine learning package for instance. This should clear any doubt: it is possible to write efficient Python code. Bottom line is that using Python for machine learning yields good performance when you deal with moderate size datasets.


Learning language by playing games

AITopics Original Links

MIT researchers have designed a computer system that learns how to play a text-based computer game with no prior assumptions about how language works. Although the system can't complete the game as a whole, its ability to complete sections of it suggests that, in some sense, it discovers the meanings of words during its training. In 2011, professor of computer science and engineering Regina Barzilay and her students reported a system that learned to play a computer game called "Civilization" by analyzing the game manual. But in the new work, on which Barzilay is again a co-author, the machine-learning system has no direct access to the underlying "state" of the game program -- the data the program is tracking and how it's being modified. "When you play these games, every interaction is through text," says Karthik Narasimhan, an MIT graduate student in computer science and engineering and one of the new paper's two first authors.


Global Bigdata Conference

#artificialintelligence

No longer was it an esoteric discipline commanded by the few, the proud, the data scientists. Now it was, in theory, everyone's business. Machine learning's power and promise, and all that surrounded and supported it, moved more firmly into the enterprise development mainstream. GET A 15% DISCOUNT through Jan.15, 2017: Use code 8TIISZ4Z. Cut to the key news in technology trends and IT breakthroughs with the InfoWorld Daily newsletter, our summary of the top tech happenings.


Meizu Pro 6 Plus Review: iPhone's Body Plus Samsung Galaxy's Chip Equals Powerhouse

Forbes

When it comes to hardware design language, non-Apple/-Samsung phones tend to be all over the place, even within the same product line. Meizu, on the other hand, has stuck with the same design language over at least a half dozen phones released in the past two to three years. That coupled with the quad HD AMOLED display (another jump, as previous Meizu phones mostly used 1080p LCD panels) give this phone a decidedly more premium feel than not just other Meizu phones, but most phones at this price point ($2,999 yuan/US$430). The Pro 6 Plus scored a 112,795 on Antutu (left), which is among the highest of all phones released this year; the phone uses USB-C with Meizu's own fast charge technology that supports up to 24-watt charge (middle); the device scored a 1,469 and 3,471 on Geekbench's single- and multi-core tests.


Machine learning: From science project to business plan

#artificialintelligence

No longer was it an esoteric discipline commanded by the few, the proud, the data scientists. Now it was, in theory, everyone's business. Machine learning's power and promise, and all that surrounded and supported it, moved more firmly into the enterprise development mainstream. That movement revolved around three trends: new and improved tool kits for machine learning, better hardware (and easier access to it), and more cloud-hosted, as-a-service variants of machine learning that provided both open source and proprietary tools. Once upon a time, if you wanted to implement machine learning in an app, you had to roll the algorithms yourself.


Machine learning: From science project to business plan

#artificialintelligence

No longer was it an esoteric discipline commanded by the few, the proud, the data scientists. Now it was, in theory, everyone's business. Machine learning's power and promise, and all that surrounded and supported it, moved more firmly into the enterprise development mainstream. That movement revolved around three trends: new and improved tool kits for machine learning, better hardware (and easier access to it), and more cloud-hosted, as-a-service variants of machine learning that provided both open source and proprietary tools. Once upon a time, if you wanted to implement machine learning in an app, you had to roll the algorithms yourself.


Understanding 20161128 v8

#artificialintelligence

This presentation provides an initial conceptual framework for "Understanding Cognitive Systems" – this presentation can be downloaded from slideshare.net/spohrer I'm Jim Spohrer, I work at IBM and I am the presenter, and in today's short talk, I will briefly cover what is a cognitive system (entity) – both biological and digital. Then I will briefly discuss how to build, understand, and work with digital cognitive systems – and how this is steps towards a next generation cognitive curriculum, including types of digital cognitive systems. Biological cognitive system entities… and human intelligence… The best explanation of what a biological cognitive system entity is can be found in Terrence Deacon's book- the Symbolic Species – the co-evolution of language and the brain. All easily recognizable biological cognitive systems from ants to wolves to crows to dolphins to monkeys to people have brains that have co-evolved with symbol systems - chemical, visual, auditory – that individuals of the species you as a type of language for communicating and coordinating reasoning and interactions and the accumulation of knowledge for successful multi-generational living in an environment.