Machine Learning



QMF is a fast and scalable C library for implicit-feedback matrix factorization models. For evaluation, QMF supports various ranking-based metrics that are computed per-user on test data, in addition to training or test objective values. QMF requires gcc 4.9, as it uses the C 14 standard, and CMake version 2.8 . It also depends on glog, gflags and lapack libraries. For more details on the command-line options, see the definitions in wals.cpp and bpr.cpp.

Evolutionary Computation - Part 3 - Alan Zucconi


When we are looking at a problem through the lens of evolution, we always have to take into account its two faces: the phenotype and genotype. The previous post focused on creating the body of the creature, together with its brain. It is now time to focus on the genotype, which is the way such information is represented, transmitted and mutated. Which is just a normal sine wave with period, ranging from to and shifted on the X axis by . Learning how to walk is now a problem of finding a point in a space with 8 dimensions (4 for each leg).

Nicola Mendelsohn and Matt Brittin on VR, AI and why the future is bright for marketing


Nicola Mendelsohn: "[Brands should be] starting to test [virtual reality] but it's very early days and not many people have these devices yet. It depends what your objective is. If your objective is testing innovation and being seen as an innovative company then try it because that would fulfil the objective. We only started shifting [the Samsung VR] in November, the Rift is only going out now, so is it actually in the hands of consumers? Matt Brittin: "I encourage people in the UK, which is one of the most creatively advanced countries globally, to be experimenting with new stuff all the time.

IBM Watson can customize your canned granola


Don't worry: IBM's Watson didn't whip up a bunch of needlessly complicated granola recipes for a cookbook that you must make (for science!). No, we're talking about its partnership with Kellogg's subsidiary Bear Naked, which is the first consumer brand to sell Chef Watson-inspired food. The partnership made it possible for Bear Naked to launch a website where granola enthusiasts can make custom blends. After you select a base -- cacao cashew butter, chocolate or honey -- Watson looks through thousands of possible flavors to find ingredients it can suggest. It's a very simple process, and we wish Watson can customize each can of granola even further.

Machine learning offers hope in fight against antibiotic resistance ExtremeTech


Note that this is so potentially powerful because it's such a starkly different approach from the historical experiments that led us to this point. In the past, researchers basically worked in the opposite direction: some observable characteristic of the cell is tracked to the protein causing the observation, to the gene encoding the protein, to the specific pattern of activity that allows that gene to have that effect. In this case, researchers observe the activity patterns without context, then brute-analyze them to find other genes, with known effects. This allows them to work forwards toward practical effects on the cell, rather than backward from them.

Machine learning methods applied to big data


There has been an upsurge in machine learning methods in recent years. Growing evidence suggests that machine learning is what a lot of people do with the big data they have accumulated. Like any complex undertaking, it is worthwhile to break it down into component parts. That is the objective of this episode of the Talking Data podcast, in which TechTarget reporters Jack Vaughan and Ed Burns discuss the evolution of machine learning through the lens of technologies employed and end-use applications. Among use cases cited are risk estimation in insurance, credit scoring and digital ad placement.

SAS Viya


Detect, predict, prevent and halt fraudulent activity with greater speed and accuracy. Efficiently conduct thorough alert triage, and more productive, directed investigations. SAS Visual Investigator combines easy-to-use features and visualization capabilities with the full power of SAS' advanced analytics and machine learning technology.

This Computer Algorithm Predicted Who Will Die Next on Game of Thrones


Over the course of five seasons, Game of Thrones has killed off over 61 characters, including fan favorites such as Ned Stark, Oberyn Martell and (supposedly) Jon Snow. Now, a computer science class at Germany's Technical University of Munich has created a website dubbed "A Song of Ice and Data" to determine the fate of the HBO drama's remaining key players in the upcoming sixth season. Using a series of machine learning algorithms, the students have figured out the likelihood of each character meeting their end in the next 10 episodes. According to the site, Tommen Baratheon has the worst odds of survival--with a 97 percent chance of death--while Sansa Stark is the most likely to make it to next year at 3 percent. The group also applied the formula to both the show's previous seasons and George R.R. Martin's A Song of Ice and Fire series, and found it accurately predicted 74 percent of deaths.

DIY Recommendation Engines for Mom and Pop Ecommerce Shops


Of course we have all heard about machine learning and recommendation engines in big business ecommerce. For quite some time, massive ecommerce businesses like Netflix, Amazon, and Ebay have been leveraging the power of data science to improve customer service and boost sales. Where once this technology was cost-prohibitive to all but the major players, recently things have changed. Thanks to multi-channel ecommerce platforms like Shopify, and the developers who are building custom machine learning add-ons, now mom and pop online businesses get the chance to infuse their operations with the power of data science. In this article I introduce how machine learning algorithms work to produce recommendation systems for small business ecommerce.

Intelligent machines: Making AI work in the real world - BBC News


As part of the BBC's Intelligent Machines season, Google's Eric Schmidt has penned an exclusive article on how he sees artificial intelligence developing, why it is experiencing such a renaissance and where it will go next. Until recently, AI seemed firmly stuck in the realm of science fiction. The term "artificial intelligence" was coined 60 years ago - on August 31 1955, John McCarthy proposed a "summer research project" to work out how to create thinking machines. It's turned out to take a bit longer than one summer. We're now entering the seventh decade, and just starting to see real progress.