A Reddit study group for the free online version of the Stanford class "Machine Learning", taught by Andrew Ng. The purpose of this reddit is to help each other understand the course materials, not to share solutions to assignments. Please follow the Stanford Honor Code. I'm a new user to Reddit, how does this site work? I have a question about the (class / videos / quiz / homework), how can I get help?
In the twenty-first century, AI techniques have experienced a massive surge in interest following concurrent advances in computer power, large amounts of data, and theoretical understanding. Companies are focused on it, Google has rebuilt their software around it, and Mark Zuckerberg personally hires and pays AI engineers seven figure salaries right out of graduate school. AI is simply the hottest area in technology. But, what exactly is AI, and how can it impact your investments? These are the questions we will try to explore in this week's newsletter.
This is a place to share machine learning research papers, journals, and articles that you're reading this week. If it relates to what you're researching, by all means elaborate and give us your insight, otherwise it could just be an interesting paper you've read. Please try to provide some insight from your understanding and please don't post things which are present in wiki. Preferably you should link the arxiv page (not the PDF, you can easily access the PDF from the summary page but not the other way around) or any other pertinent links. Besides that, there are no rules, have fun.
They're probably the person in your life you go to help for all your technology needs. So how can you give something good to the tech-savvy person in your life? Here are some suggestions for gifts to delight those who are always looking at the hottest tech trends and products. As with all cutting-edge tech, this isn't for the faint of heart, both in terms of price and in willingness to try something new. You still won't find a headphone jack with the The iPhone X.
I have this portfolio with jupyter notebooks done by me. Several of them need to be reworked or deleted, but most of them are okay. One of them is similar to things which I did while I worked in a bank. As for the first project - this is my attempt to build a site with handwritten digit recognition system with online training. This portfolio really helped me when I was looking for a job.
"With cities, it is as with dreams: everything imaginable can be dreamed, but even the most unexpected dream is a rebus that conceals a desire or, its reverse, a fear. Cities, like dreams, are made of desires and fears, even if the thread of their discourse is secret, their rules are absurd, their perspectives deceitful, and everything conceals something else." A project made during "Machine Learning for Artists workshop" with Gene Kogan @Opendotlab In this project, we trained a neural network to translate map tiles into generative satellite images. We trained individual models for several cities–Milan, Venice, and Los Angeles–allowing us to do city map style transfer (example above) by applying the aerial model of one city onto the map tiles of another. Also, we can create imaginary cities by hand-drawing sketches and feeding them to the generative model.
Clarifai has hired four members of Twitter's machine learning team and a former Google Brain engineer. The 40 employee startup creates visual recognition software that can automatically organise and filter images. The New York firm's clients include Buzzfeed, Trivago and Unilever. It was founded in 2013 by computer science PHD Matthew Zeiler after he completed an internship with the Google research team. The startup has raised $41 million in funding.
They're true... and you might not have to wait long to witness it in action. Jonathan Levin has combed through BridgeOS code that should accompany the iMac Pro, and it looks as if Apple will be using a cut-down version of the iPhone 7's A10 Fusion chip as a co-processor. While its full functionality isn't clear yet, developer Steve Troughton-Smith notes that the A10 appears to handle macOS' boot and security processes, such as passing firmware to the main Xeon processor and managing media copy protection. More importantly, Guilherme Rambo has found references to "hey Siri" support -- as with Cortana on Windows 10, you might not have to click an icon or invoke a keyboard shortcut just to ask about the weather. It's possible that the A10 chip is always running, which would represent a break from the custom T1 chip driving the Touch Bar in some recent MacBook Pro models.
Lab41 is currently in the midst of Project Hermes, an exploration of different recommender systems in order to build up some intuition (and of course, hard data) about how these algorithms can be used to solve data, code, and expert discovery problems in a number of large organizations. Anna's post gives a great overview of recommenders which you should check out if you haven't already. The ideal way to tackle this problem would be to go to each organization, find the data they have, and use it to build a recommender system. But this isn't feasible for multiple reasons: it doesn't scale because there are far more large organizations than there are members of Lab41, and of course most of these organizations would be hesitant to share their data with outsiders. Instead, we need a more general solution that anyone can apply as a guideline.
Not sure if it's the correct subreddit to post to, was referred by someone from AI one. I am currently enrolled in a Master level AI program, and as classes go by, I am starting to have increasingly more serious doubts about whether I should go through with it, so I'd love some feedback from the people that are already in the industry. I have a previous degree in electrical engineering, but I have graduated about 5 years, ago, and I was not the top student. My math skills are pretty limited. Somehow the math in the program is not that complex, but I really worry I will need to greatly improve to actually be meaningful irl.