Goto

Collaborating Authors

 cheeseburger


49ad23d1ec9fa4bd8d77d02681df5cfa-Supplemental.pdf

Neural Information Processing Systems

Compute isessential tomodern machine learning applications, andmorecompute typically yields better results. It is thus important to compare our method's compute requirements to competing methods. Table 10: Training compute requirements for our diffusion models compared to StyleGAN2 and BigGAN-deep. Underreasonablesettingsforβt andT,thedistribution q(xT) is nearly an isotropic Gaussian distribution, so samplingxT is trivial. In particular, they do not directly parameterizeµθ(xt,t) as a neural network,butinsteadtrainamodel ϵθ(xt,t)topredictϵfromEquation3.


A Additional Results

Neural Information Processing Systems

FID evaluated over 10k samples instead of 50k for efficiency. It is thus important to compare our method's compute requirements to competing methods. BigGAN-deep with the same or lower compute budget. We include communication time across two machines whenever our training batch size doesn't We find that a naive implementation of our models in PyTorch 1.7 is very inefficient, utilizing only Table 7: Throughput of our ImageNet models, measured in Images per V100-sec. In addition, we can train for many fewer iterations while maintaining sample quality superior to BigGAN-deep.


My Imagination Is on Steroids Now

The Atlantic - Technology

What if The Atlantic owned a train car? Amtrak, I had just learned on the internet, allows owners of private railcars to lash onto runs along the Northeast Corridor, among other routes. "We should have a train car," I slacked an editor. Moments later, it appeared on my screen, bright red with our magazine's logo emblazoned in white, just like I'd ordered. It's an old logo, and misspelled, but the effect was the same: A momentary notion--one unworthy of relating to someone in private, let alone executing--had been realized, thanks to DALL-E 3, an artificial-intelligence image generator now built into Microsoft Bing's Image Creator website.


How Do Kids View Smart Speakers and AI in the Classroom? 6 Things to Know

#artificialintelligence

Alexa, how many whiskers does a cat have? Alexa, do my parents still love me? Devices fueled by artificial intelligence--including smart speakers--have been making inroads into classrooms for several years now. But how do students actually perceive these machines? And how are they using them in response to that perception?


Can Computers Learn Common Sense?

The New Yorker

A few years ago, a computer scientist named Yejin Choi gave a presentation at an artificial-intelligence conference in New Orleans. On a screen, she projected a frame from a newscast where two anchors appeared before the headline "CHEESEBURGER STABBING." Choi explained that human beings find it easy to discern the outlines of the story from those two words alone. Had someone stabbed a cheeseburger? Had a cheeseburger been used to stab a person?


A Cartoon Guide to Language Models in NLP (Part 1: Intuition)

#artificialintelligence

(This is a crosspost from the official Surge AI blog. If you need help with data labeling and NLP, say hello!) Language models are a core component of NLP systems, from machine translation to speech…


What makes an image memorable? Ask a computer

#artificialintelligence

From the "Mona Lisa" to the "Girl with a Pearl Earring," some images linger in the mind long after others have faded. Ask an artist why, and you might hear some generally-accepted principles for making memorable art. Now there's an easier way to learn: ask an artificial intelligence model to draw an example. A new study using machine learning to generate images ranging from a memorable cheeseburger to a forgettable cup of coffee shows in close detail what makes a portrait or scene stand out. The images that human subjects in the study remembered best featured bright colors, simple backgrounds, and subjects that were centered prominently in the frame.


Artifice No More? 'Intelligence' Revolves with Machine Learning

#artificialintelligence

Repeat after me: Machines are our friends; they're with us till the end! There, now doesn't that feel better? Oh, sure, the "narrative" says that machines will take jobs away from humans, but that's only somewhat true. Mostly, the machines of tomorrow will do what they've always done: streamline and expedite workflow across the spectrum of business processes. Keep in mind, the cotton gin eradicated thousands of jobs all over the Southern United States (and elsewhere), back when it stormed the market in the 1800s.


An Easy Introduction to Machine Learning Recommender Systems

#artificialintelligence

How does YouTube know what videos you'll watch? How does Google always seem to know what news you'll read? They use a Machine Learning technique called Recommender Systems. Practically, recommender systems encompass a class of techniques and algorithms which are able to suggest "relevant" items to users. Ideally, the suggested items are as relevant to the user as possible, so that the user can engage with those items: YouTube videos, news articles, online products, and so on. Items are ranked according to their relevancy, and the most relevant ones are shown to the user.


Predictive and Interactive Analytics: A Primer - Artificial Intelligence Online

#artificialintelligence

Imagine the difference between a buffalo stampede and a cheeseburger. Both are tasty sources of protein. The difference lies in their requisite culinary tools. Predictive Analytics (PA) is the buffalo stampede of quantitative research: data is big, fast, and shaggy. Interactive Analytics (IA) is a cheeseburger: structured, convenient, and easy to grill.