Goto

Collaborating Authors

 muppet


Rashida Jones Wonders What Makes Us Human

The New Yorker

For someone who used to ride a school bus with Paris Hilton and Kim Kardashian, Rashida Jones is remarkably earthbound. Growing up in Los Angeles, the daughter of the "Mod Squad" actor Peggy Lipton and the legendary music producer Quincy Jones, she was so ensconced in the world of mega-celebrity that it took a while for her to realize that the people surrounding her--Frank Sinatra, Sidney Poitier, Michael Jackson--were as iconic as they were. That heady milieu would cause most young people (say, her bus-mates) to lose themselves in the fame bubble. Instead, Jones did her homework and got into Harvard, where she studied religion and philosophy, before finding fame on her own, on the sitcoms "The Office" and "Parks and Recreation." In many of her roles, as in her life, she projects a dry, discerning intelligence that cuts through the absurdity surrounding her. She is a very good guide to the world of the famous.


Multi-Modal Few-Shot Temporal Action Detection

Nag, Sauradip, Xu, Mengmeng, Zhu, Xiatian, Perez-Rua, Juan-Manuel, Ghanem, Bernard, Song, Yi-Zhe, Xiang, Tao

arXiv.org Artificial Intelligence

Few-shot (FS) and zero-shot (ZS) learning are two different approaches for scaling temporal action detection (TAD) to new classes. The former adapts a pretrained vision model to a new task represented by as few as a single video per class, whilst the latter requires no training examples by exploiting a semantic description of the new class. In this work, we introduce a new multi-modality few-shot (MMFS) TAD problem, which can be considered as a marriage of FS-TAD and ZS-TAD by leveraging few-shot support videos and new class names jointly. To tackle this problem, we further introduce a novel MUlti-modality PromPt mETa-learning (MUPPET) method. This is enabled by efficiently bridging pretrained vision and language models whilst maximally reusing already learned capacity. Concretely, we construct multi-modal prompts by mapping support videos into the textual token space of a vision-language model using a meta-learned adapter-equipped visual semantics tokenizer. To tackle large intra-class variation, we further design a query feature regulation scheme. Extensive experiments on ActivityNetv1.3 and THUMOS14 demonstrate that our MUPPET outperforms state-of-the-art alternative methods, often by a large margin. We also show that our MUPPET can be easily extended to tackle the few-shot object detection problem and again achieves the state-of-the-art performance on MS-COCO dataset. The code will be available in https://github.com/sauradip/MUPPET


Using DALL-E to make images of Muppets in Mad Max

#artificialintelligence

Whether you're at the pool, beach, lakefront, camping, or in your backyard, you'll need a reliable cooler to keep your water and other essential beverages nice and cold. However, not just any cooler will do.


Future Tense Newsletter: We Need a Muppet Version of em Frankenstein /em

Slate

Sign up to receive the Future Tense newsletter every other Saturday. On Aug. 30, my heart broke a tiny bit. That day, the Guardian published a remarkable interview with Frank Oz, Jim Henson's longtime collaborator and the puppeteer behind Fozzie Bear, Miss Piggy, and other classic Muppets. Oz hasn't been involved with the Muppets since 2007, three years after Disney purchased the franchise. He tells the Guardian: "I'd love to do the Muppets again but Disney doesn't want me, and Sesame Street hasn't asked me for 10 years. They don't want me because I won't follow orders and I won't do the kind of Muppets they believe in. He added of the post-Disney Muppet movies and TV shows: "The soul's not there.


Adaptive Precision Training (ADEPT): A dynamic fixed point quantized sparsifying training approach for DNNs

Kummer, Lorenz, Sidak, Kevin, Reichmann, Tabea, Gansterer, Wilfried

arXiv.org Artificial Intelligence

Quantization is a technique for reducing deep neural networks (DNNs) training and inference times, which is crucial for training in resource constrained environments or time critical inference applications. State-of-the-art (SOTA) approaches focus on post-training quantization, i.e. quantization of pre-trained DNNs for speeding up inference. Little work on quantized training exists and usually, existing approaches re-quire full precision refinement afterwards or enforce a global word length across the whole DNN. This leads to suboptimal bitwidth-to-layers assignments and re-source usage. Recognizing these limits, we introduce ADEPT, a new quantized sparsifying training strategy using information theory-based intra-epoch precision switching to find on a per-layer basis the lowest precision that causes no quantization-induced information loss while keeping precision high enough for future learning steps to not suffer from vanishing gradients, producing a fully quantized DNN. Based on a bitwidth-weighted MAdds performance model, our approach achieves an average speedup of 1.26 and model size reduction of 0.53 compared to standard training in float32 with an average accuracy increase of 0.98% on AlexNet/ResNet on CIFAR10/100.


Why Amazon Is Naming New Warehouse Robots After Muppets

Slate

Shortly before Prime Day in June, Amazon announced it was developing two robots for its infamously demanding distribution centers. Named "Bert" and "Ernie" after the Sesame Street Muppets, the robots, Amazon claimed, would help relieve the physical burden of its jobs by autonomously carting materials through distribution center floors and lifting heavy totes off shelves. They were not, the company stressed, intended to increase speed or replace workers, but to improve safety and free workers for tasks "that require…critical thinking skills." According to the company, the robots weren't some nefarious plot; instead, they embodied its empathy for workers and commitment to innovations that would help consumers and employees alike. The announcement's timing was convenient.


Why are so many AI systems named after Muppets?

#artificialintelligence

One of the biggest trends in AI recently has been the creation of machine learning models that can generate the written word with unprecedented fluidity. These programs are game-changers, potentially supercharging computers' ability to parse and produce language. But something that's gone largely unnoticed is a secondary trend -- a shadow to the first -- and that is: a surprising number of these tools are named after Muppets. To date, this new breed of language AIs includes an ELMo, a BERT, a Grover, a Big BIRD, a Rosita, a RoBERTa, at least two ERNIEs (three if you include ERNIE 2.0), and a KERMIT. Big tech players like Google, Facebook, and the Allen Institute for AI are all involved, and the craze has global reach, with Chinese search giant Baidu and Beijing's Tsinghua University contributing models.


Notre Dame and the culture it inspired – from Matisse to the Muppets

The Guardian

As Notre Dame Cathedral's majestic spire tumbled into the inferno on Monday night, live newsreaders around the world decried the tragic loss of this 12th-century marvel. The great timber roof – nicknamed "the forest" for the thousands of trees used in its beams – was gone, the rose windows feared melted, the heart of Paris destroyed forever. What few realised in the heat of the shocking footage was that much of what was ablaze was a 19th-century fantasy. Like most buildings of this age, Notre Dame is the sum of centuries of restorations and reinventions, a muddled patchwork of myth and speculation. Standing as a sturdy hulk on the banks of the Seine, the great stone pile has never been the most elegant or commanding of the ancient cathedrals, but it became the most famous. Begun in 1163, it was larger than any gothic church before it, employing some of the first flying buttresses to allow taller, thinner walls and larger expanses of glazing – including the spectacular rose windows that projected great cosmic wheels of colour into the luminous interior. "Where would [one] find … such magnificence and perfection, so high, so large, so strong, clothed round about with such a multiple variety of ornaments?"


Culture crusaders: who's who in Trump's gun violence roundtable

The Guardian

As Donald Trump convenes a meeting on Thursday to address violence in video games, in the wake of last month's Florida school shooting, those in attendance will include a group that argues the Muppets drink too much, and another committed to exposing strident liberal bias on television. The president's round table at the White House will be the latest in a series of discussions on school safety after a gunman left 17 dead at Marjory Stoneman high school in Parkland on 14 February. And although representatives of the mainstream Entertainment Software Association and executives from other gaming parent companies are slated to attend, they will be seated across from a bevy of culture crusaders who have sought to tie mass shootings to violence in video games and movies – despite decades of research failing to produce such a link. In attendance will be retired Lt Col Dave Grossman, the author of Assassination Generation: Video Games, Aggression, and the Psychology of Killing, a book that purports to "reveal how violent video games have ushered in a new era of mass homicide". Grossman characterizes himself as an expert in "killology". Also present will be Melissa Henson, an advocate from the Parents Television Council, a group that has stood in staunch opposition to depictions of or allusions to sex and violence in entertainment.


Chatbots Are Back And They're About to Take Over

AITopics Original Links

Miss Piggy doesn't like it when I ask her about Kermit. We're having a conversation in Facebook's Messenger app, and the famous Muppet is trying to convince me to watch her fictional talk show Up Late With Miss Piggy, whose backstage hijinks provide the plot of ABC's show The Muppets. She gushes about the show's house band, Electric Mayhem--"VERY hairy"--and asks me whether I'd rate the program a 10 out of 10 or a mere 9 out of 10. But when I ask about her green ex-lover, she snaps. "I wish everyone would stop asking me about Kermit...I've moved on."