thano
Thanos: A Block-wise Pruning Algorithm for Efficient Large Language Model Compression
This paper presents Thanos, a novel weight-pruning algorithm designed to reduce the memory footprint and enhance the computational efficiency of large language models (LLMs) by removing redundant weights while maintaining accuracy. Thanos introduces a block-wise pruning strategy with adaptive masks that dynamically adjust to weight importance, enabling flexible sparsity patterns and structured formats, such as $n:m$ sparsity, optimized for hardware acceleration. Experimental evaluations demonstrate that Thanos achieves state-of-the-art performance in structured pruning and outperforms existing methods in unstructured pruning. By providing an efficient and adaptable approach to model compression, Thanos offers a practical solution for deploying large models in resource-constrained environments.
- Europe > Italy > Tuscany > Florence (0.04)
- Asia > Middle East > Saudi Arabia > Mecca Province > Thuwal (0.04)
An Integrated System Dynamics and Discrete Event Supply Chain Simulation Framework for Supply Chain Resilience with Non-Stationary Pandemic Demand
Camur, Mustafa Can, Tseng, Chin-Yuan, Thanos, Aristotelis E., White, Chelsea C., Yund, Walter, Iakovou, Eleftherios
COVID-19 resulted in some of the largest supply chain disruptions in recent history. To mitigate the impact of future disruptions, we propose an integrated hybrid simulation framework to couple nonstationary demand signals from an event like COVID-19 with a model of an end-to-end supply chain. We first create a system dynamics susceptible-infected-recovered (SIR) model, augmenting a classic epidemiological model to create a realistic portrayal of demand patterns for oxygen concentrators (OC). Informed by this granular demand signal, we then create a supply chain discrete event simulation model of OC sourcing, manufacturing, and distribution to test production augmentation policies to satisfy this increased demand. This model utilizes publicly available data, engineering teardowns of OCs, and a supply chain illumination to identify suppliers. Our findings indicate that this coupled approach can use realistic demand during a disruptive event to enable rapid recommendations of policies for increased supply chain resilience with controlled cost.
- North America > United States > Texas > Brazos County > College Station (0.14)
- North America > United States > California (0.04)
- North America > United States > Massachusetts (0.04)
- (2 more...)
The AI tool behind Thanos made facial animation in 'The Quarry' a snap
The Oscar-winning studio has produced visual effects for movies like "Titantic," "The Curious Case of Benjamin Button" and several Marvel films. To create the photorealistic characters seen in "The Quarry," it used the AI facial capture system Masquerade, which was developed to replicate Josh Brolin's likeness for his character Thanos in "Avengers: Infinity War." Masquerade was originally designed to do one thing: to take the performance from a head-mounted camera and translate it into a digital mesh that could then be rendered in a movie. For "The Quarry," the VFX team needed something that could track the movement and facial expressions of actors and create digital characters that could be edited in real time. So they built Masquerade 2.0.
- Media > Film (1.00)
- Leisure & Entertainment > Games > Computer Games (0.40)
- Information Technology > Graphics > Animation (0.40)
- Information Technology > Artificial Intelligence > Games (0.40)
How TIME Re-created the 1963 March on Washington in Virtual Reality
Tucked away in an office on a quiet Los Angeles street, past hallways chockablock with miniature props and movie posters, is a cavernous motion-capture studio. And in that studio is the National Mall in Washington, D.C., in 1963, on the day Martin Luther King Jr. delivered his "I Have a Dream" speech. Or rather, it was inside that room that the visual-effects studio Digital Domain captured the expressions, movements and spirit of King, so that he could appear digitally in The March, a virtual reality experience that TIME has produced in partnership with the civil rights leader's estate. The experience, which is executive–produced and narrated by actor Viola Davis, draws on more than a decade of research in machine learning and human anatomy to create a visually striking re-creation of the country's National Mall circa 1963--and of King himself. When work on the project began more than three years ago, a big question needed answering.
- North America > United States > California > Los Angeles County > Los Angeles (0.26)
- North America > United States > District of Columbia > Washington (0.25)
Avengers set to assemble in 'Fortnite' in 'Endgame' movie-video game crossover event
Two pop culture touchstones – the online video game "Fortnite" and Marvel movie franchise The Avengers – have a mash-up on the horizon. Game publisher Epic Games has begun posting vague updates on the official Fortnite Twitter page with the phrasing, "Whatever it takes. Disney marketing president Asad Ayaz confirmed in his own tweet that a partnership "arrives this week" between the game and the film "Avengers: Endgame," which begins hitting theaters Thursday night. The competitive gaming news site Dot Esports noted that an update Sony PlayStation 4 users got showed that the event is expected to start at 4 a.m. Intruisive tech?: Where are the cameras in your car and what are they looking for?
- Information Technology > Communications > Social Media (1.00)
- Information Technology > Artificial Intelligence > Games (0.65)
Machine Learning Saves 'Avengers' VFX Artists Time
For visual-effects artists, time is always a struggle. When the call comes in to create something spectacular, artists and supervisors have to calculate how much run- way they have to get from the point of the idea for the vfx to the deadline. On "Avengers: Infinity War," the vfx crew found that a new innovation -- machine learning -- made it possible to create the character Thanos in a way that would have simply been impossible without it. The filmmakers envisioned a version of Thanos -- played by Josh Brolin -- that would be CG, but also incorporate all the subtle facial expressions and delicate hallmarks of a physical performance that could only been done by an actor. They knew that the facial tracking tech was there but asking vfx artists to manually adjust every inch of the CG version of the face of Thanos once they had all the tracking and scanning information would have been a disaster.
- Media > Film (0.73)
- Leisure & Entertainment (0.73)
AI gives Thanos a soul in 'Avengers: Infinity War'
Then again, even after 19 films in Disney's superhero universe, it's not as if he's had much strong competition. Aside from the puckish Loki and tragic Killmonger, most Marvel villains have been pretty forgettable. Now, after years of build up (we first caught a glimpse of Thanos in 2012's The Avengers) he finally took center stage in this summer's Avengers: Infinity War. But what's most intriguing about Thanos isn't that he wants to wipe out half of life across the universe -- instead, it's that he's a big purple alien who feels genuine emotion. He cries when he's forced to sacrifice Gamora, his adopted daughter.
- Media > Film (0.74)
- Leisure & Entertainment (0.74)
Would you watch a movie written and animated by artificial intelligence? Genetic Literacy Project
The next time you sit down to watch a movie, the algorithm behind your streaming service might recommend a blockbuster that was written by AI, performed by robots, and animated and rendered by a deep learning algorithm. An AI algorithm may have even read the script and suggested the studio buy the rights. It's easy to think that technology like algorithms and robots will make the film industry go the way of the factory worker and the customer service rep, and argue that artistic filmmaking is in its death throes. Just like computers made it so animators didn't have to draw every frame by hand, advanced algorithms can automatically render advanced visual effects. In both cases, the animator didn't lose their job.
50 vs 50 Is Back, With One Big Change For 'Fortnite: Battle Royale'
There was a surprise drop into Fortnite: Battle Royale this afternoon: the limited time 50 vs 50 V2 mode is back, and playable now. It makes for a single mad rush to a large-scale melee in the middle, and while the games end in blowouts a little more often than I would like, they're always a welcome break from the familiar rhythms of standard play. If you want to play 50 vs. Longtime Fortnite players -- or really just anyone who's been with the game a few weeks now -- will notice a big change for the game as a whole with this addition. I suspected it might happen when Epic shifted the "modes" menu to what it looks like now, but this marks the first time that we have two limited time modes running concurrently.
'Fortnite: Battle Royale' Nerfed Thanos When It Should Have Buffed Him
While it was strange enough that Thanos was arriving in Fortnite for a Marvel crossover event, stranger still was the fact that when he did show up, Epic decided they needed to nerf him immediately, launching a live hotfix to do so. Apparently they thought the Mad Titan was just too powerful in his present form, so they reduced his shield, which recharged on kill, and bumped his health a bit. They also decreased the damage his beam attack does and…all of this left many players scratching their heads. I've played with and against both versions of Thanos, and my initial thought was certainly not that he was in such a desperate need of a nerf that he had to be hotfixed. Rather, if anything, it seems like he might need to be buffed a little bit.
- Information Technology > Artificial Intelligence > Games > Computer Games (0.40)
- Information Technology > Communications > Social Media (0.37)