• Home
  • About
  • A Brief History of AI
  • AI-Alerts
  • AI Magazine
  • AAAI Conferences
  • NeurIPS
  • Books
  • Classics

Learn To be Efficient: Build Structured Sparsity in Large Language Models

Oct-10-2025, 14:30:24 GMT–Neural Information Processing Systems 

L TE reduces LLaMA2-7B inference latency by 25% at 50% sparsity.

  large language model, machine learning, sparsity, (19 more...)

Neural Information Processing Systems

Oct-10-2025, 14:30:24 GMT

Conferences    PDF

Add feedback

  • Country:
    • Asia > Middle East
      • Jordan (0.04)
    • Europe
      • Belgium > Brussels-Capital Region
        • Brussels (0.04)
      • Finland > North Karelia
        • Joensuu (0.04)
      • Germany > Saarland
        • Saarbrücken (0.04)
    • North America > United States
      • Illinois > Champaign County
        • Urbana (0.04)
      • Michigan (0.04)
      • Pennsylvania > Allegheny County
        • Pittsburgh (0.04)
      • Washington > King County
        • Seattle (0.04)
  • Genre:
    • Research Report
      • Experimental Study (0.93)
      • New Finding (1.00)
  • Industry:
    • Information Technology (0.46)
  • Technology:
    • Information Technology > Artificial Intelligence
      • Machine Learning > Neural Networks
        • Deep Learning (1.00)
      • Natural Language > Large Language Model (1.00)

  • By text
  • By views
  • By concept tags

Duplicate Docs Excel Report

Title
Learn To be Efficient: Build Structured Sparsity in Large Language Models

Similar Docs  Excel Report  more

TitleSimilaritySource
None found

Site Feedback

© 2026, i2k Connect Inc  ·  All Rights Reserved.
Privacy policy  ·  Terms of use  ·  License  ·  Legal Notices
This is i2kweb version 7.1.0-SNAPSHOT. Logged in as aitopics-guest for 59 more minutes (idle timeout).

Site Feedback

powered by
i2k Connect

aitopics.org uses cookies to deliver the best possible experience. By continuing to use this site, you consent to the use of cookies. Learn more »

Add feedback

Send feedback to help us improve this new enhanced search experience.

Thank You!