Home
About
A Brief History of AI
AI-Alerts
AI Magazine
AAAI Conferences
NeurIPS
Books
Classics
MiniLM: Deep Self-Attention Distillation for Task-Agnostic Compression of Pre-Trained Transformers
aitopics.org uses cookies to deliver the best possible experience. By continuing to use this site, you consent to the use of cookies.
Learn more ยป
I understand
Add feedback
Send feedback to help us improve this new enhanced search experience.
Select feedback type:
General
Views
Title
Summary
Body
Concept Tags
Oilfield Places
Thank You!