Why Meta Took Down its 'Hallucinating' AI Model Galactica?
On Wednesday, MetaAI and Papers with Code announced the release of Galactica, an open-source large language model trained on scientific knowledge, with 120 billion parameters. However, just days after its launch, Meta took Galactica down. Interestingly, every result generated by Galactica came with the warning- Outputs may be unreliable. Language Models are prone to hallucinate text. "Galactica is trained on a large and curated corpus of humanity's scientific knowledge. This includes over 48 million papers, textbooks and lecture notes, millions of compounds and proteins, scientific websites, encyclopedias and more," the paper said.
Nov-21-2022, 10:01:22 GMT
- Genre:
- Instructional Material > Course Syllabus & Notes (0.47)
- Research Report (0.36)
- Industry:
- Health & Medicine > Therapeutic Area
- Immunology (0.31)
- Infections and Infectious Diseases (0.31)
- Information Technology > Services (0.76)
- Health & Medicine > Therapeutic Area
- Technology: