wu dao 2
We Might See A 100T Language Model In 2022
Looking back at 2021, it can surely be labelled as the year of large language models, with all the tech giants releasing models to stay ahead in the innovation game. In December itself, we saw back-to-back releases – DeepMind's 280 billion parameter transformer language model, Gopher, Google's Generalist Language Model (GLaM), a trillion weight model that uses sparsity, LG AI Research's artificial intelligence language model "Exaone", with capabilities of tuning 300 billion different parameters or variables. With innovations in language models accelerating at such a massive pace, can we possibly see a 100T large language model in the very near future? The idea is surely not too far-fetched if we look at the growth that tech companies have made, bringing out improved versions of the models that exist today in a span of just a few years. After the release of the GPT-3 autoregressive language model with 175 billion machine learning parameters from Open AI in 2020 (its predecessor, GPT-2, was over 100 times smaller, at 1.5 billion parameters), major efforts have gone into bringing out more such models by tech mammoths.
- North America > United States > California > Alameda County > Berkeley (0.05)
- Asia > China > Beijing > Beijing (0.05)
- Information Technology (1.00)
- Energy > Renewable (0.50)
GPT-3 Scared You? Meet Wu Dao 2.0: A Monster of 1.75 Trillion Parameters
Jack Clark, OpenAI's policy director, calls this trend of copying GPT-3, "model diffusion." Yet, among all the copies, Wu Dao 2.0 holds the record of being the largest of all with a striking 1.75 trillion parameters (10x GPT-3). Coco Feng reported for South China Morning Post that Wu Dao 2.0 was trained on 4.9TB of high-quality text and image data, which makes GPT-3's training dataset (570GB) pale in comparison. Yet, it's worth noting OpenAI researchers curated 45TB of data to extract clean those 570GB. It can learn from text and images and tackle tasks that include both types of data (something GPT-3 can't do).
- Information Technology > Artificial Intelligence > Natural Language > Large Language Model (1.00)
- Information Technology > Artificial Intelligence > Natural Language > Chatbot (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning > Generative AI (0.50)
We are sleepwalking into AI-augmented work
The Transform Technology Summits start October 13th with Low-Code/No Code: Enabling Enterprise Agility. A recent New York Times article concludes that new AI-powered automation tools such as Codex for software developers will not eliminate jobs but simply be a welcome aid to augment programmer productivity. This is consistent with the argument we're increasingly hearing that people and AI have different strengths and there will be appropriate roles for each. As discussed in a Harvard Business Review story: "AI-based machines are fast, more accurate, and consistently rational, but they aren't intuitive, emotional, or culturally sensitive." The belief is that "AI plus humans" is something of a centaur, greater than either one operating alone.
Five Key Facts About Wu Dao 2.0: The Largest Transformer Model Ever Built - KDnuggets
I recently started a new newsletter focus on AI education and already has over 50,000 subscribers. TheSequence is a no-BS( meaning no hype, no news etc) AI-focused newsletter that takes 5 minutes to read. The goal is to keep you up to date with machine learning projects, research papers and concepts. It seems that every other month we have a new milestone in the race of building massively large transformer models. GPT-2 set up new records by building a 1.5 billion parameters model just to be surpassed by Microsoft's Turing NLG with 17 billion parameters.
Pinaki Laskar on LinkedIn: #artificialintelligence #machinelearning #deeplearning
AI Researcher, Cognitive Technologist Inventor - AI Thinking, Think Chain Innovator - AIOT, XAI, Autonomous Cars, IIOT Founder Fisheyebox Spatial Computing Savant, Transformative Leader, Industry X.0 Practitioner At what stage of development are #artificialintelligence and #machinelearning now? We're living exciting times in the Narrow AI of Statistic ML/DL to be replaced by the Causal AI/ML/DL. Are there any new breakthrough results? OpenAI shocked the world a year ago with GPT-3. Google presented LaMDA and MUM, two AIs that will revolutionize chat-bots and the search engine, respectively.
- Information Technology > Artificial Intelligence > Natural Language > Large Language Model (1.00)
- Information Technology > Artificial Intelligence > Natural Language > Chatbot (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning > Generative AI (0.61)
Wu Dao 2.0: Why China is Leading the Artificial Intelligence Race?
Wu Dao 2.0 has surpassed OpenAI's GPT-3 in so many ways. China could grow to monopolise the language modelling world. Artificial intelligence models have become a strong informal indicator of national and continental progress. Wu Dao 2.0 means enlightenment. It is dubbed as China's first homegrown super-scale intelligent model system, and was led by BAAI Research Academic Vice President and Tsinghua University Professor Tang Jie.
- Asia > China > Beijing > Beijing (0.08)
- North America > United States (0.05)
- Europe > Sweden (0.05)
- (9 more...)
Wu Dao 2.0 - Bigger, Stronger, Faster AI From China
It is no secret that China has COVID-19 under control. When you travel there you need to go through a 2-week hotel quarantine but once you are in the country, you are safe. Probably even safer than before COVID as wearing a mask is now part of the etiquette, and the many other viral respiratory diseases are likely to be on the decline. Hence, when I got invited to speak at the annual conference of the Beijing Academy of Artificial Intelligence (BAAI) in the AI for healthcare section, I readily accepted. The BAAI is a great platform for showcasing technology and talent across broad categories.
DeepMind AGI paper adds urgency to ethical AI
Where does your enterprise stand on the AI adoption curve? Take our AI survey to find out. It has been a great year for artificial intelligence. Companies are spending more on large AI projects, and new investment in AI startups is on pace for a record year. All this investment and spending is yielding results that are moving us all closer to the long-sought holy grail -- artificial general intelligence (AGI).
- North America > Canada > Ontario > Toronto (0.15)
- Oceania > New Zealand (0.05)
- Asia > China > Beijing > Beijing (0.05)
Amazing New Chinese A.I.-Powered Language Model Wu Dao 2.0 Unveiled
Earlier this month, Chinese artificial intelligence (A.I.) researchers at the Beijing Academy of Artificial Intelligence (BAAI) unveiled Wu Dao 2.0, the world's biggest natural language processing (NLP) model. NLP is a branch of A.I. research that aims to give computers the ability to understand text and spoken words and respond to them in much the same way human beings can. Last year, the San Francisco–based nonprofit A.I. research laboratory OpenAI wowed the world when it released its GPT-3 (Generative Pre-trained Transformer 3) language model. GPT-3 is a 175 billion–parameter deep learning model trained on text datasets with hundreds of billions of words. A parameter is a calculation in a neural network that shapes the model's data by assigning to each chunk a greater or lesser weighting, thus providing the neural network a learned perspective on the data.
- North America > United States > California > San Francisco County > San Francisco (0.26)
- Asia > China > Beijing > Beijing (0.26)
- Europe (0.06)
- Media (0.52)
- Government (0.32)
Meet Wu Dao 2.0, the Chinese AI model making the West sweat
A new artificial intelligence model developed by Chinese researchers is performing untold feats with image creation and natural language processing -- making rivals in Europe and the U.S. nervous about falling behind. The model, dubbed Wu Dao 2.0, is able to understand everything people say -- the grammar too -- but can also recognize images and generate realistic pictures based on descriptions. It can also write essays and poems in traditional Chinese, as well as predict the 3D structures of proteins, POLITICO'S AI: Decoded reported. Developed by the government-funded Beijing Academy of Artificial Intelligence and unveiled last week, Wu Dao 2.0 appears to be among the world's most sophisticated AI language models. Wu Dao 2.0's creators say it's 10 times more powerful than its closest rival GPT-3, developed by the U.S. firm OpenAI.
- North America > United States (0.31)
- Asia > China > Beijing > Beijing (0.25)
- Europe > Germany (0.06)
- (7 more...)