ai winter
Kioxia reportedly pulls IPO as AI winter looms
A sizable and significant initial public offering in Japan may have been pulled due to market concerns and on fears of a possible chip glut. Kioxia Holdings, formerly Toshiba Memory, was reportedly preparing for an October Tokyo Stock Exchange listing that would have raised 500 million for an estimated 10 billion market capitalization. Citing unnamed sources, Reuters said yesterday that the share sale has been delayed. Kyodo reported that the company was concerned about not hitting its valuation target due to low demand.
There Was No 'First AI Winter'
As I concluded my June Historical Reflections column, artificial intelligence had matured from an intellectual brand invented to win funding for a summer research workshop to one of the most prestigious fields in the emerging discipline of computer science. Four of the first 10 ACM A.M. Turing Award recipients were AI specialists: Marvin Minsky, Herb Simon, Allen Newell, and John McCarthy. These men founded the three leading AI labs and played central roles in building what are still the top three U.S. computer science programs at MIT, Stanford, and Carnegie Mellon. Conceptually AI was about uncovering and duplicating the processes behind human cognition; practically it was about figuring out how to program tasks that people could do but computers could not. Although connectionist approaches based on training networks of simulated neurons had been prominent in the primordial stew of cybernetics and automata research from which AI emerged, all four Turing Award recipients favored the rival symbolic approach, in which computers algorithmically manipulated symbols according to coded rules of logic.
The economy is down, but AI is hot. Where do we go from here?
It was heartbreaking to read over the weekend about how some Googlers in the US found out about the company's abrupt cull. Dan Russell, a research scientist who has worked on Google Search for over 17 years, wrote how he had gone to the office to finish off some work at 4 a.m., only to find out his entry badge didn't work. Economists predict the US economy may enter a recession this year amid a highly uncertain global economic outlook. Big tech companies have started to feel the squeeze. In the past, economic downturns have shut off the funding taps for AI research.
- Banking & Finance > Economy (0.99)
- Information Technology (0.60)
AI Today Podcast: AI Glossary Series: AI Winters - Cognilytica
Like all technologies, Artificial Intelligence (AI) is not immune to the waves of obscurity, hyped promotion, plateauing of interest, and decline. In fact, the AI industry has been through two such major waves of interest, hype, plateau, and decline, commonly referred to as the "AI Winters". In this episode of the AI Today podcast hosts Kathleen Walch and Ron Schmelzer define an "AI Winter" at a high level.
- Information Technology > Artificial Intelligence (1.00)
- Information Technology > Communications > Mobile (0.74)
What Will The Future Of AI Be Like In 2030?
What does the future of AI hold for us by 2030? Will it be a wasteland populated only by robots, or will humans and machines coexist harmoniously? Will artificial intelligence make us brighter, healthier, and better able to solve global challenges like climate change and poverty? AI has existed for centuries, with early examples dating back to the 18th century. The term "artificial intelligence" was first coined in 1956, but it wasn't until the late 20th century that AI began to be developed in earnest.
- Health & Medicine (0.78)
- Energy (0.50)
8. Are we inviting yet another AI winter?
AI has existed for several decades and has shown increasing levels of maturity from time to time. It has also shown the promise and aptness to solve the problems of that day. Companies increased their investments to adopt the so-called intelligent technologies to gain a competitive edge. But it gradually led to massive disappointment and loss of trust for not being able to keep up to the expectations. These couple of periods in history when AI slumped into darkness were termed THE AI WINTER.
How Companies Can Succeed in AI Winter: Jeff Kagan
Industry observers say we're about to enter the next AI Winter. Artificial Intelligence emerged more than 60 years ago, and in that timeframe we have seen many such seasons. While I don't believe the coming winter will be as severe as the others, by all accounts, it's coming. The first AI Winter happened in the 1970s, when, after more than a decade of heavy funding of academic research by the U.S. Department of Defense, the government pulled back. It became clear that advancing AI would be much more challenging and expensive than originally foreseen.
- North America > United States > Nevada > Clark County > Las Vegas (0.06)
- Asia > Japan (0.06)
Why Do We Keep Repeating The Same Mistakes On AI?
Artificial intelligence has a long and rich history stretching over seven decades. What's interesting is that AI predates even modern computers, with research on intelligent machines being some of the starting points for how we came up with digital computing in the first place. Early computing pioneer Alan Turing was also an early AI pioneer, developing ideas in the late 1940s and 1950s. Norbert Wiener, creator of cybernetics concepts developed the first autonomous robots in the 1940s when even transistors didn't exist, let alone big data or the Cloud. Claud Shannon developed hardware mice that could solve mazes without needing any deep learning neural networks.
- North America > United States > New York > New York County > New York City (0.05)
- Europe > Russia (0.05)
- Asia > Russia (0.05)
- Information Technology > Artificial Intelligence > Robots (1.00)
- Information Technology > Artificial Intelligence > History (1.00)
- Information Technology > Artificial Intelligence > Natural Language (0.99)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning (0.35)
The sustainable approach that will help avoid a third 'AI winter'
The majority of big artificial intelligence companies are pouring huge amounts of energy and resources into AI in the hope of creating a more efficient and automated future. However, throwing large volumes of data at machine-learning algorithms and using vast amounts of processing power is neither efficient nor futureproof. Algorithms were never developed with efficiency in mind, so focusing on this aspect is a vital step towards avoiding another'AI winter'. The energy consumption required for mining and managing Bitcoin has been in the media spotlight for years now. The energy usage of crypto transactions has even been compared to that of countries the size of Greece, a country with a population of over 10 million people.
- Banking & Finance > Trading (0.58)
- Energy (0.53)
What is AI Winter? Definition, History and Timeline
The trajectory of AI has been marked by several winters since its inception in 1955 in a formal proposal made by computer scientist and AI researcher Marvin Minksy and several others. Between 1956 and 1974, the U.S. Defense Advanced Research Projects Agency (DARPA) funded AI research with few requirements for developing functional projects. After the initial hype generated by these AI projects, a quiet decade followed where interest and support gradually tapered off. In 1969, Minsky and another AI researcher, Seymour Papert, published a book called Perceptrons, which pointed out the flaws and limitations of neural networks. This publication influenced DARPA to withdraw its previous funding of AI projects.
- Government > Regional Government > North America Government > United States Government (0.82)
- Government > Military (0.82)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks (0.59)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Rule-Based Reasoning (0.36)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Expert Systems (0.36)