gordon moore
Silicon Valley's Favorite Slogan Has Lost All Meaning
In early 2021, long before ChatGPT became a household name, OpenAI CEO Sam Altman self-published a manifesto of sorts, titled "Moore's Law for Everything." The original Moore's Law, formulated in 1965, describes the development of microchips, the tiny silicon wafers that power your computer. More specifically, it predicted that the number of transistors that engineers could cram onto a chip would roughly double every year. As Altman sees it, something like that astonishing rate of progress will soon apply to housing, food, medicine, education--everything. The vision is nothing short of utopian.
- North America > United States > California (0.52)
- Europe > Netherlands > Limburg > Maastricht (0.05)
- Semiconductors & Electronics (1.00)
- Information Technology (1.00)
Gordon Moore, Intel co-founder who predicted rise of the PC, dies at 94
Intel Corp co-founder Gordon Moore, a pioneer in the semiconductor industry whose "Moore's Law" predicted a steady rise in computing power for decades, has died at the age of 94, the company announced. Intel and Moore's family philanthropic foundation said he died on Friday surrounded by family at his home in Hawaii. Co-launching Intel in 1968, Moore was the rolled-up-sleeves engineer within a triumvirate of technology luminaries that eventually put "Intel Inside" processors in more than 80% of the world's personal computers. In an article he wrote in 1965, Moore observed that, thanks to improvements in technology, the number of transistors on microchips had roughly doubled every year since integrated circuits were invented a few years before. His prediction that the trend would continue became known as "Moore's Law" and, later amended to every two years, it helped push Intel and rival chipmakers to aggressively target their research and development resources to make sure that rule of thumb came true.
- North America > United States > Hawaii (0.25)
- North America > United States > California > San Francisco County > San Francisco (0.05)
- North America > Canada (0.05)
- (3 more...)
- Semiconductors & Electronics (1.00)
- Information Technology > Hardware (0.91)
Leaps and Bounds: The Breakneck Progress of Robot Agility
When Charles Rosen, the A.I. pioneer who founded SRI International's Artificial Intelligence Center, was asked to come up with a name for the world's first general -purpose mobile robot, he thought for a moment and then said: "Well, it shakes like hell when it moves. Let's just call it Shakey." Some variation of this idea has pervaded for much of the history of modern robotics. Robots, we often assume, are clunky machines with as much grace as an atheist's Sunday lunch. Even science fiction movies have repeatedly imagined robots as ungainly creations that walk with slow, halting steps. Recently, a group of researchers from the Dynamic Robotics Laboratory at Oregon State took one of the university's Cassie robots, a pair of walking robot legs that resembles the lower extremities of an ostrich, to a sports field to try out the lab's latest "bipedal gait" algorithms.
Artificial General Intelligence: An Advancement to Foresee Analytics Insight
At the core of the discipline of artificial intelligence is the possibility that one day we'll have the option to construct a machine that is as smart as a human. Such a system is frequently alluded to as artificial general intelligence, or AGI, which is a name that recognizes the idea from the more extensive field of study. It additionally clarifies that true AI has insight that is both wide and flexible. Until this point in time, we've built innumerable systems that are superhuman at explicit tasks, yet none that can match a rat with regards to general mental ability. However, regardless of the centrality of this idea to the field of AI, there's little understanding among analysts with respect to when this feat might really be achievable.
The Edge, The Center, and Everything In-between: The Internet of Things Gets Local Smarts - Cantina
A few years ago, the TensorFlow project appeared and opened up a whole new range of machine learning capabilities. An open-source platform, with a host of tools for training and deploying machine learning models, TensorFlow has been one of the driving forces behind the rapid adoption of ML. It is a great tool for experimentation and since its initial release, TensorFlow has been ported to the browser, and integrated into mobile devices. Hardware designed and dedicated for machine learning, such as Google's Coral boards, has been bringing TensorFlow and other ML systems to the edge; that is, smaller, cheaper devices meant to be deployed out in the world, rather than big servers living in the data centers that back the Internet of Things and the cloud. But things are about to get even more interesting, because TensorFlow is starting to show up on cheap, low-power microcontrollers. This is big news in a small package.
From Imitation Games To The Real Thing: A Brief History Of Machine Learning
Hephaestus, the Greek god of blacksmiths, metalworking and carpenters, was said to have fashioned artificial beings in the form of golden robots. Myth finally moved toward truth in the 20th century, as AI developed in series of fits and starts, finally gaining major momentum--and reaching a tipping point--by the turn of the millennium. Here's how the modern history of AI and ML unfolded, starting in the years just following World War II. In 1950, while working at the University of Manchester, legendary code breaker Alan Turing (subject of the 2014 movie The Imitation Game) released a paper titled "Computing Machinery and Intelligence." It became famous for positing what became known as the "Turing test."
From Imitation Games To The Real Thing: A Brief History Of Machine Learning
Hephaestus, the Greek god of blacksmiths, metalworking and carpenters, was said to have fashioned artificial beings in the form of golden robots. Myth finally moved toward truth in the 20th century, as AI developed in series of fits and starts, finally gaining major momentum--and reaching a tipping point--by the turn of the millennium. Here's how the modern history of AI and ML unfolded, starting in the years just following World War II. In 1950, while working at the University of Manchester, legendary code breaker Alan Turing (subject of the 2014 movie The Imitation Game) released a paper titled "Computing Machinery and Intelligence." It became famous for positing what became known as the "Turing test."
Intel AIVoice: Stepping Out Of Science Fiction: A History Of Intel Powering AI
That patent, awarded April 25, 1961, recognizes Robert Noyce as the inventor of the silicon integrated circuit (IC). Integrated circuits forever changed how computers were made while adding power to a process of another kind: the growth of a then-nascent field called artificial intelligence (AI). And the potential of Noyce's invention truly took flight when he and Gordon Moore founded Intel on July 18, 1968. Fifty years later, the "eternal spring" of artificial intelligence is in full swing. To understand how we arrived, here's the truth in a nutshell: The rise of artificial intelligence is intertwined with the history of faster, more robust microprocessors.
- North America > United States > Nevada > Clark County > Las Vegas (0.05)
- North America > United States > California (0.05)
- Europe > Netherlands (0.05)
The Evolution of Computer Science in One Infographic
We take computing power for granted today. That's because computers are literally everywhere around us. And thanks to advances in technology and manufacturing, the cost of producing semiconductors is so low that we've even started turning things like toys and streetlights into computers. But how and where did this familiar new era start? Today's infographic comes to us from Computer Science Zone, and it describes the journey of how we got to today's tech-oriented consumer society.
- North America > United States > Texas (0.06)
- Europe > Germany (0.06)
Google and Amazon are spearheading a quiet gadget revolution, and it's going to put pressure on Apple most of all
The Google Pixel 2 smartphone relies on artificial intelligence, not cutting-edge specs, to make its sales pitch to customers. Way back in 1965, Intel cofounder Gordon Moore predicted that computers would get twice as powerful every two years -- a prediction which has mostly come true, and is now enshrined as "Moore's Law." However, many, including Moore himself, now believe Moore's Law is screeching to a halt. It's going to mean a huge shift for the technology industry. "To be honest, it's going to be tougher and tougher for people to develop new and exciting products every year," said Google hardware boss Rich Osterloh, on stage at the company's recent Pixel 2 phone launch event.