If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
Today marks the start of the fall Strata Data Conference in New York City, which has traditionally been the big data community's biggest show of the year. It's been a wild ride for the big data crowd in 2018, one that's brought its share of highs and lows. Now it's worth taking some time to consider where big data has come, and where it's possibly headed in the future. Here are five things to keep in mind as the Strata Data Conference kicks off. We've said this before, but it bears repeating: Hadoop is just one of many technologies angling for relevance in today's increasingly heterogeneous at-scale computing environment.
It took me 4 hours and 5 minutes to effectively annihilate the Universe by pretending to be an Artificial Intelligence tasked with making paper-clips. Put another way, it took me 4 hours and 5 minutes to have an existential crisis. This was done by playing the online game "Paperclip", which was released in 2017. Though the clip-making goal of the game is in itself simple, there are so many contemporary lessons to be extracted from the playthrough that a deep dive seems necessary. Indeed, the game explores our past, present and future in the most interesting way, especially when it comes to the technological advances Silicon Valley is currently oh so proud of.
I have been brewing the idea of using machine learning to improve software systems since 2016. It was pretty vague and broad, without an actionable plan. I just had the intuition -- the software configuration and tuning, especially after the adoption of microservices, was getting too complex. If you have enough experience in the software industry, then it's very likely that you've struggled with either a configuration problem or a tuning problem. Configuration and tuning problems are pretty common and can lead to really bad outages.
This ebook, based on the latest ZDNet/TechRepublic special feature, looks at the rise of e-commerce and the digital transformation of retail companies. It takes a lot of machine learning and computer vision to ensure that a pair of high-end sneakers is authentic. GOAT is the largest sneaker marketplace and specializes in selling authentic goods. Specifically, GOAT provides buyers and sellers of sneakers an authenticity guarantee with a "ship to verify" model. GOAT, which has both e-commerce and physical retail locations, has 400 employees and 60 of them are engineers with 7 data scientists.
This post is authored by Viral B. Shah, co-creator of the Julia language and co-founder and CEO at Julia Computing, and Avik Sengupta, head of engineering at Julia Computing. The Julia language provides a fresh new approach to numerical computing, where there is no longer a compromise between performance and productivity. A high-level language that makes writing natural mathematical code easy, with runtime speeds approaching raw C, Julia has been used to model economic systems at the Federal Reserve, drive autonomous cars at University of California Berkeley, optimize the power grid, calculate solvency requirements for large insurance firms, model the US mortgage markets and map all the stars in the sky. It would be no surprise then that Julia is a natural fit in many areas of machine learning. And the powers of Julia make it a perfect language to implement these algorithms.
There are many simulation and optimization problems that are difficult or impossible to solve using your existing computing resources. You do not have a quantum computer, which may be able to solve them, and you do not expect your company to get one soon. You are not alone, but don't worry IBM will let you use their quantum computing resources to make a start in formulating their solutions. For years, quantum computing was little more than an idea that fascinated computer scientists. Now it is offering direct utility for researchers and engineers even before the promise of a large-scale universal quantum computer is fulfilled.
Last year Google partnered with the Raspberry Pi Foundation to survey users on what would be most helpful in bringing Google's artificial intelligence and machine learning tools to the Raspberry Pi. Now those efforts are paying off. Thanks to Colaboratory – a new open-source project from Google – engineers, researchers, and makers can now build and run machine learning applications on a simple single-board computer. Google has officially opened up its machine learning and data science workflow – making learning about machine learning or data analytics as easy as using a notebook and a Raspberry Pi. Google's Colaboratory is a research and education tool that can easily be shared via Google's Chrome web browser.
As US Army researcher believes that wars will be fought with human soldiers commanding a team of'physical and cyber robots' to create a network of "Internet of Battle Things" in the future. "Internet of Intelligent Battle Things (IOBT) is the emerging reality of warfare," as AI and machine learning advances Alexander Kott, chief of the Network Science Division of the US Army Research Laboratory. He envisions a future where physical robots are able to fly, crawl, walk, or ride into battle. The robots as small as insects can be used as sensors, and the ones as big as large vehicles can carry troops and supplies. There will also be "cyber robots", basically autonomous programmes, used within computers and networks to protect communications, fact-check, relay information, and protect other electronic devices from enemy malware.
A few weeks back, I wrote about the need for machine learning at the edge and what big chip firms are doing to address the challenge. Even as Intel, ARM, and others invest in new architectures, startups are also attempting to innovate with new platforms. GreenWaves Technologies, based in France, is one such company. It has built a machine learning chip that offers multiple cores and low-power machine learning at the edge. The chip is called the GAP8 application processor.
Autonomous driving is not one single technology but rather a complex system integrating many technologies, which means that teaching autonomous driving is a challenging task. Indeed, most existing autonomous driving classes focus on one of the technologies involved. This not only fails to provide a comprehensive coverage, but also sets a high entry barrier for students with different technology backgrounds. In this paper, we present a modular, integrated approach to teaching autonomous driving. Specifically, we organize the technologies used in autonomous driving into modules. This is described in the textbook we have developed as well as a series of multimedia online lectures designed to provide technical overview for each module. Then, once the students have understood these modules, the experimental platforms for integration we have developed allow the students to fully understand how the modules interact with each other. To verify this teaching approach, we present three case studies: an introductory class on autonomous driving for students with only a basic technology background; a new session in an existing embedded systems class to demonstrate how embedded system technologies can be applied to autonomous driving; and an industry professional training session to quickly bring up experienced engineers to work in autonomous driving. The results show that students can maintain a high interest level and make great progress by starting with familiar concepts before moving onto other modules.