If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We'll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!): Let us know if you have suggestions for next week, and enjoy today's videos. The bar has now been set for robot holiday videos, thanks to FZI. Still waiting for a robot with a cookie to show up at my door.
As we reflect on the definition of "work" in the business world, it's clear that most work today is the exception processing of tasks that we couldn't fully automate yet. And, with artificial intelligence changing the way we program software, we can solve for this last mile of enterprise automation we hadn't been able to figure out over the past thirty years. As AI projects roll out over the next few years, we will need to rethink the definition of the "work" that people will do. And in the post-AI era the future of work will become one of the largest agenda items for policy makers, corporate executives and social economists. Despite the strong and inherently negative narrative around the impact on jobs, the bulk of the impact from the automation of work through AI will result in a "displacement" of work not a "replacement" of work – it's easy to see how the abacus-to-calculator-to-Excel phenomenon created completely new work around financial planning and reporting, and enterprise performance management.
If you really want to know how to future-proof your career, your best bet is the World Economic Forum (WEF) Future of Jobs Report 2018. The report confirmed most of the things we already knew: that automation and machine learning are set to create as many jobs as they displace, that the gig economy and flexible contract work will become standard, and that knowledge of data science is going to be a key differentiator in the job market over the next few years. With more than two years having passed since the first Future of Jobs report, there have been some new developments. With the mainstreaming of chatbots and other consumer-facing artificial intelligence (AI), there's more of an understanding of how machine learning might integrate into our society. Now that we've had some time to get to know Sophia, Alexa, Pepper and the rest, there are noticeably fewer "Are robots coming to steal your job?" clickbait articles in the media.
We, humans, are capable of recognizing our surrounding objects since our early age. This ability to interact with objects in our environments plays a vital role in the emergence of object perception and manipulation capabilities. This also benefits us by teaching self-supervision- what actions we shall take and learn from the result. This is the case in humans, whereas, in robots, this capability is actually researched. It enables robots to learn without the need for large amounts of training data or manual supervision.
A month ago a group convened in the University Club dining room at Arizona State University to discuss the future of national security research. There were retired Army and Marine generals, agents from the CIA and a bevy of scientists. Two trendlines popped out over the peppered bacon and frittatas: Nation states are vying for technological dominance, and the Holy Grail in that sphere is the successful pairing of humans and artificial intelligence. Creating machines that think and act like us is as much grounded in the humanities as it is in engineering. Talk to engineers about the problem, and they'll discuss things far outside the usual lanes of engineering, things like the nature of self, perception and free will.
It seems that we've already seen more than we were ready to VR in video games, IoT in medicine and smart cities being brought to life. We are really close to living in some sort of sci-fi so it's a good idea to have a look at the most possible and promising machine learning and AI trends for the upcoming 2018 and ask ourselves if we are ready for them. Healthcare is one of the biggest and most crucial industries in the world so no wonder it's the one that is heavily using the latest technologies – because it's the matter of life and death. First of all, due to artificial intelligence and work with Big Data, a scientist will soon get the opportunity to prevent certain diseases, like cancer. This can be done by analyzing patient's history and all their records so AI will be able to understand the mechanism of disease, thus enabling doctors to be proactive instead of reacting.
Artificial Intelligence (AI) is an intriguing concept that has fascinated experts and laymen alike for years now. Technology in 2018 is moving at a breakneck speed, and it is safe to say that man today has significantly more power in his pocket than he had in his entire home back in the 90s. There have been immense breakthroughs in the field of machine learning and deep learning. These concepts have allowed machines to process and analyze information themselves in a very sophisticated manner. Thanks to these AI developments; machines can now perform complex functions such as facial recognition.
Creating a virtual environment that looks realistic takes time and skill. The details have to be hand-crafted using a graphics chip that renders 3D shapes, appropriate lighting, and textures. The latest blockbuster video game, Red Dead Redemption 2, for example, took a team of around 1000 developers more than 8 years to create--occasionally working 100-hour weeks. That kind of workload might not be required for much longer. A powerful new AI algorithm can dream up the photorealistic details of a scene on the fly.
This week in Las Vegas, Amazon rolled out dozens of new features, upgrades, and new products at AWS re:Invent. Here's a quick roundup of news out of the annual conference that may matter to members of the AI community. A disproportionate amount of money is spent on inference versus training when it comes to AI models, AWS CEO Andy Jassy said, and GPUs can be terribly inefficient. To address these issues, Amazon custom-designed a chip named Inferentia due out next year and created Elastic Inference, a service that identifies parts of a neural network that can benefit from acceleration. To speed up training of AI models, Amazon introduced AWS-Optimized TensorFlow, which can train a model with the ResNet-50 benchmark in 14 minutes.
Bygone is the era where the competition was among men; the world has now progressed to such a level that the war is between men and machines. The digital revolution has escalated to such an extent that the development in the field of Information technology is happening at an exponential pace. We must accept that Moore's prediction regarding the digital development in 1965 is no longer a mere hypothesis but a proven fact. With the advancements that are taking place in the field of Information technology, the security of this data has also become a matter of concern for scientists and organizations. With the high level innovations that are occurring in this field, it is an undeniable fact that the future of cybersecurity will be even tighter.