If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
When Apple unveiled the iPhone X, it catapulted artificial intelligence and machine learning into the limelight. Facial recognition became a mainstream reality for those who can afford it. A few months later, Vietnamese cyber security firm Bkav claimed it was able to bypass the iPhone X's Face ID using a relatively inexpensive $150 mask. The claim is still up in the air and while it has not been accepted to its full extent, no one was actually able to refute the claim based on scientific facts. For anyone dealing in AI and security, it emphasized what many of us have held true for a while.
Artificial intelligence (AI), machine learning and cognitive analytics are having a tremendous impact in areas ranging from medical diagnostics to self-driving cars. AI systems are highly dependent on enormous volumes of data--both at rest in repositories and in motion in real time--to learn from experience, make connections and arrive at critical business decisions. Usage of AI is also expected to expand significantly in the not-so-distant future. As a result, having the right storage to support the massive amounts of data required for AI workloads is an important consideration for an increasing number of organizations. Availability: When a business leader uses AI for critical tasks such as understanding how best to run their manufacturing process or to optimize their supply chain, they cannot afford to risk any loss of availability in the supporting storage system.
Researchers from our group at QUT and the Australian Centre for Robotic Vision have had six papers accepted to the upcoming Australasian Conference on Robotics and Automation to be held at The University of Technology Sydney. This year the conference trialed a dual submission process with the IEEE International Conference on Robotics and Automation, meaning work can be presented at both conferences but only published in the proceedings of one. The papers cover ongoing research in our lab spanning topics including robotics, positioning and AI for applications in mining, construction safety and autonomous vehicles. I'll give an overview here of the research we're doing, and a wrap up at the end. Despite very high safety standards, work sites of all varieties around Australia still cause large numbers of injuries and occasional fatalities.
Fingerspitzengefühl: A German word used to describe the ability to maintain attention to detail in an ever-changing operational and tactical environment by maintaining real-time situational awareness. The term is synonymous with the English expression of "keeping one's finger on the pulse". The problem with fingerspitzengefühl traditionally, in addition to pronouncing it, has been it is hard for an individual to scale up. In a world of sensors, AI and mobile devices, having real-time situational awareness is far easier than ever before. In fact, today the challenge is not how to do it, but what to do with the massive volume of data that can be provided.
Nearly every industry today is swimming in data, and the floodgates are not closing any time soon. Expert projections suggest a 4,300% increase in annual data production that will create 35 zettabytes by 2020. As the acceleration of data analytics continues, more businesses are realizing the necessity for an efficiency of increased automation across their organizations. In fact, nearly three-quarters of business leaders and employees believe at least some part of their job could be automated. Yet, there's also an ongoing debate around the linear computational ability of machines, which inherently lacks business logic.
Determining marketing and PR's business impact is a major challenge for both B2C and B2B companies. In B2B, long customer buying journeys and extended time lag between "marketing or PR cause" and "business effect" made it all but impossible to judge what the business was getting in return for each dollar of marketing investment. Caught in the annual "planning and budgeting" competition for resources, marketing and PR often have gotten the short end of the stick. The result has been that marketing and PR's business impact has gone unrecognized and unrewarded in too many companies. From 2001 until 2016, I served as CEO of BMC Software.
Not long ago, reaching audiences through technologies like email was considered breakthrough. Fast forward to today, a hyper-accelerated existence in which the average human checks their smartphone over 150 times daily. This phone-fiddling translates into the average enterprise dealing with somewhere between 200 and 300 million digital signals per day, yet most can barely handle about 2 percent of that data. This implies companies attempt to make sense of around 4 million signals every single day -- which is still far too many to analyze, interpret, and respond to quickly. As for the 98 percent of the signals organizations can't handle, massive amounts of opportunity and insight are lost.
Today's consumers are pickier than ever. They want customized, personalized, and unique products over standardized ones and prefer local, smaller producers over large-scale global manufacturers. Factories, power plants, and manufacturing centers around the world must rely on automation, machine learning, computer vision, and other fields of AI to meet these rising demands and transform the way we make, move, and market things. Since the industrial revolution, factories have been optimized to mass produce a few products rapidly and cheaply to satisfy global demand. "The largest inefficiency that most manufacturers face is inflexibility," says Jim Lawton, Chief Product & Marketing Officer of Rethink Robotics, maker of collaborative industrial robots.
Films dating back as far as 1927 (and perhaps beyond) predicted machine intelligence would be the future. But for as many who saw the vision, there was an equal number who believed the forecast was soundless--how could a machine possibly replicate and surpass the intelligence of the human who created it? Fast-forward to 2017, and we find two things: machine intelligence is here and rising, and it's quite capable of learning at a much faster rate than us. Look no further than search engines, which compile the digital footprints we leave behind, analyze them to better interpret our unique and changing behaviors, then use that information to tailor individual online experiences. With every search we do, the engines grow smarter about its billions of users and how to better accommodate them.
Artificial Intelligence (AI) may be out of the sci-fi closet but the adoption within MarTech industry is still very much in its nascent stage. With the advent of wireless and virtual technologies in last two decades, businesses have evolved at a breakneck pace. And now, the biggest force pushing that momentum is the refined combination of AI and Machine Learning (AI/ML). While the first half of the decade focused on leveraging Big Data, AI technology in business, especially in sales and marketing, has taken a major lead with its equitable adoption rate. The promise AI offers – intelligent software running on AI/ML algorithms would perform tasks better, faster and cheaper, is an optimistic opportunity within MarTech.