If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
SONAL SHAH: It's also about how do we make data more useful for people to use and to solve problems in their communities? TANYA OTT: Okay, that is a big job. Who is this superhuman who fills it? TANYA OTT: We'll tell you, in a moment. But first, let me say, you're listening to the Press Room, where we talk about some of the biggest issues facing businesses today. I'm Tanya Ott and joining me today are Bill Eggers … I am the executive director and a professor of practice at Georgetown University's Beeck Center. TANYA OTT: Bill and Sonal are coauthors of The CDO Playbook – a guide for Chief Data Officers. For the last decade, government has been focused on making data more open and easily [accessible] to the public.
When it comes to AI, companies typically test the waters proof of concepts or small-scale use cases, taking advantage of vendor offerings, such as new features in their existing SaaS platforms. If things go well, they pursue another project, then another -- and soon they're relying on a sprawl of incompatible systems, competing data lakes, problems with cost overruns, duplication of efforts, and an inability to scale, not to mention privacy, compliance or ethics problems. Get the latest insights with our CIO Daily newsletter. At some point, the benefits of AI become obvious enough, and the pain of continuing on their present path so acute, that companies step back to develop a cohesive strategy for an enterprise-wide AI-powered transformation. "The tendency to get overwhelmed in individual technologies is not only drowning organizations in technical debt but discouraging them because they don't see a path forward to sustainable and scalable AI," said Traci Gusher, partner in data, analytics and artificial intelligence practice at KPMG.
As companies increasingly invest in business intelligence and analytical tools, organisations are increasingly unable to derive critical benefits from them. In most cases, this results because of the inflection shift caused by the new oil, Data. Data has transformed how businesses look at their processes and operations. It has changed perspectives and introduced new sources of revenues, insights and competencies. With AI being on the rise, companies are digitally transforming their organisations at a scale never seen before.
In this special guest feature, Solomon Thimothy, CEO of DMA Digital Marketing Agency, believes that digital marketing advancements in 2018 have set a high bar for customer expectations. Customers now expect, deserve and demand that personalized, seamless transactions will only increase in 2019. By focusing on data-based AI solutions, organizations can ensure the customer journey will be more personalized and more profitable in the year to come. Solomon focuses his expertise and passion in helping businesses invest in long-term digital marketing for financial growth. His education from Northeastern Illinois University and North Park University provide him with the tools needed to lives up to its digital marketing commitments.
At the 2019 Semicon Conference Applied Materials (AMAT) had a day-long seminar focused on technology, particularly memory, for artificial intelligence (AI) applications. In addition to talks by AI experts, the company also talked about their tools for manufacturing magnetic random access memory (MRAM) as well as resistive random access memory (RRAM) and Phase Change Memory (PCM). We will talk about a workshop at Stanford in August will explore emerging memories enabling artificial intelligence, especially for embedded products, such as IoT devices. Gary Dickerson from Applied Materials gave a kick-off talk at the seminar. He talked about the growth of data and the importance of memory to support data centers as well as the edge.
Fifty years is a long time by human standards, and an eon by technology standards. In 1969, not many organizations even knew what a computer was, let alone used one. Though it's trivial, revisiting and comparing the compute power of then to what we have now can help us realize the effort it took to realize the achievement that the moon landing was. The scale of our compute and storage capabilities has changed dramatically as Moore's law has been in full effect. Like many "laws," Moore's law is more like a rule of thumb, stating that the number of transistors in dense integrated circuit doubles about every two years.
How do you benchmark the "evil" quotient in your AI app? That may sound like a facetious question, but let's ask ourselves what it means to apply such a word as "evil" to this or any other application. And, if "evil AI" is an outcome we should avoid, let's examine how to measure it so that we can certify its absence from our delivered work product. Obviously, this is purely a thought experiment on my part, but it came to mind in a serious context while I was perusing recent artificial intelligence industry news. Specifically, I noticed that MLPerf has recently announced the latest versions of its benchmarking suites for both AI inferencing and training.
AI is finding its way to more places in organizations, including human resources. Human capital management providers are building AI into their solutions, but depending on the details, it may be wiser to build your own application than buy something off-the-shelf. Earlier this year, Gartner issued a research note exploring AI use cases in human capital management (HCM). Its author, VP Analyst Helen Poitevin, concluded that many of these applications were still in the "demo candy" stage, mainly to demonstrate product roadmaps. In other words, AI-related expectations are outpacing reality.
With research suggesting artificial intelligence in manufacturing could become mainstream within 24 months, what can manufacturers gain from taking an early adopter approach? With AI and advanced analytics to identify patterns and trends in the wealth of data generated by the IoT, the barriers between operational technology and information technology are breaking down. Manufacturers can become data-driven in all aspects of business, enabling the companies to transform operations, restructure supply chains, improve efficiency, address skills shortages and create entirely new revenue streams and business models. Despite the many benefits, the Manufacturing Leadership Council's'Factories of the Future' survey revealed that less than one in 10 (8%) of manufacturers are currently using AI – though a further 50% expect to deploy it within two years. AI is still nascent in manufacturing today, yet these results suggest it could become mainstream in under 24 months.
Generic OCR programs are adequate for simple text scooping and editing. For example, with Microsoft OneNote, you can import a PDF or JPEG image and use the OCR tool to output text. General purpose OCR lacks the tools for your business to take full advantage of the tremendous labor-saving and machine learning capabilities that full-powered OCR provides. You need OCR that scales to your needs and is specific for your business context. Think of OCR as a data collection tool.