In this special guest feature, Solomon Thimothy, CEO of DMA Digital Marketing Agency, believes that digital marketing advancements in 2018 have set a high bar for customer expectations. Customers now expect, deserve and demand that personalized, seamless transactions will only increase in 2019. By focusing on data-based AI solutions, organizations can ensure the customer journey will be more personalized and more profitable in the year to come. Solomon focuses his expertise and passion in helping businesses invest in long-term digital marketing for financial growth. His education from Northeastern Illinois University and North Park University provide him with the tools needed to lives up to its digital marketing commitments.
At the 2019 Semicon Conference Applied Materials (AMAT) had a day-long seminar focused on technology, particularly memory, for artificial intelligence (AI) applications. In addition to talks by AI experts, the company also talked about their tools for manufacturing magnetic random access memory (MRAM) as well as resistive random access memory (RRAM) and Phase Change Memory (PCM). We will talk about a workshop at Stanford in August will explore emerging memories enabling artificial intelligence, especially for embedded products, such as IoT devices. Gary Dickerson from Applied Materials gave a kick-off talk at the seminar. He talked about the growth of data and the importance of memory to support data centers as well as the edge.
How do you benchmark the "evil" quotient in your AI app? That may sound like a facetious question, but let's ask ourselves what it means to apply such a word as "evil" to this or any other application. And, if "evil AI" is an outcome we should avoid, let's examine how to measure it so that we can certify its absence from our delivered work product. Obviously, this is purely a thought experiment on my part, but it came to mind in a serious context while I was perusing recent artificial intelligence industry news. Specifically, I noticed that MLPerf has recently announced the latest versions of its benchmarking suites for both AI inferencing and training.
AI is finding its way to more places in organizations, including human resources. Human capital management providers are building AI into their solutions, but depending on the details, it may be wiser to build your own application than buy something off-the-shelf. Earlier this year, Gartner issued a research note exploring AI use cases in human capital management (HCM). Its author, VP Analyst Helen Poitevin, concluded that many of these applications were still in the "demo candy" stage, mainly to demonstrate product roadmaps. In other words, AI-related expectations are outpacing reality.
With research suggesting artificial intelligence in manufacturing could become mainstream within 24 months, what can manufacturers gain from taking an early adopter approach? With AI and advanced analytics to identify patterns and trends in the wealth of data generated by the IoT, the barriers between operational technology and information technology are breaking down. Manufacturers can become data-driven in all aspects of business, enabling the companies to transform operations, restructure supply chains, improve efficiency, address skills shortages and create entirely new revenue streams and business models. Despite the many benefits, the Manufacturing Leadership Council's'Factories of the Future' survey revealed that less than one in 10 (8%) of manufacturers are currently using AI – though a further 50% expect to deploy it within two years. AI is still nascent in manufacturing today, yet these results suggest it could become mainstream in under 24 months.
The area burned by wildfires each year across the Western United States has increased by more than 300 percent over the past three decades, and much of this increase is due to human-caused warming. Warmer air holds more moisture, and the thirsty air sucks this from plants, trees, and soil, leaving forest vegetation and ground debris drier and easier to ignite. Future climate change, accompanied by warming temperatures and increased aridity, is expected to continue this trend, and will likely exacerbate and intensify wildfires in areas where fuel is abundant. Park Williams, a Lamont-Doherty Earth Observatory associate research professor and a 2016 Center for Climate and Life Fellow, studies climatology, drought, and wildfires. He has received a $641,000 grant from the Zegar Family Foundation that he'll use to advance understanding of the past and future behavior of wildfires.
As innovative technologies designed to support implementation of value-based care models continue to flood the market and patient demand for transparency of quality and cost of the care grows, patient-centric, data-driven technologies will lead the roost. These technology platforms address the need for timely, actionable patient-specific insights and leverage data to meaningfully improve quality, drive efficiencies and lower costs. Leading the way in delivering differentiated value, artificial intelligence (AI) technologies, such as machine learning (ML) and natural language processing (NLP), are entering the mainstream with the potential to transform healthcare delivery and better meet patient expectations. ML, for instance, has wide-ranging applications--from helping clinicians refine or customize care plans for individual patients to increasing the speed at which pharma and medical device companies can develop therapies that can improve patient outcomes. These technologies can enable a healthcare system to continuously "learn" from the vast amounts of clinical data that has for years remained siloed and untapped and in a growing number of use cases to leverage AI to support clinical decision making at the point of care.
Ready to use statistical and machine-learning techniques across large data sets? This practical guide shows you why the Hadoop ecosystem is perfect for the job. Instead of deployment, operations, or software development usually associated with distributed computing, you'll focus on particular analyses you can build, the data warehousing techniques that Hadoop provides, and higher order data workflows this framework can produce. Data scientists and analysts will learn how to perform a wide range of techniques, from writing MapReduce and Spark applications with Python to using advanced modeling and data management with Spark MLlib, Hive, and HBase. You'll also learn about the analytical processes and data systems available to build and empower data products that can handle--and actually require--huge amounts of data.
Your study material will be available to you on Imarticus's Learning Management System, which is a fully integrated state-of-the-art learning management system for an extended duration of 7 months. You will need to log in to the learning portal using the credentials provided and navigate through the portal as required.
In December 2014, I asked whether we were at the beginning of "the end of the Hadoop bubble." I kept updating my Hadoop bubble watch (here and here) through the much-hyped IPOs of Hortonworks and Cloudera. The question was whether an open-source distributed storage technology which Google invented (and quickly replaced with better tools) could survive as a business proposition at a time when enterprises have moved rapidly to adopting the cloud and "AI"--advanced machine learning or deep learning. In January 2019, perennially unprofitable Hortonworks closed an all-stock $5.2 billion merger with Cloudera. In May 2019, another Hadoop-based provider, MapR, announced that it would shut down if it were unable to find a buyer or a new source of funding.