Services


Five Things to Consider as Strata Kicks Off

#artificialintelligence

Today marks the start of the fall Strata Data Conference in New York City, which has traditionally been the big data community's biggest show of the year. It's been a wild ride for the big data crowd in 2018, one that's brought its share of highs and lows. Now it's worth taking some time to consider where big data has come, and where it's possibly headed in the future. Here are five things to keep in mind as the Strata Data Conference kicks off. We've said this before, but it bears repeating: Hadoop is just one of many technologies angling for relevance in today's increasingly heterogeneous at-scale computing environment.


Cisco Unveils Server for Artificial Intelligence and Machine Learning

#artificialintelligence

SAN JOSE, Calif--September 10, 2018–Artificial intelligence (AI) and machine learning (ML) are opening up new ways for enterprises to solve complex problems. But they will also have a profound effect on the underlying infrastructure and processes of IT. According to Gartner, "only 4% of CIOs worldwide report that they have AI projects in production." And when it does, IT will struggle to manage new workloads, new traffic patterns, and new relationships within their business. To help enterprises address these emerging challenges, Cisco is unveiling its first server built from the ground up for AI and ML workloads.


Machine Learning for Edge Devices

#artificialintelligence

Two of the most interesting topics in the tech world today are edge computing and artificial intelligence/machine learning. Separately, they're each making profound impacts on both consumers and businesses in everything from smart speakers to automated factory production lines. Even more intriguing, however, are applications that combine edge computing and machine learning (ML) to enable new kinds of experiences and new kinds of opportunities in industries ranging from Mobile and Connected Home, to Security, Surveillance, and Automotive. Products that combine these two technologies can leverage the local compute and storage capabilities of new types of edge devices, and perform a variety of actions on their own. This can save time, improve privacy, reduce network traffic, and enable applications or devices to be optimized for specific environments.


5 tech trends that blur the lines between human and machine

#artificialintelligence

"CIOs and technology leaders should always be scanning the market along with assessing and piloting emerging technologies to identify new business opportunities with high impact potential and strategic relevance for their business," says Gartner research vice president Mike J. Walker. In Gartner's latest Hype Cycle for Emerging Technologies, Walker reports on these must-watch technologies, listing five that will "blur the lines" between human and machine. They will profoundly create new experiences, with unrivaled intelligence, and offer platforms that allow organisations to connect with new business ecosystems, he states. AI technologies will be virtually everywhere over the next 10 years, reports Gartner. While these technologies enable early adopters to adapt to new situations and solve problems that have not been encountered previously, these technologies will become available to the masses -- democratised.


The Bayesian Probability: Basis and Particular Utility in AI

#artificialintelligence

PROBABILITY was initially called and for a quite a long time the doctrine of chances and was the mathematical description of game of chance (dice, cards and so on) and used to describe and quantify randomness or aleatory of uncertainty. Statisticians use it to describe uncertainty. How can you use probability to describe learning? How can you use it to describe an accumulation of information overtime so yo can modify probability, based on additional knowledge? However, using Bayes theorem is a thing and being Bayesian is something else.


Python rival? Programming language 'Julia' is winning over developers.

#artificialintelligence

A young programming language for machine learning is on the rise and could be soon gunning for Python. Python is now one of the most popular programming languages among developers and could soon overtake C . But a much younger language, Julia "a possible alternative to Python" is catching on quickly. While developers have been using Python for nearly 30 years and is being spurred on by machine learning and data scientists, Julia has only been available since 2012 but is now showing up in numerous language popularity rankings. Last week, analysts from the TIOBE programming language index noted that Julia for the first time made its top 50 list.


How the Convergence of AI and IoT is Transforming Careers

#artificialintelligence

This article is the second part (click here for part one) in our series about the role of AI in customer support. The explores how digitization, digital self-service, and distributed digital advisors are disrupting a series of industries and creating a series of business opportunities to realize transformative value propositions, business models, services, and new revenue streams. The second-part delves further into the impact of AIOps and machine automation on a range of different careers and considers how we can prepare further for the occupational changes of the future. Many of the critical building blocks of computing -- microchip density, processing speed, storage capacity, energy efficiency, download speed, etc. -- have been improving at exponential rates over the last decade According to the authors of The Second Machine Age, Erik Brynjolfsson and Andrew McAfee: "We've also recently seen great progress in natural language processing, machine learning (the ability of a computer to automatically refine its methods and improve its results as it gets more data), computer vision, simultaneous localization and mapping, and many of the other fundamental challenges of the discipline." Administration tasks have evolved to largely self-service over the last few decades with technology removing the need for typing pools, copyists, and mailroom clerks.


Business Intelligence and Machine Learning: Data Matters, Not Just the User Experience

#artificialintelligence

As artificial intelligence (AI) and machine learning (ML) begin to move out of academia into the business world, there's been a lot of focus on how they can help business intelligence (BI). There are a lot of potential in systems that use natural language search to help management more quickly investigate corporate information, perform analysis, and define business plans. A previous column discussing "self-service" business intelligence (BI) briefly mentioned two areas of focus where ML can help BI. While the user interface, the user experience (UX), matters, it's visibility is only the tip of the iceberg. The data being supplied to the UX is even more important.


Industry 4.0. Time to embrace the inevitable Intetics

#artificialintelligence

There are a lot of changes that occur in companies under the influence of the information technology innovations. Those changes help significantly increase the quality of products and services, which increases the level of customer loyalty and satisfaction. Manufacturers also do not stand aside. New approaches and business models born in Industry 4.0 allow them increasing profit and investing more in the product enhancement. The term "industry 4.0" is now used as a synonym for the fourth industrial revolution.


Everything You Need To Know About The Artificial Intelligence Boom

#artificialintelligence

If you've used Google Maps, you've experienced artificial intelligence ( AI) firsthand. It's a prime example of how AI technology today enables computers to take on tasks formerly reserved solely for humans -- such as reading a map. In this case, Google uses historical and real-time data to visualize current traffic patterns and then applies AI to predict future traffic flow, with the objective to plot the quickest route to a destination. Three important trends have made recent advancements in AI possible: big data collection, reduced computing costs, and improvements in algorithms. Data these days are easy to collect and cheap to store, while the advent of cloud computing has made it much more affordable to crunch all that data.