Goto

Collaborating Authors

Big Data


Why Programmers Are Not Data Scientists (and Vice Versa)

#artificialintelligence

Hot jobs go in waves, and not surprisingly, the information technologies sector is as prone to following fashions as religiously as teenagers. There is a good reason for this, of course. The hot IT jobs are where the money is, and if you want to play in that market, then you need to have the skills or training to participate. Otherwise, you run the risk of watching your income fall as you're relegated to lesser paying jobs, or worse, are forced into IT management, doomed never to touch a compiler again, while never quite managing to play in the big leagues with the C Suite (I may be exaggerating a bit here, though not necessarily by much). Over the years the role of programmers as generalists havs faded even as their importance as tool creators to assist others has grown dramatically.


Global Big Data Conference

#artificialintelligence

Experts from MIT and IBM held a webinar this week to discuss where AI technologies are today and advances that will help make their usage more practical and widespread. Artificial intelligence has made significant strides in recent years, but modern AI techniques remain limited, a panel of MIT professors and IBM's director of the Watson AI Lab said during a webinar this week. Neural networks can perform specific, well-defined tasks but they struggle in real-world situations that go beyond pattern recognition and present obstacles like limited data, reliance on self-training, and answering questions like "why" and "how" versus "what," the panel said. The future of AI depends on enabling AI systems to do something once considered impossible: Learn by demonstrating flexibility, some semblance of reasoning, and/or by transferring knowledge from one set of tasks to another, the group said. The panel discussion was moderated by David Schubmehl, a research director at IDC, and it began with a question he posed asking about the current limitations of AI and machine learning.


Global Big Data Conference

#artificialintelligence

Every department in a company has its own challenges. In the case of Human Resources, recruitment and onboarding processes, employee orientations, process paperwork, and background checks is a handful and many a time painstaking – mostly because of the repetitive and manual nature of the work. The most challenging of all is engaging with employees on human grounds to understand their needs. As leaders today are observing the AI revolution across every process, Human resources is no exception: there has been a visible wave of AI disruption across HR functions. According to an IBM's survey from 2017, among 6000 executives, 66% of CEO's believe that cognitive computing can drive compelling value in HR while half of the HR personnel believe this may affect roles in the HR organization.


Top 15 Cheat Sheets for Machine Learning, Data Science & Big Data

#artificialintelligence

Data Science is an ever-growing field, there are numerous tools & techniques to remember. It is not possible for anyone to remember all the functions, operations and formulas of each concept. That's why we have cheat sheets. But there are a plethora of cheat sheets available out there, choosing the right cheat sheet is a tough task. So, I decided to write this article. Enjoy and feel free to share!


Seamlessly Scaling AI for Distributed Big Data

#artificialintelligence

Originally published at LinkedIn Pulse. Early last month, I presented a half-day tutorial on at this year's virtual CVPR 2020. This is a very unique experience, and I would like to share some of the highlights of the tutorial. The tutorial focused on a critical problem that arises as AI moves from experimentation to production; that is, how to seamlessly scale AI to distributed Big Data. Today, AI researchers and data scientists need to go through a mountain of pains to apply AI models to production dataset that is stored in distributed Big Data cluster.


Global Big Data Conference

#artificialintelligence

ModiHost is a new platform for hotels that uses artificial intelligence to offer a better hotel management system, centered around personalization of the guest experience. In turn they aim to drive increased spending and brand loyalty. They say they've cracked the code that many hotels haven't, offering a solution for remembering guest preferences and anticipating their needs that most hotels wouldn't be able to employ on their own. As the company says it in its whitepaper: "Hotel management is a complex and convoluted industry. It is also a highly inefficient one. The need to operate multiple systems, integrate different booking systems, and process reservations via mediums ranging from email to fax, have made hotel management hopelessly complicated."


Global Big Data Conference

#artificialintelligence

The first few months of 2020 have radically reshaped the way we work and how the world gets things done. While the wide use of robotaxis or self-driving freight trucks isn't yet in place, the Covid-19 pandemic has hurried the introduction of artificial intelligence across all industries. Whether through outbreak tracing or contactless customer pay interactions, the impact has been immediate, but it also provides a window into what's to come. The second annual Forbes' AI 50, which highlights the most promising U.S.-based artificial intelligence companies, features a group of founders who are already pondering what their space will look like in the future, though all agree that Covid-19 has permanently accelerated or altered the spread of AI. "We have seen two years of digital transformation in the course of the last two months," Abnormal Security CEO Evan Reiser told Forbes in May. As more parts of a company are forced to move online, Reiser expects to see AI being put to use to help businesses analyze the newly available data or to increase efficiency.


Decision points in storage for artificial intelligence, machine learning and big data

#artificialintelligence

Data analytics has rarely been more newsworthy. Throughout the Covid-19 coronavirus pandemic, governments and bodies such as the World Health Organization (WHO) have produced a stream of statistics and mathematical models. Businesses have run models to test post-lockdown scenarios, planners have looked at traffic flows and public transport journeys, and firms use artificial intelligence (AI) to reduce the workload for hard-pressed customer services teams and to handle record demand for e-commerce. Even before Covid-19, industry analysts at Gartner pointed out that expansion of digital business would "result in the unprecedented growth of unstructured data within the enterprise in the next few years". Advanced analytics needs powerful computing to turn data into insights.


IBM Ramps Up AI, Analytics Via New File, Object Storage

#artificialintelligence

IBM Thursday introduced new storage hardware and software aimed at placing its storage at the center of large-scale data requirements for artificial intelligence and analytics workloads. The new offerings are aimed at helping to build the kind of information architecture needed to get the most out of businesses' fast-changing data, said Eric Herzog, IBM's chief marketing officer and vice president of worldwide storage channels. "The new stuff is all about storage solutions for AI, big data and business analytics," Herzog told CRN. "IBM thinks customers need an information architecture to build AI before they can collect and analyze their data and feed it into their AI systems." IBM storage technology has always been an important part of customers' high-performance computing, artificial intelligence and machine-learning infrastructures, said John Zawistowski, global systems solutions executive at Sycomp, a Foster City, Calif.-based solution provider and IBM channel partner. "Why IBM? It's the way they integrated the AI software platform and storage," Zawistowski told CRN. "And the way IBM understands the importance of doing that. And the way IBM technology performs."


Pick a number: big data, artificial intelligence and aviation

#artificialintelligence

Airline transport faces an enviable problem: how does it improve an already impressive safety record? Doing so may be beyond human capability, but well within the potential of two computing concepts--big data and artificial intelligence. Big data is an almost self-defining term. More specifically, as defined by Gartner Group in 2001, it is data that has the three Vs: 'greater variety, arriving in increasing volumes and with ever-higher velocity.' The Airbus A350 is a good example of the three Vs.