Collaborating Authors

Big Data

The AI Lords Of Sports: How The SportsTech Is Changing Business World


It is the time of the fall classic, Major League Baseball's World Series. As the two best teams vie for the championship this year, there are some actors in the game beyond the players, coaches, umpires (or referees), and fans… namely big data, analytics, and artificial intelligence. These new actors are also highly prevalent in football, basketball, and hockey, and they are changing these games forever. Sports foray into technology and data really got its start in 2002 with the Oakland Athletics. General Manager Billy Beane and Assistant GM Paul DePodesta would pioneer sabermetrics, which is a new perspective on baseball analytics.

How does Big Data Analytics use Machine Learning?


Big data refers to incredibly broad collections of structured and unstructured data that can not be managed using conventional methods. Big data research can make sense of data by uncovering trends and patterns. Machine learning can accelerate this process by decision-making algorithms. It can categorize incoming data, identify trends, and convert data into insights that are useful for business operations. Machine learning algorithms are useful for gathering, analyzing, and incorporating data for large organizations.

Council Post: Lack Of Cybersecurity Consideration Could Upend Industry 4.0


Industry 4.0 signifies a seismic shift in the way the modern factories and industrial systems operate. They consist of large-scale integration across an entire ecosystem where data inside and outside the organization converges to create new products, predict market demands and reinvent the value chain. In Industry 4.0, we see the convergence of information technology (IT) and operational technology (OT) at scale. The convergence of IT/OT is pushing the boundaries of conventional corporate security strategies where the focus has always been placed on protecting networks, systems, applications and processed data involving people and information. In the context of manufacturing industries with smart factories and industrial systems, robotics, sensor technology, 3D printing, augmented reality, artificial intelligence, machine learning and big data platforms work in tandem to deliver breakthrough efficiencies.

The Upside to Deepfake Technology - InformationWeek


Indeed, it's now possible to impersonate practically anybody with astonishing verisimilitude, thanks to the artificial intelligence technology known as generative adversarial networks (GANs). Fortunately, deepfakes have not shown their ugly side in the US presidential election campaign that is now drawing to a close. No one has been able to point to any significant use of GANs to produce deceptive videos and thereby manipulate public opinion. Instead, GANs are increasingly popping up in socially beneficial applications, such as for photorealistic animation and live-action video post-production. As evidenced by several recent industry announcements, next-generation remote collaboration services are using GANs and other AI techniques to improve the quality of rendered streams while improving the productivity of participants on these calls.

RS21 Hires Leader in Artificial Intelligence – IAM Network


Dr. Michelle Archuleta Leads AI Innovations as RS21 Director of Data Science RS21, an industry leader in developing interactive, big data analytics and visualization products and an Inc. 500 fastest-growing company, welcomes Michelle Archuleta, Ph.D. as its Director of Data Science. Dr. Archuleta is an entrepreneur, inventor and practitioner of artificial intelligence (AI) with 15 years of experience. She has 6 pending patents and has published in top reviewed scientific journals with a focus in the fields of systems biology, computational biology and utilizing applied mathematics and machine learning. Recommended AI News: Intel Xeon Scalable Platform Built for Most Sensitive Workloads "I'm excited to use AI for good and tackle high impact projects at RS21," said Dr. Archuleta. "I am looking forward to collaborating with the data science team, UX/UI design experts and developers to identify those'aha' moments that lead to new knowledge, a better path forward, and ultimately, make a difference in people's lives."

Deep Learning for NLP and Speech Recognition: Kamath, Uday, Liu, John, Whitaker, James: 9783030145989: Books


Uday Kamath has more than 20 years of experience architecting and building analytics-based commercial solutions. He currently works as the Chief Analytics Officer at Digital Reasoning, one of the leading companies in AI for NLP and Speech Recognition, heading the Applied Machine Learning research group. Most recently, Uday served as the Chief Data Scientist at BAE Systems Applied Intelligence, building machine learning products and solutions for the financial industry, focused on fraud, compliance, and cybersecurity. Uday has previously authored many books on machine learning such as Machine Learning: End-to-End guide for Java developers: Data Analysis, Machine Learning, and Neural Networks simplified and Mastering Java Machine Learning: A Java developer's guide to implementing machine learning and big data architectures. Uday has published many academic papers in different machine learning journals and conferences.

Upgrading Psychiatry Treatment Using AI and Big Data


Artificial Intelligence (AI) has invaded the healthcare sector long back. It is making accountable impacts on treatment and overviewing of patients. However, psychiatry department stands out when it comes to utilising AI applications. It has taken a long way before reaching the current initial stage where AI is being used for analyzing patients but only by a handful of psychiatrists. Medicine is already reaping a fruitful benefit from artificial intelligence and big data.

Technology is about to destroy millions of jobs. But, if we're lucky, it will create even more


The next five years might see 85 million jobs displaced by new technologies, according to a new report from the World Economic Forum (WEF), although the trend could be balanced out by the creation of 97 million new roles – subject, however, to businesses and governments putting in extra efforts to upskill and retrain the workforce. While the adoption of technologies that automate human labor has been long-anticipated by analysts, who have predicted the start of the "Fourth Industrial Revolution" for years now, 2020 has come with its share of unexpected events, and they have greatly accelerated changes that could threaten the stability of the labor market sooner than expected. The COVID-19 pandemic has fast-tracked most businesses' digital transformation, bringing remote work into the mainstream but also sparking CIOs' interest in new technologies. Surveying 300 of the world's biggest companies, which together employ eight million people around the world, the WEF found that an overwhelming 80% of decision makers are planning on accelerating the automation of their work processes, while half are set to increase the automation of jobs in their company. Industries like finance, healthcare and transportation are showing renewed interest in artificial intelligence, while the public sector is keen to increase the use of big data, IoT and robotics.

The 2020 data and AI landscape


When COVID hit the world a few months ago, an extended period of gloom seemed all but inevitable. Yet many companies in the data ecosystem have not just survived but in fact thrived. Perhaps most emblematic of this is the blockbuster IPO of data warehouse provider Snowflake that took place a couple of weeks ago and catapulted Snowflake to a $69 billion market cap at the time of writing – the biggest software IPO ever (see the S-1 teardown). And Palantir, an often controversial data analytics platform focused on the financial and government sector, became a public company via direct listing, reaching a market cap of $22 billion at the time of writing (see the S-1 teardown). Meanwhile, other recently IPO'ed data companies are performing very well in public markets. Datadog, for example, went public almost exactly a year ago (an interesting IPO in many ways, see my blog post here).

Create your secondary AI data store using HPE Apollo 4000 systems and Scality RING


Why does a secondary data store matter for AI? In my previous blog in this data store series, I discussed how the real selection criteria for an AI/ML data platform is how to obtain the best balance between capacity (cost per GB stored) and performance (cost per GB of throughput). Indeed, to support enterprise AI programs, the data architecture must support both high performance (needed for Ai training and validation) and high capacity (needed to store the huge amount of data that AI training requires). Even if these two capabilities can be hosted on the same systems (integrated data platform) or in large infrastructures, they are hosted in two separated specialized systems (two-tier architecture). This post continues the series of blogs dedicated to data stores for AI and advanced analytics.