Goto

Collaborating Authors

Results


Integrate.io Achieves Google Cloud Ready - BigQuery Designation

#artificialintelligence

"Integrate.io is thrilled to achieve BigQuery's designation! We look forward to continuing our ongoing partnership to drive the data stack evolution together and helping every organization to become data driven" Google Cloud Ready – BigQuery is a partner integration validation program that intends to increase customer confidence in partner integrations into BigQuery. As part of this initiative, Google engineering teams validate partner integrations into BigQuery in a three-phase process – Run a series of data integration tests, compare results against benchmarks, and work closely with partners to fill any gaps and refine documentation for our mutual customers. This designation enables customers to be confident that Integrate.io "Digital transformation increasingly requires analysis and access to data across multiple platforms and environments," said Manvinder Singh, Director, Partnerships at Google Cloud.


Amazon fixes security flaw in AWS Glue service

ZDNet

Amazon Web Services has fixed two flaws affecting AWS Glue and AWS CloudFormation. The bug in AWS Glue could allow an attacker using the service to create resources and access data of other AWS Glue customers, according to Orca Security. It's easier than ever for enterprises to take a multicloud approach, as AWS, Azure, and Google Cloud Platform all share customers. Here's a look at the issues, vendors and tools involved in the management of multiple clouds. Orca researchers say it was due to an internal misconfiguration within AWS Glue, which AWS today confirmed it has since fixed.


Bootstrap a Modern Data Stack in 5 minutes with Terraform - KDnuggets

#artificialintelligence

Modern Data Stack (MDS) is a stack of technologies that makes a modern data warehouse perform 10–10,000x better than a legacy data warehouse. Ultimately, an MDS saves time, money, and effort. The four pillars of an MDS are a data connector, a cloud data warehouse, a data transformer, and a BI & data exploration tool. Easy integration is made possible with managed and open-source tools that pre-build hundreds of ready-to-use connectors. What used to take a team of data engineers to build and maintain regularly can now be replaced with a tool for simple use cases.


Informatica debuts its intelligent data management cloud

ZDNet

Informatica on Tuesday is officially unveiling its intelligent data management cloud (IDMC), an AI-powered platform designed to serve a broad base of users working with data in multi-cloud environments. Along with that, the company is announcing a series of partnerships and integrations with Microsoft Azure, Amazon Web Services and Google Cloud Platform. Informatica has been in the business of data management tools for more than 20 years, and in that span of time, data has become increasingly valuable, Informatica Chief Product Officer Jitesh Ghai told ZDNet. "We recognize our community of data-led practitioners has grown well beyond technical ETL experts, well beyond data engineers and data scientists," he said. Now, it includes "non-technical users who want to operate with facts. Gut-based decision making has been laid bare as insufficient moving forward."


The New Data Engineering Stack

#artificialintelligence

Remember the time when the software development industry realized that a single person can take on multiple technologies glued tightly with each other and came up with the notion of a Full Stack Developer -- someone who does data modelling, writes backend code and also does front end work. Something similar has happened to the data industry with the birth of a Data Engineer almost half a decade ago. For many, the Full Stack Developer remains a mythical creature because of the never-ending list of technologies that cover frontend, backend and data. One of the reasons for that could be the fact that visualisation (business intelligence) has become a massive field in its own right. A Data Engineer is supposed to build systems to make data available, make it useable, move it from one place to another and so on.


Data Integration: The vital baking ingredient in your AI strategy - Journey to AI Blog

#artificialintelligence

When people dream about becoming a baker or a pastry chef, they often think about the delicious pastries they’ll create, delighting their patrons with towering cakes wrapped in impossibly smooth fondant. But very rarely does anyone start off by thinking about the preparation involved in baking… Without being able to use freshly milled flour for baking, for example, you would actually never be able to eat a good piece of cake or a crusty loaf of bread. To produce those delicious pastries, a lot of preparation must happen before the actual baking process begins. The same parallel can be made between AI and Data Integration. Let me explain: The business challenge: As an example, let’s examine a regional U.S. retailer who recently decided to modernize its supply chain management, including supply chain availability, fulfillment, and online cart. To accomplish this objective, the retailer decided to implement the Onera Decision Engine in Google Cloud Platform (GCP). The Onera Decision Engine…


Data integration issues still impede digital progress, survey shows

ZDNet

The foundation of a digitally savvy enterprise is data, and lots of it, coming from all corners, and having the right data at the right time being digested and turned into insights that guide both humans and machines to make the right things happen. But organizations are still fumbling in their efforts to bring it all together. Cloud services help -- but only go so far, and may even complicate things even more. That's the word from a recent survey of 1,400 executives released by Progress, which finds data integration to be the number-one challenge to enterprises seeking to expand their digital repertoire. Close to half of respondents pinpoint ever-increasing disparate data sources as a major pain point.


SnapLogic Adds GitHib, Container Support

#artificialintelligence

More vendors are offering new integration tools for meeting growing enterprise demand for faster delivery of application software. Among the approaches is automating key elements of the continuous integration and continuous delivery pipeline using emerging application container services and cloud-based open source development platforms. That's the route taken by SnapLogic, the self-service application and data integration specialist, which released an "integration cloud" this week that automates key software development bottlenecks. The Silicon Valley company also announced an update to its AI platform for automating routine development tasks along with a catalog of data pipeline components. SnapLogic, San Mateo, Calif., said its integration with GitHub Cloud and support for the Mesosphere container platform would "provide the glue needed to streamline the software development lifecycle."


Managing ETL Vendors

#artificialintelligence

It wasn't but a few short years ago your local technology vendor (IBM, Oracle, SAP, Microsoft, etc.) was always looking to enhance your ETL services on premise so you could be in "full control" of your database operations locally. You invested hundreds of thousands of dollars (if not much more!) to build out your local infrastructure to support your data platform of choice. Not to mention all the other costs for hardware, security, environment, expertise, and the list goes on. Today, one of the top projects of any technology leader or CXO is navigating their cloud strategy. Whether you are cloud mature, a hybrid, or really just exploring your cloud options, there is much activity around the cloud.


Microsoft Releases Azure Data Factory V2 Visual Tools in Public Preview

#artificialintelligence

After releasing Microsoft Azure Data Factory v2 (ADF) in public preview in September, Microsoft has recently followed up with the announcement of a public preview of new visual tooling for the fully managed cloud-based data integration and ETL service.