Goto

Collaborating Authors

 data story


DATAWEAVER: Authoring Data-Driven Narratives through the Integrated Composition of Visualization and Text

Fu, Yu, Bromley, Dennis, Setlur, Vidya

arXiv.org Artificial Intelligence

Data-driven storytelling has gained prominence in journalism and other data reporting fields. However, the process of creating these stories remains challenging, often requiring the integration of effective visualizations with compelling narratives to form a cohesive, interactive presentation. To help streamline this process, we present an integrated authoring framework and system, DataWeaver, that supports both visualization-to-text and text-to-visualization composition. DataWeaver enables users to create data narratives anchored to data facts derived from "call-out" interactions, i.e., user-initiated highlights of visualization elements that prompt relevant narrative content. In addition to this "vis-to-text" composition, DataWeaver also supports a "text-initiated" approach, generating relevant interactive visualizations from existing narratives. Key findings from an evaluation with 13 participants highlighted the utility and usability of DataWeaver and the effectiveness of its integrated authoring framework. The evaluation also revealed opportunities to enhance the framework by refining filtering mechanisms and visualization recommendations and better support authoring creativity by introducing advanced customization options.


DataNarrative: Automated Data-Driven Storytelling with Visualizations and Texts

Islam, Mohammed Saidul, Laskar, Md Tahmid Rahman, Parvez, Md Rizwan, Hoque, Enamul, Joty, Shafiq

arXiv.org Artificial Intelligence

Data-driven storytelling is a powerful method for conveying insights by combining narrative techniques with visualizations and text. These stories integrate visual aids, such as highlighted bars and lines in charts, along with textual annotations explaining insights. However, creating such stories requires a deep understanding of the data and meticulous narrative planning, often necessitating human intervention, which can be time-consuming and mentally taxing. While Large Language Models (LLMs) excel in various NLP tasks, their ability to generate coherent and comprehensive data stories remains underexplored. In this work, we introduce a novel task for data story generation and a benchmark containing 1,449 stories from diverse sources. To address the challenges of crafting coherent data stories, we propose a multiagent framework employing two LLM agents designed to replicate the human storytelling process: one for understanding and describing the data (Reflection), generating the outline, and narration, and another for verification at each intermediary step. While our agentic framework generally outperforms non-agentic counterparts in both model-based and human evaluations, the results also reveal unique challenges in data story generation.


Salesforce Tableau 2023.1 uses AI to bring data stories to life

#artificialintelligence

Data can be complicated to collect and it is often even more complex to understand in a way that brings a business value. Salesforce's Tableau business unit today announced the 2023.1 release of its enterprise platform known as Tableau Server, which can run on-premises or in an organization's own virtual private cloud deployment. Tableau is generally used as a data analytics technology that helps users get insights from data. The new 2023.1 update integrates enhanced features to help organizations connect to data including a data mapping feature that has been designed to make it easier to execute analytics on any data source. There is now also a deeper integration with Salesforce's Slack messaging application in a bid to help users benefit from data analytics directly within Slack.


Data Dashboarding Must Evolve or Face Irrelevance

#artificialintelligence

Data visualization has become synonymous with business intelligence (BI) data dashboarding. But these dashboards have a weakness: They are only as good as the humans–and AI–that interpret it. For businesses to truly unlock their full operational efficiency potential, they must find a better way to translate data, operationalize metadata, and create more visually intuitive ways to build trust and extract value from the data. One of the reasons behind the lack of trust in the data stems from the absence of context around the numbers to make them useful, especially if the data is needed for a range of purposes, viewed by more than the dashboard creator. And more often than not, when the data isn't our own, we tend to distrust it.


Automated Data Storytelling Is Not the Future of Analytics

#artificialintelligence

Automated data storytelling is the future of analytics. That's the argument put forth by James Richardson during a conference hosted by automated data storytelling vendor Narrative Science (as reported here). I've spoken to Mr. Richardson on a couple of occasions and deeply appreciate his understanding and enthusiasm for data storytelling. He's been a champion for data storytelling at Gartner for years. It is his modifier'automated' that worked me into a Stephen Few -style lather.


The AI Hierarchy of Needs

#artificialintelligence

As is usually the case with fast-advancing technologies, AI has inspired massive FOMO, FUD and feuds. Some of it is deserved, some of it not -- but the industry is paying attention. From stealth hardware startups to fintech giants to public institutions, teams are feverishly working on their AI strategy. It all comes down to one crucial, high-stakes question: 'How do we use AI and machine learning to get better at what we do?' More often than not, companies are not ready for AI.


Global Big Data Conference

#artificialintelligence

How has the COVID-19 pandemic impacted the world of data and analytics in the enterprise? Here are the trends for 2021. Enterprise organizations have embraced the ideas behind advanced analytics technologies over the past several years, beginning with buzz words like big data and moving onto topics such as machine learning and artificial intelligence. But the promise of these technologies can sometimes get lost in the reality of implementing them in the real-world enterprise. Depending on what survey you are looking at, how you define the technologies, and what questions you ask, enterprise organizations' adoption of advanced analytics, machine learning, and AI varies quite a bit. But the technologies have captured the attention of both the IT pros in the trenches and the top enterprise executives who recognize its promise for everything from cutting costs, to increasing revenue, to accelerating innovation and improving competitiveness in the market.


The Four Benefits of Data Mining: Google's Side of the Data Story

#artificialintelligence

Whenever you use a free application, website, or service, the companies behind it gain large amounts of information about you and then package you with other users with similar ages and interests to be sold to advertisers. This process is called data mining, is how Google generated a staggering $134.81 billion in advertising in 2019 alone. With advertising accounting for over 70% of Google's revenue, it has no other option than to try to convince us that we should not only tolerate its data collection and mining but accept it, because of its many advantages. Your phone is your personal assistant, and the more information about you it gets fed, the more things it can do for you. Would you care that your data is being collected if Google could use it to make things easier for you?


Gartner Identifies Top 10 Data and Analytics Technology Trends for 2020

#artificialintelligence

Gartner, Inc. identified the top 10 data and analytics (D&A) technology trends for 2020 that can help data and analytics leaders navigate their COVID-19 response and recovery and prepare for a post-pandemic reset. "To innovate their way beyond a post-COVID-19 world, data and analytics leaders require an ever-increasing velocity and scale of analysis in terms of processing and access to succeed in the face of unprecedented market shifts," said Rita Sallam, distinguished research vice president at Gartner. By the end of 2024, 75% of organizations will shift from piloting to operationalizing artificial intelligence (AI), driving a 5 times increase in streaming data and analytics infrastructures. Within the current pandemic context, AI techniques such as machine learning (ML), optimization and natural language processing (NLP) are providing vital insights and predictions about the spread of the virus and the effectiveness and impact of countermeasures. Other smarter AI techniques such as reinforcement learning and distributed learning are creating more adaptable and flexible systems to handle complex business situations; for example, agent-based systems that model and simulate complex systems.


Top 10 Data and Analytics Technology Trends for 2020 - IntelligentHQ

#artificialintelligence

Gartner, Inc. identified the top 10 data and analytics (D&A) technology trends for 2020 that can help data and analytics leaders navigate their COVID-19 response and recovery and prepare for a post-pandemic reset. "To innovate their way beyond a post-COVID-19 world, data and analytics leaders require an ever-increasing velocity and scale of analysis in terms of processing and access to succeed in the face of unprecedented market shifts," said Rita Sallam, distinguished research vice president at Gartner. AIBy the end of 2024, 75% of organizations will shift from piloting to operationalizing artificial intelligence (AI), driving a 5 times increase in streaming data and analytics infrastructures. Within the current pandemic context, AI techniques such as machine learning (ML), optimization and natural language processing (NLP) are providing vital insights and predictions about the spread of the virus and the effectiveness and impact of countermeasures.Other smarter AI techniques such as reinforcement learning and distributed learning are creating more adaptable and flexible systems to handle complex business situations; for example, agent-based systems that model and simulate complex systems. Dynamic data stories with more automated and consumerized experiences will replace visual, point-and-click authoring and exploration. As a result, the amount of time users spend using predefined dashboards will decline.