Goto

Collaborating Authors

 unemployment


'The search is soul-destroying': Young jobseekers on the struggle to find work

BBC News

'The search is soul-destroying': Young jobseekers on the struggle to find work Young people are bearing the brunt of the UK's weak labour market, according to new figures from the Office for National Statistics (ONS). Some 16.1% of people aged 16 to 24 are not able to find work, compared to a national unemployment figure of 5.1%. That does not include young people who are out of work but not looking for a job, due to ill health or who are still studying. Businesses, particularly in sectors that traditionally gave young people their first jobs, like retail and hospitality, say higher costs are leading them to cut staff or not take on new hires, which often hits young workers the hardest. But graduate-level roles are also proving harder to land.


grangersearch: An R Package for Exhaustive Granger Causality Testing with Tidyverse Integration

Korfiatis, Nikolaos

arXiv.org Machine Learning

Understanding causal relationships between time series variables is a fundamental problem in economics, finance, neuroscience, and many other fields. While true causality is philosophically complex and difficult to establish from observational data alone, Granger (1969) proposed a practical, testable notion of causality based on predictability: a variable X is said to "Granger-cause" another variable Y if past values of X contain information that helps predict Y beyond what is contained in past values of Y alone. Granger causality testing has found applications across diverse domains. In macroeconomics, Sims (1972) famously applied the technique to study money-income relationships, while Kraft and Kraft (1978) pioneered its use in energy economics. Financial market researchers including Hiemstra and Jones (1994) have extended the methodology to study price-volume dynamics, and neuroscientists have adapted Granger causality for brain connectivity analysis (Seth, Barrett, and Barnett 2015). The statistical foundations rest on vector autoregressive (V AR) models (Sims 1980), with comprehensive treatments available in Lütkepohl (2005) and discussions of causal interpretation in Peters, Janzing, and Schölkopf (2017). Despite its popularity, implementing Granger causality tests in R (R Core Team 2024) remains cumbersome for applied researchers.


Left Leaning Models: How AI Evaluates Economic Policy?

Chupilkin, Maxim

arXiv.org Artificial Intelligence

Would artificial intelligence (AI) cut interest rates or adopt conservative monetary policy? Would it deregulate or opt for a more controlled economy? As AI use by economic policymakers, academics, and market participants grows exponentially, it is becoming critical to understand AI preferences over economic policy. However, these preferences are not yet systematically evaluated and remain a black box. This paper makes a conjoint experiment on leading large language models (LLMs) from OpenAI, Anthropic, and Google, asking them to evaluate economic policy under multi-factor constraints. The results are remarkably consistent across models: most LLMs exhibit a strong preference for high growth, low unemployment, and low inequality over traditional macroeconomic concerns such as low inflation and low public debt. Scenario-specific experiments show that LLMs are sensitive to context but still display strong preferences for low unemployment and low inequality even in monetary-policy settings. Numerical sensitivity tests reveal intuitive responses to quantitative changes but also uncover non-linear patterns such as loss aversion.


Exclusive: AI Could Double U.S. Labor Productivity Growth, Anthropic Study Finds

TIME - Tech

By how much, if at all, will AI boost the U.S. economy? New research by Anthropic, seen exclusively by TIME in advance of its release today, offers at least a partial answer to that question. By studying aggregated data about how people use Claude in the course of their work, Anthropic researchers came up with an estimate for how much AI could contribute to annual labor productivity growth--an important contributor to the total level of growth in the overall economy--as the technology becomes more widely used. Their answer: current-generation AI models could increase the U.S. annual labor productivity growth rate by 1.8%--doubling the average rate of growth since 2019. Assuming that labor makes up 60% of total productivity in the economy, and that AI reaches full diffusion in a decade's time, "this implies an overall total factor productivity increase of 1.1% per year," the researchers write.


The AI job cuts are here - or are they?

BBC News

The AI job cuts are here - or are they? Amazon's move this week to slash thousands of corporate jobs fed into a longstanding anxiety: that Artificial Intelligence is starting to replace workers. The tech giant joined a growing list of companies in the US that have pointed to AI technology as a reason behind layoffs. But some question whether AI is fully to blame - and have voiced scepticism that recent high-profile layoffs are a telling sign of the technology's effect on employment. Chegg, the online education firm, cited the new realities of AI as it announced a 45% reduction in workforce on Monday.


Methodological Insights into Structural Causal Modelling and Uncertainty-Aware Forecasting for Economic Indicators

Cerutti, Federico

arXiv.org Artificial Intelligence

This paper presents a methodological approach to financial time series analysis by combining causal discovery and uncertainty-aware forecasting. As a case study, we focus on four key U.S. macroeconomic indicators -- GDP, economic growth, inflation, and unemployment -- and we apply the LPCMCI framework with Gaussian Process Distance Correlation (GPDC) to uncover dynamic causal relationships in quarterly data from 1970 to 2021. Our results reveal a robust unidirectional causal link from economic growth to GDP and highlight the limited connectivity of inflation, suggesting the influence of latent factors. Unemployment exhibits strong autore-gressive dependence, motivating its use as a case study for probabilistic forecasting. Leveraging the Chronos framework, a large language model trained for time series, we perform zero-shot predictions on unemployment. This approach delivers accurate forecasts one and two quarters ahead, without requiring task-specific training. Crucially, the model's uncertainty-aware predictions yield 90% confidence intervals, enabling effective anomaly detection through statistically principled deviation analysis. This study demonstrates the value of combining causal structure learning with probabilistic language models to inform economic policy and enhance forecasting robustness.


Geometric Dynamics of Consumer Credit Cycles: A Multivector-based Linear-Attention Framework for Explanatory Economic Analysis

Sudjianto, Agus, Setiawan, Sandi

arXiv.org Artificial Intelligence

Understanding the dynamics of consumer credit cycles requires analyzing a complex web of interconnected economic relationships. When unemployment rises, consumers typically reduce spending and increase precautionary savings, while simultaneously facing greater difficulty servicing existing debt obligations. This creates pressure on revolving credit balances as households may increase borrowing to maintain consumption levels, ultimately leading to higher default rates. However, the timing, magnitude, and interaction patterns of these relationships vary dramatically across different economic cycles, creating fundamentally different crisis mechanisms that correlation-based analysis cannot distinguish. Consider the 2008 financial crisis versus the 1990-91 recession.


Investigating the importance of social vulnerability in opioid-related mortality across the United States

Deas, Andrew, Spannaus, Adam, Maguire, Dakotah D., Trafton, Jodie, Kapadia, Anuj J., Maroulas, Vasileios

arXiv.org Artificial Intelligence

The opioid crisis remains a critical public health challenge in the United States. Despite national efforts which reduced opioid prescribing rates by nearly 45\% between 2011 and 2021, opioid overdose deaths more than tripled during this same period. Such alarming trends raise important questions about what underlying social factors may be driving opioid misuse. Using county-level data across the United States, this study begins with a preliminary data analysis of how the rates of thirteen social vulnerability index variables manifest in counties with both anomalously high and low mortality rates, identifying patterns that warrant further investigation. Building on these findings, we further investigate the importance of the thirteen SVI variables within a machine learning framework by employing two predictive models: XGBoost and a modified autoencoder. Both models take the thirteen SVI variables as input and predict county-level opioid-related mortality rates. This allows us to leverage two distinct feature importance metrics: information gain for XGBoost and a Shapley gradient explainer for the autoencoder. These metrics offer two unique insights into the most important SVI factors in relation to opioid-related mortality. By identifying the variables which consistently rank as most important, this study highlights key social vulnerability factors that may play critical roles in the opioid crisis.


AI may displace 3m jobs but long-term losses 'relatively modest', says thinktank

The Guardian

Artificial intelligence could displace between 1m and 3m private sector jobs in the UK, though the ultimate rise in unemployment will be in the low hundreds of thousands as growth in the technology also creates new roles, according to Tony Blair's thinktank. Between 60,000 and 275,000 jobs will be displaced every year over a couple of decades at the peak of the disruption, estimates from the Tony Blair Institute (TBI) suggest. It described the figure as "relatively modest" given the average number of job losses in the UK has run at about 450,000 a year over the past decade. More than 33 million people are employed in the UK. AI, a technology that can be loosely defined as computer systems performing tasks that typically require human intelligence, has shot up the political agenda after the emergence of the ChatGPT chatbot and other breakthroughs in the field.


The Download: direct-air-capture plants, and measuring body fat

MIT Technology Review

It was 1938, and the pain of the Great Depression was still very real. Unemployment in the US was around 20%. New machinery was transforming factories and farms, and everyone was worried about jobs. Were the impressive technological achievements that were making life easier for many also destroying jobs and wreaking havoc on the economy? To make sense of it all, Karl T. Compton, the president of MIT from 1930 to 1948 and one of the leading scientists of the day, wrote in the December 1938 issue of this publication about the "Bogey of Technological Unemployment." His essay concisely framed the debate over jobs and technical progress in a way that remains relevant, especially given today's fears over the impact of artificial intelligence.