peirce
Moving Pictures of Thought: Extracting Visual Knowledge in Charles S. Peirce's Manuscripts with Vision-Language Models
Pedretti, Carlo Teo, Picca, Davide, Rodighiero, Dario
Diagrams are crucial yet underexplored tools in many disciplines, demonstrating the close connection between visual representation and scholarly reasoning. However, their iconic form poses obstacles to visual studies, intermedial analysis, and text-based digital workflows. In particular, Charles S. Peirce consistently advocated the use of diagrams as essential for reasoning and explanation. His manuscripts, often combining textual content with complex visual artifacts, provide a challenging case for studying documents involving heterogeneous materials. In this preliminary study, we investigate whether Visual Language Models (VLMs) can effectively help us identify and interpret such hybrid pages in context. First, we propose a workflow that (i) segments manuscript page layouts, (ii) reconnects each segment to IIIF-compliant annotations, and (iii) submits fragments containing diagrams to a VLM. In addition, by adopting Peirce's semiotic framework, we designed prompts to extract key knowledge about diagrams and produce concise captions. Finally, we integrated these captions into knowledge graphs, enabling structured representations of diagrammatic content within composite sources.
- North America > United States > Illinois > Cook County > Chicago (0.05)
- Europe > Switzerland > Vaud > Lausanne (0.04)
- Europe > Italy > Lazio > Rome (0.04)
- (6 more...)
- Workflow (0.88)
- Research Report (0.85)
The meaning of prompts and the prompts of meaning: Semiotic reflections and modelling
Thellefsen, Martin, Dewi, Amalia Nurma, Sorensen, Bent
This paper explores prompts and prompting in large language models (LLMs) as dynamic semiotic phenomena, drawing on Peirce's triadic model of signs, his nine sign types, and the Dynacom model of communication. The aim is to reconceptualize prompting not as a technical input mechanism but as a communicative and epistemic act involving an iterative process of sign formation, interpretation, and refinement. The theoretical foundation rests on Peirce's semiotics, particularly the interplay between representamen, object, and interpretant, and the typological richness of signs: qualisign, sinsign, legisign; icon, index, symbol; rheme, dicent, argument - alongside the interpretant triad captured in the Dynacom model. Analytically, the paper positions the LLM as a semiotic resource that generates interpretants in response to user prompts, thereby participating in meaning-making within shared universes of discourse. The findings suggest that prompting is a semiotic and communicative process that redefines how knowledge is organized, searched, interpreted, and co-constructed in digital environments. This perspective invites a reimagining of the theoretical and methodological foundations of knowledge organization and information seeking in the age of computational semiosis
- North America > United States > Indiana (0.04)
- Europe > Denmark > Capital Region > Copenhagen (0.04)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
- (2 more...)
Convergence to the Truth
The epistemology of scientific inference has a rich history. According to the explanationist tradition, theory choice should be guided by a theory's overall balance of explanatory virtues, such as simplicity, fit with data, and/or unification (Russell 1912). The instrumentalist tradition urges, instead, that scientific inference should be driven by the goal of obtaining useful models, rather than true theories or even approximately true ones (Duhem 1906). A third tradition is Bayesianism, which features a shift of focus from all-ornothing beliefs to degrees of belief (Bayes 1763). It may be fair to say that these traditions are the big three in contemporary epistemology of scientific inference. There is, in fact, a fourth tradition.
- Europe > United Kingdom > England > Oxfordshire > Oxford (0.05)
- North America > United States > Illinois > Cook County > Chicago (0.04)
- North America > United States > California > Yolo County > Davis (0.04)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
Peirce in the Machine: How Mixture of Experts Models Perform Hypothesis Construction
Mixture of experts is a prediction aggregation method in machine learning that aggregates the predictions of specialized experts. This method often outperforms Bayesian methods despite the Bayesian having stronger inductive guarantees. We argue that this is due to the greater functional capacity of mixture of experts. We prove that in a limiting case of mixture of experts will have greater capacity than equivalent Bayesian methods, which we vouchsafe through experiments on non-limiting cases. Finally, we conclude that mixture of experts is a type of abductive reasoning in the Peircian sense of hypothesis construction.
- Asia > Middle East > Jordan (0.04)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Uncertainty > Bayesian Inference (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Statistical Learning (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Learning Graphical Models > Directed Networks > Bayesian Learning (1.00)
A novel framework for systematic propositional formula simplification based on existential graphs
de Mas, Jordina Francès, Bowles, Juliana
This paper presents a novel simplification calculus for propositional logic derived from Peirce's existential graphs' rules of inference and implication graphs. Our rules can be applied to propositional logic formulae in nested form, are equivalence-preserving, guarantee a monotonically decreasing number of variables, clauses and literals, and maximise the preservation of structural problem information. Our techniques can also be seen as higher-level SAT preprocessing, and we show how one of our rules (TWSR) generalises and streamlines most of the known equivalence-preserving SAT preprocessing methods. In addition, we propose a simplification procedure based on the systematic application of two of our rules (EPR and TWSR) which is solver-agnostic and can be used to simplify large Boolean satisfiability problems and propositional formulae in arbitrary form, and we provide a formal analysis of its algorithmic complexity in terms of space and time. Finally, we show how our rules can be further extended with a novel n-ary implication graph to capture all known equivalence-preserving preprocessing procedures.
- Europe > France (0.05)
- Europe > United Kingdom > Scotland > Fife > St. Andrews (0.04)
- Europe > Portugal > Lisbon > Lisbon (0.04)
- (16 more...)
Revisiting C.S.Peirce's Experiment: 150 Years Later
An iconoclastic philosopher and polymath, Charles Sanders Peirce (1837-1914) is among the greatest of American minds. In 1872, Peirce conducted a series of experiments to determine the distribution of response times to an auditory stimulus, which is widely regarded as one of the most significant statistical investigations in the history of nineteenth-century American mathematical research (Stigler, 1978). On the 150th anniversary of this historic experiment, we look back at Peirce's view on empirical modeling through a modern statistical lens.
- North America > United States > New York > Nassau County > Garden City (0.04)
- North America > United States > Massachusetts > Middlesex County > Reading (0.04)
- North America > United States > Massachusetts > Middlesex County > Cambridge (0.04)
- (3 more...)
The Arc of the Data Scientific Universe
In this paper I explore the scaffolding of normative assumptions that supports Sabina Leonelli's implicit appeal to the values of epistemic integrity and the global public good that conjointly animate the ethos of responsible and sustainable data work in the context of COVID-19. Drawing primarily on the writings of sociologist Robert K. Merton, the thinkers of the Vienna Circle, and Charles Sanders Peirce, I make some of these assumptions explicit by telling a longer story about the evolution of social thinking about the normative structure of science from Merton's articulation of his well-known norms (those of universalism, communism, organized skepticism, and disinterestedness) to the present. I show that while Merton's norms and his intertwinement of these with the underlying mechanisms of democratic order provide us with an especially good starting point to explore and clarify the commitments and values of science, Leonelli's broader, more context-responsive, and more holistic vision of the epistemic integrity of data scientific understanding, and her discernment of the global and biospheric scope of its moral-practical reach, move beyond Merton's schema in ways that effectively draw upon important critiques. Stepping past Merton, I argue that a combination of situated universalism, methodological pluralism, strong objectivity, and unbounded communalism must guide the responsible and sustainable data work of the future.
- Europe > Austria > Vienna (0.26)
- North America > United States > Minnesota (0.04)
- North America > United States > Illinois > Cook County > Chicago (0.04)
- (10 more...)
- Social Sector (1.00)
- Law (1.00)
- Health & Medicine > Therapeutic Area > Infections and Infectious Diseases (0.66)
- Health & Medicine > Therapeutic Area > Immunology (0.66)
- Information Technology > Artificial Intelligence > Representation & Reasoning (1.00)
- Information Technology > Artificial Intelligence > Natural Language (0.93)
- Information Technology > Artificial Intelligence > Machine Learning (0.93)
- Information Technology > Artificial Intelligence > Issues > Social & Ethical Issues (0.46)
The Meaning of Causality
We use the word causality as a means of understanding cognition but we don't really understand its distinctions. Let's look at what C.S.Peirce had to say about causality. What @yudapearl says is that to understand a system one needs to hypothesize a model of the system and then see how this model is in agreement. Statistics is just one of the methods of testing. But it's not how one formulates the original model.
Peirce's Semiotics and General Intelligence
There is a natural evolution from the ideas that deep learning has empirical revealed to a theory of general intelligence. A common criticism of deep learning is its lack of good theory. Deep learning is like the supercolliders in high energy physics. It reveals the inner behavior of an artificial intuitive process. It reveals to us patterns of what does work. To build up that theory we must walk back into the ideas of past thinkers. Thinkers who have never seen the empirical evidence. What will they conclude about their ideas if they had been exposed to evidence in deep learning?
Why AI Geniuses Haven't Created True Thinking Machines
As we saw yesterday, artificial intelligence (AI) has enjoyed a a string of unbroken successes against humans. But these are successes in games where the map is the territory. That fact hints at the problem tech philosopher and futurist George Gilder raises in Gaming AI (free download here). Whether all human activities can be treated that way successfully is an entirely different question. As Gilder puts it, "AI is a system built on the foundations of computer logic, and when Silicon Valley's AI theorists push the logic of their case to a "singularity," they defy the most crucial findings of twentieth-century mathematics and computer science." Here is one of the crucial findings they defy (or ignore): Philosopher Charles Sanders Peirce (1839–1914) pointed out that, generally, mental activity comes in threes, not twos (so he called it triadic).
- Information Technology (0.50)
- Banking & Finance > Trading (0.30)
- Information Technology > Artificial Intelligence > Issues > Turing's Test (0.42)
- Information Technology > Artificial Intelligence > Issues > Philosophy (0.42)