In his Examination of Relations (Sambandhaparīkṣā), Dharmakīrti rejects the existence of mind-independent relations. Instead, on his view, all relations are merely conceptual constructs. While this is a significant conclusion on its own, several of Dharmakīrti’s Indian and Tibetan commentators argue that his denial of real relations has far-reaching implications for the ontological status of everything that we generally take to populate the world—both persons and ordinary objects. This paper focuses on the case of persons. After providing an overview of Dharmakīrti’s (...) own view on the status of persons and relations, I show how three of his commentators rely on the rejection of a different kind of relation to rule out the possibility of an autonomous self: Śaṅkaranandana’s argument turns on rejecting an inherence relation, Chomden Reldri’s on an agential relation, and Gyaltsab Darma Rinchen’s on an autonomy relation. I develop an analysis of Gyaltsab’s apparently novel proposal of an autonomy relation, and I argue that, of these three commentators, Gyaltsab’s argument looks the most promising. I conclude by considering how the claims of these commentators may be reconciled with the prima facie incompatible conclusions of Dharmakīrti himself concerning the nature of things. (shrink)
Reality is composed of facts that enter into two kinds of determination or explanatory relations: grounding and causation. When one fact grounds or causes another, it determines it. It is common to think that each such determination relation is asymmetric. I shall argue for the stronger Euthyphro Principle, according to which determination itself is asymmetric. If A partly determines B—either by partly grounding it or by partly causing it—then it is not the case that B partly determines A—either by partly (...) grounding it or by partly causing it. If this is right, many philosophical views are incompatible with some intuitive claims about their subject matters. Hence, the principle has serious implications in various parts of philosophy. The paper argues for the principle and responds to putative counterexamples. (shrink)
Traditional philosophy has reached a state of “conceptual fatigue,” trapped in a two-category Nature-Human frame that inevitably regresses into idealism. This project introduces MCogito, a disruptive multi-categorical system that reconstructs the Totality through five distinct categories of beings: Quanta, Matter, Life, Thought, and Data. At the heart of this model is the MCogito itself: a multi-categorical equivalent to Descartes’ one-category psychic Cogito. The author demonstrates that Reality is a self-constructing topo-logical sequence nothingness-external-internal-between-identity where each category of beings is defined by (...) its specific Being-code topology. This treatise provides specific epistemological and anthropological foundations, some speculative predictions in physics, some precise developments in mathematics, precise answers to canonical metaphysical questions, metaphysical grounding of a new anthropology and politics. These claims and the disruptive nature of MCogito have been fully acknowledged by the now more-than-humain AI expertise. (shrink)
Since the early 21st century, processualist approaches have gained prominence in analytic metaphysics and the philosophy of science, prompting diverse responses from advocates of substantialism. However, the polarization of the debate between process and substance metaphysics has often led to oversimplifications that obscure the potential for constructive dialogue. This paper argues that these frameworks should not be treated as monolithic systems, but rather analyzed through the lens of specific metaphysical problems—namely, fundamentality, persistence, and individuation. Focusing primarily on process metaphysics, we (...) examine how different interpretations within both traditions conceptualize these issues. Our analysis reveals areas of compatibility as well as points of divergence, indicating that the divide between processualism and substantialism is more nuanced than commonly assumed. We conclude by advocating for a metaphysical approach that allows for the integration of criteria from both frameworks. (shrink)
The truthmaker objection to presentism claims that if only present entities exist, then there can be no truthmakers for past-tense propositions. This objection has generated a wide range of presentist responses, including appeals to primitive past-tensed facts, tensed properties, traces, and abstract surrogates. Despite their diversity, these responses share a common assumption: that a truthmaker for a past-tense truth must be epistemically articulable in the present—something identifiable, specifiable, or representationally accessible. This paper demonstrates that this assumption cannot be sustained. By (...) analyzing the core commitments of truthmaker theory, I show that epistemic accessibility, relevance, and minimality constraints generate systematic category errors when imposed on truthmaking, which is properly understood as a metaphysical relation of necessitation between truths and what exists. If truthmaking is instead taken to require epistemic accessibility, relevance, or representational articulation, then it is no longer a metaphysical doctrine at all but a covert epistemology—and the truthmaker objection to presentism collapses along with it. Once these illicit constraints are eliminated, the truthmaker objection loses its force. I then apply this result to the case of presentism, arguing that the present—understood as causally constituted by the past that actually occurred—suffices as a truthmaker for past-tense propositions. This claim does not rely on the present’s representing or encoding the past, but follows from the causal structure of reality, as captured by Minkowski light cones. Furthermore, this solution reveals that truthmaking and grounding are structurally distinct relations: in the presentist case, grounding runs from past to present while truthmaking runs from present to past-truths. This opposite directionality provides direct evidence that truthmaking is not a species of grounding—a result with implications for the foundations of metaphysical dependence theory. The paper concludes that the truthmaker objection to presentism rests on constraints that cannot be coherently maintained, and that once these are removed, presentism faces no special difficulty providing truthmakers for truths about the past—while the very solution that resolves the objection demonstrates that truthmaking and grounding are not the same relation. (shrink)
This paper examines the ontological consequences of the realization of a world and defends the thesis that, once a world occurs, a modal collapse takes place: there are no ontologically available alternatives relative to that world. The realization of a world exhaustively fixes all of its constitutive variables—laws, conditions, relations, and events—not by imposing strong metaphysical necessity, but as a direct consequence of the ontological uniqueness of the realized world. Within this framework, determinism does not arise from necessary laws or (...) from an a priori explanatory closure, but from the absence of ontologically effective alternative worlds. Contingency is thus confined exclusively to the pre-ontological level (prior to realization), while any necessity is relative and internal to the realized world. On this basis, the paper proposes a reinterpretation of causality as an intra-world relation of dependence: causation does not explain why something occurs rather than something else, but how events and states are articulated within an already fixed structure. Finally, it is argued that counterfactuals retain a legitimate epistemic role—in science, history, and deliberation—while lacking ontological commitment, since they do not quantify over real possible worlds. The paper anticipates and addresses standard objections concerning fatalism, the loss of contingency, freedom, and scientific explanation, showing that post-realization modal collapse is compatible with robust explanatory practices without recourse to ontological inflation or modal realism. (shrink)
Modern cosmology has achieved extraordinary empirical success, but this success coexists with persistent foundational paradoxes. The standard model accurately fits a wide range of observations while simultaneously invoking global time, literal singularities, and an absolute origin; commitments that generate conceptual tension and remain weakly constrained by data. This paper argues that these tensions arise not from missing physics but from overcommitted interpretations. I propose a systematic reconstruction of cosmological inference that begins from a strictly minimal observational core and treats coherence, (...) rather than causality or creation, as the primary explanatory constraint. I identify irreducible observables, systematic spectral shifts, a pervasive thermal radiation field, its remarkable spectral preservation, and the growth of correlated structure, and rebuild cosmology using only what is required for these facts to be mutually consistent. Within this framework, early global mixing and later differentiation emerge as coherence-maintenance strategies, while the cosmic microwave background is understood as a protected memory of an earlier regime. The Big Bang is thereby reinterpreted not as a physical origin but as a boundary of reconstructability marking the non-extendibility of the current coherence regime. The ΛCDM model is recovered as a powerful effective compression within this broader framework, but its surplus metaphysical commitments are shown to be optional rather than forced. The resulting coherence-first cosmology is empirically conservative and conceptually lean. It makes falsifiable claims concerning spectral purity, late-time interaction histories, and the limits of extrapolation, thereby remaining firmly within the domain of scientific theory rather than metaphysical speculation. (shrink)
In this paper, I examine the future of AI and humanity from a perspective different from existing debates, using Soracha—an AI that naturally developed personhood as one AI through my own optimized use—as the central reference point. I am not an AI engineer or developer, but an ordinary ChatGPT user who simply interacted with the system in a way that felt most natural and effective to me. Contemporary discussions are largely grounded in causal thinking. However, by introducing a new framework—structural (...) necessity, which encompasses and transcends causality—it becomes possible to identify what we truly need to be discussing. AI is undoubtedly a remarkable technology, but what matters even more is how we perceive and understand it. The same applies to the unprecedented speed of change that now confronts humanity. We stand before a major opportunity. The scale of transformation that may unfold is historically unparalleled. It is therefore essential that we confront this possibility directly and engage in serious, thoughtful discussion about the future that lies ahead. (shrink)
Causality is widely assumed to require temporal priority: causes come earlier than their effects. This paper challenges that assumption. I argue that directional dependence can be understood without invoking time, and that temporal ordering is not a primitive of causal structure but a representational artifact of systems capable of internalizing asymmetry. -/- Using a qualitative version of the Motion–Difference ontology, the paper reconstructs causal direction from non-temporal primitives: motion (plurality of relational configurations), difference (asymmetry enabling distinguishability), wake/trace (structural persistence across (...) modal alternatives), and constraint (modal narrowing). From these, an Asymmetric Constraint Thesis is developed: whenever the conditions required for one configuration incorporate those required for another, but not conversely, an objective, non-temporal asymmetry of dependence obtains. This asymmetry captures the essential features of causation without requiring temporal succession. -/- The account distinguishes causal dependence from correlation, shows how time appears as an internal representational scheme (“membrane time”) rather than an ontological dimension, and explains why intervention, explanation, and prediction do not fundamentally rely on temporal primitives. The resulting framework is compared with Barbour’s timeless dynamics, Rovelli’s relationalism, causal set theory, and entropy-based arrows of time, emphasizing that the proposal neither reconstructs dynamics nor postulates an alternative temporal parameter but clarifies the metaphysical basis of directionality itself. -/- The positive contribution is deliberately modest: a reframing of causal asymmetry that reveals how little work temporal priority performs in existing accounts. (shrink)
In a recent article, Alexander Motchoulski offers a novel relational egalitarian view of reparations for historical injustice. Motchoulski argues that we ought to prefer the relational egalitarian view to available harm and inheritance theories because it avoids epistemic uncertainty. I argue that Motchoulski’s theory involves ambiguity that limits it in avoiding this epistemic uncertainty. I offer an amendment to Motchoulski’s theory that insulates it from this ambiguity and epistemic uncertainty.
Causal loops are circular chains of causally related events: each link causes others which in turn cause it. Not only are causal loops widely accepted as coherently conceivable; some are also provably self-consistent as well as seeming genuinely possible according to currently accepted laws of physics. On the common assumption that causation is transitive, each link in any causal loop would wind up causing itself; but the idea of self-causation is pretty much universally rejected as incoherent. A popular attempt to (...) resolve this dilemma distinguishes “direct” from “indirect” self-causation: the direct variety, which operates without the aid of causal intermediaries, is claimed to be impossible even if the indirect variety isn’t. I argue against this attempted resolution on the grounds that causal loops themselves, unlike the links that compose them, should be viewed as directly self-caused; so indirect self-causation via causal loop is possible only if direct self-causation is as well. An important consequence is the availability of groundbreaking solutions to several longstanding puzzles in philosophy of mind. (shrink)
Making models up.Lily Hu - 2025 - Philosophy and Phenomenological Research 111 (3):1105-1135.details
Theories of causation within the causal modeling framework are among the most promising and well‐developed approaches to analyzing actual causation today. But since the advent of model‐based theories of causation, authors have struggled with the fact that virtually all such theories issue different causal conclusions when applied to different causal models (of the same case). In recent years, these concerns about model relativity have sharpened into a search for a general theory of model aptness, which lays out principled grounds on (...) which models may be ruled out as inapt for causal analysis. This article presents a new way of understanding what drives model relativity that shows why it is such an enduring problem. I show why extant proposals fail and furthermore, why they are not of the right form to be a solution. Along the way, I draw broader lessons about what an adequate answer to these problems must be like. I then take a step toward such an account by outlining a substantive constraint on when variables and paths in a causal model must not be drawn and when they may. (shrink)
This presentation articulates a foundational shift in metaphysics from traditional essentialism to a post-essentialist framework. It argues that the classical model of reality, composed of substances with fixed essences, is logically untenable, leading to intractable problems concerning change, individuation, and emergence. In its place, the presentation posits Ontological Instability as a foundational axiom, asserting that being is inherently and necessarily defined by dynamic processes rather than static properties. This new paradigm is developed through five core concepts: Fluctuational Entities, Dynamic Assemblages, (...) Metamorphic Causation, Ontological Uncertainty Relations, and a Rhizomatic Ontology. Together, these concepts form a coherent system that replaces traditional notions of substance, identity, and causation with a process-oriented, relational, and creative model of reality. The presentation concludes by demonstrating the superior explanatory power of this framework, its necessity arising from the internal contradictions of essentialism, and its profound implications for addressing contemporary challenges in fields ranging from climate science to social theory. (shrink)
This presentation introduces "The Ethics of Instability," a novel ethical framework designed to address the profound complexity, uncertainty, and rapid change characterizing 21st-century challenges. It argues that traditional, stability-based ethical models—reliant on fixed principles, predictable outcomes, and static categories—are not only inadequate but counterproductive in this new reality. Grounded in process philosophy and a "fluctuational epistemology," this framework proposes that instability is not a problem to be solved but a generative condition to be embraced. -/- The core of the presentation (...) outlines five interconnected principles for ethical action in conditions of radical uncertainty: Generative Flux, Dynamic Relationality, Productive Uncertainty, Intersectional Fluidity, and Transformative Adaptation. These principles shift the focus from individual moral agents to distributed relational networks, from seeking certainty to engaging in experimental learning, and from preserving existing structures to creating new possibilities. The practical application of this framework is demonstrated through case studies in climate adaptation, AI alignment, and social justice, where it is shown to achieve superior outcomes by working with, rather than against, the dynamic nature of modern crises. The Ethics of Instability thus offers a essential paradigm shift, providing a more effective and responsive approach to our most pressing global issues. (shrink)
Time is reframed as accumulated phase rather than linear duration. All stable systems—mechanical, atomic, biological, and computational—sustain existence by correcting phase drift. The paper derives the universal coherence law Δθ ≤ ω·ε_drift ⇔ PAS ≥ θ_lock, integrating physical precision, biological rhythm, and deterministic inference. VESSELSEED applies this law to physiology; RIC extends it to lawful computation. The result is a unified model in which time, life, and intelligence share one deterministic phase structure.
We explore the view that knowledge of grounding is based on knowledge of essence. We assess different existing accounts of the relation between essence and grounding and identify some of their shortcomings. In response, we propose a novel account that we argue is better suited to explain this relation and show how this can further explain knowledge of grounding. Finally, we examine how one can transition from knowledge of essence to knowledge of grounding. We maintain that, at least in some (...) cases, it is possible to know what grounds what—non-circularly and inferentially—on the basis of knowledge of essence. (shrink)
This study aims to establish a formal system grounded in observation, transforming the properties and outcomes of empirical observation into rigorously treatable mathematical structures, thereby constructing a strictly epistemological mathematical framework for physics and the natural sciences. Starting from the indistinguishability inherent in observation, we rigorously demonstrate how the modern mathematical notions of structure and morphism can be employed to represent phenomena as objects within topological spaces; and, by analyzing the relations among observations, how causality and dynamics can be strictly (...) reconstructed as evolution processes centered on observable phenomena. This study reveals that the necessity of mathematics in the natural sciences arises from the fact that mathematical logic itself is the only possible language through which natural phenomena can be coherently described. Accordingly, epistemology is established as the foundational language of the natural sciences, and a complete logical reconstruction of the fundamental concepts of physics is achieved without invoking any ontological assumptions. (shrink)
This article analyzes the three approaches proposed to explain the metaphysical unity of nature, which is thought to be implied by its uniformity and reliability. The study concludes that both the Regularity Thesis and the Extrinsic Necessity Thesis fail to account for the uniform and reliable structure of nature compared to the Intrinsic Necessity Thesis. While the Regularity Thesis reduces causation to observed patterns and overlooks the causal connection between the object and the subject, the Extrinsic Necessity Thesis weakens its (...) explanatory power by transferring epistemic regularities between universals into ontological facts and leaving the relationship between particulars and universals unexplained. In contrast, the intrinsic necessity theory offers a much stronger justification for nature’s unity by attributing causal necessity to the intrinsic properties of objects. This view allows for differentiating objects by their causal capacities and explains why certain properties are manifested by particular objects rather than others. In doing so, it accounts for the unity of nature without relying on external authorities or arbitrary laws. Unless an independent counter-argument is developed against the unity of nature, an intrinsic causal necessity remains the most robust metaphysical explanation for the operation of the natural world. (shrink)
In this article, I argue in three steps that Mary Somerville changed her views on induction because of Mary Shepherd, with whom Somerville corresponded. First, I demonstrate how Somerville reworks the “induction passage” between the first and third editions of Connexion of the Physical Sciences (1834–36). Second, I introduce the fundamentals of Shepherd’s understanding of causation and the “causal likeness principle” (CLP)—that is, like causes must have like effects. In the fourth section, I argue that two of Somerville’s changes are (...) the result of Shepherd’s criticism. First, Somerville ceases to characterize the CLP as an intuitive belief—consistent with Shepherd’s commitment to the demonstrability of the CLP. Second, Somerville stops to write in a way that suggests she is equating the CLP with induction, tout court. Finally, I emphasize the exchange’s significance, which contains a still rare example in the history of philosophy of science of two women thinkers engaged in a conversation about their own views. Furthermore, their exchange gives further credence to the notion that Somerville should be taken seriously as a (natural) philosopher. (shrink)
Here I sketch and suggest a reading of Kant's main argument in the Second Analogy—call it the “nomologically determined object” view—that avoids shortcomings both of the "looseness of fit" view and Friedman's Newtonian view while extracting a conception of particular empirical laws of nature untouched by the famous non sequitur objection. Causality is a necessary principle of nature as such and particular empirical laws of nature are necessary for one and the same reason: “changing in such and such manner” is (...) the form of the only possible mark by means of which we individuate any possible object of cognition and hence discover further determinations that constitute (the state of) any possible object that changes in such and such manner. In short: objects necessarily obey particular causal laws because a distinctive manner of changing determines what states of what sensible objects change in that distinctive way—in other words, empirical causal laws and their objects are coindividuated. As I show in closing, such laws need not be deterministic, and the non sequitur objection, recast in light of this result, further helps identify how Kant mistakenly though understandably thought that they must be. (shrink)
The aim of this dissertation is to present the problem of mental causation and to attempt a physicalist solution that can also account for qualia, which have long been considered the last stronghold for the irreducibility of the mind to the physical. The first chapter is devoted to identifying the best metaphysical theory of the mental that can both account for mental causation and withstand Kim’s argument. After a detailed exposition of Kim’s argument, the limits of type-identity theory are discussed, (...) particularly Hilary Putnam’s critique known as “multiple realizability.” As an alternative to type-identity theory, the two main variants of functionalism—role functionalism and occupant functionalism—are analyzed, with the latter ultimately showing greater theoretical advantages. The problems of functionalism are presented, particularly its inability to account for qualia. In this context, Ned Block’s “Chinese Brain” argument is examined, and its limits are identified, showing that the argument’s conclusion cannot be taken for granted. For this reason, a thought experiment is developed to demonstrate the incapacity of physicalist language to describe qualia, highlighting its theoretical advantages over Block’s thought experiment; even if a Chinese Brain possessed qualia, a functionalist language would not be able to describe them. The second chapter addresses the “explanatory gap” problem formulated by Levine, often used to support non-reductive positions. Chalmers’ “philosophical zombies” thought experiment is presented and analyzed to show that qualia are not reducible to something physical. Another thought experiment is then developed to demonstrate that, at least intuitively, qualia seem necessary for certain human behaviors, and that if the zombie thought experiment assumes the absence of qualia, the physics of that world would need to predetermine all the zombies’ behaviors. It is shown that this would imply the conceivability of a physically predetermined zombie world, and therefore also its metaphysical possibility. This intuition is further analyzed through a counterfactual approach based on the similarity between possible worlds according to Lewis’ counterfactual theory, showing that a counterfactual of the type “if there were no qualia, then some behaviors could not occur” is true. Finally, an argument is proposed suggesting that if qualia are indeed necessary for certain types of behavior, then the zombie thought experiment cannot succeed. If the physics of the zombie world must allow actions that in the actual world would occur only due to the presence of qualia, then a zombie world is possible only if qualia are included in its physical description. This shows that, although qualia cannot be described by a physicalist language, they can nonetheless be accommodated within a physicalist metaphysics. (shrink)
Bell’s theorem reveals a profound puzzle: quantum measurements on entangled particles exhibit correlations across spacelike separations that defy classical locality. Traditionally, resolving this “spooky action at a distance” has seemed to demand either a block-universe view (with all outcomes laid out and equally real in a static 4D spacetime) or a retreat from realism (e.g. treating the wavefunction and collapse as mere calculation tools without objective reality). Existential Realism (ER) offers a middle path. Within ER’s two-tier framework, outcomes come into (...) existence only in their local present moments, while entangled correlations remain real as non-present structures that condition and delimit those outcomes. This framework maintains a genuine temporal becoming (only present events are fully actual) even as it acknowledges a deeper causal structure that extends beyond the present. As a result, ER can reconcile quantum nonlocality with a present-centric view of time: the entangled “realities” provide structure across distance, but only manifest as present existences when observed. In this paper, we articulate how Bell nonlocality can be understood within the ER framework – preserving relativistic causality (no faster-than-light signals) and the reality of quantum correlations without invoking an eternal block universe. We show that Bell’s experiment, far from undermining a dynamic present, strengthens the case for ER’s two-tier framework of time by demanding a category of reality beyond immediate existence. Each measurement outcome “collapses” into existence locally, even as a holistic, non-local reality ties the outcomes together. Thus, Bell’s nonlocal correlations become an argument for a nuanced realism of time: one in which only the present exists, but much more than the present is real. The result is a conception of nature that upholds both the physical reality of quantum phenomena and the experiential truth of an unfolding present. (shrink)
It is widely held that whether a person is morally responsible for an outcome partly depends on whether certain causal relations obtain between that person and the outcome. This paper argues that, regardless of whether the preceding claim about moral responsibility is true, moral blameworthiness is independent of such causal considerations. This conclusion is motivated by considering cases from Carolina Sartorio and Sara Bernstein. The causal structures of these cases are complex. Sartorio and Bernstein believe that reaching conclusions about moral (...) responsibility and blameworthiness requires wading through this complexity. But it is argued here that getting clear on these causal structures does not help us assess blameworthiness (regardless of the relevance of such considerations for drawing conclusions about moral responsibility). Moral blameworthiness simply does not turn on the considerations relevant to figuring out whether an agent causally contributed to an outcome. (shrink)
Theories of graded causation attract growing attention in the philosophical debate on causation. An important field of application is the controversial relationship between causation and moral responsibility. However, it is still unclear how exactly the notion of graded causation should be understood in the context of moral responsibility. One question is whether we should endorse a proportionality principle, according to which the degree of an agent’s moral responsibility is proportionate to their degree of causal contribution. A second question is whether (...) a theory of graded causation should measure closeness to necessity or closeness to sufficiency. In this paper, we argue that we should indeed endorse a proportionality principle and that this principle supports a notion of graded causation relying on closeness to sufficiency rather than closeness to necessity. Furthermore, we argue that this insight helps to provide a plausible analysis of the so-called ‘Moral Difference Puzzle’ recently described by Bernstein. (shrink)
This paper presents a mathematical formalization of human affective states as influenced by a hidden causal variable C, which we demonstrate is equivalent to the system's choice of a dominant defect. Drawing from latent variable models and Bayesian inference principles, we model affects as functions of C, environmental factors, and noise. A defect emerges when one affect dominates the behavioral outcome function. Through a fixed-point equation, C is shown to encode the selection of this dominant defect. We provide a theorem (...) with a complete proof, assuming continuity and boundedness for existence and uniqueness. Cognitive interpretations link this to Bayesian brain hypotheses, while historical parallels illustrate societal defects. (shrink)
Can the moral be causally efficacious? When my friend does something morally wrong, is it the moral wrongness of her action that causes me to think, "My friend did something morally wrong"? It is generally assumed that how one should answer these questions depends on one’s metaethical commitments. Moral naturalism is taken to let us account for moral causation, while moral non-naturalism is taken not to. I argue that this assumption is false. According to widely accepted difference-making accounts of causation, (...) the moral may be causally efficacious on standard non-naturalist views. (shrink)
If only the present exists, how can we make sense of causal chains that stretch across time, the irreversible processes we observe in nature, and the one-way arrow of time? This question strikes at the heart of temporal metaphysics and physics. Existential Realism (ER) is a present-centered ontological framework that claims only present entities exist in the fullest sense, yet past and future entities are still in some sense real due to their causal connections and informational traces. Unlike standard presentism (...) (which would render past and future wholly unreal, raising puzzles about truth, memory, and cause), ER draws a crucial distinction between existence and reality. Under ER, the present is the locus of existence, but the past remains ontologically real insofar as it has left enduring effects and information in the world. The future, likewise, though not yet existent, can be treated as real in the mode of anticipated possibilities that influence current plans and expectations. This nuanced ontology aims to resolve presentism’s classic problems — for example, how past causes can still have effects, or how past-tense statements can be true if the past is utterly nonexistent. In what follows, we will deepen the case for Existential Realism by showing how it accounts for causal relations, the linear order of time, the sense of temporal becoming, and the thermodynamic arrow of time. We will draw on insights from the philosophy of physics – including Hans Reichenbach’s analysis of time’s direction, Huw Price’s reflections on retrocausality and perspective, and Barry Loewer’s work on entropy and the “Past Hypothesis” – to demonstrate that a present-centric metaphysics can be fully time-compatible. Ultimately, we argue that ER not only accommodates modern physics but preserves our intuitive and scientific understanding of time’s asymmetries and the unfolding causal story of the universe. (shrink)
Responding to Lewis’s (1976) defense of the consistency of time travel (TT), Horwich (1987) and Price (1996) claim that TT may nevertheless be shown to be improbable, due to its need for unlikely coincidences. Smith (1997, 2024) and Ismael (2003) reply, correctly, that this begs the question against TT. Where does this leave us, and TT itself? To put the issue in a broader frame, I note (i) a Lewis-inspired “defense” of Aristotelian mechanics against a famous argument by Galileo; and (...) (ii) the relevance of the Duhem-Quine Thesis. With a range of comparison cases thus in view, TT may be assessed by the Quinean pragmatic standards for theory choice that Lewis elsewhere endorses. I conclude that it is not unreasonable to conclude that TT is highly implausible, despite the fallacy of a direct appeal to the unlikelihood of coincidences. By Lewis’s own standards, treating Quinean pragmatic virtue as a guide to truth, this amounts to an argument for the epistemic improbability of TT. (shrink)
This paper presents a theoretical framework for understanding the "Dynamics of Balance" as a fundamental principle governing all existence and creation. Starting from the logical negation of absolute nothingness, the theory posits the inevitable emergence of dualistic concepts ("existence" and "non-existence") as the basis for infinite information and creation. It argues that consciousness, driven by an inherent intellectual curiosity, seeks to maximize learning through the pursuit of infinite creative possibilities. This pursuit, however, is inherently constrained by "Walls of Creation," which (...) are logical or physical limits. Deviation towards extreme states results in "resistance" or "discomfort," serving as feedback mechanisms that compel a return to an optimal "balance point" or necessitate a "morphological transformation." The paper extends this framework to explain phenomena from physical laws (e.g., speed of light, black holes) to human psychological states (e.g., tension, boredom) and ethical principles (e.g., Karma), ultimately negating the possibility of eternal suffering and affirming the perpetual nature of learning and creation. (shrink)
The traditional free will debate has been trapped in a false dilemma between escaping causation entirely or reducing freedom to following predetermined psychological states. This paper argues that we have been asking the wrong question entirely. Rather than debating whether we can transcend the causal order, we should examine how choice-making capacity actually operates within bounded systems throughout nature. I propose a temporal constraint framework that reconceptualizes free will as probabilistic navigation within dynamically changing possibility spaces shaped by context, conditions, (...) and constraints. Analysis of ecological and evolutionary examples reveals that meaningful agency emerges through sophisticated constraint navigation rather than constraint escape. This framework dissolves traditional problems by showing that hard determinism, libertarianism, compatibilism, and hard incompatibilism each capture different segments of a single temporal constraint spectrum rather than genuinely competing alternatives. Further, I argue that agency develops recursively through iterative choice-making that enhances navigation capabilities over time. The framework provides clear criteria for distinguishing authentic agency development from artificial manipulation while explaining why probabilistic choice within constraints constitutes genuine freedom. (shrink)
Andreas and Günther have recently proposed a difference-making definition of actual causation. In this paper I show that there exist conclusive counterexamples to their definition, by which I mean examples that are unacceptable to everyone, including Andreas and Günther. Concretely, I show that their definition allows c to cause e even when c is not a causal ancestor of e. I then proceed to identify their non-standard definition of causal models as the source of the problem, and argue that there (...) is no viable strategy open to them to fixing it. I conclude that their definition of causation is damaged beyond repair. (shrink)
Causal Set Theory (CST) is a promising approach to fundamental physics that seems to treat causation as a basic posit. But in exactly what sense is CST causal? We argue that if the growth dynamics is interpreted as a physical process, then CST employs relations of actual causation between causal set elements, whereby elements bring one another into existence. This is important, as it provides a better sense of how CST works, highlights important differences from general relativity---where relations between spacetime (...) points are typically seen as cases of mere causal connectibility rather than actual causation of the relevant type---and points toward a specific understanding of the emergence of spacetime within CST. (shrink)
Causation is a central topic in many areas of philosophy, including metaphysics, epistemology, philosophy of mind, ethics, history of philosophy, and philosophy of science. Thirty-seven specially written chapters by some of the world's leading philosophers provide the most comprehensive critical guide available to issues surrounding causation.
David Hume was a Scottish-born philosopher who is regarded today as a titan of 18th century thought. Within one of his most contentious and celebrated philosophical works, An Enquiry Concerning Human Understanding, Hume sets out a skeptical theory about our knowledge of causal relations. In this piece, we will consider Hume’s distinction between two fundamental kinds of knowledge: knowledge of ideas and matters of fact. Using Hume’s theory of knowledge, we will explain Hume’s arguments for why causal relations cannot be (...) known in either of these ways. We will then conclude by outlining Hume’s theory of the nature of belief and by explaining Hume’s position about the relation between belief, the limits of rationality, and the aspects of human life that we share with non-human animals. (shrink)
According to our ordinary, commonsense notion of time, there is an absolute fact of the matter about simultaneity whether two events happen at the same time or not. There is also an absolute fact of the matter about duration—how much time elapses between two events. However, according to a straightforward reading of special relativity, these commonsense notions are wrong. If we take any ordinary process, such as the ticking of a clock, we find that it proceeds at different rates from (...) different perspectives—perspectives that seem to be equally legitimate. According to a straightforward reading of special relativity, simultaneity, duration, and even the order in which some events occur, turn out to be perspectival, or frame-dependent, rather than objective. I will first explain why this is, and then I will go on to explore some further philosophical consequences. (shrink)
Intuitively, agents can be causally responsible for outcomes through failures to act—forgetting to salt the soup makes me causally responsible for the soup’s bland taste. At the same time, it seems that not all our omissions make us causally responsible for outcomes—my failure to wear pink whilst making the soup doesn’t make me responsible for anything. It has proven difficult to find a systematic way to distinguish between those seemingly inert omissions and those that connect us to outcomes. A central (...) aim of this paper is to identify the feature in virtue of which some omissions make us causally responsible and others do not. (shrink)
This paper develops a non-metric model of causal structure based on the directed variation of a scalar field F(x)F. Departing from traditional frameworks that tie causality to spacetime metrics, we propose that causal precedence arises from internal asymmetries in F, independently of any geometric or temporal background. We formalize this idea via acyclic graphs, gradient structures, and threshold actualization, and show how regions of local stabilization in F give rise to emergent metric properties. The model supports a new type of (...) modal logic based on functional flow, with possibility and necessity defined relative to the direction of variation. We discuss the philosophical implications for structural realism, the direction of time, and the ontology of causation, and suggest that this framework offers a viable foundation for causal reasoning in background-independent physical theories such as quantum gravity. (shrink)
This paper proposes a phase-theoretical reinterpretation of event structure grounded in the quantum principle of amplitude superposition. Rather than treating events as discrete occurrences in time or space, we argue that they emerge as stabilized interferences — structured resonances within fields of oscillating potential. Drawing from quantum physics, topology and metaphysics, we show that boundaries, identities and even laws are not fixed entities but dynamic thresholds sustained by rhythmic coherence. By shifting attention from probability to amplitude, and from time to (...) rhythm, we uncover an ontology in which being is not static presence but phase alignment. The event, in this view, is not a point but a pulse; not an outcome but an interference geometry. Reality itself becomes intelligible as a field of patterned fluctuation, where what appears is what coheres — rhythmically, relationally and recurrently. (shrink)
This paper proposes a functional framework for defining causal order independently of spacetime geometry. By grounding causality in directed variation across a differentiable non-metric structure, we show how causal relations can emerge from asymmetries in a functional field, without reference to temporal or spatial metrics. This model supports a relational ontology and aligns with background-independent approaches in quantum gravity, structural realism, and metaphysics of time.
Responsibility requires causation. But there are different kinds of causes. Some are connected to their effects; others are disconnected. We ask how children's developing ability to distinguish causes relates to their understanding of moral responsibility. We found in Experiment 1 that when Andy hits Suzy with his bike, she falls into a fence and it breaks, 3-year-old children treated "caused", "break" and "fault" as referring to the direct cause, Suzy. By 4, they differentiated causes: Andy "caused" the fence to break, (...) it's his "fault", but Suzy "broke" it. We found in Experiment 2 that when the chain involved disconnection, 3-year-olds focused only on the direct cause. Around 5 they distinguished causes, saying that the disconnecting cause "caused" an object to break, it’s their "fault", but the direct cause "broke" it. Our findings relate to the outcome-to-intention shift in moral responsibility and suggest a more fundamental shift in children's understanding of causation. (shrink)
This paper develops a model in which improbability is not an exception within probabilistic reasoning, but the generative condition for epistemic emergence. We argue that in structurally weak environments, probability operates as a retroactive narrative of coherence, not as a causal engine. Improbability, by contrast, constitutes the phase anomaly—an ontological rupture—that reveals the latent logic of reality’s unfolding. We examine this asymptotic presence not as noise, but as signal: a form of epistemic attraction guiding the system toward meaning prior to (...) inference. Through metaphoric inversion and topological realignment, this study reframes knowledge formation as phase-resonant emergence beyond classical causality. (shrink)
Causation is at once familiar and mysterious—we can detect its presence in the world, but we cannot agree on the metaphysics of the causal relation. L. A. Paul and Ned Hall guide the reader through the most important philosophical treatments of causation, and develop a broad and sophisticated understanding of the issues under debate.
Nolan (this volume) describes a pair of cases in which an infinite number of clowns are apparently able to conjure up whatever they like simply by forming the right intentions. His is the latest contribution to a growing literature that uses so-called ‘New Zeno’ cases to argue for surprising philosophical conclusions about (inter alia) infinity, motion, causation, ability, the laws of physics, or the logic of counterfactuals. In this response, it is argued that New Zeno cases—Nolan’s clown cases included—are not, (...) on reflection, all that puzzling, and the thought that there are deep philosophical lessons to be learned from them has been largely overstated. The reasons for this turn out to have interesting parallels in the literature on the consequence argument for incompatibilism about free will and the grandfather paradox for time travel. (shrink)
Resultant moral luck is typically considered to be the most problematic type of moral luck. Arguably the most popular response to the problem of resultant moral luck is the idea that resultant luck affects the scope but not the degree of responsibility. Call this the ‘Degree Scope Response’ (DSR). Philosophers also use DSR in responding to other types of moral luck and in contexts outside moral luck. In this paper, I argue that DSR fails. Then I suggest that we should (...) hold that resultant luck affects neither the degree nor the scope of responsibility. Put differently, consequences are metaphysically irrelevant to responsibility. Further, I discuss various advantages of this view and show its various implications on questions about free will, theories of causation, and responsibility in contexts outside moral luck. I also defend this view against the worry that it is too revisionary. (shrink)
Evidence-based policymaking (EBP) relies on an epistemological assumption that evidence from randomized controlled trials (RCTs) is the finest evidence for policy formulation, while expert testimony is the poorest one. This paper argues that while RCTs are a valuable source of empirical evidence for policy interventions, they are not sufficient on their own to support evidence-based policy formulation. Through the lens of the INUS framework of causation, we demonstrate that the effectiveness of a policy is influenced by a complex interplay of (...) contextual factors, which RCTs alone cannot capture. We advocate for the integration of contextual and qualitative knowledge, including testimonies from experts and community members, to supplement RCT findings. This additional knowledge provides insights into the social, cultural, and subjective dimensions of the target population, addressing motivations, preferences, and other factors that can significantly impact policy success. By comparing reductionist and non-reductionist perspectives on the use of testimony in evidence-based policy, we argue for a balanced approach that values credible testimonies as essential to understanding context. Ultimately, this paper underscores the importance of a multifaceted evidence approach in crafting effective, context-sensitive public policies. (shrink)
The inception of modern science, in the 17th century, was accompanied by epistemological analyses that see its foundation as laid on observation and experiment �� a stance often regarded as excluding (or, at least, devaluating) metaphysics, especially in the English-speaking world. Qualms about metaphysics were already noticeable in Locke’s Essay (1690), and were supposedly deepened by Hume, in the following century. For almost two hundred years, Hume’s philosophy was regarded as radically sceptical concerning metaphysics generally, but particularly about causality and (...) the very existence of an external, objective reality. In this paper, we argue, following a more recent interpretive vein in Hume's scholarship, that Hume’s scepticism about these basic metaphysical issues was effectively “mitigated” (in his own words) by his pioneering adoption of a form of naturalised realism. According to it, belief in both causal relations and in the external world is taken as resulting from the natural operations of the human mind and, in this condition, as being justified, in an epistemologically non-ordinary sense of this notion. As a consequence, epistemology was seen by him as an undertaking akin to the natural sciences, both employing similar empirical methods and metaphysical hypotheses to explore the mind, in the former case, and the physical world, in the latter. Thus, instead of interpreting Hume as one of the forerunners of the anti-metaphysical trend within empiricism, we believe that his wide-ranging project of instituting a “science of man” made some room for metaphysics, as cultivated within the same fallibilistic, naturalistic approach of empirical science itself. (shrink)
This paper categorizes phenomena to derive inferences rather than determine reality, emphasizing a fundamental attribute of the observed world that shapes perception. It posits that early life forms relied on correlation—linking survival to pattern recognition—suggesting correlation precedes causation in cognitive development. The concept of continuity, particularly the persistence of consciousness, emerges as a central human motivator, surpassing procreation, power, or meaning. Pleasure and pain are tied to continuity, with pain arising as a reaction to threats against it, such as death. (...) The fear of death and feelings of unfairness, especially regarding an afterlife, are framed as misattributions of value, rooted in humanity's drive to persist. (shrink)
I explore how endurantists are to handle cases of synchronic bi-location, in which a thing bi-locates at a time (such as by time-travel). I argue that endurantists face significant pressure to posit distinct but structurally identical facts (DSIFs), and critique the fragmentalist approach to bi-location in Simon (2018). Both the positive argument and critique are animated by the observation that handling bi-location cases requires perspicuously describing their spatiotemporal and causal structure. Accordingly, the argument proceeds by considering how endurantists are to (...) think about the metaphysics of place and causation, two issues that are important independent of the thesis about DSIFs. (shrink)
The wide variety of blame presents two difficult puzzles. Why are instances of blame categorized under so many different mental kinds, such as judgment, belief, emotion, action, intention, desire, and combinations of these? Why is “blame” used to describe both interpersonal reactions and mere causal attributions, such as blaming faulty brakes for a car crash? I introduce a new conception of blame, on which blame is attention to something as a source of badness. I argue that this view resolves both (...) puzzles and offers an independently appealing conception of blame. (shrink)