Let us criticize contemporary analytic metaphysics’ tacit reliance upon the coarse categories of Theory T thinking, wherein we are referring to scientific theorization by means of approximation, which has directed philosophical attention away from the puzzles of applied mathematical technique that originally concerned Leibniz.[1] By Theory T thinking, we are borrowing a term from philosopher of science Mark Wilson’s critiques; Theory T thinking concerns a mutltitude of philosophical-scientific doctrines that draw from logic-centric conceptions of scientific organization, a mode of thinking canonized by those logical empiricists of the mid-twentieth century. For such Theory T thinkers, physical theories are understood to be axiomatized structures, which are taken to successfully capture the autonomous behavior of nature within an arachnean cast of mathematical netting. Such Theory T thinking includes: explaining deduction from laws, wherein ‘autonomous behavior’ indexes the commonly held presumption that a physical system’s evolution can be understood from its initial conditions via internally-determined dynamics, paying no heed to interference from extraneous factors. In turn, Theory T thinking presents an idealized picture of correlations at the empirical level of explanation-as-deductible from laws. With this in mind, let us turn to a particular instantiation of Theory T thinking: the word ’cause’.
The term ‘cause’ serves, at least partially, as a central instrument of linguistic management insofar as we utilize it to arrange component strategies within an extended reasoning process. Such applications endow ‘cause’ with robust physical content (e.g., the ‘causal processes’ involved in wave motion) but we can consider, with equal prudence, a scenario where the majority of customary physical referents are relinquished through mechanical constraints, leaving behind a pure exemplar of procedural significance.[2]
Analytic metaphysics has, following David Lewis, been prodded into considerations regarding the early a priori, wherein there is a quasi-Kantian expectation that metaphysical categories exist which serve as fundamental prerequisites of descriptive thought. The first of these is ‘genetic on consideration’, and emphasizes how, via linguistic training, we learn to reason about ‘parts’ and ‘wholes’, ‘causes’, and ‘effects’. Relying upon such categories, we scaffold inferential skills. Thus, in parsing the concept of mereology—the branch of classificatory doctrine that is apparently demanded by our conceptual rendering of ‘parts’ and ‘wholes’—philosophers such as L.A. Paul pose a necessitarian position concerning ‘cause’ and ‘effect’ wherein:
“the metaphysics tells us what it is to be a sum or physical object composed of these structured arrangements of parts, and thus tells us how the physical object is metaphysically constructed (composed) from its parts. In contrast, chemistry tells us what some of the parts and the arrangements of the parts are for different kinds of molecules, and it also tells us how to causally manipulate the world in order to bring such arrangements into existence”.[3]
Other analytic metaphysicians cite future scientific development rather than the prerequisites of knowledge formation wherein metaphysics is considered to be purely “speculative, and rarely if ever results in certainty” contrary to “continuity with science”.[4] Philosopher of science Mark Wilson’s position is closer to Paul’s, whereby the vocabulary we use to explicitly distinguish cases and reorient behavior necessitates that we have a vocabulary of language management. On the other hand, Wilson identifies Sider’s position with Theory T thinking, as it suppresses the importance of the contextual complexities in the linguistic tools of management and architectural subtleties therein. Accordingly, the terminologies with which we can articulate novel reasoning architectures to ourselves and others who may benefit from learning these routines are necessarily entangled in functions concerning multiscalar architecture qua the fulcrum of change (as is the case when concerning questions like “when the small parts of granite recrystallize, what changes do these cause on a macroscopic level?”). Accordingly, we require linguistic tools in order to manage architectural subtleties of the languages we speak. Here, Wilson’s claim concerning the pragmatic factors driving language in progressive fashion is in keeping with Quine’s derision on the “analytic and synthetic”[5] although novel in its structural conception.
The word ‘cause’ adjusts to its semantic bearings as we move from one explanatory architecture to the next, contingent upon descriptive architectural adjustments. Recall that differential equational models that capture genuine causal processes generally possess a formal feature, or a hyperbolic signature, which is within those equational sets that capture non-evolutionary physical circumstances (such as equilibrium conditions, generally of elliptical signature). Thus, physics has a need for considerations regarding ‘cause’ when modeling equations that seek to accurately capture evolutionary developments as they unfold in causal processes. In order to do this, we characterize causal processes according to ‘finite difference’, characterizing infinitesimal relationships within differential equations in terms of finitary spatiotemporal ‘steps’.
As it concerns these ‘early a priori’ childhood acquisitions, we begin to speak of nature’s causal processes in terms of finite difference. Rendering such finite differences into language means turning these into the Humean propositions of ‘cause now, effect later’. Wilson characterizes those co-opting the Humean framework of inference as “calculus-avoiding philosophical descendants, who pressure […] that the ‘laws of nature’ invariably take the form ‘for all x and all t, if F(x) holds at t, then G(x) will hold at t + ? t’”.[6] True causal processes are, however, captured in terms of differential equation relationships that we can only approximate if we do not possess the appropriate calculus. However, this does not mean that we agree with Ernst Mach and Bertrand Russell’s critiques on ‘cause’, as both philosophers fail to contend with Jacques Hadamard’s distinction between elliptical and hyperbolic signatures.
When, for instance, the energy input into a violin string remains trapped within travelling wave-front packets, ‘cause’ is shaped along time’s progressive forwards-arrow, allowing us to co-opt a straightforward evolutionary modeling. But with real strings energetic confinement does not smooth out into standing wave patterns. Rather, “smeared out pulses continue to travel back and forth across the string for appreciable periods” where “the applied work across the entire string” is redistributed in a manner that, after a short relaxation time, sees input energetic resettlement such that we can characterize the string’s movement as a “superposition of standing wave patterns that retain their individual energies for significant periods of time”.[7] This altered representation returns us to Fourier analysis, where there is a shift of basis vectors within a common descriptive arena—moving from a position representation to energy representation, with a change in descriptive basis representing coeval adjustments in reasoning architecture. Fourier factoring demonstrates how the real string, in the absence of energy dissipation, cycles through a number of simple processes independently of one another.
In order to consider the varied vibratory modes of the string at ‘turnaround points’, or those points when stored energies are expressed completely in a potential manner, we need to relocate string calculations through potential energy vis-à-vis an eigenfunction problem. This culls a different explanatory landscape wherein we no longer are concerned with ‘time’ at all but, merely, configurations of maximal potential energy (which is not contingent upon time but, instead, energetic storage capacity). In order to consider this, we need to adjust our computational strategy as a control problem rather than as an initial value problem, applying standard separation of variable techniques. This kind of adjustment is accompanied by a shift in how the word ‘cause’ shifts, as it is no longer attached to any evident ‘causal process’ in the evolutionary modeling manner.
Specifically, when we are speaking about the least potential energy calculated according to curvature, invoking a description of a string in a Fourier-like way, our causal attention shifts to features of the central control variable involved, where questions of cause are contingent upon changes in variables such as angle changes. These are called manipulationist counterfactuals, where the outcome of various potential manipulations is centered upon target variables that proceed processually.[8] A change in strategic focus, or instruction, accompanies an adjustment in the appropriate questions associated. This admixture is characteristic of guiding terms such as ‘cause’ and philosophers such as James Woodward have examined the manipulationist concerns involved.
The Fourier paradigm is particularly seductive because of its “strong physics avoidance virtues—the invariant nature of our string’s modes allows us to push most issues of temporal development off the table and to concentrate instead upon the time-removed question of what the system’s eigenfunction modes will look like when frozen into their positions of pure potential energy”.[9] However, causal process, as it applies to continuous wave progression as we originally examined (the wave motions which carry a violent string forward in time from one state to another) inherently deal with temporal developmental processes. This is a product of our ‘early a priori’ training concerning the developmental etiology of ‘cause’—we progressively deal with target variables such as counterfactual construction, causation, and manipulations, which are refined over time through mixing linguistic instruction with factual report qua reasoning judgment (e.g., we can solve the problem of maximal potential energy shapes that ‘cause’ a refined preserve energy via shooting method trials, locating distinct eigenfunctions according to the number of times they cross the x-axis.). As in Woodward’s oft-overlooked analyses on manipulationist conditionals, effective techniques concerning counterfactuals depend upon the inferential methods regarding specialized search spaces.[10]
‘Cause’ performs a different function with respect to linguistic instruction than with factual reporting. Temporally deracinated, the word ‘cause’ no longer attaches itself in the temporal manner of a wave traveling along a violin string. Instead, its focus becomes at least partially architectural, concerning how alterations arising within a modeling format ? ( ? lower) can be matched to alterations that appear within modeling format ? ( ? higher). Such is also the case with the use of ‘cause’ in multiscalar modelings (e.g., adjustment in the stresses around the mineral grain within granite can ‘cause’ that portion to shear elastically or recrystallize). Such multi-layered reasoning architectures mimic reasoning policies found in nature, with collections of linked sub-models centered around various scale lengths and communicating with one another via homogenization techniques, rather than through the straightforward amalgamation of data. Enforced changes on scale level ? lower affect behaviors on scale level ? higher; according to adjustments in stress that affect elasticity adjustments. As it concerns distinguishing granite from pumice on the macroscopic scale, lower-scale adjustments manifest themselves by shearing elastically (behaving like standard granite) or transmuting into gneiss (recrystallization); pumice, which lacks significant lower scale grain, has an architecture that requires counterfactuals of a similar type but also contains many trapped gas bubbles that can erupt when their obsidian walls melt under higher temperatures. As in the example of thermodynamic effort, which positions a crucial distinction between coherent and incoherent effort, the notion of controlled manipulation is central to conceptual endeavors such as how a spring’s original coherence responds to macroscopically manipulated push or pull with increased internal pressure—etiological question arise, such as “how much of a specified manipulation will cause an increase in pressure and how much will cause a rise in temperature?”[11] Through the collection of cross-scalar counterfactuals of the same general type—that is, a compilation of reliable Woodward-style counterfactuals adequate to the computational architecture we should employ—we are able to address questions concerning the principles that dictate ‘causal’ change by grounding our laws in counterfactual claims.
The type of autonomous causal processes of the sort typified by wave motion do not invoke control variable considerations. Thus, we are not claiming that ‘cause’ necessitates an induced manipulated change at lower or higher size scale. Rather, the specializations that we are concerned with are related to reasoning stratagems that are applicationally and circumstantially ‘mixed and matched’ according to early a prior training modules. The rigid requirements pertaining to how ‘causes’ relate to ‘effects’ depends on local architectures and distinctive bonds between word-and-world, just as “the strength of the thread does not reside in the fact that some one fiber runs through its whole length, but in the overlapping of many fibers”.[12]
According to Paul and fellow analytic metaphysicians, whose position opposes the rigidified semantics within contemporary philosophy of language, during our early a priori training we attach a central ‘meaning’ to ‘cause’, with firm extensions in all ‘possible worlds.’” The essential pattern of word/world(s) attachments takes such shape, and thus we, in agreeance with Quine’s naturalist portraiture of linguistic development, reject the necessitarian assumptions of standard semantic essentialisms. However, this does not mean that we also ought not to be critical of how Quine is privy to Theory T thinking at times as well, grounding counterfactual claims in scientific laws. Thus, Quine is partial to the Gaussian patchwork where early a priori verities can collapse under inferentially patterned adoptions:
“Operating as a term of mixed descriptive and management import, the word ‘cause’ tags along with the architectural decisions we make in adapting established strands of parent reasoning into strategically modified sons and daughters. Borrowing terminology from the mathematicians, we can say that the newer employments represent natural continuations or prolongations…”[13]
Wilson gives us good reason not to reduce every explanatory setting to evolutionary modeling circumstances, as those analytic metaphysicians who overlook the fact that counterfactual claims “make perfectly good sense within explanatory circumstances that are equilibrium-center or which eschew direct consideration of temporal consideration through other means” are writ to do.[14] Insofar as Paul’s emphasis on early a priori considerations is concerned, Wilson’s objection to her defense of analytic metaphysical doctrine is rooted in the fact that her means of improving descriptive practice rests upon tearing our inferential doctrines away from the simpler demands upon which they were originally formed.[15] Rather than appeal to a Theory T “fundamental theory” of futurist appeal, as philosophers such as Ted Sider do, Wilson denies distinctions that rely upon ‘perfectly natural properties’, ‘internal versus relational properties’ and ‘counterfactuals sustained by explicitly articulated laws’. This is precisely why Wilson takes such arduous time to make the case that the history of classical mechanics is characterized by a dependable resistance to suggesting plausible ‘laws’ regarding the basic cohesion of solid matter, preferring to co-opt the more metaphysically reliable “dodge of relying upon constraints and allied evasive crutches”.[16]
What do differential equations within our science teach us about classificatory concepts? First and foremost, and in agreement with the inferentialist Hegelianism of Robert Brandom, statements of scientific law should be understood as making explicit something that is implicit already in ordinary empirical descriptions of how things are. Such equations are not directly anticipated within the subject’s pre-assigned syntax (thus the necessary task of ‘making explicit’)—the law is present in appearance, but it is not the entire presence of appearance. Under unique circumstances, the scientific law of nature has an ever different actuality; the laws of nature determine how things actually interact only when supplemented by actual boundary conditions, or applications. In fixing which antecedents are factual, lawful necessity is expressed under actual conditions, which single out some of those hypotheticals as worthy of detaching conclusions from.[17] Considering the Fourier-style characteristics of an uneven string, the decompositional traits capture basic behaviors of the string in a direct manner despite differential equation vocabularies’ ‘recursive orbit’ rarely captures terminological specificities; these are, instead, born from the “fixed point limits of holistic approximations, whose existences must be established by set theoretic means”.[18] Despite, as made clear by the Greediness of Scales problem, we do not regard differential equations, themselves, as positing directly descriptive accounts of physical behavior(s) below a cut-off level where scaling assumptions fail (instead regarding formulas as convenient bottlenecks of descriptive overextension as we search for tractable conclusion), Fourier-like models obtained from such infinitesimal seeds demonstrate that target systems’ reliable characteristics, as these are captured upon a macroscopic, dominant behavior basis. Our string’s modal properties do not obtain status as ‘important traits’ from laws or differential equations in and of themselves, but, instead, from the means through which such interior considerations provide bridges to boundary conditions, interfaces, basic modeling assumptions, and, more importantly, unformalized appeals to dissipation, relaxation times, and steady-state conditions. That is, Fourier string modes obtain descriptive centrality by way of the vibrating string’s endpoints—continually redirecting traveling wave energy back towards the interior—which couples with relaxation time dispersion. Such boundary condition behaviors designate unique physical factors, which are registered within the interior string equation. Without taking account of such operative cooperative partnership, the string loses its capacity for storing energy in standing wave containers.
Leibniz’ metaphysics is similarly concerned with descriptive overreach in differential equations. For Leibniz, differential equations do not directly reflect physical reality naively but require being parsed “in a manner that accurately recognizes the expressive limitations of the tools of applied mathematics. The metaphysical entities he [Leibniz] reaches via these reflections are his strange monads”.[19] While we do not endorse such monads, Leibniz’ interpretative problem is still with us, as we cannot extract a plausible ontology from physical doctrine in the straightforward syntactic manner that Quine[20] and others posit (in their Theory T modalities). Appealing to the terms of Sider’s ‘internal’ and ‘relational’ property distinction elides the critical importance that homogenization and allied techniques play in supplying mathematical surrogates for the environmental and interscalar relationships that determine how natural behaviors on varied length scales unify. What are the considerations to be privy to when concerning cooperation-of-linguistic-labor?
Syntactic labels that we assign to significant physical behaviors “often derive from the mathematical terrain in which the advantages of strategies like factoring are clearly registered […] rather than first appearing within the grammatical orbits directly spawned by the original differential equation modelings”.[21] In Galileo’s description of triangles and rectangles,[22] boundary region ascriptions do not accord with differential calculus tools used to describe their interiors—the corners concentrate stresses, undercutting the validity of the interior equation. Nonetheless, the idealized notion of perfect triangles also facilitates many helpful reasoning practices that we are barred to otherwise. To solve this, scientists since the 1950s have invoked functional analysis corrections to continually confront such discrepancies–Theory T thinking is oblivious to such cooperative repair due to its static conception of ‘ontology’.
As Wilson notes, “Paul’s emphasis on early a priori learning properly directs our attention to the linguistic question of how we competently manage a wide variety of differently strategized explanatory schemes”.[23] Our disagreements with Paul are at the level of Paul’s rigidified semantic assumptions, which presume that words like ‘cause’ retain a constant and metaphysically analyzable ‘meaning’ throughout all of its helpful ministrations—following Wilson and other critics of permanent necessity (which includes Quine), we reject this premise. Sider does not repeat Paul’s mistake; instead, Sider makes a range of Theory T assumptions concerning how predicates that are posited within science are stratified hierarchically with respect to ‘importance’. Accordingly, such a static position “locks science with a conceptual straitjacket that fails to account for the subtle adjustments at the core of its improving practices”.[24] We can remedy Sider’s conception of metaphysics as pre-scientific enterprise and a necessitarian doctrine that rests upon syntactic convictions and a hypothetical ‘final physics’ by affirming Woodward’s manipulation conditionals. In unison with Wilson’s robustly adaptive semantic pragmatism, we pronounce that referential ties to the natural world ultimately are rooted in language’s practical entanglements with it, in action-enjoining contextual manners which frequently employ complex modes of data registration.
[1] Mark Wilson, Physics Avoidance, (Oxford: Oxford University Press, 2017), xv.
[2] Wilson delineates a sewing machine with complex parts where the circular motion on the right-hand crank is converted into back-and-forth motion at the highest extremity suitable for sewing machine stitching; in this example, there are two mechanical pathways that lead to a triangular piece, making it difficult to visually extrapolate whether this piece will turn clockwise ore counter-clockwise (98).
[3] L. A. Paul, “The Handmaiden’s Tale,” Philosophical Studies 160 (2012), 3.
[4] Theodore Sider, “Introduction” in T. Sider. J. Hawthorne, and D. Zimmerman, eds. Contemporary Debates in Metaphysics (Hoboken: Wiley-Blackwell, 2007), 18.
[5] W.V. O. Quine, “Two Dogmas of Empiricism” in From a Logical Points of View (Cambridge, MA: Harvard University Press, 1980).
[6] Mark Wilson, Physics Avoidance (Cambridge, MA: Harvard University Press, 2017), 246.
[7] Ibid., 248.
[8] See: James Woodward, Making Things Happen (Oxford University Press, 2003) for a discussion around how “cause” operates in accordance to manipulationist conditionals that arise from a wider set of operative circumstances.
[9] Mark Wilson, Physics Avoidance, 252.
[10] As opposed to Nelson Goodman and Stalnaker Theory T thinking, where significantly distinct forms of explanatory architecture are collapsed into the format of as an initial value problem sans reliability-enhancing homogenization (the helpful kind of ‘physics avoidance’) and salient modeling equations are endowed an evolutionary character of hyperbolic signature. Nelson Goodman, Fact, Fiction and Forecast, Cambridge, Harvard University Press, 1983. Robert Stalnaker, “A Theory of Conditionals” in Nicholas Rescher, ed., Studies in Logical Theory (Oxford: Blackwell, 1968).
[11] Mark Wilson, Physics Avoidance, 258 (emphasis added).
[12] Ludwig Wittgenstein, Philosophical Investigations (Oxford: Blackwell, 2001), §67.
[13] Mark Wilson, Physics Avoidance, 260.
[14] Mark Wilson, Physics Avoidance, 265.
[15] Ibid., 267-268.
[16] Ibid., 268.
[17] Robert Brandom, A Spirit of Trust (Harvard, Harvard University Press, 2019), 189-190.
[18] Mark Wilson, Physics Avoidance, 270.
[19] Ibid., 272.
[20] W. V. O Quine, From a Logical Point of View (Cambridge, MA: Harvard University Press, 1980).
[21] Mark Wilson, Physics Avoidance, 270.
[22] Galileo, “Selections from ‘The Assayer’” in Maurice A. Finocchiaro, ed. The Essential Galileo (Indianapolis: Hackett, 2008).
[23] Mark Wilson, Physics Avoidance, 278.
[24] Ibid.