May 12, 2020
Katharina Grosse’s “Rockaway!,” which involves Fort Tilden’s decaying aquatics building in Queens. 2016

On ‘Cause’ and Laws: Grounding Laws in Counterfactuals

Let us criticize contemporary analytic metaphysics’ tacit reliance upon the coarse categories of Theory T thinking, wherein we are referring to scientific theorization by means of approximation, which has directed philosophical attention away from the puzzles of applied mathematical technique that originally concerned Leibniz.[1]  By Theory T thinking, we are borrowing a term from philosopher of science Mark Wilson’s critiques; Theory T thinking concerns a mutltitude of philosophical-scientific doctrines that draw from logic-centric conceptions of scientific organization, a mode of thinking canonized by those logical empiricists of the mid-twentieth century. For such Theory T thinkers, physical theories are understood to be axiomatized structures, which are taken to successfully capture the autonomous behavior of nature within an arachnean cast of mathematical netting. Such Theory T thinking includes: explaining deduction from laws, wherein ‘autonomous behavior’ indexes the commonly held presumption that a physical system’s evolution can be understood from its initial conditions via internally-determined dynamics, paying no heed to interference from extraneous factors. In turn, Theory T thinking presents an idealized picture of correlations at the empirical level of explanation-as-deductible from laws. With this in mind, let us turn to a particular instantiation of Theory T thinking: the word ’cause’.

The term ‘cause’ serves, at least partially, as a central instrument of linguistic management insofar as we utilize it to arrange component strategies within an extended reasoning process. Such applications endow ‘cause’ with robust physical content (e.g., the ‘causal processes’ involved in wave motion) but we can consider, with equal prudence, a scenario where the majority of customary physical referents are relinquished through mechanical constraints, leaving behind a pure exemplar of procedural significance.[2]

Analytic metaphysics has, following David Lewis, been prodded into considerations regarding the early a priori, wherein there is a quasi-Kantian expectation that metaphysical categories exist which serve as fundamental prerequisites of descriptive thought. The first of these is ‘genetic on consideration’, and emphasizes how, via linguistic training, we learn to reason about ‘parts’ and ‘wholes’, ‘causes’, and ‘effects’. Relying upon such categories, we scaffold inferential skills. Thus, in parsing the concept of mereology—the branch of classificatory doctrine that is apparently demanded by our conceptual rendering of ‘parts’ and ‘wholes’—philosophers such as L.A. Paul pose a necessitarian position concerning ‘cause’ and ‘effect’ wherein:

“the metaphysics tells us what it is to be a sum or physical object composed of these structured arrangements of parts, and thus tells us how the physical object is metaphysically constructed (composed) from its parts. In contrast, chemistry tells us what some of the parts and the arrangements of the parts are for different kinds of molecules, and it also tells us how to causally manipulate the world in order to bring such arrangements into existence”.[3]

Other analytic metaphysicians cite future scientific development rather than the prerequisites of knowledge formation wherein metaphysics is considered to be purely “speculative, and rarely if ever results in certainty” contrary to “continuity with science”.[4] Philosopher of science Mark Wilson’s position is closer to Paul’s, whereby the vocabulary we use to explicitly distinguish cases and reorient behavior necessitates that we have a vocabulary of language management. On the other hand, Wilson identifies Sider’s position with Theory T thinking, as it suppresses the importance of the contextual complexities in the linguistic tools of management and architectural subtleties therein. Accordingly, the terminologies with which we can articulate novel reasoning architectures to ourselves and others who may benefit from learning these routines are necessarily entangled in functions concerning multiscalar architecture qua the fulcrum of change (as is the case when concerning questions like “when the small parts of granite recrystallize, what changes do these cause on a macroscopic level?”). Accordingly, we require linguistic tools in order to manage architectural subtleties of the languages we speak. Here, Wilson’s claim concerning the pragmatic factors driving language in progressive fashion is in keeping with Quine’s derision on the “analytic and synthetic”[5] although novel in its structural conception.

The word ‘cause’ adjusts to its semantic bearings as we move from one explanatory architecture to the next, contingent upon descriptive architectural adjustments. Recall that differential equational models that capture genuine causal processes generally possess a formal feature, or a hyperbolic signature, which is within those equational sets that capture non-evolutionary physical circumstances (such as equilibrium conditions, generally of elliptical signature). Thus, physics has a need for considerations regarding ‘cause’ when modeling equations that seek to accurately capture evolutionary developments as they unfold in causal processes. In order to do this, we characterize causal processes according to ‘finite difference’, characterizing infinitesimal relationships within differential equations in terms of finitary spatiotemporal ‘steps’.

As it concerns these ‘early a priori’ childhood acquisitions, we begin to speak of nature’s causal processes in terms of finite difference. Rendering such finite differences into language means turning these into the Humean propositions of ‘cause now, effect later’. Wilson characterizes those co-opting the Humean framework of inference as “calculus-avoiding philosophical descendants, who pressure […] that the ‘laws of nature’ invariably take the form ‘for all x and all t, if F(x) holds at t, then G(x) will hold at t + ? t’”.[6] True causal processes are, however, captured in terms of differential equation relationships that we can only approximate if we do not possess the appropriate calculus. However, this does not mean that we agree with Ernst Mach and Bertrand Russell’s critiques on ‘cause’, as both philosophers fail to contend with Jacques Hadamard’s distinction between elliptical and hyperbolic signatures.

When, for instance, the energy input into a violin string remains trapped within travelling wave-front packets, ‘cause’ is shaped along time’s progressive forwards-arrow, allowing us to co-opt a straightforward evolutionary modeling. But with real strings energetic confinement does not smooth out into standing wave patterns. Rather, “smeared out pulses continue to travel back and forth across the string for appreciable periods” where “the applied work across the entire string” is redistributed in a manner that, after a short relaxation time, sees input energetic resettlement such that we can characterize the string’s movement as a “superposition of standing wave patterns that retain their individual energies for significant periods of time”.[7] This altered representation returns us to Fourier analysis, where there is a shift of basis vectors within a common descriptive arena—moving from a position representation to energy representation, with a change in descriptive basis representing coeval adjustments in reasoning architecture. Fourier factoring demonstrates how the real string, in the absence of energy dissipation, cycles through a number of simple processes independently of one another.

In order to consider the varied vibratory modes of the string at ‘turnaround points’, or those points when stored energies are expressed completely in a potential manner, we need to relocate string calculations through potential energy vis-à-vis an eigenfunction problem. This culls a different explanatory landscape wherein we no longer are concerned with ‘time’ at all but, merely, configurations of maximal potential energy (which is not contingent upon time but, instead, energetic storage capacity). In order to consider this, we need to adjust our computational strategy as a control problem rather than as an initial value problem, applying standard separation of variable techniques. This kind of adjustment is accompanied by a shift in how the word ‘cause’ shifts, as it is no longer attached to any evident ‘causal process’ in the evolutionary modeling manner.

Specifically, when we are speaking about the least potential energy calculated according to curvature, invoking a description of a string in a Fourier-like way, our causal attention shifts to features of the central control variable involved, where questions of cause are contingent upon changes in variables such as angle changes. These are called manipulationist counterfactuals, where the outcome of various potential manipulations is centered upon target variables that proceed processually.[8] A change in strategic focus, or instruction, accompanies an adjustment in the appropriate questions associated. This admixture is characteristic of guiding terms such as ‘cause’ and philosophers such as James Woodward have examined the manipulationist concerns involved.

The Fourier paradigm is particularly seductive because of its “strong physics avoidance virtues—the invariant nature of our string’s modes allows us to push most issues of temporal development off the table and to concentrate instead upon the time-removed question of what the system’s eigenfunction modes will look like when frozen into their positions of pure potential energy”.[9] However, causal process, as it applies to continuous wave progression as we originally examined (the wave motions which carry a violent string forward in time from one state to another) inherently deal with temporal developmental processes. This is a product of our ‘early a priori’ training concerning the developmental etiology of ‘cause’—we progressively deal with target variables such as counterfactual construction, causation, and manipulations, which are refined over time through mixing linguistic instruction with factual report qua reasoning judgment (e.g., we can solve the problem of maximal potential energy shapes that ‘cause’ a refined preserve energy via shooting method trials, locating distinct eigenfunctions according to the number of times they cross the x-axis.). As in Woodward’s oft-overlooked analyses on manipulationist conditionals, effective techniques concerning counterfactuals depend upon the inferential methods regarding specialized search spaces.[10]

‘Cause’ performs a different function with respect to linguistic instruction than with factual reporting. Temporally deracinated, the word ‘cause’ no longer attaches itself in the temporal manner of a wave traveling along a violin string. Instead, its focus becomes at least partially architectural, concerning how alterations arising within a modeling format ? ( ? lower) can be matched to alterations that appear within modeling format ? ( ? higher). Such is also the case with the use of ‘cause’ in multiscalar modelings (e.g., adjustment in the stresses around the mineral grain within granite can ‘cause’ that portion to shear elastically or recrystallize). Such multi-layered reasoning architectures mimic reasoning policies found in nature, with collections of linked sub-models centered around various scale lengths and communicating with one another via homogenization techniques, rather than through the straightforward amalgamation of data. Enforced changes on scale level ? lower affect behaviors on scale level ? higher; according to adjustments in stress that affect elasticity adjustments. As it concerns distinguishing granite from pumice on the macroscopic scale, lower-scale adjustments manifest themselves by shearing elastically (behaving like standard granite) or transmuting into gneiss (recrystallization); pumice, which lacks significant lower scale grain, has an architecture that requires counterfactuals of a similar type but also contains many trapped gas bubbles that can erupt when their obsidian walls melt under higher temperatures. As in the example of thermodynamic effort, which positions a crucial distinction between coherent and incoherent effort, the notion of controlled manipulation is central to conceptual endeavors such as how a spring’s original coherence responds to macroscopically manipulated push or pull with increased internal pressure—etiological question arise, such as “how much of a specified manipulation will cause an increase in pressure and how much will cause a rise in temperature?”[11] Through the collection of cross-scalar counterfactuals of the same general type—that is, a compilation of reliable Woodward-style counterfactuals adequate to the computational architecture we should employ—we are able to address questions concerning the principles that dictate ‘causal’ change by grounding our laws in counterfactual claims.

The type of autonomous causal processes of the sort typified by wave motion do not invoke control variable considerations. Thus, we are not claiming that ‘cause’ necessitates an induced manipulated change at lower or higher size scale. Rather, the specializations that we are concerned with are related to reasoning stratagems that are applicationally and circumstantially ‘mixed and matched’ according to early a prior training modules. The rigid requirements pertaining to how ‘causes’ relate to ‘effects’ depends on local architectures and distinctive bonds between word-and-world, just as “the strength of the thread does not reside in the fact that some one fiber runs through its whole length, but in the overlapping of many fibers”.[12]

According to Paul and fellow analytic metaphysicians, whose position opposes the rigidified semantics within contemporary philosophy of language, during our early a priori training we attach a central ‘meaning’ to ‘cause’, with firm extensions in all ‘possible worlds.’” The essential pattern of word/world(s) attachments takes such shape, and thus we, in agreeance with Quine’s naturalist portraiture of linguistic development, reject the necessitarian assumptions of standard semantic essentialisms. However, this does not mean that we also ought not to be critical of how Quine is privy to Theory T thinking at times as well, grounding counterfactual claims in scientific laws. Thus, Quine is partial to the Gaussian patchwork where early a priori verities can collapse under inferentially patterned adoptions:

“Operating as a term of mixed descriptive and management import, the word ‘cause’ tags along with the architectural decisions we make in adapting established strands of parent reasoning into strategically modified sons and daughters. Borrowing terminology from the mathematicians, we can say that the newer employments represent natural continuations or prolongations…”[13]

Wilson gives us good reason not to reduce every explanatory setting to evolutionary modeling circumstances, as those analytic metaphysicians who overlook the fact that counterfactual claims “make perfectly good sense within explanatory circumstances that are equilibrium-center or which eschew direct consideration of temporal consideration through other means” are writ to do.[14] Insofar as Paul’s emphasis on early a priori considerations is concerned, Wilson’s objection to her defense of analytic metaphysical doctrine is rooted in the fact that her means of improving descriptive practice rests upon tearing our inferential doctrines away from the simpler demands upon which they were originally formed.[15] Rather than appeal to a Theory T “fundamental theory” of futurist appeal, as philosophers such as Ted Sider do, Wilson denies distinctions that rely upon ‘perfectly natural properties’, ‘internal versus relational properties’ and ‘counterfactuals sustained by explicitly articulated laws’. This is precisely why Wilson takes such arduous time to make the case that the history of classical mechanics is characterized by a dependable resistance to suggesting plausible ‘laws’ regarding the basic cohesion of solid matter, preferring to co-opt the more metaphysically reliable “dodge of relying upon constraints and allied evasive crutches”.[16]

What do differential equations within our science teach us about classificatory concepts? First and foremost, and in agreement with the inferentialist Hegelianism of Robert Brandom, statements of scientific law should be understood as making explicit something that is implicit already in ordinary empirical descriptions of how things are. Such equations are not directly anticipated within the subject’s pre-assigned syntax (thus the necessary task of ‘making explicit’)—the law is present in appearance, but it is not the entire presence of appearance. Under unique circumstances, the scientific law of nature has an ever different actuality; the laws of nature determine how things actually interact only when supplemented by actual boundary conditions, or applications. In fixing which antecedents are factual, lawful necessity is expressed under actual conditions, which single out some of those hypotheticals as worthy of detaching conclusions from.[17] Considering the Fourier-style characteristics of an uneven string, the decompositional traits capture basic behaviors of the string in a direct manner despite differential equation vocabularies’ ‘recursive orbit’ rarely captures terminological specificities; these are, instead, born from the “fixed point limits of holistic approximations, whose existences must be established by set theoretic means”.[18] Despite, as made clear by the Greediness of Scales problem, we do not regard differential equations, themselves, as positing directly descriptive accounts of physical behavior(s) below a cut-off level where scaling assumptions fail (instead regarding formulas as convenient bottlenecks of descriptive overextension as we search for tractable conclusion), Fourier-like models obtained from such infinitesimal seeds demonstrate that target systems’ reliable characteristics, as these are captured upon a macroscopic, dominant behavior basis. Our string’s modal properties do not obtain status as ‘important traits’ from laws or differential equations in and of themselves, but, instead, from the means through which such interior considerations provide bridges to boundary conditions, interfaces, basic modeling assumptions, and, more importantly, unformalized appeals to dissipation, relaxation times, and steady-state conditions. That is, Fourier string modes obtain descriptive centrality by way of the vibrating string’s endpoints—continually redirecting traveling wave energy back towards the interior—which couples with relaxation time dispersion. Such boundary condition behaviors designate unique physical factors, which are registered within the interior string equation. Without taking account of such operative cooperative partnership, the string loses its capacity for storing energy in standing wave containers.

Leibniz’ metaphysics is similarly concerned with descriptive overreach in differential equations. For Leibniz, differential equations do not directly reflect physical reality naively but require being parsed “in a manner that accurately recognizes the expressive limitations of the tools of applied mathematics. The metaphysical entities he [Leibniz] reaches via these reflections are his strange monads”.[19] While we do not endorse such monads, Leibniz’ interpretative problem is still with us, as we cannot extract a plausible ontology from physical doctrine in the straightforward syntactic manner that Quine[20] and others posit (in their Theory T modalities). Appealing to the terms of Sider’s ‘internal’ and ‘relational’ property distinction elides the critical importance that homogenization and allied techniques play in supplying mathematical surrogates for the environmental and interscalar relationships that determine how natural behaviors on varied length scales unify. What are the considerations to be privy to when concerning cooperation-of-linguistic-labor?

Syntactic labels that we assign to significant physical behaviors “often derive from the mathematical terrain in which the advantages of strategies like factoring are clearly registered […] rather than first appearing within the grammatical orbits directly spawned by the original differential equation modelings”.[21] In Galileo’s description of triangles and rectangles,[22] boundary region ascriptions do not accord with differential calculus tools used to describe their interiors—the corners concentrate stresses, undercutting the validity of the interior equation. Nonetheless, the idealized notion of perfect triangles also facilitates many helpful reasoning practices that we are barred to otherwise. To solve this, scientists since the 1950s have invoked functional analysis corrections to continually confront such discrepancies–Theory T thinking is oblivious to such cooperative repair due to its static conception of ‘ontology’.

As Wilson notes, “Paul’s emphasis on early a priori learning properly directs our attention to the linguistic question of how we competently manage a wide variety of differently strategized explanatory schemes”.[23] Our disagreements with Paul are at the level of Paul’s rigidified semantic assumptions, which presume that words like ‘cause’ retain a constant and metaphysically analyzable ‘meaning’ throughout all of its helpful ministrations—following Wilson and other critics of permanent necessity (which includes Quine), we reject this premise. Sider does not repeat Paul’s mistake; instead, Sider makes a range of Theory T assumptions concerning how predicates that are posited within science are stratified hierarchically with respect to ‘importance’. Accordingly, such a static position “locks science with a conceptual straitjacket that fails to account for the subtle adjustments at the core of its improving practices”.[24] We can remedy Sider’s conception of metaphysics as pre-scientific enterprise and a necessitarian doctrine that rests upon syntactic convictions and a hypothetical ‘final physics’ by affirming Woodward’s manipulation conditionals. In unison with Wilson’s robustly adaptive semantic pragmatism, we pronounce that referential ties to the natural world ultimately are rooted in language’s practical entanglements with it, in action-enjoining contextual manners which frequently employ complex modes of data registration.

 

[1] Mark Wilson, Physics Avoidance, (Oxford: Oxford University Press, 2017), xv.

[2] Wilson delineates a sewing machine with complex parts where the circular motion on the right-hand crank is converted into back-and-forth motion at the highest extremity suitable for sewing machine stitching; in this example, there are two mechanical pathways that lead to a triangular piece, making it difficult to visually extrapolate whether this piece will turn clockwise ore counter-clockwise (98).

[3] L. A. Paul, “The Handmaiden’s Tale,” Philosophical Studies 160 (2012), 3.

[4] Theodore Sider, “Introduction” in T. Sider. J. Hawthorne, and D. Zimmerman, eds. Contemporary Debates in Metaphysics (Hoboken: Wiley-Blackwell, 2007), 18.

[5] W.V. O. Quine, “Two Dogmas of Empiricism” in From a Logical Points of View (Cambridge, MA: Harvard University Press, 1980).

[6] Mark Wilson, Physics Avoidance (Cambridge, MA: Harvard University Press, 2017), 246.

[7] Ibid., 248.

[8] See: James Woodward, Making Things Happen (Oxford University Press, 2003) for a discussion around how “cause” operates in accordance to manipulationist conditionals that arise from a wider set of operative circumstances.

[9] Mark Wilson, Physics Avoidance, 252.

[10] As opposed to Nelson Goodman and Stalnaker Theory T thinking, where significantly distinct forms of explanatory architecture are collapsed into the format of as an initial value problem sans reliability-enhancing homogenization (the helpful kind of ‘physics avoidance’) and salient modeling equations are endowed an evolutionary character of hyperbolic signature. Nelson Goodman, Fact, Fiction and Forecast, Cambridge, Harvard University Press, 1983. Robert Stalnaker, “A Theory of Conditionals” in Nicholas Rescher, ed., Studies in Logical Theory (Oxford: Blackwell, 1968).

[11] Mark Wilson, Physics Avoidance, 258 (emphasis added).

[12] Ludwig Wittgenstein, Philosophical Investigations (Oxford: Blackwell, 2001), §67.

[13] Mark Wilson, Physics Avoidance, 260.

[14] Mark Wilson, Physics Avoidance, 265.

[15] Ibid., 267-268.

[16] Ibid., 268.

[17] Robert Brandom, A Spirit of Trust (Harvard, Harvard University Press, 2019), 189-190.

[18] Mark Wilson, Physics Avoidance, 270.

[19] Ibid., 272.

[20] W. V. O Quine, From a Logical Point of View (Cambridge, MA: Harvard University Press, 1980).

[21] Mark Wilson, Physics Avoidance, 270.

[22] Galileo, “Selections from ‘The Assayer’” in Maurice A. Finocchiaro, ed. The Essential Galileo (Indianapolis: Hackett, 2008).

[23] Mark Wilson, Physics Avoidance, 278.

[24] Ibid.

More Articles from &&&

Kunstwollen* Minus the Human (Painting in the Age of Machinic Will to Art)

1 Imagine describing the series of Jeff Perrott’s paintings New Construction (Pharmakon, Subject, Natural, Denatural, Door, Sublime, Red Interior, and Cosmic) to an AI or a blind person. How would you start? By listing which elements come first, and how the layers of lines in each painting are ordered? Describing an artwork is deconstructing or… Read More »

Ruangrupa: Contemporary Art or Friendship Industry?*

In the past two decades, more than in the past hundred years, authoritarian regimes have risen to power globally. Today, fascist parties are occupying seats in many countries’ governments, such as in the Israeli Knesset, the Dutch Tweede Kamer, the American Congress, and the German Bundestag. Meanwhile, the collective memory of European fascism and its… Read More »

Call the Bronze Age… they forgot their pictograms!

“In the preceding chapter we discussed the development of technoeconomic organization and the establishment of social machinery closely connected with the evolution of techniques. Here I propose to consider the evolution of a fact that emerged together with Homo sapiens in the development of anthropoids: the capacity to express thought in material symbols. (…) As… Read More »

Interferential Axiology: Excess & Disruption

What is tragic about choice is no longer fundamental if choice is no longer what establishes communication between an independent city and an independent individual as substances. —Gilbert Simondon1   Excess and disruption are different modes of systemic interferences, providing differing sets of axiological implications. This essay seeks to explore their tragic interface in the… Read More »

Here & Elsewhere, at War, & Into the Future

The Middle East continues to painfully be a primary site for the blood-drenched transformations of our planetary geopolitical system. However, about ten years ago and during another Israeli operation in Gaza, an uncanny timeliness opened an unexpected connection between global contemporary art and geopolitics in August 2014 when, following the escalation of Israel’s Gaza operations,… Read More »

Zionism Reconsidered

The seminal essay below by Hannah Arendt, spanning 15,000 words was first published in the Menorah Journal in October 1944. This work was inspired by the meeting of the World Zionist Organization’s American section in Atlantic City. This congress was notable for its assertive call for a Jewish state covering the entire territory of Palestine,… Read More »

The Dead God, A short story in two parts

Things had been getting strange at the firm, since the boss had come back from holidays. The black cape and the pile of Crowley books strewn about the office were the first clue. What was Hardeep, the Singaporean tech bro CEO, doing with all this, mused Pierre, a level 7 sales executive, en route to… Read More »

The Purist

Filipe Felizardo is a philosophy student, artist and musician from Lisbon, with an informal education in film, comics, and musical pedagogy. Currently a Researcher on Critical Philosophy at the New Centre for Research & Practice, Felizardo focuses on systematic reconceptions of learning and alienation, as understood from the workspaces of inferentialism, Marxist activity-approach, and anti-vitalism.

Retinol: A Mode of Action

“Condensed in a formula, the Technological Civilization can be characterized as the transition from ratio to generativity, from matter to process, from nature to the hybrid.” –Davor Löffler If we follow the self-avowed German Accelerationism and deep futurology of Davor Löffler (Löffler 2021), we can posit that everything is co-evolutionary and that there are no… Read More »

The Narcissist Image

In his course Deleuzian Aesthetics Fares Chalabi presents an extended typology of mutually exclusive, rigorously defined image-types, or what I like to call aesthetic structures or aesthetic logics. An image-type or aesthetic logic is a form that structures the entirety of a work of art – take, for example, the ‘series’. The logic of series,… Read More »

Sorry You Can’t Pass a Turing Test But I’m Different 

Five hundred million individuals tried to monetize their social media last year, according to a recent Linktree survey. As a lucky member of this esteemed group, I recently found myself surfing through the entrepreneurial side of TikTok, captivated by a video titled “How to make money with Chat GPT”. The clip tells you to go… Read More »

Unthought Apparitions

In this video essay, Brent Cox works through the poetry of Barbadian poet Kamau Brathwaite and his Sycorax Video Style, which he developed in the early 1980s using a Mac SE/30 and which offers myriad compelling extra-linguistic or extra-conceptual ideas in relation to citationality, literary convention, the constative/performative distinction, the temporality of neologisms, and the… Read More »

The Work of Art in the Age of Cybernetic Criticism

Walter Benjamin’s seminal 1935 essay “The Work of Art in the Age of Mechanical Reproduction” wrestled with the effects of powerful technologies upon culture, and presaged much subsequent writing, e.g. Martin Heidegger and Italo Calvino. Here I want to consider not the artwork-qua-object as in Benjamin, but rather the work of art as an active force, in… Read More »

Modern Art: A True Conspiracy

*Originally delivered as a response to Gertrude Stein’s “The Making of Americans” on Day 27 of Superconversations, a collaboration between e-flux and The New Centre for Research & Practice in 2015. The most recent wartime Christmas in New York was as cold and bright as any other holiday season had ever been in the city. As usual, a… Read More »

Cosmotechnics and the Multicultural Trap

1. Although still a young writer and researcher, it is probably not an exaggeration to say that Yuk Hui is already one of the most influential contemporary thinkers of technology working today. This position is certainly warranted by the strength and scope of his work, the expansive drive and breadth of which is inspiring, especially… Read More »

Pandemic, Time for a Transversal Political Imagination*

I: Symptoms With the omnipresence of the term “symptom” these days, it seems that a plausible escape from the deep horror of this pandemic would be to conduct a symptomatic reading of it. Attributed to Louis Althusser, this method of reading literary and historical texts focuses not on what a text evidently expresses, but on… Read More »

Generation Z: Invincible, Angry & Radical*

*Originally published by BBC Persian, to read the original, please click here.  Following the protests that are taking place in Iran after the killing of Mahsa Amini by the forces of the Islamic Republic of Iran, the attention of the people and the media has been drawn to the role, and strong presence of the… Read More »

A dialogue on Law & Platform Architecture

Note: This piece was co-produced as a dialogue in the manner of a feedback between the authors. They reacted to each other’s thoughts on Law about Space while having as a single rule that each would use a different language as a tool of communication. Zé would use written text, whereas Artemis would use visual expressions. When… Read More »

Arriving from the Future: Sinofuturism & the post-human in the philosophy of Nick Land & Yuk Hui

Modernity and technics “If you think about the Silk Road in the past, there’s this idea of eastern and western people meeting on some kind of big road and maybe selling and buying things. I think this history repeats itself, and some kind of new and interesting phenomenon is happening.” —Kim Namjoon, member of the group… Read More »

Artist as a Formal System: Towards a general theory of art

For the past few years, I’ve been engaged with writing a footnote to an essay with an attempted theoretical explication of what is meant by the word “art”. For a much longer time, I’ve pursued a very abstract but also very specific direction in my own art practice – like any other artist. One little… Read More »

On Daniel Hölzl’s Grounded

“Oil is the undercurrent of all narrations, not only the political but also that of the ethics of life on earth. This undercurrent material, petroleum narrates the dynamics of planetary events from macroscopic scales such as hot and cold wars, migrations, religious and political uprisings, to micro or even nanoscopic scales such as the chemical… Read More »

The Future History of Skills

We become what we behold. We shape our tools and, thereafter, our tools shape us. — John Culkin (1967) “A Schoolman’s Guide to Marshall McLuhan” (The Saturday Review) Human creativity is often driven by lateral thinking, which according to Margaret Boden has a weakness. She posits that AI can introduce better “standards of rigor, […]… Read More »

Babylonian Neo-mustaqbal: Continental Vibe and the Metaverse

My aim here is to venture a scholarly definition of the Continental Vibe, but allow me to arrive there via an anecdote, or an impression, really – one of my earliest memories of viewing the world as a cast of signs and symbols. A somersault of senses: visual, auditory, tactile, olfactory. A sum of building blocks and a bevy… Read More »

Telos at the End: A Meditation on Dysteleological Superintelligence

I proceed from an actual fact. For all the scenarios of existential risk from Artificial Intelligence/Superintelligence, there’s always been the same thing. There’s always been this aspect, put tacitly or implicitly, either merely enlisted, or considered to be decisive. And what is it? It is the presupposed teleology. Varying in movements and outcomes, all AI-concerned… Read More »

Second-order Design Fictions in End Times

This conversation on Second-order design fiction is part of an ongoing collective research project by Fry and Perera on Technology, Cosmotechnics, Design and Resistance. In their conversation Fry and Perera explore the concept of second-order design fiction (SoDF) as an emergent means of addressing how design is understood and practiced in the context of the… Read More »