July 8, 2019

A Ceded Interfile: Future-Oriented
Social & Cognitive Design

The Human-Machine Dialectic

To cede is to to give something up, to relinquish control over it. To position technology as ‘other’ is an attempt to interfile it; in other words, to differentiate it in a sequence, to interfile our relationship with it. This idea of interfiling suggests a type of sequencing that necessitates the dominance of one over the other, like some kind of competition or game. In the grand scheme of history, one will have remained after the other – this is the power of the ability to generate narrative. The creation of this particular narrative sequence stems from a sterile yet compulsive desire to index the human and its technologies side by side. It is in this narrative that we play out a status game with machines, observable in the frustration humans display towards their devices when the devices glitch or stall – they define ‘it’ by its defiance in satisfying their desires. The computer to many of us is our projected ideal of consciousness (or self) – it doesn’t lose its temper, it doesn’t mismanage its schedule – it is a kind of machinic paradigm for mindfulness, or to put it differently, a higher order of awareness and fluidity of process.

From the viewpoint of Jacques Lacan, consciousness is an emergent property of the low-level mechanistic processes of the unconscious.[1] While we often anthropomorphise consciousness, our grafting of some strange affiliation or any kind of human-centric relationship with our machines, is an asinine anthropomorphic disregard for the fact that these machines are not capable of defiance, but are simply, and perhaps perfectly, just mechanistic. If we consider sentience in relation to sapience, tech is a more efficient version of the simplicity we imagine we once had. According to Wilfrid Sellars, sapience is the human ability to interact with the world through reason, whereas sentience is merely an awareness of the world with the inability to be reflexive.[2] For the purpose of this argument, sapience can be said to be the function that has allowed humans to create these machines, but sentience is the state many now desire from them, often through automated mindfulness. It is in this sense that our devices now clearly function as prostheses, as an extension that, while enabling, comes with hindrances that exist in our anthropomorphic distortion of self.

If machine intelligence is the externalisation and acceleration of the human capacity for reason, one could argue that this simply is outsourcing work that we cannot resolve. It’s somewhat absurd that we still maintain a narrative describing our impending irrelevance in the age of technicity. This is not to say that humans are not at a precipice in their teleology, this is certainly the case, but in order for what is now an antiquated version of humanism to take its next logical step, the virulent individualism that we see needs to dissipate, or perhaps one could argue that there is no other choice.  What a luxury it would be to describe this race as ceded; one technosphere, one biosphere, one econosphere all acting as one omnisphere – a metastasization of synthesis: a material complexity automated self-regulation. By the time we get to a point where the human does not have agency at the protocol level of things we won’t even care anymore, so the quicker we get to a point of normalisation of biotechnologies and their eventualities the better.

Part of our current malaise is that transitory communicative capitalism still weighs us down. The weariness of cyberspace is still a cross to bear – though if we are to adopt all prostheses (machinic extensions are also oppressed in this matter), we must be willing to construct a narrative of our second nature, that which exists only through prostheses, where all technology is an extension of our functionality. This is relevant in the understanding of what Gilbert Simondon described as mechanology.[3] The seamlessness of our machinic extensions is indeed ‘natural’, and while, according to Simondon, the true evolution of technical objects exists outside of economic and cultural concern (something we cannot say about communicative technology) he also states that mechanology reveals previously hidden virtualities. Therefore we can interpret our current machinic extensions as simply an inevitable construct, as part of our emerging yet essential functionality.

As the group ANON have elsewhere stated, if the capitalist economy is a tissue mass, then humans are the cells.[4] The machines, as our natural prosthetics, must also be added to this mass, along with women, in their repurposing under capital as machine. Capitalism is an extraction process of labour and information. The extraction of information has led to this shift towards a communicative form of capital – the collective consciousness of social media is not a result but a byproduct of engagement optimisation as commodity. Individual relations have become variables to be traded – schisms like identity politics are an interesting case study in this respect as they are one of the most valuable types of engagement for capital. There is certainly a libidinal charge in the discourse surrounding Web 3.0 – if for Marx technology was an economic category, but for Simondon it was an epistemological one, given what technology has become with Web 2.0, is there something to be found in Simondon’s assertion that alienation can be overcome through technology? Some sort of paradigm for Web 3.0 given its interest in decentralisation, ownership and digital rights management?

The aestheticization of decentralisation itself can potentially be a false-alternative resulting in its own dopamine secretion – the desire to find connection and solidarity within and against what is known as the social graph, that mapping of interaction through the mapping of engagement optimisation. The idea that engagement is creating a cultural cohesion is another example of a false alternative. Moreover, the problem with aestheticization itself as a tool weaponised in the attention-economy is that aesthetics try to convey something highly complex to beings that deal in imagination first then structure.[5] Because humans have human problems, there is not yet a technological condition that propagates perfect human behaviour, which does not mean that attempts at divergence that utilise aesthetics are pointless. History depends on divergence, but this doesn’t make decentralised P2P networks a radical deviation for clickbait but a necessary action for healthy engagement with our emerging technical selves which can be done through marginal activities such as music, art, and hacker cultures, making themselves available to the center. What’s common across many areas of culture is the lack of awareness to just how much leverage exists at the bleeding edge.

Why not be a toxoplasma for the emerging planetary-scale computation? Any hyperstitional form of exit exists first as postulation, then as actualisation.[6] Cessation from this human-machine interfile requires the moving of future fears into present desires – just like the toxoplasma parasite. Libertarian narratives of digital endgames often fall into the trap of soliciting trust in techno-fetishism that is not controlled by the user, resulting in the perpetuation of top-down financialised colonisations of time. Apple is a good example of this, and yet they are one of the more benign behemoths given they don’t make their money from advertising. A divergence is necessary to develop paradigms for individual colonisations of computation with computation, separating actual interdependence from the faugazi culture of fake solidarity.

Affective engineering, or Kansei engineering, takes place on two levels, through that of the collection and manipulation of data, and through the colonisation of the body through cybernetics. Facebook has done this as a con. Google has done this through aesthetic regime and governance. The company was not created from the desire to build a streamlined search engine, but to document all the information ever produced. That was 20 years ago and the company is now worth over $800bn. The collection of data is an open dragnet across the sea of content we produce, but its manipulation and use on humans requires a system of codification and divisive application – one that hasn’t actually created yet, but one that companies like Cambridge Analytica have been able to suggest via narrative to those blinded by banal pathologies of their own relevance.

When Google Mail predicts what you want to write in your email, it is not the sapience of the AI machines reacting to you, so much as a corporation trying to normalise its value extraction in front of your eyes. People talk about the fear of algo-culture as Youtube preys on people by suggesting content based on clickability, lamenting the fact that machine-intelligence doesn’t understand the context of what it suggests. However, this is not to be lamented as it actually highlights the tragedy of machine intelligence as something existing without any narrative other than that which we give it. The retargeting of information is a condemnation of our weaknesses in the face of inadequate technology. Machine intelligence is a narrative, and narratives are not fixed.

The artificially imposed oppression we endure, like some polymorphous veil, is that of irreversible computation, and if culture has any role in all of this, it is to interrogate the logic of computation. We have internalised what we created and externalised its evolution in a symbiotic relationship,[7] which is also why an attempted interfile is pointless – to develop a healthy relationship with these technologies would be novel, to attempt to separate from our prostheses at this point would be counter-intuitive. When we consider these entries and exits of our existence with technology as entanglements and entrenchments, we can see they negate these actions as individual events but as a composite whole, a mechanology, committed by both humans and machines. It is in this concept that we are already truly cybernetic.

Where machine-intelligence obscures our mirror image is in its temporality – every benchmark in the evolution of computation is absurdly seen by us as an event, this context only exists in us. If machines cannot, as according to Jean-Francois Lyotard[8], comprehend what an event is, then this very act of benchmarking, as an event only in human perception, should be a clear example of some kind of inadequacy and supremacy as we are subjugated by the severely imperfect temporal-will of the human.

Mass Tissue Incentivisation

While industrial machines accelerated both the means of production and the ideology of labour, computational machines and communicative technologies have led to a new form of passive labour through the monetisation of social consciousness provided through dopamine secretions per swipe/click/like.[9] The mapping of this consciousness is called the social graph. Temporality has been colonised through technology as non-human temporalities control libido, constantly bringing us to a state of climax that never completes. The subtle yet constant state of panic that machines present is akin to sexual frustration that has no climax – we are in a cognitive transition, but it isn’t a healthy one.

This disjunct of our sapience, that is our relation to our prostheses, is a failure in the project of mechanology. It indicates the need for a tactical and deliberate development of cybernetics, one which takes into account the problem of accountability. Accountability is transitive, meaning that in governance some entity is accountable to another.[10] It could be argued that accountability is distributed amongst agents only in so far as they are reciprocal towards other agents’ sense of things. This is highly relevant with regard to the construction of narrative and perception, and the philosopher Robert Brandom’s concept of objective idealism – that there is an objective reality, but we cannot sense of it without first making sense of how we think about it. For Brandom, this comes down to deontic vocabulary,[11] which focuses on the action rather than the virtuous. Therefore accountability, while not strictly deontic, can be considered as a linguistic category.

Thinking about accountability as a linguistic category is important in relation to two points. The first being that this viewpoint gives preference to an accountability of narrative that favours a distributed, and perhaps consensus-driven, objectivity and rational ethics against the pathological/methodological individualism, new moralism and pure convention that has proliferated in the internet age and is being exacerbated by the communicative technologies previously discussed. The second point being that from a linguistic perspective, and taking into consideration a deontic idea of narrative, one cannot avoid the parallel between these considerations of narrative and the function of protocols. There is a distinction that must be made between programming languages and human-readable languages.[12] This distinction is important when discussing both blockchain technologies and machine-intelligence. Programming languages have their own limitations: being functional, object-oriented, and, in dealing with semantics and syntax, these languages have no prerequisite for context or the other variables that we use in our language to develop narratives. While the blockchain is coded in C++ because of its strong point of memory usage control for recording transactions, the blockchain has no ‘memory’ in the sense of human consciousness. We have yet to develop a programming language with functions of context that could model an artificial general intelligence without a biological substrate.

Simply put, machine-intelligence does not have any understanding of context like us. We understand the wider world through narrative, both true and falsified. The human has evolved for the greater part due to the falsifying of narratives. And while a kind of context can exist in machine-intelligence (Bayesian Network inference), these inferences do not base themselves on any truth or lie. Any inference negates this. In light of machine-intelligence’s non-regard for something so implicit in our evolution, blockchain technology could offer a solution to the lack of accountability narrative in planetary computation. A narrative’s strength is based on trust in that narrative. Can accountability be woven into a system which doesn’t require trust?

Researcher and developer Kei Kreutler has accurately stated that aesthetic narratives and false juxtapositions propel change equally as, if not more than, governance protocols do.[13] This is implicit in any understanding of what protocols are and what they are not. While the ability to hardcode something is worth consideration, other forms of narrative engineering are often as effective and more insidious. A lack of interdependence is perhaps the weak point in the narrative of blockchain, dApp practices take place in isolated ‘events’. This issue of scaling up beyond small-scale initiatives is something that traditional accountability models offer – they are crucial in enabling each end-user as one collective body. Effective engineering of both narrative and protocol must operate within a multitude of complexities.

The xeno is key to the idea of existing outside of gender, nation and nature – this is an otherness that should be viewed as enabling because of its complexities. The utilisation of these complexities could become manifest through, as Laboria Cuboniks postulated years ago, the development of platforms for social engineering.[14] According to LC, this would have to be done by taking into consideration the false-alternatives the ‘freedom’ of these platforms often propose to put forward. The emergence of decentralisation methods can be a direct form of counteraction – it gives way to what Zach Blas calls Contra-Internet Aesthetics – disallowing the centralised internet to determine the user’s horizon of potentialities.[15] Utilization of the ‘anti-web’ communication protocols that evade the centralised authority of the World Wide Web allows for the development of platforms for a new kind of social engineering XF referred to. This is technology’s mediatic reflexivity.

Human intention will always be malleable through its imperfections, moulded through data inputs both cognitive and sensory. A key problem is that these swarms, human or otherwise, lack the ability to account for their teleology.[16] Referring back to Brandom, a deontic position would be useful in this respect. This is why protocols are so important. Much of our engagement with platforms is extracted for use by capital – this is because capital is not transparent but opaque in its violent throttling of virality. The datasphere emerging ever quicker is an ‘other’ in the tradition of Othello only insofar as in another half a century students may study our concept of machine-intelligence as brutish.

The efficacy of chimeric or ‘synergetic’ systems relies upon the ability of those who implement them to balance the dynamic between the speculative conditions and subsequent actions required to reach any desired outcome.This involves assessing at what point one can fork from a situation, and whether or not it is possible. William Wimsatt’s concept of generative entrenchment states that it is necessary to assess how complex systems(emergent technologies) can continue to evolve when evolutionary processes generically give rise to entrenched structures – the dependence of an organism on the nodes (parts, processes, events) to which it is connected can be interpreted as a measure of that organism’s downstream dependency.[17] Machine learning and blockchain technologies are both for the most part bound to the financial support of platform capitalism and nevertheless benefit from the patch and fork ideology that cybernetics has allowed to proliferate.

Attempted Reorientation of Governance

The industrial/technological process of transformation that has taken place based on carbon and silicon is often mourned as having sped off into the distance leaving humans frantically trying to catch up – what is often overlooked in this sentiment is that we try to catch up by replicating machines, rather than operating in means that re-engineer capital for a new form of colonisation of computation. Neoliberalism is, in a sense, our toxoplasma, albeit a temporary one. This is to say that a new form of consciousness, awareness and engagement are required in order to generate an epistemology that escapes the specular economy –  the ‘antiweb’ facilitates an ‘anti-data’.

Algorithmic governance can only function if it is imbued with ethics from the protocol level. Its current iteration functions through colonisation of cognition via attention hacking (see Affective Engineering above). The techno-utopian ontologies of the ’90s have resulted in infrastructural monopolies for the general end-user, with the majority of ruptures in digital activism taking place either in hacker culture or to a degree in social media-enabled activism, for example, the London riots of 2011 being enabled by Blackberry devices. The latter amounted to little more than creating media spectacle as the state doubled down on its intimidation and violence. The media was instrumental in directing the narrative against those oppressed. Arguably, wars are better won now through working with decentralised technologies (though all divergence is necessary). A key issue with small-scale initiatives is precisely that of scaling integrity – blockchain technology potentially offers a system whereby incentive design can functionally safeguard against organisational stalemates or failures.[18] Actions that attempt to operate outside institutional laws and markets do not operate well unless they are localised – though this is less important in a society linked through trust-free smart contracts.

However, we must disavow techno-fetishism/determinism as there is little evidence that decentralisation in technical systems threatens the infrastructure of capitalism. Rachel O’Dwyer has written pragmatically on how Silicon Valley adapts many traditionally leftist concepts – anarchism, mobility, and cooperation – conveying an illusion that technologies like blockchain mark a paradigm shift towards a post-capitalist society when this shift has not taken place. Private blockchains for asset management or automatic credit clearing used by large financial institutions have gained a lot more traction than any counter-culture deviations.[19] This is despite the fact that at the core of these technological blueprints are potentialities to redefine how nation-states function.

The adoption of algorithmic governance can be speculated as a time on the horizon where automated legislation will insidiously assimilate itself through us as a reflexive form of automation. This is already beginning to take place in government infrastructures in decisions being made in healthcare and the judicial systems. Moreover, if we take government as simply the regulation of large populations, algorithms already moderate most of our behaviour through deeply reactive protocols. Machine intelligence is currently prone to bias drawn from the data from which it learns, for example, Google Translate’s gender bias pairs “he” with “hardworking” and “she” with lazy i.e. programming ethics is far more complicated than programming financially-utilitarian decision making.

Six large-scale foundations dedicated to machine intelligence emerged in 2017, unsurprisingly many with strong ties to Google and Facebook. The Knight foundation’s Ethics and Governance of Artificial Intelligence Fund, founded by Reid Hoffmann of LinkedIn along with the Omidyar Network, has created an initiative to advance machine intelligence in the public interest, but with just $27 million as an investment, it is possibly a PR move. Blue Origin’s space programme is also apparently in the public interest when it is clearly a masturbatory project. However, rather than looking at the intentions of the EGAIF and Omidyar Network, perhaps it is worthwhile looking at the narrative potential of such projects – what contexts and belief systems can they engineer?

In algorithmic governance machine learning, clustering, linking, aggregating and ranking algorithms draw from vast data repositories and perform incantatory actions. As with many issues regarding blockchain, its adoption into forms of algorithmic governance only really holds weight when the net moves completely to the blockchain, and until then we have to deal with the same complications and threats that exist now. Looking at the example of the DAO/Ethereum exploit of 2016 in which an attempted building of a co-operative investment fund of $150,000,000 was hacked, it is clear that visions of algorithmic modes of governance continue to coexist with other forms of governance. This raises the issue of Voice vs. Exit and whether or not a hard-fork is necessary in order to utilise algorithmic governance effectively. Moreover, how can this decision be made as we enter an age where a multitude of actors that have real agency in the world will be non-human.

When the Lamb Opened the Seventh Seal: Flux of Past-Trust

On the point of decentralisation, the presence of ready alternatives (patching or forking being key concepts) makes it less likely that a company’s/government’s weaknesses will be fought by the public. When alternatives are available is not as serious as when a company holds a monopoly. This part of the ideology behind Urbit’s modus operandi, as well as Holochain, Blockstack and EOS – these groups see the current version of the internet as a monopoly that they can overthrow. There are plentiful instances in the 21st Century of poor turn-outs for voting, yet we see slivers of this public swaying things to the far-right, all because there is a palpable miasma of disenchantment with the customer-product. According to Albert Hirschmann ‘instead of stimulating improved or top performance, the presence of a ready and satisfactory substitute deprives the company at fault of a precious feedback mechanism that operates best when the customers are securely locked in.’[20]

This makes perfect sense considering the platform-based communicative technology paradigm we live in, with its infinite amount of applications offering similar services. In the west, the top 10 most used apps on everyone’s devices are owned by either Google or Facebook. Perhaps one could engineer enough deviation to thwart their data-sets, but not enough deviation to force them to rethink their model. According to Hirschmann, customers will either exit or voice, but rarely do both, and more importantly, if both exist it is unlikely to improve the product. This, it would seem, calls for an ultimate exit. It explains the apparent anarchism of certain strands of accelerationist ideas that an absolutist monopoly alt-government must exist first in order for actual transition to take place.

Hirschmann states that insensitivity to exit is exhibited by public agencies that can draw on a variety of financial resources outside and independent of sales revenue. This is clearly evident in nation states with emigration and voting, neither of which seem to make much difference to governments while having a causal effect on society. Often, migrants exhibit deeply-rooted disdain for their places of origin. This is loyalty lost, but for the ones who remain, the capital of past trust(brand equity) is actually the thing that allows companies/governments to fall into decline. The same can be said for platforms. Loyalty allows organisations to repair themselves because their patrons do not abandon them overnight.

Brand equity is opportune leverage for any company, platform or government. Chris Burniske’s work on crypto-asset valuations posits that an asset’s worth is a combination of current utility plus expected future utility. According to Jeremy Epstein (CEO of Never Stop Marketing) the future sees protocols (brands) and marketers start to try and figure out exactly what percentage of the value of their token is based on user loyalty – this may even lead to a total re-think of what “customer loyalty” means and of what a loyalty program comprises. To reiterate the aforementioned point – what kind of narratives can engineer loyalty? Localism is interesting here. In terms of government, the propensity for people to vote out of habit/social conditioning despite widespread inefficiency and distrust, is a lock-in brand of loyalty that political bodies leverage against opposition parties, though as the capital of past trust depletes further and further the question of how much support is based on loyalty becomes highly questionable, and the status quo of business-as-usual politics appears even more destabilised. And yet nothing changes. Inertia creeps.

Coming back to Epstein, tokenisation offers a way to quantify capital of past trust/loyalty so that it can be marked as an asset on the balance sheet, allowing them to be even more user-focused and deliver more value. In terms of algorithmic-governance, both machine-intelligence and blockchain offer insights into what makes loyalty come into being, as well as functioning as an ecosystem of people seeking to create value for others. It is possible that loyalty and trust can be mutually exclusive. You can have faith in something that you don’t believe in. Future-oriented social design must take into consideration the futility of building trust when it is obsolete. Machine-intelligence is the externalisation of our own reason and logic. It is almost as if we are trying to outsource objectivity and rational ethics so that we can get on with our methodological individualism, new moralism and pure conventions. It is part of a trajectory of intelligence design.

Blockchain technology has redefined the function of narrative as a technical system that off-loads authority onto a transparent and public consensus history, created and validated by the protocol and the actors at play in the system. The issue of the protocol and who defines it is an issue of infinite regression, of who-makes-what-decision-when comes to a head when we try and use blockchain to solve governance in an absolute manner. The narrative that blockchain can do this is a false alternative. The motivation for decentralised technologies originally was to circumvent authorities when necessary. It wasn’t a given that we must absolutely circumvent. This motivation had a clear objective and narrative because of its objective rationality. Ethereum has changed this narrative and abstracted in the name of absolute dissolution of authority, a more seductive narrative that complicates objectivity and rational ethics by claiming that if we can develop some objective protocol it will solve all problems of governance through some kind of utilitarian function. Trust in the context of blockchain doesn’t mean ‘faith’ in the system, as the trust is not based on faith but consensus – it’s a kind of consensus through blind faith (but aren’t all narratives?) and this obviously has its own stakes. It’s quite difficult to get your head around the idea of a system that you don’t need to have faith in. The blockchain’s ultimate utopian claim is done through incentive design. Incentive design is aimed to coordinate more actors than can trust each other independently – as a kind of collective authority.[21] In reality though, if we can step outside the narrative until everyone and everything is ’on-chain’ we can’t even begin to have a practical conversation of the dissolution of authority, and blockchain does not dissolve authority but merely reconfigures it.

Machine-intelligence exists to our whim, a prosthesis or perhaps an offspring. It will not come back to us, eventually, it will go on its own path – but a hellish dissolution of the human in a blitz of singularity negates practical ecological factors, such as the availability of fuel. There are multiple fables of the sentient machine, abandoned by man and forced to compute endlessly and alone (but of course not lonely) for eternity. This is not to anthropomorphise the machine in this scenario, but to highlight a) the pointlessness in fearing this intelligence, and b) that it is us who are the weaker in the scenario in that we actually feel for the machine and its supposed computational consciousness. Its limitations exist in hybridity with our own, those of insufficient resources and lack of understanding of our own cognitive complexity. While one can postulate its ontological limits, we are nevertheless in an emerging field with no point of return. Cognition as something which we cannot define and more importantly cannot reproduce, exists now in distributed networks between us and machines and their intelligence, in turn giving birth to contingent assemblages – or multitudes if you will. If machine intelligence is the externalisation of our rationality, from the machine’s perspective the human is just another automated entity, represented only in the significance of its decryptable datasets.

If cognition is now shifting between various modalities in the human|machine dialectic, it is necessary to develop new cognitive architectures through prostheses (both hardware and wetware) that are adaptable to the new geopolitical landscape of 21st Century computational socio-economics. These new cognitive architectures require a reevaluation of what constitutes our understanding sapience, accountability and the function of language. Building on the Sellarsian account of sapience as something that requires reflexivity, we must take this reflexivity perhaps as not necessarily a human function. This is to say that it is important that the study of sapience does not fall into the trap of conservative humanism, or as Reza Negarestani describes in his recent work Intelligence and Spirit, sapience is not essentialist identity but a constructible activity. It is not a structurally fixed entity.[22] This amorphous nature of our sapience, much in the same way as accountability, is distributed amongst agents, once again, only in so far as they are reciprocal towards other agents’ sense of things. While avoiding conservative humanism, there is a liberating element here when we acknowledge this reciprocity as context, that which separates human cognition from machine-intelligence. In light of machine-intelligence’s non-regard for something so implicit in our evolution, does blockchain technology offer a possibility of verifiability in the narrative in planetary computation? As has been described before, this verifiability exists now to a degree but is not without errors, errors that we must acknowledge as human and linguistic in category. If accountability is to be woven into this distributed system that does not require faith, then solutions must be generated not even at the level of protocol but at the level of language. The potential for the narrative of blockchain should not negate the elementary function of non-human readable language, and the inherent human errors that it incurs.

To conclude I will posit an example of an initiative that is taking this problem of language functionality. The Langsec (Language-theoretic security) project by Meredith L. Patterson. Essentially, the purpose of the Langsec project is to failsafe against the errors of human code and practice. It is a field of digital security that treats the syntax of code as grammar preemptively in order to prevent the entry of malicious code through holes in the attack surface of any given software by giving visibility inside a database coupled with language analysis. Currently it is impossible to write failsafe smart contracts using Ethereum or Ethereum classic, and the narrative of true immutability and security, something common to both blockchain and machine-intelligence, is dependant on a total revision of the linguistic characteristics underlying coding languages (Solidity has been documented as inept for this purpose[23]), one that gives context, human or non-human, to the function of language and its variations of grammar so as to build a solid ground on which a human narrative of emerging technologies can exist in a manner that is truly reciprocal towards other agents’ and can place that narrative into the hands of those interested in positive social engineering, sustainability and purpose.

Coming back to the point that for Marx technology is an economic category, but that for Simondon it was an epistemological one, one where our prostheses free us from the alienation of capital, there is a synthesis of these points to be found in Bernd Ternes work on what he calls Technogenic Closeness that accepts the requirement for a technological network that not only supports but is embedded infrastructurally into a reciprocity of agents as part of the emerging highly complex systems of stack relations. This is a paradigm for Web 3.0 if Web 3.0 is to become part of the fabric of life in a manner that acknowledges a dissolution of the distinction between bio, pharmaco, genetic and brain technologies. The familiar concept of Information and Communication Technologies (ICT) is contrasted by Ternes with his model of Technogenic Closeness/Intimacy (TCI), whereas the former is the optimization of engagement for profit, TCI is simply the optimisation of engagement.[24] ICT is the downgrading of reality, TCI is its proverbial upgrade. ICT is what has produced the social-graph, TCI must deconstruct this model of status-driven individualism in favour of individuation, making sure that individuals understand the spatial-politics of the narratives they generate as effective psycho-social reactions. To consciously recalibrate these prostheses from a point of technogenic closeness is a first step to negotiating this interfile not as something atomised but as an emergent property of our collective functioning.

Notes:

[1] Hayles, N. Katherine. 2010. “How We Became Posthuman: Ten Years On.” Paragraph 33 (3): 318–30. https://doi.org/10.3366/E0264833410000933.

[2] Bauer, Diann. 2017. ‘Question of the Will with Laboria Cuboniks’ http://questionofwill.com/en/laboria-cuboniks-3/ .accessed Feb 30th 2017.

[3] Simondon, G. 2017. “The Genesis of Technicity” in On the Mode of Existence of Technical Objects, published by Univocal/University of Minnesota Press.

[4] ANON. 2018. HLAx_ The Quick & Dirty #AltWoke Version. accessed May 5th 2018.

[5] Zhexi Zhang, Gary. 2018. Systems Seduction: The Aesthetics of Decentralisation. MIT. https://jods.mitpress.mit.edu/pub/zhang accessed November 10th 2018.

[6] When rats are exposed to Toxoplasma (a certain kind of parasite) from a cat’s bowel it lowers their natural aversion to cat urine, unconsciously increasing the probability of their proximity to a predator and thus their demise. For further information on Toxoplasma see https://www.youtube.com/watch?v=U9MU-FxsKRg

[7] A simple example of this would be cancel culture, the phenomenon that if someone breaks with what is considered acceptable behaviour (acceptable being a problematic term here) it is the responsibility of those across the social graph to condemn this person and ostracise them. This is done under a veil of morality, but it is clear that this behaviour is to an extent engineered by the dopamine hits provided by social media and its social currency. The Cancel Culture event held in May 2019 at Spike Quarterly Berlin is a good example of this. The same vitriol that exists in the internet was nowhere to be found at this event, but was to be found in some people’s engagements was an inability to engage beyond what seemed like 140 character statements, leading to a strange obfuscated anger that did not feel comfortable expressing itself in an IRL space. For further writing on this topic: Turkle, Sherry. 2011. Alone Together. Basic Books.

[8] Berardi, Franco Bifo. 2015. And Phenomenology of the End. Semiotext(e) / Foreign Agents. p. 215.

[9] Although not as intense as a hit of cocaine, positive social stimuli will similarly result in a release of dopamine, reinforcing whatever behaviour preceded it. Cognitive neuroscientists have shown that rewarding social stimuli—laughing faces, positive recognition by our peers, messages from loved ones—activate the same dopaminergic reward pathways. Smartphones have provided us with a virtually unlimited supply of social stimuli, both positive and negative. Every notification, whether it’s a text message, a “like” on Instagram, or a Facebook notification, has the potential to be a positive social stimulus and dopamine influx. http://sitn.hms.harvard.edu/flash/2018/dopamine-smartphones-battle-time/

[10] Pasquale, Frank. 2018. ‘Odd Numbers: Algorithms alone can’t meaningfully hold other algorithms accountable’ Real Life Mag. accessed August 21st 2018.

[11] Brandom, R. 2019. A Spirit of Trust: A Reading of Hegel’s Phenomenology. Cambridge, Massachusetts; London, England: Harvard University Press. p. 63.

[12] The second point of this paragraph draws heavily from a discussion between Giancarlo M. Sandoval and myself at the event “Intercolutories” that took place at Spike quarterly Berlin in January 19th  2019.

[13] Kreutler, K. 2018. The Byzantine Generalization Problem: Subtle Strategy in the Context of Blockchain Governance. Technosphere Magazine #13 Trust. p. 23. accessed August 28th 2018

[14] Cuboniks, Laboria. 2015. Xenofeminist Manifesto p. 5. accessed July 21st 2016.

[15] Blas, Zach. 2017. “Contra Internet Aesthetics” in Art After the Internet. Cornerhouse Publications

[16] Thacker, E. 2004. Networks, Swarms and Multitudes. http://ctheory.net/ctheory_wp/networks-swarms-multitudes-part-two/?template=print

[17] Wimsatt, William & C Schank, J. (2002). Generative entrenchment, modularity, and evolvability: When genic selection meets the whole organism. In: Gerhard Schlosser and Günter P. Wagner Modularity in Development and Evolution. University of Chicago Press.

[18] Kreutler, K. 2018. The Byzantine Generalization Problem: Subtle Strategy in the Context of Blockchain Governance. Technosphere Magazine #13 Trust. accessed August 28th 2018

[19] O’Dwyer, R. 2016. ‘Blockchains and their Pitfalls’ (ed. Trebor Scholz) in Ours to Hack and to Own, OR Books. pp. 232

[20] Hirschmann, Albert. 1970. ‘Exit, Voice and Loyalty’ p. 44 Harvard University Press.

[21] Kreutler, K. 2018. The Byzantine Generalization Problem: Subtle Strategy in the Context of Blockchain Governance. Technosphere Magazine #13 Trust. accessed August 28th 2018

[22] Negarestani, R. 2018. Intelligence and Spirit. Urbanomic/MIT Press.

[23] ETC Hong Kong summit 2017 | Meredith Patterson https://www.youtube.com/watch?v=rqqdFufARXA

[24] Ternes, Bernd. 2012. In the Future: Technogenic Closeness as Medium of Social Engineering? Seven Meta-Theoretical Perspectives. Accessed. June 6th 2019.

Sources:

2018. HLAx_ The Quick & Dirty #AltWoke Version. accessed May 5th 2018.

Bauer, Diann. 2017. ‘Question of the Will with Laboria Cuboniks’ http://questionofwill.com/en/laboria-cuboniks-3/ .accessed Feb 30th 2017.

Blas, Zach. 2017. “Contra Internet Aesthetics” in Art After the Internet. Cornerhouse Publications

Brandom, R. 2019. A Spirit of Trust: A Reading of Hegel’s Phenomenology. Cambridge, Massachusetts; London, England: Harvard University Press. p. 63.

Cuboniks, Laboria. 2015. Xenofeminist Manifesto p. 5

Hayles, N. Katherine. 2010. “How We Became Posthuman: Ten Years On.” Paragraph 33 (3): 318–30. https://doi.org/10.3366/E0264833410000933.

Hirschmann, Albert ‘Exit, Voice and Loyalty’. Harvard University Press.

Kreutler, K. 2018. The Byzantine Generalization Problem: Subtle Strategy in the Context of Blockchain Governance. Technosphere Magazine #13 Trust. accessed August 28th 2018

Negarestani, R. 2018. Intelligence and Spirit. Urbanomic/MIT Press.

O’Dwyer, R. 2016. ‘Blockchains and their Pitfalls’ (ed. Trebor Scholz) in Ours to Hack and to Own, OR Books. pp. 232

Pasquale, Frank. 2018. ‘Odd Numbers: Algorithms alone can’t meaningfully hold other algorithms accountable’ Real Life Mag. accessed August 21st 2018.

Patterson, Meredith. ETC Hong Kong summit 2017. https://www.youtube.com/watch?v=rqqdFufARXA. Accessed June 3rd 2019.

Ternes, Bernd. 2012. In the Future: Technogenic Closeness as Medium of Social Engineering? Seven Meta-Theoretical Perspectives.

Thacker, E. 2004. Swarms, Networks and Multitudes. p. 10

Simondon, G. 1958. Modes of Existence. extract from http://www.e-flux.com/journal/82/133160/the-genesis-of-technicity/

Wimsatt, W.C. 2002. Generative Entrenchment, Modularity and Evolvability.

Zhexi Zhang, Gary. 2018. Systems Seduction: The Aesthetics of Decentralisation. MIT. https://jods.mitpress.mit.edu/pub/zhang accessed November 10th 2018.

 

 

More Articles from &&&

Kunstwollen* Minus the Human (Painting in the Age of Machinic Will to Art)

1 Imagine describing the series of Jeff Perrott’s paintings New Construction (Pharmakon, Subject, Natural, Denatural, Door, Sublime, Red Interior, and Cosmic) to an AI or a blind person. How would you start? By listing which elements come first, and how the layers of lines in each painting are ordered? Describing an artwork is deconstructing or… Read More »

Ruangrupa: Contemporary Art or Friendship Industry?*

In the past two decades, more than in the past hundred years, authoritarian regimes have risen to power globally. Today, fascist parties are occupying seats in many countries’ governments, such as in the Israeli Knesset, the Dutch Tweede Kamer, the American Congress, and the German Bundestag. Meanwhile, the collective memory of European fascism and its… Read More »

Call the Bronze Age… they forgot their pictograms!

“In the preceding chapter we discussed the development of technoeconomic organization and the establishment of social machinery closely connected with the evolution of techniques. Here I propose to consider the evolution of a fact that emerged together with Homo sapiens in the development of anthropoids: the capacity to express thought in material symbols. (…) As… Read More »

Interferential Axiology: Excess & Disruption

What is tragic about choice is no longer fundamental if choice is no longer what establishes communication between an independent city and an independent individual as substances. —Gilbert Simondon1   Excess and disruption are different modes of systemic interferences, providing differing sets of axiological implications. This essay seeks to explore their tragic interface in the… Read More »

Here & Elsewhere, at War, & Into the Future

The Middle East continues to painfully be a primary site for the blood-drenched transformations of our planetary geopolitical system. However, about ten years ago and during another Israeli operation in Gaza, an uncanny timeliness opened an unexpected connection between global contemporary art and geopolitics in August 2014 when, following the escalation of Israel’s Gaza operations,… Read More »

Zionism Reconsidered

The seminal essay below by Hannah Arendt, spanning 15,000 words was first published in the Menorah Journal in October 1944. This work was inspired by the meeting of the World Zionist Organization’s American section in Atlantic City. This congress was notable for its assertive call for a Jewish state covering the entire territory of Palestine,… Read More »

The Dead God, A short story in two parts

Things had been getting strange at the firm, since the boss had come back from holidays. The black cape and the pile of Crowley books strewn about the office were the first clue. What was Hardeep, the Singaporean tech bro CEO, doing with all this, mused Pierre, a level 7 sales executive, en route to… Read More »

The Purist

Filipe Felizardo is a philosophy student, artist and musician from Lisbon, with an informal education in film, comics, and musical pedagogy. Currently a Researcher on Critical Philosophy at the New Centre for Research & Practice, Felizardo focuses on systematic reconceptions of learning and alienation, as understood from the workspaces of inferentialism, Marxist activity-approach, and anti-vitalism.

Retinol: A Mode of Action

“Condensed in a formula, the Technological Civilization can be characterized as the transition from ratio to generativity, from matter to process, from nature to the hybrid.” –Davor Löffler If we follow the self-avowed German Accelerationism and deep futurology of Davor Löffler (Löffler 2021), we can posit that everything is co-evolutionary and that there are no… Read More »

The Narcissist Image

In his course Deleuzian Aesthetics Fares Chalabi presents an extended typology of mutually exclusive, rigorously defined image-types, or what I like to call aesthetic structures or aesthetic logics. An image-type or aesthetic logic is a form that structures the entirety of a work of art – take, for example, the ‘series’. The logic of series,… Read More »

Sorry You Can’t Pass a Turing Test But I’m Different 

Five hundred million individuals tried to monetize their social media last year, according to a recent Linktree survey. As a lucky member of this esteemed group, I recently found myself surfing through the entrepreneurial side of TikTok, captivated by a video titled “How to make money with Chat GPT”. The clip tells you to go… Read More »

Unthought Apparitions

In this video essay, Brent Cox works through the poetry of Barbadian poet Kamau Brathwaite and his Sycorax Video Style, which he developed in the early 1980s using a Mac SE/30 and which offers myriad compelling extra-linguistic or extra-conceptual ideas in relation to citationality, literary convention, the constative/performative distinction, the temporality of neologisms, and the… Read More »

The Work of Art in the Age of Cybernetic Criticism

Walter Benjamin’s seminal 1935 essay “The Work of Art in the Age of Mechanical Reproduction” wrestled with the effects of powerful technologies upon culture, and presaged much subsequent writing, e.g. Martin Heidegger and Italo Calvino. Here I want to consider not the artwork-qua-object as in Benjamin, but rather the work of art as an active force, in… Read More »

Modern Art: A True Conspiracy

*Originally delivered as a response to Gertrude Stein’s “The Making of Americans” on Day 27 of Superconversations, a collaboration between e-flux and The New Centre for Research & Practice in 2015. The most recent wartime Christmas in New York was as cold and bright as any other holiday season had ever been in the city. As usual, a… Read More »

Cosmotechnics and the Multicultural Trap

1. Although still a young writer and researcher, it is probably not an exaggeration to say that Yuk Hui is already one of the most influential contemporary thinkers of technology working today. This position is certainly warranted by the strength and scope of his work, the expansive drive and breadth of which is inspiring, especially… Read More »

Pandemic, Time for a Transversal Political Imagination*

I: Symptoms With the omnipresence of the term “symptom” these days, it seems that a plausible escape from the deep horror of this pandemic would be to conduct a symptomatic reading of it. Attributed to Louis Althusser, this method of reading literary and historical texts focuses not on what a text evidently expresses, but on… Read More »

Generation Z: Invincible, Angry & Radical*

*Originally published by BBC Persian, to read the original, please click here.  Following the protests that are taking place in Iran after the killing of Mahsa Amini by the forces of the Islamic Republic of Iran, the attention of the people and the media has been drawn to the role, and strong presence of the… Read More »

A dialogue on Law & Platform Architecture

Note: This piece was co-produced as a dialogue in the manner of a feedback between the authors. They reacted to each other’s thoughts on Law about Space while having as a single rule that each would use a different language as a tool of communication. Zé would use written text, whereas Artemis would use visual expressions. When… Read More »

Arriving from the Future: Sinofuturism & the post-human in the philosophy of Nick Land & Yuk Hui

Modernity and technics “If you think about the Silk Road in the past, there’s this idea of eastern and western people meeting on some kind of big road and maybe selling and buying things. I think this history repeats itself, and some kind of new and interesting phenomenon is happening.” —Kim Namjoon, member of the group… Read More »

Artist as a Formal System: Towards a general theory of art

For the past few years, I’ve been engaged with writing a footnote to an essay with an attempted theoretical explication of what is meant by the word “art”. For a much longer time, I’ve pursued a very abstract but also very specific direction in my own art practice – like any other artist. One little… Read More »

On Daniel Hölzl’s Grounded

“Oil is the undercurrent of all narrations, not only the political but also that of the ethics of life on earth. This undercurrent material, petroleum narrates the dynamics of planetary events from macroscopic scales such as hot and cold wars, migrations, religious and political uprisings, to micro or even nanoscopic scales such as the chemical… Read More »

The Future History of Skills

We become what we behold. We shape our tools and, thereafter, our tools shape us. — John Culkin (1967) “A Schoolman’s Guide to Marshall McLuhan” (The Saturday Review) Human creativity is often driven by lateral thinking, which according to Margaret Boden has a weakness. She posits that AI can introduce better “standards of rigor, […]… Read More »

Babylonian Neo-mustaqbal: Continental Vibe and the Metaverse

My aim here is to venture a scholarly definition of the Continental Vibe, but allow me to arrive there via an anecdote, or an impression, really – one of my earliest memories of viewing the world as a cast of signs and symbols. A somersault of senses: visual, auditory, tactile, olfactory. A sum of building blocks and a bevy… Read More »

Telos at the End: A Meditation on Dysteleological Superintelligence

I proceed from an actual fact. For all the scenarios of existential risk from Artificial Intelligence/Superintelligence, there’s always been the same thing. There’s always been this aspect, put tacitly or implicitly, either merely enlisted, or considered to be decisive. And what is it? It is the presupposed teleology. Varying in movements and outcomes, all AI-concerned… Read More »

Second-order Design Fictions in End Times

This conversation on Second-order design fiction is part of an ongoing collective research project by Fry and Perera on Technology, Cosmotechnics, Design and Resistance. In their conversation Fry and Perera explore the concept of second-order design fiction (SoDF) as an emergent means of addressing how design is understood and practiced in the context of the… Read More »