16 September 2014

On Shareable Media: What is the Apple Watch and what is it doing on your wrist?


Xerox Palo Alto Researchers using Tabs, Pads and Boards (Weiser 1991)
Three years ago I wrote a post about Apple's strategy towards digital devices entitled, Tabs, Pads and Boards: Why Apple et al will make a HDTV, which attempted to understand the way in which certain material forms and sizes were beginning to sediment in relation to media production and consumption. The aim of the article was to try to divine Apple's strategy in relation to dividing up screens by using the concepts of tabs, pads and boards, drawn from the work of Xerox Palo Alto Researchers in the late 1980s and early 1990s. This was not just to provide an analysis of Apple but also general trends in the technology industry by using Apple as an exemplar. One of the key considerations was how screens are entangled with space and the norms and values of usage correspond to shared media practices. As such I originally thought that this framework might play out in the following fashion,

  • PERSONAL: 
    • Tabs: iPhone, iPod
  • SEMI-SHARED: 
    • Pads: iPad, iPad Mini, Macbook,
  • SHARED: 
    • Boards: AppleTV? 

As I stated in the original article, "the success of the iPad, and other new tablet-like devices, shows that what people want to be able to do with their media will become increasing important in both differentiating computational products, but also in structuring the technology and media industries", and I think that this still holds true in relation to trying to understand how computation is driving consumption habits and medial change. Indeed I argued,
Through a number of refinements and empirical experiments [Xerox] settled on range of device categories that seemed to be needed to negotiate a computational media landscape, dividing them into three classes: tabs, pads, and boards: tabs are 'inch-scale machines that approximate active Post-It notes', pads are 'foot-scale ones that behave something like a sheet of paper (or a book or a magazine)', and boards are 'yard-scale displays that are the equivalent of a blackboard or bulletin board' (Weiser 1991: 80). It does not take much imagination to see that Apple's strategy has followed the Xerox research to a remarkable degree, except for one glaring exception [of the TV screen] (Berry 2011).
I think it is useful to revisit some of the arguments I made in light of an extremely interesting new addition to the line up (and loss to some extent of the original iPod). This is exemplified by the Apple Watch, a personal, intimate technology that is part of the space called wearables, that has analysts, like Benedict Evans, trying to understand how they fit into everyday life practices (Evans 2014). Again, I argued, "Xerox team saw computation as a distributed system, not a self-contained device. That is, that they understood the importance of the network for computational media. This immediately transformed the kinds of information that each of these classes of technical device was able to use and transmit to others, and most importantly these devices were programmed to understand the importance of the real-time stream, above and beyond that of historical data and media. Indeed, they even referred to 'liveboards'".

I think that this framework becomes even more relevant in relation to the Apple line up in light of the revision of their technologies. Both because of the fact that the original framework of pads, tabs and boards, still seems to be a useful heuristic for thinking about this, but also because Apple has responded to user feedback with what I think is an intensification of the divisions that Xerox had developed. So in my earlier formulation I thought that the pad corresponded to the iPhone and iPod, but it seems that this technology was not intimate enough and actually is not as private/personal as originally envisioned. In fact in reworking these categories I think that the structure of experience is now spread across the devices in the following way,
  • PERSONAL: 
    • Tabs: iWatch
  • SEMI-SHARED: 
    • Pads: iPhone 6, iPhone 6+
  • SHARED: 
    • Boards: iPad Air/AppleTV/? 
Here I am also connecting these types to normative practices (personal, semi-shared, shared) in relation to usage of the devices. I think that this is important because the usage of the iPhone and iPod, which seemed likely extremely personal devices in their early iterations have in fact become increasing public and sharable, albeit not as public as a TV screen. This has been magnified by the increase in display size of both new models of the iPhone 6, now sized at 4.7" and 5.5" (up from the iPhone 5's 4" display and the iPhone 4's 3.5" display). The iPhone 6/+ is now also a wallet, which needs to be "displayed" to purchase goods etc, but also the screen size is more amenable to sharing information (who hasn't take a photo and then passed their phone around a group, for example). 

The way in which the Apple Watch has pushed all the devices up this framework, points back to the original formulation of the tab at Xerox, as an inch-scale machine, and which can transmit extremely personal and even intimate information to the user without others being aware. Here, I am thinking of the new "taptic engine" which can transmit discrete vibrations and "taps" through haptic technology to the wrist. Together with nice touches like social media sharing of picture and messages, not to mention the ability to send your heartbeat to a friend. 

The Apple Watch functions as a sophisticated personal GPS, giving directions and routes through haptic feedback as discreet taps for turn left, turn right. Here, there are links to notions of a transitional object that mediates movement between different kinds of spaces, home to public space, place to place and around an unfamiliar location or city. Of course, the Apple Watch still enables looking at photos, listening to music (via bluetooth headphones), voice-messaging, and voice-operated commands using Siri which makes it potentially a very intimate repository of identity and memories. But the Apple Watch is also very much a fashion device, and again will be strongly linked to personal self-identity and public signalling of status and what Bourdieu (1986) called distinction. 

This analysis still leaves the question of boards somewhat hanging. Should we expect the iPad to become increasingly board-like, very much a shared consumption and display device, or will Apple finally produce a form of television, perhaps a 4K or 5K version, that completes the spread of categories? I think that the work done at Xerox provides a powerful way of understanding the way in which our current devices are morphing in size and capability, and continues to give us at least a basic map of the future trajectories of sharable and shared media. For that reason the next couple of years will be interesting in relation to computational media, communications technologies and social networks and their continued penetration of everyday life. 




Bibliography

Berry, D. M. (2011) Tabs, Pads and Boards: Why Apple et al will make a HDTV, Stunlaw, accessed 16/09/2014, http://stunlaw.blogspot.co.uk/2011/04/tabs-pads-and-boards-why-apple-et-al.html

Bourdieu, P. (1986) Distinction. London: Routledge.

Evans, B. (2014) Ways to think about watches, accessed 16/09/2014, http://ben-evans.com/benedictevans/2014/9/15/ways-to-think-about-watches

Weiser, M. (1991) The Computer for the 21st Century, Scientific American, accessed 18/04/2011, http://nano.xerox.com/hypertext/weiser/SciAmDraft3.html

28 August 2014

On Latour's Notion of the Digital


Bruno Latour at Digital Humanities 2014
Bruno Latour, professor at Sciences Po and director of the TARDE program (Theory of Actor-network and Research in Digital Environments), recently outlined his understanding of the digital in an interesting part of his plenary lecture at Digital Humanities 2014 conference. He was honest in accepting that his understanding may itself be a product of his own individuation and pre-digital training as a scholar which emphasised close-reading techniques and agonistic engagement around a shared text (Latour 2014). Nonetheless, in presenting his attempt to produce a system of what we might call augmented close-reading in the AIME system, he was also revealing about how the digital was being deployed methodologically and his notion of the digital's ontological constitution.[1]

Unsurprisingly, Latour's first move was to deny the specificity of the digital as a separate domain as such, highlighting both the materiality of the digital and its complex relationship with the analogue. He described both the analogue structures that underpin the digital processing that makes the digital possible at all (the materials, the specific electrical voltage structures and signalling mechanisms, the sheer matter of it all), but also the digital's relationship to a socio-technical environment. In other words, he swiftly moved away from what we might call the abstract materiality of the digital, its complex layering over an analogue carrier and instead reiterated the conditions under which the existing methodological approach of actor-network theory was justified – i.e. digital forms part of a network, is "physical" and material, requires a socio-techical environment to function, is a "complex function", and so on.

Slide drawn from Latour (2014)
It would be too strong, perhaps, to state that Latour denied the specificity of the digital as such, but rather through what we might unkindly call a sophisticated technique of bait and switch and the use of a convincingly deployed visualisation of what the digital "really" is, courtesy of an image drawn from Cantwell-Smith (2003) the digital as not-physical was considered to have been refuted. Indeed, this approach to the digital echoes his earlier statements from 1997 about the digital, such that Latour argues,[2]
I do not believe that computers are abstract... there is (either) 0 and (or) 1 has absolutely no connection with the abstractness. It is actually very concrete, never 0 and 1 (at the same time)... There is only transformation. Information as something which will be carried through space and time, without  deformation, is a complete myth. People who deal with the technology will actually use the practical notion of transformation. From the same bytes, in terms of 'abstract encoding', the output you get is entirely different, depending on  the medium  you use. Down with information (Lovink and Schultz 1997).
This is not a new position for Latour, indeed in earlier work he has stated "actually there is nothing entirely digital in digital computers either!" (original emphasis, Latour 2010a). Whilst this may well be Latour's polemical style getting rather out of hand, it does raise the question about what it is that is "digital" for Latour and therefore how this definition enables him to make such strong claims. One is tempted to suppose that it is the materiality of the 0 and 1s that Cantwell Smith's diagram points towards that enables Latour to dismiss out of hand the complex abstract digitality of the computer as an environment, which although not immaterial, still is located through a complex series of abstraction layers which actually do enable programmers to work and code in an abstract machine disconnected in a logical sense from the materiality of the underlying silicon. Indeed, without this abstraction within the space of digital computers there could be none of the complex computational systems and applications that are built today on abstraction layers. Here space is deployed both in a material sense as the shared memory abstracted across both memory chips and the hard disk (which itself may be memory chips) and as a metaphor for the way in which the space of computation is produced through complex system structures that enable programmers to work as programmers working within a notionally two-dimensional address space that is abstracted onto a multidimensional structure.

The Digital Iceberg (Berry 2014)
In any case, whilst our attention is distracted by his assertion, Latour moves to cement his switch by making the entirely reasonable claim that the digital lies within a socio-technical environment, and that the way to study the digital is therefore to identify what is observable of the digital. This he claims are "segments of trajectories through distributed sets of material practice only some of which are made visible through digital traces", thus he claims the digital is digital less as a domain and more as a set of practices. This approach to studying the digital is, of course, completely acceptable, providing one is cognisant of the way in which the digital in our post-digital world resembles the structure of an iceberg, with only a small part ever visible to everyday life – even to empirical researchers (see diagram above).  Otherwise, ethnographic approaches which a priori declare the abstractness of the digital as a research environment illegitimate, lose the very specificity of the digital that their well-meaning attempt to capture the materiality of the digital calls for. Indeed, the way in which the digital through complex processes of abstraction is then able to provide mediators to and interfaces over the material is one of the key research questions to be unpacked when attempting to get a handle on the increasing proliferation of the digital into "real" spaces. As such, ethnographic approaches will only ever be part of a set of research approaches for the study of the digital, rather than, as Latour claims, the only, or certainly most important research methodology.

This is significant because as the research agenda of the digital is heightened, in part due to financial pressures and research grants deployed to engage with digital systems, but also the now manifest presence of the digital in all aspects of life, and hence the deployment of methodological and theoretical positions on how such phenomena should be studied. Should one undertake digital humanities or computational social science? Digital sociology or some other approach such as actor-network theory? In his claim that "the more thinking and interpreting becomes traceable, the more humanities could merge with other disciplines" reveals the normative line of reasoning that (digital) humanities specificity as a research field could be usurped or supplemented by approaches that Latour himself thinks are better at capturing the digital (Latour 2014). Indeed, Latour claims in his book, Modes of Existence, that his project, AIME, "is part of the development of something known by the still- vague term 'digital humanities,' whose evolving style is beginning to supplement the more conventional styles of the social sciences and philosophy" (Latour 2013: xx).

To legitimate the claim of Latour's flavour of actor-network theory as a research approach to the digital, he refers to Boullier's (2014) work, Pour des sciences social de çéme génération, that there have been three ages of social context, with the latest emerging from the rise of digital technologies and the capture of digital traces they make possible. They are,
Age 1: Statistics and the idea of society 
Age 2: Polls and the idea of opinion 
Age 3: Digital traces and the idea of vibrations (quoted in Latour 2014).
Here, vibration follows from the work of Gabriel Tarde in 1903 who referred to the notion of "vibration" in connection to an empirical social science of data collection, arguing that,
If Statistics continues to progress as it has done for several years, if the in-formation which it gives us continues to gain in accuracy, in dispatch, in bulk, and in regularity, a time may come when upon the accomplishment of every social event a figure will at once issue forth automatically, so to speak, to takeits place on the statistical registers that will be continuously communicatedto the public and spread abroad pictorially by the daily press. Then, at every step, at every glance cast upon poster or newspaper, we shall be assailed, asit were, with statistical facts, with precise and condensed knowledge of allthe peculiarities of actual social conditions, of commercial gains or losses, of the rise or falling off of certain political parties, of the progress or decay of a certain doctrine, etc., in exactly the same way as we are assailed when weopen our eyes by the vibrations of the ether which tell us of the approach or withdrawal of such and such a so-called body and of many other things of a similar nature (Tarde 1962: 167–8).
This is the notion of vibration Latour deploys, although he prefers the notion of sublata (similar to capta, or captured data) rather than vibration. For Latour, the datascape is that which is captured by the digital and this digitality allows us to view a few segments, thus partially making visible the connections and communications of the social, understood as an actor-network. It is key here to note the focus on the visibility of the representation made possible by the digital, which becomes not a processual computational infrastructure but rather a set of inscriptions which can be collected by the keen-eyed ethnographer to help reassemble the complex socio-technical environments that the digital forms a part of. The digital is, then, a text within which are written the traces of complex social interactions between actants in a network, but only ever a repository of some of these traces.

Latour finishes his talk by reminding us that the "digital is not a domain, but a single entry into the materiality of interpreting complex data (sublata) within a collective of fellow co-inquirers". Reiterating his point about the downgraded status of the digital as a problematic within social research and its pacification through its articulation as an inscription technology (similar to books) rather than a machinery in and of itself, shows us again, I think, that Latour's understanding of the digital is correspondingly weak.

The use of the digital in such a desiccated form points to the limitations of Latour's ability to engage with the research programme of investigating the digital but also the way in which a theologically derived close-reading method derived from bookish practice may not be entirely appropriate for unpacking and "reading" computational media and software structures.[3] It is not that the digital does not leave traces, as patently it does, rather it is that these traces are encoded in such a form, at such quantities and high-resolutions of data compression that in many cases human attempts to read this information inscription directly are fruitless, and instead require the mediation of software, and hence a double-hermeneutic which places human researchers twice (or more) removed from the inscriptions they wish to examine and read.  This is not to deny the materiality of the digital, or of computation itself, but certainly makes the study of such matter and practices much more difficult than the claims to visibility that Latour presents. It also suggests that Latour's rejection of the abstraction in and of computation that electronic circuitry makes possible is highly problematic and ultimately flawed.



Notes

[1] Accepting the well-designed look of the website that contains the AIME project, there can be no disputing the fact that the user experience is shockingly bad. Not only is the layout of the web version of the book completely unintuitive but the process of finding information is clumsy and annoying to use. One can detect the faint glimmer of a network ontology guiding the design of the website, an ontology that has been forced onto the usage of the text rather than organically emerging from use, indeed the philosophical inquiry appears to have influenced the design in unproductive ways. Latour himself notes: "although I have learned from studying technological projects that innovating on all fronts at once is a recipe for failure, here we are determined to explore innovations in method, concept, style, and content simultaneously" (Latour 2013: xx). I have to say that unfortunately I do think that there is something rather odd about the interface that means that the recipe has been unsuccessful. In any case, it is faster and easier to negotiate the book via a PDF file than through the web interface, or certainly it is better to keep ready to hand the PDF or the paper copy when waiting for the website to slowly grind back into life. 
[2] See also, Latour stating: "the digital only adds a little speed to [connectivity]. But that is small compared to talks, prints or writing. The difficulty with computer development is to respect the little innovation there is, without making too much out of it. We add a little spirit to this thing when we use words like universal, unmediated or global. But if way say that, in order to make visible a collective of 5 to 10 billion people, in the long history of immutable mobiles, the byte conversion is adding a little speed, which favours certain connections more than others, than this seems a reasonable statement" (Lovink and Schultz 1997).
[3] The irony of Latour (2014) revealing the close reading practices of actor-network theory as a replacement for the close reading practices of the humanities/digital humanities is interesting (see Berry 2011). Particularly in relation to his continual reference to the question of distant reading within the digital humanities and his admission that actor-network theory offers little by way of distant reading methods. Latour (2010b) explains "under André Malet’s guidance, I discovered biblical exegesis, which had the effect of forcing me to renew my Catholic training, but, more importantly, which put me for the first time in contact with what came to be called a network of translations – something that was to have decisive influence on my thinking... Hence, my fascination for the literary aspects of science, for the visualizing tools, for the collective work of interpretation around barely distinguishable traces, for what I called inscriptions. Here too, exactly as in the work of biblical exegesis, truth could be obtained not by decreasing the number of intermediary steps, but by increasing the number of mediations" (Latour 2010b: 600-601, emphasis removed).



Bibliography

Berry, D. M. (2011) Understanding Digital Humanities, Basingstoke: Palgrave Macmillan.

Cantwell Smith, B. (2003). Digital Abstraction and Concrete Reality. In Impressiones, Calcografia Nacional, Madrid.

Latour, B. (2010a) The migration of the aura or how to explore the original through its fac similes, in Bartscherer, T. (ed.) Switching Codes, University of Chicago Press.

Latour, B. (2010b) Coming out as a philosopher, Social Studies of Science, 40(4) 599–608.

Latour, B (2013) An inquiry into modes of existence : an anthropology of the moderns, Harvard University Press.

Latour, B. (2014) Opening Plenary, Digital Humanities 2014 (DH2014), available from http://dh2014.org/videos/opening-night-bruno-latour/

Lovink, G. and Schultz, P. (1997) There is no information, only transformation: An Interview with Bruno Latour, available from http://thing.desk.nl/bilwet/Geert/Workspace/LATOUR.INT

Tarde, G. (1903/1962) The Laws of Imitation, New York, Henry Holt and Company

06 June 2014

The Post-Digital Ornament


This post is part of a presentation I gave at the Matter – Materials – Materiality – Materialism  (4M) conference organised by Dr Iris van der Tuin and Dr Ann-Sophie Lehmann, Utrecht University, 5th June 2014. 


The Tiller Girls
I now want to turn to think about the notion of the “post-digital ornament”. This is part of a project that looks to interrogate both the original theoretical work of the early critical theory literature, but also explore their concepts and ideas in light of computation and the post-digital condition. There is a need to critically think through the implications of computational imaginaries, particularly hegemonic representations of the digital – “post-digital aesthetics”, “new aesthetic”, “pixels”, “sound waves”, “interfaces”, “surface”, and so forth. This is new work I am exploring through a re-reading of first generation critical theorists and draws on Siegfried Kracauer’s writing on The Mass Ornament (1927) and related writings (Kracauer 1995).

As the historical distinction between the digital and the non-digital becomes increasingly blurred, so the idea that the digital presupposes an experiential and technical disjuncture that makes less and less sense. So in developing a critical approach to the digital by definition required the explicit recognition that the digital itself needed to be historicised. Perhaps today it is better to talk about the need to think in terms of “post-digital objects”. Thus computation becomes spatial in its implementation, embedded within the environment, in the body, and in society. Computation is part of the texture of life itself which can be walked around, touched, manipulated and interacted with in a number of ways and means. So "being online" or "being offline" is now anachronistic, with our always-on smart devices, tablets and hyper-connectivity, as indeed is the notion that we have "digital" and "analogue" worlds that are disconnected. Today the digital is hegemonic, and as such is entangled with everyday life and experience in a highly complex, messy and difficult to untangle way that is different from previous instantiations of the digital. The notion of the "post-digital" helps to give us a critical purchase on this moment, pointing towards the differences in this new post-digital condition, but also provides a critical way into thinking the new hegemonic form of the digital. Critical theory can contribute to this critique of the digital, and this post is an exploration of the critical project of the twentieth century, in order to orient and inform a critical purchase on the computational. 

Kracauer wrote that we must rid ourselves of the delusion that it is the major events which have the most decisive influence on us. We are much more deeply and continuously influenced by what he called "the tiny catastrophes that make up daily life". Such that we need a consistent, interdisciplinary attempt to articulate the material construction of a historically specific social reality. That is, a focus on the impoverished but potentially revelatory landscape of everyday life. He argues that the position that an epoch occupies in the historical process can be determined more strikingly from an analysis of its inconspicuous surface-level expressions than from that epoch’s judgments about itself. These surface-level expressions provide access to the state of things, because through their organisation computationally, aesthetically, elements that were “strewn helter-skelter” suddenly become meaningfully related.

For Kracauer the ornamental patterns produced by groups of dancers, for example, are the aesthetic reflex of the rationality to which the prevailing economic system aspires (see above for an example of the Tiller Girls). The mass ornament is not, though, simply a superstructural reflection of the prevailing mode of production. Rather Kracauer reads the geometry of human limbs as an ambivalent historico-philosophical allegory, insisting they are also a mise-en-scene of disenchantment. Thus, the mass ornament manifests progressive potential as the representation of a new type of collectivity organised not according to bonds of a community but as a social mass of functionally linked individuals.

The landscape from above
The post-digital ornament similarly resembles aerial photography of landscapes and cities in that it does not emerge out of the interior of the given conditions, but rather appears above them – granting a distant reading of culture, society and everyday life. In the midst of a world which has become blurred and ungraspable, the post-digital ornament becomes a primary element, a cultural analytics that provides connection and a sense of cohesion in a fragmentary digital experience. The relation to the post-digital ornament is an aesthetic mode, and the ornament becomes an end in itself – via data visualisations, interfaces, surfaces.


So the post-digital ornament consists of lines and circles, like in Euclidean geometry, but also waves and spirals. These formations are still in some sense opaque, composed as they are according to the dictates of a rationality that sacrifices meaning for the sake of an abstract unity of reified elements. The post-digital ornament suspends the opposition of the merely decorative applied ornament and the functional structure.

3D Alignment Forms. Animation of dancer's traceforms
in One Flat Thing, reproduced mapped to 3D space.
Thus producing both an ornamentation of function and a functionalization of ornament. Thus, by critically examining the very superficiality of the post-digital ornament as a surface, one can further explore the computational practices that underwrite and mediate this affinity with the surface. That is to look at how a spatial continuum devoid of both time and meaning is produced. Reading algorithms, for example, as material expressions of a particular historical condition. This has been explored by Synchronous Objects Project, The Ohio State University and The Forsythe Company project which aims to create a large set of data visualization tools for understanding and analyzing the interlocking systems of organization in the choreography of William Forsythe's "One Flat Thing.[1] These dances were quantified through the collection of data and transformed into a series of objects – that they call "synchronous objects" and we might think of as an example of the post-digital ornament – that work in harmony to explore those choreographic structures, reveal their patterns, and re-imagine them through data visualisation techniques. In some senses this is the de-temporalisation of movement, creating a spatial map formed by the aggregate of dancers' movements. The post-digital ornament is also gestured towards by the artist, Natalie Bookchin, in her installation and video, Mass Ornament, she writes,
In Mass Ornament a mass dance is constructed from hundreds of clips from YouTube of people dancing alone in their rooms... Today, YouTube dancers, alone in their rooms performing a routine that is both extremely private and extraordinarily public, reflect a post-Fordist era. Millions of isolated spectator/workers in front of their screens move in formation and watch dancers moving in formation alone in their rooms, also in front of their screens (Bookchin 2009).
We might say that the algorithm that instantiates the post-digital ornament captures the remnants that history has left behind, the same mere nature that appears in the algorithm is thriving in the reality of the society created by capitalist rationality. For example, in new social obsessions with consumption and conspicuous compensatory leisure, sedimented issues of gender, or in politics and norms. The post-digital ornament serves to train people in those forms of perceptions and reactions which are necessary for any interaction with apparatuses. Indeed, the representational practices of the post-digital ornament display an elective affinity with the surface, not the knowledge of an original but the spatial configuration of an instant. In some sense, the post-digital ornament stage nature and everyday life as the negativity of history. 

It is clear that we have an urgent task to mobilise critical philosophy towards contestations and interpretation of our present historical condition as manifested in computation. As computation penetrates more deeply into our everyday life and increasingly mediates our social and personal lives, the task becomes ever more urgent. For me this involves a critical re-reading of key theoretical and philosophical work in relation to developing concepts in a theoretical register but also to provide a means for developing empirical work in relation to computational society.

This leads to a theoretical and sociological challenge in terms of how critical theory can be deployed to think through this historical constellation. Questions of aesthetics, politics, economics, society and the everyday need to be reflected on in relation to the computation precisely because of the penetration of computation into all aspects of human life. This is a call to more rigorous scholarship in relation to the post-digital but also towards a praxis linked to critical practice and a critical approach to the aesthetic of computation and its mediating role both in and through computation.


Notes

[1] I would like to thank Maaike Bleeker for introducing me to these works at the 4M conference in Utrecht, 5th June 2014. 

Bibliography

Bookchin, N. (2009) Mass Ornament, accessed 6 June 2014, http://bookchin.net/projects/massornament.html

Kracauer, S. (1995) The Mass Ornament, Harvard University Press. 



04 April 2014

The Antinomies of Computation

AntiSurveillance Feminist Poet Hair & Makeup Party
In this post I explore what I want to call the antinomies of computation.[1] This is part of a larger project to map out these contradictions but here I will only talk about one of the antinomies that I think is interesting, namely visibility/opacity. In subsequent posts I hope to explore multiple strata to map out different moments in these antinomies. This is an attempt to contribute to a critique of an increasingly softwarized society and economy that requires analysis and contestation (see Berry 2011, 2014).

Computation makes the collection of data relatively easy. This increases visibility through what Rey Chow (2012) calls “Capture”. Software enables more effective systems of surveillance and hence new capture systems. As Foucault argues, “full lighting and the eyes of a supervisor capture better than darkness, which ultimately protected. Visibility is a trap” (Foucault 1991:200). The question is also linked to who is made visible in these kinds of systems, especially where as Feminist theorists have shown, visibility itself can be a gendered concept and practice, as demonstrated in the historical invisibility of women in the public sphere, for example. Here we might also reflect on the way in which the practice of making-visible also entails the making-invisible – computation involves making choices about what is to be captured. For example, Zach Blas's work is helpful in showing the various forms of race, gender and class-based exclusion in computational and biometric systems (Magdaleno 2014).

The question then becomes how to “darken” the visibility to prevent the totalising nature of full top-view possible in computational society? Using the metaphor of “black boxes” – the technical notion of objects which have opaque or impossible to read internal states but readable surfaces – how can we think about spaces that paradoxically enable democracy and the political, whilst limiting the reading of the internal processes of political experimentation and formation. Thus, how to create the conditions of possibility for “opaque presence” working on the edges or at the limits of legibility. These spaces we might call opaque temporary autonomous zones, that seek to enable democratic deliberation and debate. These should be fully political spaces, open and inclusive, but nonetheless opaque to the kinds of transparency that computation makes possible. As Rossiter and Zehle (2014) argue, we need to move towards a "politics of anonymity", part of which is an acknowledgement of the way in which the mediation of algorithms could operate as a plane of opacity for various actors.

It is important to note that this is not to create paranoid spaces or secret societies, but conditional and temporary moments – glitches in the regime of computational visibility. The idea is not to recreate notions of individual privacy as such, but rather collective spaces of critical reflection for practices of creating a political response. That is, to draw on theory and "un-theory" as a way of proceeding theoretically as "an open source theory [and practice] in constant reformulation from multiple re-visions and remixings" (Goldberg 2014), what CTI (2008) calls "poor theory". Indeed, crypto practices can create shadows in plain sight thus tipping the balance away from systems of surveillance and control. Of course, paradoxically these opaque spaces themselves may draw attention to state authorities and the intelligence community who monitor the use of encryption and cryptography – demonstrating again the paradox of opacity and visibility.

CV Dazzle Project by Adam Harvey
By crypto practices, or crypto-activism, I mean the notion of “hiding in plain sight”, a kind of stenography of political practice. This is not merely a technical practice but a political and social one too. Here I am thinking of the counter-surveillance art of Adam Harvey, such as "CV Dazzle", which seeks to design make-up that prevents facial recognition software from identifying faces, or the "Stealth Wear" which creates the "potential for fashion to challenge authoritarian surveillance" (Harvey 2014). Some examples in political practice can also be seen at the AntiSurveillance Feminist Poet Hair and Makeup Party. Additionally, Julian Oliver's work has also been exemplary in exploring the ideas of visibility and opacity. Here I am thinking in particular of Oliver's works that embed code executables paradoxically in images of the software objects themselves, such as "Number was the substance of all things" (2012), but also "PRISM: The Beacon Frame" (2013) which makes visible the phone radio networks, and hence the possibility of surveillance in realtime of networks and data channels (Oliver 2014).

These artworks point towards the notion of "opaque presence" explored by Broeckmann (2010) who argues that "the society of late capitalism – whether we understand it as a society of consumption, of control, or as a cybernetic society – visibility and transparency are no longer signs of democratic openness, but rather of administrative availability" (Broeckmann 2010). It also is suggestively explored by the poet Edouard Glissant, who believes that we should "agree not merely to the right to difference but, carrying this further, agree also to the right to opacity that is not enclosure within an irreducible singularity. Opacities can coexist and converge, weaving fabrics" (Glissant 1997: 190).

So this is not just a technical (e.g. cryptographic) practice. Indeed crypto practices have to be rethought to operate on the terrain of the political and technical simultaneously. Political activity, for example, is needed to legitimate these cryptographically enabled “dark places”. Both with the system (to avoid paranoia and attack), with the public (to educate and inform about them), and with activists and others.

That is, we could think about these crypto-practices as (re)creating the possibility of being a crowd, both in the terms of creating a sense of solidarity around the ends of a political/technical endeavour and the means which act as a condition of possibility for it. Thus we could say in a real sense that computer code can act to create “crowd source”, as it were, both in the technical sense of the computer source code, and in the practices of coming together to empower actors within a crowd, to connect to notions of the public and the common. But these crypto-practices could also help individuals to "look to comprehend how things fit together, how structural conditions and cultural conceptions are mutually generative, reinforcing, and sustaining, or delimiting, contradictory, and constraining. [They] would strive to say difficult things overlooked or purposely ignored by conventional thinking, to speak critically about challenging matters, to identify critical and counter-interests" (Goldberg 2014).

In contrast, to think for a moment about the other side of the antinomy, liberal societies have a notion of a common good of access to information to inform democratic citizens, whilst also seeking to valorise it. That is, the principle of visibility is connected to not only the notion of seeing ones representatives and the mechanisms of politics themselves but also the knowledge that makes the condition of acting as a citizen possible.

Meanwhile, with the exploding quantity of information in society and the moves towards a digital economy, information is increasingly seen as a source of profit for capitalism if captured in an appropriate way. Indeed, data and information are said to be the new ‘oil’ of the digital age (e.g. Alan Greenspan 1971) (Berry 2008: 41, 56). This highlights both the political and economic desire for data. Meanwhile, the digital enables exploding quantities of data that are increasingly hard to contain within organisation boundaries.

One response to computational changes in political and the economy has been the kinds of digital activism connected with whistleblowing and megaleaks, that is the release of massive amounts of data into the public sphere and the use of social media and the internet to distribute it. These practices tend to act to take information out of the "black boxes" of corporations, governments and security services and provide information in the public domain about their mechanisms, practices and machinations. They seek then to counter the opaqueness of the organisation form, and making use of the copyable nature of digital materials.

However, as megaleaks places raw data into the public sphere – usually as files and spreadsheets of data – there is a growing problem of being able to read and comprehend it, hence the growing need for journalists to become data journalists. Ironically then, “opening the databanks” (Berry 2014: 178, Lyotard 1984: 67) creates a new form of opaqueness. Computational strategies are needed to read these new materials (e.g. algorithmic distant readings). Attached to the problem of information overload is that this mechanism can also be harnessed by states seeking to attack megaleaks by counter-leaking and delegitimate megaleaks. Additionally, in some senses the practices of Wikileaks are connected to creating an informational overload within organisations, both in terms of their inability to cope with the release of their data, but also the requirement to close communicational channels within the organisation. So information overload can become a political tactic of both for control and resistance.

But what is at stake here is not just the relationship between visibility and incarceration, nor the deterritorialisation and becoming-mobile made possible by computation. Rather it is the collapse of the “time lag between the world and its capture” (Chow 2012).  When capture becomes real-time through softwarized monitoring technologies and the mediation of “police” functions and control that implies.

The question then becomes what social force is able to realise the critique of computational society but also block the real-time nature of computational monitoring. What practices become relevant when monitoring and capture become not only prevalent but actively engaged in. Tentatively I would like to suggest embedding critical cryptographic practices made possible in what Lovink and Rossiter  (2013) calls OrgNets (organised networks).

Antisurveillence Feminist Party
But also what we might call crypto-activism, the creation of systems of inscription that enable the writing of opaque codes and the creation of "opaque places". This is not just the making possible the spaces of collectivity (“crowd source”) but also the hacking and jamming of the realtime mediation of politics, dissent and everyday life (Deleuze 1992). As Glissant argues "We clamour for the right to opacity for everyone" (Glissant 1997: 194). This, I think, calls for both a cartography of the hybridity of digital media (its post-digital materiality) but also and importantly, the possible translation of crypto, as a concept and as a technical practice, into digital activism tactics.


Notes

[1] This post is drawn from a talk given at Digital Activism #Now: Information Politics, Digital Culture and Global Protest Movements, at Kings College, London (KCL), 04/04/14. See http://www.kcl.ac.uk/aboutkings/worldwide/initiatives/global/nas/news-and-events/events/eventrecords/Digital-Activism-Now-Information-Politics,-Digital-Culture-and-Global-Protest-Movements.aspx


Bibliography

Berry, D. M. (2008) Copy, Rip, Burn: The Politics of Copyleft and Open Source, London: Pluto Press.

Berry, D. M. (2011) The Philosophy of Software, London: Palgrave.

Berry, D. M. (2014) Critical Theory and the Digital, New York: Bloomsbury.

Broeckmann, A. (2010) Opaque Presence / Manual of Latent Invisibilities, Berlin: Diaphanes Verlag.

Chow, R. (2012) Entanglements, or Transmedial Thinking about Capture, London: Duke University Press.

CTI (2008) Poor Theory Notes: Toward a Manifesto, Critical Theory Institute, accessed 14/4/2014, https://www.humanities.uci.edu/critical/poortheory.pdf

Deleuze, G. (1992) Postscript on the Societies of Control, October, vol. 59, pp. 3-7. Available at https://files.nyu.edu/dnm232/public/deleuze_postcript.pdf

Foucault, M. (1991) Discipline and Publish, London: Penguin Social Sciences.

Glissant, E. (1997) The Poetics of Relation, Michigan: The University of Michigan Press.

Goldberg, D. T. (2014) Afterlife of the Humanities, accessed 14/04/2014, http://humafterlife.uchri.org

Harvey, A. (2014) Stealth Wear, accessed 04/04/2014, http://ahprojects.com/projects/stealth-wear/

Lovink, G. and Rossiter, N (2013) Organised Networks: Weak Ties to Strong Links, Occupy Times, accessed 04/04/2014, http://theoccupiedtimes.org/?p=12358

Lyotard, J. F. (1984) The Postmodern Condition: A Report on Knowledge. Manchester:
Manchester University Press

Magdalenom J. (2014) Is Facial Recognition Technology Racist?, The Creators Project, accessed 05/04/2014, http://thecreatorsproject.vice.com/blog/is-facial-recognition-technology-racist

Oliver, J. (2014) Julian Oliver, accessed 05/04/2014, http://julianoliver.com/output/

Rossiter, N. and Zehle, S. (2014) Toward a Politics of Anonymity: Algorithmic Actors in the Constitution of Collective Agency and the implications for Global Justice Movements, in Parker, M., Cheney, G., Fournier, V. and Land, C. (eds.) The Routledge Companion to Alternative Organization, London: Routledge.

01 April 2014

On Capture

In thinking about the conditions of possibility that make possible the mediated landscape of the post-digital (Berry 2014) it is useful to explore concepts around capture and captivation, particularly as articulated by Rey Chow (2012). Chow argues the being "captivated" is
the sense of being lured and held by an unusual person, event, or spectacle. To be captivated is to be captured by means other than the purely physical, with an effect that is, nonetheless, lived and felt as embodied captivity. The French word captation, referring to a process of deception and inveiglement [or persuade (someone) to do something by means of deception or flattery] by artful means, is suggestive insofar as it pinpoints the elusive yet vital connection between art and the state of being captivated. But the English word "captivation" seems more felicitous, not least because it is semantically suspended between an aggressive move and an affective state, and carries within it the force of the trap in both active and reactive senses, without their being organised necessarily in a hierarchical fashion and collapsed into a single discursive plane (Chow 2012: 48). 
To think about capture then is to think about the mediatized image in relation to reflexivity. For Chow, Walter Benjamin inaugurated a major change in the the conventional logic of capture, from a notion of reality being caught or contained in the copy-image, such as in a repository, the copy-image becomes mobile and this mobility adds to its versatility. The copy-image then supersedes or replaces the original as the main focus, as such this logic of the mechanical reproduction of images undermines hierarchy and introduces a notion of the image as infinitely replicable and extendable.  Thus the "machinic act or event of capture" creates the possibility for further dividing and partitioning, that is for the generation of copies and images, and sets in motion the conditions of possibility of a reality that is structured around the copy.

Chow contrasts capture to the modern notion of "visibility" such that as Foucault argues "full lighting and the eyes of a supervisor capture better than darkness, which ultimately protected. Visibility is a trap" (Foucault 1991: 200). Thus in what might be thought of as the post-digital – a term that Chow doesn't use but which I think is helpful in thinking about this contrast – what is at stake is no longer this link between visibility and surveillance, indeed nor is the link between becoming-mobile and the technology of images, but rather the collapse of the "time lag" between the world and its capture.

This is when time loses its potential to "become fugitive" or "fossilised" and hence to be anachronistic. The key point being that the very possibility of memory is disrupted when images become instantaneous and therefore synonymous with an actual happening. Thus in a condition of the post-digitial, whereby digital technologies make possible not only the instant capture and replication of an event, but also the very definition of the experience through its mediation both at the moment of capture – such as with the waving smart phones at a music concert or event  – but also in the subsequent recollection and reflection on that experience.

Thus the moment of capture or "arrest" is an event of enclosure, locating and making possible the sharing and distribution of a moment through infinite reproduction and dissemination. So capture represents a techno-social moment but is also discursive in that it is a type of discourse that is derived from the imposition of power on bodies and the attachment of bodies to power. This Chow calls a heteronomy or heteropoiesis, as in a system or artefact designed by humans, with some purpose, but not able to self-reproduce but which is yet able to exert agency in the form of prescription often back onto its designers. Essentially producing an externality in relation to the application of certain "laws" or regulations.

Nonetheless, capture and captivation also constitute a critical response through the possibility of a disconnecting logic and the dynamics of mimesis. This possibility reflected through the notion of entanglements refers to the "derangements in the organisation of knowledge caused by unprecedented adjacency and comparability or parity". This is, of course, definitional in relation to the notion of computation when itself works through a logic of formatting, configuration, structuring and the application of computational ontologies (Berry 2011, 2014).

Here capture offers the possibility of a form of practice in relation to alienation by making the inquirer adopt a position of criticism, the art of making strange. Chow here is making links to Brecht and Shklovsky, and in particular their respective predilection for estrangement in artistic practice, such as in Brecht's notion of verfremdung, and thus to show how things work, whilst they are being shown (Chow 2012: 26-28). In this moment of alienation the possibility is thus raised of things being otherwise. This is the art of making strange as a means to disrupt the everyday conventionalism and refresh the perception of the world – art as device. The connections between techniques of capture and critical practice as advocated by Chow, and reading or writing the digital are suggestive in relation to computation more generally, not only in artistic practice but also in terms of critical theory. Indeed, capture could be a useful hinge around which to subject the softwarization practices, infrastructures and experiences of computation to critical thought both in terms of their technical and social operations but also to the extent to which they generate a coercive imperative for humans to live and stay alive under the conditions of a biocomputational regime.



Bibliography

Berry, D. M. (2011) The Philosophy of Software, London: Palgrave.

Berry, D. M. (2014) Critical Theory and the Digital, New York: Bloomsbury.

Chow, R. (2012) Entanglements, or Transmedial Thinking about Capture, London: Duke University Press.

Foucault, M. (1991) Discipline and Publish, London: Penguin Social Sciences.



12 February 2014

Digital/Post-digital

I want to take up the question of the definition of the "post-digital" again because I think that what the post-digital is pointing towards as a concept is the multiple moments in which the digital was operative in various ways (see Berry 2014a, 2014b, 2014c). Indeed, historicising the “digital" can be a useful, if not crucial step, in understanding the transformation(s) of digital technologies. That is, we are at a moment whereby we are able to survey the various constellations of factors that made up a particular historical configuration around the digital and in which the “digital” formed an “imagined" medium to which existing analogue mediums where often compared, and to which the digital tended to be seen as suffering from a lack, e.g. not a medium for “real” news, for film, etc. etc. The digital was another medium to place at the end (of the list) after all the other mediums were counted – and not a very good one. It was where the digital was understood, if it were understood at all, as a complement to other media forms, somewhat lacking, geeky, glitchy, poor quality and generally suited for toys, like games or the web, or for “boring” activities like accountancy or infrastructure. The reality is that in many ways the digital was merely a staging post, whilst computing capacity, memory, storage and display resolutions could fall in price/rise in power enough to enable a truly “post-digital” environment that could produce new mediated experiences. That is, that it appears that the digital was “complementary” but the post-digital is zero-sum. Here is my attempt to sum up some of the moments that I think might serve as a provocation to debate the post-digital.


 DIGITAL 


 POST-DIGITAL 
Non-zero sumZero-sum
ObjectsStreams
FilesClouds
ProgramsApps
SQL databasesNoSQL storage
HTMLnode.js/APIs
DisciplinaryControl
AdministrationLogistics
ConnectAlways-on
Copy/PasteIntermediate
DigitalComputal
HybridUnified
InterfaceSurface
BitTorrentScraping
ParticipationSharing/Making
MetadataMetacontent
Web 2.0Stacks
MediumPlatform
GamesWorld
Software agentsCompactants
ExperienceEngagement
SyndicationPush notification
GPSBeacons  (IoTs)
ArtAesthetics
PrivacyPersonal Cloud
PlaintextCryptography
ResponsiveAnticipatory
TracingTracking
SurfingReading

figure 1: Digital to Post-Digital Shifts 

This the table offers constellations or moments within a “digital” as opposed to a “post-digital” ecology, as it were, and, of course, a provocation to thought. But they can also be thought of as ideal types that can provide some conceptual stability for thinking, in an environment of accelerating technical change and dramatic and unpredictable social tensions in response to this. The question then becomes to what extent can the post-digital counter-act the tendencies towards domination of specific modes of thought in relation to instrumentality, particularly manifested in computational devices and systems? For example, the contrast between the moments represented by Web 2.0 / Stacks provides an opportunity for thinking about how new platforms have been built on the older Web 2.0 systems, in some cases replacing them, and in others opening up new possibilities which Tiziana Terranova (2014) has pointed to in her intriguing notion of “Red Stacks”, for example (and in contrast to Bruce Sterlings notion of “The Stacks”, e.g. Google, Facebook, etc.). Here I have been thinking of the notion of the digital as representing a form of “weak computation/computationality”, versus the post-digital as “strong computation/computationality”, and what would the consequences be for a society that increasingly finds that the weak computational forms (CDs, DVDs, laptops, desktops, Blogs, RSS, Android Open Source Platform [AOSP], open platforms and systems, etc.) are replaced by stronger, encrypted and/or locked-in versions (FairPlay DRM, Advanced Access Content System [AACS], iPads, Twitter, Push-notification, Google Mobile Services [GMS], Trackers, Sensors, ANTICRISIS GIRL, etc.)?  

These are not just meant to be thought of in a technical register, rather the notion of “weak computation” points towards a “weak computational sociality” and “strong computation” points towards a “strong computation sociality”, highlighting the deeper penetration of computational forms into everyday life within social media and push-notification, for example. Even as the post-digital opens up new possibilities for contestation, e.g. megaleaks, data journalism, hacks, cryptography, dark nets, torrents, piratization, sub rosa sharing networks, such as the Alexandria Project, etc. and new opportunities for creating, sharing and reading knowledges, the “strong computation” of the post-digital always already suggests the shadow of computation reflected in heightened tracking, surveillance and monitoring of a control society. The post-digital points towards a reconfiguration of publishing away from the (barely) digital techniques of the older book publishing industry, and towards the post-digital singularity of Amazonized publishing with its accelerated instrumentalised forms of softwarized logistics whilst also simultaneously supporting new forms of post-digital craft production of books and journals, and providing globalised distribution. How then can we think about these contradictions in the unfolding of the post-digital and its tendencies towards what I am calling here “strong computation”, and in what way, even counter-intuitively, does the digital (weak computation) offer alternatives, even as marginal critical practice, and the post-digital (strong computation) create new critical practices (e.g. critical engineering), against the increasing interconnection, intermediation and seamless functioning and operation of the post-digital as pure instrumentality, horizon, and/or imaginary.  



Bibliography

Berry, D. M. (2014a) The Post-Digital, Stunlaw, accessed 14/1/2014, http://stunlaw.blogspot.co.uk/2014/01/the-post-digital.html

Berry, D. M. (2014b) Critical Theory and the Digital, New York: Bloomsbury.

Berry, D. M. (2014c) On Compute, Stunlaw, accessed 14/1/2014,  http://stunlaw.blogspot.co.uk/2014/01/on-compute.html

Terranova, T. (2014) Red stack attack! Algorithms, capital and the automation of the common, EuroNomade, accessed 20/2/2014,  http://www.euronomade.info/?p=1708


23 January 2014

Marcuse and Objects

Herbert Marcuse
For Marcuse, the a priori concept of the object precedes and makes possible its appropriation by rational theory and practice. That is, that the links between science, technology and society are shared in the form of experience created through the technological a priori that creates a quantifiable reality of science and hence an instrumentalizable reality for society (Feenberg 2013) – objects as such. That is, "when technics becomes the universal form of material production, it circumscribes an entire culture; it projects a historical totality – a 'world'" (Marcuse 1999: 154). In other words, "technology has become the great vehicle for reification – reification in its most mature and effective form" (Marcuse 1999:168). As such "the world tends to become the stuff of total administration, which absorbs even the administrators" (Marcuse 199: 169). Thus, Marcuse argues that,
The science of nature develops under the technological a priori which projects nature as potential instrumentality, stuff of control and organisation. And the apprehension of nature as (hypothetical) instrumentality precedes the development of all particular technical organisation (Marcuse 1999: 153). 
Even experience itself becomes "corrupted" because of the way in which experience is mediated through technologies and scientific methods resulting in abstract labour and the fetishism of commodities (Feenberg 2013: 609). The measure of society is then, in this account, eliminated, depriving society and individuals of a means to critique or provide justifications against the prevailing a priori of technological rationality. This,
technological reality, the object world, (including the subjects) is experienced as a world of instrumentalities. The technological context predefines the form in which the objects appear... The object world is thus the world of a specific historical project, and is never accessible outside the historical project which organises matter, and the organisation of matter is at one and the same time a theoretical and a practical enterprise (Marcuse 1999: 219). 
As such, there are two moments that Marcuse identifies in relation to this, namely quantification and instrumentalization. He writes, firstly regarding quantification that,
The quantification of nature, which led to its explication in terms of mathematical structures, separated reality from all inherent ends and, consequently, separated the true from the good, science from ethics... And no matter how constitutive may be the role of the subject as point of observation, measurement, and calculation, this subject cannot play its scientific role as ethical or aesthetic or political agent (Marcuse 1999: 146-7)
Secondly he explains that it is claimed that,
Theoretically, the transformation of man and nature has no other objective limits than those offered by the brute factuality of matter, its still unmastered resistance to knowledge and control. To the degree which this conception becomes applicable and effective in reality, the latter is approached as a (hypothetical) system of instrumentalities; the metaphysical "being-as-such" gives way to "being-instrument." Moreover, proved in its effectiveness, this conception works as an a priori – it predetermines experience, it projects the direction of the transformation of nature, it organizes the whole (Marcuse 1999: 152). 
This creates a way of being, and experience of and set of practices towards everyday life that embody and realise this a priori in a number of moments across a life experience. Indeed, it develops an attitude or a towards-which that is infused with the instrumentality towards the world that conceives of it as being a world of entities which can be known, controlled, manipulated and if required transformed. Consequently,
the "correct" attitude towards instrumentality is the technical approach, the correct logos is techno-logy, which projects and responds to a technological reality. In this reality, matter as well as science is "neutral"; objectivity has neither a telos in itself nor is it structured towards a telos. But it is precisely its neutral character which relates objectivity to a specific historical Subject – namely, to the consciousness that prevails in the society by which and for which this neutrality is established. It operates in the very abstractions which constitute the new rationality – as an internal rather than external factor... the reduction of secondary to primary qualities, quantification and abstraction from "particular sorts of entities" (Marcuse 1999: 156). 
The question then becomes the extent to which this totalising system overwhelms the capacity for agency, and as such a critical consciousness. Indeed, related to this is the important question of the relationship between science and technology itself, in as much as the question to be addressed is, is science prior to technology and therefore a condition of possibility for it? Or has science become technologised to the extent that science is now itself subjected to a technological a priori? The latter a position held by Heidegger, for example. In other words, is science "complicit with the system of domination that prevails under capitalism" (Feenberg 2013: 609). Indeed, Marcuse agreed that,
Critical analysis must dissociate itself from that which it strives to comprehend; the philosophic terms must be other than the ordinary ones in order to elucidate the full meaning of the latter. For the established universe of discourse bears throughout the marks of the specific modes of domination, organisation, and manipulation to which the members of a society are subjects (Marcuse 1999: 193). 
The danger of "one-dimensionality" that the lack of critical thought implies, creates a form of modern reason that has domination built into its structure. Indeed, Horkheimer and Adorno argue,
The thing-like quality of the means, which makes the means universally available, its “objective validity” for everyone, itself implies a criticism of the domination from which thought has arisen as its means. On the way from mythology to logistics, thought has lost the element of reflection on itself, and machinery mutilates people today, even if it also feeds them. In the form of machines, however, alienated reason is moving toward a society which reconciles thought, in its solidification as an apparatus both material and intellectual, with a liberated living element, and relates it to society itself as its true subject. The particularist origin and the universal perspective of thought have always been inseparable. Today, with the transformation of the world into industry, the perspective of the universal, the social realization of thought, is so fully open to view that thought is repudiated by the rulers themselves as mere ideology (Horkheimer and Adorno 1999: 37; quoted in Feenberg 2013: 609).
How then to recover the capacity for reflection and thought and thus to move to a new mode of experience, a "two dimensional experience responsive to the potentialities of people and things" (Feenberg 2013: 610). This would require a new orientation towards potentiality, or what I call elsewhere possibility (Berry 2014) that would enable this new spirit of criticality, critical reason as such. In other words, the reconfiguring of quantification practices and instrumental processes away from domination (Adorno, Horkheimer, Marcuse) and control (Habermas), instead towards reflexivity, critique and democratic practices.

For Feenberg this requires "counter-acting the tendencies towards domination in the technological a priori" through the "materialization of values" (Feenberg 2013: 613). This he argues can be found at specific intervention points within the materialisation of this a priori, such as in design processes. Feenberg argues that "design is the mediation through which the potential for domination contained in scientific-technical rationality enters the social world as a civilisational project" (Feenberg 2013: 613). Instead, Feenberg argues that the "socialist a priori" should inform the processes of technical implementation and technical practice. However, it seems to me that this misses the instrumentality implicit in design and design practices more generally, which often tend to maximise instrumental values in their application of concepts of efficiency and organisation. This, in some senses requires a call for a radical politicisation of design, or a new form of critical design which is different and more revolutionary than the form outlined by Dunne & Raby (2013). Here we might start making connections to new forms of rationality that offer possibilities to augment or perhaps replace instrumental rationalities, for example in the potentialities of critical computational rationalities, iteracies, and other computational competences whose performance and practice are not necessarily tied to instrumental notions of efficiency and order, nor to capitalist forms of reification (Berry 2014).




Bibliography

Berry, D. M. (2014) Critical Theory and the Digital, New York: Bloomsbury.

Dunne, A. and  Raby, F. (2013) Critical Design FAQ, accessed 23/1/2013, http://www.dunneandraby.co.uk/content/bydandr/13/0

Feenberg, F. (2013) Marcuse’s Phenomenology: Reading Chapter Six of One-Dimensional Man, Constellations, Volume 20, Number 4, pp. 604-614.

Horkheimer, M. and Adorno, T. W. (1999) The Dialectic of Enlightenment, London: Verso.

Marcuse, H. (1999) One-dimensional Man, London: Routledge.



Disqus for Stunlaw: A critical review of politics, arts and technology