Playing videogames in the countryside in Rio Grande do Sul in the late 1990's had more epistemological significance than one could initially expect. As a sub-tropical region of Brazil, Rio Grande do Sul is known for its temperate climate with very well defined stations - often mild cold in the Winter and a very hot and dry (and nowadays longer) Summer season. In the north-western parts of the State, such as my hometown, Summer nights can be very extenuating. Low middle class teenagers (as myself at the time) could not expect to spend too much time in air conditioning, as energy bills could be responsible for a high percentage of the budget of a household -- often people would gather in one air conditioned room and perform the same activity together for the couple of hours the machine would be turned on. Playing videogames was hardly one of such activities, since it was (by then) very much relegated to a teenager activity. Playing was then restricted to late hours in the night, when the temperature was relatively milder and A/C would not need to be turned on anymore. I am relying here on a personal recollection to emphasise that the relationship between gaming and heat can already be noticed on the very small scale of personal, individual perception: often the old Playstation 1 console would just shut down and stop working for hours due to overheating. This was a condition inflicted by the environment, since the console would rarely shut down when playing with the A/C turned on. Although I am not aware of how exactly the Playstation heat prevention system functioned at the time, this technical feature has a very simple explanation in the case of computers: in order to avoid compromising or damaging its internal components, computer motherboards are loaded with a basic operating system called a BIOS that will make a computer shut down if the CPU temperature surpasses a certain expected level. The exact shut down temperature will vary based on the BIOS settings, but it generally ranges from 70 to 100 degrees Celsius.
This is just a localised, small-scale example of how gaming computers require constant cooling to perform adequately. But what can we say when game processing and storage are not performed at one's home, but in specifically designed facilities like Data Centres and Server Farms, such as in the case of Cloud Gaming? As we know, Cloud Gaming allows users to store and run games on a remote graphics processing unit (GPU) housed in a Data Centre. The video output of these games is then streamed to any personal computer, with inputs of a multitude of players sent back to the cloud server over the internet (Willett 2019). Strategies to cool down Data Centres vary of course, as they are designed and developed in several different contexts, and companies often find solutions which are more adequate to regional specificities - such as geographic location, the efficiency of local energy grids, size and scale of operations, just to mention a few among a myriad of aspects. But we can observe how the scale of these resource and energy-intensive activities sometimes require pretty eccentric approaches from system designers and geoengineers alike in order to try cooling down these facilities. Server halls have been built in underground geological repositories; bunkers from the Cold War era have been repurposed with the goal of maintaining a fire-isolated (and also privately secured) position; Scandinavian nation-states have been advertising cold weather as a resource for building "more sustainable" data centres in snowy regions (Vondarau, 2019), which require that less electrical energy is drawn for artificial cooling; and servers have also been installed underwater for the same reason. The more traditional "solution" besides A/C, though, is just using water from nearby hydrographical sources to cool down server rooms (liquid cooling) -- which can leverage the higher thermal transfer properties of water or other fluids to cool the facilities. This is much more energy-efficient than air conditioning, although it of course also impacts local water supplies. As Star and Ruhleder (1996, p.113) famously put it, while infrastructures are often concealed from sight due to their very endemic role of performing background functions in modern societies, they are often brought to the surface of public attention only when they fail. This can be said to be the case of the recent succession of fires involving Data Centre facilities in several locations around the world. Although companies tend to be very discreet (if not plainly opaque) about the conditions and reasons that allowed fire to unleash in their facilities, it is hardly questionable that the role of Data Centres as implicit infrastructures for the functioning of contemporary societies becomes highly noticeable when they are set ablaze. Not only is stored data lost, but all kinds of services are suddenly paralysed or shut down. In the case of gaming, we can observe what happened when the OVHcloud SBG2 Data Centre in Strasbourg burnt down. Twenty five of the European servers of sandbox survival game Rust were destroyed by the fire, and developers Facepunch Studios were not able to restore the lost data. While the physical assemblages of infrastructure burnt in Strasbourg, the secret items, collectible items and other sorts of features disappeared along with the players' monthly saved progress in the virtual worlds of the game. Chess players took it in their stride when servers of the free Lichess service were lost. "Several of the Lichess servers burnt down in a data centre fire tonight", the service tweeted. "No-one was harmed. Thanks to our zealous sysadmin and his backups, all we lost was 24h of puzzle history". The somewhat frugal data that was lost due to the fire contrasts with what these calamitous situations highlight in terms of their elemental entanglements: in the ground, the cloud is actually pretty dense. It took six hours for 100 firefighters to bring the fire in the 500 square metres data centre under control, using 44 fire fighting vehicles, including a pump ship on the Rhine (Judge 2022). Moreover, as revealed by Data Centre Dynamics (DCD), an organisation which promotes best practices in data centre engineering, most Data Centres are designed to be difficult to switch off, as accidental shut offs would bring services down. Since power could not be cut off through a single switch, firefighters took nearly three hours just to get the facility's power cut off, making the electrical fire spread quicker through the location. The buildings had wooden ceilings built with materials designed to resist fire for around one hour, but since the energy could not be switched off quickly, the power room kept burning while the fire spread. Hereby the fire reveals an important principle of regularly working cloud infrastructures: the need and the designing for constant availability -- the reason first and foremost why power could not be switched off quickly in the OVHcloud SBG2 episode. How such an imperative gets articulated with the principles of environmental sustainability can only be, nonetheless, a question for much heated discussion. A less noticeable issue though (perhaps because less advertised by the very cloud services) refers to the question of scale. Not only is the fire scaled up by the massive amount of servers performing highly intensive processing under respectively high temperatures in the same space, but also at the same time. This growth in simultaneous energy demand is perhaps even less visible to players, but gets comparatively shocking in scale (a big fire, for instance), especially if we take into consideration that players like 12-year old me would have a very direct experience of the thermal nature of gaming through local shut down, but rarely would experience such a drastic episode as a half mile-long fire. We can see that, although not experienced by the immediate perception of players, the dimension of incidental environmental breakdowns unsurprisingly increases when games become large-scale media infrastructures. Together with the outsourcing of computing power and data processing, cloud gaming outsources the thermal properties of the medium. The computer becomes much less apparent to players, while it may be increasingly exchanging waste heat somewhere else in a different site, together with a large number of machines, in a large number of rooms, which may have a large-scale interaction with other non-computational systems such as the metabolism of local water supplies. In a relatively small 1 megawatt data centre (that uses enough electricity to power 1,000 houses), these traditional types of cooling would use 26 million litres of water per year (Ashtine & Mitton, 2021). This is why the transition of gaming from a local-based activity to a cloud-based service should also be analysed through a material lens considering the thermal transfers it encompasses. This is why observing the heat of gameplay modes matters. Digital media has for long been described as ephemeral, immaterial, and cold – all adjectives which are given even more impulse through the metaphors often used to explain outsourced, synchronous networked computing – the cloud. A more careful observer would perceive how the regulation of temperature sets conditions for how the matter of media technologies takes shape and circulates through the world (Starosielski, 2016, p.305), conditions in which “non-mediatic” or “premediatic” materialities can be transformed into elements of mediation (Parikka, 2013, p.70; 2015, p.4). Whether these elements result in the slow, cumulative increase of waste heat or the wild and radical expenditure of a bursting fire is not only a matter of contingencies, but also of design. Nonetheless, if current practices in infrastructure design are not making the thermal conditions of organic environments milder, but instead making only the relationship itself more opaque, as Peter Krapp (2015) notices in his article on Polar Media, "it pertains to the realm of fiction the role of exposing our senses to the paradoxes and risks". In this case we should regard the risks of ignoring media’s endemic relationship with temperature. Krapp reminds us of a passage in Thomas Pynchon’s 2013 novel, Bleeding Edge, when one character states that "more and more servers are put together in the same place putting out levels of heat that quickly become problematic unless you spend the budget on A/C”. The solution his partner suggests in the novel is “to go north, set up server farms where heat dissipation won’t be so much of a problem, take (...) power from renewables like hydro or sunlight, use surplus heat to help sustain whatever communities grow up around the data centers. Domed communities across the Arctic tundra”. To which he is answered: “My geek brothers! the tropics may be OK for cheap labor and sex tours, but the future is out there on the permafrost, a new geopolitical imperative—gain control of the supply of cold as a natural resource of incomputable worth, with global warming, even more crucial”. The imagery evoked by Pyncheon’s character is of course voluntarily cynical, but it may also strike the sensitive nerve! Mäntsälä, a finnish town of 20,000 inhabitants located 65 kilometres north of Helsinki, was proclaimed decarbonized by its local district heating and energy monopoly, Nivos (Muukka, 2018). It did so with the help of a data centre launched in 2016 by Yandex, the Russian platform for internet search, transportation and geolocation services (normally referred as ‘the Russian Google’). The waste heat produced by Yandex data centres was outsourced to the house heating system in Mäntsälä, reducing the municipal usage of natural gas negotiated with Russia. The electricity that powers Yandex data centres is not intrinsically environmentally friendly, though, as the energy it turns into heat is sourced from other places in Europe, where it creates carbon emissions and other forms of pollution. Mäntsälä and its energy monopoly picture themselves as clean and carbon-free by means of the infrastructural outsourcing of emissions and forms of pollution to other places – akin to the displacement of energy consumption from personal home computers to centralised cloud computing facilities. Therefore, as cloud gaming emerges as thermal media, the energy demands of data centres require transparency, debate, and regulation in context of policy development, since as large-scale infrastructural systems their implications go way beyond the local contexts where they are experienced as a service. Once again the material properties of infrastructure may be better disclosed when they fail, when the conveniences they provide to our daily experience are interrupted. It seems after all that infrastructural failure, just like the climate crisis with its heat waves, water shortages and so on and so on, just cannot be completely outsourced.
0 Comments
Some notes I took while reviewing Susan Leigh Star's seminal article 'The Ethnography of Infrastructure'. In American Behavioral Scientist, Vol. 43, No. 3, November/December 1999, 377-391.
In her work describing methodological aspects of infrastructure studies or, as she prefers to name it, the ‘study of boring things’, Leigh Star highlighted the difficulty and strangeness of studying infrastructures: “struggles with infrastructure are built into the very fabric of technical work”. According to her, such studies do not incorporate only the usual strangeness that is habitual to anthropological work. Rather it is a second-order strangeness, of an embedded sort – “that [strangeness] of the forgotten, the background, the frozen in place”. She was referring to these things we often do not considered as important: “as well as the important studies of body snatching, identity tourism, and transglobal knowledge networks, let us also attend ethnographically to the plugs, settings, sizes, and other profoundly mundane aspects of cyberspace, in some of the same ways we might parse a telephone book”. We can see just how technical media of each time participate in the perception of the mundanity of each of these artifacts, such as computer settings or telephone books. So “the ecology of the distributed high-tech spaces is profoundly impacted by the relatively understudied infrastructure that permeates all its functions. Study a city and neglect its sewers and power supplies (as many have), and you will miss essential aspects of distributional justice and planning power” (Latour & Hermant, 1998). And she remarks: “Perhaps if we stopped thinking of computers as information highways and began to think of them more modestly as symbolic sewers, this realm would open up a bit”. The article also highlights some important methodological tools: Infrastructural inversion (Bowker, 1994) -> “foregrounding the truly backstage events of work practice to help describe the history of large-scale systems”. Bowker, G. 1994. Information mythology and infrastructure. According to Leigh Star & Ruhleder (1996), in their work on Worm Communities, INFRASTRUCTURES as technical systems have the following dimensions: a) Embeddedness; b) Transparency; c) Reach or scope; d) Learned as part of membership; e) Links with conventions of practice; f) Embodiment of standards; g) Built on an installed base; h) Becomes visible upon breakdown; i) Is fixed in modular increments, not all at once or globally (p.381-382). See more in: Leigh Star & Ruhleder (1996). Steps toward an ecology of infrastructure. Some particular notes on these dimensions: d) The taken-for-grantedness of artefacts and organisational arrangements is a sine qua non of membership in a community of practice (p. 381). e) Infrastructure both shapes and is shaped by the conventions of a community of practice (e.g., the ways that cycles of day-night work are affected by and affect electrical power rates and needs). Generations of typists have learned the QWERTY keyboard; its limitations are inherited by the computer keyboards and thence by the design of today’s computer furniture (Becker 1982) (p.381). f) Infrastructure does not grow out of nothing. It wrestles with the inertia of the installed base and inherits strengths and limitations from that base. Optical fibers run along old railroad lines; new systems are designed for backward compatibility, and failing to account for these constraints may be fatal or distorting to new development processes (Hanseth & Monteiro, 1996) (p.382). g) The normally invisible quality of working infrastructure becomes visible when it breaks; the server is down, the bridge washes out, there is a power blackout (p.382). Infrastructure and Methods: The article also highlights that the methodological implications of a relational approach to infrastructure are considerable: Fieldwork (...) transmogrifies to a combination of historical and literary analysis, traditional tools like interviews and observations, systems analysis, and usability studies (p.382). People make meanings based on their circumstances, and (...) these meanings would be inscribed into their judgements about the built information environment (p.383). The labor-intensive and analysis-intensive craft of qualitative research, combined with a historical emphasis on single investigator studies, has never lent itself to ethnography of thousands. (...) The scale question remains a pressing and open one for methodological concerns in the study of infrastructure. (...) Yet, I know of no one who has analysed transaction logs to their own satisfaction, never mind to a standard of ethnographic veridicality (p.383-384). Tricks of the Trade: In this section, Leigh Star examines tricks she developed in her studies which can be helpful for analysing infrastructure and unraveling some of its features.
The thorny problem of indicators: In Leigh Star’s view, one can read information infrastructure either as: a) A material artifact constructed by people (physical and pragmatic properties); b) a trace or record of activities (transaction logs, email records, classification systems as evidence of cultural decisions, conflicts and values); c) a veridical representation of the world (here the information system is tacitly taken as if it was a complete enough record of actions). These are three different orders of information that researchers can access. It is easy to elide these functions of indicators, so it is VERY important to cultivate an awareness of these differences and to disentangle them. I.e.: “Films about rape may say a great deal about a given culture’s acceptance of sexual violence, but they are not the same thing as police statistics about rape, nor the same as phenomenological investigations of the experience of being raped” (p.388). Not to confuse precision and validity in the creation of a system of indicators and categories. They are two different indicators. Bridges and barriers: “At least since Winner’s (1986) classic chapter, ‘Do Artifacts Have Politics?’ the question of whether and how values are inscribed in technical systems has been a live one in the communities studying technology and its design” (p.388). In this study, Winner observed how a behind-the-scenes policy decision was made to make automobile bridges over parkway in New York low in height. With this, public transport such as buses could not pass, only private small vehicles. Results: Poor people were barred from richer suburbs of Long Island, not by policy, but by design. This is just one example. Matters of accessibility, for instance, rely on that as well. The same with computers and IT infrastructure in poor countries or regions: it would have the infra funded, but perhaps the electricity would be expensive. The same now happens in relation to sustainable systems for media infrastructure and global injustices. Building the infra and not fixing the energy systems will just make the differences in economics larger. She concludes that, still in the 90’s, that “applying the insights, methods, and perspectives of ethnography to this class of issues is a terryfying and delightful challenge for what some would call the information age”. |
AuthorThis blog is meant to provide a space for discussing the geophysical as well as the the imaginary entanglements between media infrastructures and organic environments. In the coming months, it will be dedicated to my current project, Cloud Gaming Atlas, which is particularly interested in observing and interrogating the infrastructures developed for cloud gaming initiatives in regard to their environmental implications. Additionally, it should also gather information about events and publications related to my project at the Zukunftskolleg and the Department of Literature, Art and Media of the University of Konstanz. Archives
January 2024
Categories |