#BDCH23 ABSTRACTS
—————————————–

 

Abstracts are in the order by which they are presented at the event.

Research Consortia in Digital Heritage Research: The ‘Archiving Australian Media Arts’ Project and Beyond

Melanie Swalwell

Drawing on more than fifteen years’ experience leading multi-disciplinary digital heritage research projects, in this presentation I will briefly detail the progress and achievements of the “Archiving Australian Media Arts: Towards a method and a national collection” Linkage Project against its Aims (and pandemic-necessitated adjustments!), before reflecting on the quite remarkable consortium that has undertaken this work. My aim is to offer an assessment of the utility of a ‘consortial approach’ more generally in born digital cultural heritage research.

I speak of ‘digital heritage’ research projects rather than simply ‘digital preservation’, first because most of us in the university system are researchers rather than digital preservation practitioners, and second, because our team is concerned with bringing humanities scholarship to bear on the born digital artefacts we are helping to gather, document and preserve, whether this is through curation or ethnographically-inspired research with communities or evaluating artefacts’ cultural significance, to cite just a few examples.

Collaboration is key, for many reasons. At the most basic level, different knowledge and skillsets are required from quite radically different disciplines, which often do not have a history of much previous interaction (meaning there is much potential). The collaboration between university researchers and GLAM professionals brings yet more diverse expertise as well as (of course) differences in norms, workflows, and expectations. The Merriam Webster dictionary defines ‘consortium’ as a “group…formed to undertake an enterprise beyond the resources of any one member”, which speaks to the efficacy of multiple cultural institutions with similar needs teaming up with us: cross-institutional collaboration is advantageous in developing answers to challenging questions. Advantages can be leveraged in all sorts of ways in the consortium, and “resources” should be understood here not only in monetary terms, but also skillsets, capabilities, and even policy and strategic direction. Appropriate scaling is important, but when it works, the whole is so much more than the sum of the parts.

Kooks of Usenet Past: Hauntings in the Born-Digital Archive

Avery Dame-Griff

For those active in transgender-focused Usenet groups during the 1990s up to the mid-2000s, few regular posters were as infamous as Laura Blake. Best known for her prolific and pugnacious posting style, Blake engaged in frequent, extended flame wars with her ideological opponents. Some dismissed her as a kook, a term in Usenet lingo designating “a regular poster who continually posts messages with no apparent grounding in reality” (Raymond).

What makes Blake notable beyond her voluminous posting habits, however, was one of her main linguistic weapons in these battles: “cisgender.” During this period, the term was entirely absent from English language trans print media, instead exclusively used by Blake and her allies on Usenet. As I’ve argued, Blake plays an essential role in the term’s contemporary popularization (Dame-Griff 2023). Yet Blake’s use of “cisgender” is deeply embedded in her wider separatist ideology, including her violent opposition to transsexuality, particularly gender confirmation surgery. Her posting habits earned her several persistent enemies, who would regularly deadname and dox her, including posting her hometown. Blake wasn’t the only target of such tactics, however, and the Usenet archive is full of such sensitive information.

This paper, then, explores the challenges of born-digital archives and “difficult” historical figures. Drawing inspiration from Abram J. Lewis’s work on Angela Douglas and the transgender archive (2014), I consider what it means for a born-digital archive to be “haunted” by the kooks of Usenet. Given the sheer volume and searchability of digital archives, what are the risks to posters? What is the historian’s duty to explore the accuracy of posts, and how can they account for the risk of experiencing “archival whiplash” in the process (Shaw 2019)? And lastly, how can preservationists and researchers balance their desire to archive the past with protecting sensitive information?

Inventing the Archived Web: Lessons from a Prehistory of Australia’s Web Archiving Program

 

Kieran Hegarty

Which born-digital material is designated the status of heritage? How does this process of “heritigatization” occur? This paper explores these questions by focusing on the period between the installation of the first web server in Australia (late 1992) and the commencement of Australia’s collaborative web archiving program, PANDORA (October 1996). Drawing on oral history interviews with current and formed library staff and organisational records from the period, I illustrate how staff at the National Library of Australia responded to an emerging networked digital information environment.

During this period, staff debated and defined which material from networked information environments (including Gopher, the World Wide Web, bulletin board systems, and Listservs) could and should be collected by the national institution. By conceptualising publicly available websites as a kind of “evolving” serial publication—the site as the “title”, its pages “volumes”, and its updates “issues”—a group of existing library staff well-versed in processing serial publications could adapt and extend existing standards and practices to (some) born-digital content. This framing has had a profound influence on what is collected, as well as how it is made available, with a “single, linear, document-centric access method” typical of published records still prevalent in web archives.

In exploring this history, this paper shows how including websites in the national collection was not just a technical feat, but a conceptual, social, and cultural one as well. The aim of the paper is to present lessons as to why some ways of preserving digital artefacts become routine ways of acting and some artefacts become worthy of preservation, while others are disregarded. It encourages attention to the broader organisational contexts and institutional standards that underpin collecting and argues for the need to leverage, and steadily reorient, established practices and techniques to emerging media forms.

 

Preservation of Digital Culture and Commentary: What We Miss When We Fail to Preserve Local Digital Innovators like Mess + Noise, the Oz Music Project and other Local Arts Communities

Liz Guiffre

At a time when wide participation and support for Australian popular culture appears to be particularly under threat (Briggs, 2023; McHenry, 2023), it is even more vital that we unearth and re-celebrate past achievements, allowing contemporary Australians to ‘be what they can see’ (and hear). Popular cultural histories of Australia are still dominated by a few narratives—they overrepresent male, white and relatively privileged artforms and their audiences, leaving out artists, audiences and perspectives that aren’t part of these.

In reality, Australian popular culture participation is much broader than these accounts – with many more women, CALD and gender diverse participants, as well as diasporic and blended forms of art being performed locally at community outlets, underground venues, and in ways that are not yet able to draw the attention of mainstream media. Printed street press in the 1980s and 90s captured much of this, and into the 2000s this was continued on by local digital arts press, online-only zines, message boards and newspapers that provided room for artists, but also journalists, photographers and audiences to come together and have space in a way that mainstream press did not yet allow.

Iconic titles like Mess + Noise, The Oz Music Project, In The Mix and Junkee were all hyperlocal and made room for Australian experiences. As these have been taken over or taken down, the histories these outlets captured has also been lost, and importantly so too has the groundbreaking work of the scene. Finding a way to capture these histories will provide a richer and more accurate account of Australian popular culture over time, but also provide a way to inspire new artists and audiences.

The Game and Interactive Software Scholarship Toolkit (GISST): Enabling Interactive Software Reference

Eric Kaltman & Joseph C. Osborn

The Game and Interactive Software Scholarship Toolkit (GISST) is a US National Endowment for the Humanities (NEH) funded project to develop a framework for the citation and sharing of emulated states and input streams. GISST allows a user to load files and games into an emulator (either locally or in-browser), record their interactions with it, and then share linkable citations to “replays” of user input as well as the emulator’s underlying memory states. This enables embedding of curated emulator states and replays into online articles, as well as canonical URI references to specific states running on specific architectures.

Currently, the system supports many common computer game systems through the Retroarch framework, as well as 32-bit x86 systems. The project is also developing a set of targeted case studies into computer games, architectural design software, e-literature, and other software studies domains in addition to GISST’s potential for automating digital preservation workflows.

Since GISST will allow for quick emulated access to legacy software files, it could be used for validation of ingested digital objects, recording of data-access workflows, and comparisons of digital objects running on different, yet related, architectures, or different versions of objects running on similar architectures. The system includes a web-based and native application interface, as well as a server side repository to store saved input streams and states. Our goal is create a new ecosystem for sharing and leveraging software-based records for collaborative study and institutional use.

Comparing the Archival Rate, Frequency, and Quality of Chinese-language and English-language Web Pages from the late 1990s and early 2000s on the Wayback Machine

Richard Lewei Huang

The importance of web archives is gaining recognition both within and outside of academia. However, most existing works in web archiving focus on English-language web content. While Chinese-speaking users account for more than 20% of the world’s online population, few studies have been conducted on the status of web archiving for Chinese-language web content. Understanding how well the Chinese-language web is archived is crucial not only for researchers investigating the histories of computing and online communication in Chinese-speaking countries, but also for those studying contemporary society and culture in general in these countries.

This paper investigates how well Chinese-language web pages from the late 1990s and early 2000s are archived by the Wayback Machine. We compiled a dataset of 77,777 unique historical URIs from six “Internet directory” books published in mainland China and the United States between 1999 and 2001. We categorised the URIs into three subsets by language: simplified Chinese, traditional Chinese, and English. We then calculated the archival rate (how many URIs have at least one archived copy on the Wayback Machine) and archival frequency (how often the Wayback Machine archives URIs in the dataset) for URIs in each subset. We also gauged the archival quality of URIs in each subset by analysing the amount of resources missing from the archived pages. We show that overall Chinese-language URIs have a slightly lower archival rate than that of English language URIs (92.8% vs 95.2%), and simplified Chinese URIs have slightly higher archival rate than traditional Chinese URIs (85.8% vs 93.6%). This contradicts earlier research by Thelwall and Vaughan (2004) reporting a much bigger gap in archival rate between Chinese and English URIs, and between simplified and traditional Chinese URIs. However, English-language URIs are much more frequently archived than their Chinese-language counterparts by the Wayback Machine, and they also fare better in terms of archival quality than their Chinese-language counterparts.

We believe our approach of using historical URIs enables us to better understand how well early Chinese-language web content is archived, compared to earlier attempts to measure archival coverage of language-specific web pages such as Alkwai et al. (2017) and Thelwall and Vaughan (2004), which used datasets composed largely of URIs that were accessible online at the time of their writing. Our findings also demonstrate how language bias on the Wayback Machine may manifest in differences in archival quality in addition to metrics used in prior research such as archival rate and frequency. [Data are tentative and subject to future revisions]

Evaluation of Emulation-as-a-Service for 1990’s Videogames

Denise De Vries

Play It Again: Preserving Australian Videogame History of the 1990s (LP180100104) is employing Emulation-as-a-Service and Emulation-as-a-Service Infrastructure to render 1990’s video games playable. Evaluation of these platforms is taking place with criteria to measure the quality of user interactions with the software. While measurements of “quality” are subjective and differ greatly dependent on the player’s knowledge of the game, in this evaluation all responses have been made by players who are new to the games and their interfaces.

Three games have been selected as test cases from the early, mid and late 1990s. These have differing levels of complexity and require increasingly more resources. These games are Gumboots Australia, an educational DOS game released in 1990 by Reckon Software, The Dame Was Loaded, a full motion video adventure DOS game released in 1996 by Beam Software, and Krush Kill ‘N Destroy Xtreme, a real-time strategy game for Windows 95 released in 1997 by Beam Software.

The criteria for evaluation includes the quality and synchronisation of audio (music and sound effects) as well as the control response of the mouse and keyboard. Computer usage metrics are also captured to record both the pre-emulation use of resources and their use while the emulation is executing.

The Evolution of Architectural Documentation: A Case Study on Software Adoption and Challenges in Contemporary Practice

Kirsten Day

The advent of computer-aided design (CAD) in the 1980s marked a significant transformation in architectural documentation. Over the ensuing decades, architectural practices gradually transitioned from traditional analogue methods to fully embracing digital documentation, relying exclusively on hardware and software for their design work. While preservation efforts have predominantly focused on CAD and word processing programs, the architectural industry’s relentless pursuit of innovation in documentation, rendering, animation, and the integration of virtual reality (VR) and augmented reality (AR) has led to the widespread adoption of various software tools.

A significant challenge faced by architectural offices is the frequent need for software upgrades during the course of a design project. Given the complex nature of large-scale projects, such as airport centres, which can span up to a decade from inception to completion, extended timeframes are often required for document access due to legal and archival considerations. This necessitates the ability to modify, copy, or access information over time, while the constant threat of software obsolescence adds further complexity to the process of transitioning to newer software versions while ensuring project continuity and data integrity. The standard timeframe for migration between software upgrades, as set by software vendors, currently stands at three years.

This conference paper presents a comprehensive case study conducted at a medium-sized multinational architectural firm located in Melbourne. The primary objective of the study is to thoroughly examine the diverse range of software employed in their practice, analyse their application methodologies, and investigate the strategies employed for software updates throughout the lifecycle of a project. This study offers valuable insights into the software landscape prevalent in contemporary architectural practice and explores the dynamic nature of software updates. It serves as a vital resource for architectural practitioners and researchers, while also highlighting concerns about the prioritization of innovation by software developers, often overlooking the significance of preservation aspects in the architectural process.

Dynamic Objects, Evolving Collections: A New Approach to Changeability at the National Museum of Australia

Asti Sherring

The digital revolution has disrupted the archetype of the 20th-century museum. We are moving from a place of tradition and contemplation into an active space, driven by experience and both in-person and virtual connectivity. This transition is driven by cultural changes in the present day which are increasingly mediated by technologies with the ability to engage human senses in different ways, therefore creating new connections (Sherring 2020). In response to these cultural and societal shifts, the future impact and relevance of 21st century museum will be played out across both the physical and digital landscape. Our crucial role as cultural stewards “to define, describe and prolong the existence of cultural material” (Wain and Sherring 2021) therefore also needs to change—in response to and anticipation of—these societal shifts.

Since 2022, the National Museum of Australia (NMA) has been undertaking the Changeable and Digital Collections Project, which identifies both the thinking and activities required to acquire, manage, and make accessible objects that are dynamic, variable, and relational, where change is inherent to its ongoing meaning, value and significance. As proof of concept, the NMA team are applying new philosophical, collection management, and conservation approaches to one of the most significant Changeable objects in Australian social history, an operational version of the CSIRO Wireless LAN Testbed, which was built in 1992-1994 to provide a proving ground for CSIRO’s solutions to high-speed data transmission in an indoor radio environment – ie: the beginning of WiFi.

Recognising the changeable nature of digital cultural heritage is crucial for ongoing preservation and activation within a museum environment. This approach acknowledges that culture is not something static and unchanging, but rather a vibrant and adaptive force that is reflected in the continual evolution and expression of the cultural heritage objects in our care. By viewing objects through the lens of change, we can conclude that authenticity can be seen to lie not in keeping things the same, but in understanding and accepting how and why things have changed.

Understanding Digital Architecture

Ania Molenda

As the National Collection for Dutch Architecture and Urban Planning, Nieuwe Instituut (NI) manages one of the largest architecture collections in the world and the largest in the Netherlands. Since 2015, it has been developing the capacity to accommodate its growing born-digital collection. The early use of Computer-Aided Design in architecture is mirrored in the complexity of its archives comprising an array of 2D drawings, 3D models, graphic design, collages, renderings, animations, and computer code. Like many other collecting institutions, NI is developing ways of storing, cataloguing, preserving, and presenting those collections. Changing notions of the traditional understanding of the object, authorship, and originality, triggered by the characteristics of born-digital archives, simultaneously shake up institutional frameworks and provide new perspectives on the history of architecture and design practice. To make such archives more useable and accessible for professionals and broader audiences, NI recognises the need for building a better understanding of those collections including the opportunities and challenges they pose for the collection management.

On the one hand, born-digital architecture archives present unique opportunities to study and interpret the intertwined relationship between the built environment, digital cultures, and social change. They allow us to rethink the design practice and its archive as a more multivocal and dynamic collaboration between humans and machines. On the other, the combination of technical challenges related to utilization of these files and high level of digital literacy required to navigate them pose limits to both research and discovery.

In this paper, I will present the findings of my research titled Understanding Digital Architecture: Stories Born in the Digital Archive conducted between 2021–2022. I will elaborate on the abovementioned opportunities and challenges by bringing together a reflection on the state-of-the-art approaches to born-digital architecture collections with a selection of speculative stories emerging from the archives.

 

Fair Play: Legal Strategies for Ongoing Access to Complex Digital Content

Robin Wright

The Digital Preservation Coalition (DPC) has established a task force with participants drawn from its member organisations around the world to develop evidence, reasoning, and global support for a practical approach to the preservation of complex digital objects in a challenging global legal environment. Legal barriers are one of the biggest obstacles to the effective preservation of complex and interactive digital content, including video games. And the different legal exceptions available for preservation in different jurisdictions often adds to the confusion. In addition, the rapid development of digital technologies often far outpaces law reform.

The DPC task force is developing resources to provide case study examples of the approach taken to preserving video games by organisations in different legal jurisdictions. This will allow for the comparison of the reasoning behind legislative decisions and provide practical examples of the different outcomes experienced by organisations around the world and the impact this has on the preservation of our born digital culture. The DPC is aiming to create a package of materials, quotes, evidence and arguments that can support ongoing action and advocacy by preservation organisations, industry bodies, producers and governments to help ensure ongoing access to one of the most critical digital artforms of the early digital age.

This presentation will release the initial findings of the task force, demonstrate the first case study data collected and explain the data collection processes being followed to develop new DPC resources to assist organisations with ensuring the long-term access to video games. It will focus on the work already being undertaken by DPC members and others in Australia and consider the different constraints faced by those in the US where there are fair use exceptions and the possibility of amending legislation to address technological change and in the UK.

History without an Archive? The Challenges of Collecting, Preserving and Researching Bulletin Board Systems in Turkey

Arda Erdikmen & Ivo Furman

From 1990 to 1997, Turkey boasted a small yet vibrant community of Bulletin Board Systems (BBS). The first BBS in Turkey (SoftCom) was established in 1990 by three undergraduate students who were members of the Bosphorus University Engineering Society club. Allegedly, the software needed to set up SoftCom was downloaded from BBS operators in Sweden and Greece. Soon afterwards, the Turkish national broadcaster (TRT) interviewed the three students, presenting SoftCom as a national technological breakthrough. The interview created public interest on the subject and soon local BBSs were cropping up throughout the metropolitan areas of Turkey. These local BBSs were connected through an array of national networks, the most popular being HitNet (Hi! Türkiye Network).

Founded in 1992 by a group of computer enthusiasts in Ankara, HitNet used Fidonet networking protocols and an echo-mail system, allowing information to be exchanged between local BBS networks. At its peak, HitNet connected around 300-400 local nodes nationwide. Preliminary research exists on the community culture and user practices of HitNet (Furman 2015, 2017). Although there is scholarly consensus on how and when BBS technologies became
popular in Turkey, we have little idea about what was discussed on BBS networks. In fact, HitNet is the only Turkish BBS to have a publicly accessible archive. For the most part, local BBS archives are either inaccessible or lost to the march of technology. On the other hand, Turkey’s position on the periphery of transnational BBS networks such as PeaceNet or FidoNet means that it is difficult to find anything relevant about Turkish BBS users in international archives such as Textfiles.

Despite being an indisputable part of Turkey’s born digital cultural heritage, the lack of archives between 1990-1997 means that this facet of pre-Internet digital culture is at risk of being entirely forgotten to future generations. Using the HitNet archive as a case study, our presentation looks at the challenges associated with collecting, preserving and researching BBS archives in Turkey.

Saving Stan: Preserving the Digital Artwork of Joseph Stanislaus Ostoja-Kotkowski

Taryn Ellis

The archive of Polish-Australian artist Josef Stanislaus Ostoja-Kotkowski is inscribed in the Australian Register of the UNESCO Memory of the World. This collection includes more than 900 3 1/2 inch floppy disks containing digital artworks created in the late 1980s and early 1990s.  

The State Library of South Australia (SLSA), in conjunction with the Archiving Australian Media Arts project (AAMA), embarked on a project to image these disks and use emulation to access the contents. But digital preservation is rarely a simple, linear endeavour. The artwork file formats proved varied, with a significant subset unable to be rendered by standard emulation software. Some of the disks contained RISC OS applications for the Acorn Archimedes series of personal computers, raising questions around access and copyright and challenging standard emulation configurations. 

This presentation describes how the workflow grew and evolved to encompass automation, obscure open-source and community-created software, a lot of Python, a hex-editor, and even sticky tape.  While a strategy for onsite access to the RISC OS software is still in development, the art is now available to view online: This is the first time that Ostoja-Kotkowski’s pixel art has been shown in digital form.

Preserving and Emulating Australian Made Videogames of the 1990s

Helen Stuckey

This presentation reflects on our experience disk imaging and emulating a selection of Australian videogames in the Play It Again: Preserving Australian Video Game History of the 1990s project. This three-year project was funded by the Australian Research Council as a Linkage Project. The project is a collaboration with two Partner Organisations, ACMI (the Australian Centre for the Moving Image) and AARNet (Australia’s Academic and Research Network).

In the project, we have sought to document, preserve, and exhibit the history of Australian-made videogames of the 1990s. The preservation aspect of the project has two major parts. The first part is to create digital images from the physical media carrying the game software. The second part is using Emulation-as-a-Service Infrastructure (EaaSI) and other open-source emulators to make the games playable again.

In this presentation, we discuss successes, difficulties, and limitations that we have encountered in preserving and emulating the curated selection of 50 game titles and the implications for access and exhibition.

Emulation and Infrastructure

Sean Cubitt

The rule that the older the medium, the more persistent is a great rule of thumb but needs more specification when it comes to the transition to digital media. Paper artefacts are reasonably discrete and stable objects to preserve or, at least, to document because they don’t need a power supply. Electrical media are complex artefacts in the sense that they depend on infrastructures that also have to be preserved. Mass media production and circulation tended to mechanical standardisation, for example of film and recorded music, with large numbers of reliably similar machines and parts, reducing the infrastructure problem to manageable proportions. The early years of computer arts are more complex still, with a proliferation not only of computers, operating systems and software applications but of peripherals, connectors, displays and an evolving terrain of standards and protocols. At the brink of quantum computing, digital culture begins to look like a definable period, almost coherent. Yet the heterogenous infrastructures they depended on, depend on now and will depend on in future raise new challenges for preserving artworks and artefacts, even for creating documentation.

The proliferation of new media arts since the late 1970s is very much in living memory. Accessing old drives and software, and ensuring they can function adequately to play often demanding artworks, involves hard work but also a good deal of nostalgia. Alongside the ethics of emulation and the challenges of intellectual property rights, thinking about archiving as nostalgia, from the Greek words for pain and home, is necessarily also thinking about what kind of home we have lost in obsolete technologies. Hearing modem chimes and boot-up alerts, handling old external storage media and their connectors, reviving once-familiar application interfaces and coding with now obsolete protocols still draws on muscle-memories and half-remembered workarounds. Giovanna Fossati wrote brilliantly on the archival life of film – how archived filmstrips continue to evolve or decay chemically and physically, a trait we often find in electronic storage media too. Film archives often deal with materials predating the birth of their handlers. But spare a thought for the life of archivists of the near-contemporary, face-to-face not with the anonymous dead but with their own past.

Working with digital archives means not only caring for and restoring artefacts, but modelling and rebuilding the platforms they were built on: systems and system architectures, peripherals and ports, clocks and configurations that also have to be unearthed, often from disparate collections (or forgotten boxes of old cables), re-constructed or tinkered together with the aid of manuals, printouts and memories. Confronting yet another new practice of computing in AI, another revolutionary infrastructure in the form of quantum computing, and the very possible end of the era of online platform dominance (with everything that implies for archiving), the mutual dependence of software and hardware has never been clearer or more urgent, or raised so many questions about the emotional and ethical demands of archival practice.

Software Is Stuff Unlike Any Other

Dragan Espenschied

In the early 2000’s the Guggenheim museum in New York hosted the Variable Media Initiative, Andrew Wilson from the National Archives of Australia published A Performance Model and Process for Preserving Digital Records for Long-term Access, and the Matters in Media Art collaboration in between MoMA, SFMOMA, and Tate was launched.

Foundational ideas put forward about 20 years ago remain highly influential today and have been successfully applied to new preservation challenges in museums and archives. This presentation will be a review some of these ideas in the light of more recent developments in software preservation and digital art conservation.

Preserving Complex Born-digital/Software-based Artworks and the PREMIS Metadata Framework

Rebecca Barnott-Clement

Born-digital and software-based artworks are often incredibly complex and idiosyncratic in nature, requiring an equally idiosyncratic approach to preservation activities and conservation methodologies of care. Given the normalised nature of digital asset ingestion into digital preservation systems, what does it mean to ‘preserve’ a software-based artwork within such structures?

At Art Gallery of New South Wales (AGNSW), Digital Preservation staff are currently evaluating how best to document, structure and map fifteen case study artworks (including all their associated digital files, metadata, and the proprietary and customised software environments they operate within) using a PREMIS metadata framework.

It is hoped that the interrogation of these case study works will provide a blue-print for ingesting other complex, software-dependent digital objects into the Gallery’s digital preservation system, along with how best to represent this information in the Collection Management System (CMS).

Bit by Bit: Preserving Collections on Digital Carriers

Matthew Burgess & Roxi Ruuska

Born-digital collections stored on original physical digital carriers are at high risk of loss due to hardware and software obsolescence, and degradation of physical media. Although some of this material may be replicated in printed form, the digital original has value not only as the primary archival source but in the additional research potential offered by technical metadata and its usefulness for access. Unlike physical material that can be stored in the right environment for a long period of time, born-digital collections have a short window of opportunity for preservation actions to ensure continued access over time. Transferring born-digital collections from original carriers such as floppy disks and CDs is the first step for preservation and access. It requires specialist hardware, software, operating systems, and skilled people who understand how to use them.

The Digital Curation team at the State Library of New South Wales has been focusing on the transfer of material from original carriers since April 2022. This paper provides a practical case study, outlining the experience of transferring the contents from over 1,000 carriers to network storage, lessons learned, and next steps required for preservation and access.

Building an Environment Creation Workflow for AusEaaSI Using the ACMS Software Accession

Cynde Moya

Our Australian Research Council (ARC) Linkage, Infrastructure, Equipment and Facilities (LIEF) project is entitled The Australian Emulation Network: Born Digital Cultural Collections Access. This project is led by Prof Melanie Swalwell of the Centre for Transformative Media Technologies at Swinburne University of Technology. It aims to conserve and render high-value born digital artefacts from university archives and the GLAM (Galleries, Libraries, Archives and Museums) sector that require legacy computer environments.

This project uses a tool in development called Emulation-as-a-Service Infrastructure, or EaaSI. The EaaSI program of work, sponsored by the Alfred P. Sloan foundation and the Mellon Foundation, gained momentum in the United States around 2018. Our Australian project chose EaaSI as the best tool for managing and sharing configured software environments in a controlled system. Now, as AusEaaSI, it is the backbone of this ARC LIEF emulation network.

ACMS, the Australian Computer Museum Society, has agreed to loan their significant collection of commercial software from the 80s, 90s, and 2000s to this project. This will create a foundation of configured software environments to share across our partners in the AusEaaSI network. We will digitize the software, make catalog records in the ACMS Catalog-IT system, and then configure these environments that are then shared across the AusEaaSI node. This presentation reports on the early decision making, resources used, and workflow strategies chosen in this ongoing project. It will include live demos of the AusEaaSI environments.

Balancing bulk processing with responsive access methods

Candice Cranmer

How do you preserve archives of material collaboratively, simultaneously and at scale while also ensuring artists’ intent in reimagined exhibition displays? Case studies detailing ACMI’s videogame preservation and interactive artwork re-display will offer insights into our workflow and collaborative preservation strategies.

ACMI’s collaboration in the ARC funded Play it Again projects to preserve 1980s and 1990s videogames, Archiving Australian Media Arts: Towards a Method and National Collection and the LIEF AusEaaSi Community of Practice with Swinburne University of Technology and RMIT University will provide a framework to explore the receipt and ingest of Experimenta’s legacy collection to ACMI.  Discussing the development and implementation of a tripartite agreement with the Powerhouse Museum and the National Film and Sound Archive, I will trace our collaborative acquisition and preservation workflow for contemporary, Australian videogames with reference to select examples such as the videogame Untitled Goose Game by House House. 

This discussion provides context for the select games and artworks that will be made available to view/play within ACMI’s Wi-Fi footprint for those attending the conference (see the conference about page for further details).

Radical Uncertainties: Collecting Digital Objects at the V&A, Histories and New Acquisitions

Corinna Gardner & Anna Kallen Talley

The Victoria and Albert Museum holds more than 3,000 objects, dating from 1969 to today that are considered ‘digital’. However, the medium(s) and concepts of what we think of as ‘digital’ objects has changed over time, as shifting technologies offer new capabilities for producing digital material culture. The rapid evolution of digital objects poses a particular problem for museums, which often do not have established acquisitions protocols for collecting and preserving digital material. Further, the documentation surrounding the acquisition of contemporary, born-digital objects and best practices for their display and preservation must be carefully considered when faced with the possibility that, in the future, the technology that hosts digital objects will inevitably fail.

Addressing the conference’s sub-themes of collecting and preserving born-digital cultural heritage, this paper discusses a three-month long research project that took place in the V&A’s Design and Digital department between May and August 2023, which explored intersecting concerns between curatorial aims and conservation needs of digital objects at the point of acquisition. This was done through desk-based research on a selection of case studies in the Design and Digital collection alongside semi-structured interviews with museum stakeholders in various departments.

The outcomes of the project included a report that contained a history of collecting digital objects at the V&A, concerns about digital acquisitions and potential solutions. This report provided the theoretical basis for the project’s applied output, a “how-to guide” and questionnaire for the acquisition of digital objects, now under trial use in the museum. More broadly, this project sits alongside several recent initiatives within the V&A to address digital collecting strategies and the stewardship of digital objects.

Beyond File Borders: Digital Preservation of Time-based Art

Joanna Fleming & Lisa Mansfield

When addressing the display and preservation of time-based art, the file is but one part of how people experience the artwork in a museum. How do we best capture the technical information—maps, schematics, colour profiles, pointer files—to ensure the future display of an artwork is considered authentic? When working with external technicians to install and troubleshoot next generation display technology, what’s needed to enable conservation and digital preservation staff to bridge the gap between technical infrastructure, artist’s intent, and preservation needs? The authors will examine these questions in relation to artworks recently displayed in the newly built North Building at Art Gallery of New South Wales: Retainers of Anarchy, 2017, an algorithmic animation by Howie Tsui; Lineament, 2012, an installation combining software and analogue equipment by Hiraki Sawa; and Groundloop, 2022, a monumental moving-image commission for AGNSW’s 19m x 4.7m LED art wall by Lisa Reihana (Ngā Puhi, Ngāti Hine, Ngāi Tūteauru, Ngāi Tūpoto)

From Centre to Museum: Revisiting CDRom works in Griffith University’s Collection

Angela Goddard & Patrick Lester

Australia has a proud history of innovation in media arts. Several community-based, often artist-run organisations were pioneers of the media arts scene. In Queensland, the Film and Drama Centre’s artist residency program supported the development of many seminal video art works that were subsequently acquired for the collection. The Centre evolved to be Griffith Artworks and is now, over 50 years later, the Griffith University Art Museum (GUAM). This collection and associated archive present an invaluable and extremely rich record of practice. As well as conserving and emulating Griffith University Art Museum’s collection of interactive CD-ROMs, the ARC Linkage Project Archiving Australian Media Arts: Towards a Method and a National Collection has enabled us to reconnect and interview selected artists, inviting them to reflect on the creation of their works and their ongoing preservation.

This paper examines this process through two case studies, Linda Dement’s Cyberflesh GirlMonster, 1996; Tales of Typhoid Mary, 1996; and In my Gash, 1999, and Brad Miller and McKenzie Wark’s Planet of Noise, 1997. Dement’s works were purchased in 1996 and 2021, and Miller and Wark’s work was purchased in 1998.

Through examining the initial acquisition, ongoing preservation and curatorial history of these works, as well as interviews with the artists and collaborators decades later, this paper discusses the insights into the significance of the works, the artists and the contexts in which they were made, and the wider historical and social value of these works.

Technonecromancy: The Afterlives of New Media Art

René G. Cepeda & Constanza Salazar

New Media Artifacts live a double life, as actively changing and evolving media while they remain activated and as a memory of their old selves as they become faced with deactivation. This deactivation can be a mundane physical one due to technological obsolescence where the work becomes an artifact of its previous life. It can also become deactivated as part of its creator’s desire for impermanence and immediacy of experience. And finally, it can also become obsolete due to institutional fossilization.

This paper departs from strictly materialist or ecological understandings of media and their assemblage relations such as by Friedrich Kittler (1997), Matthew Fuller (2005), Jane Bennett (2009), and Jussi Parikka (2015). Grounded in texts by Georgia Smithson (2019), Richard Rinehart and Jon Ippolito (2014), Julia Noordegraaf et al. (2013), and Rhizome’s Net Art Archive, this paper focuses on the notions of preservation and restoration of new media artifacts and their afterlives, and the importance for artists and institutions to come to terms with the realities of conservation work.

This paper approaches artifact deactivation from an ontological and material perspective. What does it mean for an artwork to become deactivated? Has it reached the end of its life and moved into an afterlife where only its memory and physical form remains? Or can it be given, metaphorically, new life through conservation methods? What are the ethics of such technonecromancy? Can an artwork be separated from its materiality, a ghost without shell? And in this way preserve it? Or should we allow them to pass along and remember them through remaining memories created during their life?

Interaction Interrupted: Writing Failure into Born-digital Art and Design Histories

Katherine Mitchell

In a conference dedicated to born-digital heritage, there is a need to consider the realities of technological malfunction, obsolescence and failure in shaping the interpretation of this period. This paper tests such an interpretation. For this, I draw on my experiences as a PhD researcher at the V&A Museum and share my methodological challenges of studying broken or inaccessible software-based artworks in the V&A collection, now 10-15 years old. I discuss how these uncomfortable (non-)encounters have steered my research towards interrogating thresholds of failure in collections care and display, and to questions of what happens after failure. But circumventing object denial has also led me to find other ways to see through “artefacts of failure”: maintenance reports and other administrative records that are produced in moments of breakdown, and serve as evidential traces that remain afterwards.

In this paper, I spotlight a key research object–the exhibition incident report–as a critical device for writing histories of digital exhibitions. My approach here responds to both the proliferation of exhibition documentation produced because of failure, but also recent scholarship that alludes to a productive history of failure, for example in Jack Burnham’s 1970 exhibition Software. After testing this approach for later born-digital exhibition histories, and extending failure beyond equipment breakdown to failures of relations, logistics and reception, I conclude by signposting some applications of such research: figuring the thresholds of acceptable failure in exhibition planning today.

Sitting with the inevitability of failure here is not to alleviate institutions of their duties of care. Rather, it is to take failure beyond being a problematic event and a technical conservation problem to be repeatedly deferred, and to make visible the complexity of relations, practices and understandings that constitute an interpretation of “failure” in the context of what memory institutions aim to do.

.

In the Cloud: Effects of Platformisation in the Preservation Ecosystem of Video Games and Software-based Art

Patricia Falcao

Platformisation, the centralisation of production, distribution and communication means in online platforms, is probably the most impactful evolution in how cultural artefacts are created, distributed and preserved in the 21st century. There are many platforms with different functions and characteristics, and they have different impacts on the production and preservation ecosystem of an artwork. Examples include social media platforms such as Instagram, Facebook, Twitter/X which are both a subject and medium of art, platforms such as Steam, the App Store or Github which are at the core of software distribution, and Discord, Slack and Stack Overflow where discussions about games and art happen. Even though these platforms are very different, they raise some common concerns for preservation, namely around the dependency on an online resource where users have little to no control on their evolution and longevity.

This is true in both the video game and the software-based art preservation contexts, and this paper aims to map and contextualise its impacts, both negative and positive. Using the examples of the game TerraTech by Payload Studios and the artwork Go Rando, by artist Ben Grosser, this paper will illustrate the types of platformisation at different moments in the life of these objects, from production, distribution, dissemination to preservation, as well as analysing its effects on the ecosystems surrounding these objects.

The presentation will illustrate how platformisation, and the changes it drives in the ecosystems of artworks, affects preservation, and asks how practice is changing, or how it should be changing, to address the new ecosystems.

Documenting Software. A Case Study from the Singapore Art Museum

Melanie Barrett & Fabiola Rocco

In 2023, the Singapore Art Museum (SAM) acquired three software-based artworks by multifaceted Singaporean artist Ho Tzu Nyen, including his decade-long meta-project Critical Dictionary of South East Asia (CDOSEA). Triggered by these acquisitions and in view of the upcoming Tzu Nyen solo show at SAM, the Conservation Team embarked on a study focused on unpacking the challenges posed by the unfamiliar medium of software.

Software can play different roles in defining the identity of an artwork, acting as a mere vehicle to achieve a specific effect, or becoming an irreplaceable component of the piece. Understanding the significance of a specific use of software and what the software is designed to “do” in real time becomes critical for evaluating if the software can be modified without compromising the artwork’s identity.

Focusing on two works by Ho Tzu Nyen that employ software in their realization, this presentation aims to identify and compare their significant properties. In the first case, we will propose that when the software acts as a “tool” its absolute preservation is not imperative. Conversely, we will argue that in Critical Dictionary of South East Asia a treatment intervention at the level of the software needs to be weighed carefully and relies equally on the artist’s inputs as well as on the historical context.

Software-based Art Preservation: Embracing the Age of Technological Discontinuities

Morgan Stricot & Matthieu Vlaminck

ZKM, Center for Art and Media, started collecting software-based artworks in 1989 at a period where standardized approaches to managing media and digital art collections did not yet exist. More than 30 years and 200 software-based artworks later, our current preservation workflows are reflecting our awareness of what we learned in the past years as well as the challenges we are retroactively facing now. Software-based artworks involve software and hardware to be visible/hearable, playable or interactive. The material, frameworks, tools and libraries on which a piece of hardware and software is built can be defined as ecosystem. It also defines the inter-dependencies that exist between all these components.

Today, the media technical ecosystem we started the collection with is dying. It only takes one of the components to disappear for an ecosystem to collapse and the subsequent escalation often results in technological discontinuities. These incompatibilities between two ecosystems often constrain the conservation professionals to imitate the behaviour of the artwork with contemporary technologies and thus rewrite it.

It looks like the preservation of software-based art is an endless process inextricably moving towards the complete rewriting of our media art history. We came around that instead of fighting against a phenomenon out of our control, we should embrace this new age by seeing it as an opportunity to learn and preserve not only artworks but also the cultural context of the history of technology. By following the journey of three artworks facing technological discontinuities, we will see how we used different strategies to perpetuate these artworks and related knowledge from commented code to data extraction and hardware emulation.