Audiovisual Performance
"Into Our Sun"

In cooperation with ONB Labs a feedback system to translate a set of metadata from scans of historical newspapers with over 1 million IDs was developed and transferred into the audio-visual domain. Only the parameters ‘width‘ and ‘height‘ of the scans were used, while all other metadata was omitted. The focus on the items’ quality/exactness of representation (number of pixels), while abandoning the actual representation, could be conceived as a reflection on how one perceives, thus interprets and creates, history.

"Into our Sun" by artist Thomas Wagensommerer was initially shown as an audio-visual performance at the launch event of ONB Labs. Translating metadata from more than 1 million historical newspaper scans, he focused on the parameters of resolution. The documentation as video and soundfiles, the processing patch and the patches created in Reaktor and Ableton Live are published under copy left and are a courtesy by the artist.


Tools used to generate sound and visuals



Open source programming language and integrated development environment

Native Instruments Reaktor

Closed source audio software

Ableton Live

Closed source audio software

Data sets

The subset of the metadata of our historical newspapers used in the AV performance


Metadata Historical Newspapers

Subset of page based metadata (height and width, dataset 1 to 1M)



Used data is in the public domain. Generated code is MIT licensed.


Complete Audio Project

Reaktor ensemble, used audio samples, Ableton project

git BZ2 GZ ZIP

Reaktor Ensemble


Complete Video Project

Processing patch, used metadata

git BZ2 GZ ZIP

Processing Patch