Ascend Spring Studios and move through the movements of Johann Sebastian Bach's “Unaccompanied Cello Suite No. 2 in D Minor,” performed by the legendary Yo-Yo Ma, with augmented reality artwork by Sougwen Chung.
Premiered as an official selection of the Tribeca Film Festival 2019, part of the Immersive ‘Virtual Arcade.’
Press: Wired, Filmmaker
Project Creators: Jessica Brillhart, Igal Nassima
Key Collaborators: Yo-Yo Ma, Sougwen Chung, (Johann Sebastian Bach)
Producers: Max Lauter, Erica Newman
Associate Producer: Corinne Brenner
Sound & Engineering Design: Q Department
Installation Support: Phi Centre
Production Company: Vrai Pictures
Creative Development: Superbright
Lead Engineer: Nate Turley
Special thanks to: Bose AR, Nokia Bell Labs, Sound Postings, Sony Classical
Sound: Steven Epstein
Cast: Yo-Yo Ma
Vessel Orchestra was exhibited at the Met Breuer in the summer of 2019.
”Vessel Orchestra is the first sound-based installation commissioned by The Met. Hybrid by design, it is a musical instrument, a series of live performances, and an installation composed of thirty-two sculptures, utilitarian vessels, and decorative objects from the Museum collection. Selected for their natural pitches, which range from low C to high G on the chromatic musical scale, they form an arresting and unexpectedly versatile instrument, similar to an organ with multiple pipes.
The Vessel Orchestra has two distinct modes. During Museum hours, a pre-programmed audio interface will play a new composition written by Beer, activating the vessels in real time—"player piano-style." On Friday evenings, the exhibition will feature a diverse group of guest artists who will perform new compositions and improvisations on this radical musical instrument in a series of intimate concerts. All performances on Friday evenings.” - The Met Breuer
Work by Oliver Beer
Oliver Beer Studio, Cleo Roberts
System Design: Dave Meckin
Technical Design Lead: Daniel Perlin, Make Good
Design + Production: Max Lauter
Design + Fabrication: Michael Hernandez-Stern
Software: Tommy Martinez
The Met team: Lauren Rosati, Katy Uravitch, Patrick Paine, Brian Butterfield, Xiaoxi Chen, Limor Tomer, Kwabena Slaughter
Photo Credit Successió Miró/Artists Rights Society (ARS), New York/ADAGP, Paris; Mark Sommerfeld for The New York Times
Watch the film here.
The Material Provenance Project emerged in the context of the Biodesign Challenge (BDC) x Google Biodesign Sprint in 2021, where it was a semifinal project. Created by Global Listening over the course of 10 days, this film was the product of a 4 week research and design sprint looking into how biodesign and synthetic biology can help to navigate the complexities of global e-waste in support of circular economies.
The Material Provenance Project is an imagined nonprofit organization serving a global catalog of material data along with an eco-label specification for reuse. Empowering their work is a DNA-based data encoding system that imbues material with metadata used to track substance flows worldwide. The organization helps to support sustainable practices by providing insight into the provenance, properties, and journeys of materials.
The Team
Max Lauter, Sara Nejad, Chris Lunney, Celeste Rose, Wiena Lin, Tess Adams, Michael Lee
More info here.
antib0dy.club
Check your body at the door.
Antibody Club is a virtual place designed to connect people in a socially-distant world. As a venue and public space, Antibody Club presents curated interactive art and music through a unique platform designed for intimate conversation.
See the full case study here: Antibody Club Case Study
Role: Executive Producer & Artistic Director
Agency: The Umbrella
Winner of the SXSW 2019 Award: Special Jury Recognition – The Future of Experience
TRAVERSE is a platform for spatial audio experiences that makes listening physical. Pairing smartphones with listening devices enhanced with augmented reality (AR) technology, Traverse enables listeners to move through sound. Be on stage, walk up to the members of the band, or explore another world's landscape—from wherever you are.
Traverse brings Elvis’ music to life with “From Elvis in Memphis” on the heels of the 50th anniversary of the release of his album of the same name, allowing listeners to engage with the legacy rock n’ roll recordings like never before. The platform uses spatial audio technology to create immersive experiences from the multitrack recordings of “Suspicious Minds” and “Power of My Love.” Audiences will be able to move through each of these tracks and feel as if they’re in the presence of the King of Rock ‘n’ Roll himself. They can walk towards or away from the singer, or the drummer, or the audio engineer. They can witness a mix morph and change as they move through it.
Traverse also features “The Arm of InSight,” an experience which launches the listener into NASA’s InSight mission on Mars using publicly-available audio and imagery captured by the lander. Become the Martian lander and assist NASA by deploying the real sensors that InSight uses to gather information on the red planet. Mission Control and audio signals are your guide.
Traverse is available on the Apple App Store, download here.*
*note: As of March 2019, Traverse currently works with Bose AR devices (Bose Frames or QC35 ii headphones). Compatibility with all headphones and AR-enabled devices coming soon.
Press: Engadget, Forbes (1), Forbes (2), Variety, SXSW Film Awards
Credits:
Partner - Vrai Pictures
Partner - Superbright
Director - Jessica Brillhart
Sound Design - Antfood
Creative Developer - Superbright
Technical Director - Igal Nassima
Producer - Erica Newman
Lead Engineer - Nate Turley
Engineer - Prashast Thapan
UX Director - Daniel Perlin
Sr. UX Designer - Kaori Ogawa
Project Manager - Max Lauter
Art Direction - The Combination Rule
“Designs for Different Futures brings together some 80 works that address the challenges and opportunities that humans may encounter in the years, decades, and centuries ahead. Organized by and on view at the Philadelphia Museum of Art, the Walker Art Center, and the Art Institute of Chicago.”
Photography courtesy of the Philadelphia Museum of Art.
Produced virtual experience with 360 video and 3D data scanned in Seoul, built and installed system for dome projection and surround sound.
Venue: Philadelphia Museum of Art
Driver Less Vision examines the tension and reality of AI and humans merging and diverging as they negotiate Seoul's unique urban landscape—challenging us to consider how we can design cities for the future of ‘intelligent vehicles.’
Driver Less Vision is the immersive experience of becoming an autonomous, self-driving vehicle. It explores the untapped conflicts and disruptive effects on the built environment caused by the deployment of technologies for autonomous mobility. Currently, the visual stimuli that organizes traffic is designed for human perception. The arrival of driverless cars entails the emergence of a new type of gaze that is required to negotiate existing visual codes—omnidirectional yet untrained. To assume that driverless cars will fully adapt to future conditions of the city neglects the history of transformation of urban streetscapes associated with changes in vehicular technologies. Driver Less Vision is an attempt to understand how driverless cars will change the city by immersing the audience in an urban journey through the car’s point of view, seeing the streets of Seoul through overlapping and dissonant perceptions.
The project was originally produced for the Seoul Biennale of Architecture and Urbanism in 2017, utilizing an eight meter diameter dome with 360 visuals developed with the generous support of Ocular Robotics, University of Technology Sydney, and Rice University.
Concept and Design
Urtzi Grau, Guillermo Fernández-Abascal, Daniel Perlin, Max Lauter
Visual / Sound Design + Production
Perlin Studios: Daniel Perlin, Principal and Creative Director; Max Lauter, Creative Producer and Designer; Robert Crabtree, 3D Design; Dan Taeyoung, Code and 3D Design; Gary Breslin, Motion Graphics and Animation
Premiered at Sonar+D 2019. Barcelona, Spain.
Photos by Gadi Sassoon.
Chaos & Order is a live, immersive multichannel experience for Traverse created in collaboration with composer Gadi Sassoon. Audience members participate in the composition by physically moving through the space and interacting with sculptural instruments embedded with sensors measuring distance, touch, and capacitance. These interactions dynamically change the sound and spatial arrangement of the piece, with each person experiencing their own unique physical space.
The work is an interactive, non-linear composition based on a work from Gadi Sassoon’s album ‘Multiverse.’ It explores the space between real and fake, abstract and concrete, with sounds that were created using experimental physical models on a supercomputer, analogue synths, and live strings. The sculptures designed by the composer represent three ideas and sonic qualities: synthetic (the geometric solid), bionic (the robotic hand) and organic (the violin). Multiverse was created in collaboration with the mathematicians of the Next Generation Sound Synthesis project (NESS) and the supercomputing facility EPCC.
TRAVERSE
Chaos & Order
Producers: Vrai Pictures and Superbright
Creators: Jessica Brillhart, Igal Nassima, Gadi Sassoon
Producer: Max Lauter
Thanks to curators Jose Luis de Vicente, Jeremy Boxer, and the Sonar team.
AR effect and animated 360 background for Facebook.
To promote World Mental Health Day, BUCK created a meditative 360 background and AR effect for Facebook Messenger. Anyone around the world could chat with their friends in a colorful, psychedelic dreamscape with characters doing all sorts of healthy activities like reading, yoga, sports, and more.
The AR effect features a “breathing buddy” that invites you and your friends to a guided breathing meditation. If you breath in sync with the character as it inhales and exhales an aura will appear with trippy colors around you, signaling your mindful state.
Verizon suped up an immersive theater aboard a bus to help tell the story of how 5G is enabling the next generation of technologies. Envisioned as a business-to-business experience, the mobile theatre is equiped with five 100” 4k screens creating a a 270 degree, interactive video wall. Each vignette documented how various industries will be pushing the limits with 5G, from first responders and firefighters to television broadcasters at sporting events.
Film and animation by BUCK
Tech by Alexander Rea, Alex Nguyen [Oddleg]
Fabrication by A Standard Transmission
Sound design by Antfood
Agency: Civic
The LA River in Los Angeles, designed and built by the Army Corps of Engineers in 1930’s to protect the city against destructive floods, spans 51 miles. Frank Gehry’s architecture firm has taken on the massive urban development project of the ‘river master plan,’ focusing on revitalization and naturalization of the riverfront.
“Rio de Los Angeles” is an augmented reality app that tells the river’s story from the pre-historic era to the present, focusing on the river’s development through urban sprawl and the complexities of its design. The app leverages AR technologies to show animated data visualizations of GIS data about population, pollution, transportation, demographics, environmental condition and more, to contextualize the importance of the river to the city of LA and its local communities.
Data was sourced from a variety of sources, including historic archives and the data from a variety of sources, including historic archives and the LA River Index, a research database assembled by River LA, one of the many groups involved in the planned redevelopment of the river.
Created by:
Vrai Pictures
Superbright
RYOT
The DRONE RACING LEAGUE premiered ‘DRL Drone Duels’ at their World Championship event at Chase Field. The mobile application engaged over 4000 attendees in a live interactive game inviting visitors to compete to guess the winning pilots during the race.
Design / Development by Superbright
Igal Nassima, Max Lauter, Mark Fingerhut, Alex Olmstead, Kaori Ogawa
Designed, programmed, and built playback system for "5D cinema" exhibition and film shown at the Museum of Modern Art (MoMA) and LUMA Foundation in Arles and Zurich. Cinema included synchronized multichannel holographic video, surround sound playback, lighting fx, scent, and wind. Engineered spatial sound design with Sonic Platforms.
Currently on view at the Museum of Modern Art (MoMA), click here for info.
MoMA, Oursler Studio (US), LUMA Foundation (FR, ZH).
Created with Sonic Platforms [Partners: Max Lauter, Michael Christopher, Melodie Yashar]
Measure, an exhibition at the Storefront for Art and Architecture, invited 32 architects and 5 artist groups to produce an artwork that challenges methods of architectural representation, data visualization, and quantification to trace, map, and react to the role of information in public and private space.
Installations featured work by Ekene Ijeoma, Giorgia Lupi + Stefanie Posavec, + POOL, Citygram (The Hong Park, Evan Kent, Sean Lee, Min Joon Yoo), Landscapes of Profit (Dan Taeyoung, Caroline Woolard, Chris Henrick, John Krauss, Ingrid Burrington),
Featured projects:
InSeE’
InSeE’ (Interactive Soundscape Environment), focuses on sonification and visualization of soundscape data captured by an urban sensor network technology. The project aimed to create real-time, dynamic “soundmaps” to augment existing digital cartographic technologies. In this piece, InSeE’ zooms into Storefront for Art and Architecture’s walking area (interior and exterior) to capture soundscape information—noise and spatial-acoustic energies—through immediate, short-term, and long-term dynamic mapping strategies. By 'sonifying" real-time noise data into harmonic spectrums that trail the sounds of the streetscape, the installation aims to bring awareness to spatio-temporal and non-ocular measurements through artistic media enabled by a series of sensors located in the gallery.
Dear Data
Dear Data is a year-long analog data drawing project by Giorgia Lupi and Stefanie Posavec, an Italian and an American who switched continents to live as expats in New York and London, respectively. Stefanie and Giorgia met only twice before beginning this project, and it became a way for them to get to know each other. Every week, they each collect and measure a particular type of personal data. They then each use this data to make a drawing on a postcard and drop it into an English “postbox” (Stefanie) or an American “mailbox” (Giorgia). Eventually, each postcard arrives at the other person’s address featuring the scuff marks of its journey over the ocean: a type of “slow data” transmission. In contrast to mechanical and impersonal gathering of data, Dear Data proposes a slow, manual, deliberately limited, and analog approach.
Drawings by:
The Architecture Lobby, Barozzi / Veiga, Víctor Enrich, Fake Industries Architectural Agonism (Urtzi Grau, Cristina Goberna) and Georgia Jamieson, FIG Projects, FleaFollyArchitects, Formlessfinder, Michelle Fornabai, Grimshaw Architects, Steven Holl, Bernard Khoury, Kohn Pedersen Fox Assoc., KUTONOTUK (Matthew Jull + Leena Cho), Erika Loana, Jon Lott / PARA Project, MAIO, m-a-u-s-e-r (Mona Mahall + Asli Serbest), MILLIØNS (John May + Zeina Koreitem), Nicholas de Monchaux, Anna Neimark and Andrew Atwood / First Office, pneumastudio (Cathryn Dwyre + Chris Perry), + POOL, James Ramsey, RAAD Studio, Reiser + Umemoto, Mark Robbins, Selldorf Architects, Malkit Shoshan, Nader Tehrani / NADAAA, Urban-Think Tank, Anthony Titus, Ross Wimer, James Wines
Produced virtual experience with 360 video and 3D data scanned in Seoul, built and installed system for 24 foot diameter dome projection and surround sound.
Press: Arch Daily, designboom, Urban Next, psfk, Trend Hunter
Venue: Seoul Biennale of Architecture and Urbanism
Driver Less Vision examines the tension and reality of AI and humans merging and diverging as they negotiate Seoul's unique urban landscape—challenging us to consider how we can design cities for the future of ‘intelligent vehicles.’
Driver Less Vision is the immersive experience of becoming an autonomous, self-driving vehicle. It explores the untapped conflicts and disruptive effects on the built environment caused by the deployment of technologies for autonomous mobility. Currently, the visual stimuli that organizes traffic is designed for human perception. The arrival of driverless cars entails the emergence of a new type of gaze that is required to negotiate existing visual codes—omnidirectional yet untrained.. To assume that driverless cars will fully adapt to future conditions of the city neglects the history of transformation of urban streetscapes associated with changes in vehicular technologies. Driver Less Vision is an attempt to understand how driverless cars will change the city by immersing the audience in an urban journey through the car’s point of view, seeing the streets of Seoul through overlapping and dissonant perceptions.
The project was produced for the Seoul Biennale of Architecture and Urbanism in 2017, utilizing an eight meter diameter dome with 360 visuals developed with the generous support of Ocular Robotics, University of Technology Sydney, and Rice University.
Concept and Design
Urtzi Grau, Guillermo Fernández-Abascal, Daniel Perlin
Visual / Sound Design + Production
Perlin Studios: Daniel Perlin, Principal and Creative Director; Max Lauter, Creative Producer and Designer; Robert Crabtree, 3D Design; Dan Taeyoung, Code and 3D Design; Gary Breslin, Motion Graphics and Animation.
JB1.0: Jamming Bodies is an immersive installation that transforms Storefront’s gallery space into a laboratory. The installation, a collaboration between science fiction artist Lucy McRae and architect and computational designer Skylar Tibbits with MIT’s Self-Assembly Lab, explores the relationship between human bodies and the matter that surrounds them.
JB1.0: Jamming Bodies collapses architecture, technology, and art into a single object. While skin usually demarcates the transition between exterior and interior, this experimental installation transforms skin into a membrane that operates as both. A threshold toward a space of total interiority or total exteriority, JB1.0 is an animate continuum that simultaneously embraces and modifies human bodies and space. Combining the plasticity of mutable organisms with the rigidity of architectural forms,JB1.0 brings architecture and its subject into a single space. A breathing, morphable wall, JB1.0 animates the building enclosure by absorbing and expulsing the atmosphere around it while compressing the bodies with which it interacts.
Exhibition production and original sound design. Exhibited in 2016 at Het Neuie Institute (Rotterdam, Netherlands) and in 2017 at Storefront for Art and Architecture (New York, USA).
Curators: Farzin Lotfi-Jam, Mark Wasiuta. Exhibition Design: Sharif Anous, Farzin Lotfi-Jam, Mark Wasiuta. Graphic Design: MTWTF. Sound Design: Sonic Platforms (Michael Christopher, Max Lauter)
"Rio de Janeiro is one of the most visible sites of “smart city” experimentation. In response to catastrophic natural disasters, calamitous traffic congestion, and urban health epidemics, the Centro de Operações Rio (COR) was designed as a corrective tool and as a new command and control hub that would allow the city to prepare for the 2016 Olympic Games. Launched in 2010, COR now monitors its urban camera network and information sensors, gauges optimal traffic patterns, determines landslide risk zones, predicts weather disruptions, and maps disease paths." - Storefront for Art and Architecture
Closed Worlds, curated by Storefront for Art and Architecture and Lydia Kallipoliti, exhibits an archive of 41 historical living prototypes built over the last century that present an unexplored genealogy of closed resource regeneration systems.
The exhibition features Some World Games, a virtual reality ecosystem by Farzin Farzin that presents a contemporary 42nd prototype. Some World Games, the winning installation of the Closed Worlds Design Competition, is an immersive environment that urges visitors to explore and experiment with virtual prototypes generated from the archive of 41 closed systems exhibited as part of the larger Closed Worlds exhibition. Participants are guided through the installation on a looped track that channels their kinetic motion through an orbiting virtual environment.
"Lake Gilmore"
Virtual Reality music video with live and rendered 360-video production, original video, 3D content, and spatial audio mix. Release in 2017. Filmed at National Sawdust.
A reference to Lake Gilmore in Wisconsin, where No Regular Play’s Greg Paulus spent time with his family, the track evokes a nostalgia for both a place of comfort and the experiences with loved ones. In producing a narrative for original video content, we wanted to blend a sense of personal memory with the out-of-body experience that comes with both joy and remembrance and loss. For the live show at National Sawdust, a visual set was prepared that blends a rorschach-like introspection with a bird’s-eye topology—primarily original video shot above New York City and found drone videos of aerial perspectives of Lake Gilmore—to give this simultaneous inward and outward perspective. We recorded 360-degree video of the performance to blend these spaces, along with binaural audio of the performance.
Created with Sonic Platforms [Partners: Michael Christopher, Max Lauter, Melodie Yashar, + Spencer Kohn.]
Exhibition Design: Inaba Williams, MTWTF. Fabrication: Kin and Company. Video System Design: Sonic Platforms. Exhibited at A/D/O (Amalgamated Drawing Office) in Greenpoint, NY.
Cover Image/GIF: Kin and Company.
"Inaba Williams and MTWTF have produced ‘Utopia–Dystopia,’ an exhibition highlighting the vital role design plays now that we are entirely supported by technology. The multi-channel video installation presents snapshots of our technology-centered existence, acting as a prismatic interface with the spectrum of utopian and dystopian images we encounter at every moment in every place. The spaces, products, identities, and experiences offered up to visitors by these two New York-based firms are meant to ask, ‘What possible futures should designers propose at a time when to be human, to be true to ourselves, is to be completely one with technology?"" - Inaba Williams
Created with Sonic Platforms [Partners: Max Lauter, Melodie Yashar, Michael Christopher]
Brain Dead x Sonic Platforms Drop 2 LookBook
Editorial Photography, 3D capture, animation, GIF w/ Spencer Kohn (Direction/Actor/Model/)
View Brain Dead '15 Spring/Summer "Drop 2" Lookbook
"psychedelic 3D rendered lookbook reminiscent of a fever dream shot solely on a 3D scanner. " - HypeBeast
Created with Sonic Platforms [Partners: Max Lauter, Melodie Yashar, Michael Christopher]
An exhibition by Max Lauter and Jonathan Peck at ShapeShifter Lab in Brooklyn, NY. A live performance and installation featuring original real-time video processing, sculpture, painting, improvisational dance, and musical performance.
Music by Matt Garrison. Dance by Natalie Walters, Sylvana Tapia, and Tommaso Petrolo.
Curatorial Statement
Through a playful exchange of characteristics between subjects and objects, this exhibition explores how decisions that are made in interactive moments allow for new creative modes and aestheticized communication.
Who is the object and what is subject? What is the object and who is the subject? How might a machine see, or interpret motion, sound, light, and in turn draw, paint, or build for the aesthetic of other machines rather than human counterparts?
These objects, as sculptures are understood only through the consequence of our interaction with them. The three scenes, represented by primary colors, are the product of a complex and empathetic engagement with material: a physical manifestation of geometry; a scalable reflection of human proportion and action; an aesthetic partitioning of embodied subjects into objects.
As a scenario for interaction, the participants have experience of this new system. However, the participants are creating the objects’ history as much as the object is creating a history for the persons involved – they each exchange a trace in the system, to be presented within its components after they are gone.
The present affords every entity a concurrent reality. An experience of consciousness, organic or digital, offers a sensibility that is inimitable, potentially incommunicable, yet unique in its existence.
Interferometry is the experimental measurement of displacement caused by the construction and deconstruction of waveforms. Indicies of refraction can be gathered from light, such as in holography, as well as from sound. For this project, the aim was to develop a system that would visualize the interfering trajectories of a grid of moving speakers, their resulting vector and path produced by the emitted frequency assigned to each source.
Created with Sonic Platforms [Partners: Max Lauter, Melodie Yashar, Michael Christopher]
Commissioned to designed, produce, and install a real-time virtual reality environment and original 3D video for Urban Outfitters and Champion launch of collaborative clothing line with WOODWOOD, Craig Green and Timo Weiland.
Using a mobile 3D-scanner and a custom Unity engine built with Arcadia, event guests were scanner and imported in real-time to a shared virtual environment.
Fashion Times / PAPER Mag / Space 98 /
Created with Sonic Platforms [Partners: Max Lauter, Melodie Yashar, Michael Christopher]
Meditation Technology. [Computer, sheet metal, wood, cables, piezo microphones, Pure Data software, speakers, audio.]
This interactive installation creates an environment of simultaneous relaxation and heightened awareness. Slight physical motion creates change in the ebb and flow of spatialized sonic harmonies, providing participants an audible feedback on their current state of focus. The space is filled with the sound of crystal crucibles being struck or rubbed, inviting participants to experience the interplay of frequencies and beat patterns in the air as the sound slowly pans across four speakers in the corners of the room. Participants are invited to sit on four metal mats positioned in the four directions on the floor, each with a piezo microphone sending audio signal to a computer for processing. Through a custom software patch, the computer measures in real-time the amount of movement and restlessness of the participant on the mat. Based on this varying level of energy, the sound increases in spread the across the four speakers, or in cases of more disruptive movement, begins to distort. Participants quickly become aware of their own movement, and the movement of their three other co-meditators.
Through this sonic feedback, participants ascertain the subtleties of both their personal and shared space. Focused, still meditation provides tonal clarity, while disruption increases noise in the environment. Meditation Technology enables collective awareness through a technologically mediated system.
The sounds featured in this piece are recorded from a by-product of technological innovation in the silicon industry. The ringing crucibles one hears hear, similar to crystal bowls, were used to forge microprocessors in California over 20 years ago.
Special thanks to Dan Lauter, Liz Phillips, Pall Thayer, and Seth Powsner.
ABSTRACT
Auralization connotes the imagining of an aural event, distinct from sonification, which is a method of representing information via the mapping of datum to a composition of audible signifiers. Aural systems, built upon structures in psychoacoustics, provide a set of practical and conceptual tools that inform our sonic imaginary. New notions of transmission, translation, and fidelity are found at the intersection of computer music and information display. These artistic processes offer hybrid communicative capacities through their interfaces, which span built and virtual environments. Operating on the outer thresholds of perception and calculation, the efficacy of these intermodal strategies are contextualized by concepts of noise found in the aesthetic discourse of ‘glitch’. Glitch refers to an unpredictable error, but it has become increasingly unclear if it occurs due to external systematic breakdown or internal sensory capacities. A perceptual hiccup may just as easily be the artifact of computational error or compression as a demarcation of individual thresholds for detecting difference. White noise is seemingly the most unique sound in its complete variety, but to human ears its nuance is imperceivable. By making noise legible this thesis constructs an aural ecology for expanding semantic and aesthetic discourse within the International Community for Auditory Display. A critical archeology of methods of transmission, sonification, and audiovisual art provide a historical framework expanding both design and curatorial practice.
Hyperminimalist work by composer Ryoji Ikeda provides a case study for art practices that disseminate popular notions of glitch through audiovisual information display. Ikeda constructs immersive environments of auditory display within museum space and the urban stage of the city. In this context, noise is the materia prima of theories of information and performance practices, inscribed with the effects of entropy and carrying embedded, masked meaning. This project seeks a critical language which expands notions of auditory display in order to examine the productivity of noise-based methodologies—artifacts of glitch being the referent and object for the perception of difference. How are we to understand the cross-disciplinary influence of auralization relative to the social aspects of perceptual capital and cultural capital? What are the functional implications of interfaces for auditory display in institutionalized art spaces and public settings? Striving for a lossless experience is impossible due to an aural architecture modulated ontologically by filtering and error. Steganography is the art of concealing, embedding one sound, image, or file within another. Communication failure can occur due to intended encryption or unexpected interference in exchange. This loss—a system’s lossiness—can be a source of production; one always mediated by unavoidable characteristics of globalized information flow, cultural politics, and sonic perception. Application of these notions to scientific and artistic practice is paramount if we are to decode the future city.
M.S. Thesis Presented to the Faculty of the Graduate School of Architecture, Planning, and Preservation Columbia University in Candidacy for the Degree of Masters of Science in Critical, Curatorial, and Conceptual Practices in Architecture
TRANS, an event celebrating transcultural practices in art and architecture. Produced 360-degree timelapse video on the top of 432 Park by architect Rafael Viñoly. Produced video animations for identity.
Video production: Pomp & Clout. Sound Design: Daniel Perlin, Gabe Liberti, Dave Rife. Graphic Identity: Pentagram (Natasha Jen, Jang Hyun Han. Animated TRANS Graphics: Max Lauter.
A mobile laboratory traveling around the world to inspire innovative ideas for urban design and urban life. A co-initiative of the Solomon R. Guggenheim Foundation and the BMW Group.
Curated by David van der Leer, Maria Nicanor, Amara Antilla, and Stephanie Kwai.
http://www.bmwguggenheimlab.org/
Role: Lead Theater AV Specialist