top of page

Inter(Face)

(Inter)Face is intended to be a ludic approach to thinking through our relationships with devices, networks, and each other. Based on the writings of New Media thinkers Alexander Galloway and Wendy Hui Kyong Chun, (Inter)Face highlights our partial perspectives in network technologies, the work of translation that we do with our machines to become legible in those networks, and the vulnerable position we all put ourselves in by being part of network protocols.

 

Inter(Face) launched as an installation on December 7th, 2022 to a select audience.

Skills Utilized :

Processing

3D Modeling

Form Analysis

Arduino

Electronics System Design

Narrative Development/Storytelling

Background

New Media thinkers say that "good HCI principles" can hide the inner workings of device functionality, which Alexander Galloway calls "The Interface Effect." Galloway asserts that what we, as users, interact with is not truly an interface but an “intraface” – a way for computers to communicate internally, rather than any direct relationship to the user. Users, likewise, intraface their will through clicks, swipes, and key presses, and It is when these translations are insufficient that we notice them acutely. 


According to Wendy Hui Kyong Chun in "Control and Freedom," our devices are "promiscuous," sending our data through network nodes that we may never notice. Digital networks are inherently vulnerable, for one cannot peer into the network without the network peering back at them. Chun, however, argues that a universal sense of vulnerability can create a space for politics and social connection, rather than advocating for increased control of information.

Aesthetic experiences offer different methods of apprehending and interfacing with networks in their form and their functions. It is in that spirit that this project emerges: an attempt to provide an aesthetic that might allow users to think through the Galloway’s Interface Effect and the “wonderful creepiness” described by Chun to build towards a sense of empathetic vulnerability.

51VbUAqQFML._AC_SY780_.jpg

Installation

Inter(Face) features two black-box booths that encourage play and experimentation with a variety of physical interaction artifacts. Users create an image in a short amount of time by contorting their bodies to use the unorthodox placements of inputs and simultaneous interactions, resulting in a ludic form of play. The lack of information and time limit results in panicked experimentation that aligns with "The Interface Effect," where machine and human thought must intraface in order to interface.

Picture1.png

After the time limit, users are shown snapshots taken by a camera system while they were experimenting in the booths. Each user is shown an image from the other's booth, creating a shared moment of being "exposed" by the machine. This moment is intended to facilitate positive social relations between the users through a mechanical compromise of information that they had no part in, as described in Chung's work. It is an opportunity to stand with each other to create a social relation against the baffling incomprehensibility of machine and network.

After exiting the booths, users view a projection of their combined artwork displayed in a public gallery space, which invites onlookers to become part of the network. Chun notes that network perspectives are always partial, and as such only the users can parse out their individual contributions to the images. For onlookers, the image is the materialization of a social relation – a lone digital artifact of a mechanically-built relationship, one built out of partiality, (il)legibility, and vulnerability. 

Picture1.png

Early Concept Sketches

Copy of Untitled_Artwork (1).png

Interaction Booth/Exhibition Space Ideation

Copy of Sketches_Page_1.png
Copy of Sketches_Page_2.png
Picture4.png

Interactive Artifacts/Interaction Method Ideation

Reflective Design Considerations

Through a series of reflective design activities, such as Bodystorming, we were able to critically reflect on the parameters of Inter(Face). Specifically, these exercises allowed us to recognize the critical assumptions behind Inter(Face)’s foundation, such as the expected degree of mobility from users, the size considerations of such an exhibit, and the types of interactions that would be appropriate from an ergonomics perspective.

EcoWave

EcoWave provides an interactive experience for children to understand abstract concept like sustainability.

20221102_152614.jpg

Chowy

Chowy aims to create an engaging dining experience for children and encourage healthy diets through achievements and positive feedback.

Bodystorming exercise that focuses on determining size constraints of exhibit and form factors of interaction components

Block-E

Block-E allows children to use their imagination to build complex structures with tangible blocks to create a sense of achievement and develop hand-eye coordination.

CAD Mockup

We designed the entire booth in Fusion 360 with accurate dimensions and materials prior to construction. Doing so enabled us to gain a perspective of relative size and the usage of positive/negative space within the booths, to ensure that it was not too disparate or too clustered. Additionally, this gave us the space to design and experiment with our interaction artifacts, which we later 3D printed and laser cut using these CAD files.

TUI_Final_Project_Environmental_Setup_2023-Jan-15_08-54-35AM-000_CustomizedView38897815444
Picture6.png

Booth Fabrication

The construction of the booths involved cutting a series of 2x4 and 2x2 scrap lumber to planned dimensions. We assembled two 6 ft x 4 ft bases, and then attached 4 ft members to each corner of both bases, creating the lower and upper halves of the booths. Afterwards, we joined these two halves.

All of the members were joined with 3D printed brackets, to provide a greater rigidity at a lower cost in comparison to metal brackets.

Cutting and Prepping Lumber

Assembling Lower and Upper Halves of Booth

Joining Lower and Upper Halves of Booth

Interaction Artifact Development

The interaction artifacts were prototyped through a combination of 3D printing and laser cutting. The artifacts were designed in Adobe Illustrator and Fusion 360.

The artifacts were mounted to a series of wooden panels that were then drilled onto the frames of the booths.

The artifacts transformed mechanical input to digital output through a variety of sensors, such as potentiometers, force sensors, and push buttons.

3D Printing Interaction Components

3D Printing Interaction Components

Laser Cutting Button Caps and Step Pads

Laser Cutting Button Caps and Step Pads

Prepping Components for Spray Painting

Prepping Components for Spray Painting

Painting and Adhering 3D Printed Components

Painting and Adhering 3D Printed Components

Screen Shot 2023-01-16 at 9.57_edited

Connecting And Testing Electronic Components to Physical Artifacts

Prepping Booth Walls for Component Mounts

Prepping Booth Walls for Component Mounts

Booth Wall Setups

Booth Wall Setups

Attaching Walls to Booth Frame

Attaching Walls to Booth Frame

Creative Coding

The visualization's early stages were centered around brainstorming simple visualization techniques so as not to overload the user with feedback. We ran through several hand-drawn visualizations of curve forms, polygonal compositions, and both two-dimensional and three-dimensional output systems to evaluate what would allow users the clearest balance of control to confusion. Ultimately, we decided to utilize a visualization that allowed users to control a growing series of connected curve forms, the parameters of which could be altered through input.

The coding process started with utilizing a computer keyboard as input to a Processing script to simulate the various potentiometers, buttons and sensors we would have in the final installation. After a variety of characteristics of the visualization were controllable (such as stroke color and size, reflection along axes, curve shape, and confirmation to move onto the next curve form), we moved the Processing script to the real world by connecting it to three Arduinos. We utilized a Standard Firmata protocol to communicate between the Arduino and the Processing script via Serial, thereby allowing Processing to manipulate the visualization when Arduino registered an input manipulation in the real world.

Screenshot 2023-02-10 at 7.28.05 PM.png
20221115_190802.jpg

Writing and testing physical computing and visualization code through Processing and Arduino

Future Work

We are currently planning to relaunch Inter(Face) in 2023 at its full-scale. We plan to incorporate a wider array of sensors for a greater degree of autonomy within the booths, and also plan to utilize sensors of a greater sensitivity to improve user interaction.

We plan to submit our work to the The ACM Symposium on User Interface Software and Technology and the Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies Journal Series.

Some of the visualizations created at the exhibition

Want to Know More?

Inter(Face) Full Academic Writing - Available Here
Project Source Code - Available Here

GitHub Link

bottom of page