LUME Interactive Media Display

What are new and exciting ways to visualize and access relevant data within an environment?

The Comparative Media Studies/Writing (CMS|W) department at MIT asked us to consider new and exciting ways to visualize and access relevant data within an environment. The objective was to address how best to engage people in the department’s physical space. We explored multiple scales of tangible interactions with digital content through spatially embedded computation in CMS|W’s public space. LUME now includes several interactive touch points. Interactive signage responds to the movement of passers-by through the use of PING ultrasonic range finder sensors and brings attention to otherwise static signage. A wayfinding media wall displays information on three large TV screens; bands dedicated to different labs display curated content such as lab news, the latest research, and researcher bios. An upload app allows users to “flick” with a simple finger gesture to post on a large departmental screen; lab researchers use it to share photos, research papers, or other pixel data. Finally, the Media/Twitter Shelf provides a tangible media interface that invites users to engage with objects on display using near-field communication (NFC) tags. All this increases the awareness and visibility of the different research labs within CMS|W, displaying a variety of content, acting as a place finder, and engaging all potential users.

Interactive signage

is designed as the main identifying signage of Department of CMS|W. It is also interactive, and responds to the movement of the passersby through the use of PING ultrasonic range finder sensors. Series of sensors are placed on the bottom strip of the signage wall, detecting the presence of nearby person and controlling the LED pixels to turn on/off in a designed sequence through Arduino microcontrollers. The etched acrylic panel is fully lit at all times, and starts to dynamically respond to people’s movement, as a result bringing attention to the otherwise static signage.

Wayfinding media wall + "Flick" photo upload app

The Media Wall displays highest resolution of information on three large TV screens that are encased in linear strips of acrylic sheets to visually divide the large screen into multiple sections. Each band on the screen is dedicated to different labs to display curated contents, such as lab news, latest research, and researcher bios. Contents are operated from a main server, displayed on three sites with slightly different web address that are assigned to web browsers on respective smart TVs. Each of them uses the same backend, and when it needs to update, they are connected through node.js and socket.io, which allows socket push communication to the web browser.

User interaction of the Media Wall occurs with the use of an accompanying mobile app that allows users to flick (simple finger gesture of “flicking” the mobile image towards the screen) their pictures to the screen whenever they are close to it. Lab researchers can use this function to share photos, research papers, or other pixel data on a large departmental screen.

Tangible media interface: media/twitter shelf

Nine palm-sized acrylic cubes placed on a shelf invite users to engage with the objects. Each cube is laser-etched with logos of different research labs, and identified with NFC tags (near field communication). When a lab sends a tweet, the LED light under the respective lab object (acrylic cube) turns on, indicating new activity. The lifting or placing of the object turns off the LED light and sends a request to the server via NFC readers, to update the content of the web browser displayed on the main screen.

Related Projects