Rethinking the Online Exhibit (2016)
This research project investigates the space of online exhibitions, with a focus on creating new types of interactions for exhibiting 3-dimensional objects.
Most existing online exhibitions simply act as an index of their physical counterparts, skipping interactivity all together. This is a missed opportunity, as the Web is a powerful, playful medium capable of enhancing the way a user interacts with art remotely. Throughout this project, we abstracted away a few qualities of a physical exhibition and brought them into the browser: it is an experience shared with other people, it has a limited lifespan, it requires moving through a space.
The outcome consisted of a few web experiments showcasing a collection of glass objects. They make heavy use of the webcam—for navigating the website, for rotating objects, for visualizing the presence of other users on the website.
Developed in partial fulfillment of the Yale Computer Science degree under the supervision of Holly Rushmeier. Built using Node.js, React JS, web sockets and a face tracking library.
Website navigation using the position of one’s face in front of the camera. The 3-dimensional objects are photographed from 18 different angles and rotated according to the face position.
Visualizing two users visiting the website at the same time, by mapping their face positions onto the screen and streaming the data in real time.
Each time someone visits the website, a pixel on the screen becomes completely dark. After a high enough number of visitors (around a million), the exhibition renders itself unusable.