One of my longest-running bodies of work is a series of formal abstract pieces that use form and depth, color and positive and negative space, light and shadow, and a variety of material to address concepts of consumerism such as product design and obsolescence.
16 Step Social
16 Step Social is a video beat sequencer comprised
of a custom controller, a Max MSP Jitter interface, and the free Instagram API.
Users can download Instagram videos through the interface and edit the clips,
build rhythms, add effects, and layer beats indefinitely. The Arduino-based
controller was inspired by the Roland TR 808, and it works in tandem with the
computer interface. Uers can search for specific hashtags or upload their own
Instagram video from their cell phones. Audience members can participate in a
performance by posting videos with a predetermined hashtag.
Augmented Carousel and Augmented Asbury Park
Augmented Asbury Park is Asbury Park's first augmented reality tour of the boardwalk. Augmented reality allows people to see a three-dimensional image on the screens of their mobile devices as an overlay on top of their realtime experience. The 3D content can interact with a flat image called a marker or it can exist out in the open at a specific geographic location. Augmented Asbury Park uses both methods. Seeing the project onsite is the best way to experience it, however, the project website www.augmentedasburypark.com provides free downloadable markers that serve as portable sites for the 3D models. This enables the content to be distributed to the local community and schools.
We launched June 29, 2014 with a tour of the boardwalk. In order to answer questions and to help people through the experience, we conducted guided tours every Sunday throughout July. In addition to the walking tours we conducted over the summer, we also presented Augmented Asbury Park at TEDxNavesink, MakerFaire NYC, The Glory Days Conference at Monmouth University, and the Asbury Park Historical Society. Posters from this project have been the focus of three exhibitions, one at CoWerks in Asbury Park, another at Red Bank FrameWorks in Red Bank, and the last at Eastern Michigan University in Ypsilanti, MI. We also plan to collaborate with other organizations to spread this project as an educational resource.
Utilizing the 2012 presidential campaign footage as raw material, this performance samples words, phrases, breaths, pauses, and other sounds and silences in order to build percussion tracks, melodies, and solos.
Screenshots from the performances at Artist Outpost Resource (Ridgewood, NY) and 3 Legged Dog (New York, NY)
This piece utilizes the Microsoft Kinect and Max MSP Jitter to track hand movements and trigger video collages of collapsing buildings. By moving their hands in front of the Kinect camera, audience members can control the placement and height of imaginary buildings. The structures, which resemble abstracted houses and apartment complexes, appear to be "rebuilding" themselves until they crumble and fall off the screen. The title is a play on the name of the German industrial group Einstürzende Neubauten, which translates to “collapsing new buildings.”
Wobble Tumble Slide, also combines video, performance and sculpture. This piece, however, relies entirely on audience interaction. Rather than involving one performer and one controller, this new installation consists of three controllers, three video channels, and multiple performers. When viewers enter the installation, the video screen show a silent instructional loop. By picking up the sculptures and manipulating them by shaking, rocking, and otherwise interacting with the moving parts, participants alter and edit the sound and appearance of the projected video clips. Like Simulsuck, Wobble Tumble Slide is a performance, but a performance that requires the participation of the viewer. The audience member is simultaneously the viewer and performer. This is documentation from an interactive video piece that was installed at the carousel house in Asbury Park, NJ at the end of September, 2010.
This piece utilizes a custom video controller comprised of discarded vacuum cleaners. The controller houses interactive electronic meters and dials that feed information such as volume and rate into the computer program Max/MSP/Jitter. The program then outputs the video while altering it according to the incoming data. For the video component, I gathered television commercials for cleaning products such as mops, sprays, sponges, and, of course, vacuum cleaners. The result is a rhythm-based improvisational musical performance. This clip (7 min.) is from a performance at Grizzly Grizzly Gallery, Philadelphia in August 2011.