RSS BLOG

Adaptive Art at the Duderstadt

Art & Design and Engineering Students team up to create art with interactions

Andre Grewe makes websites for the School of Art & Design.

Opening this afternoon at the Duderstadt Gallery, the /bin/art exhibition showcases student projects developed in Adaptive Art: an interdisciplinary class offered by the School of Art & Design and the College of Engineering.

Co-taught by Satinder Baveja (Computer Science & Engineering), director of the University of Michigan's AI laboratory with expertise in machine learning, and Osman Khan (School of Art & Design), an artist who uses technology to create interactive installations, Adaptive Art focused on using computation, algorithms and machine learning as mediums for aesthetic expressions.  In this class, computers aren't something that you do your work on - instead, they're a vital part of the finished work, and might even have created it on their own.

Working in teams that mixed Engineering and Art & Design students, the class used Arduino microcontrollers and Processing, an open source programming language, to create a wide variety of works that interface with people, whether in the same room or on another continent.

I stopped by while the show was being set up and had the chance to get a preview of some of the installations:

The Beckoner
Picture this: You're rushing through the Duderstadt on your way to class. As you hurry past the gallery, you're startled by a sudden tapping sound. As you turn toward the source of the tapping, you realize it's being made by an articulated wooden hand - and now it's beckoning, drawing you closer... This isn't the start of a horror story, but an interactive installation: The Beckoner is a wooden hand mannequin that's controlled by software programmed to recognize human figures and faces - when its camera senses a human-sized shape walking by, it taps. If that shape stops and appears to be looking toward the camera, it beckons them in. "The Beckoner is study into the dynamics of human-computer interaction in a public space – can a wooden hand really engage the attention of the busy students and faculty walking by the gallery?"

 

Rehaiku
Rehaiku plays with conventions, filtering messages about disposable pop culture relayed on a very modern form of communication through a centuries-old formal Japanese structure. The software retrieves tweets from Twitter in real-time, and uses machine learning techniques to combine pieces of these different tweets into correctly formed haiku, creating perfectly formed groupings of 5, then 7, then 5 syllables about Kanye, Justin Bieber and more. The program then sends its creations back out into the world, tweeting them based on crowd response at: http://twitter.com/rehaiku

Monitor
In today's world of TSA body scanners and cameras at every store entrance and traffic light, it's hard to know when you're being watched. Monitor plays into this modern paranoia: As a viewer steps into the installation, he's surrounded by 3 pillars, each topped with a TV monitor. The center TV displays a security camera feed of the subject, and the other two display closeup footage of human eyes. As the subject turns toward any of the monitors, facial recognition software notices - and switches that monitor to static. The result? "The subject will only be able to see video in his periphery, adding to the sensation that he is being watched. The room reacts to the subjects actions in a way that is meant to maintain his ignorance of the content being displayed. "

 

Pouring Sound
This installation takes a new approach to the traditional sampler, "focusing on transforming how we treat audio and sound, from being audible and intangible, and translating it into a physical object that can be moved, contained, and mixed together." Users can speak into the red/blue pitcher, then play back the sound by pouring it into the green/yellow one. Sound can be sloshed back and forth between the vessels to mix, and dumped on the floor to erase.

 

Other projects featured in the show include:

  • Audio Wall: a Microsoft Kinect powered interactive space that allows the user to produce and play with music using only their body
  • Digital Genesis: viewers are invited to place and move physical blocks onto the digital environment, which provide light and water sources, nests and environmental effects
  • Hands Free Super Three: presenting three classic video games played hands free (sans controller)!
  • Inside Out: a garment that displays its wearer's heartbeat
  • Music Sequencer: a self-contained, 8-bit, Arduino-powered music sequencer. Sounds are selected and sequenced using an array of buttons on the main panel.

Read more about all of the projects at the Adaptive Art blog.

To get the full impact of these interactive installations, you should really see them in person, but there's a limited time to check them out: /bin/art opens with a reception at the Duderstadt Gallery from 4 - 6 pm on Thursday, December 16 and closes on Friday, December 17th.



COMMENTS

Duderstadt Center is a real paradise for Duderstadt Center for students who wnat to be involved in engineering, computer science, music, art (studio art) and architecture. You'll see it with your own eyes how it made synthesis of traditional and new sources of information.

Posted by Mason on February 22, 2012

SUBMIT A COMMENT
Name

Email

Website

Comment

Anti-Spam Question

What is missing: North, South, East? (4 character(s) required)

More Posts by Andre Grewe | View All Posts