RSS BLOG

Robots Not Uprising, but Sitting

I AM A SUBTITLE!!!

Zack Jacobson-Weaver is the Materials Fabrication Studio Coordinator at A&D.

Hey y'all.  Just came across this crazy article about Rapid Prototyping with much less size limitation.  There is a question though.  How do people feel about robots taking over making stuff?  What is the artists role?  The designer?  Do you harbor irrational fears of robots.  Some say they cannot feel (Andre!).  Is a robot-made-object devoid of emotion?  Do robots get paid to do this?  If so, are we talking, like, 30k, 50k, what?  You tell me.  Let's use this blog......like a blog.

 

Endless from Dirk Vander Kooij on Vimeo.

 

Intuition?  This program simulates "phonesthesia", converting sounds into images.

 


 

Adaptive Art at the Duderstadt

Art & Design and Engineering Students team up to create art with interactions

Andre Grewe makes websites for the School of Art & Design.

Opening this afternoon at the Duderstadt Gallery, the /bin/art exhibition showcases student projects developed in Adaptive Art: an interdisciplinary class offered by the School of Art & Design and the College of Engineering.

Co-taught by Satinder Baveja (Computer Science & Engineering), director of the University of Michigan's AI laboratory with expertise in machine learning, and Osman Khan (School of Art & Design), an artist who uses technology to create interactive installations, Adaptive Art focused on using computation, algorithms and machine learning as mediums for aesthetic expressions.  In this class, computers aren't something that you do your work on - instead, they're a vital part of the finished work, and might even have created it on their own.

Working in teams that mixed Engineering and Art & Design students, the class used Arduino microcontrollers and Processing, an open source programming language, to create a wide variety of works that interface with people, whether in the same room or on another continent.

I stopped by while the show was being set up and had the chance to get a preview of some of the installations:

The Beckoner
Picture this: You're rushing through the Duderstadt on your way to class. As you hurry past the gallery, you're startled by a sudden tapping sound. As you turn toward the source of the tapping, you realize it's being made by an articulated wooden hand - and now it's beckoning, drawing you closer... This isn't the start of a horror story, but an interactive installation: The Beckoner is a wooden hand mannequin that's controlled by software programmed to recognize human figures and faces - when its camera senses a human-sized shape walking by, it taps. If that shape stops and appears to be looking toward the camera, it beckons them in. "The Beckoner is study into the dynamics of human-computer interaction in a public space – can a wooden hand really engage the attention of the busy students and faculty walking by the gallery?"

 

Rehaiku
Rehaiku plays with conventions, filtering messages about disposable pop culture relayed on a very modern form of communication through a centuries-old formal Japanese structure. The software retrieves tweets from Twitter in real-time, and uses machine learning techniques to combine pieces of these different tweets into correctly formed haiku, creating perfectly formed groupings of 5, then 7, then 5 syllables about Kanye, Justin Bieber and more. The program then sends its creations back out into the world, tweeting them based on crowd response at: http://twitter.com/rehaiku

Monitor
In today's world of TSA body scanners and cameras at every store entrance and traffic light, it's hard to know when you're being watched. Monitor plays into this modern paranoia: As a viewer steps into the installation, he's surrounded by 3 pillars, each topped with a TV monitor. The center TV displays a security camera feed of the subject, and the other two display closeup footage of human eyes. As the subject turns toward any of the monitors, facial recognition software notices - and switches that monitor to static. The result? "The subject will only be able to see video in his periphery, adding to the sensation that he is being watched. The room reacts to the subjects actions in a way that is meant to maintain his ignorance of the content being displayed. "

 

Pouring Sound
This installation takes a new approach to the traditional sampler, "focusing on transforming how we treat audio and sound, from being audible and intangible, and translating it into a physical object that can be moved, contained, and mixed together." Users can speak into the red/blue pitcher, then play back the sound by pouring it into the green/yellow one. Sound can be sloshed back and forth between the vessels to mix, and dumped on the floor to erase.

 

Other projects featured in the show include:

  • Audio Wall: a Microsoft Kinect powered interactive space that allows the user to produce and play with music using only their body
  • Digital Genesis: viewers are invited to place and move physical blocks onto the digital environment, which provide light and water sources, nests and environmental effects
  • Hands Free Super Three: presenting three classic video games played hands free (sans controller)!
  • Inside Out: a garment that displays its wearer's heartbeat
  • Music Sequencer: a self-contained, 8-bit, Arduino-powered music sequencer. Sounds are selected and sequenced using an array of buttons on the main panel.

Read more about all of the projects at the Adaptive Art blog.

To get the full impact of these interactive installations, you should really see them in person, but there's a limited time to check them out: /bin/art opens with a reception at the Duderstadt Gallery from 4 - 6 pm on Thursday, December 16 and closes on Friday, December 17th.


 

Digital Ceramics!

Zack Jacobson-Weaver is the Materials Fabrication Studio Coordinator at A&D.

We've recently been experimenting with how to combine our RP and SRP technologies with ceramics.  What's the draw? For starters, the amazing material properties of ceramics combined with freeform and rapid fabrication tech has potential too grand and various to innumerate.  How about a custom suit of fireproof dragon scales?  Rock!  Here's some of what we're up to now...

These lovely nugs were printed with the help of Shawn at the UM 3D Lab, right across the street from A&D.  This is a mixture of terra cotta, PVA and sugar; a recipe we got through a virtual collaboration with ME researchers at U-Dub, the University of Washington, Seattle. You can read about their work and get the recipe for yourself here: http://ceramicartsdaily.org/methods-techniques/the-printed-pot/?floater=99.  

The pieces were printed in a Zcorp 3D printer with Zb60 binder.  The future? Firing tests this spring break: tank tops and Mojitos in the kiln room, anyone?

We're gonna need a bigger kiln!  This lovely catch was made by taking a 3D scan of a fish, again credit to 3dLab.  The .stl file was then prepared for CNC milling on A&D's Roland MDX-540 "desktop" wizzard and RhinoCam software, a plug-in for the ubiquitous CAD software.  

This piece is cut into molding plaster and is half of a slip cast mold.  We didn't think it was a really big deal until American Craft Maganzine wrote an article talking about the "new craft".  See page 4 of this article for some insight on the role of these hot technologies in the future of ceramics.

For an amazing example of the potential, check out artist Geoffrey Mann's "Crossfire" series.  The teapot and flatware here is made by slip cast from digitally designed molds.  This is awesome!