(YouTube Link)
Gil Weinberg and Scott Driscoll of Georgia Tech developed a robot that can improvise rhythms as it hears music:
Haile is a robotic percussionist that can listen to live players, analyze their music in real-time, and use the product of this analysis to play back in an improvisational manner. It is designed to combine the benefits of computational power and algorithmic music with the richness, visual interactivity, and expression of acoustic playing. We believe that when collaborating with live players, Haile can facilitate a musical experience that is not possible by any other means, inspiring players to interact with it in novel expressive manners, which leads to novel musical outcome.
http://www.cc.gatech.edu/~gilwein/Haile.htm via The Presurfer
A fascinating book that I highly recommend.
Thanks for posting the video (stanky or not).
@otterly, the robot can 'improvise' utilizing the human's beats and manipulating them, it can sense the density of the user's playing and make a decision about its own density. The player can play anything it wants and Haile will 'improvise' with it, not the other way around.