You are not logged in.
Pages: 1
T-Dawg's response here: http://bbs.archlinux.org/viewtopic.php?id=29852 for some reason made me think of another interesting question phrakture (and maybe others) might be interested in answering. I saw 'Happy Feet' a couple weeks ago, and had this idea that it would be really cute if I could make an animated character (say tux...) dance in time to music. Now I know there are plugins to common media players that provide various 'visualizer' plugins. Does anybody know anything about the signal processing that would be required to extract 'dance movies' and somehow apply them to animations; specifically how it would be coded? I have a pretty good idea of how to make a bunch of simple animations in blender, and then access them in code. The question would be how to 'bind' certain changes in the music to switch the animation, and some way of processing a delay on the animation to make it appear 'in time'.
Google provided me with this paper: http://www.cs.ualberta.ca/~sauer/ACMPaper.pdf Haven't read it yet, but I will and some of the references. I've noticed research papers tend not to be very useful toward actual real-world implementation.
Anything like this been done before? I mean, besides Animusic, which rocks.
Dusty
Offline
haha, that's almost perfect, the GDancer one is awesome (and its in AUR). I still want to see something with 3D characters though, but these might give me something to work with in the signal processing department.
Thanks!
Dusty
Offline
Hey Dusty, would you mind reporting back when you find a solution?
Offline
YAY! now thanks to gdancer I have steve ballmer dancing on my desktop!
AND I now have clinton head banging..
there should be a bill gates one.. xD
Last edited by majikstreet (2007-02-14 13:55:05)
syd wrote:Here in NZ we cant spell words with more than 5 letters. So color will have to do.
You must be very special then because "letters" has 7
Offline
Pages: 1