First things first: there's no library out there for what you want. It's just a bit too specific, I think.
The good news is this isn't terribly hard to do - it's just about getting your head in the right place.
Instead of thinking about sound, let's think about something else, like the accelerometer. If I want to have a UIView
move around in response to the accelerometer I could quite simply take the outputted value for the Z-axis (between -1 and 1) and convert it coordinates on the screen (0 through 480, for example).
If I just plug the accelerometer measurements directly into my conversion formula it's probably going to be a bit jerky. The UIView
might bounce all over the place. This is because the accelerometer measurements can vary a lot. So maybe I add some kind of simple filter to make the changes between measurements more gradual.
Now, what has this got to do with the audio? Actually, a fair bit. For example, if you substitute amplitude for the accelerometer you could have a UIView that moved up and down in response to the loudness of the audio. All you'd need to do is write something that sent amplitude values constantly in to the UIView you wanted to animate.
You can then get a little more complicated, by having a view that may move or change its position/scale in response to certain frequency ranges (bass, perhaps). So I think perhaps if you take a step back and think about exactly what you're trying to animate in response to what parameters this may become a little easier for you.