Better audio analysis
funwithtriangles opened this issue · 6 comments
Currently it is quite primitive. Would be good to have some "decay" options and more. Not totally clued up on this one!
got a start over here, going to dive in with some more improvements once I understand the UI side a bit more :) cale-bradbury@e2b213b
Awesome, this is exactly the sort of stuff we need.
@netgrind do you think it would be useful to have MIDI control over these new values? If so, this is the approach:
- Values for each of these new settings need to be stored in the state. The best place for these values to be stored is probably under
nodes
. - Nodes are wired up in a way that they can be a value from 0 to 1 and have the ability to be controlled by inputs (e.g. MIDI).
nodes
are also used for any sketch param, macro, modifier, etc. - On the start of the project, these new audio analysis nodes need to be created. They can be created with the action
uNodeCreate
. They will need to be given a unique id, usinguid
. This will put them in the state. This will have to happen on initiation of the project. - As well as creating the nodes, the unique IDs for each of these nodes need to be kept somewhere too. Possibly under
project
. Maybe in a new place in the state. - Once that's all done, it's then a case of creating components that are linked to these parts of the state. The IDs stored under
project
will get you the right values for the items undernodes
. These values can then be given to slider components, etc.
- Will also need logic to prevent these audio analysis nodes being created on a project if they already exist
Yeah, getting a texture of audio data that can be passed to any THREEjs scenes that have the proper markup/metadata, very handy to passing through to vert/frag shaders for reactive effects.
I personally like having it so X is the frequency, and Y is the history over time. Might be a good idea to have an option to send one that is similar to shadertoys (I think it is currently just a 2 or 5 pixel high left/right channel..?) so people can just pop their shaders right into a template
So my first thought on this that this should be some kind of new input that is assignable to any parameter. However I don't know if that really makes sense as it's quite a specific data type that only makes sense in certain contexts.
One very simple approach to doing this would be to pass the audio analyser object to each sketch class upon construction. Inside the analyser object could be useful pre-calculated things such as the texture you're talking about or anything else. So you'd create sketches like this:
class MySketch {
constructor (scene, meta, extras) {
const analyser = extras.audioAnalyser
this.texture = analyser.dataTexture
this.frequencies = analyser.frequencies
this.someOtherUsefulThing = analyser.fooThing
}
...
update () {
// can use this.texture here
}
}
Does that approach make sense? Inside the extras
parameter there could also be a webcam feed or whatever other inputs that don't quite fit with how the rest of the params work.