petersalomonsen/javascriptmusic

Questions! (Btw, this project is awesome!!)

Closed this issue · 3 comments

Hi, this project really looks amazing and after I played in the web version for a while, looked through the code here in the repo, my head started spinning around, and I got very curios about the possibilities of making music with JS the way you did it with node.js and wasm. Now I have tons of questions and I really hope you will have time to answer all of them, thank you! :)

  1. I would love to try out and also do some more music with node.js, I suppose I would have to install ZynAddSubFX or Yoshimi first (I just saw Yoshimi mentioned in the code so I just suppose it works too) can I also use another VST instruments following your examples? I'm on Ubuntu/Linux if something but will I be right to guess that eventually it's possible even to use "premium" windows VSTs via wine or something?
  2. How can we adjust the knobs, parameters of the VST plugin in node.js without their GUI, making it into custom instrument or something we can use in the code? Is it even possible via midi? Or maybe every VST synths have their own project files that can be parsed/changed?
  3. How can I render the audio data to .wav when using node.js? I saw that is pretty easy with Sox when working with 4klang or Webassembly. But here we have Recorder class, does that mean it records in real time only? How could I go around this and just render the audio data without waiting for play-through?
  4. Could you advise, share links/info/resources on where I could learn more about how we can manipulate VST and other audio plugins with just a code and no GUI, DAW etc? I am really fascinated by the idea of making high quality music with code, no GUI, but I cannot wrap my head around it and understand how could it actually work, for example, how can we actually adjust all the knobs and parameters of the plugin to make custom instrument/sound from it. I tried to search more info about such things but currently there is not much about it. Are there similar projects/packages that work with VSTs (or similar high quality plugins) you know of?
  5. I may have misunderstood something but I noticed that for WebAssembly version you are writing custom instruments with Typescript, could we actually use VST instruments compiled to WASM, I know there are WAMs (https://www.webaudiomodules.org/wamsynths/) could we use them without their GUIs?

Thanks again for reading through all of this! :)

Thanks for your feedback and interest in my project, I'll try to answer as best as I can :-)

  1. I would love to try out and also do some more music with node.js, I suppose I would have to install ZynAddSubFX or Yoshimi first (I just saw Yoshimi mentioned in the code so I just suppose it works too) can I also use another VST instruments following your examples? I'm on Ubuntu/Linux if something but will I be right to guess that eventually it's possible even to use "premium" windows VSTs via wine or something?

Yes I would assume you could use VSTi with WINE if they also expose MIDI ports via ALSA or JACK ( so that they can be controlled via nodejs midi ).

  1. How can we adjust the knobs, parameters of the VST plugin in node.js without their GUI, making it into custom instrument or something we can use in the code? Is it even possible via midi? Or maybe every VST synths have their own project files that can be parsed/changed?

Most VSTi automation should be possible via MIDI (control change or sysex), but that depends on the actual synth.

  1. How can I render the audio data to .wav when using node.js? I saw that is pretty easy with Sox when working with 4klang or Webassembly. But here we have Recorder class, does that mean it records in real time only? How could I go around this and just render the audio data without waiting for play-through?

When I rendered data from ZynAddSubFX I connected the audio output to JACK and then into Audacity, and recorded in real time. If using VSTi you can of course call the audio render method (processReplacing()) of the VSTi and render directly to either audio device or a file. I have an example of this in Java here: https://github.com/petersalomonsen/frinika/blob/master/src/com/frinika/jvstsynth/FrinikaJVSTSynth.java#L305

I would assume something similar could be done from nodejs.

  1. Could you advise, share links/info/resources on where I could learn more about how we can manipulate VST and other audio plugins with just a code and no GUI, DAW etc? I am really fascinated by the idea of making high quality music with code, no GUI, but I cannot wrap my head around it and understand how could it actually work, for example, how can we actually adjust all the knobs and parameters of the plugin to make custom instrument/sound from it. I tried to search more info about such things but currently there is not much about it. Are there similar projects/packages that work with VSTs (or similar high quality plugins) you know of?

The experience I have with VSTs is primarily from the time I worked on Frinika as mentioned in the answer on the previous question. I used JVSTHost for controlling the VST, and it's possible to control both the midi and rendering of audio that way.

  1. I may have misunderstood something but I noticed that for WebAssembly version you are writing custom instruments with Typescript, could we actually use VST instruments compiled to WASM, I know there are WAMs (https://www.webaudiomodules.org/wamsynths/) could we use them without their GUIs?

Yes I'm writing my synths in AssemblyScript. I guess some VSTs could be compiled to WASM using Emscripten, but it's always a bit of an effort, especially for the UI. The Audio part of it should be possible to connect to the AudioWorklet just as I do with my WebAssembly synths.

Thanks again for reading through all of this! :)

You're welcome! Hope that you get something out of my answers :-)

Thank you for getting back to me! Thanks for pointing me to Frinika, it actully looks very cool, has all the features I would ever need for making my ambient/electronic type of music, but again I don't know Java and separating the GUI to turn it into coding music environment probably would be a paaain :) Too bad node.js doesn't have mature VST Host library similar to JVSTHost.
In any case, for getting better understanding on things I should just dive in into trying and doing things you started with node.js, I have already forked this repo, so if you are interested I could eventually push all my updates if I will ever be able to hack something together. I suppose you are now fully concentrated on wasm version so I guess the node.js version of the project could only benefit from this. I'm not after the software/ live coding env. itself rather it's results (making music) so I would be happy to contribute to the community :) I believe you've already put together quite mature sequencer and way of turning code into the music logic but still I would be happy if I would be getting some guidence/help along the way if I get stuck! Cheers!

Closing this for now but always happy to hear more additions/ comments/ directions! :)

Yeah I'm mostly into the WASM version now, and have everything in the browser. It's of course always interesting to see what you have going on in your fork, even if not merged back to my repo. I'm also most interested in the result ( producing music ), and creating the live coding environment is just for getting there.

Currently I have a side branch on making old-school Amiga protracker music. Not sure if this will be merged either, but it's an interesting attempt to write mods and instruments in javascript / assemblyscript and using xmp (compiled to webassembly with emscripten) for playback.

but the main thing will be the WASM music environment in the browser, and something I want to create next is for using audio samples in the music.

cheers :-)