moinejf/abc2svg

Identifying music note types

Opened this issue ยท 9 comments

Hi,

Is data of the note types rendered in SVG available and computed?

I mean if the first note is a crotchet and second a minim, would your library indicate that or compute that? Is that data accessible.

Secondly, If there is a chord present on the staff, Like C, E, G notes on the same vertical line, does abc2svg know that and is that data also available?

Currently I have access to the play data (array of floats) however that doesn't seem to have sufficient information. However, I see that there are 2 other data structures "tsfirst" and "voice_tb", it might be possible that one of them has the necessary information that I'm looking for?

I'm using code like this to get the array of floats

function gotABCModel(tsfirst, voice_tb, anno_type, info) {
    console.log("ABC model");

    var to_audio = new play1.ToAudio();

    to_audio.add(tsfirst, voice_tb);
    play1.player_model = to_audio.clear();
}

the toaudio.clear() function returns the play data. I see that the clear() function has now been deprecated in the newer versions?

bwl21 commented

Hi,

there is a class to AbcJSON which provides a method to get the music model as json. This is maintained by @moinejf and is a more stable access. You can save this JSON and investigate it to see which information is provided.

why do you think that clear() is deprecated? I could not find a hint for this.

Here is the Opal code I am using to extract the model. you can guess how it would be in a plain Javascript. %x{ } contains javascript code.

    #
    # we use gen_json in to prepare
    #
    def _callback_get_abcmodel(tsfirst, voice_tb, music_types, info)

      json_model = ""
      %x{
          var abcmidi = new AbcMIDI();
          abcmidi.add(#{tsfirst}, #{voice_tb});
          var to_json = new AbcJSON();
          #{json_model} =  to_json.gen_json(#{tsfirst}, #{voice_tb}, #{music_types}, #{info});

          var to_audio = new ToAudio()
          to_audio.add(#{tsfirst}, #{voice_tb})
          #{@player_model} = to_audio.clear()
      }

      @abc_model = JSON.parse(json_model)

      if $log.loglevel == "debug"
        $log.debug(@abc_model.to_json)
      end
      @abc_model
    end

Hi @GitterHubber and @bwl21,
The music symbols may be accessed by 2 means, anno_start()/anno_stop(), and get_abcmodel().

The first functions are called when the symbols are generated, just before or after creating their SVG representation.
The last parameter of these functions is a pointer to the music symbol (object). This object contains all the information about the symbol: type (note, rest, key signature...), duration, array of notes (pitch, accidental), staff reference, voice reference, voice links, time links... (for more information, a chord is a symbol of type NOTE with many notes in the array of notes)

The second function (get_abcmodel) is called just before the SVG generation of a tune.
The first parameter, tsfirst, is the first music symbol in time. The music symbols are linked in time by the two pointers 'ts_next' and 'ts_prev'.
The second parameter, voice_tb, is the voice table (array of objects). Among other attributes, 'sym' is the first symbol of the voice. The symbols are linked in their voice by the two pointers 'next' and 'prev'.

On the other side, there are two mechanisms for playing.

The old mecanism which is still used (for some time) in abcemb{,1,2}.js, does two generations: a first one to create the SVG images, and a second one for playing. This second generation uses to_audio.add() in get_abcmodel() to create the 'play data' (array of floats). This array is returned by to_audio.clear().

The new mecanism is (actually) implemented in the simple editor only. There is only one generation. On anno_stop(), the symbol and its location in the source file are memorized. The source file is stored in the class of a rectangle drawn around the music symbol. This permits to identify the symbols in the page frame from mouse events. Then, for playing, the function abcplay.play() is called with the pointers to the start and stop symbols. Internally, on its first call, this function generates the MIDI pitches and other play attributes (ties, repeat sequences...) before following the time links and doing the sound generation.

@bwl21

why do you think that clear() is deprecated? I could not find a hint for this.

I say this because I think snd-1.js ,which is the new sound api, no longer uses clear, at least that's the indication I got.
I see in the ToAudio() function of snd-1.js

   clear: function() {		// useless - for compatibility
   },

@moinejf and @bwl21
Thanks for your feedback, its really appreciated.
Let me go through your answers in details, I will get back soon.

Hi,
So I tried the JSON method first to see what I get, uisng "json-1.js"

I pasting an excerpt of the json string that I get symbol objects like so


        {
          "type": 8,
          "fname": "HB",
          "stem": 0,
          "multi": 0,
          "nhd": 0,
          "xmx": 0,
          "istart": 113,
          "notes": [
            {
              "pit": 16,
              "shhd": 0,
              "shac": 0,
              "dur": 192
            }
          ],
          "dur_orig": 192,
          "dur": 192,
          "v": 0,
          "st": 0,
          "time": 768,
          "pos": {
            "dyn": 0,
            "gch": 0,
            "gst": 0,
            "orn": 0,
            "stm": 0,
            "voc": 0,
            "vol": 0
          },
          "iend": 115,
          "seqst": true
        }

I see that type 8 indicates that it is a "note".
However I do not see music note information like if its a quaver, minim, crotchet etc.
Is this information available in the library, or there any enum I can refer to in the library that can identify the music symbols?

The duration will probably give information regarding the note, I mean we can compute the note type from the duration and tempo, but I was just wondering if you have already pre-computed and kept it available for use.

Most of the constants are defined in the file core/abc2svg.js.
The constant BLEN (base length = 1536) is the duration of a whole note. In the note/rest symbols, 'dur' is the duration of the symbol. For instance, 192 is BLEN/8, i.e. a eighth note or a quaver.

The conversion between the note duration and its head type and its dots is done in the function 'identify_note() in core/music.js.

More information about the attributes of the symbol above:

  • type 8 is a NOTE
  • stem is the stem direction (-1 down, 1 up, 0 unknown)
  • multi is 1 for the top voice in a staff, -1 for a secundary voice
  • nhd is the number of note heads (= length of the array 'notes' - 1)
  • istart and iend are the index of the note in the ABC source
  • v is the voice number
  • st is the staff number
  • time is the time of the note in the tune
  • pos gives the position of extra elements (dynamics, chord symbols, ornaments..) in the staff (above, below, invisible)
  • seqst is one when the symbol is the first at this time in the tune
    In the note array:
  • pit is the pitch of the note head in staff position (with the treble clef, 'C' is 18, 'D' is 19,.. 'c' is 23...)
  • acc is the accidental when present (1 sharp, -1 flat, 3 natural)
  • shhd and shac are the horizontal shifts of the head and the accidental
  • dur is the duration of the head (may be different per head)

Thanks for the extended reply

  • pit is the pitch of the note head in staff position (with the treble clef, 'C' is 18, 'D' is 19,.. 'c' is 23...)

I see that there are 2 different pitch values. One pitch value is in the Midi play data, which seem to be the Midi pitch numbers where middle C or C4 is 60.
and then there is pitch value in the json data, where C4 has some other value.
is C4 (middle C) value 16 in the json data?
How was this value derived?

Secondly,
Is there any ID in the json array that links it to the Midi play data array?

The MIDI pitch is computed from the ABC pitch after handling the accidentals and the play transposition.
The staff pitch (you are right, C4 is 16) ignores the accidentals, but not the display transposition.

I don't see what you mean by 'Midi play data array'. The JSON representation lacks a lot of information (pointers). For a correct rendering (display or play), it is better to use the raw structures.

The staff pitch (you are right, C4 is 16) ignores the accidentals, but not the display transposition.

Is staff pitch number a universal concept or is it just used in abc2svg.
I mean I cannot find more information of staff pitch number on the web.
So if C4 is 16, i guess D4, E4, F4 ... are 17, 18 and 19 and so on .
But what about C0, C1, C2 are those negative numbered?

I'd never imagined the staff pitch number to be searched on the web!
It is just a number that identifies the notes.

Some country use C,D,E,F.., some other ones use Do,Re,Mi,Fa.. These are names that are not easy to use in computers. Instead, music software use numbers.
But a note is fully identified when you know its octave, as 'C4'. In computers, it is simpler to have only one number. The origin of the numbers is not important. In abc2ps (base of abcm2ps and abc2svg), the origin was taken so that the number of most common notes could be stored in a byte, i.e. have a value between 0 and 127. For some more reasons, the value 18 was taken for the note E4.
Then, this number may be directly mapped to the staff: when the clef is the common treble clef, 18 is on the first bottom line, 20 is the 2nd line.. The note number, now known as a staff pitch, is adjusted when some other clef is used.