Last weeek (1/31 - 2/7):
- madmom library, audio => chroma => chord symbol & timestamp => root note & timestamp =...90%...> midi for bass
- install Open-Unmix 0% done (skip it)
meeting discussion:
- What kind of audio as input? with or without drums?
- Genre specific or not? (Metal?)
- If this is user-driven using summed track, is it really easy to use except python environment? (genre identification)
- Is source seperation necessary?
- Tempo?
Next Week (2/8 - 2/14):
- checkout Studio One
- finish midi for bass
- Come up with an alternative plan if one or current libraries doesn’t work (individual tracks?)
- Look at many genres, find music textbook
- Start with onset/drum/tempo...
Last weeek (2/8 - 2/14):
- Studio One has very good chord detection and basic bass line generation. It doesn’t consider rhythmic information unless the user write it out.
- Found a book Bassist's Bible. This book has somehow detailed knowledge about bass patterns in many popular music genres. Metal and Punk have shorter chapters than other genres, so it's easier to implement these two. Some Electronic music genres have simple bass patterns. I can go with that direction if sound design is not part of the project.
Last weeek (2/15 - 2/21):
- Tried out onset, beat, and downbeat dectection.
- Onset detection seems to work nicely. For analysizing a full track, threshold around 7 is good. For a track below 150Hz, threshold below 1.
- Convert onset locations to midi.
- Manually quantize onsets to 16th notes.
- Manually join onsets and root notes (a few bars) together. It sounds pretty good.
Next Week (2/22 - 2/28):
Possible directions:
- Drum separation -> drum transcription, replace onset detection
- Beat detection -> downbeat detection -> bass pattern based on style
- Tempo tracking -> auto quantization
Last weeek (2/22 - 2/28):
- Worked on beat detection. The result is not as accurate as expected.
Meeting
- A plan is needed
- Test on the beat detection
- Evaluation method for this project
Next Week (2/28 - 3/7):
- A plan
- Test on the beat detection
- Evaluation method
Last weeek (2/28 - 3/7):
- Tested and evaluated different beat detection algorthms
- in this project, beat detection only works when a constant tempo is assumed, and the audio is edited
Meeting
- Structure identification combined, not a good idea
- Note density idea is good, more than note/tempo ratio? More evidence to support this idea
Next Week (3/7 - 3/14):
- Build a baseline this week, root + onset
- Export 10 generated pieces, several genres
- Think a bigger plan that is not limited by the current technology
- Find a path to the plan
Last weeek (3/7 - 3/14):
- Built the baseline system
- Test on ten songs
- Good: Blues, Country, Hiphop, Electronic, Pop
- Bad: Metal and Rock due to the small dynamic range; Reggae due to the short chord length
Meeting
- Variation reduction needs more explanation
- based on what evidence you derive your definition of complexity, why your definition is not arbitrary, and why your definition is more fitting than possible alternative ways of defining complexity, and
- how you actually plan to model/approximate this complexity algorithmically and how you verify that your model fits the definition.
- Melody??
Next Week (3/15 - 3/21):
- Melody ideas
- Evidence of complexity mapping
- Plan for complexity approximation
- Source separation
Last weeek (3/15 - 3/21):
- Found a midi dataset
- Labelled genres on webpage: Rock, Pop, Hip-Hop, R&B Soul,
Classical, Country, Folk, Jazz, Blues, Dance/Electric, Folk, Punk,Newage - Many have constant velocity
- Tempo and time signature inherited in midi files
- Labelled genres on webpage: Rock, Pop, Hip-Hop, R&B Soul,
- Installed Open-unmix
- Found some literature about note density and velocity
Meeting
- Randomess no good
- Key detection needed for passing notes
- What is the distance between root notes
- Stylistic selection?
- A list of priority
- A schedule
Next Week (3/21 - 3/28):
- A list of priority
- A schedule
Last weeek (3/21 - 3/28):
- Implemented the basic note density search algorithm
- Improve the note duration so it matches the next note
- Came up with a schedule
Meeting
- Make sure the algorithm converge
- Understand loudness metrics
Next Week (3/28 - 4/4):
- Implement structure idenntification
- Implement loudness analysis
Last weeek (3/28 - 4/4):
- Integrated structure idenntification, no proper evaluation 2.Implemented LUFS for each section
Meeting
- no evaluation is fine
- DBa???
- Is the library reliable?
Next Week (4/4 - 4/11):
- GUI
- Real-time playing
Last weeek (4/4 - 4/11):
- Playback (need: reload of midi, a slider for playback position?)
- Display of music structure
- Audio mix control
- One slider for density
- Create all sliders (don't do anything yet)
Meeting
- Midi audio sync?
- Label time axis
- label gain knob
- normal gain knob is log scale
- number the sliders with each section!
- Overall slider?
- playback control?
- Stop
- Quantized slider
Next Week (4/11 - 4/17):
- Make all sliders works!
- More clear explaination
- Figure out the mapping
Last weeek (4/11 - 4/17):
- Failed to imporve the playback (unable to do anything to the midi playback)
- All sliders work now
- Provides suggested slider position based on loudness
- slider^3 gain control
- Label time axis
Meeting 1.
Next Week (4/17 - 4/):
- Splice the midi files
- New mapping of the sliders
- try on 30 songs, pick the good and bad ones
- Slider labels
- passing notes
Last weeek (4/17 - 4/25):
- Tested on 30 songs
- New slider mapping
- Section playback
- passing notes
Meeting 1.number the visual segments 2.DONE gain knob initial position 3.without bass, loud bass, normal bass 4. 60s talk: motivation, outcome, methods 5. poster, all 3rd libraries; audio and corresponding settings