Skip to content
/ madmom Public
forked from CPJKU/madmom

Python audio and music signal processing library. This is a fork adding support for synchronous tracking of vocal note onsets and metrical position in bar. The model used is Dynamic Bayesian Networks.

License

Notifications You must be signed in to change notification settings

georgid/madmom

 
 

Repository files navigation

madmom (with synchronous note onsets and beat position)

This fork adds a hidden state of a musical note (in short note state). It is however generic enough and can be used with any other musical concept as hidden state. For this purpose the NoteTransitionModel and and GMMNotePatternTrackingObservationModel classes are dummy

The joint bar note state space is a Cartesian product of the existing BarStateSpace and the new NoteStateSpace.

NOTE: There is no class for joint BarNoteStateSpace, the Cartesian multiplication is handled inside the BarNoteTransitionModel and GMMNotePatternTrackingObservationModel

NOTE: for the sake of simplicity, it is implemented to work with only one rhythmic pattern.

Modifications to the original code:

Usage

bin/GMMNotePatternTracker


Citation

Georgi Dzhambazov, André Holzapfel, Ajay Srinivasamurthy, Xavier Serra, Metrical-Accent Aware Vocal Onset Detection in Polyphonic Audio, In Proceedings of ISMIR 2017

NOTE: This repository works in together with the other repository based on pypYIN for pitch tracking.


The rest of the documentation about Madmom in general

About

Python audio and music signal processing library. This is a fork adding support for synchronous tracking of vocal note onsets and metrical position in bar. The model used is Dynamic Bayesian Networks.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 100.0%