Revision as of 13:04, 6 December 2020 by Nlfang (Talk | contribs)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)


Applications of Hidden Markov Chains

Nowadays, algorithmic composition is an emerging interdisciplinary research area that many researchers and composers are very interested in.

Remember that we mentioned three fundamental problems above. Predicting problem, decoding problem, and machine learning problem. A music generation problem is normally considered as a machine learning problem: we only know the observation sequence (existing music sheets in the world), and the set of states in the hidden Markov chain model (all the information about those music sheets, including notes, notes value, loudness, etc). With these known conditions, the problem becomes: how do we construct a training model that can evaluate the fidelity of it’s automatic composition? We can use a Generative adversarial network (GAN), raised by Ian Goodfellow in 2014, to approach this problem. The basic idea is like letting the machine do Turing tests over and over again. However, both test taker and grader are machines. The test taker machine tries to generate music sheets that can fool the grader, making it believe it is written by human beings; on the other hand, the grader machine will try to distinguish those machine generated music sheets. During this process, the test taker machine will adjust the parameters of its hidden markov chain model to get a more authentic result; the grader machine will also adjust its parameter to make it more sensitive to computer-generated music sheets. Eventually, it will come to a point where the grader machine can no longer distinguish whether it’s human-written or machine-generated music.

Diagram of a GAN network, courtesy of Pathmind

To learn more about the GAN network, readers can check this paper written by Ian Goodfellow


Back to Markov Chains

Alumni Liaison

Correspondence Chess Grandmaster and Purdue Alumni

Prof. Dan Fleetwood