The Full Story
FRAGMENTS OF LYDIAN CLUSTERS - an excerpt for 7 string electric violin, delay and AI
My exploration of music and machine learning has been long. I have recorded 127 musical sentences for this piece and cataloged them by scales, articulation, dynamics, and rhythm. After that, I trained my computer with an LSTM Polyphonic processing model (using MIDI and Python). Finally, I asked it to take sentences from the Lydian folder in the order of its choosing. What you hear in this piece is a minute-and-a-half excerpt of a multiple-hour output generated by my computer, derived from my musical sentences. Furthermore, the piece's cover art has also been generated by AI.
As an active explorer of the field, I was asked to lecture about music and AI in the course "MAS.S66 F’22 | Computer Visions: Generative Methods for Creative Applications" at the MIT Media Lab, led by Dr. Roy Shilkrot.