Each time there is a major shift in technology representing increases in productivity or serious streamlining of the work required to complete tasks, the conversation inevitably swings to how this technology will replace workers, eliminate jobs and basically end civilization as we know it. I will note that while technology has changed the way we live and work in myriad and impactful ways, the apple cart is still standing and the sun still rises in the East.
The replacement conversation in the music creation space is real and has tangible examples of the replacement of a whole chain of jobs. Enter Suno, a web-based Artificial Intelligence tool for songwriters and composers which quickly and effortlessly creates finished, polished songs that can be quickly adjusted and recreated to meet whatever needs producers have. It represents a prima facie example of musician replacement.
There may also be a whole range of applications where the output from this and other AI programs is acceptable for use. For example, for movies, television, video games, elevator music, etc., perhaps AI-generated songs produced by a Suno-like engine can provide a very cost-effective alternative to producing the music with other, more labor-intensive options.
But that is not the side of Artificial Intelligence that I’m interested in. I’m looking to use technology to enhance what I do. Make me more effective. Give me superpowers I wouldn’t otherwise have. Improve the end result of my musical adventures. I’ve spent the past year exploring these objectives.
What did I find, you ask? A couple of very key things. First, the average Digital Audio Workstation (DAW) in 2025 is a well-developed, mature piece of software that allows musicians to create excellent music wherever the computer is set up. Second, the quality of today’s sound libraries and plug-ins is outstanding and often hard to distinguish from the original instruments. Lastly, the addition of AI-based session musicians who can successfully shoulder the responsibility of the drums or piano sets up the DAW as a complete, always-available studio with competent session players waiting for your direction on the next take.
Once armed with these capabilities, I was able to go on a writing and song development tear. After I realized what great-sounding versions of my songs were coming, I couldn’t stop making them. Anxious to hear how the next one came out. The software allowed me to experiment with the very specific instruments that I love as a fan of music, for example, the Steinway Grand Piano or the Hammond B3 organ.
One of the biggest surprises of this album development project was the Mellotron. For Father’s Day last year, my family took me to the Museum of Making Music, created by the folks who run the NAMM trade show every year. The museum houses a number of unique, often one-of-a-kind musical instruments. They had a Mellotron, a big white keyboard console that played actual recorded magnetic tapes of strings and horns. Popular in the late 60’s and 70’s, the Mellotron pretty much disappeared when electronic synthesizers came along. The instrument produced a unique compressed version of flutes or violins. I called it “strings a coffee can” sound. The first few notes of The Beatles’ “Strawberry Fields Forever” are done with a Mellotron.
The Museum had an original Mellotron M400 on display. It was disassembled so you could see the clunky mechanism that played the tapes in response to pressing the keyboard. Renewed my curiosity about the instrument. I had always loved the Mellotron sound, and I indulged that obsession by going through the complete Wiki of songs that use the instrument and listening to a bunch of them, largely 1970’s prog bands that I loved. For example, David Bowie’s “Space Oddity” uses the 3 Violins version prominently.
I had a song that was designed to be in the style of the band “Yes,” and it had a spot for a spacy keyboard or synth. The song is titled “Starburst Tragedy.” I first saw the spot as a string-sounding synth pad. Logic Pro has a selection of Mellotron sounds, pretty much covering all the versions of the instrument that were offered. I found that the style I liked the most was the “3 Violins” version. I created a Mellotron track in the song and worked with the session player to add that unique sound and build some dynamics with it throughout the song. I love the results with that much Mellotron layered on top of the guitar-driven song.
When I look back at the process and results of this particular song, it really exceeded my expectations for the creativity I could employ in making my songs come to life. Having not only a competent drummer able to provide a foundation for my songs, but also the ability to add very high-quality virtual instruments, very true to their originals. The Mellotron was a great example of this. The ease with which I was able to make my riffs and songs come alive as completed musical pieces stunned me. It fueled my creativity even more, opening up new avenues, allowing me to use professional versions of my favorite instruments, familiar, close-to-my-soul sounds, and their associated feelings. Even if those sounds emanate from a 50-year-old source. I was able to create these in a band situation with other players, thanks to AI. In my case, not only has AI enhanced my creativity, but I’m having a blast doing it.
By myself.
Alone in my little studio.
With talented AI musicians.
Still miss hanging out with the guys after playing, however.
Eric Lundbohm serves as musical director for “Fret Salad,” an innovative project blending human artistry with AI musicianship. The debut album arrives March 5, 2026, on all major streaming services. Visit www.fretsalad.com for more info.