Ideas and new technology are causing seismic shifts in the media industry. Where are we headed? What does it mean? Gabriella Mirabelli talks with the brightest minds in entertainment and in business. She meets the innovators, the risk-takers and the disruptors on the front lines of change from Hollywood, Wall Street, Silicon Valley and beyond. The future is coming to a screen near you.
Gabriella Mirabelli: Welcome to up next. I'm your host Gabriella Mirabelli. My guest today is Inderjit Birdie, he's the co-founder and Head of Strategy at AI music. He's been pivotal throughout the company's early growth as they've revolutionised the way music is both produced and consumed. Indi has a background in finance and venture capital investing. His work at AI music is predicated on executing the company's goals in a rapidly evolving media ecosystem.
Prior to AI music, Indi led investment rounds and advised other early-stage companies within AI, media and music. His other passions include hip hop and sports, as well as spearheading growth and innovation across platforms. Thank you so much for joining us today.
Inderjit Birdee: Thank you for having me. It's a great experience to be here.
Gabriella Mirabelli: Your company is called AI music, which sounds very smart, but also a bit nebulous as AI can mean a lot of different things. So, what does AI mean in the context of what you're doing?
Inderjit Birdee: Well, I think going to the name is actually quite interesting because, at the company's inception, we were really toying with what do we call this company? So we set up a website, and we just called ourselves AI Music. And then we went to a conference and started to get a lot of interest surrounding that just because of the name, so we thought let's just run with it. And that's still the case today.
So what do we do? We have a set of technologies all centred around machine learning and artificial intelligence tools that we apply to music. Specifically, we have tools which can understand the DNA of a song – your sections, your BPS and your tempos and much more granular information that we can extract from a song. I mean, if you can understand the DNA of something, you can then reverse engineer it.
In practical terms, it's meant that we've been using this technology to develop a range of modules. It's an internal word we'd like to use a lot and means that we actually generate our music ourselves. And that process is quite interesting because if we look back over the last four or five years, we've seen that generator companies and the approach to generative music are quite nuanced. From our point of view, we've come up with a unique way which actually scales music but also quality.
Our final area of focus when applying AI to music is actually the adaptation of music. It's all part of a journey, you can understand this DNA, you can generate something, but actually you can also tailor it because it's now malleable. You can effectively shape change a song or a set of songs. Historically, music was something quite static, now, we've changed it into something which is much more dynamic.
Gabriella Mirabelli: I have so many questions from all of that. I've been scribbling furiously. First though, why don't we back up, who is your buyer and how are you addressing their need? What is it about you that they say, "Oh my God, you've solved this problem!" I mean, it's fascinating that you're able to generate music. It's fascinating that you have adaptive music. I guess I'm trying to understand who buys it and for what purpose?
Inderjit Birdee: Yeah of course. Over the last couple of years, we've worked quite technically within the AI space to understand the range of tools we have. And from that, music is, of course, a creative form, but it's also quite mathematical in the way that is structured. Having said that, I think within this, the music sync space and the usage of music is actually sometimes quite difficult. So for example, whether you're an agency or a brand, or even if you're a regular kind of YouTube blogger who wants to find and use music, it can be challenging sometimes.
Let's say you want to find a song. There are thousands of songs out there in the world and navigating and understanding specifically what you want can be challenging.
Gabriella Mirabelli: So is it a discovery engine done? That's a phrase that would make sense to me if I'm an agency and I'm looking for music, this would help me discover the track I need.
Inderjit Birdee: Yeah, exactly. So at the very first stage, we enable through the use of genetic algorithms to source and find a song from our library. Even then, you ordinarily would have to try and find the licenses for that. Within the music production space, there's often a territory or ring-fence for a certain time or place.
Once you've jumped through those first two hours, you have to edit the song. Next, you have to find the most relevant seconds of a song to fit what you want to use it for. What's left is this outdated one-size-fits-all usage of music. And with a workflow that can take weeks.
The music industry has difficult workflows and trying to find and navigate all of that is tricky –especially if you're a brand or agency looking to personalise your audio advertising on a mass scale.
Through our service-led model approach, we have the tools which help you navigate almost instantly the song you want. There are very simple licensing structures in place because it's all about scale.
Ultimately, we want our clients such as brands and agencies to have the freedom to operate and use music however they want. As a final stage, we have the post-production technology, which can literally fit the music to listener context. It's this whole idea of hyper-personalization. If you completely democratise music, how it's developed, how it's even edited you can achieve that. For us, the notion of hyper-personalisation and the notion of adaptive music are the real drivers behind our technology.