Catamount Veritas, April 2026

Volume 2, Issue 4

I have been deep in the world of guitar pedals recently, and that has led me to some really amazing voices. One of my recent favorites is Emily Hopkins. She plays the harp and has amazing experience playing it through all kinds of pedals. And while not an engineer, she has worked with pedals long enough to understand the implications of the various circuits that show up again and again. Recently, she covered AI guitar pedals

It’s probably not a surprise that she is not a big fan. I have my own reservations about AI pedals, but I think I will reserve them until I have a chance to acquire one. Seems like a good opportunity for a livestream too, so stay tuned. Anyway, she argues, for example, that the relationship you have with a piece of gear over time is important to the creative process. That is both because you become comfortable with it and because you learn how to stretch its capabilities.

Here’s my own example: the Pultec EQP-1A. It is a rack-mounted, tube-based equalizer (EQ), and it has become known for a very cool hack. OK, technically not a hack because it’s an EQ trick in an EQ box, but it’s pretty close. On the low-frequency controls, there are boost and attenuate knobs that cover the same frequency bands. So you can boost and cut, say, 100 Hz at the same time. Why would you ever want to do that? Because the boost and attenuation curves have different shapes so turning up both controls doesn’t cancel out. Instead, it creates a low-end contour that can sound tighter and punchier, which is why it’s often used on bass or drums.

Bringing this back to our narrative, we don’t have the opportunity to interact with an AI pedal in the same way, and the AI pedals are typically using code and other things to produce the sound effects. So the idea of pedals having character, or learning their idiosyncrasies and forming a relationship with a piece of hardware—all of this feels lost. Emily also, and more brutally, compares AI pedals to VR (virtual reality) headsets. We all see how that’s working out.

Still, from the production music standpoint, I can think of some potential use cases. You can use it to create tones for a variety of different sync genres. These compositions typically require some live guitar, and setting the tone right for the genre is important. The thing is, it’s not clear to me that AI tools are any better than Guitar Rig, which is pretty amazingly awesome at it.

Upcoming Conferences

I won’t be at Music Biz Atlanta, in early May, unfortunately. But I am scheduled to be at the Indie Week in New York on June 8-11, so if you are in town then and want to catch up, please reach out.

From © to ™️

Taylor Swift is building on Matthew McConaughey’s momentum and trademarking her name, voice, and image. These trademarks include her stage image (pink guitar and bodysuit) as well as phrases like “Hey, it’s Taylor” and so forth. I am not a lawyer, but it is reported that getting these trademarks will allow her to pursue people that put up things that are similar, as opposed to exact copies, of her trademarks. I mention McConaughey because he trademarked his name, image, brand, and even famous lines like “Alright, alright, alright” in order to fight this growing problem. 

The top content producers, celebrities, and personalities are all starting to see the issues with deepfakes, lack of consent, and malicious use of their own images. In some cases, people’s likeness is being used to promote things they vehemently oppose. In other cases, because of poor legal language, an actor signs a contract and finds out the company can use their likeness to do whatever they want, and there is not much they can do about it. This has already happened and continues to occur. There is at least one light on the horizon, though: AB-853, the California AI Transparency Act.

AB-853 will take effect on August 2, 2026, and when it does, it will provide for enforceable controls to ensure that content is labeled so you know if it has AI or not, and that there was consent by the artists, and that there is data that can be captured and travel with the asset. As it related to C2PA, this is really important, because C2PA meets many of the needs of this program. In addition, AB-853 mandates that when there is provenance data attached to a file, it cannot be stripped off. That is a huge win. It is also possible that other states and even countries will look to this legislation as a model, so it is worthy looking closely at what is required to be in compliance. If you do any business in California, you will have to be compliant. 

Books from People I Follow

Drew Thurlow has a new book out entitled “Machine Music: How AI is Transforming Music’s Next Act.” Given his background, this looks like an interesting read.

Final Thoughts

Yes, I know you are tired of hearing about Suno and Udio. But we have TopMusicAttorney keeping us honest, with a 6-minute recap that explains that UMG and Sony have had a falling out and that Suno’s lawyers are pursuing a bizarre tactic. Apparently, they are complaining that UMG is gaming the system by producing AI outputs with Suno that sound like their music. But in TopMusicAttorney’s indie lawsuit, they are complaining about the opposite. Oh, it’s a confusing world, so it would be easier for you to just find six minutes to watch the sausage getting made.

The biggest news, though, is that the judge in the Udio case rejected Udio’s call for a dismissal of a charge related to scraping music from YouTube. Apparently they tried to argue that their method of access did not violate the DMCA. From there, it gets quite technical and centers on whether or not YouTube’s encryption is an access control or a copy control (both are terms of art). But, in the end, the claims stand, and the lawsuit moves on. I’m sure we’ll still be talking about it next year at this time.

Happy May!

Leave a Reply

Your email address will not be published. Required fields are marked *