Quote:
Originally Posted by
6strings
➡️
Atmos Music has a super-limited bandwidth and, as a result, is limited to 16 objects (including the 10 bed channels which get converted to objects). When mixing for Apple Music, Apple even suggests sticking with beds and limiting objects to a "few featured sounds".
Can you share where you heard this from? I think you're talking about the encoding process for Apple Music, correct? Because technically there is no separate process to Atmos "music." The renderer is going to process everything you feed it exactly the same, whether it's music, film, theatrical, etc. I'm confused as to what this "limited bandwidth" for music is that you speak of since I've never heard of such a thing.
Not a single person I know of is sticking to max 16 objects for the sake of Apple Music. I know some who use mainly beds with the occasional object, but not for any technical reason other than it's an easy workflow and simple to jump between projects without an elaborate IO template. But I also know a good amount of people using an "object bed," and a few people using only objects and nothing else. So I'm genuinely curious where you got your information from about Apple recommending we stick to mainly the bed (not being shady, I love to learn new things and share that info.)