After seeing the latest Apple WWDC with the introduction of the new design system, it was surprising to see how their redesigned ‘Liquid Glass’ interface shown in the live stream was exactly consistent with their developer build. It wasn’t just a screen recorder thing either; the presentation demonstrated individual components and multiple screens with morphing and other custom animations that couldn’t reasonably be simulated in an app. On the other hand, animating the interface with normal video editing software would be equally impractical when replicating the behaviour of the actual software for displaying example cases of apps.
Is it just a thing of ensuring impeccable QA in producing renders like this or do they have some specialised software for these purposes?
Apple makes photo realistic renders of all of its products to use for commercials and reveal trailers. They have for a long time now.
Everything Apple does, hardware and software, is visually realized in a computer first. They take those concepts and CAD drawings and polish them into fully realized CG models that they can do anything with.
Then you get footage of the new iPhone flying through a fountain of liquid metal with realistic collision and particle effects. It would be harder to film a real phone and composite CG into it in post.