Hi Firecore team,
Infuse on Vision Pro is one of a kind and the best experience I’ve ever had among all video players on all XR headsets. But there is still one missing piece to it, that is to play stereoscopic and spatial content.
Since WWDC 2025 with the introduction of Apple Projected Media Profile, things start becoming easy. I’ve took some time implementing my own little project (linked below) using ffmpeg to demux video container with HW decoding, and then send to VideoPlayerComponent to achieve a near-native video rendering efficiency. It’s really not that hard, providing that your existing codebase is pretty omnipotent.
The key part is in NutshellPlayer/Views/Entities/APMPPlayerEntity.swift, it’s receiving decoded CMReadySampleBuffer<CMSampleBuffer.DynamicContent> and injecting APMP tags into the decoded buffer, then sending it to Apple native VideoPlayerComponent using AVSampleBufferVideoRenderer.
Figuring out how to make FFmpeg work took me 5 days, but APMP injection and rendering it in immersive space only took me 2 days. It really won’t take too much of your dev’s effort to implement this feature with so many ready-to-use tools provided by Apple. I hope this post can reach your dev team and bump up any ticket related to this feature in your backlog.
I’m really looking forward to watch all my spatial content in infuse with an upcoming update.