In the making of synk, I developed a real-time volumetric video effects pipeline to manipulate both geometry and textures using two Azure Kinect sensors. The system captures point clouds from both devices, aligns and fuses them in real time, and then triangulated the resulting data into a mesh. The mesh and the texture are passed to fragment shaders where they can be manipulated in real time. This setup was used to explore dynamic and expressive spatial visuals in real time with high fidelity.
All the content on this page was generated live.