![[shotgun-wall-destruction.gif]] ### Summary I built **Crunch Element** around one core idea: deliver jaw-dropping destruction in VR that never drops below 90 FPS – even when four players are blasting, breaching, and flying through the air on opposite ends of the world. ![](https://x.com/blakezonca/status/1920637881728536924) ### Key Highlights - **Full Coop Sync** Every piece of debris is synced so that **what you see is exactly what your teammate sees**. No ghost pieces, no rubber-banding, just pure, shared destruction. The important "debris state" of what is destroyed/pristine is synced, while all the effects are reproduced locally on each client. - **100 % Precomputed Lighting** I reworked **every shader** and **every particle system** to sample from baked directional lightmaps and spherical-harmonic volume textures—no real-time or mixed lights anywhere. Explosions, smoke, even shiny gun surfaces all pick up environment color & direction from **precomputed 3D textures**, giving accurate shading and simulated reflections at VR framerates. - **GPU Instancing & Minimal CPU Overhead** Debris meshes and particle systems are instanced entirely on the GPU. That means the CPU barely notices when dozens of chunks fly out of a breach, and the VRAM tradeoff (for loading volume textures) is managed asynchronously as you move through the level. ### The Approach **How do you make explosions, dust, and shiny guns react to lighting in VR… without a single realtime light?** One of the things I'm most proud of in the last few months is reworking all the shaders and lighting to make dynamic objects and VFX use 100% precomputed lighting and reflections. No real-time lights. No mixed lights. This makes it blazing fast for VR and look much better as you can see below. This was a huge improvement and optimization for all the new features coming to Crunch Element ![[light-showcase.gif]] A typical approach for something like this would be a mixed approach with baked lighting for the environment and realtime lights only for dynamic objects. However, even this was too slow for VR at the level I wanted, and certainly too slow for standalone VR. ...and the standard approach of baked lighting + light probes that many VR games use crushed the level of detail I wanted on guns and destruction so that's when I took the dive into directional lightmaps and baking spherical harmonics. Now all debris, smoke, and explosions sample precomputed lighting for both color and direction. This allows for debris to not only have environment lighting, but also show simulated reflections at a framerate that VR can handle with ease. ![[explosion-lights.mp4]] There's also some cool side effects like dust almost looking volumetric because of the data that is being sampled. The same hit/dust effect will be brighter the closer it is to lights, and darker further away. ![[light-showcase-screenshot.jpg]] Putting it all together in a real combat sequence, you can really see these effects complement the gameplay as effects and objects match their environment. ![[shotgun-breach-nice-lighting.mp4]] Because all of this is baked, the CPU overhead is negligible and it's extremely GPU friendly. There is also the massive benefit of not needing to care about light count. The only performance tradeoff is the increased VRAM that all the volume textures need. Thankfully, this is a much easier tradeoff to work with because they can be loaded and unloaded at runtime asynchronously based on the where the player is.