The making of InsideKristallnacht

Inside Kristallnacht is an interactive experience that tells the story of November 9, 1938. Kristallnacht Survivor ‘Dr Charlotte Knobloch’ guides users through an immersive reconstruction of the space and story. The experience is available on desktop, mobile and webXR.

We wanted to ensure there was harmony between the creative vision and technical experience. This project required a lot of assets balanced with intricate design. We created an integrated team from the outset, including them within the creative process from day one.
Working this way allowed us to rapidly prototype design ideas from the start and ensure everything would scale and work well.

Shards Rendering

The main challenge for the homepage and intro was rendering the glass shards in WebGL. They represent the broken windows from that night as well as the shattered destiny of the people affected. We also wanted to use them to present the videos of Charlotte in different forms.

In graphic rendering, it's often best to "fake it" when possible. For the shards, we replaced the translucent parts with a simple blurred image of the title behind them, then calculated their position based on the camera's perspective. We applied a subtle parallax effect to the main reflected image. To enhance the realism, we used image-based lighting and incorporated a normal map of old glass to simulate roughness and reflective properties.

You can use this code to mimic the uvs for the blurred text texture from the current eye vector:

// ===== BLUR TEXTURE MAPPING
vec3 intersectionPoint = rayPlaneIntersection(vWorldPosition, vEyeVector, normalVector, uTextPlanePosition());
vec2 uvOffset = calculateUVOffset((intersectionPoint), textAspectRatio, vec2(1.0, 1.0));
float blurTextureX = (uvOffset.x + 1.0) / 2.0;
float blurTextureY = (-uvOffset.y + 1.0) / 2.0;

Crafting the layering was very important to create the render. This code layers everything together (FYI the lighting is calculated prior to this step).

// Get the parallax layer color with parallaxed uvs
vec4 parallaxColor = texture2D( tParallaxTexture, correctedArchiveParallaxUV );
// Get the text layer color with a bit of chromashift
vec4 textROffset = texture2D( tTextTexture, vec2(blurTextureX, blurTextureY) + (chromaShiftDirection * vec2(rOffset)) );
vec4 textGOffset = texture2D( tTextTexture, vec2(blurTextureX, blurTextureY) + (chromaShiftDirection * vec2(gOffset)) );
vec4 textBOffset = texture2D( tTextTexture, vec2(blurTextureX, blurTextureY) + (chromaShiftDirection * vec2(bOffset)) );

vec3 color = vec3(0.16);
// Put everything together with the ibl lighting value and some visibility settings
color.x = ( lightingData.lightingColor.r * (0.85 * uVisibility()) + 0.01 ) + ( textROffset.r * textROffset.a * uTextOpacity() * effectDiscard ) + ( parallaxColor.r * parallaxColor.a * uParallaxOpacity() );
color.y = ( lightingData.lightingColor.g * (0.85 * uVisibility()) + 0.01 ) + ( textGOffset.g * textGOffset.a * uTextOpacity() * effectDiscard ) + ( parallaxColor.g * parallaxColor.a * uParallaxOpacity() );
color.z = ( lightingData.lightingColor.b * (0.85 * uVisibility()) + 0.01 ) + ( textBOffset.b * textBOffset.a * uTextOpacity() * effectDiscard ) + ( parallaxColor.b * parallaxColor.a * uParallaxOpacity() );

Finally Here’s a breakdown of the steps we took to make this effect work:

Story Scenes: Style and Rendering

The core site was to render the visuals that complement the story. We had to load and render 25 distinct scenes to illustrate Charlotte's journey, requiring careful planning and prototyping.

We started by experimenting with the film noir effect we wanted for the scenes, we implemented this in two stages.

  1. A basic cell shading shader with a color ramp transitioning from black to dark gray.
  2. In the same pass, we added a noise effect for texture.

We also introduced custom settings to make the cell shading perfect in each scene.

Fragment shader code :

// Get the grain texture with screen coords and animated
float grainTex = texture2D(uNoise, gl_FragCoord.xy * uNoiseScale() + floor(uTime() * uNoiseSpeed()) / uNoiseSpeed()).r;
float grain = smoothstep(0.35, 1., grainTex) * uNoiseAmount();

// Get the basic lighting from the baked texture with a factor
float lightAmount = (FragColor.r + bakedLight.r) * uLightFactor();

// Get the cell shading ramp from the custom steps
float colorMix = mix(0., uStep1().y, step(uStep1().x, lightAmount));
colorMix = mix(colorMix, uStep2().y, step(uStep2().x, lightAmount));

// Get the final color from the 2 colors passed and all above vars
FragColor.rgb = mix(uColor1(), uColor2(), max(colorMix - grain, 0.));

For the 3D assets, we baked most of the lighting directly into the textures to improve quality and save memory. We also developed a custom system to embed up to two dynamic lights per scene, ensuring that dynamic lighting was seamlessly integrated.

A key aspect of our storytelling involved animations that trigger at crucial moments(e.g., portal openings, doors closing). In Blender, we created these animations as Non-Linear animations and synced them with the sound using a storyboard file.

The camera movement between scenes was handled by a simple NDL animation. We rendered both scenes simultaneously during the animation and then only rendered the active scene, optimizing performance.

The outline effect on hotspot objects was achieved by extruding the object's mesh along the normal vectors and flipping the normals.

For the sprite sheets, we wrote a script to merge the images for each sheet. These images were packed into each colour layer (as they are monochrome), and we grouped them—up to four images per 4096x4096 texture—then converted them to WebP format to reduce file size (we experimented with Basisu, but it resulted in excessive degradation). This approach reduced the total size of each sprite sheet by an average of 20 times.

To allow us to easily scale our prototype as scenes were added, we implemented a clear naming convention for everything needed from the 3D scene, including dynamic lights, scene parents, and the TP button. 

For smooth collaboration, we compiled a comprehensive document with guidelines for the team, particularly the 3D artists. This document included polycount limits and other critical information, which they then followed when creating the scenes.

For exports, we incorporated gltf-transform into our pipeline. Typically, we used the default settings and adjusted texture export options as needed, which significantly reduced file sizes. We also convert textures to Basis when possible.

WebXR Adaptation

Integrating WebXR support into this kind of experience is more than just adding the WebXR session request script—it requires content, animation, and navigation adjustments to make the experience truly immersive.

First, we cannot emphasize enough how important it is to be comfortable to test when developing for webXR. The Immersive Web Emulator extension for Chrome made by Meta is brilliant to test directly on your browser and just put the headset on when actually needed.

‘Inside Kristallnacht’ is content-heavy, with a lot of text and interactive hotspots. As a result we had to rethink these elements for WebXR, especially since HTML layers are not yet supported in most VR browsers. Implementing UI in WebGL is always a challenge, but we managed to work through it.

To render the text, we used Signed Distance Fields (SDF). We had already developed a shader and text renderer for another project, Magicals Reflections. In Unity, we directly generated the SDF fonts and can even specify the character set to be used.

For the WebXR UI, we relied on a combination of custom shaders and Blender for positioning the 3D elements. We also needed some adjustments in terms of design as the aliasing and resolution is much lower than on desktop.

We had to reinvent the chapter menu in XR so we came with a system hooked to the gamepad and toggled with a button. It's easy to access but don't obstruct the view.

Performance is really important in WebXR as the frame rate needs to be a constant 90fps. We did a complete overhaul of the optimisation of the site to make sure everything was only rendered once and when needed. We also reviewed all assets and tailored them exactly to our needs, reducing textures where we could and merging meshes when possible. Spectorjs is still the best tool to properly audit a WebGL frame.

To minimize motion sickness, we also removed all camera animations between scenes in VR, improving comfort during the transitions.

For us, Inside Kristallnacht not only offers a cutting-edge technological experience but also serves as a poignant reminder of the past, honoring the memory of those affected by the events of Kristallnacht. By leveraging technology in service of storytelling, this project provides an opportunity for reflection, education, and remembrance in a way that is both innovative and impactful.