Notes on Physically Based Rendering In Minecraft
Mar 18, 2025When I first started out doing shader development in Minecraft, it took me a very long time to figure out how to use the data the game provides to do rendering that was somewhat "physically based". Of course, Minecraft is not a realistic game, but through the use of LabPBR materials, the idea is that surfaces should look tangible. Ideally, if you're implementing support for PBR resource packs, you want to be doing PBR.
What We Want, Ideally
The idea for physically based rendering is that for every light source in the scene, you apply a BRDF to it. This assumes that we know:
- Whether the current fragment is occluded by the light source
- The direction from the fragment to the light source
- The irradiance received from the light source
What Iris Gives Us
In Iris, there are two primary sources of information for lighting. The vanilla lightmap, and the shadow map. These together represent light contribution from the sky, block light sources, and the sun.
For the sun, we can very easily do things properly. We know exactly where it is through the use of the sunPosition
uniform. It's important to note that we don't actually use the position of the sun the game gives us (which is always 100 blocks away from the player). Given the scale of the sun and the distance from the earth, we instead treat it as a directional light source, and so we normalize the position to get the direction. You can either find irradiance values online or just pick a number depending on how physically based you want to be.
Unfortunately we don't have any positional information for block light sources, and skylight comes from every direction, unless it's blocked by something, but we don't have any way of detecting that either. Minecraft's lightmap is not good for PBR, but it's what we have, and without highly complex systems involving voxelisation and ray tracing, it's not feasible to get anything much better.
So How Do I Do PBR?
Essentially, you end up with two light sources to add together - direct and indirect light. Direct light is from the sun or moon - you know exactly where it is, and can just do a BRDF for it. In this case, I am referring to 'indirect' light as anything we don't explicitly know the origin of, that being the lightmap in this case.
So, for direct light, just apply a BRDF to the sun or moon.
For indirect light, it's important to remember we have two components for lighting - specular and diffuse. The lightmap is the diffuse component. Therefore, if you want a specular component for indirect light, you will need to do something like screen-space reflections. At this point we are getting pretty far from physically based but this should still look alright if done correctly.
Your diffuse component is the lightmap lighting. Your specular component is your reflections. You then combine these using the fresnel value, which you can get with the schlick approximation. For perfectly smooth surfaces, you can just calculate this fresnel value based on the normal. For rough surfaces, you will want to do hemisphere sampling (ideally cosine weighted or using a VNDF sampler or something), and you should average the fresnel value for all samples. You then use this average fresnel value to blend between the diffuse and the specular.
Some Example Code
Let's define some variables to illustrate things, and then show how they go together.
vec3 lightDir = normalize(sunPosition);
vec3 viewPos; // fragment view space position
vec3 albedo;
vec3 normal; // the surface normal
vec2 lightmap; // 0-1 values derived from the coordinates in the lightmap texture - x = blocklight, y = skylight
float f0; // comes from the specular map green channel
float shininess; // you will need to derive this from the perceptual smoothness stored in the specular map red channel
// these are constants we define
vec3 sunIrradiance;
vec3 blocklightColor;
vec3 skylightColor;
// these are things we need to compute
vec3 shadow = getShadowing(...);
vec3 indirectFresnel; // this gets set by the below function as the average fresnel value from all reflection samples
vec3 reflections = getScreenSpaceReflections(..., indirectFresnel);
Now we'll do the lighting. We'll use blinn-phong for the direct lighting, just for simplicity's sake.
vec3 directDiffuse = sunIrradiance * dot(lightDir, normal) * albedo * shadow;
// blinn-phong specular
vec3 halfway = normalize(lightDir + normalize(-viewPos));
vec3 directSpecular = sunIrradiance * pow(max(dot(normal, halfway), 0.0), shininess);
vec3 fresnel = schlick(f0);
vec3 direct = mix(directDiffuse, directSpecular, fresnel);
vec3 indirectDiffuse = albedo * (lightmap.x * blocklightColor + lightmap.y * skylightColor);
vec3 indirectSpecular = reflections;
vec3 indirect = mix(indirectDiffuse, indirectSpecular, indirectFresnel);
color = direct + indirect;
I have not included a method for determining the blinn-phong "shininess" exponent from the perceptual smoothness LabPBR provides, and if you are writing your own BRDF code I would highly recommend using a Cook-Torrance BRDF instead, as the Blinn-Phong model is very old.
Other Notes
Water
Water does not have a diffuse component, because it is transparent. The reason water is blue (even without reflecting the sky) is because it absorbs other colours of light as they pass through it. Therefore, rendering water should work something like the following:
color = texture(colortex0, texcoord).rgb; // get the base colour of the terrain behind the water
if(waterMask){
color = applyWaterFog(color, ...); // apply absorption and scattering, this will tint the color blue/green
vec3 fresnel
vec3 reflections = getScreenSpaceReflections(..., fresnel);
color = mix(color, reflections, fresnel);
}
This method requires that water is rendered in a deferred manner, however it is also possible to do things this way with forward rendering. Both methods require that translucents are not written to the same buffer as opaques, and are blended later on.
A correct solution that allows rendering of other translucents behind water without deferred rendering is not currently possible, as it requires separate blending for the R, G, and B channels. This requires something called "dual source blending" which neither Iris nor Optifine support. This may become possible with Iris' upcoming Aperture pipeline.
Emission and Sub-Surface Scattering
Sub-surface scattering can essentially be considered as emitted light. Emitted light is not part of your main BRDF, and so you can simply add this on separately.
Global Illumination
Global illumination is a technique where every element in the scene is considered as a light source. There are a variety of techniques for global illumination - a popular one in the context of Minecraft is Reflective Shadow Mapping. The resultant colour from this should be included in the indirect diffuse, and multiplied by the albedo like the other components are.
'Metalness', and Why It Doesn’t Exist in LabPBR
If you've ever looked at the code for an older pack like SEUS, you may notice that instead of the f0
value the LabPBR standard mentions, there is instead a 'metalness' value. These are not equivalent. In reality, something is either a metal, or it is not, and the two types of material behave differently. To prevent aliasing, some material formats like oldPBR (what SEUS uses) and the GLTF material format use this 'metalness' value to blend between a dielectric BRDF and a metallic BRDF. In order to do this, the pack has to assume an f0 value, as the resource pack does not provide this. LabPBR instead opts to provide the f0 value, or information about which metal the surface is (in which case the pack should hardcode the f0 value, or just use the albedo).