https://catlikecoding.com/unity/tutorials/custom-srp/baked-light/
1 烘培静态光
到目前为止我们都是在实时渲染光照,除此之外还可以提前计算好光照,然后存储在灯光贴图和探针中,这么做的好处是可以减少实时计算耗费的时间,还可以添加无法实时计算的间接光照,即全局光照,不过烘培光照会提升内存占用。
1.1 光照设置
在Unity的Window/Rendering/Lighting Settings
中我们可以通过开启Mixed Lighting
下的Baked Global Illumination
来设置烘培光,光照模式目前设置为Baked Indirect
:
在Lightmapping Settings
中可以对灯光贴图进行相关设置。
1.2 静态物体
为使用烘培光,我们需要将Light
组件下的Mode
设置为Mixed
。
在物体的Mesh Renderer
下的Lighting
设置中,我们可以开启Contribute Global Illumination
,并将模式修改为Lightmaps
,即间接光接触到这些表面时,会将光照信息烘培到灯光贴图中。
在Lightmapping
下的Baked Lightmap
中我们可以看到灯光贴图。
1.3 完全烘培光
使用完全烘培光,我们将模式设置为Baked
,这样的话就无法使用实时光。
2 采样烘培光
2.1 全局光照
创建GI.hlsl
,定义一个GI
结构体,目前先包含漫反射属性,并创建一个GetGI
方法来获得全局光照信息:
#ifndef CUSTOM_GI_INCLUDED
#define CUSTOM_GI_INCLUDED
struct GI
{
float3 diffuse;
};
GI GetGI (float2 lightMapUV)
{
GI gi;
gi.diffuse = float3(lightMapUV, 0.0);
return gi;
}
#endif
在GetLighting
中先利用GI初始化颜色:
float3 GetLighting (Surface surfaceWS, BRDF brdf, GI gi)
{
ShadowData shadowData = GetShadowData(surfaceWS);
float3 color = gi.diffuse;
…
return color;
}
在片元着色器中应用:
GI gi = GetGI(0.0);
float3 color = GetLighting(surface, brdf, gi);
2.2 灯光贴图坐标
为了获得灯光贴图UV坐标,Unity需要将其送往shader。我们需要指示管线对每个使用灯光贴图的物体进行此操作,可以在CameraRenderer.DrawVisibleGeometry
中设置drawSettings
的perObjectData
属性,这里设置为PerObjectData.Lightmaps
:
var drawingSettings = new DrawingSettings(
unlitShaderTagId, sortingSettings
)
{
enableDynamicBatching = useDynamicBatching,
enableInstancing = useGPUInstancing,
perObjectData = PerObjectData.Lightmaps
};
Unity现在会为开启LIGHTMAP_ON
关键字的shader变体渲染应用灯光贴图的物体:
#pragma multi_compile _ LIGHTMAP_ON
灯光贴图的UV坐标是顶点属性的一部分,在Attributes
中使用GI_ATTRIBUTE_DATA
宏定义,在Varyings
中使用GI_VARYINGS_DATA
定义,在顶点着色器中,使用TRANSFER_GI_DATA
进行转换:
struct Attributes
{
…
GI_ATTRIBUTE_DATA
UNITY_VERTEX_INPUT_INSTANCE_ID
};
struct Varyings
{
…
GI_VARYINGS_DATA
UNITY_VERTEX_INPUT_INSTANCE_ID
};
Varyings LitPassVertex (Attributes input)
{
Varyings output;
UNITY_SETUP_INSTANCE_ID(input);
UNITY_TRANSFER_INSTANCE_ID(input, output);
TRANSFER_GI_DATA(input, output);
…
}
在片元着色器中获取UV,使用GI_FRAGMENT_DATA
宏:
GI gi = GetGI(GI_FRAGMENT_DATA(input));
实际上上面这些宏没什么意义,我们需要在GI.hlsl
中定义,如果开启LIGHTMAP_ON
,我们才为宏赋予意义:
#if defined(LIGHTMAP_ON)
#define GI_ATTRIBUTE_DATA float2 lightMapUV : TEXCOORD1;
#define GI_VARYINGS_DATA float2 lightMapUV : VAR_LIGHT_MAP_UV;
#define TRANSFER_GI_DATA(input, output) output.lightMapUV = input.lightMapUV;
#define GI_FRAGMENT_DATA(input) input.lightMapUV
#else
#define GI_ATTRIBUTE_DATA
#define GI_VARYINGS_DATA
#define TRANSFER_GI_DATA(input, output)
#define GI_FRAGMENT_DATA(input) 0.0
#endif
2.3 变换灯光贴图坐标
灯光贴图坐标通常由Unity逐网格自动生成,或者是导入网格数据的一部分。网格会进行纹理展开,这样就可以映射到纹理坐标,这个展开操作会在灯光贴图中逐物体的进行缩放和平移,所以每个实例会有自己的空间。
灯光贴图的UV变换是UnityPerDraw
缓冲的一部分,我们添加下面两个变量:
CBUFFER_START(UnityPerDraw)
...
float4 unity_LightmapST;
float4 unity_DynamicLightmapST;
CBUFFER_END
然后调整TRANSFER_GI_DATA
宏,应用变换操作:
#define TRANSFER_GI_DATA(input, output) \
output.lightMapUV = input.lightMapUV * \
unity_LightmapST.xy + unity_LightmapST.zw;
2.4 采样灯光贴图
我们需要运用到Core RP
中的EntityLighting.hlsl
。定义纹理和采样器:
#include "Packages/com.unity.render-pipelines.core/ShaderLibrary/EntityLighting.hlsl"
TEXTURE2D(unity_Lightmap);
SAMPLER(samplerunity_Lightmap);
定义采样方法SampleLightMap
,调用SampleSingleLightmap
方法:
float3 SampleLightMap (float2 lightMapUV) {
#if defined(LIGHTMAP_ON)
return SampleSingleLightmap(lightMapUV);
#else
return 0.0;
#endif
}
GI GetGI (float2 lightMapUV) {
GI gi;
gi.diffuse = SampleLightMap(lightMapUV);
return gi;
}
SampleSingleLightmap
方法除UV外要求多个参数,首先需要传入纹理和采样器状态,我们可以使用TEXTURE2D_PARAM
宏组合:
return SampleSingleLightmap(
TEXTURE2D_PARAM(unity_Lightmap, samplerunity_Lightmap), lightMapUV
);
在UV后是缩放和平移变换,由于已经做过,这里设为单位变换即可:
return SampleSingleLightmap(
TEXTURE2D_ARGS(unity_Lightmap, samplerunity_Lightmap), lightMapUV,
float4(1.0, 1.0, 0.0, 0.0)
);
然后是指示灯光贴图是否被压缩,如果定义了UNITY_LIGHTMAP_FULL_HDR
的话就要设为false,最后的参数包含解码指令:
return SampleSingleLightmap(
TEXTURE2D_ARGS(unity_Lightmap, samplerunity_Lightmap), lightMapUV,
float4(1.0, 1.0, 0.0, 0.0),
#if defined(UNITY_LIGHTMAP_FULL_HDR)
false,
#else
true,
#endif
float4(LIGHTMAP_HDR_MULTIPLIER, LIGHTMAP_HDR_EXPONENT, 0.0, 0.0)
);
SampleSingleLightmap
:
real3 SampleSingleLightmap(TEXTURE2D_PARAM(lightmapTex, lightmapSampler), float2 uv, float4 transform, bool encodedLightmap, real4 decodeInstructions)
{
// transform is scale and bias
uv = uv * transform.xy + transform.zw;
real3 illuminance = real3(0.0, 0.0, 0.0);
// Remark: baked lightmap is RGBM for now, dynamic lightmap is RGB9E5
if (encodedLightmap)
{
real4 encodedIlluminance = SAMPLE_TEXTURE2D(lightmapTex, lightmapSampler, uv).rgba;
illuminance = DecodeLightmap(encodedIlluminance, decodeInstructions);
}
else
{
illuminance = SAMPLE_TEXTURE2D(lightmapTex, lightmapSampler, uv).rgb;
}
return illuminance;
}
3 光照探针
动态物体并不会影响烘焙的全局光照,但可以通过光照探针影响。光照探针是场景中的一个点,通过一个三次多项式,特别是L2球面谐波来近似烘培所有的入射光。光照探针可以拜访在场景周围,Unity会逐物体对这些探针插值来达到近似的光照结果。
3.1 光照探针组
光照探针通过创建光照探针组来添加,通过GameObjectLight/Light Probe
。默认是包含个探针的立方体。
场景中可以有多个探针组,Unity将所有的探针组合在一起,然后创建一个四面体网体来连接所有探针。每个动态物体处在一个四面体网格中。四面体的四个顶点的探针会被插值,获得最终光照后应用到物体上。如果一个物体位于探针覆盖区域范围外,会使用最邻近的三角形,这样光照会看起来很奇怪。
默认情况下,当一个动态物体被选中后,会显示哪些影响它的探针,以及一个插值后的位置。
如何拜访光照探针取决于场景。首先,只有动态物体才需要探针;其次,光照变化的地方也可以摆放探针,每个探针是插值的终点,因此将它们围绕着光照过渡的地方拜访;第三,不要再烘培几何体内摆放探针,这样会变黑;最后,插值是会遍历所有物体的,因此,当光照在一个物体的对立面不同时,探针要尽可能的靠近两面。
3.2 采样探针
同理,探针插值数据需要送往GPU,这也是一个逐物体数据:
perObjectData = PerObjectData.Lightmaps | PerObjectData.LightProbe
shader中我们需要的数据如下:
CBUFFER_START(UnityPerDraw)
…
float4 unity_SHAr;
float4 unity_SHAg;
float4 unity_SHAb;
float4 unity_SHBr;
float4 unity_SHBg;
float4 unity_SHBb;
float4 unity_SHC;
CBUFFER_END
在GI
中采样光照探针,定义SampleLightProbe
方法:
float3 SampleLightProbe (Surface surfaceWS) {
#if defined(LIGHTMAP_ON)
return 0.0;
#else
float4 coefficients[7];
coefficients[0] = unity_SHAr;
coefficients[1] = unity_SHAg;
coefficients[2] = unity_SHAb;
coefficients[3] = unity_SHBr;
coefficients[4] = unity_SHBg;
coefficients[5] = unity_SHBb;
coefficients[6] = unity_SHC;
return max(0.0, SampleSH9(coefficients, surfaceWS.normal));
#endif
}
SampleSH9
需要探针数据和法线方向:
float3 SampleSH9(float4 SHCoefficients[7], float3 N)
{
float4 shAr = SHCoefficients[0];
float4 shAg = SHCoefficients[1];
float4 shAb = SHCoefficients[2];
float4 shBr = SHCoefficients[3];
float4 shBg = SHCoefficients[4];
float4 shBb = SHCoefficients[5];
float4 shCr = SHCoefficients[6];
// Linear + constant polynomial terms
float3 res = SHEvalLinearL0L1(N, shAr, shAg, shAb);
// Quadratic polynomials
res += SHEvalLinearL2(N, shBr, shBg, shBb, shCr);
return res;
}
// Ref: "Efficient Evaluation of Irradiance Environment Maps" from ShaderX 2
real3 SHEvalLinearL0L1(real3 N, real4 shAr, real4 shAg, real4 shAb)
{
real4 vA = real4(N, 1.0);
real3 x1;
// Linear (L1) + constant (L0) polynomial terms
x1.r = dot(shAr, vA);
x1.g = dot(shAg, vA);
x1.b = dot(shAb, vA);
return x1;
}
real3 SHEvalLinearL2(real3 N, real4 shBr, real4 shBg, real4 shBb, real4 shC)
{
real3 x2;
// 4 of the quadratic (L2) polynomials
real4 vB = N.xyzz * N.yzzx;
x2.r = dot(shBr, vB);
x2.g = dot(shBg, vB);
x2.b = dot(shBb, vB);
// Final (5th) quadratic (L2) polynomial
real vC = N.x * N.x - N.y * N.y;
real3 x3 = shC.rgb * vC;
return x2 + x3;
}
球面谐波相关介绍略。
在GetGI
中应用:
GI GetGI (float2 lightMapUV, Surface surfaceWS) {
GI gi;
gi.diffuse = SampleLightMap(lightMapUV) + SampleLightProbe(surfaceWS);
return gi;
}
3.3 光照探针代理体
光照探针对小的动态物体比较好使,但因为光照时基于单个点的,因此对于更大的物体就不好使了。
对于这些大物体,我们可以添加LightProbeProxyVolume
组件,简称LPPV,这样就可以匹配大物体。
3.4 采样LPPV
同理,配置逐物体数据属性:
perObjectData =
PerObjectData.Lightmaps | PerObjectData.LightProbe |
PerObjectData.LightProbeProxyVolume
shader中,我们需要下面这些数据:
CBUFFER_START(UnityPerDraw)
…
float4 unity_ProbeVolumeParams;
float4x4 unity_ProbeVolumeWorldToObject;
float4 unity_ProbeVolumeSizeInv;
float4 unity_ProbeVolumeMin;
CBUFFER_END
体数据存储在3D纹理中,unity_ProbeVolumeSH
,在GI
中定义:
TEXTURE3D_FLOAT(unity_ProbeVolumeSH);
SAMPLER(samplerunity_ProbeVolumeSH);
是使用LPPV还是插值的光照探针,取决于unity_ProbeVolumeParams
的第一个组件。采样探针代理体我们使用SampleProbeVolumeSH4
方法,参数依次是纹理和采样器,世界空间位置,世界空间法线,unityProbeVolumeWorldToObject
矩阵,unity_ProbeVolumeParams
的YZ组件,然后是SizeInv和Min的xyz组件:
if (unity_ProbeVolumeParams.x) {
return SampleProbeVolumeSH4(
TEXTURE3D_ARGS(unity_ProbeVolumeSH, samplerunity_ProbeVolumeSH),
surfaceWS.position, surfaceWS.normal,
unity_ProbeVolumeWorldToObject,
unity_ProbeVolumeParams.y, unity_ProbeVolumeParams.z,
unity_ProbeVolumeMin.xyz, unity_ProbeVolumeSizeInv.xyz
);
}
else {
float4 coefficients[7];
coefficients[0] = unity_SHAr;
coefficients[1] = unity_SHAg;
coefficients[2] = unity_SHAb;
coefficients[3] = unity_SHBr;
coefficients[4] = unity_SHBg;
coefficients[5] = unity_SHBb;
coefficients[6] = unity_SHC;
return max(0.0, SampleSH9(coefficients, surfaceWS.normal));
}
SampleProbeVolumeSH4
:
// This sample a 3D volume storing SH
// Volume is store as 3D texture with 4 R, G, B, Occ set of 4 coefficient store atlas in same 3D texture. Occ is use for occlusion.
// TODO: the packing here is inefficient as we will fetch values far away from each other and they may not fit into the cache - Suggest we pack RGB continuously
// TODO: The calcul of texcoord could be perform with a single matrix multicplication calcualted on C++ side that will fold probeVolumeMin and probeVolumeSizeInv into it and handle the identity case, no reasons to do it in C++ (ask Ionut about it)
// It should also handle the camera relative path (if the render pipeline use it)
float3 SampleProbeVolumeSH4(TEXTURE3D_PARAM(SHVolumeTexture, SHVolumeSampler), float3 positionWS, float3 normalWS, float4x4 WorldToTexture,
float transformToLocal, float texelSizeX, float3 probeVolumeMin, float3 probeVolumeSizeInv)
{
float3 position = (transformToLocal == 1.0) ? mul(WorldToTexture, float4(positionWS, 1.0)).xyz : positionWS;
float3 texCoord = (position - probeVolumeMin) * probeVolumeSizeInv.xyz;
// Each component is store in the same texture 3D. Each use one quater on the x axis
// Here we get R component then increase by step size (0.25) to get other component. This assume 4 component
// but last one is not used.
// Clamp to edge of the "internal" texture, as R is from half texel to size of R texture minus half texel.
// This avoid leaking
texCoord.x = clamp(texCoord.x * 0.25, 0.5 * texelSizeX, 0.25 - 0.5 * texelSizeX);
float4 shAr = SAMPLE_TEXTURE3D_LOD(SHVolumeTexture, SHVolumeSampler, texCoord, 0);
texCoord.x += 0.25;
float4 shAg = SAMPLE_TEXTURE3D_LOD(SHVolumeTexture, SHVolumeSampler, texCoord, 0);
texCoord.x += 0.25;
float4 shAb = SAMPLE_TEXTURE3D_LOD(SHVolumeTexture, SHVolumeSampler, texCoord, 0);
return SHEvalLinearL0L1(normalWS, shAr, shAg, shAb);
}
4 Meta pass
因为间接漫反射光与表面作用后应该会会被表面的漫反射率影响,对此,Unity使用一个特殊的meta pass在烘培时来决定反射光 。
4.1 统一输入
新建一个LitInput.hlsl
文件,将变量放入,避免重复声明。同时构建一些获取属性的方法,隐藏Unity自带的宏:
#ifndef CUSTOM_LIT_INPUT_INCLUDED
#define CUSTOM_LIT_INPUT_INCLUDED
TEXTURE2D(_BaseMap);
SAMPLER(sampler_BaseMap);
UNITY_INSTANCING_BUFFER_START(UnityPerMaterial)
UNITY_DEFINE_INSTANCED_PROP(float4, _BaseMap_ST)
UNITY_DEFINE_INSTANCED_PROP(float4, _BaseColor)
UNITY_DEFINE_INSTANCED_PROP(float, _Cutoff)
UNITY_DEFINE_INSTANCED_PROP(float, _Metallic)
UNITY_DEFINE_INSTANCED_PROP(float, _Smoothness)
UNITY_INSTANCING_BUFFER_END(UnityPerMaterial)
float2 TransformBaseUV (float2 baseUV)
{
float4 baseST = UNITY_ACCESS_INSTANCED_PROP(UnityPerMaterial, _BaseMap_ST);
return baseUV * baseST.xy + baseST.zw;
}
float4 GetBase (float2 baseUV)
{
float4 map = SAMPLE_TEXTURE2D(_BaseMap, sampler_BaseMap, baseUV);
float4 color = UNITY_ACCESS_INSTANCED_PROP(UnityPerMaterial, _BaseColor);
return map * color;
}
float GetCutoff (float2 baseUV)
{
return UNITY_ACCESS_INSTANCED_PROP(UnityPerMaterial, _Cutoff);
}
float GetMetallic (float2 baseUV)
{
return UNITY_ACCESS_INSTANCED_PROP(UnityPerMaterial, _Metallic);
}
float GetSmoothness (float2 baseUV)
{
return UNITY_ACCESS_INSTANCED_PROP(UnityPerMaterial, _Smoothness);
}
#endif
在shader的开头,我们包含这些公用文件:
SubShader
{
HLSLINCLUDE
#include "../ShaderLibrary/Common.hlsl"
#include "LitInput.hlsl"
ENDHLSL
…
}
4.2 Meta Light Mode
Meta Pass的Light Mode我们设置为Meta
:
Pass
{
Tags
{
"LightMode" = "Meta"
}
Cull Off
HLSLPROGRAM
#pragma target 3.5
#pragma vertex MetaPassVertex
#pragma fragment MetaPassFragment
#include "MetaPass.hlsl"
ENDHLSL
}
接着构建最基本的顶点和片元着色器。ZERO_INITIALIZE
可用于初始化结构体变量:
#ifndef CUSTOM_META_PASS_INCLUDED
#define CUSTOM_META_PASS_INCLUDED
#include "../ShaderLibrary/Surface.hlsl"
#include "../ShaderLibrary/Shadows.hlsl"
#include "../ShaderLibrary/Light.hlsl"
#include "../ShaderLibrary/BRDF.hlsl"
struct Attributes
{
float3 positionOS : POSITION;
float2 baseUV : TEXCOORD0;
};
struct Varyings
{
float4 positionCS : SV_POSITION;
float2 baseUV : VAR_BASE_UV;
};
Varyings MetaPassVertex (Attributes input)
{
Varyings output;
output.positionCS = 0.0;
output.baseUV = TransformBaseUV(input.baseUV);
return output;
}
float4 MetaPassFragment (Varyings input) : SV_TARGET
{
float4 base = GetBase(input.baseUV);
Surface surface;
ZERO_INITIALIZE(Surface, surface);
surface.color = base.rgb;
surface.metallic = GetMetallic(input.baseUV);
surface.smoothness = GetSmoothness(input.baseUV);
BRDF brdf = GetBRDF(surface);
float4 meta = 0.0;
return meta;
}
#endif
4.3 光照贴图坐标
定义光照贴图UV,将其作为局部空间坐标的XY组件:
struct Attributes
{
float3 positionOS : POSITION;
float2 baseUV : TEXCOORD0;
float2 lightMapUV : TEXCOORD1;
};
…
Varyings MetaPassVertex (Attributes input)
{
Varyings output;
input.positionOS.xy =
input.lightMapUV * unity_LightmapST.xy + unity_LightmapST.zw;
output.positionCS = TransformWorldToHClip(input.positionOS);
output.baseUV = TransformBaseUV(input.baseUV);
return output;
}
OpenGL需要精确配置局部坐标的Z组件:
input.positionOS.xy =
input.lightMapUV * unity_LightmapST.xy + unity_LightmapST.zw;
input.positionOS.z = input.positionOS.z > 0.0 ? FLT_MIN : 0.0;
4.4 漫反射率
meta pass用于生成不同的数据,通过unity_MetaFragmentControl
通信控制:
bool4 unity_MetaFragmentControl;
x组件控制漫反射:
float4 meta = 0.0;
if (unity_MetaFragmentControl.x)
{
meta = float4(brdf.diffuse, 1.0);
}
return meta;
Unity的meta pass会将meta值加上通过粗糙度控制的镜面反射率的一半,用于提亮,让高光反射但粗糙的物体也可以传递一些间接光:
meta = float4(brdf.diffuse, 1.0);
meta.rgb += brdf.specular * brdf.roughness * 0.5;
之后,结果通过一个幂运算提升:
meta.rgb += brdf.specular * brdf.roughness * 0.5;
meta.rgb = min(
PositivePow(meta.rgb, unity_OneOverOutputBoost), unity_MaxOutputValue
);
之后,在GetLighting
中应用表面漫反射率:
float3 color = gi.diffuse * brdf.diffuse;
5 发光表面
5.1 发光
在shader中添加新的属性:
[NoScaleOffset] _EmissionMap("Emission", 2D) = "white" {}
[HDR] _EmissionColor("Emission", Color) = (0.0, 0.0, 0.0, 0.0)
在LitInput
中声明对应的属性和方法:
TEXTURE2D(_BaseMap);
TEXTURE2D(_EmissionMap);
SAMPLER(sampler_BaseMap);
UNITY_INSTANCING_BUFFER_START(UnityPerMaterial)
UNITY_DEFINE_INSTANCED_PROP(float4, _BaseMap_ST)
UNITY_DEFINE_INSTANCED_PROP(float4, _BaseColor)
UNITY_DEFINE_INSTANCED_PROP(float4, _EmissionColor)
…
UNITY_INSTANCING_BUFFER_END(UnityPerMaterial)
…
float3 GetEmission (float2 baseUV)
{
float4 map = SAMPLE_TEXTURE2D(_EmissionMap, sampler_BaseMap, baseUV);
float4 color = UNITY_ACCESS_INSTANCED_PROP(UnityPerMaterial, _EmissionColor);
return map.rgb * color.rgb;
}
在片元着色器末尾调用:
float3 color = GetLighting(surface, brdf, gi);
color += GetEmission(input.baseUV);
return float4(color, surface.alpha);
5.2 烘培发光
在meta pass中,使用unity_MetaFragmentControl
的y组件控制发光:
if (unity_MetaFragmentControl.x)
{
…
}
else if (unity_MetaFragmentControl.y)
{
meta = float4(GetEmission(input.baseUV), 1.0);
}
在CustomShaderGUI
中,我们需要手动开启选项:
public override void OnGUI (
MaterialEditor materialEditor, MaterialProperty[] properties
)
{
EditorGUI.BeginChangeCheck();
base.OnGUI(materialEditor, properties);
editor = materialEditor;
materials = materialEditor.targets;
this.properties = properties;
BakedEmission();
…
}
void BakedEmission ()
{
editor.LightmapEmissionProperty();
}
这样就会出现选项:
使用Baked
表明烘培发光。
但Unity尽量在烘培时避免额外的发光pass,如果材质的发光被设为0就会被忽略。为此,在发光模式变化时,我们可以使用MaterialGlobalIlluminationFlags.EmissiveIsBlack
来重写globalIlluminationFlags
,这样,只有在要烘培时才会开启:
void BakedEmission ()
{
EditorGUI.BeginChangeCheck();
editor.LightmapEmissionProperty();
if (EditorGUI.EndChangeCheck())
{
foreach (Material m in editor.targets)
{
m.globalIlluminationFlags &=
~MaterialGlobalIlluminationFlags.EmissiveIsBlack;
}
}
}
6 烘培透明物体
6.1 硬编码属性
Unity的Lightmapper对透明使用硬编码方法,即硬性规定使用_MainTex
、_Color
、_Cutoff
,目前我们只支持了_Cutoff
,为此,需要额外定义两个属性,并打上[HideInInspector]
标签:
[HideInInspector] _MainTex("Texture for Lightmap", 2D) = "white" {}
[HideInInspector] _Color("Color for Lightmap", Color) = (0.5, 0.5, 0.5, 1.0)
6.2 复制属性
我们需要确保_MainTex
与我们的_BaseMap
对应,我们定义一个CopyLightMappingProperties
方法来进行属性复制:
public override void OnGUI (
MaterialEditor materialEditor, MaterialProperty[] properties
)
{
…
if (EditorGUI.EndChangeCheck())
{
SetShadowCasterPass();
CopyLightMappingProperties();
}
}
void CopyLightMappingProperties ()
{
MaterialProperty mainTex = FindProperty("_MainTex", properties, false);
MaterialProperty baseMap = FindProperty("_BaseMap", properties, false);
if (mainTex != null && baseMap != null)
{
mainTex.textureValue = baseMap.textureValue;
mainTex.textureScaleAndOffset = baseMap.textureScaleAndOffset;
}
MaterialProperty color = FindProperty("_Color", properties, false);
MaterialProperty baseColor =
FindProperty("_BaseColor", properties, false);
if (color != null && baseColor != null)
{
color.colorValue = baseColor.colorValue;
}
}
不过,这也就意味着烘培透明物体只能依赖于单张纹理、颜色和裁剪属性,同时lightmapper也只考虑材质的属性,逐实例属性不考虑。