原文地址
Directional Lights
Directional lights are lights that shine in a single, uniform direction. That is, all rays of light are parallel to each other. Pure directional lights do not exist (except maybe lasers?) but they are often used in computer graphics to imitate strong light sources that are very far away, such as the Sun. The Sun radiates light in all directions, like a point light. Over an enormous distance, however, the tiny fraction of light rays that make it to earth appear to be almost parallel.
Unlike point lights, directional lights do not need a position coordinate. A directional light only needs a single 3D vector that represents the direction of all the rays of light. However, the GLSL lighting code in our shader expects every light to have a position. Luckily, we can represent the direction of a directional light with a homogeneous coordinate by setting W=0.
Spotlights
float lightToSurfaceAngle = degrees(acos(dot(-surfaceToLight, normalize(light.coneDirection))));
if(lightToSurfaceAngle > light.coneAngle){
attenuation = 0.0;
}
Notice how the colors from all the lights are added together in our fragment shader. The RGBA values are supposed to be within the 0.0 to 1.0 range, so what happens if there are lots of lights, and the sum ends up being greater than 1.0? The colors would get clamped, and look weird. Also, if the lights are too dim, the whole scene could look basically black, with no detail. High-dynamic-range (HDR) rendering can help to fix these problems.
The human eye adjusts depending on the brightness of what it's looking at. If you are in a dark room for long enough, your pupils dilate to allow more light to reach your retina, which makes the room seem brighter. If you walk outside into bright sunlight, the opposite happens, so that you don't go blind from the intense light. HDR rendering sort of imitates how your eye works, in order to keep the details visible in very dark and very bright scenes. RGB values are allowed to go above 1.0 during lighting calculations, then the values are later rescaled so that they fit nicely within the 0.0–1.0 range.
子面散射
Light doesn't just reflect off of surfaces, it can travel through them too. When light penetrates a surface, it changes the color of that surface. Rendering this color change is called subsurface scattering.
Even though skin looks fairly opaque, subsurface scattering is generally used for realistic rendering of human skin. Without it, skin tends to look like painted plastic.
发光表面
In our implementation, only lights can illuminate a surface. Some surfaces, however, provide their own illumination, which makes it look like they are glowing. Think of a firefly, glow-in-the-dark stickers, or those weird glowing mushrooms.
Emissive lights are pretty easy to implement in OpenGL. Send an extra color uniform to the shaders, along with the materials texture and shininess, and add that color to the final color. Alternatively, you can send an extra texture instead of a single color.
Normal Mapping 法线纹理
3D meshes have limits to the number of vertices they can contain, due to performance. Making a surface rough or bumpy with geometry can take a lot of vertices, so normal maps are often used instead. Normal maps can be used to make an angular-looking 3D model look less angular, and more realistic.
A normal map is a texture that affects the surface normal. It is like the surface texture that we have implemented, except it contains XYZ vectors instead of RGB colors. The surface normal is an important part of the lighting calculations, and it affects the brightness of each pixel.