In this second part of our Deferred Rendering's tutorial we will change the previous example reading the light's parameters from a texture. With this solution we can load any amount of light's data into our scene. Moreover I've changed the scene itself defining a more articulated room with more elements. It is possible to work on the finished version of this example. It can be found in the folder that contains all the main classes (Psyche engine is required). ________________________________________ Since we've changed the scene the initialization function changes consequently; the main modification is about objects' loading calls. try{ CheckAssignment(milosStatue = DataObjectManager::getInstance()-> createGLSLRenderObjectFromFile("./models/milosStatue.exo", "shaders/DeferredRendering/Milo/deferredShading.vert", "shaders/DeferredRendering/Milo/deferredShading.frag") ); CheckAssignment(milosColumns = DataObjectManager::getInstance()-> createGLSLRenderObjectFromFile("./models/milosColumns.exo", "shaders/DeferredRendering/Milo/deferredShading.vert", "shaders/DeferredRendering/Milo/deferredShading_bump_specular.frag") ); CheckAssignment(milosRoom = DataObjectManager::getInstance()-> createGLSLRenderObjectFromFile("./models/milosRoom.exo", "shaders/DeferredRendering/Milo/deferredShading.vert", "shaders/DeferredRendering/Milo/deferredShading_bump_specular.frag") ); Once we created the objects we initialize FBO's buffer, as we've done in the previous tutorial, and then we initialize a new texture which uses PBO (Pixel Buffer Object). This last texture is obtained from a specific class that Psyche provides, and allows to read/write every single pixel on the texture's surface. lightsTexture = new PBOFloatTexture(2, LIGHTS_COUNT, GL_RGB, GL_RGB32F_ARB); defBuffer = new FBODeferredShadingBuffer(shWIDTH,shHEIGHT); for(int i=0; i filterTextures; filterTextures.push_back(defBuffer); filterTextures.push_back(lightsTexture); ImageFilterBuilder builder; builder.addAcquireFromTextureOperator(filterTextures); builder.addDeferredShadingFilter( "./shaders/DeferredRendering/Milo/deferredRendering.vert", "./shaders/DeferredRendering/Milo/deferredRendering.frag"); builder.addRenderToScreenOperator(); screenFilter = builder.getImageFilter(); agConsole.hide(); } catch(PException* e){ LogMex("Error",e->sException); return P_FAIL; } The other function we analyze is doRender(). In this function we set the light's data to the texture and we perform the rendering itself. First of all, let's check the first rows where we create the texture that contains the light's data: void DeferredRenderingMilo::doRender(){ Camera::setCurrentCamera(&camera); glDisable(GL_LIGHTING); float pixels[2*LIGHTS_COUNT*3]; int index = 0; for (int row=0; row<3; row++) for (int col=0; colsetPosition(pixels[index + 0], pixels[index + 1], pixels[index + 2]); index += 6; } lightsTexture->updatePixels(pixels); Pixel's array is built 2 pixels wide and "light's amount" pixels high taking into account every channel. This means that the first three array's elements are the RGB values of the top left pixel in the texture. The following for cycle simply assigns positions' and colors' values in function of row, column and time. Notice that we also update every impostor's position to match lights' motion. The remaining rows of this function are analogous to the same portion of code in the first part of this tutorial. The main difference is that we render the impostors too. defBuffer->start(); milosStatue->render(); milosColumns->render(); milosRoom->render(); glEnable(GL_BLEND); glBlendFunc(GL_SRC_ALPHA,GL_ONE); for(int i=0; irender(); glDisable(GL_BLEND); defBuffer->stop(); screenFilter->executeFiltering(); }; Anyway, the most interesting part is the deferrend rendering's shader, updated to receive the lights texture and to use it. Let's analyze its code and comment it. uniform sampler2D tImage0; uniform sampler2D tImage1; uniform sampler2D tImage2; uniform sampler2D lightTexture; uniform vec3 cameraPosition; void main( void ) { vec3 diffuse = vec3(0,0,0); vec3 specular = vec3(0,0,0); vec3 light = vec3(0,0,0); vec3 lightDir = vec3(0,0,0); vec3 vHalfVector = vec3(0,0,0); vec4 image0 = texture2D( tImage0, gl_TexCoord[0].xy ); vec4 position = texture2D( tImage1, gl_TexCoord[0].xy ); vec4 normal = texture2D( tImage2, gl_TexCoord[0].xy ); vec3 eyeDir = normalize(cameraPosition-position.xyz); float lightIntensity = 0; float specularIntensity = image0.a; float selfLighting = position.a; normal.w = 0; normal = normalize(normal); The code so far just generates all the useful variables. Diffuse's and specular's vectors store RGB values for the respective components while light's, lightDir's and vHalfVector's vectors are going to be used for lighting. The three vectors image0, position and normal store the three textures and the correspondend data as previously described. Lastly, the view vector eyeDir represent a normal vector from the camera's position to the current point's position. LightIntensity scalar is used to compute the lighting's intensity for the current light. The two following scalars are different from the previous tutorials and show how we are using the diffusive texture's alpha channel to store the specular intensity value, moved inside specularIntesity, while selfLighting reads from position texture's alpha channel how much self-illumination the pixel has. The very last rows are just a normalization safety pass. Let's analyze the lighting section of the shader. // Diffusive float lightCount = 30; for(int i=0; i0) specular += lightColor * pow(max(dot(normal,vHalfVector),0.0),10) * lightIntensity * 2; } gl_FragColor = vec4(diffuse,1) * image0 + vec4(specular * specularIntensity * 20, 1) + selfLighting * (image0); } Variable lightCount is manually set to 30 since we have no particular needings about managing the lights' number dinamically; obviously was possible to code in any of the lights texture's pixels so we would have been able to read the light's number at run-time. Similarly any fixed parameter, as light's radius, can be read from the light's texture adding some pixels to it. Variable light is loaded reading the position of the current light from the light's texture; the computation I've performed to read the first pixel is not very accurate, I've just read from position 0.1 where I know there is the left pixel. Vertical movement on the texture works similarly. Variable lightColor is set reading from the light's texture again, but this time from the right pixel; I search for this information in position 0.99, therefore I just read the value using the specified coordinate. Anyway, if one wants to use more parameters, it would be undoubtly more elegant to implement a specific function for this task. The remaining computations are the same of the previous tutorial, and anyway they are widely documented (Blinn-Phong lighting model). The last shader shader is worth to analyze is the impostor's one, even if it is very simple. varying vec4 position; varying vec4 normals; varying mat4 TBN; uniform sampler2D tDiffuse; void main( void ) { gl_FragData[0] = texture2D(tDiffuse,gl_TexCoord[0].st); gl_FragData[1] = vec4(0,0,0,gl_FragData[0].a); gl_FragData[2] = vec4(0,0,0,0); } The interesting thing shown in this shader is that the position's texture alpha channel (gl_FragData[1]) contains self-illumination data. In practice if it is set as 1, the pixel is fully self-illuminated, if it is set on 0, it just receive the normal shading, else we will use an interpolation of this two ways. Notice how this parameter is read from the alpha channel of the diffusion texture: this allows us to have the impostor that self-illuminate itself where the texture has alpha at 1, and we have the impostor to be transparent where alpha is null. ________________________________________ I hope this tutorial helped you! I wrote it presuming that the reader is, at least, an intermediate graphic programmer therefore I omitted to describe the code row by row. There is no doubt that Psyche engine simplify a lot the code, therefore I really ask you to download the code and to put your hands inside it to get how it works. As always, for anything you may need, comments, errors' notification, requestes, etc., don't esitate to get in touch with me writing at