Directly Mapping Texels to Pixels

http://blogs.msdn.com/jsteed/articles/209220.aspx

 

When rendering 2D output using pre-transformed vertices, care must be taken to ensure that each texel area correctly corresponds to a single pixel area, otherwise texture distortion can occur. This article describes the process that Direct3D follows when rasterizing and texturing triangles. By understanding the basics of this process you can ensure your Direct3D application correctly renders 2D output.

Let's assume we a 6x6 display, as shown in Figure 1:

In this diagram, the pixels are modeled as squares, but this is somewhat misleading. Pixels usually aren't squares, they're little dots. Each square indicates the area lit by the pixel, but a pixel is always just a dot at the center of a square. The distinction may seem small, but it's important. A better illustration of the same display is shown in Figure 2:

This diagram correctly shows each physical pixel as a point in the center of a grid cell. The screen space coordinate (0, 0) is located directly at the top-left pixel, and therefore at the center of the top-left cell. The top-left corner of the display is therefore at (-0.5, -0.5) since it's .5 cells to the left and .5 cells up from the top-left pixel. Now, let's ask Direct3D to render a quad with corners at (0, 0) and (4, 4):
Figure 3 shows where our mathematical quad is in relation to the display, but doesn't show what the quad will look like once Direct3D rasterizes it and outputs it to the display. In fact, before moving on make sure you agree that it's impossible for a raster display to fill the quad exactly as shown because the edges of the quad don't coincide with the boundaries between pixel cells.

Stated another way: Each pixel can only output a single color so each pixel cell is filled with only a single color; if the display were to render the quad exactly as shown, the pixel cells along the quad's edge would need to show 2 distinct colors: Blue where covered by the quad and white where only the background is visible.

Instead, the graphics hardware is tasked with determining which pixels should be filled to approximate the quad. This process is called Rasterization, and is detailed in the DirectX SDK documentation under the heading Rasterization Rules. For this particular case, the rasterized quad is shown in Figure 4:

Jump back to Figure 3 to see how the quad we told Direct3D to draw and what was finally rendered are different. You can see that what the display actually outputs is the correct size, but has been shifted by -0.5 cells in the x and y directions. However, this is the best possible approximation to the quad (well, actually multi-sampling techniques compute each pixel's output by averaging color calculations taken at multiple points within the pixel cell, which makes for a better approximation but has the effect of blurring the edges. Multi-sampling is an advanced topic and is outside the scope of this article. See the AntiAlias Sample included with the DirectX SDK for thorough coverage). Realize that if the rasterizer filled every cell the quad crossed, the resulting area would be of dimension 5x5 instead of the desired 4x4.

To the unsuspecting individual who assumes that screen coordinates originate at the top-left corner of the display grid instead of the top-left pixel, the quad appears exactly as expected, but the difference becomes clear when the quad is given a texture. Here's the 4x4 texture we'll map directly onto the quad:

Since the texture is 4x4 texels and the quad is 4x4 pixels, we might expect the textured quad to appear exactly like the texture regardless of where on the screen the quad is drawn. However, this is not the case; even slight changes in position will influence how the texture is displayed. Figure 6 shows how a quad between (0, 0) and (4, 4) is displayed after being rasterized and textured:
The rest of this article explains exactly why the output looks the way it does instead of looking like the texture, but for those who are already bored and want the solution, here it is: The edges of the input quad need to lie upon the boundary lines between pixel cells. By simply shifting the x and y quad coordinates by -0.5 units texel cells will perfectly cover pixel cells and the quad can be perfectly recreated on the screen. Skip ahead to Figure 9 to see the quad at the corrected coordinates.

The details of why the rasterized output only bears slight resemblence to the input texture are directly related to the way Direct3D addresses and samples textures. What follows assumes you have a good understanding of texture coordinate space and bilinear filtering. See the companion article "Bilinear Texture Filtering and You" if you'd like a zesty review.

Getting back to our investigation of the strange pixel output, it makes sense to trace the output color back to the pixel shader: The pixel shader is called for each pixel selected to be part of the rasterized shape. The solid blue quad we rendered in the beginning could have a particularly simple shader:

float4 SolidBluePS() : COLOR
{
    return float4( 0, 0, 1, 1 );
}

For the textured quad, the pixel shader has to be changed slightly:

texture MyTexture;
sampler MySampler =
sampler_state
{
    Texture = <MyTexture>;
    MinFilter = Linear;
    MagFilter = Linear;
    AddressU = Clamp;
    AddressV = Clamp;
};
float4 TextureLookupPS( float2 vTexCoord : TEXCOORD0 ) : COLOR
{
    return tex2D( MySampler, vTexCoord );
}

That code assumes the 4x4 texture of Figure 5 is stored in the MyTexture parameter. As shown, the MySampler texture sampler is set to perform bilinear filtering on MyTexture. The pixel shader gets called once for each rasterized pixel, and each time the returned color is the sampled texture color at vTexCoord. Each time the pixel shader is called, the vTexCoord argument is set to the texture coordinates at that pixel. That means the shader is asking the texture sampler for the filtered texture color at the exact location of the pixel, as detailed inFigure 8:
In that diagram, the black dots show where the rasterization pixels are. The texture coordinates at each pixel are easily determined by interpolating the coordinates stored at each vertex: The pixel at (0, 0) coincides with the vertex at (0, 0), therefore the texture coordinates at that pixel are simply the texture coordinates stored at that vertex, UV (0.0, 0.0). For the pixel at (3, 1), the interpolated coordinates are UV (0.75, 0.25) because that pixel is located at 3/4th the texture's width and 1/4th its height. These interpolated coordinates are what get passed to the pixel shader.

As you can see, in this example the texels don't line up with the pixels; each pixel (and therefore each sampling point) is positioned at the corner of 4 texels. Since the filtering mode is set to Linear, the sampler will average the colors of the 4 texels sharing that corner. This explains why the pixel we had expected to be red is actually 3/4th gray + 1/4 red, the pixel expected to be green is 1/2 gray + 1/4 red + 1/4 green, etc.

To fix this problem, all we need to do is correctly map the quad to the pixels it will be rasterized to, and thereby correctly map the texels to pixels. Figure 9 shows the results of drawing the same quad between (-0.5, -0.5) and (3.5, 3.5), which is the quad we wanted all along:

In summary, pixels and texels are actually points, not solid blocks. Screen space originates at the top-left pixel, but texture coordinates originate at the top-left corner of the texture's grid. Most importantly, remember to subtract 0.5 units from the x and y components of your vertex positions when working in transformed screen space in order to correctly align texels with pixels.

你可能感兴趣的:(mapping)