5.1 - Unsigned float textures
Among the improvements brought by the GeForce 8800, there is all that concerns textures.
The EXT_packed_float extension allows storing an RGB color in 32 bits instead of 96 bits with single floats or 48
bits with the half floats. To do that, 6 bits are dedicated to the red and green components, but only 5 bits for
the blue at the mentisses level and a total of 11 bits for the red and the green, but only 10 bits for
the blue. The components cannot be signed, it is only positive numbers. We can consider that this extension brings the
unsigned float to OpenGL. EXT_texture_shared_exponent has a similar goal with 32 bits storage.
Nevertheless, in that case, each component uses 9 bits for their mentisse and shears 5 bits for the exponents.
We have here also an unsigned float.
5.2 - Texture buffers
Here is another type of textures: the buffers textures brought by the EXT_texture_buffer_object extension.
Vertex Buffer Object, Pixel Buffer Object, Bindable Uniform Buffer Object and now Texture Buffer Object? Yes, and that's all
for the GeForce 8800. This new object is intended to store data that will be accessed afterwards by using non
normalized texture coordinates. The mipmaps and filtering are not available. It is also possible to get the
varying variables of the vertex and geometry shaders in this buffer by performing a joint action with the
NV_transform_feedback extension. This output flow, at the level of the vertex shader or the geometry shader, can
of course be copied into other buffer objects such as the vertex buffer object, which introduce the iteration notion
at the vertex and geometry buffers level. Besides, this is the manner the geometry subdivision is made
possible for a number of N iterations.
5.3 - Integer representation
Close to the EXT_texture_buffer_object extension but even more to the usual OpenGL textures, the EXT_texture_integer extension
makes it possible to use all type of textures whose representation in the shaders is performed through integers.
Although most of the time a 2D texture is stored in the memory as a set of integers, with shaders, each component
is displayed as a float number between 0 and 1. With this extension, it is not the case anymore, and we can access
data such as integers, signed or not. Everything works at the level of the internal storage format of the textures from
the GL_RGB8UI_EXT OpenGL API side for example, and through new lookup texture functions in GLSL.
5.4 - Texture arrays
As shows its name, EXT_texture_array brings the texture array notion. A one dimension texture array corresponds to a
collection of 2D textures, and a two dimensions texture array corresponds to a collection of 3D textures, each texture of
each array needing to have the same size. To access each entry of the array, we can use the texture coordinates, the
coordinates corresponding to the entry of the array being sent in the shader as a non normalized float number.
Except this, texture arrays behave like 2D or 3D textures, except that there is no filtering between two
entries in the array, but this operation may be done in the shader.
5.5 Texture compression
Two new extensions bring two brand new compression formats…or just one. From one side, we have EXT_texture_compression_rgtc
(Red-Green Texture Compression) that allow storing a texture featuring one or two chromatic components in a compressed
manner, as a block (like S3TC). On the other hand, we have GL_EXT_texture_compression_latc (Luminance-Alpha Texture Compression)
that does exactly the same thing and with the same compression rate (2x). Frankly speaking, I suppose that
the reason for having those two extensions is for semantic or compatibility reasons with Direct3D 10. These formats are
appropriate for the normalmaps storage, because a normalmap is normalized and we can then easily find back
the last component: B = 1 – R –G