#include <OgreD3D9DepthBuffer.h>
◆ DepthFormatsMask
Enumerator |
---|
DFM_D32 | |
DFM_D24 | |
DFM_D16 | |
DFM_S8 | |
◆ PoolId
Enumerator |
---|
POOL_NO_DEPTH | |
POOL_MANUAL_USAGE | |
POOL_DEFAULT | |
POOL_NON_SHAREABLE | |
POOL_INVALID | |
◆ D3D9DepthBuffer()
Ogre::D3D9DepthBuffer::D3D9DepthBuffer |
( |
uint16 |
poolId, |
|
|
D3D9RenderSystem * |
renderSystem, |
|
|
IDirect3DDevice9 * |
creator, |
|
|
IDirect3DSurface9 * |
depthBufferSurf, |
|
|
D3DFORMAT |
fmt, |
|
|
uint32 |
width, |
|
|
uint32 |
height, |
|
|
uint32 |
fsaa, |
|
|
uint32 |
multiSampleQuality, |
|
|
bool |
isManual |
|
) |
| |
◆ ~D3D9DepthBuffer()
Ogre::D3D9DepthBuffer::~D3D9DepthBuffer |
( |
| ) |
|
◆ getDepthBufferSurface()
IDirect3DSurface9* Ogre::D3D9DepthBuffer::getDepthBufferSurface |
( |
| ) |
const |
◆ getDeviceCreator()
IDirect3DDevice9* Ogre::D3D9DepthBuffer::getDeviceCreator |
( |
| ) |
const |
◆ isCompatible()
virtual bool Ogre::D3D9DepthBuffer::isCompatible |
( |
RenderTarget * |
renderTarget | ) |
const |
|
virtual |
◆ AvailableDepthFormats
uint8 Ogre::DepthBuffer::AvailableDepthFormats |
|
staticinherited |
During initialization DefaultDepthBufferFormat is overriden with a supported format.
This can be troublesome when creating the first render window, as you cannot tell Ogre which format do you wish to use for that window.
That's where AvailableDepthFormats comes in:
Before initialization user can set this mask to inform which formats they want to use. Ogre will go from best format to worst until a supported one is found. The default value is all bits set.
Set this to 0 to never use depth buffers. If you only wish render windows to not use depth buffers, then create the window with miscParam["depth_buffer"] = "no";
After initialization the mask is left untouched
See DepthBuffer::DepthFormatsMask
◆ DefaultDepthBufferFormat
The documentation for this class was generated from the following file: