ID3D10Device::IASetIndexBuffer method (d3d10.h)
Bind an index buffer to the input-assembler stage.
Syntax
void IASetIndexBuffer(
[in] ID3D10Buffer *pIndexBuffer,
[in] DXGI_FORMAT Format,
[in] UINT Offset
);
Parameters
[in] pIndexBuffer
Type: ID3D10Buffer*
A pointer to a buffer (see ID3D10Buffer) that contains indices. The index buffer must have been created with the D3D10_BIND_INDEX_BUFFER flag.
[in] Format
Type: DXGI_FORMAT
Specifies format of the data in the index buffer. The only formats allowed for index buffer data are 16-bit (DXGI_FORMAT_R16_UINT) and 32-bit (DXGI_FORMAT_R32_UINT) integers.
[in] Offset
Type: UINT
Offset (in bytes) from the start of the index buffer to the first index to use.
Return value
None
Remarks
For information about creating index buffers, see Create an Index Buffer.
Calling this method using a buffer that is currently bound for writing (i.e. bound to the stream output pipeline stage) will effectively bind NULL instead because a buffer cannot be bound as both an input and an output at the same time.
The Debug Layer will generate a warning whenever a resource is prevented from being bound simultaneously as an input and an output, but this will not prevent invalid data from being used by the runtime.
The method will not hold a reference to the interfaces passed in. For that reason, applications should be careful not to release an interface currently in use by the device.
Requirements
Requirement | Value |
---|---|
Target Platform | Windows |
Header | d3d10.h |
Library | D3D10.lib |