Monthly Archives: June 2013

DirectX Part 2: Displaying, Like, Anything

So you just read the previous tutorial, you initialized Direct3D on your video card, and now you have pointers to these three items — a Swap Chain, a D3D Device, and a D3D Device Context. Well, in this tutorial, we’re going to display a color on screen! And it’s going to require using all three items.

Yes, that’s a lot of work for drawing a color to the screen — but it’s the groundwork for vertices and shaders and all that sexy stuff in the future. So let’s review.

D3D Device: This is an interface to the physical resources on your video card (i.e. GPU memory and processors). You’ll only have this one D3D Device to work with. Use it to allocate memory for textures, shaders, and the like.

D3D Device Context: This is an interface to the commands you want to give to the video card. As an analogy, the D3D Device is the company that builds the guitar, and the D3D Device Context is the musician that uses the guitar to play songs. You’ll only have this one context to work with (although pros use multiple contexts, known as “deferred” contexts). Things like passing in triangles, textures, and shader commands to generate a pretty picture are done through the D3D Device Context.

Swap Chain: This is an interface to the image you display on the monitor. So, it’ll contain a 1920×1080 texture that you draw on, to display on your 1920×1080 monitor. It’s called a swap chain because you’re actually drawing into 1 of 2 1920×1080 textures (you draw on one while the other is being displayed by the monitor), and then you swap those images when you’re done drawing. Since the swap chain directly controls what image is on the monitor, any time you want to see anything, you’ll use it.

Anyhow, let’s see how to draw a color to screen!

First, there’s some setup code you need to run once, after you initialize Direct3D:


ID3D11Texture2D* pBuffer = NULL;
m_pSwapChain->GetBuffer( 0, __uuidof( ID3D11Texture2D ), ( LPVOID* )&pBuffer );

m_pd3dDevice->CreateRenderTargetView( pBuffer, NULL, &m_pRenderTargetView );
pBackBuffer->Release();

m_pDeviceContext->OMSetRenderTargets( 1, &m_pRenderTargetView, NULL );

Again, let’s take it step by step.


ID3D11Texture2D* pBuffer = NULL;
m_pSwapChain->GetBuffer( 0, __uuidof( ID3D11Texture2D ), ( LPVOID* )&pBuffer );

This is simple enough: you’re making pBuffer point to the buffer (texture) at index 0 of the swap chain. You can’t grab the texture that’s currently displayed to screen, so this is the only texture you have to deal with.

That __uuidof( ID3D11Texture2D ) bit looks confusing, but it’s a fairly common setup here, so try to get comfortable with it! In order to future-proof the D3D APIs, rather than have GetBuffer(...) return a pointer to an ID3D11Texture2D (which will become obsolete come D3D 12), GetBuffer() writes out an empty void* pointer — but it guarantees that you can cast that pointer to whatever type you give in argument 2.


m_pd3dDevice->CreateRenderTargetView( pBuffer, NULL, &m_pRenderTargetView );
pBackBuffer->Release();

This code makes our D3D Device aware of the texture in our swap chain (which in turn will become the texture displayed in our monitor). Now, pBuffer and m_pRenderTargetView are two different interfaces to the same memory. They both modify the exact same image (which will eventually go to screen), but they expose different ways to modify it. Like an X-ray versus an infrared image of a human body — both offer different information about the same subject.

Calling Release() says that we don’t need to look at our frame texture in any of the methods exposed by pBackBuffer anymore. Our frame texture still exists in memory, but we can only view it through m_pRenderTargetView now. It’s a very good idea to Release() any handles you don’t need anymore.


m_pDeviceContext->OMSetRenderTargets( 1, &m_pRenderTargetView, NULL );

This says “Hey GPU! Render to this buffer”. Because we’re clever, we just made sure the buffer we’re rendering to is the one that gets displayed to screen — but it’s still not on screen yet! WE ARE SO CLOSE.

Fun fact: The “OM” stands for “Output Merger”, because it takes all the information you give the video card (which shaders/vertices/etc to use) and merges them all to create an output image.

AND NOW WE GET TO ACTUALLY DRAW. TO THE SCREEN.
In your update loop, or some loop that gets called every time you want to draw a new frame, include this code:


float ClearColor[4] = {
(float)rand() / RAND_MAX, //red
(float)rand() / RAND_MAX, //green
(float)rand() / RAND_MAX, //blue
1.0f //alpha
};
m_pDeviceContext->ClearRenderTargetView( m_pRenderTargetView, ClearColor );
m_pSwapChain->Present( 0, 0 );

Hey, simple!

Your ClearColor is a standard RGBA color — 0 is min, 1 is max. Now that we set our DeviceContext to write to the SwapChain back buffer, that clear command on m_pRenderTargetView clears our backbuffer to ClearColor, and we just tell SwapChain to present it.

THAT WAS EASY

GRAPHICS CARDS ARE SO FRIENDLY

DirectX Part 1: The “Initialize” Function

GUYS. I hate to break it to you, but normal programming on CPUs is for wimps. GPU programming is where you have to go to find fast cars / hot ladies. BUT THERE’S A PROBLEM: it’s hella hard to program for GPUs! Well. Until now, when I explain it all to you.

Since the dawn of computing, every line of code has run on your CPU, by default. Graphics cards only became a thing in the mid-80s, and even to this day, they aren’t really “standard” parts of a computer. What this means is that everything you want to run on a GPU has to be wrapped in APIs that very specifically say, “I want this to run on a GPU and it will be my fault if this computer has no GPU to run on”.

The two most common APIs to allow people to run code on graphics cards for the purpose of rendering pretty 3D scenes are DirectX and OpenGL. This is the first of many articles focusing on DirectX, although many of the concepts apply to OpenGL. Using the GPU to do non-rendering stuff, like cracking passwords, isn’t really DirectX’s strength and we aren’t gonna focus on it.

So, the entire scope of this article is the DirectX11 Initialize function. That’s a pretty small scope, but it’s a dense function, and it provides a great overview of what the API designers think you should care about re: your video card.

Anyhow, the Initialize function is called D3D11CreateDeviceAndSwapChain . More specifically, it’s:

HRESULT D3D11CreateDeviceAndSwapChain(
   _In_ IDXGIAdapter *pAdapter,
   _In_ D3D_DRIVER_TYPE DriverType,
   _In_ HMODULE Software,
   _In_ UINT Flags,
   _In_ const D3D_FEATURE_LEVEL *pFeatureLevels,
   _In_ UINT FeatureLevels,
   _In_ UINT SDKVersion,
   _In_ const DXGI_SWAP_CHAIN_DESC *pSwapChainDesc,
   _Out_ IDXGISwapChain **ppSwapChain,
   _Out_ ID3D11Device **ppDevice,
   _Out_ D3D_FEATURE_LEVEL *pFeatureLevel,
   _Out_ ID3D11DeviceContext **ppImmediateContext
);

( _In_ and _Out_ are compile-time hints indicating whether a function parameter is input or output. _In_ parameters can only be input (read from but not written to), and _Out_ parameters can only be output (written to but not read from) ).

ANYHOW WOW THAT’S A LOT OF PARAMETERS

The number of parameters is representative of DirectX’s overall API design — a design that assumes GPU programming is only for super-hardcore programmers. This is a self-fulfilling prophecy — the scariness of the DirectX API keeps everyone away who isn’t hardcore enough to handle reams of documentation — but it sucks, because GPUs are pretty mainstream now and many programmers could really add GPU skills to their arsenal. But that’s a topic for another time! Let’s cover these variables one-by-one.

_In_ IDXGIAdapter *pAdapter: “IDXGI” stands for “Interface to a DirectX Graphics Infrastructure”. Basically, a DirectX Graphics Infrastructure Adapter is anything that can handle displaying graphics. This includes video cards, integrated graphics chips on CPUs, or a CPU software renderer — if it can output images to screen, it’s a DXGI adapter. My dual-nVidia GTX560 machine has 3 adapters: one for each of my GTX560 cards, and one for the Microsoft Basic Render Driver that Microsoft falls back on if there are no video cards available. The value you pass in here will be the video card that DirectX gives you access to; pass in nullptr for the default video card (usually a pretty good guess).

_In_ D3D_DRIVER_TYPE DriverType: This is one of a pre-defined list of values, which lets you specify whether you want any commands passed through DirectX to go to the GPU hardware, or if you want to actually do a fake-out and emulate your commands in software. Chances are, if you’re using the graphics card, it’s because you want to make things go fast. So you want to use D3D_DRIVER_TYPE_HARDWARE. If you’re doing tricky enough stuff to warrant using another driver type, chances are, you’ll know about it.

_In_ HMODULE Software: This is only used if your D3D_DRIVER_TYPE above is “Software”, because this is a pointer to your software implementation of Direct3D that you want to use to debug stuff. Otherwise, it can be NULL. Again, if you’re doing tricky enough stuff that you need to pass a non-NULL value here, you’ll know about it.

_In_ UINT Flags: This is where you specify a bunch of low-level flags to modify behavior. If you’re only rendering from a single thread, or if you’re using the GPU for tasks that take lots of time to execute (i.e. cracking passwords instead of playing video games), or you want to debug stuff, there’s flags here that you may want to play with.

_In_ const D3D_FEATURE_LEVEL *pFeatureLevels: This is a pointer to the first element of an array of D3D_FEATURE_LEVEL items. “Feature Level” means “version of DirectX” — such as 9.1, 10.0, or 11.0 . In general, you want the highest feature level you can get, and newer video cards offer support for newer versions of DirectX. You’ll probably pass in an array like D3D_FEATURE_LEVEL featureLevels[] = { D3D_FEATURE_LEVEL_11_0, D3D_FEATURE_LEVEL_10_1 }; which means “Try for DX11.0 if possible, but if that fails, give me 10.1”. Alternatively, you can just pass in nullptr and it’ll set up whatever the highest DirectX feature level is that your adapter supports.

_In_ UINT FeatureLevels: The number of features in the above pFeatureLevels array. lol at the old-school C in this API, instead of just passing a vector or something you pass in a pointer-to-array-start and length-of-array. If you pass nullptr for pFeatureLevels, just set this to 0.

_In_ UINT SDKVersion: Just pass in D3D11_SDK_VERSION. That’s seriously what the documentation tells you to do. No choice, no explanation. Thanks, API designers.

_In_ const DXGI_SWAP_CHAIN_DESC *pSwapChainDesc: Okay, so what is a swap chain? Well, it’s a collection of frame buffers (frame buffers being the 1920×1080 RGB image that gets presented to your 1920×1080 monitor — a buffer containing the entire frame to display on-screen). Anything you want displayed gets drawn to one of the frames in this collection, and then once you’re done drawing to that frame it gets displayed on-screen and you start drawing to the next frame buffer in the collection. This structure tells DirectX things like how many frame buffers to create, how big they should be, and how they’ll be used.

_Out_ IDXGISwapChain **ppSwapChain: This is a pointer to the output swap chain generated by DirectX. Every time you want to display your drawn image onto the monitor, you’ll have to call pSwapChain->Present(...). So hold on to this!

_Out_ ID3D11Device **ppDevice: This is a representation of DirectX running on your graphics card. Things like memory allocation and status checks relating to the whole GPU are done through the D3DDevice. Hold on to this, too!

_Out_ D3D_FEATURE_LEVEL *pFeatureLevel: This just confirms the version of DirectX that your graphics card is able to run. Quit out if it’s lower than you want it to be.

_Out_ ID3D11DeviceContext **ppImmediateContext: This is the thing you actually care about — the “context” to the D3DDevice (as in, this is how you use the DirectX wrapper that is now lying on top of your video card). The “Immediate” refers to the fact that any command you send through here gets executed on the graphics card immediately. You use the device context to set shaders and draw 3D geometry, which is pretty much the meat of rendering.

So you came out of this with your swap chain, your D3DDevice, and your D3DDeviceContext. Cool! Next time, we’ll look at how to start using these items to — god forbid — draw a purple box on screen.