directx-11

How to create 2D texture using DXGI format DXGI_FORMAT_R1_UNORM?

前提是你 提交于 2020-02-16 10:42:31
问题 I want to create a 1 bit per pixel monochrome texture 2D in DirectX 11 using dxgi format DXGI_FORMAT_R1_UNORM I have done trying the following but it's showing following errors: D3D11 ERROR: ID3D11Device::CreateTexture2D: Device does not support the format R1_UNORM. [ STATE_CREATION ERROR #92: CREATETEXTURE2D_UNSUPPORTEDFORMAT] D3D11: BREAK enabled for the previous message, which was: [ ERROR STATE_CREATION #92: CREATETEXTURE2D_UNSUPPORTEDFORMAT ] I have tried to create a texture for

How to create 2D texture using DXGI format DXGI_FORMAT_R1_UNORM?

对着背影说爱祢 提交于 2020-02-16 10:42:28
问题 I want to create a 1 bit per pixel monochrome texture 2D in DirectX 11 using dxgi format DXGI_FORMAT_R1_UNORM I have done trying the following but it's showing following errors: D3D11 ERROR: ID3D11Device::CreateTexture2D: Device does not support the format R1_UNORM. [ STATE_CREATION ERROR #92: CREATETEXTURE2D_UNSUPPORTEDFORMAT] D3D11: BREAK enabled for the previous message, which was: [ ERROR STATE_CREATION #92: CREATETEXTURE2D_UNSUPPORTEDFORMAT ] I have tried to create a texture for

Color conversion from DXGI_FORMAT_B8G8R8A8_UNORM to NV12 in GPU using DirectX11 pixel shaders

陌路散爱 提交于 2020-01-29 04:51:06
问题 I'm working on a code to capture the desktop using Desktop duplication and encode the same to h264 using Intel hardwareMFT. The encoder only accepts NV12 format as input. I have got a DXGI_FORMAT_B8G8R8A8_UNORM to NV12 converter(https://github.com/NVIDIA/video-sdk-samples/blob/master/nvEncDXGIOutputDuplicationSample/Preproc.cpp) that works fine, and is based on DirectX VideoProcessor. The problem is that the VideoProcessor on certain intel graphics hardware supports conversions only from DXGI

Check which version of DirectX is installed

断了今生、忘了曾经 提交于 2020-01-21 05:16:28
问题 As per the title, how can I check which version of DirectX a user has installed? Checking the FeatureLevel isn't enough, as my application can run on feature level 10.0, but requires that DirectX 11.1 be installed. Why this is not a duplicate: How to code to get direct X version on my machine in C#? The first answer in this question says "If Windows 7, DirectX = 11, if Windows Vista, DirectX = 10". This is wrong, as Vista supports both DirectX 10 and 11 and Windows 7 supports DirectX 11 and

D3D11CreateDeviceAndSwapChain(…) was not declared in this scope

时光怂恿深爱的人放手 提交于 2020-01-17 14:42:49
问题 I am trying to make a simple Engine using c++ and DirectX 11. I am using Geany. This makes things worse for me because linking the d311 library was difficult, but I have to use it for 2 reasons: I want to make it without any IDE help, more to-root possible, and Because we use it at school. While I was following some tutorials and books I got stuck because my IDE doesn't recognize the method. How can I solve that? Update:i found out that the D310CreateDeviceAnsSwapChain(..) works but Idk what

DirectX11 Render CameraStream to Texture - Device::CreateShaderResourceView fails

眉间皱痕 提交于 2020-01-15 23:35:55
问题 [Prologue] I am gonna try to render a Video stream from IDS uEye Cameras to DirectX Scene. The camera's colormode is currently set to IS_CM_RGBA8_PACKED . For now, I just capture a single Picture from the cam, freeze it to a char* cameraBufferLeft_ and try to use this as a ShaderResourceView (which should replace a previously used .dds File). Previously, I had Problems on CreateTexture2D function aswell, because the camera mode was set to IS_CM_RGB8_PACKED , which means there was no Alpha

How to get an IMFTransform that is d3d_aware (To encode input from Windows Duplication API to H264)?

会有一股神秘感。 提交于 2020-01-14 05:34:33
问题 The next code should give inputInfo and outputInfo configuration to get a IMFTransform back, and this IMFTransform be used to encode ID3D11Texture2D(with format DXGI_FORMAT_B8G8R8A8_UNORM) to H264 I understand the format DXGI_FORMAT_B8G8R8A8_UNORM can be taken as a MFVideoFormat_NV12 on a IMFTranform that is D3D_AWARE . But i'm having problems to get a IMFTranform that is D3D_AWARE MFT_REGISTER_TYPE_INFO inputInfo = { MFMediaType_Video, MFVideoFormat_NV12 }; MFT_REGISTER_TYPE_INFO outputInfo

Access violation and strange behavior of Visual Studio

若如初见. 提交于 2020-01-14 03:41:12
问题 I'm writing a test application on DirectX11, and I have 2 classes "Box" and "camera" . "Box" is a cube which is to be drawn on the screen, and this is "camera": class camera { public : const camera operator=(const camera& fv) const { return fv; } XMVECTOR eye; XMVECTOR at; XMVECTOR up; XMVECTOR right; XMVECTOR left; XMVECTOR down; float pitch; //x float roll; //z float yaw; //y XMMATRIX View; XMMATRIX Projection; camera(); camera(XMVECTOR eye, XMVECTOR at, XMVECTOR up, float movingOffset,

DirectX 11 - Compute shader: Writing to an output resource

十年热恋 提交于 2020-01-12 02:21:13
问题 I've just started using the Compute shader stage in DirectX 11 and encountered some unwanted behaviour when writing to an output resource in the Compute shader. I seem to get only zeroes as output which, to my understanding, means that out-of-bound reads has been performed in the Compute shader. (Out-of-bound writes results in no-ops) Creating the Compute shader components Input resources First I create an ID3D11Buffer* for input data. This is passed as a resource when creating the SRV used

DirectX11 Shader Compilation Issue

孤人 提交于 2020-01-07 09:21:13
问题 I'm working on a simple DirectX application to display couple of triangles together as Tetrahedron,which Keeps crashing at start.I checked with VS2012 Debugger the error occurs at the stage where Shader is supposed to be compiled from a .fx file,So I assume it's got something to do with the shader.I have no idea what I did wrong.Below is the code of the Shader I'm Using.Assistance required. struct Light { float3 pos; float4 ambient; float4 diffuse; }; cbuffer cbPerFrame { Light light; };