rendertarget

QOpenGLWidget with custom framebuffer and multiple render targets

人走茶凉 提交于 2020-01-14 19:20:39
问题 Related to my other question, I'm trying to render a segmentation mask to enable object picking. But I am not able to achieve the desired result. Option 1 did not work at all. I was not able to retrieve the content of color attachment 1, or check if it existed at all (I created the attachment using only native OpenGL calls). Using this post, I was able to reproduce the green.png and red.png images by creating a custom frame buffer with a second color attachment which is then bound and drawn

How to get pixels from window using Direct2D

人走茶凉 提交于 2019-12-24 16:26:10
问题 My problem is getting the pixels in a window. I can't find a way to do this. I using standard windows functions and Direct2D (not DirectDraw). I am using standard initialization of new window: WNDCLASS wc; wc.style = CS_OWNDC; wc.lpfnWndProc = WndProc; wc.cbClsExtra = 0; wc.cbWndExtra = 0; wc.hInstance = hInstance; wc.hIcon = LoadIcon(NULL, IDI_APPLICATION); wc.hCursor = LoadCursor(NULL, IDC_ARROW); wc.hbrBackground = (HBRUSH)(6); wc.lpszMenuName = 0; wc.lpszClassName = L"WINDOW";

Add render target to default framebuffer of QOpenGLWidget

我的未来我决定 提交于 2019-12-22 12:52:46
问题 I'd like to add a second render target to the default framebuffer of a QOpenGLWidget. The reason is that I'd like to implement object picking and check whether the user hit an object by rendering a segmentation mask into gl_FragData[1] . Unfortunately, you can only retrieve the GLuint handle from the widget and there is no constructor of QOpenGLFramebufferObject that takes in the handle and there is no other option to retrieve the framebuffer. Is there any possibility to attach another

Rendering to a full 3D Render Target in one pass

寵の児 提交于 2019-12-21 02:26:26
问题 Using DirectX 11, I created a 3D volume texture that can be bound as a render target: D3D11_TEXTURE3D_DESC texDesc3d; // ... texDesc3d.Usage = D3D11_USAGE_DEFAULT; texDesc3d.BindFlags = D3D11_BIND_RENDER_TARGET; // Create volume texture and views m_dxDevice->CreateTexture3D(&texDesc3d, nullptr, &m_tex3d); m_dxDevice->CreateRenderTargetView(m_tex3d, nullptr, &m_tex3dRTView); I would now like to update the whole render target and fill it with procedural data generated in a pixel shader, similar

How can I create a random number of D2D shapes (rectangles and ellipses) and refer to them as an array while drawing?

余生颓废 提交于 2019-12-13 04:25:47
问题 Let me elaborate. I define a D2D Rectangle like so: D2D1_RECT_F rect1 = D2D1::RectF(5, 0, 150, 150); and an ellipse as: D2D1_ELLIPSE ellipse1 = D2D1::Ellipse(D2D1::Point2F(75.f, 75.f), 75.f, 75.f); To draw these shapes, I first transform them and pass them to the rendertarget: m_pRenderTarget->SetTransform(D2D1::Matrix3x2F::Translation(D2D1::SizeF(200, 50))); m_pRenderTarget->FillRectangle(&rect1, m_pLinearGradientBrush); I'd like a way to create a random number of rectangles and ellipses,

Three.js Updating DataTextures with a renderTarget's image data

久未见 提交于 2019-12-11 23:49:46
问题 I am working on a motion detection program in Three.js which uses the difference between the current and previous frame. For now, before the subtraction, the current and the previous frame are both blurred using a Three.EffectComposer each. The main problem is : Instead of having to blur the previous frame again, I want to use the previously blurred "current" frame as the texture in the subtraction process. The closest I have managed to do is by using the function below to update the image

Rendering a line into a 1x1 RenderTarget2D does not change the target's pixel color

人盡茶涼 提交于 2019-12-11 14:58:53
问题 What I am trying to achieve is the following: my pass will return a huge array with several unique numbers repeating over and over, which I need to retrieve and process on CPU. I tried rendering into a 2000x1 texture and then sampled it on the CPU with RenderTarget2d.GetData<>() and a foreach loop. It was awfully slow :). So I sidestepped my problem, the idea now is to render to a 1x1 texture multiple times. Inbetween passes I will extend a parameter array in my shader to include the numbers

THREE.js Render target texture will not draw in a different scene

馋奶兔 提交于 2019-12-08 05:33:20
问题 So if any THREE.js pros can understand why I can't get the WebGLRenderTarget to be used as a material for a plane in another scene I'd be pretty happy. How it works right now is I create a scene with a perspective camera which renders a simple plane. This happens in the Application object. I also have a WaveMap object that uses another scene and an orthographic camera and using a fragment shader draws the cos(x) * sin(y) function on another quad that takes up the entire screen. I render this

THREE.js Render target texture will not draw in a different scene

你说的曾经没有我的故事 提交于 2019-12-08 05:01:28
So if any THREE.js pros can understand why I can't get the WebGLRenderTarget to be used as a material for a plane in another scene I'd be pretty happy. How it works right now is I create a scene with a perspective camera which renders a simple plane. This happens in the Application object. I also have a WaveMap object that uses another scene and an orthographic camera and using a fragment shader draws the cos(x) * sin(y) function on another quad that takes up the entire screen. I render this to a texture and then I create a material that uses this texture. I then pass that Material to be used