DirectX 11 models with even indices not rendered - directx

I have a problem with directx 11 rendering - if i try to render more then one model, i see just models with odd index. All model that are rendered with even index are not visible.
my code based on rastertek tutorials:
m_dx->BeginScene(0.0f, 0.0f, 0.0f, 1.0f);
{
m_camera->Render();
XMMATRIX view;
m_camera->GetViewMatrix(view);
XMMATRIX world;
m_dx->GetWorldMatrix(world);
XMMATRIX projection;
m_dx->GetProjectionMatrix(projection);
XMMATRIX ortho;
m_dx->GetOrthoMatrix(ortho);
world = XMMatrixTranslation(-2, 0, -4);
m_model->Render(m_dx->GetDeviceContext());
m_texture_shader->Render(m_dx->GetDeviceContext(), m_model->GetIndicesCount(), world, view, projection,
m_model->GetTexture());
world = XMMatrixTranslation(2, 0, -2);
m_model->Render(m_dx->GetDeviceContext());
m_texture_shader->Render(m_dx->GetDeviceContext(), m_model->GetIndicesCount(), world, view, projection,
m_model->GetTexture());
world = XMMatrixTranslation(0, 0, -3);
m_model->Render(m_dx->GetDeviceContext());
m_texture_shader->Render(m_dx->GetDeviceContext(), m_model->GetIndicesCount(), world, view, projection,
m_model->GetTexture());
}
m_dx->EndScene();
Model render method
UINT stride, offset;
stride = sizeof(VertexPosTextureNormal);
offset = 0;
device_context->IASetVertexBuffers(0, 1, &m_vertex_buffer, &stride, &offset);
device_context->IASetIndexBuffer(m_index_buffer, DXGI_FORMAT_R32_UINT, 0);
device_context->IASetPrimitiveTopology(D3D11_PRIMITIVE_TOPOLOGY_TRIANGLELIST);
Shader render method:
world = XMMatrixTranspose(world);
view = XMMatrixTranspose(view);
projection = XMMatrixTranspose(projection);
D3D11_MAPPED_SUBRESOURCE mapped_subres;
RETURN_FALSE_IF_FAILED(context->Map(m_matrix_buffer, 0, D3D11_MAP_WRITE_DISCARD, 0, &mapped_subres));
MatrixBuffer* data = (MatrixBuffer*)mapped_subres.pData;
data->world = world;
data->view = view;
data->projection = projection;
context->Unmap(m_matrix_buffer, 0);
context->VSSetConstantBuffers(0, 1, &m_matrix_buffer);
context->PSSetShaderResources(0, 1, &texture);
// render
context->IASetInputLayout(m_layout);
context->VSSetShader(m_vertex_shader, NULL, 0);
context->PSSetShader(m_pixel_shader, NULL, 0);
context->PSSetSamplers(0, 1, &m_sampler_state);
context->DrawIndexed(indices, 0, 0);
What can be the reason of this?
thank you.

This code -
world = XMMatrixTranspose(world);
view = XMMatrixTranspose(view);
projection = XMMatrixTranspose(projection);
is transposing the same matrixes each time you call it so they only have the correct value alternate times. The world matrix is being reset each time in the calling code but the view and project matrices are wrong alternate times.

Related

Unable to render square in directx9

// this is the function used to render a single frame
void render_frame(void)
{
init_graphics();
d3ddev->Clear(0, NULL, D3DCLEAR_TARGET, D3DCOLOR_XRGB(255, 0, 0), 1.0f, 0);
d3ddev->BeginScene();
// select which vertex format we are using
d3ddev->SetFVF(CUSTOMFVF);
// select the vertex buffer to display
d3ddev->SetStreamSource(0, v_buffer, 0, sizeof(CUSTOMVERTEX));
// copy the vertex buffer to the back buffer
d3ddev->DrawPrimitive(D3DPT_TRIANGLESTRIP, 0, 2);
// d3ddev->DrawPrimitive(D3DPT_TRIANGLESTRIP, 0, 1);
d3ddev->EndScene();
d3ddev->Present(NULL, NULL, NULL, NULL);
}
// this is the function that puts the 3D models into video RAM
void init_graphics(void)
{
// create the vertices using the CUSTOMVERTEX struct
CUSTOMVERTEX vertices[] =
{
{ 100.f, 0.f, 0.f, D3DCOLOR_XRGB(0, 0, 255), },
{ 300.f, 0.f, 0.f, D3DCOLOR_XRGB(0, 0, 255), },
{ 300.f, 80.f, 0.f, D3DCOLOR_XRGB(0, 0, 255), },
{ 100.f, 80.f, 0.f, D3DCOLOR_XRGB(0, 0, 255), },
};
// create a vertex buffer interface called v_buffer
d3ddev->CreateVertexBuffer(6 * sizeof(CUSTOMVERTEX),
0,
CUSTOMFVF,
D3DPOOL_MANAGED,
&v_buffer,
NULL);
VOID* pVoid; // a void pointer
// lock v_buffer and load the vertices into it
v_buffer->Lock(0, 0, (void**)&pVoid, 0);
memcpy(pVoid, vertices, sizeof(vertices));
v_buffer->Unlock();
}
Can't render a square for some reason. I've searched for an hour but can't find the answer.
https://i.imgur.com/KCKZSrJ.jpg
Does anybody know how to render it? I'm using directx9.
Tried using DrawIndexPrimitive but It has the same result.
There's likely a few things going on here:
You do not set a Vertex or Pixel Shader, so you are using the legacy fixed-function render pipeline. This pipeline requires you set the view/projection matrices with SetTransform. Since you haven't done that, the vertex positions you provide in 'screens space' don't mean what you think they mean. See The Direct3D Transformation Pipeline.
You are not setting the backface culling mode via SetRenderState so it's defaulting to D3DCULL_CCW (i.e. cull counter-clockwise winding triangles). As such, your vertex positions are resulting in one of the triangles being rejected. You may want to to call SetRenderState(D3DRS_CULLMODE, D3DCULL_NONE); getting started.
You are using TRIANGLESTRIP and only 4 points. You may find it easier to get correct initially by using TRIANGELIST and 6 points.

Direct2D draw on D3D11 texture does not blend

I have a Direct2D that interops with Direct3D.
My Direct2D renders to a Direct3d texture just fine. I'm then trying to render that texture on top of my Direct3D scene, but when I render the texture it shows up fine but it covers the rest in black instead of blending
D2D1_BITMAP_PROPERTIES1 properties = D2D1::BitmapProperties1(
D2D1_BITMAP_OPTIONS_TARGET | D2D1_BITMAP_OPTIONS_CANNOT_DRAW,
D2D1::PixelFormat(
DXGI_FORMAT_B8G8R8A8_UNORM,
D2D1_ALPHA_MODE_PREMULTIPLIED
)
);
pd2dDeviceCtx->CreateBitmapFromDxgiSurface(surface.Get(), &properties, &p2dBitmap);
pd2dDeviceCtx->SetTarget(p2dBitmap.Get());
// BlendState
D3D11_BLEND_DESC blendStateDescription = {};
blendStateDescription.RenderTarget[0].BlendEnable = TRUE;
blendStateDescription.RenderTarget[0].SrcBlend = D3D11_BLEND_SRC_ALPHA;
blendStateDescription.RenderTarget[0].DestBlend = D3D11_BLEND_INV_SRC_ALPHA;
blendStateDescription.RenderTarget[0].BlendOp = D3D11_BLEND_OP_ADD;
blendStateDescription.RenderTarget[0].SrcBlendAlpha = D3D11_BLEND_ONE;
blendStateDescription.RenderTarget[0].DestBlendAlpha = D3D11_BLEND_ZERO;
blendStateDescription.RenderTarget[0].BlendOpAlpha = D3D11_BLEND_OP_ADD;
blendStateDescription.RenderTarget[0].RenderTargetWriteMask = 0x0f;
pDevice->CreateBlendState(&blendStateDescription, &blendState);
pContext->OMSetBlendState(blendState.Get(), NULL, sampleMask);
//D3D draws....
pd3dContext->DrawIndexed(6, 0, 0);
//D2D draws
pd2dDeviceCtx->BeginDraw();
pd2dDeviceCtx->Clear(D2D1::ColorF(0, 0, 0, 0));
// .. more D2D draws
pd2dDeviceCtx->EndDraw();
// ... Draw texture that d2d used to direct3d
pContext->VSSetShader(vertextShader.Get(), nullptr, 0);
pContext->PSSetShader(pixelShader.Get(), nullptr, 0);
pContext->PSSetShaderResources(0, 1, d2dTextShaderResource.GetAddressOf());
pContext->PSSetSamplers(0, 1, samplers.GetAddressOf());
pContext->DrawIndexed(6, 0, 0);
My texture renders correctly but where it should be transparent its all black hiding what used to be there.

Directx 3D the farther object covers the nearer object

I'm new to the DirectX in c#, and there is a question that confused me a lot, basically I want to render two cubes on the screen, one is near from the camera and the other is far from the camera, what I expected is the nearer one always in front of the further one, but in fact, it depends on the rendering sequence, the last rendered one always in front of the other, I've tried to clear the z-buffer but that does not work at all, so I'm wondering if there is something I'm doing wrong?
Here are my code snippet
private void Form1_Load(object sender, EventArgs e)
{
PresentParameters presentParams = new PresentParameters();
presentParams.Windowed = true;
presentParams.SwapEffect = SwapEffect.Discard;
presentParams.EnableAutoDepthStencil = true;
presentParams.AutoDepthStencilFormat = DepthFormat.D16;
device = new Device(0, DeviceType.Hardware, this, CreateFlags.MixedVertexProcessing, presentParams);
device.VertexFormat = CustomVertex.PositionColored.Format;
device.RenderState.CullMode = Cull.CounterClockwise;
device.RenderState.Lighting = false;
Matrix projection = Matrix.PerspectiveFovLH((float)Math.PI / 4, this.Width / this.Height, 0f, 10000.0f);
device.Transform.Projection = projection;
}
protected override void OnPaint(PaintEventArgs e)
{
Cube a = new Cube(new Vector3(0, 0, 0), 5);
Cube b = new Cube(new Vector3(0, 0, 15), 5);
device.Clear(ClearFlags.Target | ClearFlags.ZBuffer, Color.DarkGray, 1, 0);
device.BeginScene();
Matrix viewMatrix = Matrix.LookAtLH(cameraPosition, targetPosition, up);
device.Transform.View = viewMatrix;
device.DrawIndexedUserPrimitives(PrimitiveType.TriangleList, 0, 8, 12, a.IndexData, false, a.GetVertices());
device.DrawIndexedUserPrimitives(PrimitiveType.TriangleList, 0, 8, 12, b.IndexData, false, b.GetVertices());
device.EndScene();
device.Present();
}
Alright, I finally fixed the problem, by changing
Matrix projection = Matrix.PerspectiveFovLH((float)Math.PI / 4, this.Width / this.Height, 0f, 10000.0f);
to
Matrix projection = Matrix.PerspectiveFovLH((float)Math.PI / 4, this.Width / this.Height, 1f, 10000.0f);
But I don't know the reason, and why it happens, does anyone know that?

WHy does my triangle refuse to rotate?

I have a triangle created in DirectX11. I now want to play around with viewport and world matrices to help my understanding of them, so Id like to simply rotate the triangle around the Z axis. My code for attempting to do that is below.
void Render(void)
{
if (d3dContext_ == 0)
return;
XMMATRIX view = XMMatrixIdentity();
XMMATRIX projection = XMMatrixOrthographicOffCenterLH(0.0f, 800.0f, 0.0f, 600.0f, 0.1f, 100.0f); .
XMMATRIX vpMatrix_ = XMMatrixMultiply(view, projection);
XMMATRIX translation = XMMatrixTranslation(0.0f, 0.0f, 0.0f);
XMMATRIX rotationZ = XMMatrixRotationZ(30.0f);
XMMATRIX TriangleWorld = translation * rotationZ;
XMMATRIX mvp = TriangleWorld*vpMatrix_;
mvp = XMMatrixTranspose(mvp);
float clearColor[4] = { 0.0f, 0.0f, 0.25f, 1.0f };
d3dContext_->ClearRenderTargetView(backBufferTarget_, clearColor);
unsigned int stride = sizeof(VertexPos);
unsigned int offset = 0;
d3dContext_->IASetInputLayout(inputLayout_);
d3dContext_->IASetVertexBuffers(0, 1, &vertexBuffer_, &stride, &offset);
d3dContext_->IASetPrimitiveTopology(D3D11_PRIMITIVE_TOPOLOGY_TRIANGLELIST);
d3dContext_->VSSetShader(solidColorVS_, 0, 0);
d3dContext_->PSSetShader(solidColorPS_, 0, 0);
d3dContext_->UpdateSubresource(mvpCB_, 0, 0, &mvp, 0, 0);
d3dContext_->VSSetConstantBuffers(0, 1, &mvpCB_);
d3dContext_->Draw(3, 0);
swapChain_->Present(0, 0);
}
It just displays the standard triangle, its as if it does not take notice of the mvp.
My desired effect is the rotation as controlled by XMMATRIX rotationZ = XMMatrixRotationZ(30);.
Thanks
XMMatrixRotationZ takes a radian as parameter, not degrees (see MSDN Description ).
To get degrees from radians, you have to multiply by M_PI / 180.0f
XMMATRIX rotationZ = XMMatrixRotationZ(30 * M_PI / 180.0);
As far as i know from OpenGl you must increase the XMMatrixRotationZ-value for an animated rotation a little bit per tick, because otherwise you only draw it once in the specific angle.
So (if you haven't) create a loop for your render function and increase the angle-value per round
Hope i could help

GLKBaseEffect not loading texture (texture appears black on object)

I'm using GLKit in an OpenGL project. Everything is based on GLKView and GLKBaseEffect (no custom shaders). In my project I have several views that have GLKViews for showing 3D objects, and occasionally several of those view can be "open" at once (i.e. are in the modal view stack).
While until now everything was working great, in a new view I was creating I needed to have a rectangle with texture to simulate a measuring tape for the 3D world of my app. For some unknown reason, in that view only, the texture isn't loaded right into the opengl context: the texture is loaded right by GLKTextureLoader, but when drawing the rectangle is black, and looking at the OpenGL frame in debug, I can see that an empty texture is loaded (there's a reference to a texture, but it's all zeroed out or null).
The shape I'm drawing is defined by: (it was originally a triangle strip, but I switched for triangles to make sure it's not the issue)
static const GLfloat initTape[] = {
-TAPE_WIDTH / 2.0f, 0, 0,
TAPE_WIDTH / 2.0f, 0, 0,
-TAPE_WIDTH / 2.0f, TAPE_INIT_LENGTH, 0,
TAPE_WIDTH / 2.0f, 0, 0,
TAPE_WIDTH / 2.0f, TAPE_INIT_LENGTH, 0,
-TAPE_WIDTH / 2.0f, TAPE_INIT_LENGTH, 0,
};
static const GLfloat initTapeTex[] = {
0, 0,
1, 0,
0, 1.0,
1, 0,
1, 1,
0, 1,
};
I set the effect variable as:
effect.transform.modelviewMatrix = modelview;
effect.light0.enabled = GL_FALSE;
// Projection setup
GLfloat ratio = self.view.bounds.size.width/self.view.bounds.size.height;
GLKMatrix4 projection = GLKMatrix4MakePerspective(GLKMathDegreesToRadians(self.fov), ratio, 0.1f, 1000.0f);
effect.transform.projectionMatrix = projection;
// Set the color of the wireframe.
if (tapeTex == nil) {
NSError* error;
tapeTex = [GLKTextureLoader textureWithContentsOfFile:[[[NSBundle mainBundle] URLForResource:#"ruler_texture" withExtension:#"png"] path] options:nil error:&error];
}
effect.texture2d0.enabled = GL_TRUE;
effect.texture2d0.target = GLKTextureTarget2D;
effect.texture2d0.envMode = GLKTextureEnvModeReplace;
effect.texture2d0.name = tapeTex.name;
And the rendering loop is:
[effect prepareToDraw];
glDisable(GL_DEPTH_TEST);
glDisable(GL_CULL_FACE);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glEnable(GL_BLEND);
glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
glEnableVertexAttribArray(GLKVertexAttribPosition);
glEnableVertexAttribArray(GLKVertexAttribTexCoord0);
glVertexAttribPointer(GLKVertexAttribPosition, COORDS, GL_FLOAT, GL_FALSE, 0, tapeVerts);
glVertexAttribPointer(GLKVertexAttribTexCoord0, 2, GL_FLOAT, GL_FALSE, 0, tapeTexCoord);
glDrawArrays(GL_TRIANGLES, 0, TAPE_VERTS);
glDisableVertexAttribArray(GLKVertexAttribPosition);
glDisableVertexAttribArray(GLKVertexAttribTexCoord0);
I've also tested the texture itself in another view with other objects and it works fine, so it's not the texture file fault.
Any help would be greatly appreciated, as I'm stuck on this issue for over 3 days.
Update: Also, there are no glErrors during the rendering loop.
After many many days I've finally found my mistake - When using multiple openGL contexts, it's important to create a GLKTextureLoader using a shareGroup, or else the textures aren't necessarily loaded to the right context.
Instead of using the class method textureWithContentOf, every context needs it's own GLKTextureLoader that is initialized with context.sharegroup, and use only that texture loader for that view. (actually the textures can be saved between different contexts, but I didn't needed that feature of sharegroups).
Easy tutorial http://games.ianterrell.com/how-to-texturize-objects-with-glkit/
I think it will help you.

Resources