How to get render mode of any iOS application on runtime - ios

Is there any way of knowing the render mode of an application at runtime on iOS device ?
I need to get the render mode of my running application and pass on different logic based on whatever render mode(CPU, GPU and Direct) I get at runtime but I am struggling to find any such API or method that can solve my purpose.
Any suggestions ?
Thanks,
Ken

From a pure AS3 tack, you'd be limited to wmodeGPU (which still doesn't give you quite what you want), however, with Air you have access to NativeWindow classes. That said, everything I've read on it seems to indicate this is an init state only property, and not something you can read out of your NativeWindow.
Try stage.nativeWindow.renderMode?

Related

Direct2D Direct3d11 interop - can device be lost?

In my app I'm using direct2d to write to a shared (d3d11/d3d10) texture. This is the only kind of render target that is used in my app. Since devices can be lost in direct2d (D2DERR_RECREATE_RENDER_TARGET), lots of code exists to abstract and/or recreate device dependent resources. However, I have yet to see this situation actually occur, and am curious whether I am wasting effort. Can the render target actually be lost in this scenario, or am I protected since the texture is created via d3d11 (though shared with d3d10)? If so, does anyone know a reproducible, simple way to cause the render target to be lost so I can at least test the code that handles this condition?
It’s not a wasted effort. Many scenarios may cause device loss to occur. A simple way to induce this for testing purposes is to update your graphics driver. Your application should handle this gracefully. It can also happen behind the scenes if your graphics driver crashes or Windows Update installs a new version in the background. There are other cases but those are probably the most common.
You can use the Device Manager to roll back and update your driver quickly.
A D2D window render target will always be lost when another program uses any version of the D3D API to go fullscreen and back (exclusive mode, not the new windowed mode supported since D3D10/11). In D3D11, I think you have to cause a resolution change for the D2D render target to be lost.
So if you do not get the D2DERR_RECREATE_RENDER_TARGET HRESULT in this case, when Presenting to your texture render target, then maybe you do not need to re-create the render target, but I would still handle D2DERR_RECREATE_RENDER_TARGET. To test it, you could just replace the texture render target with a window render target during development.

how to use windbg to break into directx application process?

I run the DirectX SDK sample, BasicHLSL10 and use a Windbg to attach to the process.
After breaking into the process, I use 'x d3d*!Render' and 'x dxgi!' to list the functions I am interested.
I then set breakpoints on 'dxgi!D3DKMTPresent' as well as all the 'd3d*!D3DKMTRender'.
After that I hit 'g' for the process to continue.
However, BasicHLSL10 seems to continue running without stepping on those breakpoints.
I'm thinking this could be one of those COM Interface thing, but I'm not very familiar. How, can I break into the process when it is calling some of the DirectX call?
Will this technique also works on media players that uses DXVA?
looks like it could be because D3DKMT* functions are only use by OpenGL Installable Client Driver (at least that is what I think I read in WDK), so maybe that is why BasicHLSL10 never use it...

DirectX 11: simultaneous use of multiple adaptors

We need to drive 8 to 12 monitors from one pc, all rendering different views of a single 3d scenegraph, so have to use several graphics cards. We're currently running on dx9, so are looking to move to dx11 to hopefully make this easier.
Initial investigations seem to suggest that the obvious approach doesn't work - performance is lousy unless we drive each card from a separate process. Web searches are turning up nothing. Can anybody suggest the best way to go about utilising several cards simultaneously from a single process with dx11?
I see that you've already come to a solution, but I thought it'd be good to throw in my own recent experiences for anyone else who comes onto this question...
Yes, you can drive any number of adapters and outputs from a single process. Here's some information that might be helpful:
In DXGI and DX11:
Each graphics card is an "Adapter". Each monitor is an "Output". See here for more information about enumerating through these.
Once you have pointers to the adapters that you want to use, create a device (ID3D11Device) using D3D11CreateDevice for each of the adapters. Maybe you want a different thread for interacting with each of your devices. This thread may have a specific processor affinity if that helps speed things up for you.
Once each adapter has its own device, create a swap chain and render target for each output. You can also create your depth stencil view for each output as well while you're at it.
The process of creating a swap chain will require your windows to be set up: one window per output. I don't think there is much benefit in driving your rendering from the window that contains the swap chain. You can just create the windows as hosts for your swap chain and then forget about them entirely afterwards.
For rendering, you will need to iterate through each Output of each Device. For each output change the render target of the device to the render target that you created for the current output using OMSetRenderTargets. Again, you can be running each device on a different thread if you'd like, so each thread/device pair will have its own iteration through outputs for rendering.
Here are a bunch of links that might be of help when going through this process:
Display Different images per monitor directX 10
DXGI and 2+ full screen displays on Windows 7
http://msdn.microsoft.com/en-us/library/windows/desktop/ee417025%28v=vs.85%29.aspx#multiple_monitors
Good luck!
Maybe you not need to upgrade the Directx.
See this article.
Enumerate the available devices with IDXGIFactory, create a ID3D11Device for each and then feed them from different threads. Should work fine.

What approaches are available to revert an IDirect3DDevice9 instance to its default render state?

Given an instance of IDirect3DDevice9, what approaches are available to put it in its original render state (i.e. the state it was in when the device was initially created)?
The cleanest way that I've come across is to create a state block via IDirect3DDevice9::CreateStateBlock just after the device has been created so that it can be applied later. Unfortunately, I'm operating under the constraints of an existing project such that I can't modify the device creation code; by the time my component gets the device, its default state has been modified. As a result, I'm looking for alternative approaches.
Thx!
~Raf
Well there is no way to be 100% sure. The driver often fails to put things into a default state. Most software will set up its own default state to avoid suffering such problems from the driver.
You "could" however rely on the fact that the driver does what it is supposed too. You can then read through the docs and set all the render states to the, supposed, default value.
There is no other way to do this.

Viewing DirectX application remotely

We're working on an application that displays information through a Direct3D visualisation. A late client request is the ability to view this application via some Remote Desktop solution.
Has anyone done anything similar? What options are available / unavailable? I'm thinking RDC, VNC, Citrix...
Any advice?
I think you can still use all of the normal D3D tools, but you won't be able to render to a surface associated with the screen. You'll have to render to a DIB (or some such) and Blt it with GDI to a normal window HDC. RDC/VNC/Citrix should all work with this technique.
Performance will definitely suffer - but that's going to be the case over remote desktop anyway. In fact, if I were you, I would mock up a VERY simple prototype and demonstrate the performance before committing to it.
Good luck!
I think Windows 7 has D3D remoting stuff - probably requires both client and server to be W7 though.
The build-in remote desktop works. (You don't have to do anything special)
But it is extremely slow, because when in doubt, it just sends the contents of a window as a bitmap.

Resources