In 11.1 and later Microsoft removed a lot of helpers for loading textures (fromfile, fromstream etc).
I'm trying to port my code over to 11.2 and everything works fine except for this line :
var texture = Texture2D.FromFile<Texture2D>(device, #"C:\Texture.png");
Now all i could find was guidance telling me to use WIC instead but i can't seem to find anything providing me with a Texture2D that is nearly as versatile (all the samples i found require passing in the pixel format among other things).
I'm looking for a solution that would let me load files (without knowing their format or layout before hand) and get a Texture2D out of it just like FromFile allowed. Is there anything like that or even close? I'm assuming there has to be "something" as they wouldn't just deprecate such a feature if it wasn't superfluous.
Any help is most appreciated.
Edit : I'd like to avoid using the SharpDX.Toolkit too, i'm looking for a raw DirectX / WIC etc solution as i don't want to add a dependency on the Toolkit. I'm however perfectly fine on adding any .net Framework 4.0 or 4.5 assembly as a dependency.
There is no easy solution apart from writing the whole WIC interop yourself. Even if you don't want to use the Toolkit, the source code is available and the class WicHelper.cs responsible for decoding images is fairly easy to adapt.
If you want a very simple solution that doesn't handle all WIC cases (format mappings...etc.), you can have a look at TextureLoader.cs.
Provided you have both a Texture2D and a WPF BitmapImage lying around, something like this will help:
var map = device.ImmediateContext.MapSubresource(texture, 0, MapMode.WriteDiscard, MapFlags.None);
bitmap.CopyPixels(Int32Rect.Empty, map.DataPointer, bitmap.PixelWidth * bitmap.PixelHeight * 4, bitmap.PixelWidth * 4);
device.ImmediateContext.UnmapSubresource(source, 0);
The function you mention was probably something fairly similar under the hood.
If you're not sure about the bitmap being Bgra, you have to convert it first.
Related
I am familiar with C or C++
However, I am a quite newbie for WEBGL and GLSL.
It might be too basic or silly question though... I appreciate you give me some hints.
I found this source code which makes the great CG graphics.
I want to compile this by myself, However honestly saying I am not sure what it is , and what with compiler I can compile and make this work .
#define R rotate2D
for(float i,e,g,s;i++<1e2;g+=e*.2)
{
vec3 n,p=vec3((FC.xy-.5*r)/r.y*g,g-4.);
p.zy*=R(.4);
p.z+=t;
e=p.y;
for(s=1.;s<6e3;s+=s)p.xz*=R(s),n.xz*=R(s),n+=cos(p*s),e+=dot(sin(p.xz*s),r/r/s/3.);
n.y=1.;
n=normalize(n);
e-=snoise2D(p.xz*40.)*n.y*n.y*.4;
o-=exp(-e*9.-5.);
}
o++;
So at first, I started to learn about webGL.
And finished learning basic webGL
However I cant make it work with webGL.
Next, I started to learn about SLGL on unity,,, however it's not hit the spot.
What should I learn to compile this???
This looks a piece of a GLSL shader, and you would compile that with gl.compileShader()
But it's incomplete. First of all to be completely compilable it needs a main function that sets gl_FragColor. And I see a reference to snoise2D and rotate2D which should be functions, but are missing. And a least a few local undefined variables like FC and r. So without those pieces, this won't compile.
If you can solve those issues, you should be able to plug into something like ShaderToy without have to invent a whole WebGL application from scratch. But if this shader is meant to interact with specific geometry that might not work at all.
It's hard to advise more without more info on what this is, how its intended and what all that single letter variable names are supposed to mean.
I have a handle to a dynamic library (from using dlopen()). Regardless of why, I don't have access to what the path supplied to dlopen() was, but need the path for another function. Thus, I need to be able to acquire the path to the library using its handle.
I've tried using dladdr(), as I have in other parts of my app, but on macOS / iOS you aren't able to use it to find the path of a library using the handle to the library it only works with a handle to a symbol in the library. I could try adding a "locator symbol" to the library, and accomplish things this way, but I'd prefer not to.
I also tried dlinfo() with RTLD_DI_LINKMAP, but this is apparently not available on macOS / iOS.
I'm surprised at how little information there is out there for this. Many of the solutions out there were not available on macOS / iOS. Others still, were only about getting the path of the current executable, and had nothing to do with the handle.
After a TON of searching, I finally came across some resources saying to iterate through all the loaded images using _dyld_image_count() and _dyld_get_image_name(). I initially decided against this, as it just didn't seem like an unreasonably slow way of doing things.
Eventually, I decided to go with iterating over all the loaded images, as it was the only actual solution I had come across. I googled for examples, and couldn't find any tutorials on the topic. However, I did come across an open source C++ library that implemented the functionality (found here).
I translated it to normal C, and got rid of some excess things (such as stripping the handle). During the testing process, I noticed that the library I wanted was always last in the list (my best guess is that it stores them in the order they were loaded, and since mine isn't a system library, it will be one of the last ones loaded). This guaranteed a slow performance (in reference to the computer - to a human it'd still be nearly instantaneous). So, I did a clever optimization that started the search at the end of the list, rather than the beginning.
This is the final code for my solution:
// NOT a thread safe way of doing things
NSString *pathFromHandle(void* handle)
{
// Since we know the image we want will always be near the end of the list, start there and go backwards
for (uint32_t i = (_dyld_image_count() - 1); i >= 0; i--)
{
const char* image_name = _dyld_get_image_name(i);
// Why dlopen doesn't effect _dyld stuff: if an image is already loaded, it returns the existing handle.
void* probe_handle = dlopen(image_name, RTLD_LAZY);
dlclose(probe_handle);
if (handle == probe_handle)
{
return [NSString stringWithUTF8String:image_name];
}
}
return NULL;
}
It's important to note that this solution is not thread safe, as _dyld_image_count() and _dyld_get_image_name() are inherently not thread safe. This means that any other thread could load / unload an image and have a negative impact on our search.
Additionally, the resource I used questioned why dlopen didn't have an effect on _dyld_image_count(). This is because if an image is already loaded, dlopen does not load a new instance of the image, but rather returns the existing handle for it.
So I have recently started using SharpDX, and have stumbled into a problem. I have no idea how to get SharpDX to multisample. I have found two things related; you can specify a SampleDescription when creating the SwapChainDescription, but any input other than (1, 0) throws a Wrong Parameter exception.
The other thing I found was SamplerState, which I put on my pixel shader, didn't do anything. I played around a lot with the parameters, but there was no visible change whatsoever.
I am sure I am missing something, but without any previous directX knowlegde I have no idea really what exactly to look for.
This will come in handy in your case:
int maxsamples = Device.MultisampleCountMaximum;
int res = device.CheckMultisampleQualityLevels(SharpDX.DXGI.Format.R8G8B8A8_UNorm, samplecount);
If res returns 0 then this Sample count is not supported.
Also please note that some options are not compatible, so if you create your SwapChain with:
sd.Usage = (other usages) | Usage.UnorderedAccess;
You are not allowed to use multisampling.
Another very useful technique to spot the problems for those errors:
Create your device with DeviceCreationFlags.Debug
In your startup project properties (debug section), tick "Enable native code debugging".
Any API call that fails will give you an error description in the debug output window.
I had the same problem, could not get Multisampling to work until I enabled the debugging and got a good hint (really wished I had done this hours ago and saved a whole lot of testing!).
Initially I read somewhere that the DepthStencilBuffer had the same SampleDescription as the Render texture - but I'm not so sure as it appears to work without this as a quick test just showed.
The thing for me was to create the DepthStencilView with a DepthStencilViewDescription that has "Dimension = DepthStencilViewDimension.Texture2DMultisampled".
Just a heads up on when you are doing multisampling.
When you set your rendertarget, if passing a rendertarget and depthstencil, you need to ensure they both have the same multisampling level.
So, for rendering to the backbuffer you have defined with MSAA, you will need to create a depth buffer with the same MSAA level.
BUT, if you are have a rendertarget that will be a texture that is fed back into the pipeline, you can define a non MSAA texture and a NON MSAA depth buffer, which is handy as you can use a sampler on the texture (you cant use a normal sampler for a MSAA Resource texture).
Most of this info maybe not new for you.
I have been looking through the threads at the Qualcomm Forums but no luck since I don't know exactly how to look for what I want.
I'm working with the ImageTargets Sample for iOS and I want to change the teapot to another image (a text rather) I had.
I already have the render and I got the .h using opengl library but I can't figure out what do I need to change to make this work and since this is the very basic and I haven't been able to make it work I really haven't ventured to try anything else.
Could anyone please help me out?
I would paste code here but it's a whole project so I don't know exactly what to put if needed please let me know.
If the case is still valid, here's what you have to do:
get header file for 3D object
get texture image for this object
in EAGLView.mm make this changes:
import "yourobject3d.h"
add your texture to textureFilenames array(this should be at the begining of EAGLView
eventually take care about kObjectScale (by deafult it was about 3.0f, for one object I did have to change it even up to 120.0f)
in setup3dObjects method assign proper arrays of vertices/normals/texture coords (check in "yourobject3d.h" file for proper arrays and naming) to Object3D *object
make this change in renderFrameQCAR
//glDrawElements(GL_TRIANGLES, obj3D.numIndices, GL_UNSIGNED_SHORT, (const GLvoid*)obj3D.indices);
glDrawArrays(GL_TRIANGLES, 0, obj3D.numVertices);
I believe that is all... if something take a look at Vuforia's forum, i.e. here: https://developer.vuforia.com/node/2047669
NOTE: default teapot.h does (!) have indices, which are not present in banana.h (from comment below) so take care about that too
Have a look at the EAGLView.mm file. There you'll have to load the textures (images) and 3d objects (you'll need to import your .h instead of teapot.h and modify setup3dObjects accordingly).
They are finally rendered by calling the renderFrameQCAR function.
Actually, teapot is not an image. It's a 3D model stored in .h format which includes Vertices, Normals, and Texture coordinates. You should have a good knowledge of OpenGL ES to understand those codes in sample app.
An easier way to change the 3D model to whatever you want is to use a rendering engine which facilitates the drawing and rendering stuffs and you don't need to bother OpenGL APIs. I've done it with jPCT-AE for Android platform but for iOS there is a counterpart called OpenFrameworks engine. It has some plugins to load 3Ds or MD2 files and since it's written in C++ you can easily integrate it with QCAR.
This is a short video of my result with jPCT and QCAR:
Qualcomm Vuforia + jPCT-AE test video
In XNA, how do I load in a texture or mesh from a file without using the content pipeline?
The .FromFile method will not work on xbox or zune. You have two choices:
Just use the content pipeline ... on xbox or zune (if you care about them), you can't have user-supplied content anyways, so it doesn't matter if you only use the content pipeline.
Write code to load the texture (using .SetData), or of course to parse the model file and load the appropriate vertexbuffers, etc.
For anyone interested in loading a model from a file check out this tutorial:
http://creators.xna.com/en-us/sample/winforms_series2
This is a windows only Way to load a texture without loading it through the pipeline, As Cory stated above, all content must be compiled before loading it on the Xbox, and Zune.
Texture2D texture = Texture2D.FromFile(GraphicsDeviceManager.GraphicsDevice, #Location of your Texture Here.png);
I believe Texture2D.FromFile(); is what you are looking for.
It does not look like you can do this with a Model though.
http://msdn.microsoft.com/en-us/library/microsoft.xna.framework.graphics.texture2d.fromfile.aspx
If you really want to load an Xna Xna.Framework.Graphics.Model on PC without the content pipeline (eg for user generated content), there is a way. I used SlimDX to load an X file, and avoid the parsing code, the some reflection tricks to instantiate the Model (it is sealed and has a private constructor so wasn't meant to be extended or customised). See here: http://contenttracker.codeplex.com/SourceControl/changeset/view/20704#346981