Is it possible to add a textured material to an object with a custom mesh in Three.js?
Whenever I try exporting an object from Blender to Three.js with a texture on it, the object disappears. Looking through the three.js examples, it seems like they've carefully avoided putting textures on anything other than the built-in geometries, and forcing a texture on such a mesh causes it to again disappear.
For example, if I edit scene_test.js, which is a scene file called from webgl_scene_test.html, if I apply the "textured_bg" to the "walt" head, it will disappear.
It looks like the missing piece of the puzzle was that you have to apply the set of UV coordinates to the mesh of the object in question.
First, select your texture, and under "Mapping" make sure that the "coordinates" dropdown is set to "UV"
Then, click on the "object data" button, and in the UV Texture list, click the plus icon. This seems to automatically add the UV data to the mesh.
Related
This would be working perfectly if it weren't for one aspect of what is going on. When I add the texture to the object (drag and drop it onto the object from the menu in the bottom right) it adds it exactly where I want, but it also adds it around the object large (too large for the object) and all blurred, and I want it to just stay white:
Before:
After:
I want it to stay in the middle where it is, and have the surrounding area be the original color of the shirt.
UPDATE
I created a static physics body, and that allowed me to get to the materials of the Plane object. But I can't remove the coloring around the image.
UPDATE
Also, I exported the texture from Blender with Save Image As from Blender UV/Image Editor, I'm not sure if that is the correct way to export a texture to be used correctly in xcode.
UPDATE
I used the .obj file instead and got this far, but when it renders in the app there is no image, only they gray tshirt.
To export a .dae including textures from Blender, make sure to select UV Textures and the Copy checkbox unter Texture Options, in the Export options, during export. Then, in xcode, your texture will be available as a material under the Materials tab, and you just have to drag the image that you were using with the texture, onto the material (make sure you have the correct material selected) in the Scene Editor in Xcode.
Having problem with SceneKit object. Trying to get it to look like the first image below, but in the application it looks like plastic bronze.
Object in Xcode / SceneKit
Object in AR app
Can't really figure out what am doing wrong but have narrow it down to my material settings. Current settings:
Materials settings in Xcode
You're assigning specular map for diffuse, specular and metalness which is wrong.
PBR lights have four important components which are
Diffuse, Roughness, Metal and Normal
Assign these properties with the right map and you will have the expected result.
For More details on PBR check this link
My goal is to take an existing SKScene and stretch it according to a polynomial function, like one stretching everything toward or away from the center. The stretched form will be continuously rendered and presented to the user. It may be a new scene/image/view or whatever is necessary. The model will simply perform its functions over time in Euclidean form.
My project content is little more than the iOS SpriteKit starter project on Xcode.
I know of the functions in SKScene:
convertPointToView(), and
convertPointFromView()
However, I don't understand how these will be much use for the view since the scene only has aspect fill, fits and resize settings.
I attempted to make a fragment shader to do the actual stretching, however, I could not figure out how to get existing color and position information to draw the new color according to the transformation.
I am using SpriteKit and I only know how to access fragment from among the shaders using SKShader. I do not know how to access vertex shaders from this context. Otherwise, I would have tried to use a vertex shader.
You could go with SceneKit:
Create SCNView with SCNScene
Create SCNNode with SCNPlane geometry (or create custom SCNGeometry)
Create SCNMaterial with your SKScene attached to created material's diffuse.contents property
Attach material to geometry
Attach node to scene
Then you have multiple choices:
Use SCNShadable - either attach shader modifier for geometry or material, or use custom SCNProgram.
Use SCNTechnique on SCNView.
This way you will have your SKScene as texture on a 3D object (plane or something) and have full control of vertex and fragment shaders.
I am building a 3D image viewer which has Three.JS plane geometries as placeholders with the images as their textures.
Now I want to add a black border around the image. The only way I have found yet to implement this is to add a new black plane geometry behind the image to be displayed. But this required whole-sale changes to my framework which I want to avoid.
WebGL's texture loading function gl.texImage2D has a parameter for border. But I couldn't find this exposed anywhere through Three.js and doubt that it even works the way I think it does.
Is there an easier way to add borders around textures?
You can use a temporary regular 2D canvas to render your image and apply any kind of editing/effects there, like paint borders and such. Then use that canvas image as a texture. Might be a bit of work, but you will gain a lot of flexibility styling your borders and other stuff.
I'm not near my dev machine and won't be for a couple of days, so I can't look up an example of my own. This issue contains some code to get you started: https://github.com/mrdoob/three.js/issues/868
Morning all (if its morning where you are)
I have been looking around and have not seen a satisfactory method for doing this so thought I would ask around...
Ideal world I would like to be able to generate a transparent Texture2D object. Drawing this to the screen I would like to be able to "paint" to it, i.e. when the left mouse button is down whatever pixel the cursor is over should be set to black. Following this I would then need to be able to use this texture.
Using the texture is the easy part, we can simply make a new Texture2D attribute for a "painting" object and use that in the SpriteBatch.Draw method. The two tricky parts are
Generating a texture2D object of a specified size, filled with transparency in code.
Editing that texture2D on the fly (i.e. being able to alter pixel colours)
If anyone has any experience of these you input would be very much appreciated.
You can either use a RenderTarget2D (MSDN), which is itself a Texture2D (so you can use it in SpriteBatch.Draw). This allows you to render onto a texture in the same way you render onto the screen. You need to use GraphicsDevice.SetRenderTarget (MSDN) to set this up.
Or you can use Texture2D.SetData (MSDN) to manipulate pixels directly. You can construct a transparent Texture2D directly (MSDN). Don't forget to Dispose of any textures or other resources you create yourself!