I am working on Google ARCore. I noticed when I open my app it is showing blur everywhere. I have tried many times but still the camera is in blur, can you have any solution to fix it?
Related
I’m working on creating a marker based AR game using AFrame 1.2.0 and ar.js 3.3.3. The display shows 2D images of animals that the user has to “find”. The whole game functions well now, but I was running into an issue of photos appearing distorted or warped. I figured out that the issue is the marker’s plane is not being read correctly by mobile devices. The pictures below include a red cube to show the issue better. The top one is on a PC’s webcam and shows correctly the box is mounted to the marker. The bottom one shows the box is not mounted to the marker.
I figure that the issue is either mobile device’s gyroscope features or that the screen dimensions are affecting the aspect ratio of the screen.
I’ve tried a few properties on Aframe’s a-entity, such as look-controls=‘Enabled:false’ and look-controls=‘magicWindowTrackingEnabled: false’. Neither of those made a difference. I haven’t found properties within ar.js to use. Just wondering if anyone has come across this issue and found a fix.
images planing correctly with the marker
images not planing correctly
arjs comes in two different, mutually exclusive builds - Image + location based tracking, and marker tracking (link).
Importing the wrong one may/will cause incorrect behavior like the one you experience.
I have an Augmented Reality functionality made using Unity + Vuforia plugin which I integrated into the iOS application. The app uses the camera as background and when you navigate camera to some marker 3D object will appear on it.
My task is to add buttons which will start and stop capture video (or image) from the camera. The output should be a video with camera scene + 3D object.
I made some investigation, but the only solution I found is to convert the view of AVCaptureVideoPreviewLayer on which camera preview is showing to a video (or image). But from my opinion, this solution is inefficient and not flexible.
Is there any way to get a current instance of the AVCaptureSession from Unity (or maybe Vuforia plugin)? Or maybe there is another way to solve my problem?
Any pieces of advice or guides will be very helpful.
I don't think you should use AVCaptureSession to get the preview and even do the capture operation in Cocoa-Touch instead you should capture the image in Unity and pass the data to Cocoa-Touch native API.
Here is the link how to capture the screenshot in Unity.
I'm working on a video app for iOS, and everything seems to work fine other than if we go into stock iPhone camera app the field of view seems wider. I've tried playing with the different AVCaptureVideoPreviewLayer gravity settings (AVLayerVideoGravityResize, AVLayerVideoGravityResizeAspect and AVLayerVideoGravityResizeAspectFill) to no avail.
In fact, using other camera apps (like when taking a photo in Facebook) we get a much more similar output than we do with the stock camera app.
Does anyone have any information on why something like this might be? I've been scouring the internet and haven't been able to find anything.
It turns out the resolution of the still image camera is different then the resolution of the video camera, see for instance https://discussions.apple.com/thread/3490649?tstart=0
I am trying to integrate GPUImage into my iPhone app to take photos with real-time filters. But I found an annoying problem: when I use the front camera to take photos, there are always some vertical black bars in the result image.
Rear camera never has this problem;
It has nothing to do with filters, I have tried without any filter.
First I think it has something to do with my iPhone hardware, then I searched on internet, other developers have met the same problem (with no solution).
see image here:
http://i.stack.imgur.com/rPAJ9.jpg
Maybe this can help you out:
I had the same exact problem. In the GPUImageStillCamera.m, I commented out "[self captureOutput:photoOutput didOutputSampleBuffer:imageSampleBuffer fromConnection:[[photoOutput connections] objectAtIndex:0]];" and now the glitches are gone. That's the only way I was able to fix the problem without forcing GPUImageOpenGLESContext's +supportsFastTextureUpload to return NO.
for more see: here
I am a very beginner in Objective-C and iOS programming. I spent a month to find out how to show a 3D model using OpenGL ES (version 1.1) on top of the live camera preview by using AvFoundation. I am doing a kind of augmented reality application on iPad. I process the input frames and show 3D object overlay with the camera preview in realtime. These was fine because there are so many site and tutorial about these things (Thanks to this website as well).
Now, I want to make a screen capture of the whole screen (the model with camera preview as the background) as the image and show in the next screen. I found a really good demonstration here, http://cocoacoderblog.com/2011/03/30/screenshots-a-legal-way-to-get-screenshots/. He did everything I want to do. But, as I said before, I am so beginner and don't understand the whole project without explanation in details. So, I'm stuck for a while because I don't know how to implement this.
Does anybody know any of good tutorial or any kind of source in this topic or any suggestion that I should learn more in order to do this screen capture? This will help me a lot to moving on.
Thank you in advance.
I'm currently attempting to solve this same problem to allow a user to take a screenshot of an Augmented Reality app. (We use Qualcomm's AR SDK plugged into Unity 3D to make our AR apps, which saved me from ever having to learn how to programmatically render OpenGL models)
For my solution I am first looking at implementing the second answer found here: How to take a screenshot programmatically
Barring that I will have to re-engineer the "Combined Screenshots" method found in CocoaCoder's Screenshots app.
I'll check back in when I figure out which one works better.
Here are 3 very helpful links to capture screenshot:
OpenGL ES View Snapshot
How to capture video frames from the camera as images using AV Foundation
How do I take a screenshot of my app that contains both UIKit and Camera elements
Enjoy