In my iOS project I'd like to add image overlay (watermark) to a local video using GPUImage3 and store it in documents directory. How can I do it. I can't find GPUImage3 documentation and have to write the code blindly.
Related
I have an Augmented Reality functionality made using Unity + Vuforia plugin which I integrated into the iOS application. The app uses the camera as background and when you navigate camera to some marker 3D object will appear on it.
My task is to add buttons which will start and stop capture video (or image) from the camera. The output should be a video with camera scene + 3D object.
I made some investigation, but the only solution I found is to convert the view of AVCaptureVideoPreviewLayer on which camera preview is showing to a video (or image). But from my opinion, this solution is inefficient and not flexible.
Is there any way to get a current instance of the AVCaptureSession from Unity (or maybe Vuforia plugin)? Or maybe there is another way to solve my problem?
Any pieces of advice or guides will be very helpful.
I don't think you should use AVCaptureSession to get the preview and even do the capture operation in Cocoa-Touch instead you should capture the image in Unity and pass the data to Cocoa-Touch native API.
Here is the link how to capture the screenshot in Unity.
I am using AVFoundation to create a video and have added in an effect to clip the video so there is a clear background. What file format should I save this as to preserve the transparency in my iOS app.
AVAnimator is a library with which you can display video with an alpha channel on iOS, it is however not free to use for commercial products.
I don't think it's natively possible.
I want to be able to convert a given image uploaded in the app and convert it to map tiles to be overplayed in a MapView. This could be a photo downloaded from the web or simply a photo taken from the device's camera.
I have so far come across GDAL2Tiles and MapTiler but I want to be able to do this in the app instead of preparing them beforehand. These are command line and desktop applications that will do the conversion, but this is all that I can find so far.
Is there a built-in feature for iOS that allows me to do this? If not, is there a third-party library that does, or is it just not possible?
I'm currently working on an augmented reality app, that's why I would like to add an image above the live video feed (GPUImageVideoCamera) AND be able to record the whole to an output file.
Get the video live preview, and recording is ok from now, but I can't manage to add the image on the screen the way it's recorded to the output file.
What's the best way to achieve this (I mean a GPUImage compliant way) ?
I'm looking for making a iPhone map application for indoor navigation in our office using mapbox.
I'm not talking about custom icon image on marker or cacheing map for offline use.
Is this possible to create a simple iPhone app with MapBox which uses my own building map as image source?
Yes, this is possible. You want to use TileMill to create a map from the image, either hosting it online or exporting it to an MBTiles file (essentially a SQLite file full of tiles) that you can read directly in the app using RMMBTilesSource.
Here is a guide on making the map: http://mapbox.github.io/tilemill/docs/guides/reprojecting-geotiff/