Record video with custom camera UI but prevent save - ios

I'm using AVCaptureFileOutputRecordingDelegate - didFinishRecordingToOutputFileAt inside my custom camera UI, but I don't want to pass for this method because the video is been saved when it finish recording.
For legacy reasons I can't save the video locally, then to take it in a static instance and delete it from local.
How can I do that ?

AVFoundation framework has only the following output for a capture session.
AVCaptureMovieFileOutput - to record and output a movie file
AVCaptureVideoDataOutput - process frames from video being captures
AVCaptureAudioDataOuput - process audio data being captures
AVCaptureStillImageOutput - to capture still image output
Since you don't want to save the recorded video to a file. The other best option would be using AVCaptureVideoDataOutput and get each frame on a continuous recording video and create a video from image buffer. To make a note you will not have audio output in this case. Again we can add AVCaptureAudioDataOuput and embed the audio separately on our recorded video. But this workaround will not work for higher frame rates. So best suggestion to save the video into temp folder and delete it later.

Related

Is there any way to record video with AVCaptureMovieFileOutput and change cameras?

I want to record a video with AVFoundation and AVCaptureSession using AVCaptureMovieFileOutput.
I want to record a video like Instagram Stories, so I want to change the camera while the video is recording. The problem with AVCaptureMovieFileOutput is that when I change the captureDevice in the captureSession the delegate method AVCaptureFileOutputRecordingDelegate sends an error with the text Recording stopped and everything stops.
I've thought about stopping the front camera video before I change the camera, and save it, then save the back camera video and join them in the same video. Is this a good option? Or Should I use AVCaptureVideoDataOutput to achieve this?

iOS : How to apply audio effect on recorded video

I am developing an application which require to apply audio effect on recorded video.
I am recording video using GPUImage library. I can successfully done with it. Now, I need to apply audio effect like Chipmunk, Gorila, Large Room, etc.
I looked into Apple's document and it say that AudioEngine can't apply AVAudioUnitTimePitch on Input Node (as Microphone).
For solving this problem, I use following mechanism.
Record video & audio at same time.
Play video. While playing video, start AudioEngine on Audio file and apply AVAudioUnitTimePitch on it.
[playerNode play]; // Start playing audio file with video preview
Merge video and new effected audio file.
Problem :
User have to preview a full video for audio effect merge. This is not a good solution.
If I set volume of playerNode to 0 (zero). Then It record mute video.
Please provide any better suggestion to do this things. Thanks in advance.

Using AVPlayer as input for AVCaptureSession?

Is it possible to capture the output of the AVPlayer using AVCaptureSession? I believe it's possible but can't figure out how to go about using the AVPlayer as an input.
You cannot plug an AVPlayer into an AVCaptureSession, although you can get access to the player's video and audio in the form of CVPixelBuffers and AudioBufferLists.
This is achieved via two APIs: AVPlayerItemVideoOutput for video and MTAudioProcessingTap for audio.
Despite being a c-api, MTAudioProcessingTap is easier to integrate as just like AVCaptureSession, it pushes you samples via a callback, while with AVPlayerItemVideoOutput you pull frames for a given time.
For this reason, if you want a AVCaptureSession-like experience (real-time, push), you should probably let the audio tap drive your frame-pulling.
There is some AVPlayerItemVideoOutput sample code in objective-c here and in swift here and an example of using an MTAudioProcessingTap in swift here.

iOS Capture Video from Camera and Mixing with audio file in real time

I am trying to capture video from iPhone camera and save the movie mixed with an audio file.
I can capture the video with the audio (from mic) with no problems. What I want to do is capture the video but instead of mic audio, use a music track (a .caf file).
I am capturing the video with AVAssetWriter. I've tried to set up an AVAssetReader to read the audio file, but I couldn't make it work with the AVAssetWriter (maybe because the decoding of audio happens real fast).
Also, I don't want to save the movie without audio and mix it afterwards with an AVAssetExportSession. It would be to slow for my purpose.
Any Ideas ? Thanks in advance.
Capture the video using AVAssetWriter, Capture audio lets say with AvAudioRecorder , then mix audio and video using AVExportSession , lots of posts on this topic.
If you do it after (with AVAssetExportSession) , you will problem with the sync. and short delay (time...) between the video and the audio...didn't find better solution

Capture video from cam + custom view into single video file

I wonder if it's possible in iOS 4 or 5 to save into a single video file not just a stream from camera, but a stream from camera WITH custom view(s) overlaid. Custom view will contain few labels with transparent background. Those labels will show additional info: current time and GPS coordinates. And every video player must be able to playback that additional info.
I think you can use AVCaptureVideoDataOutput to process each frame and use AVAssetWriter to record the processed frame.You can refer to this answer
https://stackoverflow.com/a/4944594/379941 .
And you can process CVImageBufferRef then use AVAssetWriterPixelBufferAdaptor's appendPixelBuffer:withPresentationTime: method to export.
And I strongly suggest using OpenCV to process frame. this is a nice tutorial http://aptogo.co.uk/2011/09/opencv-framework-for-ios/. OpenCV library is very great.

Resources