Is it possible to change the preview surface while recording with Media recorder - preview

Can anyone help me with solution or lead?I am Recording video using MediaRecorder.I have surfaceview attached to the recorder for preview.I have different Layouts for Portrait and Landscape.Now the preoblem is that When I change the Orientation surface is getting created again(Ofcourse it is so as it is different for landscape n protrait) ..in this case inorder to get the preview I have to stop , then prepare the recorder with new surface and start recorder again.Is it possible to change the preview surface while recording?
surfaceCreated(SurfaceHolder holder) I am doing teh following
If(recording)
recorder.Stop();
recorder.reset();
recoder.intialiseFileFormats() ; //Init AV formats and codecs
recorder.setPreviewDisplay(holder.getSurface());
recorder.prepare();
recorder.start();
Any help is really appreciable.Thanks

Related

iOS : How to apply audio effect on recorded video

I am developing an application which require to apply audio effect on recorded video.
I am recording video using GPUImage library. I can successfully done with it. Now, I need to apply audio effect like Chipmunk, Gorila, Large Room, etc.
I looked into Apple's document and it say that AudioEngine can't apply AVAudioUnitTimePitch on Input Node (as Microphone).
For solving this problem, I use following mechanism.
Record video & audio at same time.
Play video. While playing video, start AudioEngine on Audio file and apply AVAudioUnitTimePitch on it.
[playerNode play]; // Start playing audio file with video preview
Merge video and new effected audio file.
Problem :
User have to preview a full video for audio effect merge. This is not a good solution.
If I set volume of playerNode to 0 (zero). Then It record mute video.
Please provide any better suggestion to do this things. Thanks in advance.

how to capture a video in specific part rather than full screen in iOS

I am capturing a video in my IOS app using AVFoundation. i am able to record the video and able to playback also.
But my problem is that i am showing the capturing video in a view which is around 200 points height.so i expected the video would be recorded in the same dimensions.but when i playback the video its showing that the whole screen has been recorded.
so i want to know is there any way to record the camera preview which was visible to user only.And any help should be appreciated.
the screenshots:
()
()
You cannot think of video resolution in terms of UIView dimensions (or even screen size, for that matter). The camera is going to record at a certain resolution depending on how you set up the AVCaptureSession. For instance, you can change the video quality by setting the session preset via:
[self.captureSession setSessionPreset:AVCaptureSessionPreset640x480]
(It is already set by default to the highest setting.)
Now, when you play the video back, you do have a bit of control over how it is presented. For instance, if you want to play it in a smaller view (who's layer is of type AVPlayerLayer), you can set the video gravity via:
AVCaptureVideoPreviewLayer *previewLayer = (AVCaptureVideoPreviewLayer*)self.previewView.layer;
[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspect];
And depending on what you pass for the gravity parameter, you will get different results.
Hopefully this helps. Your question is a little unclear as it seems like you want the camera to only record a certain amount of it's input, but you'd have to put your hand over part of the lens to accomplish that ;)

iOS - Audio level metering of app's audio output

I want to show audio meter levels in my app from the audio that my app send to the speaker. I can find the audio level from AVAudioPlayer but it works on audio files.
I have tried to acheive this using The Amazing Audio Engine as it is provided in its documentation here but I am unable to find out how to do that.
Is it possible to achieve this in ios? Can anyone suggest me any library, audio engine or method?
Thanks in advance
If you are using "remote i/o audio unit for handling audio input and output" this is a possibility:
https://developer.apple.com/library/ios/samplecode/aurioTouch2/Listings/ReadMe_txt.html
"… , and a sonogram view (a view displaying the frequency content of a signal over time, with the color signaling relative power, the y axis being frequency and the x as time). Tap the sonogram button to switch to a sonogram view, tap anywhere on the screen to return to the oscilloscope. Tap the FFT button to perform and display the input data after an FFT transform. Pinch in the oscilloscope view to expand and contract the scale for the x axis." …

iPhone: Toggling front/back AVCaptureDeviceInput camera when processing individual frames via setSampleBufferDelegate

I've run into an interesting issue when I attempt to switch from using the front camera to using the back camera while processing individual frames via the AVCaptureVideoDataOutput:setSampleBufferDelegate selector. The camera swap works and the preview screen that I'm displaying looks great, it's just that the resulting frames that I capture are no longer in portrait mode, they are in landscape. Also, swapping from the front then back to the back camera will result in the back camera capturing landscape frames. I suspect that since this is the case something is getting screwed up when I swap out the input - it's not the input that's incorrect. I verified this theory by starting the AVCaptureSession with the front facing camera - the frames passed to the buffer delegate are correctly in portrait mode. I've also played with explicitly stopping the AVCaptureSession while the device input is being swapped with no difference in results.
I cribbed from the AVCam demo for inspiration. The suspicious difference between that code and mine is that it records to an AVCaptureMovieFileOutput - it's not processing individual frames.
Any ideas? Why would the orientation of the frames sent to my processor change when I swap out the device input?
Thanks for any response!
Ah ha! I figured it out. For some reason after switching the device input my video output's AVCaptureConnection was getting its orientation reset to landscape-right. To solve the problem, after I swap the input I explicitly ask the video output's AVCaptureConnection to set its orientation to portrait.

How do I prevent the video stream rotating in landscape - AVCaptureSession and OpenGL

I have created an application that captures a live video stream, does some processing on the video stream using OpenGL ES, and displays it on a screen in a UIView
Essentially the data flow is:
AVCaptureSession -> AVCaptureDeviceInput -> AVCaptureVideoDataOutput -> gl buffers -> draws to a UIView
I am not using AVCaptureVideoPreviewLayer for displaying the content.
When I display video using this system in portrait it works as expected, the video is in the same orientation as the display.
However, my app is intended to be used in landscape. In my shouldAutorotateToInterfaceOrientation method in the superview I only allow UIInterfaceOrientationLandscapeLeft, and UIInterfaceOrientationLandscapeRight.
My issue is that the video stream is rotated. The phone is orientated to landscape, and the video stream is rotated by 90 degrees. I would like the video stream to be in the same orientation.
I do not understand why the video stream is rotated, I guess it is trying to compensate for landscape mode, but I have not idea how to prevent this rotation. Any ideas?
Thanks
Mike
Thanks for the response VinceBurn.
Ultimately I tried out the methods in this post
And settled on the example code for willRotateToInterfaceOrientation:duration
Don't know if that would work, I'm just guessing something here.
If no simple solution comes, maybe try to apply a 90° rotation transform on your UIView layer.
(I don't like this idea, but could work)

Resources