I was having a look at the RosyWriter Sample Code provided by Apple as a starting point and I'd like to find a way how to crop a video.
So i have the full resolution video from the iPhones Camera, but I just want to use a cropped part of it (and also rotate this subpart).
I figured that in captureOutput:didOutputSampleBuffer: fromConnection: i can modify each frame by modifying the CMSampleBufferRef that i get passed in.
So my questions now are:
Is this the right place to crop my video?
Where do I specify that the final video (that get's saved to disc) has a smaller resolution than the full video captured by AVCaptureSession? Setting the AVVideoWidthKey and AVVideoHeightKey has no effect.
How can I crop the video and still have good performance?
Any help is appreciated!
Thanks a lot!
EDIT:
Maybe I just need to know how I can make a video that was shot in portrait a landscape one by turning the images of the video by 90 degrees and then zoom in to fit the width again...?!?
In AVVideoSetttings.h there is the AVVideoScalingModeKey. This key combined with the defined values control how the video is scaled/cropped when encoding the images to the video container. For example if you specified a value of AVVideoScalingModeFit then cropping is used. Check out the header for how other values effect the video images.
Related
I am using AVFoundation for my camera app. I am using live video frames to grab picture using AVCaptureVideoDataOutput.
Right now i do filter all supported formats for device and then pick highest supported resolution for format.
So for example i do get 4032*3024 resolution supported format on iPhone 8/7 which i am using as active format. i do filter from videoSupportedFramRateRanges.
I would like to know how i can render video with 4:3 resolution so i can see black bars on side of camera.
I did play with AVCaptureVideoPreviewlayer properties and found that if i use resizeAspectFill it shows camera full screen but if i just dont assign then it adds black bars on side. (I assume .resize is default property) but i am not sure whether this is right approach to show/hide black bars.
In my opinion this is working because currently i am picking highest supported resolution format which in this case is 4032*3024 due to which its working since its 4:3 aspect ratio.
Basically i want make sure i show video in 4:3 format size view in some scenario with black bars and in some scenario full screen which i am already doing.
Open questions
Is there any way i can make sure i can show always 4:3 aspect ratio video feed.
How i can show black bars with 4:3
also after showing black bar how I can get camera area without black bars?
Thanks!
I am creating a feature where a user can record a video of themselves, and superimposed on this video is a view that displays an image and some text. When they are done recording, I am using AVFoundation's composition classes to composite the video and the view (as an image) into one video file, and output this in the next scene in a custom video player. The problem is that while the view's resolution is crystal clear in the record scene, after the composition (and after the AVExportSession completes) the resulting video's overlayed view quality is not clsoe to the actual view quality. I am converting the view to an image, and then setting the contents of an overlay layer as this image's CGImage, which, as I have checked, still has the same quality as the original view. The problem occurs when I apply the composition, and the image becomes blurry. Does anyone have any idea why this might be happening?
If you need to see the code, please feel free to ask! I can also provide screenshots.
Thank you!
It could happen when initiating your UIImage, iOS automatically pick #2x or #3x image source for you corresponding to your device.
Let say you get image size using size property like image.size, it gives you #1x size, and you might reduce your image size from #2x or #3x to #1x, you get a bad quality image output, because JPEG or PNG resize algorithms.
I am developing a custom camera in which the camera is set to the Image Capture mode. I need to increase the zoom level of camera preview according to the app requirements. The preview currently being displayed is perfect I just need to increase the zoom-out in current preview. I searched over internet but didn't find any solution. Please tell me how can I do this. I am attaching the example image for better understanding. first image is of my camera app and second image is of Scanner Pro app which shows view with more covered area while I focus both the apps for the same object on the same distance. My camera don't have any space but the Scanner camera has spacing all over the image. Both the camera are on the same distance from the paper.
i don't know whether you still need this answer. Probably not, but still for you and everyone else looking out:
When you set the Session Preset, try using SessionPresetPhotofor the device object. This should resolve the weird zoom issue.
Your preview view is probably spilling over the edge of the screen. Make sure it is a 4:3 aspect ratio and that it doesn’t overflow your screen edges. With that you should see more of your image.
I'm using UIImagePickerController to snap an image and uploading it to server.
When taking a photo in the front camera, the height/width get reversed somewhere.
The image is displayed correctly later, but height and width are reversed (and I'm using them for the UIImageView autolayout constraint)
The thing is - that when looking at UIImagePickerControllerMediaMetadata of front and back camera images - the EXIF and the rest of the metadata is the same (resolution is smaller but the height/width ratio is the same)
Any ideas what is the difference?
Apple images are always landscape left with EXIF and the orientation is specified in the EXIF.
OK, so #zaph comment is correct, apparently back camera images are "reversed" as well - the upload code in the server (Codeigniter PHP) ignored the EXIF.
The problem surfaced only due to front camera low resolution...
I have an app that I would like to have video capture for the front-facing camera only. That's no problem. But I would like the video capture to always be in landscape, even when the phone is being held in portrait.
I have a working implementation based on the AVCamDemo code that Apple published. And borrowing from the information in this tech note, I am able to specify the orientation. There's just one trick: while the video frame is oriented correctly, the contents still appear as though shot in portrait:
I'm wondering if I'm just getting boned by the physical constraints of the hardware: is the image sensor just oriented this way? The referenced tech note above makes this note:
Important: Setting the orientation on a still image output and movie
file output doesn't physically rotate the buffers. For the movie file
output, it applies a track transform (matrix) to the video track so
that the movie is rotated on playback, and for the still image output
it inserts exif metadata that image viewers use to rotate the image
properly when viewing later.
But my playback of that video suggests otherwise. Any insight or suggestions would be appreciated!
Thanks,
Aaron.
To answer your question, yes, the image sensor is just oriented that way. The video camera is an approx 1-megapixel "1080p" camera that has a fixed orientation. The 5MP (or 8MP for 4S, etc) still camera also has a fixed orientation. The lenses themselves don't rotate nor do any of the other camera bits, and hence the feed itself has a fixed orientation.
"But wait!", you say, "pictures I take with the camera app (or API) get rotated correctly. Why is that?" That's cuz iOS takes a look at the orientation of the phone when a picture is taken and stores that information with the picture (as an Exif attachment). Yet video isn't so flagged -- and each frame would have to be individually flagged, and then there's issues about what to do when the user rotates the phone during video....
So, no, you can't ask a video stream or a still image what orientation the phone was in when the video was captured. You can, however, directly ask the phone what orientation it is in now:
UIDeviceOrientation currentOrientation = [UIDevice currentDevice].orientation;
If you do that at the start of video capture (or when you grab a still image from a video feed) you can then use that information to do your own rotation of playback.