Prevent recording vertical videos in app - ios

I am creating an iOS app that allows users to take photos and record videos. I would like to block recording "vertical" videos - video recording in portrait orientations. I couldn't find any software libraries that implement this functionality so I guess I will have to implement it myself.
I am using UIImagePickerController and I tried to achieve that using cameraOverlayView, but I don't believe it can be done that way.
So is there any way to solve this?
Thanks

Actually the videos are always recorded in landscape-right regardless of the device orientation b/c that's how the sensor is oriented in the hardware (although you can request rotated buffers in AVFoundation). However there's a flag stored as video metadata that describes the device's orientation during recording and this is used during playback to rotate the content. See AVAssetTrack preferredTransform.
If you don't want your video to be rotated, just discard this information during playback.

Related

Fast video stream start

I am building an app that streams video content, something like TikTok. So you can swipe videos in table, and when new cell becomes visible the video starts playing. And it works great, except when you compare it to TikTok or Instagram or ect. My video starts streaming pretty fast but not always, it is very sensible to network quality, and sometimes even when network is great it still buffering too long. When comparing to TikTok, Instagram ... in same conditions they don't seam to have that problem. I am using JWPlayer as video hosting service, and AVPlayer as player. I am also doing async preload of assets before assigning them to PlayerItem. So my question is what else can I do to speed up video start. Do I need to do some special video preparations before uploading it to streaming service. (also I stream m3U8 files). Is there some set of presets that enables optimum streaming quality and start speed. Thanks in advance.
So theres a few things you can do.
HLS is apples preferred method of streaming to an apple device. So try to get that as much as possible for iOS devices.
The best practices when it comes to mobile streaming is offering multiple resolutions. The trick is to start with the lowest resolution available to get the video started. Then switch to a higher resolution once the speed is determined to be capable of higher resolutions. Generally this can be done quickly that the user doesn't really notice. YouTube is the best example of this tactic. HLS automatically does this, not sure about m3U8.
Assuming you are offering a UICollectionView or UITableView, try to start low resolution streams of every video on the screen in the background every time the scrolling stops. Not only does this allow you to do some cool preview stuff based off the buffer but when they click on it the video is already established. If thats too slow try just the middle video.
Edit the video in the background before upload to only be at the max resolution you expected it to be played at. There is no 4k resolution screen resolutions on any iOS device and probably never will be so cut down the amount of data.
Without getting more specifics this is all I got for now. Hope I understood your question correctly. Good luck!

Can iPhone 5, 6 or 6+ take a PICTURE with both cameras at the same time?

I found some answers regarding both front and back camera usage at the same time regarding AUDIO/VIDEO recording, which is impossible.
In detail here:
Can the iPhone4 record from both front and rear-facing camera at the same time?
However, is it possible to use both cameras at the same time to take pictures for iOS?
No this is definitely not possible I'm afraid.
Only one camera session can be used at a time when using AVCaptureSession (the lower level API for camera interaction on iOS).
If you try to invoke multiple sessions (from each camera) as soon as one session begins, the other will stop.
You could quickly alternate between sessions, but the images will not be taken in synchronicity.

Youtube api and Samsung SmartTV App resolutions

im working on Smart TV app for Samsung which should use youtube api to play videos. Embedded videos will work only when app resolution and yt player size are 960x540 or below,
if I set higher resolution (1280x720 or 1920x1080) player stucks, behaves really slow, and videos will buffer infinitely.
Has anyone succeeded in embedding yt videos with higher resolution player?
Thx in advance.
Video player work in FullHD resolution in fullscreen regardless of widget resolution.
If you have troubles with buffering, check your connection speed. Try play file from local network to check that selected resolution and codecs hadled well by TV.
recently i found this case. The youtube apps working great on 720p resolution if the video length is below 10min, but longer than that for example 30min the player will stuck just like as you said.
When changing the app resolution to 540p the youtube player working great again for all videos. I suppose the youtube is using progressive download on their player and Smart TV storage itself is not enough to prepare the long video storage space with 720p resolution rendering.
The conclusion is when using flash player/youtube in apps the best at using 540p app resolution.
Thx all for answering,
in the end I used different approach which showed like best solution.
I used 720p resolution, and youtube cue video functionality.
Basically i cued video, and on "videoCued" event i called "playVideo" method.
This allowed player to get ready and initialize before playing video.

Change iPhone Camera Shutter Sound in App

How can I change the sound that plays when an image is captured with an iPhone? I am using AVCapture and I want to capture still images (rather than grabbing frames from video) for image quality sake.
Thanks in advance!
After researching it appears that the standard camera shutter sound is next to impossible to change via SDK.
This answer supports that. You can replace your own sound, but not in an app.
As for AVCapture, it also appears that you can't change the shutter sound as capturing images covertly is against the App Store policy. See this answer.
The only way to take a silent picture is to use a video screen grab, which you said you don't want to do for image quality reasons.
So AFAIK, there is no way to change the shutter sound not using video screen grabs.

AV Foundation camera preview layer gets zoomed in, how to zoom out?

The application currently I am using has a main functionality to scan QR/Bar codes continuously using Zxing library (http://code.google.com/p/zxing/). For continuous frame capturing I used to initialize the AVCaptureSession and AVCaptureVideoOutput, AVCaptureVideoPreviewLayer described in the apple Q&A http://developer.apple.com/iphone/library/qa/qa2010/qa1702.html.
My problem is, when I used to run the camera preview, the image I can see through the Video device is much larger (1.5x) than the image we can see through the still camera of the iPhone. Our customer needs to hold the iPhone around 5cm distance from the bar code when he is scanning, but if you hold the iPhone to that parameter, the whole QR code won't be visible and the decoding fails.
Why is Video camera in iPhone 4 enlarges the image (by seeing through the AVCaptureVideoPreviewLayer) ?.
This is a function of the AVCaptureSession video preset, accessible by using the .sessionPreset property. For example, after configuring your captureSession, but before starting it, you would add
captureSession.sessionPreset = AVCaptureSessionPresetPhoto;
See the documentation here:
iOS Reference Document
The default preset for video is 1280x720 (I think) which is a lower resolution than the max supported by the camera. By using the "Photo" preset, you're getting the raw camera data.
You see the same behaviour with the built-in iPhone Camera app. Switch between still and video capture modes and you'll notice that the default zoom level changes. You see a wider view in still mode, whereas video mode zooms in a bit.
My guess is that continuous video capture needs to use a smaller area of the camera sensor to work optimally. If it used the whole sensor perhaps the system couldn't sustain 30 fps. Using a smaller area of the sensor gives the effect of "zooming in" to the scene.
I am answering my own question again. This was not answered even in Apple Dev forum, therefore I directly filed a technical support request from Apple and they have replied that this is a known issue and will be fixed and released with a future version. So there is nothing we can do more than waiting and see.

Resources