So, I have a custom picker controller in which I can select videos and play them. Beforehand, I was struggling to play slow motion videos (only no-slow-motion videos were playing) but after searching I found the solution here.
How to access NSData/NSURL of slow motion videos using PhotoKit
Now, slow motion videos are playing but in a normal way instead of in a slow-motion way. This questions relates to the problem.
https://devforums.apple.com/message/903937#903937
I've seen a lot of comments saying they solved the problem by using Photos framework, but I have no idea how to achieve this and they didn't explain either. It might be something to do PHAssetMediaSubtypeVideoHighFrameRate. So, how would be possible to play slow motion videos? Do I need to change the fps somehow?
Related
I'm currently working on app that records video of user, binds particle emitters to hands landmarks on preview. In result effects are shown on camera preview, but they aren't captured on video.
I saw a great tutorial on AVVideoCompositionCoreAnimationTool https://www.raywenderlich.com/6236502-avfoundation-tutorial-adding-overlays-and-animations-to-videos, but that way it only is possible to rendere animations on already recoded video.
I wonder if there is any chance to use AVVideoCompositionCoreAnimationTool for recording video form camera and animations in real time. Or if you know other method to do it, without diving deep in metal and so on.
Thanks in advance!
I am building an app that streams video content, something like TikTok. So you can swipe videos in table, and when new cell becomes visible the video starts playing. And it works great, except when you compare it to TikTok or Instagram or ect. My video starts streaming pretty fast but not always, it is very sensible to network quality, and sometimes even when network is great it still buffering too long. When comparing to TikTok, Instagram ... in same conditions they don't seam to have that problem. I am using JWPlayer as video hosting service, and AVPlayer as player. I am also doing async preload of assets before assigning them to PlayerItem. So my question is what else can I do to speed up video start. Do I need to do some special video preparations before uploading it to streaming service. (also I stream m3U8 files). Is there some set of presets that enables optimum streaming quality and start speed. Thanks in advance.
So theres a few things you can do.
HLS is apples preferred method of streaming to an apple device. So try to get that as much as possible for iOS devices.
The best practices when it comes to mobile streaming is offering multiple resolutions. The trick is to start with the lowest resolution available to get the video started. Then switch to a higher resolution once the speed is determined to be capable of higher resolutions. Generally this can be done quickly that the user doesn't really notice. YouTube is the best example of this tactic. HLS automatically does this, not sure about m3U8.
Assuming you are offering a UICollectionView or UITableView, try to start low resolution streams of every video on the screen in the background every time the scrolling stops. Not only does this allow you to do some cool preview stuff based off the buffer but when they click on it the video is already established. If thats too slow try just the middle video.
Edit the video in the background before upload to only be at the max resolution you expected it to be played at. There is no 4k resolution screen resolutions on any iOS device and probably never will be so cut down the amount of data.
Without getting more specifics this is all I got for now. Hope I understood your question correctly. Good luck!
I am trying to detect 30 targets and play different video for each target. The problem I am facing is after some 20 images, somehow the videos dont play(I get a cross sign).
Also is there some way to load a video that is required to play and unload others, so that the videos dont hang after some time.
Any help or pointers in right direction highly appretiated.
I am recording videos and playing them back using AVFoundation. Everything is perfect except the hissing which is there in the whole video. You can hear this hissing in every video captured from any iPad. Even videos captured from Apple's inbuilt camera app has it.
To hear it clearly, you can record a video in a place as quiet as possible without speaking anything. It can be very easily detected through headphones and keeping volume to maximum.
After researching, I found out that this hissing is made by preamplifier of the device and cannot be avoided while recording.
Only possible solution is to remove it during post processing of audio. Low frequency noise can be removed by implementing low pass filter and noise gates. There are applications and software like Adobe Audition which can perform this operation. This video shows how it is achieved using Adobe Audition.
I have searched Apple docs and found nothing which can achieve this directly. So I want to know if there exists any library, api or open source project which can perform this operation. If not, then how can I start going in right direction because it does looks like a complex task.
I'd like to set a video as background to a view. I found multiple solutions, each one with great drawbacks, like:
Using MPMoviePlayerController, it works ok. The video has the best quality, however it uses a lot of cpu (~50% on my mac). I didn't dare to test it on my phone.
Converting the video to a .gif and displaying it with a UIWebView. This is a great solution, but i couldn't make a high quality gif. The video has lots of movement which is not suited for gif files.
Is there a better solution? I'd appreciate any help. Thanks.
I did that before using AVPlayer. It was since iOS 4 (long time ago), and it didn't cause any noticeable effect on the performance.