Vuforia video playback video detection issue - ios

I am trying to detect 30 targets and play different video for each target. The problem I am facing is after some 20 images, somehow the videos dont play(I get a cross sign).
Also is there some way to load a video that is required to play and unload others, so that the videos dont hang after some time.
Any help or pointers in right direction highly appretiated.

Related

Fast video stream start

I am building an app that streams video content, something like TikTok. So you can swipe videos in table, and when new cell becomes visible the video starts playing. And it works great, except when you compare it to TikTok or Instagram or ect. My video starts streaming pretty fast but not always, it is very sensible to network quality, and sometimes even when network is great it still buffering too long. When comparing to TikTok, Instagram ... in same conditions they don't seam to have that problem. I am using JWPlayer as video hosting service, and AVPlayer as player. I am also doing async preload of assets before assigning them to PlayerItem. So my question is what else can I do to speed up video start. Do I need to do some special video preparations before uploading it to streaming service. (also I stream m3U8 files). Is there some set of presets that enables optimum streaming quality and start speed. Thanks in advance.
So theres a few things you can do.
HLS is apples preferred method of streaming to an apple device. So try to get that as much as possible for iOS devices.
The best practices when it comes to mobile streaming is offering multiple resolutions. The trick is to start with the lowest resolution available to get the video started. Then switch to a higher resolution once the speed is determined to be capable of higher resolutions. Generally this can be done quickly that the user doesn't really notice. YouTube is the best example of this tactic. HLS automatically does this, not sure about m3U8.
Assuming you are offering a UICollectionView or UITableView, try to start low resolution streams of every video on the screen in the background every time the scrolling stops. Not only does this allow you to do some cool preview stuff based off the buffer but when they click on it the video is already established. If thats too slow try just the middle video.
Edit the video in the background before upload to only be at the max resolution you expected it to be played at. There is no 4k resolution screen resolutions on any iOS device and probably never will be so cut down the amount of data.
Without getting more specifics this is all I got for now. Hope I understood your question correctly. Good luck!

Playing slow motion videos from camera make it loses slow motion effect

So, I have a custom picker controller in which I can select videos and play them. Beforehand, I was struggling to play slow motion videos (only no-slow-motion videos were playing) but after searching I found the solution here.
How to access NSData/NSURL of slow motion videos using PhotoKit
Now, slow motion videos are playing but in a normal way instead of in a slow-motion way. This questions relates to the problem.
https://devforums.apple.com/message/903937#903937
I've seen a lot of comments saying they solved the problem by using Photos framework, but I have no idea how to achieve this and they didn't explain either. It might be something to do PHAssetMediaSubtypeVideoHighFrameRate. So, how would be possible to play slow motion videos? Do I need to change the fps somehow?

Removing low frequency (hiss) noise from video in iOS

I am recording videos and playing them back using AVFoundation. Everything is perfect except the hissing which is there in the whole video. You can hear this hissing in every video captured from any iPad. Even videos captured from Apple's inbuilt camera app has it.
To hear it clearly, you can record a video in a place as quiet as possible without speaking anything. It can be very easily detected through headphones and keeping volume to maximum.
After researching, I found out that this hissing is made by preamplifier of the device and cannot be avoided while recording.
Only possible solution is to remove it during post processing of audio. Low frequency noise can be removed by implementing low pass filter and noise gates. There are applications and software like Adobe Audition which can perform this operation. This video shows how it is achieved using Adobe Audition.
I have searched Apple docs and found nothing which can achieve this directly. So I want to know if there exists any library, api or open source project which can perform this operation. If not, then how can I start going in right direction because it does looks like a complex task.

How to screen record the iOS simulator at 60 fps?

It turned out that capturing video from the screen is a hard task on the Mac. I have a small game running in the simulator and want to make a screencast of the gameplay for youtube. Since it's a fast-paced scroller game, video must be recorded at 60 fps to look good.
I know the actual video on youtube for example is just 24 to 30 fps, but each such slow frame is blended with another.
When capturing the simulator at a lower frame rate than 60 fps the result is jagged a lot since every frame is razor sharp with no blending.
I tried a couple of Mac screen recorders but none of them were able to capture 60fps video from the simulator, and the frames in the resulting video looked like if the app took plenty of screenshots and stiffed them together into a video container.
But since there are great demo videos on youtube showing fast-paced gameplay of iOS apps without just recording the screen with a video camera, I wonder what kind of application they use to get a smooth screen capture.
Hopefully someone who already went through this problem can point out some solutions.
I've had good results screen recording from the simulator using SnapZ Pro X from Ambrosia software:
http://www.ambrosiasw.com/utilities/snapzprox/
One problem that you're likely to have is that the simulator only simulates iOS's OpenGL graphics in software, so unless you have a really powerful Mac, it's likely that the simulator won't be able to run your game at 60fps anyway.
It's possible that the videos you've seen used the HDMI video out on the iPhone to mirror the screen from the device into a video capture card on the computer. That would likely perform much better because the Mac wouldn't have to both generate and record the graphics simultaneously.
I remember watching a video of the Aquaria guys talking about how they recorded their gameplay videos. Essentially the game recorded the input from the controller/keyboard while the game was played normally. Then they could play back the game they had just played but one frame at a time, with each frame being rendered out to a file as it went. Then all those frames are composited together and bam, a full 60fps video with perfectly rendered graphics. Bit overkill but it's a nice solution.
A program that is able to record at 60 fps is Screenflick.

Is there any lib that recognize object from camera?

I'm currently working on a project which part of it needs some level of AI. End user pointing iOS camera to a pre recorded video, the screen is very large so they can only frame part of the entire video. They can move their iPhone and shoot any part of the video and the app will automatically recognize what they aiming for and fire a match event.
To sum up
Is there a lib can recognize predefined object from video source?
I've heard of people getting OpenCV to compile on the iPhone. I am not quite clear on what you will be able to do with it, but I would certainly check it out.
yes OpenCv
http://www.eosgarden.com/en/opensource/opencv-ios/overview/
http://www.youtube.com/watch?v=xzVXyrIRm30&feature=player_detailpage

Resources