Playing an AVAsset while simultaneously adding other assets to end of video - ios

I'm aware of how to merge videos together using an AVMutableComposition (lots of good posts on this. See https://github.com/ChrisGrant/AVFoundation-Video-Stitching for example)
But is there any way to play an asset that is currently being merged together. Something like one thread plays the video while another is actively merging new assets to the end?
I want to be able to pull multiple assets from a server and merge them into an uninterrupted viewing experience (feed? Similar to Snapchat I guess) without the user having to wait for all of them to load or having to potentially worry about memory issues.
Ive been looking at documentation but can't seem to find a solution. Appreciate any input, thanks.

Related

Play video (from streaming) in iPhone with Ionic 3/Cordova iOS app from Node Js, without so much time loading

First, sorry because this is not an special code issue (I can play videos in iPhone), so I'm not attaching any of my code, but asking for technical solution.
I'm developing a mobile applicaton (and also a webapp) that plays videos which come from a Nde js server. At first I noticed that in Safari, you can only play videos from streaming (which is also the best practice in the rest of the browsers), but it was very slow (so much time loading the video).
I came accross this piece of code, and the post of the author, it helped me to improve my server side streaming code:
https://github.com/davidgatti/How-to-Stream-Movies-using-NodeJS/blob/master/routes/video.js
I didn't need to change anything in the webapp, but now I can play videos much faster in Mac/Safari (in HTML5 I have simple tags).
But nothing changed in the Ionic app... And I don't know how to follow or where the problem could be (ionic/cordova or Node JS).
What can be the point I can be missing? Any link, known bug in ionic, or trick would help a lot.
UPDATE:
I'm trying with .mov and .mp4 video files. What's the ideal format (or compression) for iPhone?
UPDATE 2:
It's a good choice to handle videos with a cloud video solution like uStream, and embed it like an iframe (as provided solution from ustream)? Nothing more seem to work on improving time of load, managing videos on my own server and ionic client.
Thanks so much

Stream multiple media sources from a single software/hardware encoder?

It's been a while since I first started looking into this and I still haven't found any feasible solutions, here's to hoping someone might have some suggestions/ideas...
The situation: We currently have a couple of live streams streaming mixed source content (some of the streams are being streamed as file playlists that are modified to change the files in the playlist, while others are streamed as live video directly from input). For each new live stream we usually just end up setting up a new streamer... it's feels rather counterproductive and wasteful.
The question: Does there exist a hardware or software solution (LINUX or Windows) that would allow to live stream multiple, for example, two (independent of each other) file playlists and optionally one or two live A/V inputs, from the same encoder?
According to my findings, with the help of FFMPEG library, it is possible to stream multiple live A/V inputs and even stream file playlists ... but it requires too much hacking to get it working and playlists have to be redone by hand and restarted every time changes have been made. This might work for me personally, but this won't do for a less tech-sawy people...
I'm basically looking for a way to reduce the computer hardware instead of allowing it to exponentially grow with each addition of a new live streaming source/destination.
Thank you for all your input and all the posted solutions. By sheer luck I found the solution I was originally looking for.
For anyone else looking for this or similar solution, the combo of systems that can combat our unusual requirements (and that can be integrated into our existing workflow by adjusting the hardware/software to meet our needs instead of us adjusting to hardware/software requirements/limitations) are: Sorenson Squeeze Server 3.0, MediaExcel Hero Live and MediaExcel File

Creating Videos in a Ruby On Rails app?

I'm looking for a way of taking an audio asset and jpeg from my app that are both stored on Amazon S3 and merge them into a video that's format will be accepted at Youtube. Basically, the song with static album artwork for the duration.
I've seen this post that refers to Ming-Ruby Create videos programatically? but i'm wondering if the Gem is active and maintained or if anyone else has experience of doing something similar and can offer some tips or alternative approaches?
You could look into using ffmpeg. Here's some examples of creating a video slideshow of images. http://ffmpeg.org/trac/ffmpeg/wiki/Create%20a%20video%20slideshow%20from%20images I'm sure you could find info out there on how to also add in audio. At that point you should just need to find a ffmpeg gem to wrap the calls you need to make or work directly with it via system calls (be careful with that though).

What's the easiest way to merge video, audio and image files?

We are planning a Wep App for a Hackathon that's happening in about 2 weeks.
The app basic functions are:
The users are guided step-by-step to upload a video, audio and image.
The image is used as a cover for the audio. Making it into a video file.
The two video files are merged thus creating a single video from the initial three files.
So, my problem is:
How do you create a video from an audio with an image as "cover".
How do you merge this two videos.
We are thinking of using Heroku for deployment. Is there a way to do it using something like stremio?
What would be the best approach? A VPS running a C++ script? How's the easiest way to do it?
FFMPEG would be a good start as seen here
https://stackoverflow.com/a/6087453/1258001
FFMPEG can be found at http://ffmpeg.org/
Also another option that maybe over kill would be Blender 3d as it could also provide simular results and could be controlled via shell commands and maybe more flexible in terms of complexe needs for asset compositions.
In any case your gonna want a server that can run heavy rendering processes wich will require a large amount of ram and cpu processing. It maybe a good choice to go with a render farm that can run gpu as the main processor for rendering as that will give you more bang for your buck but could be very difficult to set up and kept running correctly. I would also say a VPS would not be a good choice for this. In any case the type of resources your gonna need also so happen to be the most expensive in terms of web server costs. Best of luck please update with your results.

Progressive Video Download on iOS

I am trying to implement Progressive Downloading of a video in my iOS application that can be played through AVPlayer. I have already implemented a downloader module that can download the files to the iPad. However, I have discovered I cannot play a file that is still being written to
So, as far as I can tell, my only solution would be through downloading a list of file 'chunks' and then keep playing through every file as they are ready (ie: downloaded), probably using HLS
Searching I have come across this question which implements the progressive download through hls but other than that, I can find no other way
However, I keep coming across search results that say how to configure web servers to leverage the iOS support for HTTP Progressive Downloading, but with no mention of how to do it from the iOS side
So, any one have any ideas and/or experience about this?
EDIT: I have also found there could be a way of doing it other way around (ie: streaming, then writing streamed data to disk) which was suggested by this question but still cannot get it to work as it seems it does not work with non-local assets!
From what you say, you might want to change approach and attempt to stream the file. Downloading and playing at the same time, I would say is the definition of Streaming. I hate when people post links to the Apple documentation but in this instance reading a tiny bit of this documentation will help you more than I ever can. It should all make sense if you are lready working with connections and video, you just need to change your approach.
The link: https://developer.apple.com/library/ios/documentation/networkinginternet/conceptual/streamingmediaguide/Introduction/Introduction.html

Resources