I want to broadcast the video from the local server into the iPhone.
I only get link which is coming from the web-service with any extension of the video.
Video can be in any format..
LIKE :#"avi",#"wmv",#"rmvb",#"flv",#"f4v",#"swf",#"mkv",#"dat",#"vob",#"mts",#"ogg",#"mpg",#"wma"
so,which player is better for my app.
1)MPMovieplaycontroller or
2)AVPlayer controller
Please help me.
From the MPMoviePlayerController docs:
Supported Formats
This class supports any movie or audio files that already play correctly on an iPod or iPhone. This includes both streamed content and fixed-length files. For movie files, this typically means files with the extensions .mov, .mp4,.mpv, and .3gp and using one of the following compression standards:
H.264 Baseline Profile Level 3.0 video, up to 640 x 480 at 30 fps. (The Baseline profile does not support B frames.)
MPEG-4 Part 2 video (Simple Profile) If you use this class to play audio files, it displays a white screen with a QuickTime logo while the audio plays. For audio files, this class supports AAC-LC audio at up to 48 kHz, and MP3 (MPEG-1 Audio Layer 3) up to 48 kHz, stereo audio
You have to use 3rd Party libraries for your mentioned case
The built in media player won't support any of those formats.
Your only option is really a third party library like VLCKit. I've never used it, but it likely supports more formats that you require:
https://wiki.videolan.org/VLCKit/
Though I've never tried before, but I am sure you will get help from this apple documentation.
There is a nice discussion here about your problem. Sorry for not giving a direct answer. Hope this helps.. :)
This AVPlayer SDK DOC may be helpful for you. But as per your requirement you have go for third part or your custom implementations.
Related
My app needs to play some music files, like .mp3. I would like to use MPMoviePlayerController because it has implemented all the UI stuff for me, i.e. I do not want to bother implementing progress slide bar and things like this.
I tested to use it to play a .mp3 file and it worked fine but I do not know if it is fine to use it to do this because its name says "movie player" and it seems it is supposed to play a movie. Would apple reject this? Thank you.
For playing audio from a file or memory, AVAudioPlayer is your best option but unfortunately it doesn't support a network stream while MPMoviePlayerController can
From documentation :
An instance of the AVAudioPlayer class, called an audio player,
provides playback of audio data from a file or memory.
Apple recommends that you use this class for audio playback unless you
are playing audio captured from a network stream or require very low
I/O latency.
For the Apple validation I don't think that your application can be rejected because you're using the Media Player Framework to play an audio file. In fact here they explicitly say that you can do just that:
Choose the right technology for your needs:
To play the audio items in a user’s iPod library, or to play local or
streamed movies, use the Media Player framework. Classes in this
framework automatically support sending audio and video to AirPlay
devices such as Apple TV.
Not sure about performance and memory issues though!
Best of luck.
Could you help us please with the following problem related to the DRM (Widevine) encrypted video stream playback and use of the AirPlay?
When we tried to play the video from iPhone with use of the AirPlay on Apple TV, the "failed to load content” error was shown on the TV screen. We are not sure if that is correct behaviour. We think it is, because for encrypted video playback we cannot use the AirPlay as it transports the raw unencrypted stream, right?
So far we found that the only possible solution is showing video on iPhone, while playing audio on the AppleTV, it seems that for audio the DRM restriction does not apply.
Could you confirm the above description? Could you give us some advices?
We found also following (Note that we are not using Brightcove, but the principle should be same) information on the Internet: http://support.brightcove.com/en/video-cloud/docs/widevine-plugin-brightcove-video-cloud-player-sdk-ios
Try WVUseEncryptedLoopback (set it to #1). It enables AirPlay support by securing the AirPlay stream. 1 enables encrypted loopback. 0 by default.
Also, enable WVPlayerDrivenAdaptationKey (set it to #1) (Switch between Apple Native Player Adaption = 1 and Widevine Adaption = 0)
Widevine version: 6.0.0.12792
I have an AVSpeechSynthesizer which converts text to speech, but i've encountered a problem.
I don't know how to save the audio file that it generates to a music file, which I would quite like to be able to do!
So here's my question, how do you save the AVSpeechSynthesizer output and if this isn't possible, can I us AVFoundation, CoreMedia or other public API to capture the output of the speakers, but before it has come out?
Thanks!
Unfortunately no, there is no public API available to capture the speaker output and looking over the docs for AVSpeechSynthesizer and related classes I don't see a way to capture any audio from it. You may want to look at 3rd party libraries to help with this.
Related questions:
Recording audio output only from speaker of iphone excluding microphone
Text-to-speech libraries for iPhone
I'm writing an IOS-App which should record video, using front camera, and audio of the user working with the app. Later I want to analyse the user behavior offline. This App should run on an iPad 3.
Remark: The observed users will be people form my office. Code & data is only needed for the development process and won't be included in the final APP.
My requirements: Video and audio should be uncompressed, at least audio must be uncompressed. I think uncompressed video recording without skipping frames is not possible on an iPad (See: where can i find an uncompressed video recording from iPhone 3G/3GS/4 ), but uncompressed audio is possible.
Here are my questions:
Is it possible to record a video (compressed) and audio (uncompressed / kAudioFormatLinearPCM) simultaneously?
Is it possible to save video and audio in seperate files?
If one of the two questions is YES then what should I do in AVCam-Example http://developer.apple.com/library/ios/#samplecode/AVCam/Introduction/Intro.html to solve my problems :-)
Thank you all in advance!
The AVCam sample code isn't flexible enough to do what you want. You need to use AVAssetWriter to write out the media. I'm not 100% sure on the uncompressed audio bit but the VideoSnake sample code from WWDC 2012 session 520 is a great place to start with AVAssetWriter. I can't speak to performance but you could have 2 AVAssetWriters for video and audio, just modify that code to vend the samplebuffers to the appropriate writer.
I am trying to play a video file with the .mov extension. It is not playing the video in my MPMoviePlayerController. I need to play this type of file in my application. Is it possible to play a video file without using MPMoviePlayerController or UIWebView?
Please anyone suggest me in this issue.
Thanks in Advance
I've never used it directly before, but take a look at the AVPlayer Class Reference.
Under the "Overview", you'll see:
You use an AVPlayer object to implement controllers and user
interfaces for single- or multiple-item playback. The multiple-item
case supports advanced behaviors.
If you look at the AV Foundation Programming guide (linked here), you'll see Apple builds the Media Player upon AV foundation.