I'm trying to play a Video inside Unity, and stream it to Google Cast.
Google provides a plugin that enables the connection to a Cast device, and it works fine once a correct Cast App ID is given.
Recently Unity provided a component 'VideoPlayer' that enables video playback inside a mobile device. And I tried to use both of them to stream video content on the Cast device. But when I play the video, the app stops responding with a signal 'SIGABRT' at
reinterpret_cast<PInvokeFunc>(_native_GCKUnityRenderRemoteDisplay)();
I also tried to play the video using AVPro plugin but the same issue appeared.
The plugin works just fine without a video, and the last update of the plugin is Apr 2016 so I think the plugin has some issue with the Unity's latest VideoPlayer component.
Is there something I can do about it?
There are currently no plans to update the Google Cast Unity plugin.
Related
I have set up a video-call app using the Vonage Video API and Ionic. I am using the web SDK. Everything works perfectly on browser and android devices, but I have one problem on iOs devices: after creating a publisher and calling session.publish, my app reloads instantly. The callback from session.publish does not get called, the reload happens before that. All I see is a pending "ClientEvent" XHR call that never gets resolved.
Before you ask, the user has camera and microphone permissions.
I solved this by using the VP8 video codec instead of the H264 one. It's an ongoing issue in iOs 15.1.
I'm trying to gain access to a live stream through the RTSP protocol on iOS. I'm trying to run the example from this website: http://www.gdcl.co.uk/2013/02/20/iOS-Video-Encoding.html and it's advertised that you can just take the url (rtsp://) and paste it into quicktime player, VLC or some other means, but whenever I try it fails. When I try in quicktime player it gives me this error: The document “Macintosh HD” could not be opened. The file may be damaged or may not be a movie file that is compatible with QuickTime Player.
What am I doing wrong? Is the example broken or do I need to update some specs in the code. I'm running iOS 9.3 and it's told to work > 7.0.
I was able to play this back on VLC when compiling and running on my iOS device. You need to ensure that you are on WiFi (vs LTE or 3G). I'm on iOS 9.2.1 and played back with VLC version 2.2.2.
You can then take it a step further as I was successful in ingesting it into Wowza via Stream file with the following configuration:
{
uri : "rtsp://[rtsp-address-as-published-on-the-app]",
streamTimeout:12000,
reconnectWaitTime:12000,
rtpTransportMode:"udp",
rtspValidationFrequency:15000,
rtspFilterUnknownTracks:true,
rtspStreamAudioTrack:false,
rtspStreamVideoTrack:true,
rtpDebugSession:true,
rtspSessionTimeout:12000,
rtspConnectionTimeout:12000
}
I would suggest reviewing what the console logs say in your iOS application (xcode) and then also take a look at your VLC error messages/logs as well to see what the exact issue is when you try to playback.
I 'm developing an application that allows you to continue receiving (like a browser) in the background Youtube video and audio through the YouTube Player API Reference for iframe Embeds in a WebView.
When you're viewing a video, and turn off the screen with the power button, everything is still fine, but when I press the home button from the application, for a few seconds is perfectly listen, but then audio begins stuttering.
I tried to implement the WebView in a service, even with a floating overlay window, and all works, but the same thing happens when I press the home button, start stuttering audio and video (because in this case you can see the video too). Later, if I return to the application, all begin to work well again. It seems that app need to stay in foreground to work properly.
I've been trying different ideas, and I've been searching google for a week, but i dont know if this have a solution, the error is in Android 6.0, Android 5.0.2 working properly.
Here is the logcat for android 5.0.2 when i press home button:
W/cr_media: calling MediaCodec.release()
W/cr_media: calling MediaCodec.release()
E/OMXMaster: A component of name 'OMX.qcom.audio.decoder.aac' already exists, ignoring this one.
W/linker: libaricentomxplugin.so has text relocations. This is wasting memory and prevents security hardening. Please fix.
E/OMXMaster: A component of name 'OMX.qcom.audio.decoder.aac' already exists, ignoring this one.
E/OMXNodeInstance: OMX_GetExtensionIndex OMX.google.android.index.storeMetaDataInBuffers failed
E/ACodec: [OMX.google.vp9.decoder] storeMetaDataInBuffers failed w/ err -2147483648
E/OMXNodeInstance: getParameter(1868562439) ERROR: 0x8000101a
And the logcat for Android 6.0:
E/Surface: getSlotFromBufferLocked: unknown buffer: 0xb8b429e0
D/AudioManager: AudioManager dispatching onAudioFocusChange(1) for android.media.AudioManager#7f0c8a8com.mzrsoftware.android.youparrot.WebViewerYoutube$3#898cfc1
W/cr_media: calling MediaCodec.release()
W/cr_media: calling MediaCodec.release()
W/OpenGLRenderer: Fail to change FontRenderer cache size, it already initialized
E/OMXMaster: A component of name 'OMX.qcom.audio.decoder.aac' already exists, ignoring this one.
E/OMXMaster: Failed to get omx plugin handle
D/AudioManager: AudioManager dispatching onAudioFocusChange(-1) for android.media.AudioManager#7f0c8a8com.mzrsoftware.android.youparrot.WebViewerYoutube$3#898cfc1
E/OMXMaster: A component of name 'OMX.qcom.audio.decoder.aac' already exists, ignoring this one.
E/OMXMaster: Failed to get omx plugin handle
W/OMXNodeInstance: [1:google.vp9.decoder] component does not support metadata mode; using fallback
E/ACodec: [OMX.google.vp9.decoder] storeMetaDataInBuffers failed w/ err -1010 / is-streaming 1
E/OMXNodeInstance: getParameter(1:google.vp9.decoder, ParamVideoAndroidVp8Encoder(0x6f600007)) ERROR: UnsupportedIndex(0x8000101a)
E/Surface: getSlotFromBufferLocked: unknown buffer: 0xb8e1a248
-----(Here audio, video or both start stuttering)-----
W/OpenGLRenderer: Fail to change FontRenderer cache size, it already initialized
E/Surface: getSlotFromBufferLocked: unknown buffer: 0xb8ed9ff0
Viewing logcat, and with what I found online, it seems to be a problem specific to Android 6.0 with buffers (buffer underrun), which may have been fixed in Android 6.0.1 , but I can not prove it.
Any solution for this?? Thanks!
I have observed similar issue with youtube embed videos on Android L (5.1.1), but in my case issue was with video only in chrome/native browser and audio was working fine. When I play youtube embedded videos on some website, video screen remains "green". After investigating, I found that issue was with incorrect sanity check on buffers in libstagefright (check https://android-review.googlesource.com/#/c/178013/). I am developing custom ROM, so fixed it in android OS code and now AV works without issue.
As a workaround, in chrome if I enable "Media Source API" from chrome://flags then it works without making change in OS.
I'm developing a Cordova application that utilizes Tokbox to allow conference calls. It's using the following cordova plugin: https://github.com/songz/cordova-plugin-opentok.
With a little head scratching here and there everything works fine now, except for one thing:
I can't change the audio volume on iPad and iPhone.
On Android using the volume keys works. The volume slider on the iPad and the volume buttons on the iPhone do nothing.
Does anyone know how to get it working, or does it work for you without extra effort?
What i've tried so far:
I looked though the opentok.js library and noticed that setting audio
volume programmatically is not implemented.
Checked the issue tracker for the opentok plugin to see if this is a known issue.
Tried several cordova libraries that promise to allow changing the volume of html5 audio/video tag.
Thanks,
Lif
The key functionality of the app would be 1) recording short videos (typically 20-30 sec), 2) playing the videos 1-5 times right after shooting them (slow motion and pausing are a must) and 3) drawing over the videos, i.e. I'd need an additional data layer on top of the raw video.
I've been studying the HTML5 app platforms (PhoneGap, Titanium) because I'd like to minimize writing native iOS code, but it seems both recording and showing embedded video doesn't work on these platforms. The record-play-edit process is pretty simple, but it needs to be super-smooth and fast.
If you want to use JS / HTML5 and then generate the app with eg. Phonegap, then one option could be a custom Phonegap plugin built for "Media capture" and then use HTML5 in creating the app logic.
Objective-C Media Capture:
http://developer.apple.com/library/ios/#documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/04_MediaCapture.html#//apple_ref/doc/uid/TP40010188-CH5
Example Phonegap plugin for Audio Capture:
https://github.com/purplecabbage/phonegap-plugins/tree/master/iPhone/AudioRecord
More info about Phonegap plugin creation for iOS can be found from Phonegap wiki...