Youtube api not loading YouTubePlayerView on newer android device - youtube-api

You tube api is working fine when I test it on an older android device but on a newer device it won't load.
Yes the device does have the youtube app updated.
I'm getting this error:
Caused by: java.lang.IllegalArgumentException: The concrete class implementing IObjectWrapper must have exactly *one* declared private field for the wrapped object. Preferably, this is an instance of the ObjectWrapper<T> class.
at aolk.a(SourceFile:13)
at com.google.android.youtube.api.jar.client.RemoteEmbeddedPlayer.<init>(SourceFile:42)

Related

Integrate MobilePay in Xamrain app (IOS version)

I am creating a project in Xamarin and in that, I want to use MobilePay as a payment option. And I have checked many more references and videos about the implementation and I'll complete the android version(with the deep link) but I can't able to do it for the IOS version
Can you please get me some idea about these things, How to pass data and receive data from the app delegate (controller to appdelegate file)
I have implemented it in android and ios also I'll go with platform-specific
Below the image, I get the response in the android version after the complete the payment process from the MobilePay
But when I am implementing IOS I'll use UIApplication(UIKit package) and it also works but when the app center generate build APK (after committing code) for the android it's getting me below error
means it's the effect on the android app building
I don't have an idea how to pass data controller to app delegate file and use response data that getting from the google pay response

OpenTok Web SDK on iOs: session.publish reloads page

I have set up a video-call app using the Vonage Video API and Ionic. I am using the web SDK. Everything works perfectly on browser and android devices, but I have one problem on iOs devices: after creating a publisher and calling session.publish, my app reloads instantly. The callback from session.publish does not get called, the reload happens before that. All I see is a pending "ClientEvent" XHR call that never gets resolved.
Before you ask, the user has camera and microphone permissions.
I solved this by using the VP8 video codec instead of the H264 one. It's an ongoing issue in iOs 15.1.

Google Cast plugin for Unity can't stream Video clip

I'm trying to play a Video inside Unity, and stream it to Google Cast.
Google provides a plugin that enables the connection to a Cast device, and it works fine once a correct Cast App ID is given.
Recently Unity provided a component 'VideoPlayer' that enables video playback inside a mobile device. And I tried to use both of them to stream video content on the Cast device. But when I play the video, the app stops responding with a signal 'SIGABRT' at
reinterpret_cast<PInvokeFunc>(_native_GCKUnityRenderRemoteDisplay)();
I also tried to play the video using AVPro plugin but the same issue appeared.
The plugin works just fine without a video, and the last update of the plugin is Apr 2016 so I think the plugin has some issue with the Unity's latest VideoPlayer component.
Is there something I can do about it?
There are currently no plans to update the Google Cast Unity plugin.

Android Marshmallow stuttering on background streaming (Buffers?)

I 'm developing an application that allows you to continue receiving (like a browser) in the background Youtube video and audio through the YouTube Player API Reference for iframe Embeds in a WebView.
When you're viewing a video, and turn off the screen with the power button, everything is still fine, but when I press the home button from the application, for a few seconds is perfectly listen, but then audio begins stuttering.
I tried to implement the WebView in a service, even with a floating overlay window, and all works, but the same thing happens when I press the home button, start stuttering audio and video (because in this case you can see the video too). Later, if I return to the application, all begin to work well again. It seems that app need to stay in foreground to work properly.
I've been trying different ideas, and I've been searching google for a week, but i dont know if this have a solution, the error is in Android 6.0, Android 5.0.2 working properly.
Here is the logcat for android 5.0.2 when i press home button:
W/cr_media: calling MediaCodec.release()
W/cr_media: calling MediaCodec.release()
E/OMXMaster: A component of name 'OMX.qcom.audio.decoder.aac' already exists, ignoring this one.
W/linker: libaricentomxplugin.so has text relocations. This is wasting memory and prevents security hardening. Please fix.
E/OMXMaster: A component of name 'OMX.qcom.audio.decoder.aac' already exists, ignoring this one.
E/OMXNodeInstance: OMX_GetExtensionIndex OMX.google.android.index.storeMetaDataInBuffers failed
E/ACodec: [OMX.google.vp9.decoder] storeMetaDataInBuffers failed w/ err -2147483648
E/OMXNodeInstance: getParameter(1868562439) ERROR: 0x8000101a
And the logcat for Android 6.0:
E/Surface: getSlotFromBufferLocked: unknown buffer: 0xb8b429e0
D/AudioManager: AudioManager dispatching onAudioFocusChange(1) for android.media.AudioManager#7f0c8a8com.mzrsoftware.android.youparrot.WebViewerYoutube$3#898cfc1
W/cr_media: calling MediaCodec.release()
W/cr_media: calling MediaCodec.release()
W/OpenGLRenderer: Fail to change FontRenderer cache size, it already initialized
E/OMXMaster: A component of name 'OMX.qcom.audio.decoder.aac' already exists, ignoring this one.
E/OMXMaster: Failed to get omx plugin handle
D/AudioManager: AudioManager dispatching onAudioFocusChange(-1) for android.media.AudioManager#7f0c8a8com.mzrsoftware.android.youparrot.WebViewerYoutube$3#898cfc1
E/OMXMaster: A component of name 'OMX.qcom.audio.decoder.aac' already exists, ignoring this one.
E/OMXMaster: Failed to get omx plugin handle
W/OMXNodeInstance: [1:google.vp9.decoder] component does not support metadata mode; using fallback
E/ACodec: [OMX.google.vp9.decoder] storeMetaDataInBuffers failed w/ err -1010 / is-streaming 1
E/OMXNodeInstance: getParameter(1:google.vp9.decoder, ParamVideoAndroidVp8Encoder(0x6f600007)) ERROR: UnsupportedIndex(0x8000101a)
E/Surface: getSlotFromBufferLocked: unknown buffer: 0xb8e1a248
-----(Here audio, video or both start stuttering)-----
W/OpenGLRenderer: Fail to change FontRenderer cache size, it already initialized
E/Surface: getSlotFromBufferLocked: unknown buffer: 0xb8ed9ff0
Viewing logcat, and with what I found online, it seems to be a problem specific to Android 6.0 with buffers (buffer underrun), which may have been fixed in Android 6.0.1 , but I can not prove it.
Any solution for this?? Thanks!
I have observed similar issue with youtube embed videos on Android L (5.1.1), but in my case issue was with video only in chrome/native browser and audio was working fine. When I play youtube embedded videos on some website, video screen remains "green". After investigating, I found that issue was with incorrect sanity check on buffers in libstagefright (check https://android-review.googlesource.com/#/c/178013/). I am developing custom ROM, so fixed it in android OS code and now AV works without issue.
As a workaround, in chrome if I enable "Media Source API" from chrome://flags then it works without making change in OS.

Is NSPortMessage in the iOS API?

I am trying to write a demo according to Thread Programming Guide about run loops.
When I implement NSPortDelegate's - (void)handlePortMessage:(NSPortMessage *)portMessage; method it prompts an error:
Receiver type 'NSPortMessage' for instance message is a forward declaration
So I try to import by "Foundation/NSPortMessage.h", after which it says:
Foundation/NSPortMessage.h file not found.
So I wonder whether we can use NSPortMesssage in iOS?
NSPortMessage doesn't seem to be in the iOS documentation so is presumably a private API. Xcode does code completion of NSPortMessage for me when I try to use it, presumably because of the forward declaration. However, if I ask Xcode to show me the definition of NSPortMessage it says the symbol is not found which would confirm that it is a private API.
The class exists in the Objective-C Runtime on my iPhone 4s so it is on the device. However NSPortMessage allows inter-process communication which I assume would be against the iOS sandboxing security. Perhaps it will work for inter-thread communications though.
I certainly wouldn't try and use it in an app that is intended for the App Store.
Since the iOS version of distributed object programming guide and certain APIs are also deprecated, it seems that all stuff related to port-based input source are discouraged to be used for iOS development. But setting up a port-based channel to communicate between threads using Core Foundation function still be an available choice.

Resources