I have a LiveBroadcast and have added a scheduledStartTime in the future. As far as I can tell from testing this time doesn't have an effect on the overall state of the LiveBroadcast i.e. if the broadcast has a lifeCycleState of ready/testing it does not transition to live at the time set as the scheduledStartTime.
Can anyone tell me what effect the scheduledStartTime has on the Live Broadcast (or any other entity?)
Cheers
Deepak
Here's an answer from the official docs, in file resource:
snippet.scheduledStartTime datetime The date and time that the
broadcast is scheduled to start. The value is specified in ISO 8601
(YYYY-MM-DDThh:mm:ss.sZ) format.
Related
It seems to be a specific number of seconds of silence, but I haven't found a definitive answer on this.
This detail is not publicly available in the documentation. Since the interim_results are used on streaming requests to get partial translation during the streaming, my guess is that it is not a straight-forward rule but most likely is related to the ML model used, so it would be hard to identify the exact triggers for every different situation.
Nonetheless, I'm wondering why you would need to know the relation between what's ocurring with the audio and interim result during a streaming request. For example, if you want to split the requests and translations for every time the speaker pause, you could use single_utterance=true for every request.
I'm trying to get a livestream working on youtube. I want to stream 360° content with H264 video and AAC audio. The stream is started with the youtube live api from my mobile app and librtmp is used to deliver video and audio packets. I easily get to the point where the livestream health is good and my broadcast and stream are bound successfully.
However, when I try to transition to "testing" like this:
YoutubeManager.this.youtube.liveBroadcasts().transition("testing", liveBroadcast.getId(), "status").execute();
I get stuck on the "startTesting" status every time (100% reproducible) while I expect it to change to testing after few seconds to allow me to change it to live.
I don't know what's going on as in the youtube live control room everything seems to be fine but the encoder won't start.
Is it a common issue? Is there a mean to access the encoder logs? If you need more information feel free to ask me.
Regards.
I found a temporary fix !
I noticed 2 things :
When the autostart option was on, the stream changed its state to startLive as soon as I stopped sending data. It suggested that the encoder was trying to start but it was too slow to do it before some other data paket was received (I guess)
When I tried to stream to the "Stream now" URL, as #noogui suggested, it worked ! So I checked out what was the difference in the stream now & event configurations.
It turned out I just had to activate the low latency option as it's done by default in the stream now configuration.
I consider it as a temporary fix because I don't really know why the encoder isn't starting otherwise and because it doesn't work with the autostart option... So I hope it wont break again if Youtube does another change on their encoder.
So, if you have to work with the Youtube api, good luck guys !
I'm trying to use YouTube player API in my app but I don't know hot to determine if the video is live stream or not. And if anybody knows how to get real duration of the video.
Update:
I figured out a way to determine if content is live or not, I used my backend server for getting the data, but I still can't get the exact duration of the live video.
If you are using the youtube-ios-player-helper YTPlayerView, the playerView:didChangeToQuality: delegate method will return kYTPlaybackQualityAuto for Live Events.
See my pull request on the repo here as well as related discussion in this issue.
The duration of the video should be returned from the duration method on the player, but I've found this to be rather unreliable, with some Live Events returning a duration of 0. Further discussion can be found in this Stack Overflow question.
this is old but you can get the liveStreamingDetails.actualStartTime through the youtube API.
With the actualStartTime in hands, you can calculate how much time elapsed.
There is also the endTime in liveStreamingDetails.
"https://www.googleapis.com/youtube/v3/videos"
"?part=liveStreamingDetails"
"&id=$id&key=$_key"
I want my app to have a subscription service and the way I see it is by keeping timeIntervalSince1970 as an "until" date. But that is easily avoidable if the user changes system's current time. Is there any better way to track that in offline mode?
take a look at this post1, it explain how to measure passed time, independent of clock and time zone changes.
you also take a look at this post2, it explain how detect device time change only when it is changed manually
Let me know if this helps you :)
am newbie for multimedia work.i want to capture audio by samples and transfer to some other ios device via network.how to start my work??? .i have just gone through apple multi media guide and speakhere example ,it is full of c++ code and they are writing in file and then start services ,but i need buffer...please help me to start my work in correct way .
Thanks in advance
I just spent a bunch of time working on real time audio stuff you can use AudioQueue but it has latency issues around 100-200ms.
If you want to do something like the t-pain app, you have to use
RemoteIO API
Audio Unit API
They are equally difficult to implement, so I would just pick the remote IO path.
Source can be found here:
http://atastypixel.com/blog/using-remoteio-audio-unit/
I have upvoted the answer above, but I wanted to add a piece of information that took me a while to figure out. When using AudioQueue for recording, the intuitive notion is that the callback is done in regular intervals of whatever the number of samples represent. That notion is incorrect, AudioQueue seems to gather the samples for a long period of time, then deliver them in very fast iterations of the callback.
In my case, I was doing 20ms samples, and receiving 320 samples per callback. When printing out the timestamps for the call, I noticed a pattern of: 1 call every 2 ms, then after a while one call of ~180ms. Since I was doing VoIP, this presented the symptom of an increasing delay on the receiving end. Switching to Remote I/O seems to have solved the issue.