Reatime data capture from music streaming services - firebase-realtime-database

Is it a possibility, in realtime capture the data of what a user is currently listening to, the playlists they like/follow/created?
For instance, user A is listening to a song on Spotify and User N is also listening to a song the same song on the same platform. Now, is there a chance to capture the real-time data of what users are listening to, and build a community to match the users?

Related

Spotify's Web Playback SDK - Can users other than the user who created the player listen through the SDK?

I want to create a web app where I use Spotify's Web Playback SDK to create a player and play music in the browser from another Spotify client (e.g. my Spotify app on my phone or Spotify app on my computer) and also enable other users who are on my website to listen to what's being played live.
Is this possible to do through the Web Playback SDK? If not, how would I go about implementing this?
If the playlist is public then you could allow your users to sign into the website to get a Spotify Token you can then use with the Web Playback SDK, since they log into their own Spotify account, should they have one, it will allow them to listen to your Playlist.
However you would need to have additional code / events to ensure they're in the correct track and position. But you could have something to set this, such as something on your machine which publishes the track and position you're at every so often so you can then have it set the current playback track and position in their web player - but that's something you'd have to figure out is possible, but at least to listen to the same Playlist would work.

WebRTC videochat p2p: Switch from local stream to p2p stream

I want to establish a p2p video chat using WebRTC.
This is meant for a "doctor-patient" 1-on-1 video chat.
The video conference should start at a certain date/time.
However, both parties should be able to already join the room, but not see each other. They should be able to adjust their camera in private.
How could this be achieved?
I'm absolutely not sure which way to go here.
Could I perhaps switch from local stream to p2p stream at that certain start date / time of the appointment?
Thank you.

How does Twilio's "Programmable Video" work?

I'm building a streaming iOS app in Swift. Looking at the docs https://www.twilio.com/docs/api/video I understand that you can create live video chat rooms on the fly.
My use case is a bit different:
User A access a room, hit 'record' and start streaming a video of himself to Twilio storage. Creates a thumbnail in the UI. User B enters the same room and click the video thumbnail - that video should be streamed down to User B.
If user A is talking (Streaming up) and user B is in the room at the same time, it should be possible to 'Go live', which would start a live video chat room that other users can join too.
Main question: Does Twilio Programmable Video allow streaming up and down using their storage?
Secondary question: Would you say Twilio Programmable Video is the right choice for this use case or would you recommend another service?
Twilio developer evangelist here.
I'll answer this the other way around that you asked if that's ok.
If User A is currently streaming to a room and recording it (having created the room in group mode with RecordParticipantsOnConnect set to true) and another user wants to join the room, then they can. They just need an access token that gives them access to the room. They will then be able to join the room and chat and be recorded too.
Once a recording is complete, you will receive a webhook to the statusCallback URL that was set for the room. The callback for the recording will have the recording-complete and will include a MediaURL for the recording as well as the Uri and Sid for the recording resource.
You can use the media URL or the recording resource to get the binary data, which for videos will be in .mkv format. If you want to stream this video to your users, you may want to download the video and convert to a playable format. Or upload it to a streaming service.
Let me know if that helps at all.

Is it possible to use the YouTube Live Stream API to broadcast through my phone camera?

I want to create a basic app that allow users to simply start to broadcast a video through their phone camera (front and back) just by pressing a button.
Does the YouTube live stream API allow me to handle the video streaming process?
If so, is YouTube Live Stream API totally free of charges and will never ask me to pay something if I reach a certain amount of usage?
Creating a Live Event and Live broadcast is language and hardware agnostic, just use YouTube's Live Streaming HTTP API. Read through the Core Concepts and Life of a Broadcast guides.
Your flow might look something like this:
Authenticate the user.
Set up and schedule your Live Broadcast object.
Start your video encoder and create a Live Stream Object.
Bind your Live Stream to your Live Broadcast.
Test to verify your video is going through.
Set your Live Broadcast to Live.
At the conclusion of your event, set your Live Broadcast to Ended.
Note that setting up your encoder is on you. Asking "How do I create an RTMP or DASH video encoder for [hardware or software]" is too broad of a question for Stack Overflow.
The YouTube API is free to use within a specific quota. If you hit that quota limit, there are ways to request additional quota from Google (potentially for a fee).
I answered a similar question about integrating with YouTube's Live Streaming API on iOS here: YouTube live on iOS?

Any way to Keep LiveStream active on Youtube Livestream?

I'd like to livestreaming my lectures to Youtube with youtube-livestreaming-api.
I create new live events when the lectures start and I transit my LiveEvent to completion when the lectures end.
The problem is break-time. when break-time short, I don't need to change or insert LiveStream because the Live Stream is still active even though LiveEvents stop. The problem is when the break-time long, the Live Stream turns to inactive. so I can't transition to testing and live. Is there any way to keep LiveStream active?
Any Suggestion? or Any Idea?
You may use the LiveBroadcasts:transition and should confirm that the value of the status.streamStatus property for the stream bound to your broadcast is active
A liveStream resource contains information about the video stream that you are transmitting to YouTube. The stream provides the content that will be broadcast to YouTube users.
HTTP request
POST https://www.googleapis.com/youtube/v3/liveBroadcasts/transition
Note: This request requires authorization with at least one of the following scopes.
The parameter broadcastStatus identifiees the state of which the broadcast is changing. Note that to transition a broadcast to either the testing or live state. live is visible to its audience. YouTube transmits video to the broadcast's monitor stream and its broadcast stream.

Resources