How to Stream screen without Broadcast Extension in iOS - ios

I wanna streaming my app to twitch, youtube or such a streaming service without any other application likes mob crush.
According to Apple, by using Broadcast Extension I can stream my application screen.
Broadcast Extension gave video data as a type of CMSampleBuffer. Then I should send that data to rtmp sever like youtube, twitch or etc.
I think if I can get video data, I can stream the other things without using Broadcast Extension in my app. So I try to send RPScreenRecorder data to rtmp server, but I doesn't work.
Here is a code I wrote.
I use HaishinKit open source framework to rtmp communication.
(https://github.com/shogo4405/HaishinKit.swift/tree/master/Examples/iOS/Screencast)
let rpScreenRecorder : RPScreenRecorder = RPScreenRecorder.shared()
private var broadcaster: RTMPBroadcaster = RTMPBroadcaster()
rpScreenRecorder.startCapture(handler: { (cmSampleBuffer, rpSampleBufferType, error) in
if (error != nil) {
print("Error is occured \(error.debugDescription)")
} else {
if let description: CMVideoFormatDescription = CMSampleBufferGetFormatDescription(cmSampleBuffer) {
let dimensions: CMVideoDimensions = CMVideoFormatDescriptionGetDimensions(description)
self.broadcaster.stream.videoSettings = [
"width": dimensions.width,
"height": dimensions.height ,
"profileLevel": kVTProfileLevel_H264_Baseline_AutoLevel
]
}
self.broadcaster.appendSampleBuffer(cmSampleBuffer, withType: .video)
}
}) { (error) in
if ( error != nil) {
print ( "Error occured \(error.debugDescription)")
} else {
print ("Success")
}
}
}
If you have any solution, please answer me :)

I've tried a similar setup and it is possible to achieve what you'd like, just need to adjust it a little:
I don't see it in your example, but make sure that the broadcaster's endpoint is correctly set up. For example:
let endpointURL: String = "rtmps://live-api-s.facebook.com:443/rtmp/"
let streamName: String = "..."
self.broadcaster.streamName = streamName
self.broadcaster.connect(endpointURL, arguments: nil)
Then in the startCapture's handler block you need to filter by the buffer type to send the correct data to the stream. In this case you're only sending the video so we can ignore audio. (You can also find some examples with HaishinKit to send audio too.) For example:
RPScreenRecorder.shared().startCapture(handler: { (sampleBuffer, type, error) in
if type == .video, broadcaster.connected {
if let description: CMVideoFormatDescription = CMSampleBufferGetFormatDescription(sampleBuffer) {
let dimensions: CMVideoDimensions = CMVideoFormatDescriptionGetDimensions(description)
broadcaster.stream.videoSettings = [
.width: dimensions.width,
.height: dimensions.height ,
.profileLevel: kVTProfileLevel_H264_Baseline_AutoLevel
]
}
broadcaster.appendSampleBuffer(sampleBuffer, withType: .video)
}
}) { (error) in }
Also make sure that the screen is updated during streaming. I've noticed that if you're recording a static window with RPScreenRecorder, then it will only update the handler when there's actually new video data to send. For testing I've added a simple UISlider which will update the feed when you move it around.
I've tested it with Facebook Live and I think it should work with other RTMP services too.

Related

How do you allow very large files to have time to upload to firebase before iOS terminates the task?

I have a video sharing app, and when you save a video to firebase storage it works perfectly for videos that are roughly 1 minute or shorter.
The problem that I am having, is when I try to post a longer video (1 min or greater) it never saves to firebase.
The only thing that I can think of is this error that I am getting, and this error only shows up about 30 seconds after I click the save button:
[BackgroundTask] Background Task 101 ("GTMSessionFetcher-firebasestorage.googleapis.com"), was created over 30 seconds ago. In applications running in the background, this creates a risk of termination. Remember to call UIApplication.endBackgroundTask(_:) for your task in a timely manner to avoid this.
Here is my code to save the video to firebase.
func saveMovie(path: String, file: String, url: URL) {
var backgroundTaskID: UIBackgroundTaskIdentifier?
// Perform the task on a background queue.
DispatchQueue.global().async {
// Request the task asseration and save the ID
backgroundTaskID = UIApplication.shared.beginBackgroundTask(withName: "Finish doing this task", expirationHandler: {
// End the task if time expires
UIApplication.shared.endBackgroundTask(backgroundTaskID!)
backgroundTaskID = UIBackgroundTaskIdentifier.invalid
})
// Send the data synchronously
do {
let movieData = try Data(contentsOf: url)
self.storage.child(path).child("\(file).m4v").putData(movieData)
} catch let error {
fatalError("Error saving movie in saveMovie func. \(error.localizedDescription)")
}
//End the task assertion
UIApplication.shared.endBackgroundTask(backgroundTaskID!)
backgroundTaskID = UIBackgroundTaskIdentifier.invalid
}
}
Any suggestions on how I can allow my video time to upload?
Finally figured this out after a long time...
All you have to do is use .putFile("FileURL") instead of .putdata("Data"). Firebase documentation says you should use putFile() instead of putData() when uploading large files.
But the hard part is for some reason you can't directly upload the movie URL that you get from the didFinishPickingMediaWithInfo function and firebase will just give you an error. So what I did instead was get the data of the movie, save the movie data to a path in the file manager, and use the file manager path URL to upload directly to firebase which worked for me.
//Save movie to Firestore
do {
// Convert movie to Data.
let movieData = try Data(contentsOf: movie)
// Get path so we can save movieData into fileManager and upload to firebase because movie URL does not work, but fileManager url does work.
guard let path = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first?.appendingPathComponent(postId!) else { print("Error saving to file manager in addPost func"); return }
do {
try movieData.write(to: path)
// Save the file manager url file to firebase storage
Storage.storage().reference().child("Videos").child("\(postId!).m4v").putFile(from: path, metadata: nil) { metadata, error in
if let error = error {
print("There was an error \(error.localizedDescription)")
} else {
print("Video successfully uploaded.")
}
// Delete video from filemanager because it would take up too much space to save all videos to file manager.
do {
try FileManager.default.removeItem(atPath: path.path)
} catch let error {
print("Error deleting from file manager in addPost func \(error.localizedDescription)")
}
}
} catch let error {
print("Error writing movieData to firebase \(error.localizedDescription)")
}
} catch let error {
print("There was an error adding video in addPost func \(error.localizedDescription)")
}

How to get current user's playlist from Spotify

I am trying to implement Spotify integration with current user's playlist to display it in my tableview. I have integrated with login and access token everything works fine. I have gone through stack overflow link:- How to get the list of songs using Spotify in Swift3 iOS? but didn't work for me.
Then to get print for canonicalUsername as below, its showing nil value
SPTUser.requestCurrentUser(withAccessToken:(SPTAuth.defaultInstance().session.accessToken)!) { (error, data) in
guard let user = data as? SPTUser else { print("Couldn't cast as SPTUser"); return }
let userId = user.canonicalUsername
})
I have even tried this link Spotify iOS SDK Swift display all (!) playlists (20+) due to beginner may be it also didn't work for me. Is there any way to get Spotify's current user-id? How could I show the current user's playlist in my table view?
Just go through online tutorial in youtube :- https://www.youtube.com/watch?v=KLsP7oThgHU&t=1s for latest version in 2019.
Download full source code with Spotify Integration + search options + default Spotify url and fetch current user's playlist and play in our native iOS App
Source:- https://github.com/azeemohd786/Spotify-Demo
Based on your question the solution to get print canonicalUsername or current user's id try as below,
SPTUser.requestCurrentUser(withAccessToken: session.accessToken) { (error, data) in
guard let user = data as? SPTUser else { print("Couldn't cast as SPTUser"); return }
let userID = user.canonicalUserName
print(userID!)
}
Then to get current user's playlist and play in ur device, first call the SPT Delegate in your view controller like and then function call,
class PlayVC: UIViewController, SPTAudioStreamingDelegate, SPTAudioStreamingPlaybackDelegate {
func audioStreamingDidLogin(_ audioStreaming: SPTAudioStreamingController) {
let playListRequest = try! SPTPlaylistList.createRequestForGettingPlaylists(forUser: session.canonicalUsername, withAccessToken: session.accessToken)
Alamofire.request(playListRequest)
.response { response in
let list = try! SPTPlaylistList(from: response.data, with: response.response)
for playList in list.items {
if let playlist = playList as? SPTPartialPlaylist {
print( playlist.name! ) // playlist name
print( playlist.uri!) // playlist uri
// self.tableView.reloadData()// if u want to display playlist name and other stuffs like so..
SPTAudioStreamingController.sharedInstance().playSpotifyURI("\(playlist.uri!)", startingWith: 0, startingWithPosition: 10) { error in
if error != nil {
print("*** failed to play: \(error)")
return
}
}
}}}
}
}

Swift - How can I convert Saved Audio file conversations to Text?

I work on speech recognition. I solve the text-to-speech and speech-to-text with IOS frameworks. But now i want to convert saved audio file conversations to text. How can i solve this ? Thank you for all replies.
I have worked on same things which are working for me.
I have audio file in my project bundle which. So I have written following code to convert audio to text.
let audioURL = Bundle.main.url(forResource: "Song", withExtension: "mov")
let recognizer = SFSpeechRecognizer(locale: Locale(identifier: "en-US"))
let request = SFSpeechURLRecognitionRequest(url: audioURL!)
request.shouldReportPartialResults = true
if (recognizer?.isAvailable)! {
recognizer?.recognitionTask(with: request) { result, error in
guard error == nil else { print("Error: \(error!)"); return }
guard let result = result else { print("No result!"); return }
print(result.bestTranscription.formattedString)
}
} else {
print("Device doesn't support speech recognition")
}
First get audio url from where you have store audio file.
Then create instance of SFSpeechRecognizer with locale that you have want.
Create instance of SFSpeechURLRecognitionRequest which are used to requesting recognitionTask.
recognitionTask will give you result and error. Where result contains bestTranscription.formattedString. formmatedString is your test result of audio file.
If set request.shouldReportPartialResults = true, this will give your partial result of every line speak in audio.
I hope this will help you.

AVAudioEngine warning: "deprecated Carbon Component Manager for hosting Audio Units"

I'm writing my first audio app for Mac, which loads an external audio unit and uses it to play sound through an instance of AVAudioEngine, and I've been seeing this warning:
WARNING: 140: This application, or a library it uses, is using the
deprecated Carbon Component Manager for hosting Audio Units. Support
for this will be removed in a future release. Also, this makes the
host incompatible with version 3 audio units. Please transition to the
API's in AudioComponent.h.
I've already transitioned from using AVAudioUnitComponents to AudioComponents (now accessed via this api) which I expected would solve this issue, but I'm still seeing this warning when I call start() on my engine.
Any ideas what's going wrong here? As far as I can tell, I'm no longer using deprecated APIs. Is it possible that AVAudioEngine is using deprecated APIs under the hood?
Here's a snippet from the code I'm working with. I'm calling selectInstrument with a description I've retrieved using the AudioComponents API.
public func selectInstrument(withDescription description: AudioComponentDescription, callback: #escaping SelectInstrumentCallback) {
AVAudioUnit.instantiate(with: description, options: []) { avAudioUnit, error in
guard let unit = avAudioUnit else {
callback(nil)
return
}
self.disconnectCurrent()
self.connect(unit: unit)
unit.auAudioUnit.requestViewController { viewController in
callback(viewController)
}
}
}
private func disconnectCurrent() {
guard let current = currentInstrument else { return }
self.engine.disconnectNodeInput(engine.mainMixerNode)
self.engine.detach(current)
self.currentInstrument = nil
self.engine.stop()
}
private func connect(unit: AVAudioUnit) {
let hardwareFormat = self.engine.outputNode.outputFormat(forBus: 0)
self.engine.connect(self.engine.mainMixerNode, to: self.engine.outputNode, format: hardwareFormat)
self.engine.attach(unit)
do {
try ExceptionCatcher.catchException {
let stereoFormat = AVAudioFormat(standardFormatWithSampleRate: hardwareFormat.sampleRate, channels: 2)
self.engine.connect(unit, to: self.engine.mainMixerNode, format: stereoFormat)
}
} catch {
let monoFormat = AVAudioFormat(standardFormatWithSampleRate: hardwareFormat.sampleRate, channels: 1)
self.engine.connect(unit, to: self.engine.mainMixerNode, format: monoFormat)
}
unit.auAudioUnit.contextName = "Running in AU host demo app"
self.currentInstrument = unit
do {
// Carbon Component Manager warning issued here:
try self.engine.start()
} catch {
print("Failed to start engine")
}
}
Thanks for your help!

Swift: Web Service API call returns error on certain networks

I am creating an iOS app using Swift that uses some web services to get some information. Specifically I am using the food2fork API to get some recipes. The problem that I'm having is that, if I am connected to the internet at my University, the web calls will always return errors, even though I know that I am connected to the internet on the phone. I believe that the error has something to do with how the network only will handle secure websites, but I'm not sure.
Am I not using NSURL correctly? Is there a better way that I should do it to ensure that my web calls will always return the data that the app needs? Here is the function:
func getRecipeByID(recipeId: String, sendTo: RecipeInfoViewController)
{
let theURLAsString = "http://food2fork.com/api/get?key=[MY KEY]&rId=" + recipeId
let theURL = NSURL(string: theURLAsString)
let theURLSession = NSURLSession.sharedSession()
let theJSONQuery = theURLSession.dataTaskWithURL(theURL!, completionHandler: {data, response, error -> Void in
if(error != nil)
{
print(error!)
}
do
{
let theJSONResult = try NSJSONSerialization.JSONObjectWithData(data!, options: NSJSONReadingOptions.MutableContainers) as! NSDictionary
if theJSONResult.count > 0
{
let theRecipeDictionary = theJSONResult["recipe"] as? NSDictionary
sendTo.setRecipeInfo(theRecipeDictionary!)
}
} catch let error as NSError {
print(error) //The function always gets here on certain networks
}
})
theJSONQuery.resume()
}
The error that is output at the print(error) line is:
Error Domain=NSCocoaErrorDomain Code=3840 "JSON text did not start
with array or object and option to allow fragments not set."
UserInfo={NSDebugDescription=JSON text did not start with array or
object and option to allow fragments not set.}
If you're running iOS 9, you'll need to disable App Transport Security for that domain by adding keys to your Info.plist. Otherwise, you won't be able to make non-HTTPS connections.

Resources