Getting Red5Pro Live Streaming to function properly on iOS - ios

So I'm working on allowing users to begin a livestream (visible to those subscribed to them) from our application. We are using a Red5Pro server. I have followed the instructions from Red5's iOS page, and when it runs on the phone the camera screen comes up, our really nice looking UI comes up, everything looks great.
But when I push the button to begin recording a livestream, the app either
1) crashes abruptly
2) claims it is taking a livestream, but it won't show up on Red5's "Check if your server has a stream being broadcasted currently" page.
Anyone with Red5Pro experience wanna glance over my code and possibly point to something wrong? We are using Swift 2 still (not my choice) at the moment, and there are no error messages on Xcode's side of things. Thanks!
import UIKit
import R5Streaming
class PublishViewController : R5VideoViewController, R5StreamDelegate{
var config : R5Configuration!
var stream : R5Stream!
override func viewDidLoad() {
super.viewDidLoad()
config = R5Configuration()
config.host = Defaults.sharedDefaults.localHost
config.port = Int32(Defaults.sharedDefaults.hostPort)
config.contextName = "live"
}
override func viewWillDisappear(animated: Bool) {
super.viewWillDisappear(animated)
self.stop()
}
func preview(isBackCamera: Bool) {
let cameraDevice: AVCaptureDevice = isBackCamera ? AVCaptureDevice.devicesWithMediaType(AVMediaTypeVideo).first as! AVCaptureDevice : AVCaptureDevice.devicesWithMediaType(AVMediaTypeVideo).last as! AVCaptureDevice
let camera = R5Camera(device: cameraDevice, andBitRate: 512)
camera?.orientation = (camera?.orientation)! + 90
let audioDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeAudio)
let microphone = R5Microphone(device: audioDevice)
let connection = R5Connection(config: config)
stream = R5Stream.init(connection: connection)
stream.attachVideo(camera)
stream.attachAudio(microphone)
stream.delegate = self
self.attachStream(stream)
self.showPreview(true)
}
func start() {
self.showPreview(false)
stream.publish("red5prostream", type:R5RecordTypeLive)
}
func stop() {
stream.stop()
stream.delegate = nil
}
func onR5StreamStatus(stream: R5Stream!, withStatus statusCode: Int32, withMessage msg: String!) {
print("Stream: \(r5_string_for_status(statusCode)) - \(msg!)")
}
}

First, make sure you download the latest iOS SDK and Red5 Pro server.
Red5 Pro Accounts Site
Your code looks good except that you have the iOS code pointing at "localhost" for your config.
config.host = Defaults.sharedDefaults.localHost
What that line is trying to do is connect your iOS device to itself. You need to point this at your Red5 Pro server. You should go to the machine where your server is running and issue ifconfig to determine what the local IP address of the server is or the WAN IP Address where you deployed the server. Then use that as the host in your iOS config host property.
You can additionally check out the "Getting Started with iOS" section of our developer series in order to get a feel for how we set up a similar application. https://www.red5pro.com/docs/developerseries/03/gsios.html
You can also join our slack channel from the accounts page as well as submit tickets for any issues you observe.
Hope this helps!

Related

Getting VPN Connection Status on Xamarin.ios

We are converting a mobile application from iOS native (Swift) to Xamarin.iOS (so it can eventually be deployed to multiple operating systems).
I am trying to research how to do the following in Xamarin.iOS:
1 - Check if a VPN connection is active/enabled on the iOS device
2 - Bring up the VPN settings screen (or better, enable a specific VPN automatically)
For:
1 - Is this https://learn.microsoft.com/en-us/dotnet/api/networkextension.nevpnstatus applicable?
2 - Same for https://learn.microsoft.com/en-us/dotnet/api/networkextension.netunnelprovidermanager?
Snippets from existing Swift code:
func connectToVpn(){
//mention the connection name instead of exposing the server
//let connectUrl = URL(string: "mobileconnect://connect?name=*********")
let connectUrl = URL(string: "mobileconnect://connect?")
if UIApplication.shared.canOpenURL(connectUrl!) == true
{
UIApplication.shared.openURL(connectUrl!)
}
}
var isVpnConnected : Bool {
let dict = CFNetworkCopySystemProxySettings()?.takeUnretainedValue() as? [String: AnyObject]
guard let keys = dict?["__SCOPED__"]?.allKeys as? [String] else{
return false
}
for key in keys {
if key.contains("tap") || key.contains("tun") || key.contains("ppp") {
return true
}
}
return false
}
Any comments/suggestions/youtube video/blog entry links would be greatly appreciated.
Edit:
For 1 - Realized I don't need to check the VPN status, will just ping an internal server/host to see if it responds.
iOS Only allows you to open your own app's settings page from within your app. If your app does not implement its own settings page, then the main settings page will open instead, but you can not open a specific system settings page. See this discussion elsewhere on SO: How to open Settings programmatically like in Facebook app?
Your approach is the way to go, that way it doesn't matter what interface type is used. Will note that there are other interface name such as "utun2". Not sure where to pull all the interface names from but here's an updated Xamarin version in case it helps someone out.
public bool isVPNConnected()
{
var settings = CoreFoundation.CFNetwork.GetSystemProxySettings();
var keys = settings.Dictionary.ValueForKey(new NSString("__SCOPED__")) as NSDictionary;
return keys.Any(k => new string[] { "tap", "tun", "utun2", "ppp", "ipsec", "ipsec0" }.Contains(k.Key.ToString()));
}

Why doesn't my iOS (Swift) app properly recognize some external display devices?

So I have an odd issue and my google-fu utterly fails to even provide me the basis of where to start investigating, so even useful keywords to search on may be of use.
I have an iOS application written in swift. I have a model hooked up to receive notifications about external displays. On some adaptors, I'm able to properly detect and respond to the presence of an external display and programatically switch it out to be something other than a mirror (see code block below). But with another adaptor, instead of just 'magically' becoming a second screen, I'm asked to 'trust' the external device, and it simply mirrors the device screen. Not the intended design at all.
func addSecondScreen(screen: UIScreen){
self.externalWindow = UIWindow.init(frame: screen.bounds)
self.externalWindow!.screen = screen
self.externalWindow!.rootViewController = self.externalVC
self.externalWindow!.isHidden = false;
}
#objc func handleScreenDidConnectNotification( _ notification: NSNotification){
let newScreen = notification.object as! UIScreen
if(self.externalWindow == nil){
addSecondScreen(screen: newScreen)
}
}
#objc func handleScreenDidDisconnectNotification( _ notification: NSNotification){
if let externalWindow = self.externalWindow{
externalWindow.isHidden = true
self.externalWindow = nil
}
}
The worst issue here is that because I'm connecting to an external display to do this, I can't even run this code through the debugger to find out what is going on. I don't know where to even begin.
Any ideas?
Edit:
Thanks to someone pointing out wifi debugging, I can tell you my notifications are firing off, but they're both firing at the same time, one after the other, when the external adaptor is disconnected.

Is it okay to download data directly to watch OS from server

So I'm trying to make a watchOS app for a music streaming app, and I found an example pretty much close to what I'm going to make.
(https://github.com/belm/BaiduFM-Swift)
But It seems like the project is kinda outdated. According to the codes below, watch extension is getting required datas like sound, images via HttpRequest. From what I read, watchOS 3 supports Background Connectivity, (which enables app to transfer data more efficiently) and Apple encourages developers to process and get data from the main app.
What is right way to do it? Is there any good example to see?
// play song method in interface controller
HttpRequest.getSongLink(info.id, callback: {(link:SongLink?) -> Void in
if let songLink = link {
DataManager.shareDataManager.curSongLink = songLink
DataManager.shareDataManager.mp.stop()
var songUrl = Common.getCanPlaySongUrl(songLink.songLink)
DataManager.shareDataManager.mp.contentURL = NSURL(string: songUrl)
DataManager.shareDataManager.mp.prepareToPlay()
DataManager.shareDataManager.mp.play()
DataManager.shareDataManager.curPlayStatus = 1
Async.main{
self.songTimeLabel.setText(Common.getMinuteDisplay(songLink.time))
}
HttpRequest.getLrc(songLink.lrcLink, callback: { lrc -> Void in
if let songLrc = lrc {
DataManager.shareDataManager.curLrcInfo = Common.praseSongLrc(songLrc)
//println(songLrc)
}
})
}
})

OBEXFileTransferServices doesn't connect

I'm trying to write a macOS app that would connect to already paired the bluetooth phone and retrieves the list of address book entries and call records. This information should be available via standard OBEX interface. I'm relatively new to macOS development (although have enough experience with iOS development) and I have a feeling that I'm doing something wrong on a very basic level.
Here are snippets of my code:
First I'm finding particular paired Bluetooth device by its address
let paired = IOBluetoothDevice.pairedDevices()
let device = paired?.first(where: { (device) -> Bool in
return (device as? IOBluetoothDevice)?.addressString == "some_address"
}) as? IOBluetoothDevice
This actually works fine and I'm getting back valid object. Next, I'm picking up address book service and creating BluetoothOBEXSession for it
let service = device!.getServiceRecord(for: IOBluetoothSDPUUID(uuid32:kBluetoothSDPUUID16ServiceClassPhonebookAccess.rawValue))
let obexSession = IOBluetoothOBEXSession(sdpServiceRecord: service!)
This also works fine, I'm getting proper service object and session is created.
Next step (I would assume) is to create an OBEXFileTransfer session and do something (like checking current directory or retrieving the content of telecom/cch which supposed to have the list of combined outgoing and incoming calls:
let ftp = OBEXFileTransferServices(obexSession: obexSession!)
ftp!.delegate = self
if ftp!.connectToFTPService() == 0 {
NSLog("\(ftp!.currentPath())") // -- empty
ftp!.changeCurrentFolderForward(toPath: "telecom/cch")
NSLog("\(ftp!.currentPath())") // -- empty
ftp!.retrieveFolderListing()
}
I have added the following delegate's method to my view controller (to receive callbacks from OBEX FTS but they never get called:
override func fileTransferServicesRetrieveFolderListingComplete(_ inServices: OBEXFileTransferServices!, error inError: OBEXError, listing inListing: [Any]!) {
NSLog("Listing complete...")
}
override func fileTransferServicesConnectionComplete(_ inServices: OBEXFileTransferServices!, error inError: OBEXError) {
NSLog("Connection complete...")
}
override func fileTransferServicesDisconnectionComplete(_ inServices: OBEXFileTransferServices!, error inError: OBEXError) {
NSLog("Disconnect complete...")
}
override func fileTransferServicesAbortComplete(_ inServices: OBEXFileTransferServices!, error inError: OBEXError) {
NSLog("Abort complete...")
}
What am I doing wrong here?
I also could not find any good Bluetooth examples for macOS either, if somebody has good links, please do share.

Is there a way to tell if a MIDI-Device is connected via USB on iOS?

I'm using CoreMIDI to receive messages from a MIDI-Keyboard via Camera Connection Kit on iOS-Devices. My App is about pitch recognition. I want the following functionality to be automatic:
By default use the microphone (already implemented), if a MIDI-Keyboard is connected use that instead.
It's could find out how to tell if it is a USB-Keyboard using the default driver. Just ask for the device called "USB-MIDI":
private func getUSBDeviceReference() -> MIDIDeviceRef? {
for index in 0..<MIDIGetNumberOfDevices() {
let device = MIDIGetDevice(index)
var name : Unmanaged<CFString>?
MIDIObjectGetStringProperty(device, kMIDIPropertyName, &name)
if name!.takeRetainedValue() as String == "USB-MIDI" {
return device
}
}
return nil
}
But unfortunately there are USB-Keyboards that use a custom driver. How can I tell if I'm looking at one of these? Standard Bluetooth- and Network-Devices seem to be always online. Even if Wifi and Bluetooth are turned of on the device (strange?).
I ended up using the USBLocationID. It worked with any device I tested so far and none of the users complained.But I don't expect many users to use the MIDI-Features of my app.
/// Filters all `MIDIDeviceRef`'s for USB-Devices
private func getUSBDeviceReferences() -> [MIDIDeviceRef] {
var devices = [MIDIDeviceRef]()
for index in 0..<MIDIGetNumberOfDevices() {
let device = MIDIGetDevice(index)
var list: Unmanaged<CFPropertyList>?
MIDIObjectGetProperties(device, &list, true)
if let list = list {
let dict = list.takeRetainedValue() as! NSDictionary
if dict["USBLocationID"] != nil {
devices.append(device)
}
}
}
return devices
}

Resources