HTTP2 protocol with URLSessionStreamTask - ios

I am trying to build iOS client side of Alexa Voice Services. I am stuck at the networking layer.
Interaction with Alexa Server requires creation of mainly two streams over a single connection. After creating connection with server, you open a stream downchannel which will be in half closed state. downchannel will be used by server to send directives such as notifications, alarms etc.(for e.g. if you ask Alexa to set a alarm after 5 mins, you will get directive to play alarm on this channel after 5 mins). downchannel will be open as long as the session with Alexa is active. Another stream will be started whenever user starts an audio session with Alexa. This stream will be used to send audio chunks until we get end directive.Then this stream will be closed. More Details here
I am trying to implement this using streamTaskWithHostName(:portNumber:) of URLSession. So for simple HTTP request if I am sending HTTP request, I can read the response and I just need to parse header and body as per the standards.
let request = "GET / HTTP/1.1 \r\nHost: localhost\r\n\r\n"
streamTask = session.streamTask(withHostName: "localhost", port: 8080)
let getRequest = request.data(using: .utf8)!
streamTask.write(getRequest, timeout: 60) {
error in
debugPrint("Error is \(error?.localizedDescription) )")
if error == nil {
self.streamTask.readData(ofMinLength: 4096, maxLength: 4096, timeout: 20) {
data, bool, error in
if let extractedData = data {
let dataString = String(data: extractedData, encoding: .utf8)
debugPrint("Data received is \(dataString!)")
}
debugPrint("Bool = \(bool) error = \(error?.localizedDescription)")
}
}
}
streamTask.resume()
Data what I am reading is
Data received is HTTP/1.1 200 \r\nDate: Tue, 26 Mar 2019 06:08:33 GMT\r\nServer: Apache/2.4.38 (Unix)\r\nContent-Length: 226\r\nConnection: close\r\nContent-Type: text/html; charset=iso-8859-1\r\n\r\n<!DOCTYPE HTML PUBLIC \"-//IETF//DTD HTML 2.0//EN\">\n<html><head>\n<title>it works</title>\n</head><body>\n<h1>it workst</h1>\n<p>it works.<br />\n</p>\n</body></html>\n"
But when i am trying to create http2 stream task, I can read nothing.
:method = GET
:scheme = https
:path = /{{API version}}/directives
authorization = Bearer {{YOUR_ACCESS_TOKEN}}
1.Is it because of the http2 is a binary protocol instead of plain text and header is compressed using HPACK?
2.If yes, then Will I need to implement header compression, frame creation and writing to stream as per documentation myself or is there some configuration in URLSessionConfiguration which will do this for me even if I specify plain headers as in URLRequests?
3.Can you help me with some libraries which can help me in achieving the same.

Related

Partial downloads not working correctly with MS Graph and Adobe Embed Api

I am trying to display linearized PDFs using Adobe Embed Api with MS Graph downloadUrl.
Adobe Embed API - https://developer.adobe.com/document-services/docs/overview/pdf-embed-api/howtos/#pdf-linearization
Partial range download - https://learn.microsoft.com/en-us/graph/api/driveitem-get-content?view=graph-rest-beta&tabs=javascript#partial-range-downloads
const previewFilePromise = this.adobeDCView.previewFile(
{
content: {
location: {
url: downloadUrl
}
},
metaData: {
fileName,
},
},
{
enableLinearization: true
}
);
return previewFilePromise;
}
This will automatically do 3 requests:
a HEAD call to get the file content length
Partial range request that returns the first bytes correctly
Request header: range: bytes=0-1024
Response header: Content-Length: 1025 and etag: "{08F2D5AD-CA54-470B-BAFD-FA594C1F9DE4},3"
Another request that is the same as the second one with an additional request header of if-none-match: "{08F2D5AD-CA54-470B-BAFD-FA594C1F9DE4},3"
As soon as you add the if-none-match header to the graph downloadUrl request it returns a 200 with all the data instead of 304 unmodified response. This cause a long delay when opening large pdfs with the API.
Am I doing something wrong or could it be an issue on their side sending back the incorrect response?

iOS Alamofire - Streaming JSON lines first response issue

Using Alamofire 4.9.0.
I am trying to implement handling streaming APIs in JSON lines format. Here's how:
stream = Alamofire.request(url, method: HTTPMethod.get,
headers: TTSessionManager.headers)
.validate()
.stream(closure: { (data) in
// parsing JSON lines ...
})
.response(completionHandler: { (response) in
// error handling ...
})
Now the issue is that the first response takes some time to return. And when it does I get a couple of JSON lines in one big batch. After that stream continues to normally respond with a new JSON line per response coming through the stream.
Has anyone encountered this behaviour? I'm wondering wether there is some additional session or request setup needed in order for this to work normal (line per response) from the start. When inspecting the response.metrics after canceling the request a lot of the fields are null so I can't for sure say wether some of the initial connection steps are the issue:
(Domain Lookup Start) (null)
(Domain Lookup End) (null)
(Connect Start) (null)
(Secure Connection Start) (null)
(Secure Connection End) (null)
(Connect End) (null)
So the problem here was that the response header didn't have Content-Type set to application/json. When this header is not set properly, URLSession data task will buffer first 512 bytes of response.
More info can be found here: https://developer.apple.com/forums/thread/64875

Swift Network.framework (websocket): read response headers set by the server

I’m trying to figure it out how a ws client can read additional headers set by the WebSocket server during the handshake.
Websocket Server (built using the NWProtocolWebSocket)
let wsOptions = NWProtocolWebSocket.Options()
wsOptions.setClientRequestHandler(serverQueue) { (_, headers) -> NWProtocolWebSocket.Response in
let additionalHeaders = [("custom-header", "hi there")]
return .init(status: .accept, subprotocol: nil, additionalHeaders: additionalHeaders)
}
Websocket Client (built using the NWProtocolWebSocket)
I'm aware that NWProtocolWebSocket.Metadata has an additionalServerHeaders but I don't know how to access it.
Any help? Thanks

Websocket : Starscream "masked and rsv data is not currently supported"

I am developing an iOS app which required to connect with web socket server.
I can successfully connect to server but when I send request on it, it drop off the connection.
I am using Starscream library for web socket.
As per server support team:
it does not support protocol compression, but in the headers below they're requesting "permessage-deflate" from us. We'll accept uncompressed messages just fine (it's just a flag on the packet) but due to the extension they asked for, messages we send out will be compressed with that flag set.
I send request as following using Swift
let dict = ["Parameter1":"value1","Parameter2":"value2"]
do {
let data = try NSJSONSerialization.dataWithJSONObject(dict, options: NSJSONWritingOptions(rawValue: 0))
var jsonString: String = String(data: data, encoding: UInt())!
self.socket.writeString(jsonString);
} catch {
print(error)
}
It disconnect with server and print following message.
"websocket is disconnected: Optional("masked and rsv data is not currently supported")"
What the server support team meant is that the request from your WebSocket client application contained an HTTP header like below.
Sec-WebSocket-Extensions: permessage-deflate
So, if your application has a line like below, remove it.
socket.headers["Sec-WebSocket-Extensions"] = "permessage-deflate"
This error might also be thrown if the server doesn't accept the incoming connection (regardless of the reasons), or if the server crashed.
Basically, when this message shows up, the best action would be to check what is going on the server as you might be wasting time trying improve client code (it happened to me :)
For those facing this issue when trying to connect to the backend WebSocket, make sure the front end and the backend version of the socket.io are compatible. Running the following command fixed the issue for me.
pod update
Updated the both to the latest and solved the issue.
this will fix your issue I believe. just add "wamp" in the header like this.
*
var request = URLRequest(url: URL(string: URL)!)
request.setValue(["wamp"].joined(separator: ","), forHTTPHeaderField: "Sec-WebSocket-Protocol")
socket = WebSocket(request: request)
socket.delegate = self
socket.connect()

How to serve a remote file (taken from a remote SMB server) to certain requests

I'm trying to serve a video file as response when certain request hits the server (server running on a mobile app). eg: someone goes to http://mylocalip:8080/video, and I want to serve him with a video.
This video file can be stored locally, or can be external. I started trying to serve a file located on a SMB server, so I tried using this code to get the file from the server and returning it (I know it's waiting for the whole file to read instead of reading and sending chunks, but should work right?):
webServer.addDefaultHandlerForMethod("GET", requestClass: GCDWebServerRequest.self, processBlock: {request in
print("########\n########Request: \(request)")
let webrequested:GCDWebServerRequest = request;
let urlcita:String = String(webrequested.URL)
print("URL of requests\(urlcita)")
if urlcita.rangeOfString("videosmb") != nil{
print("It's a video SMB request")
//Test fake SMB file predefined. Read File using KxSMB
let route:String = "smb://192.168.1.205/ServidorAH/video.avi";
let extensFile:NSString = (route as NSString).pathExtension
let contentype = GCDWebServerGetMimeTypeForExtension(extensFile as String);
print("Content type MIME \(contentype)")
//Open SMB file using KxSMB
let authcredentials = KxSMBAuth.smbAuthWorkgroup("", username: "Guest", password: "");
let provider = KxSMBProvider.sharedSmbProvider()!
let archivoes = provider.fetchAtPath(route, auth: authcredentials)
//Switch to manually end the reading when switched to true. In the future will send chunks of data until the end, instead of reading the whole file first.
var interruptor:Bool = false
//Response Stream block
let responseStream: GCDWebServerStreamedResponse = GCDWebServerStreamedResponse(contentType: contentype, asyncStreamBlock: { completionBlock in
if (interruptor == true)
{
print("Test: End of reading")
completionBlock(NSData(), nil);
}
if archivoes!.isKindOfClass(NSError){
//It can not find the file, SMB error, so empty NSDATA to completion block
let errorcito:NSError = archivoes as! NSError
print("Error obteniendo archivo SMB: \(errorcito.localizedDescription)");
completionBlock(NSData(), nil);
}else{
print("Archivo SMB adecuado \(archivoes)")
let datos:NSData = archivoes!.readDataToEndOfFile()
//Print lenght of data (to check the size of the file)
print("Data lenght \(datos.length)")
//Set switch to true, so the next call will send an empty daya completion block
interruptor = true
//Send data chunk (in this case everything)
completionBlock(datos, nil);
}
})
return responseStream
}else{
//Default response
return GCDWebServerDataResponse(HTML:"<html><body><p>Hello World<br></p></body></html>")
}
However, I can't make it work. I always get a broken pipe error, and the web player accessing the web server (browsing from the mac and iOS too) doesn't play anything. I've also tried using an embeded iOS player to log the response (KxMovie). I get something like this:
[DEBUG] Did open connection on socket 19
[DEBUG] Connection received 177 bytes on socket 19
[DEBUG] Connection on socket 19 preflighting request "GET /videosmb" with 177 bytes body
[DEBUG] Connection on socket 19 processing request "GET /videosmb" with 177 bytes body
[DEBUG] Did connect
[DEBUG] Did start background task
[DEBUG] Connection sent 175 bytes on socket 19
...
Using a local player (KxMovie) from inside the App, here appears to be reading the file headers and it gets the right file size and dimensions of the video. However it doesn't play and it ends saying it reached the end of the video (without playing it). Just after that, WebServer shows an error:
...
[ERROR] Error while writing to socket 19: Broken pipe (32)
[DEBUG] Did close connection on socket 19
[VERBOSE] [fe80::cd0:28cd:3a37:b871%en1:8080] fe80::cd0:28cd:3a37:b871%en1:50109 200 "GET /videosmb" (177 | 175)
[DEBUG] Did disconnect
[DEBUG] Did end background task
Given the fact this is the first time I'm dealing with SMB servers, I though that maybe I was doing something wrong with the SMB part, so I decided to go for a simplified method just for testing.
This time I tried to serve a simple mp4 file stored on a remote webserver (not SMB). It didn't work either.
Finally I tried to serve a local file included in the main Bundle of the App, and the same happened: nothing. Here is the code:
webServer.addDefaultHandlerForMethod("GET", requestClass: GCDWebServerRequest.self, processBlock: {request in
print("########\n########Request: \(request)")
let webrequested:GCDWebServerRequest = request;
let url:String = String(webrequested.URL)
print("URL of request: \(url)")
if url.rangeOfString("video") != nil{
print("It's a video request")
let rutalocalita = (NSBundle.mainBundle()).pathForResource("video", ofType: "avi")
let datos = NSData(contentsOfFile: rutalocalita!)!
print("video size: \(datos.length)")
return GCDWebServerDataResponse(data: datos, contentType: "video/avi")
}else{
//Default Response: Simple web
return GCDWebServerDataResponse(HTML:"<html><body><p>Hello World<br></p></body></html>")
}
})
This is what the log looks like:
[DEBUG] Did open connection on socket 19
[DEBUG] Connection received 177 bytes on socket 19
[DEBUG] Connection on socket 19 preflighting request "GET /video" with 177 bytes body
[DEBUG] Connection on socket 19 processing request "GET /video" with 177 bytes body
[DEBUG] Did connect
[DEBUG] Did start background task
[myCUSTOMDebug] Read 13584902 b. I'm going to send the response back to the request.
[DEBUG] Connection sent 173 bytes on socket 19
...
Here the local Playing I'm using inside the app to track the response, is able to read things like:
header='HTTP/1.1 200 OK'
2015-10-17 17:51:41.571 videotvtest[262:14252] http_code=200
2015-10-17 17:51:41.571 videotvtest[262:14252] header='Cache-Control: no-cache'
2015-10-17 17:51:41.571 videotvtest[262:14252] header='Content-Length: 13584902'
2015-10-17 17:51:41.572 videotvtest[262:14252] header='Content-Type: video/avi'
2015-10-17 17:51:41.572 videotvtest[262:14252] header='Connection: Close'
2015-10-17 17:51:41.573 videotvtest[262:14252] header='Server: GCDWebServer'
...
[ERROR] Error while writing to socket 19: Broken pipe (32)
[DEBUG] Did close connection on socket 19
[VERBOSE] [fe80::cd0:28cd:3a37:b871%en1:8080] fe80::cd0:28cd:3a37:b871%en1:50155 200 "GET /video" (177 | 173)
netbios_ns_send_name_query, name query sent for '*'.
[DEBUG] Did disconnect
[DEBUG] Did end background task
I'm playing around with tvOS and Xcode 7 for this, but I guess it should be ok if I'm able to show a regular HTML response... so I'm sure I'm missing something, or maybe I missed some framework when installing WebServer (I'm not using pods)?
Thanks in advance
If you want to serve a video file that can play in the browser using the video tag, at least on Chrome and Safari, you need to implement HTTP range requests.
GCDWebServer automatically implement range support if you use GCDWebServerFileResponse. If you use another type of response you would need to implement it yourself based on the byteRange property of the incoming GCDWebServerRequest. You should copy-paste the logic from GCDWebServerFileResponse.m.

Resources