How can I achieve a socket call like this in iOS (Swift or ObjectiveC)? - ios

This is the code in Java to make the socket call, but I want to know how can I replicate this or something similar in iOS (Swift or Objective-C)
public String MakeSocketRequest() {
DataInputStream inputSt;
DataOutputStream outputSt;
Socket socket = new Socket(InetAddress.getByName("socketurl.io"), 40008);
String jsonStr = "{\"id\":1,\"method\":\"themethod\"}";
inputSt = new DataInputStream(socket.getInputStream());
outputSt = new DataOutputStream(socket.getOutputStream());
PrintWriter pw = new PrintWriter(outputSt);
pw.println(string);
Log.d("PrintWriter", jsonStr);
pw.flush();
BufferedReader bfr = new BufferedReader(new InputStreamReader(inputSt));
JSONObject json = new JSONObject(bfr.readLine());
Log.d("Json", json.toString());
inputSt.close();
outputSt.close();
return json.toString();}

If you want to do it natively without 3rd-party libraries,
then you can use CFStreamCreatePairWithSocketToHost function to create input and output streams (no socket object is needed).
Here's some example code to set this up
And the search shows many more
On iOS you can't write or read the streams immediately, and you have to wait until the socket is connected, and you get a permission to read/write. This is done by implementing NSStreamDelegate.
If you get NSStreamEventHasSpaceAvailable event there, you can write your string to the output stream. You don't need a PrintWriter to just write a string, because it is easy to convert NSString to NSData, and write NSData.
If you get NSStreamEventHasBytesAvailable event, means you can try to read data from the input stream to some buffer (like NSMutableData). There's no builtin BufferedReader with a readLine method, so you will have to buffer the data yourself and detect when a new line character appears there. After that you can cut a part of the buffer until the new line, and convert NSData to NSString (or a JSON object by using NSJSONSerialization).
Note: scheduleInRunLoop calls might look confusing, but they are required to start receiving events via the delegate. It kind of tells the system on which thread you want to receive them.
P.S. I agree with commenters that if you have control over the server code, it's better to use a standard protocol like Socket IO or msgpack instead of inventing your own, because they have better and nicer libraries and wider community support.

Related

How to play RTSP url from within app in ios

I have found many suggestion in stack overflow regarding usage of FFmpeg and link of github for DFURTSPPlayer but it is not compiling. But after integrating FFmpeg what I have to write? suppose i am having HTTP urls then I write:
code
moviePath = "http:/path.mp4"
movieURL = NSURL.URLWithString(moviePath!)
moviePlayer = MPMoviePlayerController(contentURL: movieURL)
moviePlayer!.play()
So for using RTSP urls what kind of code should i write?
Here is another post that has an example FFmpeg code that receives an RTSP stream (this one also decodes the stream to YUV420, stores it in pic, then converts the frame to RGB24, stores in picrgb and writes it to a file). So to achieve something similar to what you have for HTTP you should:
1) Write a wrapper Objective-C class for the FFmpeg C code, or just wrap the code in functions/functions that you will call directly from Objective-C code. You should have a way to pass the RTSP url to the class or function and provide a callback for a new frame. In the class/function start a new thread that will actually execute something similar to the code in the example and call a callback for each new decoded frame. NOTE: FFmpeg has a way to perform asynchronous I/O by using your own custom IO context and that would actually allow you to avoid creating the thread, but if you are new to FFmpeg maybe start with the basics and then you can improve your code later on.
2) In the callback update the view or whatever you are using for display with the decoded frame data.

Enqueueing into NSInputStream?

I would like to add three "parts" to an NSInputStream: an NSString, an output from another stream and then another NSString. The idea is the following:
The first and last NSStrings represent the beginning and end of a SOAP request while the output from the stream is a result of loading a very large file and encoding it as Base64 string. So, in the end I would have the final NSInputStream hold the whole SOAP request like this:
< soap beginning > < Base64 encoded data > < soap ending >
The reason I want the whole request to be held in NSInputStream is two-fold:
I don't what to load the very large data file into memory
I think that this is the only way to enforce sending the final request in HTTP 1.1 chunks (which I need because otherwise, if the request becomes too big, the server won't accept it). So, I know that doing this:
NSInputStream *dataStream = ....;
[request setHTTPBodyStream:dataStream];
ensures that the request will be sent as HTTP 1.1 chunks and not as one huge raw SOAP request.
So, I wonder how this can be achieved - namely, how do I "enqueue" things into an NSInputStream? Can it be even done? Is there an alternative way?
Just for reference, in Java this can be done as follows
Vector<InputStream> streamVec = new Vector<InputStream>();
BufferedInputStream fStream = new BufferedInputStream(fileData.getInputStream());
Base64InputStream b64stream = new Base64InputStream(fStream, true);
String[] SOAPBody = GenerateSOAPBody(fileInfo).split("CUT_HERE");
streamVec.add(new ByteArrayInputStream(SOAPBody[0].getBytes()));
streamVec.add(b64stream);
streamVec.add(new ByteArrayInputStream(SOAPBody[1].getBytes()));
SequenceInputStream seqStream = new SequenceInputStream(streamVec.elements());
because Java has these objects available, but NSStreams in objective-c look like very low level objects and are very hard to work with.
Note: I completely re-wrote the original question as I asked it 2 days ago, since I think the new edit explains more clearly what the problem is. I hope it would help it be easier comprehended and maybe answered
UPDATE 2
Here is what I've been able to achieve so far: Instead of trying to enqueue into a stream, I am using a temp file to first write the < soap beginning >, then I set up an input stream to read from the file in chunks, encode each chunk as a Base64 string and write this to the same temp file, finally, when my stream closes, I write the < soap ending > to the temp file. Then I set up another input stream with the contents of this file which I pass to the NSMutableURLRequest:
NSMutableURLRequest* request = [NSMutableURLRequest requestWithURL:url];
...
NSInputStream *dataStream = [NSInputStream inputStreamWithFileAtPath:_tempFilePath];
[request setHTTPBodyStream:dataStream];
This ensures HTTP 1.1 chunked transfer of the contents of the file. After the connection finishes, delete the temp file.
This seems to work fine but of course this is an annoying work-about. I don't want to be writing to a temp file when it all could have been handled by streams (ideally.) If anybody still has better suggestions, let me know :)
UPDATE 3
OK, another update is in order. While my writing to file seems to work, I am now hitting an unexpected issue with some of my requests failing to upload to the server. Specifically, everything is going according to the plan, I am reading the contents of the temp file into a stream and set HTTP body of my request to be this stream and it starts transmitting the HTTP 1.1 chunks as I want it to - but for some reason some packets get dropped and the final request - this is my guess - gets malformed and thus fails. I think the issue of dropped packets is random, because I observe it on larger requests - that is, the issue just has more chance to show up - while my smaller requests usually go thru just fine. This is of course a separate issue from the original in this question. If anybody has a good idea what might be causing this, I asked about the problem here: Packets dropped during chunked HTTP 1.1 request sent by NSURLConnection
Your solution is an ok option, but you can do it with a stream. It means subclassing NSInputStream, and that isn't trivial because there are a bunch of methods you need to implement.
Basically your subclass would initially return the header bytes, then it would return bytes from the 'internal' stream to the file content, then when that's used up it returns the footer bytes. It means maintaining a record of how big the header and footer are and how much has been processed so far, but that isn't a big issue.
There's an example of creating a subclass here which shows the tricky hidden methods you need to implement to get the stream subclass to work properly without throwing exceptions.

Blackberry java radio streaming

I'm developing a radio app for BB 5.0 in java. I don't find a way to play the radio from the url stream address that I have. I use multiple formats but nothing works (.pls, .aac, .m3u). I get a RuntimeException every time I try to play the stream. The content is ok, I've checked it.
InputStream stream = Connector.openInputStream(urlPlay);
StreamConnection streamConnection = (StreamConnection) Connector.open(urlPlay, Connector.READ);
InputStream readAhead = streamConnection.openDataInputStream();
byte[] audioData = new byte[500];
readAhead.read(audioData,0,audioData.length);
ByteArrayInputStream in2 = new ByteArrayInputStream(audioData);
player = javax.microedition.media.Manager.createPlayer(in2, "audio/aac");
System.out.println("REALIZE");
player.realize();
System.out.println("PREFETCH");
player.prefetch();
System.out.println("START");
player.start();
Edit:
When I use a URL from my .pls file I hear a little bit of my streaming but It stops immediately.
I suspect the problem is that you are trying to play playlist files instead of an actual stream. Generally, you need to parse those files yourself to get the real stream URLs.
If you open up that .m3u file, you will see that it is just a list of URLs. Take one of those URLs and then try it. Also, be sure you are setting the right content type. You can determine what that type is with cURL or VLC.

How to check for slow/low network in ios app

Can anyone suggest how to handle a slow network when streaming video in a web view?
When the network strength is poor, a blank screen appears or video doesn't stream.
Is there a way to detect this condition so that we can alert the user? (Apart from using private API.)
Perhaps ifi_baudrate member of the if_data structure (declared in <net/if.h>) is what you need. If baudrate is less than some threshold value, then you can show an alert.
Please see the following answer to know how to obtain the if_data structure for a particular network interface:
https://stackoverflow.com/a/8014012/1310204
You can easily detect the state of the network connection via the HTML5 networking API
http://www.html5rocks.com/en/mobile/optimization-and-performance/#toc-network-detection
Also if you want to test the network speed, just set up some files on your server of a specific size, and do a ajax request for the file, while timing how long it takes to download.
You can use a simple:
var start = new Date();
$.get("someFile.jpg")
.done(function() {
var elapsed = (new Date() - start);
});
Or dig into the HTML5 performance API:
http://www.html5rocks.com/en/tutorials/webperformance/basics/
...if you not using javascript, the same applies. Just open a network connection with whatever is at your disposition, download a small file & do the math ;-)

Better way to encrypt & decrypt audio inBlackBerry?

I use this code to play decrypted audio on BlackBerry on the fly (for the sake of simplicity, I use TEA)
public void play(String path){
try {
FileConnection fc = (FileConnection) Connector.open(path, Connector.READ);
InputStream is = fc.openInputStream();
byte[] rawData = IOUtilities.streamToBytes(is);
processEncryptedAudio(rawData);
is.close();
fc.close();
}
catch (IOException ioex){
}
}
// TEA code is taken from http://www.winterwell.com/software/TEA.php
private void processEncryptedAudio(byte[] data) throws IOException {
TEA tea = new TEA("ABCDE ABCDE ABC A ABCDEF".getBytes());
byte[] decrypted_data = tea.decrypt(data);
ByteArrayInputStream stream = new ByteArrayInputStream(decrypted_data);
ByteArrayInputStreamDataSource source = new ByteArrayInputStreamDataSource(stream, "audio/mpeg");
try {
player = Manager.createPlayer(source);
player.start();
}
catch (MediaException me){
Dialog.alert("MediaException: "+me.getMessage());
}
}
The problem is decryption takes quite long time to finish. For example: on simulator, decrypting a 9 MB audio takes around 5 secs, but on BlackBerry Torch 9860 it takes more than 20 secs.
Is there any way to improve this? Actually the whole file doesn't neet to be encrypted, as long as it is obscured/cannot be played directly.
You could try switching from TEA to RC4, which is also very simple to implement and quite possibly faster.
Also, it looks like you're doing some unnecessary data copying: it would be slightly more efficient to make your decrypt() method modify the input byte array directly. This may require changing the calling code to skip some number of bytes at the beginning and/or end of the decrypted data, but that shouldn't be too hard. (The ByteArrayInputStream constructor can take optional offset and length arguments.)
If you want to get really fancy, you could try writing your own custom InputStream subclass that does the decryption "on the fly" while the audio is playing. If you use a block cipher in CTR, CFB or CBC mode (or ECB, but that's not secure), you can even make the stream seekable. If you want to be even fancier, make it a wrapper around the original InputStream so that you can do the loading, decryption and playing all at the same time.
Another option might be to use the RIM Crypto API, whose cipher implementations might be more efficient (possibly implemented in optimized native code) than your own. The Crypto API also already provides the DecryptorInputStream class which works in the manner I described above.
One possible down side is that the Crypto API seems to be only available to signed apps.

Resources