iOS - Issue recieving data from NSStream - ios

I am making an application that senses iBeacons. When you get within immediate range of an iBeacon the application sends the major and minor numbers of the beacon to a server and the server sends back an image that is stored in a MySQL database, different images are sent back based on the major and minor numbers.
The application sends the major and minor number to a Python (Twisted sockets) script via an NSStream, the script uses these numbers to get an image from the database and send it back to the application.
This setup work great when I use it to get simple text messages back from the database but I am running into problems when trying to recieve and display images inside the application.
first I will post the code of the stream:handleEvent that recieves the data from the input stream.
The code is only a slight modification of this tutorial http://www.raywenderlich.com/3932/networking-tutorial-for-ios-how-to-create-a-socket-based-iphone-app-and-server
// input stream event that recieves the data from the server
//
- (void)stream:(NSStream *)aStream handleEvent:(NSStreamEvent)eventCode
{
switch (eventCode)
{
case NSStreamEventOpenCompleted:
NSLog(#"stream opened");
break;
case NSStreamEventHasBytesAvailable: // event for recieving data
NSLog(#"Recieved Data");
if (aStream == _inputStream)
{
uint8_t buffer[500000];
int len;
// loop gets bytes from input stream
//
while ([_inputStream hasBytesAvailable])
{
len = [_inputStream read:buffer maxLength:sizeof(buffer)];
if (len > 0)
{
NSString *str = #"data:image/jpg;base64,";
NSString *img = [[NSString alloc] initWithBytes:buffer length:len encoding:NSASCIIStringEncoding];
str = [str stringByAppendingString:img];
NSData *ImgOut = [NSData dataWithContentsOfURL:[NSURL URLWithString:str]];
if (nil != ImgOut)
{
self.ImageView.image = [UIImage imageWithData:ImgOut];
NSLog(#"show image");
}
}
}
}
break;
case NSStreamEventErrorOccurred:
NSLog(#"can not connect to host");
[self initNetworkComms];
break;
case NSStreamEventEndEncountered:
NSLog(#"Connection Lost");
[_outputStream close];
[_inputStream close];
[self initNetworkComms];
break;
default:
NSLog(#"unkown event");
break;
}
}
just for good measure I will post the code of the Python script
from twisted.internet.protocol import Protocol, Factory
from twisted.internet import reactor
import mysql.connector
db = mysql.connector.connect(user='NotImportant', password='WouldntYouLikeToKnow', host='localhost', database='retailbeacons')
cursor = db.cursor()
class MessageServer(Protocol):
def connectionMade(self):
self.factory.clients.append(self)
print "clients are ", self.factory.clients
def connectionLost(self, reason):
self.factory.clients.remove(self)
print "client has disconnected"
def dataReceived(self, data):
a = data.split(':')
if len(a) > 1:
Major = a[0]
Minor = a[1]
msg = ""
print "Received query " + Major + ":" + Minor
sql = "SELECT Picture FROM beaconinfo WHERE major=" + Major + " AND minor=" + Minor + ";"
cursor.execute(sql)
for row in cursor.fetchall():
mess = row[0]
msg = mess.encode('utf=8')
self.message(msg)
def message(self, message):
self.transport.write(message + '\n')
factory = Factory()
factory.protocol = MessageServer
factory.clients = []
reactor.listenTCP(8080, factory)
print "Python message test server started"
reactor.run()
what happens with this code is that when the app queries the server, the server sends back the image data (in base64 format), the application recieves this data and the EventHasBytesAvailable case of the switch statement is triggered. But only a small portion of the image is displayed and I get an error log saying:
<Error>: ImageIO: JPEG Corrupt JPEG data: premature end of data segment
This led me to believe that not all the data came across the stream. you'll see in the code that I have an NSLog say 'Recieved Data' everytime the EventHasBytesAvailable case is called and 'show image' when the UIImageView is set with the image data.
The thing I find odd, and what I feel is the source of this problem is the fact that when the EventHasBytesAvailable is called the 'Recieved Data' message is logged, then the 'show image' message is logged, then once again the 'Recieved Data' message is logged and the Error listed above is then logged.
So it looks like a small portion of the data comes in through the stream, the loop gathers up those bytes and sticks them in the UIImageView, then more bytes come in through the stream and an attempt to put them into the UIImageView is made but the 'premature end of data segment' error occurs.
I am very confused as to why this is happening. Shouldn't the whole data of the image be sent through the stream with one calling of the EventHasBytesAvailable case? Possibly I have over looked the buffer in my code? Can my buffer take an image of 60kb? That is the only thing I can think of that might be wring with the application code, then all i can think of is maybe the Python script is sending the data in two chunks instead of one.
Thank you for your time. I am an intern that has hit a bit of a wall with this one! any help will be greatly appreciated!

fixed this problem. so the stream was sending the data over in more than one call of the 'HasBytes' case. so i created a string that gets appended with each chunk of the data when 'HasBytes' gets called. i also used a different method for converting the image data string to an NSData object.
NSString *ImgStr = [[NSString alloc] initWithBytes:buffer length:len encoding:NSASCIIStringEncoding];
// string property for appending
//
_str = [_str stringByAppendingString:ImgStr];
NSData *ImgData = [[NSData alloc] initWithBase64EncodedString:_str options:1];
if (nil != ImgData)
{
self.ImageView.image = [UIImage imageWithData:ImgData];
}
Thanks very much!

Related

Read "structured/serialized data" from NSStream

I am developing a game app which will take structured data from a server and make responses based on these data.
I have connected the app to the internet through the NSSream. Specifically, I follow the tutorial of Apple's guide. So my code looks something like:
- (void)stream:(NSStream *)stream handleEvent:(NSStreamEvent)eventCode {
switch(eventCode) {
case NSStreamEventHasBytesAvailable:
{
if(!_data) {
_data = [[NSMutableData data] retain];
}
uint8_t buf[1024];
NSInteger len = 0;
len = [(NSInputStream *)stream read:buf maxLength:1024];
if(len) {
[_data appendBytes:(const void *)buf length:len];
// bytesRead is an instance variable of type NSNumber.
[bytesRead setIntValue:[bytesRead intValue]+len];
} else {
NSLog(#"no buffer!");
}
break;
}
// continued
My problem is: "How can I convert the _data to the format I want when it is possible". For example, my server will send two types of data. For each data, the first byte is an indicater for the data type (e.g., 1 means data type 1, 2 means data type 2). For the data type 1, an int (4 bytes) is sent as the data itself. For the data type 2, an int (4 bytes) is sent as the size of the following string bytes. For example, if this int is 10, then there will be 10 more bytes sent from the server to form a string for the client.
The code in my Android(Java) app looks like:
// dataIn = new DataInputStream(socket.getInputStream());
private void keepPacketRecving(){ // this will be executed in a separate thread
while(keepRecvThreadRunning) {
try {
byte type;
type = dataIn.readByte();
if(type == 1){
int data = dataIn.readInt();
getTye1Data(data); // callback function for receiving type 1 data (int)
} else if (type ==2) {
int dataSize = dataIn.readInt();
byte[] buf = new byte[dataSize];
// ... loop to read enough byte into buf ...
String data = new String(buf);
getType2Data(data); // callback function for receiving type 2 data (String)
}
}
}
}
I also notice I can't have a while-loop inside the switch statement to ask inputStream to read more data untile it is enougth because it will hang the thread.
I guess the best way would be always appending all the bytes into the _data (like the example) and has another function to parse the _data? Then the problem is how I can peek bytes from the data and just fetch part of it out (e.g., 10 bytes out of a 20-byte _data).
Is there any wrapper in Objective-C like the DataInputStream in Java?

Video streaming via NSInputStream and NSOutputStream

Right now I'm investigating possibility to implement video streaming through MultipeerConnectivity framework. For that purpose I'm using NSInputStream and NSOutputStream.
The problem is: I can't receive any picture so far. Right now I'm trying to pass simple picture and show it on the receiver. Here's a little snippet of my code:
Sending picture via NSOutputStream:
- (void)sendMessageToStream
{
NSData *imgData = UIImagePNGRepresentation(_testImage);
int img_length = (int)[imgData length];
NSMutableData *msgData = [[NSMutableData alloc] initWithBytes:&img_length length:sizeof(img_length)];
[msgData appendData:imgData];
int msg_length = (int)[msgData length];
uint8_t *readBytes = (uint8_t *)[msgData bytes];
uint8_t buf[msg_length];
(void)memcpy(buf, readBytes, msg_length);
int stream_len = [_stream writeData:(uint8_t*)buf maxLength:msg_length];
//int stream_len = [_stream writeData:(uint8_t *)buf maxLength:data_length];
//NSLog(#"stream_len = %d", stream_len);
_tmpCounter++;
dispatch_async(dispatch_get_main_queue(), ^{
_lblOperationsCounter.text = [NSString stringWithFormat:#"Sent: %ld", (long)_tmpCounter];
});
}
The code above works totally fine. stream_len parameter after writing equals to 29627 bytes which is expected value, because image's size is around 25-26 kb.
Receiving picture via NSinputStream:
- (void)readDataFromStream
{
UInt32 length;
if (_currentFrameSize == 0) {
uint8_t frameSize[4];
length = [_stream readData:frameSize maxLength:sizeof(int)];
unsigned int b = frameSize[3];
b <<= 8;
b |= frameSize[2];
b <<= 8;
b |= frameSize[1];
b <<= 8;
b |= frameSize[0];
_currentFrameSize = b;
}
uint8_t bytes[1024];
length = [_stream readData:bytes maxLength:1024];
[_frameData appendBytes:bytes length:length];
if ([_frameData length] >= _currentFrameSize) {
UIImage *img = [UIImage imageWithData:_frameData];
NSLog(#"SETUP IMAGE!");
_imgView.image = img;
_currentFrameSize = 0;
[_frameData setLength:0];
}
_tmpCounter++;
dispatch_async(dispatch_get_main_queue(), ^{
_lblOperationsCounter.text = [NSString stringWithFormat:#"Received: %ld", (long)_tmpCounter];
});
}
As you can see I'm trying to receive picture in several steps, and here's why. When I'm trying to read data from stream, it's always reading maximum 1095 bytes no matter what number I put in maxLength: parameter. But when I send the picture in the first snippet of code, it's sending absolutely ok (29627 bytes . Btw, image's size is around 29 kb.
That's the place where my question come up - why is that? Why is sending 29 kb via NSOutputStream works totally fine when receiving is causing problems? And is there a solid way to make video streaming work through NSInputStream and NSOutputStream? I just didn't find much information about this technology, all I found were some simple things which I knew already.
Here's an app I wrote that shows you how:
https://app.box.com/s/94dcm9qjk8giuar08305qspdbe0pc784
Build the project with Xcode 9 and run the app on two iOS 11 devices.
To stream live video, touch the Camera icon on one of two devices.
If you don't have two devices, you can run one app in the Simulator; however, you can only use the camera on the real device (the Simulator will display the video broadcasted).
Just so you know: this is not the ideal way to stream real-time video between devices (it should probably be your last choice). Data packets (versus streaming) are way more efficient and faster.
Regardless, I'm really confused by your NSInputStream-related code. Here's something that makes a little more sense, I think:
case NSStreamEventHasBytesAvailable: {
// len is a global variable set to a non-zero value;
// mdata is a NSMutableData object that is reset when a new input
// stream is created.
// displayImage is a block that accepts the image data and a reference
// to the layer on which the image will be rendered
uint8_t * buf[len];
len = [aStream read:(uint8_t *)buf maxLength:len];
if (len > 0) {
[mdata appendBytes:(const void *)buf length:len];
} else {
displayImage(mdata, wLayer);
}
break;
}
The output stream code should look something like this:
// data is an NSData object that contains the image data from the video
// camera;
// len is a global variable set to a non-zero value
// byteIndex is a global variable set to zero each time a new output
// stream is created
if (data.length > 0 && len >= 0 && (byteIndex <= data.length)) {
len = (data.length - byteIndex) < DATA_LENGTH ? (data.length - byteIndex) : DATA_LENGTH;
uint8_t * bytes[len];
[data getBytes:&bytes range:NSMakeRange(byteIndex, len)];
byteIndex += [oStream write:(const uint8_t *)bytes maxLength:len];
}
There's a lot more to streaming video than setting up the NSStream classes correctly—a lot more. You'll notice in my app, I created a cache for the input and output streams. This solved a myriad of issues that you would likely encounter if you don't do the same.
I have never seen anyone successfully use NSStreams for video streaming...ever. It's highly complex, for one reason.
There are many different (and better) ways to stream video; I wouldn't go this route. I just took it on because no one else has been able to do it successfully.
I think that the problem is in your assumption that all data will be available in NSInputStream all the time while you are reading it. NSInputStream made from NSURL object has an asynchronous nature and it should be accessed accordingly using NSStreamDelegate. You can look at example in the README of POSInputStreamLibrary.

make didUpdateValueForCharacteristic asynchronous

currently I am working on a project that is uploading a file through BLE (ios)
Currently I have a loop to send all the data in the file and I close the loop once my pointer in the file has reached the end.
So on the firmware side the characteristics I am writing to updates from 0 to 100 as the progress indication of how many bytes the firmware has received.
my didUpdateValueForCharacteristic is only being called after my loop ends because I believe they are synchronous and both working on the main thread.
I am trying to make my loop asynchronous so I can still listen to the method didUpdateValueForCharacteristic while I am sending
dispatch_async(_sendFileQueue, ^{
while (_isFileSending) {
// File is still sending
// Upload file from offset
[self uploadFile:_offset];
}
});
so I tried this, but I still get the update at the end of the loop, but I don't know if it is because the connection interval that I don't get a response on the iOS end. I know for certain the firmware is update the UUID as I am sending the file.
Source for uploadFile
- (void)uploadFile:(int)from {
// Look at file from index from
[_filePath seekToFileOffset:from];
NSData *soundData = [_filePath readDataOfLength:_packetSize];
// data it should be greater than 0 bytes
if (soundData.length > 0) {
[self connectToPeripherals:soundData];
// Find new pointer
_offset += _packetSize;
}
}
- (void)connectToPeripherals:(NSData *)data {
// Using NSNumber since I want to check if I get a nil returned
NSNumber *writeWithResponse = nil;
CBPeripheral *firstPeripheral = self.peripherals[0];
CBCharacteristic *firstCharacteristics = self.characteristics[0];
// Check if properties have a response, if so write with response
if((self.characteristicManager.properties & CBCharacteristicPropertyWrite)
== CBCharacteristicPropertyWrite) {
DLog(#"Write with response");
writeWithResponse = [NSNumber numberWithBool:YES];
} else if ((self.characteristicManager.properties &
CBCharacteristicPropertyWriteWithoutResponse) ==
CBCharacteristicPropertyWriteWithoutResponse) {
DLog(#"Write without response");
writeWithResponse = [NSNumber numberWithBool:NO];
}
// There is no write property
if (!writeWithResponse) {
DLog(#"ERROR! Could not write to characteristics");
return;
}
// Write to the first peripheral
[firstPeripheral writeValue:data forCharacteristic:firstCharacteristics
type:[writeWithResponse boolValue] ?
CBCharacteristicWriteWithResponse : CBCharacteristicWriteWithResponse];
}
tl;dr make didUpdateValueForCharacteristic called in everything

NSMutableData encryption in place using NSInputStream

I am trying to use CommonCrypto to encrypt an NSMutableData object in place (copying the resulting bytes to itself, without duplicating it). Previously, I was using CCCrypt() "one-shot" method, mainly because it seemed simple. I noticed that my data object got duplicated in memory.
To avoid this, I tried using an NSInputStream object with a buffer size of 2048 bytes. I am reading my NSMutableData object, and continuously call CCCryptorUpdate(), to handle the encryption. The problem is, that it still seems to be duplicated. Here's my current code (please note that it's a category on NSMutableData - mainly because of historical reasons - thus the "self" references):
- (BOOL)encryptWithKey:(NSString *)key
{
// Key creation - not relevant to the dercribed problem
char * keyPtr = calloc(1, kCCKeySizeAES256+1);
[key getCString: keyPtr maxLength: sizeof(keyPtr) encoding: NSUTF8StringEncoding];
// Create cryptographic context for encryption
CCCryptorRef cryptor;
CCCryptorStatus status = CCCryptorCreate(kCCEncrypt, kCCAlgorithmAES128, kCCOptionECBMode, keyPtr, kCCKeySizeAES256, NULL, &cryptor);
if (status != kCCSuccess)
{
MCLog(#"Failed to create a cryptographic context (%d CCCryptorStatus status).", status);
}
// Initialize the input stream
NSInputStream *inStream = [[NSInputStream alloc] initWithData:self];
[inStream open];
NSInteger result;
// BUFFER_LEN is a define 2048
uint8_t buffer[BUFFER_LEN];
size_t bytesWritten;
while ([inStream hasBytesAvailable])
{
result = [inStream read:buffer maxLength:BUFFER_LEN];
if (result > 0)
{
// Encryption goes here
status = CCCryptorUpdate(
cryptor, // Previously created cryptographic context
&result, // Input data
BUFFER_LEN, // Length of the input data
[self mutableBytes], // Result is written here
[self length], // Size of result
&bytesWritten // Number of bytes written
);
if (status != kCCSuccess)
{
MCLog(#"Error during data encryption (%d CCCryptorStatus status)", status);
}
}
else
{
// Error
}
}
// Cleanup
[inStream close];
CCCryptorRelease(cryptor);
free(keyPtr);
return ( status == kCCSuccess );
}
I am definitely missing something obvious here, encryption, and even using input streams is a bit new to me..
As long as you only call CCUpdate() one time, you can encrypt into the same buffer you read from without using a stream. See RNCryptManager.m for an example. Study applyOperation:fromStream:toStream:password:error:. I did use streams here, but there's no requirement that you do that if you already have an NSData.
You must ensure that CCUpdate() is only called one time, however. If you call it multiple times it will corrupt its own buffer. This is an open bug in CommonCryptor (radar://9930555).
As a side note: your key generation is extremely insecure, and use of ECB mode for this kind of data barely qualifies as encryption. It leaves patterns in the ciphertext which can be used to decrypt the data, in some cases just by looking at it. I do not recommend this approach if you actually intend to secure this data. If you want to study how to use these tools well, see Properly Encrypting With AES With CommonCrypto. If you want a prepackaged solution, see RNCryptor. (RNCryptor does not currently have a convenient method for encrypting in-place, however.)
In the line:
result = [inStream read:buffer maxLength:BUFFER_LEN];
the data is read into buffer and result is set to the outcome of the execution.
in lines:
status = CCCryptorUpdate(cryptor, &result, ...
You should be using buffer for the input data, not the status
status = CCCryptorUpdate(cryptor, buffer, ...
Using better names would help eliminate the simple error. If instead of result the variable had been named readStatus the error would most likely not occurred. Likewise instead of naming rthe data variable buffer it had been named streamData things would also have been more clear. Poor naming really can cause errors.

Can't read custom file from NSInputStream

I am trying to read a custom file from my documents directory via NSInputStream to upload it to a FTP server. I'm using the exact same code as demonstrated in the SimpleFTPSample provided by Apple. It seems to work fine as long as the file is empty, but as soon as it contains data it fails.
Here's my code. This is how I create the input stream, I tried it both ways:
//Approach one: Init with path
self.fileStream = [NSInputStream inputStreamWithFileAtPath:filePath];
//Approach two: Init with data
NSData* fileData = [NSData dataWithContentsOfFile:filePath];
self.fileStream = [NSInputStream inputStreamWithData:fileData];
If I init the stream with data, I get EXC_BAD_ACCESS (code=1, address=0x0) when invoking read on the stream (code snipped below), if I use the path it jumps right to File read error.
filePath is #"/var/mobile/Applications/94292A2A-37FC-4D8E-BDA6-A26963932AE6/Documents/1395576645.cpn", NSData returns properly and has 806 bytes.
That's part of the stream:handleEvent: delegate method:
if (self.bufferOffset == self.bufferLimit) {
NSInteger bytesRead;
bytesRead = [self.fileStream read: self.buffer maxLength: kSendBufferSize];
if (bytesRead == -1) {
if (kPrintsToConsole) NSLog(#"File read error");
} else if (bytesRead == 0) {
[self stopSending];
} else {
self.bufferOffset = 0;
self.bufferLimit = bytesRead;
}
}
I'm kinda stuck. Hope you guys can help me out. Running iOS 7.1 & Xcode 5.1
you need to call [self.fileStream open] before read. For both file and data approaches.

Resources