In an iPhone app, I create a CFSocket object from an existing native UDP socket and set up a data callback whenever the socket receives some data. I then add that to my main program loop:
//Set socket descriptor field
cbData.s = udpSocket.getSocketDescriptor();
CFSocketContext udpSocketContext;
memset(&udpSocketContext, 0, sizeof(udpSocketContext));
udpSocketContext.info = &cbData;
cbData.socketRef = CFSocketCreateWithNative(NULL, cbData.s, kCFSocketDataCallBack, &getSocketDataCallBack, &udpSocketContext);
cbData.runLoopSourceRef = CFSocketCreateRunLoopSource( NULL, cbData.socketRef, 0);
CFRunLoopAddSource(CFRunLoopGetMain(), cbData.runLoopSourceRef, kCFRunLoopCommonModes);
I send 1024-byte datagrams over WiFi from a separate Mac server app every 5 mS, and receive them on my iPhone in my getSocketDataCallBack routine.
I expect getSocketDataCallBack to be called every 5 mS (to match the period of the datagrams being sent from the Mac), which happens the majority of times. BUT, the calls often get delayed by 10s or 100s of mS. Thereafter, I get a rapid sequence of callbacks (fractions of a mS) to retrieve the multiple datagrams that have piled up over that delay.
As iOS obviously keeps the delayed datagrams around,
is there any way to grab all the delayed datagrams from the system at
once instead of getSocketDataCallBack being called over and over
in quick succession?
[I do query how many bytes are available in the callback ala:
CFDataRef dataRef = (CFDataRef)data;
numBytesReceived = CFDataGetLength(dataRef);
but 'numBytesReceived' is always reported as 1024.]
Alternatively, is there any way to improve/lessen the socket callback
timing variability through other means?
I'm using socket call back for Inter Process Communication (actually, inter thread communication) with UNIX socket. How we use socket is identical to the TCP/UDP.
The code below is written in c/obj-c and using posix thread. To translate it to Swift/NSThread should not be difficult.
Note the program below works as a server side, which means the program creates socket where the clients connect to. Once the client connected to the socket, the system automatically accepts the connection and allocates
another file descriptor to read/write. The socket call back reflects this two stage operation. Initially we create the socket, we then add as run-loop source so the system can call the call back when the client attempted to connect. The system accepts, then allocates and tells the call back
a file descriptor to read/write with the client. We then create another run-loop source from the read/write fd and add to run-loop. This second call back is called when rx/tx data is ready.
MAIN THREAD:
The main thread creates UNIX socket and worker thread. The socket fd is passed as argument of the worker thread.
#import <stdio.h>
#import <string.h>
#import <stdlib.h>
#import <unistd.h>
#import <pthread.h>
#import <sys/socket.h>
#import <sys/un.h>
#import <sys/stat.h>
#import <sys/types.h>
#import <UIKit/UIKit.h>
#import <Foundation/Foundation.h>
int setup(const char *ipcNode) {
int sockfd = socket(AF_UNIX, SOCK_STREAM, 0);
if (sockfd == -1) {
return -1;
}
struct sockaddr_un sa = {0};
sa.sun_len = sizeof(sa);
sa.sun_family = AF_UNIX;
strcpy(sa.sun_path, ipcNode);
remove(sa.sun_path);
if (bind(sockfd, (struct sockaddr*)&sa, sizeof(struct sockaddr_un)) == -1) {
close(sockfd);
return -1;
}
// start up worker thread
pthread_attr_t at;
pthread_attr_init(&at);
pthread_attr_setdetachstate(&at, PTHREAD_CREATE_DETACHED);
pthread_t th;
pthread_create(&th, &at, workerThread, (void *)(long)(sockfd));
return 1;
}
WORKER THREAD:
The program works as a server. So, it waits to get connected by client (via connect()). Once it's connected, the system automatically calls accept() and allocates read/write fd to communicate with the client. This fd is passed to accept-call back routine socketDataCallback(). Then we create another call back clientDataCallback() with the read/write fd.
// worker thread
//
void *workerThread(void *tprm) {
int sockfd = (int)tprm;
int retval = listen(sockfd, 1); // mark as "server" side. here, accept only 1 connection request at a time
if (retval != 0) {
return NULL;
}
// create CFSocket and register it as data source.
CFSocketRef socket = CFSocketCreateWithNative(kCFAllocatorDefault, sockfd, kCFSocketAcceptCallBack, socketDataCallback, nil);
// don't close native fd on CFSocketInvalidate
CFSocketSetSocketFlags(socket, CFSocketGetSocketFlags(socket) & ~kCFSocketCloseOnInvalidate);
// create run loop source
CFRunLoopSourceRef socketRunLoop = CFSocketCreateRunLoopSource(kCFAllocatorDefault, socket, 0);
// add to run loop
CFRunLoopAddSource(CFRunLoopGetCurrent(), socketRunLoop, kCFRunLoopCommonModes);
CFRelease(socketRunLoop);
CFRelease(socket);
CFRunLoopRun();
// not return here untill run loop stops
close(sockfd);
return NULL;
}
// socket connection w/ client side. create another data source and add to run-loop
//
void socketDataCallback(CFSocketRef s, CFSocketCallBackType callbackType, CFDataRef address, const void *data, void *info) {
CFSocketContext socketContext;
memset(&socketContext, 0, sizeof(CFSocketContext));
int clientfd = *((int *)data); // get file descriptor (fd)
socketContext.info = (void *)((long)clientfd); // set fd at info of socketContext
// create CFSocket for tx/rx w/ connected client
CFSocketRef socket = CFSocketCreateWithNative(kCFAllocatorDefault, clientfd, kCFSocketReadCallBack | kCFSocketWriteCallBack, clientDataCallback, &socketContext);
CFSocketDisableCallBacks(socket, kCFSocketWriteCallBack);
CFRunLoopSourceRef socketRunLoop = CFSocketCreateRunLoopSource(kCFAllocatorDefault, socket, 0);
CFRunLoopAddSource(CFRunLoopGetCurrent(), socketRunLoop, kCFRunLoopCommonModes);
CFRelease(socket);
CFRelease(socketRunLoop);
}
// data to/from client
//
void clientDataCallback(CFSocketRef s, CFSocketCallBackType callbackType, CFDataRef address, const void *data, void *info) {
if (callbackType & kCFSocketWriteCallBack) {
// your own tx data prcess here
// txDataCallback(s, callbackType, address, data, info);
}
if (!(callbackType & kCFSocketReadCallBack)) return;
// extract fd
int fd = (int)((long)info);
// read data, and do some work
uint8_t rxdata[1024];
size_t nr = read(fd, rxdata, 1024);
if (!nr) {
// socket closed
handleSocketClosed(s);
return;
}
// your own rx process here
}
// socket closed
//
void handleSocketClosed(CFSocketRef s) {
// any clean up process here, then
CFSocketInvalidate(s);
// stop run loop if necessary
// CFRunLoopStop(CFRunLoopGetCurrent());
}
If you are working at client side, things get a bit easier. You get a read/write fd with connect() call. Then you create CFSockeRef and add to run-loop by using the fd.
Hope this helps.
EDIT: How to wait with POSIX select(). To wait with POSIX select() at worker thread is simpler than socket call back. If you are on client side, then:
int sockfd = socket(...);
bind(sockfd, ...)
connect(sockfd, ...);
while (1) {
int nfds = sockfd+1;
fd_set rfds;
FD_ZERO(&rfds);
FD_SET(sockfd, &rfds);
int retval = select(nfds, &rfds, NULL, NULL, NULL);
if (retval == -1) break;
if (retval > 0) {
uint8_t rxdata[1024];
size_t nr = read(sockfd, rxdata, 1024);
if (!nr) {
// socket closed.
break;
}
// do your rx process here
}
}
Run the code above at your worker thread.
I have an encrypted stream of data and I have implemented a function more or less similar to:
NSInteger DecryptContent(NSInputStream *inputStream,
NSOutputStream *outputStream,
NSData *key)
{
NSInteger totalNumberOfWrittenBytes = 0;
uint32_t recordSequenceNumber = 0;
NSMutableData *ciphertextInput = [NSMutableData dataWithLength:recordSize];
NSData *plaintextOutput = nil;
NSInteger recordDelimiterIndex = -1;
do {
CodingHeader *codingHeader = ReadCodingHeaderFromInoutStream(inputStream);
NSInteger numberOfReadBytes = [inputStream read:ciphertextInput.mutableBytes maxLength:codingHeader.recordSize];
if (numberOfReadBytes <= 0) {
LogError(#"Error: Stream should not have ended");
return -1;
}
NSData *actualCiphertextInput = ciphertextInput;
// Last chunk
if (numberOfReadBytes != ciphertextInput.length) {
actualCiphertextInput = [ciphertextInput subdataWithRange:NSMakeRange(0, numberOfReadBytes)];
}
NSData *scrambledKey = ScrambleKeyWithRecordSequenceNumberAndSalt(recordSequenceNumber, codingHeader.salt);
plaintextOutput = Decrypt(actualCiphertextInput, scrambledKey);
recordDelimiterIndex = FindRecordDelimiterIndex(plaintextOutput);
if (recordDelimiterIndex < 0) {
LogError(#"Error: Delimiter not found");
return -2;
}
NSInteger numberOfWrittenBytes = [outputStream write:plaintextOutput.bytes maxLength:recordDelimiterIndex];
if (numberOfWrittenBytes == -1) {
LogError(#"Error writing bytes: %#", outputStream.streamError);
return -3;
}
totalNumberOfWrittenBytes += numberOfWrittenBytes;
recordSequenceNumber++;
} while (((uint8_t *)plaintextOutput.bytes)[recordDelimiterIndex] != LastRecordDelimiterByte);
return totalNumberOfWrittenBytes;
}
This is not ideal because it's a blocking function that uses polling on the streams. What's a good approach for adapting this code into an NSOutputStream subclass that transparently decrypts data on the fly? Any other async alternatives?
Do I have to override - (NSInteger)write:(const uint8_t *)buffer maxLength:(NSUInteger)length and just manage the decryption using my own intermediary buffer, or is there a better/simpler approach?
If I have to manage my own buffer, not being able to use NSInputStream to conveniently read data (and having to use buffer offsets, concatenate several reads into one encrypted record, etc.) seems like a huge pain.
Right now I'm investigating possibility to implement video streaming through MultipeerConnectivity framework. For that purpose I'm using NSInputStream and NSOutputStream.
The problem is: I can't receive any picture so far. Right now I'm trying to pass simple picture and show it on the receiver. Here's a little snippet of my code:
Sending picture via NSOutputStream:
- (void)sendMessageToStream
{
NSData *imgData = UIImagePNGRepresentation(_testImage);
int img_length = (int)[imgData length];
NSMutableData *msgData = [[NSMutableData alloc] initWithBytes:&img_length length:sizeof(img_length)];
[msgData appendData:imgData];
int msg_length = (int)[msgData length];
uint8_t *readBytes = (uint8_t *)[msgData bytes];
uint8_t buf[msg_length];
(void)memcpy(buf, readBytes, msg_length);
int stream_len = [_stream writeData:(uint8_t*)buf maxLength:msg_length];
//int stream_len = [_stream writeData:(uint8_t *)buf maxLength:data_length];
//NSLog(#"stream_len = %d", stream_len);
_tmpCounter++;
dispatch_async(dispatch_get_main_queue(), ^{
_lblOperationsCounter.text = [NSString stringWithFormat:#"Sent: %ld", (long)_tmpCounter];
});
}
The code above works totally fine. stream_len parameter after writing equals to 29627 bytes which is expected value, because image's size is around 25-26 kb.
Receiving picture via NSinputStream:
- (void)readDataFromStream
{
UInt32 length;
if (_currentFrameSize == 0) {
uint8_t frameSize[4];
length = [_stream readData:frameSize maxLength:sizeof(int)];
unsigned int b = frameSize[3];
b <<= 8;
b |= frameSize[2];
b <<= 8;
b |= frameSize[1];
b <<= 8;
b |= frameSize[0];
_currentFrameSize = b;
}
uint8_t bytes[1024];
length = [_stream readData:bytes maxLength:1024];
[_frameData appendBytes:bytes length:length];
if ([_frameData length] >= _currentFrameSize) {
UIImage *img = [UIImage imageWithData:_frameData];
NSLog(#"SETUP IMAGE!");
_imgView.image = img;
_currentFrameSize = 0;
[_frameData setLength:0];
}
_tmpCounter++;
dispatch_async(dispatch_get_main_queue(), ^{
_lblOperationsCounter.text = [NSString stringWithFormat:#"Received: %ld", (long)_tmpCounter];
});
}
As you can see I'm trying to receive picture in several steps, and here's why. When I'm trying to read data from stream, it's always reading maximum 1095 bytes no matter what number I put in maxLength: parameter. But when I send the picture in the first snippet of code, it's sending absolutely ok (29627 bytes . Btw, image's size is around 29 kb.
That's the place where my question come up - why is that? Why is sending 29 kb via NSOutputStream works totally fine when receiving is causing problems? And is there a solid way to make video streaming work through NSInputStream and NSOutputStream? I just didn't find much information about this technology, all I found were some simple things which I knew already.
Here's an app I wrote that shows you how:
https://app.box.com/s/94dcm9qjk8giuar08305qspdbe0pc784
Build the project with Xcode 9 and run the app on two iOS 11 devices.
To stream live video, touch the Camera icon on one of two devices.
If you don't have two devices, you can run one app in the Simulator; however, you can only use the camera on the real device (the Simulator will display the video broadcasted).
Just so you know: this is not the ideal way to stream real-time video between devices (it should probably be your last choice). Data packets (versus streaming) are way more efficient and faster.
Regardless, I'm really confused by your NSInputStream-related code. Here's something that makes a little more sense, I think:
case NSStreamEventHasBytesAvailable: {
// len is a global variable set to a non-zero value;
// mdata is a NSMutableData object that is reset when a new input
// stream is created.
// displayImage is a block that accepts the image data and a reference
// to the layer on which the image will be rendered
uint8_t * buf[len];
len = [aStream read:(uint8_t *)buf maxLength:len];
if (len > 0) {
[mdata appendBytes:(const void *)buf length:len];
} else {
displayImage(mdata, wLayer);
}
break;
}
The output stream code should look something like this:
// data is an NSData object that contains the image data from the video
// camera;
// len is a global variable set to a non-zero value
// byteIndex is a global variable set to zero each time a new output
// stream is created
if (data.length > 0 && len >= 0 && (byteIndex <= data.length)) {
len = (data.length - byteIndex) < DATA_LENGTH ? (data.length - byteIndex) : DATA_LENGTH;
uint8_t * bytes[len];
[data getBytes:&bytes range:NSMakeRange(byteIndex, len)];
byteIndex += [oStream write:(const uint8_t *)bytes maxLength:len];
}
There's a lot more to streaming video than setting up the NSStream classes correctly—a lot more. You'll notice in my app, I created a cache for the input and output streams. This solved a myriad of issues that you would likely encounter if you don't do the same.
I have never seen anyone successfully use NSStreams for video streaming...ever. It's highly complex, for one reason.
There are many different (and better) ways to stream video; I wouldn't go this route. I just took it on because no one else has been able to do it successfully.
I think that the problem is in your assumption that all data will be available in NSInputStream all the time while you are reading it. NSInputStream made from NSURL object has an asynchronous nature and it should be accessed accordingly using NSStreamDelegate. You can look at example in the README of POSInputStreamLibrary.
Im fairly new to IOS programming and objective-c. I have an embedded system that runs a program written in C that is sending UDP packet to iPhone app I am working on.
I am able to read the packet data (NSData) if it only contains a string but, cannot if the data is structured with additional markup.
Here is the C code that sends the packet.
typedef struct s_msg_temp_report {
uint8_t id0;
uint8_t id1;
uint8_t name[9];
uint8_t led;
uint32_t temp;
} t_msg_temp_report;
static t_msg_temp_report msg_temp_report =
{
.id0 = 0,
.id1 = 2,
.name = DEMO_PRODUCT_NAME,
.led = 0,
.temp = 0,
};
/* Send client report. */
msg_temp_report.temp = (uint32_t)(at30tse_read_temperature() * 100);
msg_temp_report.led = !port_pin_get_output_level(LED_0_PIN);
ret = sendto(tx_socket, &msg_temp_report, sizeof(t_msg_temp_report),
0,(struct sockaddr *)&addr, sizeof(addr));
if (ret == M2M_SUCCESS) {
puts("Assignment 3.3: sensor report sent");
} else {
puts("Assignment 3.3: failed to send status report !");
}
What is the best way to to process (NSData) object data into a usable object for string conversion?
I am trying to learn how to get some sensors plugged into an Arduino board to talk to an iPhone over Bluetooth with a Red Bear Labs mini board but have hit a brick wall.
The sensors get a reading and this is sent to the phone over BLE. So far I've connected to the device and I get back what appears to be data but I can't make sense of it.
I've written a little sketch that looks like this, to simulate the sensor data.
#include <SoftwareSerial.h>
SoftwareSerial bluetooth(5, 6);
void setup() {
bluetooth.begin(57600);
}
void loop() {
//int reading = analogRead(2);
int reading = 123; // fake reading
byte lowerByte = (byte) reading & 0xFF;
byte upperByte = (byte) (reading >> 8) & 0xFF;
bluetooth.write(reading);
bluetooth.write(upperByte);
bluetooth.write(lowerByte);
delay(1000);
}
In iOS I send a call to read the data and then the data is received by a piece of code that looks something like:
- (void)peripheral:(CBPeripheral *)peripheral
didUpdateValueForCharacteristic:(CBCharacteristic *)characteristic
error:(NSError *)error
{
Byte data[20];
static unsigned char buf[512];
static int len = 0;
NSInteger data_len;
if (!error && [characteristic.UUID isEqual:[CBUUID UUIDWithString:#RBL_CHAR_TX_UUID]]){
data_len = characteristic.value.length;
[characteristic.value getBytes:data length:data_len];
if (data_len == 20){
memcpy(&buf[len], data, 20);
len += data_len;
if (len >= 64){
[[self delegate] bleDidReceiveData:buf length:len];
len = 0;
}
} else if (data_len < 20) {
memcpy(&buf[len], data, data_len);
len += data_len;
[[self delegate] bleDidReceiveData:buf length:len];
len = 0;
}
}
}...
}
But when I look at the data that comes back it just makes no sense to me at all.. (I'll dig out an example as soon as I can).
Does anyone know a simple step I'm missing or a good example I could look at to try and better understand this?
I finally realised that the data was correct, I had to 'pull out' the data by bit shifting it.
UInt16 value;
UInt16 pin;
for (int i = 0; i < length; i+=3) {
pin = data[i];
value = data[i+2] | data[i+1] << 8;
NSLog(#"Pin: %d", pin);
NSLog(#"Value %d",value);
}