make didUpdateValueForCharacteristic asynchronous - ios

currently I am working on a project that is uploading a file through BLE (ios)
Currently I have a loop to send all the data in the file and I close the loop once my pointer in the file has reached the end.
So on the firmware side the characteristics I am writing to updates from 0 to 100 as the progress indication of how many bytes the firmware has received.
my didUpdateValueForCharacteristic is only being called after my loop ends because I believe they are synchronous and both working on the main thread.
I am trying to make my loop asynchronous so I can still listen to the method didUpdateValueForCharacteristic while I am sending
dispatch_async(_sendFileQueue, ^{
while (_isFileSending) {
// File is still sending
// Upload file from offset
[self uploadFile:_offset];
}
});
so I tried this, but I still get the update at the end of the loop, but I don't know if it is because the connection interval that I don't get a response on the iOS end. I know for certain the firmware is update the UUID as I am sending the file.
Source for uploadFile
- (void)uploadFile:(int)from {
// Look at file from index from
[_filePath seekToFileOffset:from];
NSData *soundData = [_filePath readDataOfLength:_packetSize];
// data it should be greater than 0 bytes
if (soundData.length > 0) {
[self connectToPeripherals:soundData];
// Find new pointer
_offset += _packetSize;
}
}
- (void)connectToPeripherals:(NSData *)data {
// Using NSNumber since I want to check if I get a nil returned
NSNumber *writeWithResponse = nil;
CBPeripheral *firstPeripheral = self.peripherals[0];
CBCharacteristic *firstCharacteristics = self.characteristics[0];
// Check if properties have a response, if so write with response
if((self.characteristicManager.properties & CBCharacteristicPropertyWrite)
== CBCharacteristicPropertyWrite) {
DLog(#"Write with response");
writeWithResponse = [NSNumber numberWithBool:YES];
} else if ((self.characteristicManager.properties &
CBCharacteristicPropertyWriteWithoutResponse) ==
CBCharacteristicPropertyWriteWithoutResponse) {
DLog(#"Write without response");
writeWithResponse = [NSNumber numberWithBool:NO];
}
// There is no write property
if (!writeWithResponse) {
DLog(#"ERROR! Could not write to characteristics");
return;
}
// Write to the first peripheral
[firstPeripheral writeValue:data forCharacteristic:firstCharacteristics
type:[writeWithResponse boolValue] ?
CBCharacteristicWriteWithResponse : CBCharacteristicWriteWithResponse];
}
tl;dr make didUpdateValueForCharacteristic called in everything

Related

Read "structured/serialized data" from NSStream

I am developing a game app which will take structured data from a server and make responses based on these data.
I have connected the app to the internet through the NSSream. Specifically, I follow the tutorial of Apple's guide. So my code looks something like:
- (void)stream:(NSStream *)stream handleEvent:(NSStreamEvent)eventCode {
switch(eventCode) {
case NSStreamEventHasBytesAvailable:
{
if(!_data) {
_data = [[NSMutableData data] retain];
}
uint8_t buf[1024];
NSInteger len = 0;
len = [(NSInputStream *)stream read:buf maxLength:1024];
if(len) {
[_data appendBytes:(const void *)buf length:len];
// bytesRead is an instance variable of type NSNumber.
[bytesRead setIntValue:[bytesRead intValue]+len];
} else {
NSLog(#"no buffer!");
}
break;
}
// continued
My problem is: "How can I convert the _data to the format I want when it is possible". For example, my server will send two types of data. For each data, the first byte is an indicater for the data type (e.g., 1 means data type 1, 2 means data type 2). For the data type 1, an int (4 bytes) is sent as the data itself. For the data type 2, an int (4 bytes) is sent as the size of the following string bytes. For example, if this int is 10, then there will be 10 more bytes sent from the server to form a string for the client.
The code in my Android(Java) app looks like:
// dataIn = new DataInputStream(socket.getInputStream());
private void keepPacketRecving(){ // this will be executed in a separate thread
while(keepRecvThreadRunning) {
try {
byte type;
type = dataIn.readByte();
if(type == 1){
int data = dataIn.readInt();
getTye1Data(data); // callback function for receiving type 1 data (int)
} else if (type ==2) {
int dataSize = dataIn.readInt();
byte[] buf = new byte[dataSize];
// ... loop to read enough byte into buf ...
String data = new String(buf);
getType2Data(data); // callback function for receiving type 2 data (String)
}
}
}
}
I also notice I can't have a while-loop inside the switch statement to ask inputStream to read more data untile it is enougth because it will hang the thread.
I guess the best way would be always appending all the bytes into the _data (like the example) and has another function to parse the _data? Then the problem is how I can peek bytes from the data and just fetch part of it out (e.g., 10 bytes out of a 20-byte _data).
Is there any wrapper in Objective-C like the DataInputStream in Java?

Deep copy of CMSampleBufferRef

I'm trying to perform a deep copy of a CMSampleBufferRef for audio and video connection ? I need to use this buffer for delayed processing. Can somebody helper here by point to a sample code.
Thanks
I solve this problem
I needs access to the sample data for a long period of time.
try many way:
CVPixelBufferRetain -----> program broken
CVPixelBufferPool -----> program broken
CVPixelBufferCreateWithBytes ----> it can solve this program,but this will reduce performance,Apple is not recommended to do so
CMSampleBufferCreateCopy --->it is ok, and apple recommended it.
List : To maintain optimal performance, some sample buffers directly reference pools of memory that may need to be reused by the device system and other capture inputs. This is frequently the case for uncompressed device native capture where memory blocks are copied as little as possible. If multiple sample buffers reference such pools of memory for too long, inputs will no longer be able to copy new samples into memory and those samples will be dropped. If your application is causing samples to be dropped by retaining the provided CMSampleBuffer objects for too long, but it needs access to the sample data for a long period of time, consider copying the data into a new buffer and then calling CFRelease on the sample buffer (if it was previously retained) so that the memory it references can be reused.
REF:https://developer.apple.com/reference/avfoundation/avcapturefileoutputdelegate/1390096-captureoutput
that might be what you need:
pragma mark -captureOutput
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection{
if (connection == m_videoConnection) {
/* if you did not read m_sampleBuffer ,here you must CFRelease m_sampleBuffer, it is causing samples to be dropped
*/
if (m_sampleBuffer) {
CFRelease(m_sampleBuffer);
m_sampleBuffer = nil;
}
OSStatus status = CMSampleBufferCreateCopy(kCFAllocatorDefault, sampleBuffer, &m_sampleBuffer);
if (noErr != status) {
m_sampleBuffer = nil;
}
NSLog(#"m_sampleBuffer = %p sampleBuffer= %p",m_sampleBuffer,sampleBuffer);
}
}
pragma mark -get CVPixelBufferRef to use for a long period of time
- (ACResult) readVideoFrame: (CVPixelBufferRef *)pixelBuffer{
while (1) {
dispatch_sync(m_readVideoData, ^{
if (!m_sampleBuffer) {
_readDataSuccess = NO;
return;
}
CMSampleBufferRef sampleBufferCopy = nil;
OSStatus status = CMSampleBufferCreateCopy(kCFAllocatorDefault, m_sampleBuffer, &sampleBufferCopy);
if ( noErr == status)
{
CVPixelBufferRef buffer = CMSampleBufferGetImageBuffer(sampleBufferCopy);
*pixelBuffer = buffer;
_readDataSuccess = YES;
NSLog(#"m_sampleBuffer = %p ",m_sampleBuffer);
CFRelease(m_sampleBuffer);
m_sampleBuffer = nil;
}
else{
_readDataSuccess = NO;
CFRelease(m_sampleBuffer);
m_sampleBuffer = nil;
}
});
if (_readDataSuccess) {
_readDataSuccess = NO;
return ACResultNoErr;
}
else{
usleep(15*1000);
continue;
}
}
}
then you can use it such this:
-(void)getCaptureVideoDataToEncode{
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^(){
while (1) {
CVPixelBufferRef buffer = NULL;
ACResult result= [videoCapture readVideoFrame:&buffer];
if (ACResultNoErr == result) {
ACResult error = [videoEncode encoder:buffer outputPacket:&streamPacket];
if (buffer) {
CVPixelBufferRelease(buffer);
buffer = NULL;
}
if (ACResultNoErr == error) {
NSLog(#"encode success");
}
}
}
});
}
I do this. CMSampleBufferCreateCopy can indeed deep copy
but a new problem is appear
captureOutput delegate doesn't work

iOS - Issue recieving data from NSStream

I am making an application that senses iBeacons. When you get within immediate range of an iBeacon the application sends the major and minor numbers of the beacon to a server and the server sends back an image that is stored in a MySQL database, different images are sent back based on the major and minor numbers.
The application sends the major and minor number to a Python (Twisted sockets) script via an NSStream, the script uses these numbers to get an image from the database and send it back to the application.
This setup work great when I use it to get simple text messages back from the database but I am running into problems when trying to recieve and display images inside the application.
first I will post the code of the stream:handleEvent that recieves the data from the input stream.
The code is only a slight modification of this tutorial http://www.raywenderlich.com/3932/networking-tutorial-for-ios-how-to-create-a-socket-based-iphone-app-and-server
// input stream event that recieves the data from the server
//
- (void)stream:(NSStream *)aStream handleEvent:(NSStreamEvent)eventCode
{
switch (eventCode)
{
case NSStreamEventOpenCompleted:
NSLog(#"stream opened");
break;
case NSStreamEventHasBytesAvailable: // event for recieving data
NSLog(#"Recieved Data");
if (aStream == _inputStream)
{
uint8_t buffer[500000];
int len;
// loop gets bytes from input stream
//
while ([_inputStream hasBytesAvailable])
{
len = [_inputStream read:buffer maxLength:sizeof(buffer)];
if (len > 0)
{
NSString *str = #"data:image/jpg;base64,";
NSString *img = [[NSString alloc] initWithBytes:buffer length:len encoding:NSASCIIStringEncoding];
str = [str stringByAppendingString:img];
NSData *ImgOut = [NSData dataWithContentsOfURL:[NSURL URLWithString:str]];
if (nil != ImgOut)
{
self.ImageView.image = [UIImage imageWithData:ImgOut];
NSLog(#"show image");
}
}
}
}
break;
case NSStreamEventErrorOccurred:
NSLog(#"can not connect to host");
[self initNetworkComms];
break;
case NSStreamEventEndEncountered:
NSLog(#"Connection Lost");
[_outputStream close];
[_inputStream close];
[self initNetworkComms];
break;
default:
NSLog(#"unkown event");
break;
}
}
just for good measure I will post the code of the Python script
from twisted.internet.protocol import Protocol, Factory
from twisted.internet import reactor
import mysql.connector
db = mysql.connector.connect(user='NotImportant', password='WouldntYouLikeToKnow', host='localhost', database='retailbeacons')
cursor = db.cursor()
class MessageServer(Protocol):
def connectionMade(self):
self.factory.clients.append(self)
print "clients are ", self.factory.clients
def connectionLost(self, reason):
self.factory.clients.remove(self)
print "client has disconnected"
def dataReceived(self, data):
a = data.split(':')
if len(a) > 1:
Major = a[0]
Minor = a[1]
msg = ""
print "Received query " + Major + ":" + Minor
sql = "SELECT Picture FROM beaconinfo WHERE major=" + Major + " AND minor=" + Minor + ";"
cursor.execute(sql)
for row in cursor.fetchall():
mess = row[0]
msg = mess.encode('utf=8')
self.message(msg)
def message(self, message):
self.transport.write(message + '\n')
factory = Factory()
factory.protocol = MessageServer
factory.clients = []
reactor.listenTCP(8080, factory)
print "Python message test server started"
reactor.run()
what happens with this code is that when the app queries the server, the server sends back the image data (in base64 format), the application recieves this data and the EventHasBytesAvailable case of the switch statement is triggered. But only a small portion of the image is displayed and I get an error log saying:
<Error>: ImageIO: JPEG Corrupt JPEG data: premature end of data segment
This led me to believe that not all the data came across the stream. you'll see in the code that I have an NSLog say 'Recieved Data' everytime the EventHasBytesAvailable case is called and 'show image' when the UIImageView is set with the image data.
The thing I find odd, and what I feel is the source of this problem is the fact that when the EventHasBytesAvailable is called the 'Recieved Data' message is logged, then the 'show image' message is logged, then once again the 'Recieved Data' message is logged and the Error listed above is then logged.
So it looks like a small portion of the data comes in through the stream, the loop gathers up those bytes and sticks them in the UIImageView, then more bytes come in through the stream and an attempt to put them into the UIImageView is made but the 'premature end of data segment' error occurs.
I am very confused as to why this is happening. Shouldn't the whole data of the image be sent through the stream with one calling of the EventHasBytesAvailable case? Possibly I have over looked the buffer in my code? Can my buffer take an image of 60kb? That is the only thing I can think of that might be wring with the application code, then all i can think of is maybe the Python script is sending the data in two chunks instead of one.
Thank you for your time. I am an intern that has hit a bit of a wall with this one! any help will be greatly appreciated!
fixed this problem. so the stream was sending the data over in more than one call of the 'HasBytes' case. so i created a string that gets appended with each chunk of the data when 'HasBytes' gets called. i also used a different method for converting the image data string to an NSData object.
NSString *ImgStr = [[NSString alloc] initWithBytes:buffer length:len encoding:NSASCIIStringEncoding];
// string property for appending
//
_str = [_str stringByAppendingString:ImgStr];
NSData *ImgData = [[NSData alloc] initWithBase64EncodedString:_str options:1];
if (nil != ImgData)
{
self.ImageView.image = [UIImage imageWithData:ImgData];
}
Thanks very much!

Can't read custom file from NSInputStream

I am trying to read a custom file from my documents directory via NSInputStream to upload it to a FTP server. I'm using the exact same code as demonstrated in the SimpleFTPSample provided by Apple. It seems to work fine as long as the file is empty, but as soon as it contains data it fails.
Here's my code. This is how I create the input stream, I tried it both ways:
//Approach one: Init with path
self.fileStream = [NSInputStream inputStreamWithFileAtPath:filePath];
//Approach two: Init with data
NSData* fileData = [NSData dataWithContentsOfFile:filePath];
self.fileStream = [NSInputStream inputStreamWithData:fileData];
If I init the stream with data, I get EXC_BAD_ACCESS (code=1, address=0x0) when invoking read on the stream (code snipped below), if I use the path it jumps right to File read error.
filePath is #"/var/mobile/Applications/94292A2A-37FC-4D8E-BDA6-A26963932AE6/Documents/1395576645.cpn", NSData returns properly and has 806 bytes.
That's part of the stream:handleEvent: delegate method:
if (self.bufferOffset == self.bufferLimit) {
NSInteger bytesRead;
bytesRead = [self.fileStream read: self.buffer maxLength: kSendBufferSize];
if (bytesRead == -1) {
if (kPrintsToConsole) NSLog(#"File read error");
} else if (bytesRead == 0) {
[self stopSending];
} else {
self.bufferOffset = 0;
self.bufferLimit = bytesRead;
}
}
I'm kinda stuck. Hope you guys can help me out. Running iOS 7.1 & Xcode 5.1
you need to call [self.fileStream open] before read. For both file and data approaches.

How to lock an array from audio processing for analysis

In my app I'm doing some audio processing.
In the for loop of the audio buffer, there is a NSMutable array. The loop is called a huge number of time every second (depending on the buffer size).
As an example :
#autoreleasepool
{
for ( int i = 0; i < tempBuffer.mDataByteSize / 2; ++i )
{
if ( samples[i] > trig)
{
[self.k_Array addObject:[NSNumber numberWithInt:k]];
// other stuff
}
}
}
Then, every second, I'm calling a function for other processing.
- (void)realtimeUpdate:(NSTimer*)theTimer
{
// Create a copy of the array
NSMutableArray *k_ArrayCopy = [NSMutableArray arrayWithArray:k_Array]; // CRASH with EXC_BAD_ACCESS code 1 error
//do some stuff with k_ArrayCopy
}
I sometime receive an EXC_BAD_ACCESS error because, I think, a locking problem of the array.
I spent a lot of time trying to get information on queues, locking, working copies, etc... but I'm lost on this specific case.
My questions :
do I have to use atomic or nonatomic for k_array ?
do I have to use a dispatch_sync function ? If so, where exactly ?
should the realtimeUpdate function be called on background ?
Thanks in advance !
Use dispatch queue that will solve problem
//create queue instance variable
dispatch_queue_t q = dispatch_queue_create("com.safearrayaccess.samplequeue", NULL);
//1.
#autoreleasepool
{
for ( int i = 0; i < tempBuffer.mDataByteSize / 2; ++i )
{
if ( samples[i] > trig)
{
dispatch_async(q, ^{
//queue block
[self.k_Array addObject:[NSNumber numberWithInt:k]];
});
// other stuff NOTE: if its operation on array do it in queue block only
}
}
}
//2.
- (void)realtimeUpdate:(NSTimer*)theTimer
{
// Create a copy of the array
__block NSMutableArray *k_ArrayCopy;//when you use any variable inside block add __block before it
dispatch_async(q, ^{
//queue block
k_ArrayCopy = [NSMutableArray arrayWithArray:k_Array];
});
//do some stuff with k_ArrayCopy
}
Now your add and read array operation are on same queue and it will not conflict..
For more details in using dispatch queue go through apples Grand Central Dispatch doc
Other way of doing this is use NSConditonLock

Resources