Bluetooth peripheral doesn't answer - ios

I am in big trouble with my exams.
I have to write an iOS app that uses an external sensor made by Texas Instruments, it's called TI Sensortag.
TI's documentation, in my humble opinion, is really poor and complicated to understand for an entry level programmer.
I tried to ask in the E2E forum but they weren't able to help me, their answer was something like "Um, well, we don't know, go away and ask someone else", ...
I added the CoreBluetooth framework to my project an created a CentralManager. I am able to find my device, connect and get his name and (sometimes) his RSSI.
Now what I'm trying to do is to ask my CBPeripheral object if it has some services for me or something like that. I've found the Complete Attribute Table but I have no idea how to use it...
I know I have to activate some services or something like that but I really don't now ho to do it, I googled a lot but I've not found something helpful...
I'm trying to enable my sensor with this method, but I'm doing something wrong.
-(void) configureSensorTag
{
uint8_t myData = 0x01;
NSData *data = [[NSData alloc] initWithBytes:&myData length:1];
[BLEUtility writeCharacteristic:myPer sUUID:#"F000AA00-0451-4000-B000-000000000000" cUUID:#"F000AA02-0451-4000-B000-000000000000" data:data];
[BLEUtility setNotificationForCharacteristic:myPer sUUID:#"F000AA00-0451-4000-B000-000000000000" cUUID:#"F000AA01-0451-4000-B000-000000000000" enable:YES];
NSLog(#"Configured TI SensorTag IR Termometer Service profile");
}
Moreover I'm trying to retrive Sensortag's services with this method
[peripheral discoverServices:nil];
and his delegate
- (void)peripheral:(CBPeripheral *)peripheral didDiscoverServices:(NSError *)error
{
NSLog(#"Found service");
if (error) {
NSLog(#"Error: %#", error);
}
}
but it is never called.
Has someone any idea?
Thank you very much!

Unfortunately I can't help you with the details of the iOS, but I can help you with understanding the sensor tag. If you look at that attribute PDF you linked you'll find entries marked "GATT_CLIENT_CHAR_CFG_UUID". It's 16 bits of flags where only the 2 least significant bits are used. It even says in there 'Write "01:00" to enable notifications, "00:00" to disable'. (That's the least significant bit because it's encoded in little-endian format)
So, you're sending a 0x01 to turn on the IR temperature sensor, but you haven't turned on the notifications. Turning it on will then cause the device to stream notifications back to the client. The accelerometer doesn't require turning on, so maybe you should try that first.
I have no idea what that second chunk of code is supposed to be doing... sorry.

Ok ok I got it,
there were any software problem, I mean, not by iOS side.
Sensortag has a wrong firmware and so it did'n work.
I've changed Sensortag and now everything works fine.
Thank you anyway!

Related

Split data from Arduino in Objective-C

Below is a code to receive data from a Bluno Beetle BLE:
/* Data received */
else if ([characteristic.UUID isEqual:[CBUUID UUIDWithString:BLECharacteristic]]){
NSString *data = [[NSString alloc] initWithData:characteristic.value encoding:NSUTF8StringEncoding];
NSLog(#"Received Data = %#", data);
[_receiveText setText:data];
}
However, if I want to display multiple data values, is there a way for me to split the received text/data?
For example I want to display a number and a text, and the Arduino sends over a string. New to coding, so your help and patience will be appreciated!
is there a way for me to split the received text/data?
Yes, of course. You can do whatever you like to the data once you've got it. Take a look at the NSString documentation and you'll find plenty of methods for splitting and otherwise extracting data from strings. Some examples: -componentsSeparatedByString:, componentsSeparatedByCharactersInSet:, -stringByTrimmingCharactersInSet:, -substringWithRange:, etc. There are also other Foundation classes that can help, like NSScanner and NSRegularExpression.
New to coding, so your help and patience will be appreciated!
Reading the fine manual should be your first move no matter what your experience level is. The documentation for Apple's frameworks is generally excellent, and it includes many guides and introductory "getting started" documents that make it easy to get up to speed.

React native: Real time camera data without image save and preview

I started working on my first non-demo react-native app. I hope it will be a iOS/Android app, but actually I'm focused on iOS only.
I have a one problem actually. How can I get a data (base64, array of pixels, ...) in real-time from the camera without saving to the camera roll.
There is this module: https://github.com/lwansbrough/react-native-camera but base64 is deprecated and is useless for me, because I want a render processed image to user (change picture colors eg.), not the real picture from camera, as it does react-native-camera module.
(I know how to communicate with SWIFT code, but I don't know what the options are in native code, I come here from WebDev)
Thanks a lot.
This may not be optimal but is what I have been using. If anyone can give a better solution, I would appreciate your help, too!
My basic idea is simply to loop (but not simple for-loop, see below) taking still pictures in yuv/rgb format at max resolution, which is reasonably fast (~x0ms with normal exposure duration) and process them. Basically you will setup AVCaptureStillImageOutput that links to you camera (following tutorials everywhere) then set the format to kCVPixelFormatType_420YpCbCr8BiPlanarFullRange (if you want YUV) or kCVPixelFormatType_32BGRA(if you prefer rgba) like
bool usingYUVFormat = true;
NSDictionary *outputFormat = [NSDictionary dictionaryWithObject:
[NSNumber numberWithInt:usingYUVFormat?kCVPixelFormatType_420YpCbCr8BiPlanarFullRange:kCVPixelFormatType_32BGRA]
forKey:(id)kCVPixelBufferPixelFormatTypeKey];
[yourAVCaptureStillImageOutput setOutputSettings:outputFormat];
When you are ready, you can start calling
AVCaptureConnection *captureConnection=[yourAVCaptureStillImageOutput connectionWithMediaType:AVMediaTypeVideo];
[yourAVCaptureStillImageOutput captureStillImageAsynchronouslyFromConnection:captureConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if(imageDataSampleBuffer){
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(imageDataSampleBuffer);
CVPixelBufferLockBaseAddress(imageBuffer, 0);
// do your magic with the data buffer imageBuffer
// use CVPixelBufferGetBaseAddressOfPlane(imageBuffer,0/1/2); to get each plane
// use CVPixelBufferGetWidth/CVPixelBufferGetHeight to get dimensions
// if you want more, please google
}
}];
Additionally, use NSNotificationCenter to register your photo-taking action and post a notification after you have processed each frame (with some delay perhaps, to cap your through-put and reduce power consumption) so the loop will keep going.
A quick precaution: the Android counterpart is much worse a headache. Few hardware manufacturers implement api for max-resolution uncompressed photos but only 1080p for preview/video, as I have raised in my question. I am still looking for solutions but gave up most hope. JPEG images are just toooo slow.

Progressbar on file-upload to Amazon S3 for iOS?

I was using the services from Parse a while back, and they had implemented an amazing feature for uploading data, with a method something like this:
PFFile *objectToSave...; //An image or whatever, wrapped in a Parse-file
[objectToSave saveInBackgroundWithBlock:^(BOOL succeeded, NSError *error) {
//Do stuff after upload is complete
} progressBlock:^(int percentDone) {
[someLabel setText:[NSString stringWithFormat:#"%i%#", percentDone, #"%"]];
}];
Which let me keep track of the file-upload. Since Parse only let me upload max 10mb files, I chose to move to the cloud-area to explore a bit. I've been testing with Amazon's S3-service now, but the only way I can find how to upload data is by calling [s3 putObject:request];. This will occupy the main thread until it's done, unless I run it on another thread. Either way, I have no idea of letting my users know how far the upload has come. Is there seriously no way of doing this? I read that some browser-API-version of S3's service had to use Flash, or set all uploads to go through another server, and keep track on that server, but I won't do either of those. Anyone? Thanks.
My users are supposed to be uploading video with sizes up to 15mb, do I have to let them stare at a spinning wheel for an unknown amount of time? With a bad connection, they might have to wait for 15 minutes, but they would stare at the screen in hope the entire time.
Seems like I didn't quite do my homework before posting this question in the first place. I found this great tutorial doing exactly what I was asking for. I would delete my question, but I'll let it stay just in case it might help other helpless people like myself.
Basically, it had a delegate-method for this. Do something like this:
S3PutObjectRequest *por = /* your request/file */;
S3TransferManager *tm = /* your transfer manager */;
por.delegate = self;
tm.delegate = self;
[tm upload: por];
Then use this appropriately named delegate-method:
-(void)request:(AmazonServiceRequest *)request
didSendData:(long long)bytesWritten
totalBytesWritten:(long long)totalBytesWritten
totalBytesExpectedToWrite:(long long)totalBytesExpectedToWrite
{
CGFloat progress = ((CGFloat)totalBytesWritten/(CGFloat)totalBytesExpectedToWrite);
}
It will be called for every packet it uploads or something. Just be sure to set the delegates.
(Not sure if you need both delegates to be set though)

Saving Roster for Offline/Disconnects proposes

I downloaded the SampleProject off XMPPFramework for iOS and connected it already to my Jabber Server. Everything ok.
But I would like to have my Buddys to stay in the Buddys overview even if I have been disconnected (Connection lost). Is that possible? As I understood the XMPPCoreStorage save them in CoreData. Why do it get cleaned at a disconnect? Is it possible to disable the clean at disconnect? I unfornatly didnt found the method.
Would be really annoying & much traffic if it isnt persistent or?
Any help would be great!
So I came up with a solution:
On the XMPPRoster.m xmppStreamDidDisconnect:
i commented 4 lines out. It looks now like this:
- (void)xmppStreamDidDisconnect:(XMPPStream *)sender withError:(NSError *)error
{
// This method is invoked on the moduleQueue.
XMPPLogTrace();
// [xmppRosterStorage clearAllUsersAndResourcesForXMPPStream:xmppStream];
//
// [self _setRequestedRoster:NO];
// [self _setHasRoster:NO];
//
// [earlyPresenceElements removeAllObjects];
}
So it will not clear the storage on a disconnect. On a reconnect it will clean and refill the storage with updated information. Hope it helps other people. Its not a perfect solution . A preffered one is to build your own CoreData model & fetch the XMPP stuff in there..

Updating TXTRecordDictionary doesn't always notify monitoring services

I'm using bonjour to find other devices. Each device uses TXTRecordData to share its name:
NSDictionary* dictionary = #{ #"name": #"Goose" };
[service setTXTRecordData:[NSNetService dataFromTXTRecordDictionary:dictionary]];
Each device listens for changes:
- (void) netService:(NSNetService *)sender didUpdateTXTRecordData:(NSData *)data
{
...
}
About 80% of the time it works - didUpdateTXTRecordData is called when a name is changed.
Sometimes the other devices are never notified.
I've checked and setTXTRecordData returns YES when the data is lost.
How can I make sure updates to the TXTRecordData makes it to other devices?
Someone posted a gist demonstrating what is possibly the above bug.
For you Apple people out there, the gist says the relevant rdar is rdar://11018654

Resources