I'm using the CMMotionManager to gather accelerometer data. I am trying to set the update interval to every half second with the following:
[_motionManager setDeviceMotionUpdateInterval:.5];
[_motionManager startAccelerometerUpdatesToQueue:[[NSOperationQueue alloc] init]
withHandler:^(CMAccelerometerData *accelerometerData, NSError *error) {
dispatch_async(dispatch_get_main_queue(), ^{
[self performSelectorOnMainThread:#selector(update:) withObject:accelerometerData waitUntilDone:NO];
});}];
yet I receive updates far more frequently than every half second. Any idea why?
Wasn't setting the update interval for accelerometer itself.
[_motionManager setAccelerometerUpdateInterval:.5];
Related
After seeing this question, I tried to code up a quick program that would save the watches accelerometer and gyroscope data to a file.
#implementation InterfaceController{
NSMutableArray *accData;
bool recording;
}
- (void)awakeWithContext:(id)context {
[super awakeWithContext:context];
// Configure interface objects here.
self.motionManager = [[CMMotionManager alloc] init];
[self.motionManager setAccelerometerUpdateInterval:.01];
}
- (IBAction)startStopRecording {
if (!recording){//We are starting to record.
recording = YES;
accData = [[NSMutableArray alloc] init];
[self.startRecording setTitle:#"Stop Recording"];
[self.motionManager startAccelerometerUpdatesToQueue:[NSOperationQueue currentQueue] withHandler:^(CMAccelerometerData *accelerometerData, NSError *error) {
[accData addObject:[NSString stringWithFormat:#"%f, %f, %f", accelerometerData.acceleration.x, accelerometerData.acceleration.y, accelerometerData.acceleration.z]];
}];
}else{
recording = NO;//we are stopping the recording
[self.motionManager stopAccelerometerUpdates];
[self.startRecording setTitle:#"Start Recording"];
[InterfaceController openParentApplication:#{ #"accData": accData } reply:^(NSDictionary *replyInfo, NSError *error) { //this method saves the array to a csv file.
NSLog(#"Data has been saved.");
}];
}
}
I had plotted this data and for the life of me, no matter how hard I shook the watch, all my plots looked like this:
Until 8 hours later, I started to suspect that I wasn't grabbing the acceleration data from the watch, but rather from the phone (sitting still on the table next to me). I ran some tests and confirmed that this is exactly what is happening.
Which leads me to the original question. How do I pull acceleration/gyro/data from the watch and not from the iPhone?
The problem was that I wasn't running watchOS2. I assumed I was but it's still in beta and I hadn't installed it. The data I was getting was accelerometer data from the phone. Also, currently, you can only get acc data from the watch using watchOS2 and not gyro data.
you can use CoreMotion framework to get activity data.
while I can only get accel data, the gyro often return false.
Currently I'm trying to use CMAcceleration to get accelerometer values for x,y& z axis is it possible to get Acceleration Time & Acceleration Magnitude with this x,y & z values.When i checked the developer forum i can't find any delegates / methods to get this two.Guide me to find the values.
Using the following code:
int updatesensorFrequencyInterval = [sensorFrequencySliderTxt.text intValue];
self.motionManager.accelerometerUpdateInterval = updatesensorFrequencyInterval;
self.motionManager.gyroUpdateInterval = updatesensorFrequencyInterval;
[self.motionManager startAccelerometerUpdatesToQueue:[NSOperationQueue currentQueue]
withHandler:^(CMAccelerometerData *accelerometerData, NSError *error) {
[self outputAccelertionData:accelerometerData.acceleration];
if(error){
NSLog(#"%#", error);
}
}];
The x,y,z values of CMAccelerometerData acceleration struct are magnitudes in m/s^2.
CMAccelerometerData is a subclass of CMLogItem, so it inherits the timestamp property which provides the NSTimeInterval since the phone was booted. This can be used to measure elapsed time between any pair of samples.
In my app the iPhone is moved in two cases:
Moved in lineare direction
Moved in a arc direction
How can I detect whether the direction of the movement changes the direction?
You'll have to pull in the CoreMotion framework and start the device accelerometer and/or gyroscope.
CoreMotion Reference
What you'r looking to get is CMAccelerometerData. This is done by instantiating a CMMotionManager object and calling startAccelerometerUpdatesToQueue:withHandler:
Something like this:
CMMotionManager *manager = [[CMMotionManager alloc] init];
manager.accelerometerUpdateInterval = 0.5; // half a second
NSOperationQueue *queue = [[NSOperationQueue alloc] init];
[manager startAccelerometerUpdatesToQueue:queue withHandler:^(CMAccelerometerData *accelerometerData, NSError *error) {
double x = accelerometerData.acceleration.x;
double y = accelerometerData.acceleration.y;
double z = accelerometerData.acceleration.z;
// post back to main queue to update UI
dispatch_async(dispatch_get_main_queue(), ^{
});
}];
You'll need to use some good old-fashioned geometry to detect arcs vs. lines.
I am using the EKEventEditViewController to allow adding events from my application to the iPhone calendar. This is currently the code that I am using:
[self.store requestAccessToEntityType:EKEntityTypeEvent completion:^(BOOL granted, NSError *error) {
if (!granted) { return; }
EKEvent *storedEvent = [EKEvent eventWithEventStore:self.store];
storedEvent.title = self.selectedEvent.title;
storedEvent.startDate = self.selectedEvent.date;
storedEvent.endDate = [NSDate dateWithTimeInterval:60*60 sinceDate:self.selectedEvent.date];
storedEvent.notes = self.selectedEvent.comments;
[storedEvent setCalendar:[self.store defaultCalendarForNewEvents]];
self.eventController.event = storedEvent;
self.eventController.eventStore = self.store;
self.eventController.editViewDelegate = self;
[self presentViewController:self.eventController animated:YES completion:nil];
}];
This code is taking upwards of 10 seconds to produce the necessary view event though I have pre-inited both the view controller and the EventStore. Is there a way to make this faster, or do I just need to put up a spinner and tell the users to wait?
The completion handler will be called on an arbitrary queue.
Inside the completion block dispatch UIKit-related stuff on the main thread:
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
self.eventController.event = storedEvent;
self.eventController.eventStore = self.store;
self.eventController.editViewDelegate = self;
[self presentViewController:self.eventController animated:YES completion:nil];
}];
Good day, here's what i am trying to do:
i have a photo processing app that takes images using AVFoundation
i have a DeviceMotion queue that is processing device position at 60Hz
when image is taken, it needs to be cropped and saved. DeviceMotion needs to keep running and interface updated without delays
what i am seeing is: updates to interface from DeviceMotion queue are being frozen for the duration of image crop.
this is how i start updates for DeviceMotion:
self.motionManager.deviceMotionUpdateInterval = 1.0f/60.0f;
gyroQueue = [[NSOperationQueue alloc] init];
[self.motionManager startDeviceMotionUpdatesToQueue:gyroQueue withHandler:^(CMDeviceMotion *motion, NSError *error){
[NSThread setThreadPriority:1.0];
[self processMotion:motion withError:error];
}];
when images is returned from AVFoundation it is added to the queue for processing:
imageProcessingQueue = [[NSOperationQueue alloc] init];
[imageProcessingQueue setName:#"ImageProcessingQueue"];
[imageProcessingQueue setMaxConcurrentOperationCount:1];
//[imageProcessingQueue addOperationWithBlock:^{
//[self processImage:[UIImage imageWithData:imageData]];
//}];
NSInvocationOperation *operation = [[NSInvocationOperation alloc] initWithTarget:self selector:#selector(processImage:) object:[UIImage imageWithData:imageData]];
[operation setThreadPriority:0.0];
[operation setQueuePriority:NSOperationQueuePriorityVeryLow];
[imageProcessingQueue addOperation:operation];
and the method for processing the image:
- (void)processImage:(UIImage*)image {
CGSize cropImageSize = CGSizeMake(640,960);
UIImage *croppedImage = [image resizedImageWithContentMode:UIViewContentModeScaleAspectFit bounds:cropImageSize interpolationQuality:kImageCropInterpolationQuality];
NSData *compressedImageData = UIImageJPEGRepresentation(croppedImage, kJpegCompression);
[self.doc addPhoto:compressedImageData];
}
the issue is:
devicemotion updates are blocked for the duration of image crop when image is processed using the NSOperationQueue
if i process the image using performSelectorInBackground - it works as desired (no delays to DeviceMotion queue)
[self performSelectorInBackground:#selector(processImage:) withObject:[UIImage imageWithData:imageData]];
any ideas on where my understanding of background threading needs an update? :)
PS. I have asked this question earlier but it got nowhere, so this is a re-post
i have found a solution (or solid workaround) for this issue:
instead of routing deviceMotion updates to the queue using startDeviceMotionUpdatesToQueue, i have created a CADisplayLink timer and it is not interfering with other background queues - while it is matching screen refresh rate it's given highest priority by it's nature:
[self.motionManager startDeviceMotionUpdates];
gyroTimer = [CADisplayLink displayLinkWithTarget:self selector:#selector(processMotion)];
[gyroTimer addToRunLoop:[NSRunLoop currentRunLoop] forMode:NSDefaultRunLoopMode];