I'm looking to use the CMMotionActivity for iPhone 5s's, but also want to be able to use similar functionality on older iPhone's, is this possible?
Could I create a less accurate alternative maybe, by tracking GPS and not using the M7 chip? Any advice/tutorials/sample code?
You can create your own algorithm which will utilize accelerometer data and estimate number of steps taken. Its not as accurate and its not a good idea to have 2 separate logic in the same app.
In case, you want to give it a try, check this answer..How to count steps using an Accelerometer?
Related
I would like to implement fitness app for learning purpose. I would like to detect the user's action such as running,walking. According to my research, now I am able to detect the user's activity using motionActivityManager from CoreMotion framework. Now I would like to detect the user's rate. For example, user is running in what kilomiles per hour.
Can anyone give any suggestions to achieve this? Thanks.
I would check out the documentation on core motion here: https://developer.apple.com/documentation/corelocation/cllocation
You can get the devices speed and heading using the speed and course variables. The speed variable is in meters per second, so you can easily convert to km/s from there
I'm researching AR frameworks in order to select the best option for developing conference call/ meeting application for ODG glasses.
I got only a few directions for selecting a framework:
Performance of video streaming (capturing and encoding) must be watched closely to avoid overheating and excessive power consumption,
Should support extended tracking and
Video capturing should not be frame by frame.
I have no experience with AR field in general, and I would really appreciate if you can let me know your opinion or to give me some guidance on how to choose the best-fitted framework.
For ODG, you should use Vuforia according software details :
Qualcomm Technologies Inc.'s VuforiaTM SDK for Digital Eyewear
Vuforia supports extended tracking. According to what you are asking, you'll need more than just an AR SDK. You'll need to identify what you want exactly. Do you want an application that let the user see with who he's talking or do you want some holographic stuff? Depending on what you want, maybe smartglasses isn't what you need and at this point you should try to learn more about the differents SDK out there. I suggest you to look at this and that.
I would like to develop a mobile app for iPhones, that calculates time needed to reach a given velocity. For example: I'm in my car, open the app, choose 100km/h and when I accelerate the app should start to count time and it stops counting just in the moment when I reach 100km/h. It should be very accurate.
I heard about two solutions. First is to use the accelerometer/gyroscope, but some people told me it's bad idea, because I won't be able to calculate time on longer distances. The second option is to use GPS, but on the other hand it can be not as accurate as I want it to be.
So I need suggestions, which option is better and why.
My targets are iPhones 4s and newer.
If you want to be more precise than the GPS you will need to have some sort of sensor. Most similar apps and concepts will create a receiver that plugs into the car that the iPhone can connect to. This has the benefit of making all of the sensors in the car available to you. This is an example: https://www.automatic.com/how-automatic-works/
I've been researching the new M7 chip's CMMotionActivityManager, for determining whether the user of the device is walking, running, in a car, etc (see Apple Documentation). This seemed like a great step forward over trying to determine this previous from using LocationManager and accelerometer data only.
I notice however that CMMotionActivityManager does not have a cycling activity, which is disappointing, and almost a deal-breaker for complete usage as a new activity manager. Has anyone else found a convenient way to use CMMotionActivityManager with cycling also without having to reincorporate CMLocationManager + accelerometer just to try to test for cycling too?
Note, this also does not include general transport options for things like a Train. For instance, I commute an hour a day on the train. Automotive could be made more generic at least, similar to how Moves uses Transport.
CMMotionActivity has these defined motion types only:
stationary
walking
running
automotive
unknown
Useful notes from Apple's code, that does not necessarily solve the issue, but is helpful:
CMMotionActivity
An estimate of the user's activity based on the motion of the device.
The activity is exposed as a set of properties, the properties are not
mutually exclusive.
For example, if you're in a car stopped at a stop sign the state might
look like:
stationary = YES, walking = NO, running = NO, automotive = YES
Or a moving vehicle, stationary = NO, walking = NO, running = NO,
automotive = YES
Or the device could be in motion but not walking or in a vehicle.
stationary = NO, walking = NO, running = NO, automotive = NO. Note in this case all of the properties are NO.
[Direct Source: Apple iOS Framework, CoreMotion/CMMotionActivity.h #interface CMMotionActivity, inline code comments]
First of all its your question or kind of informative details on M7?
Has anyone else found a convenient way to use CMMotionActivityManager
with cycling also without having to reincorporate LocationManager +
accelerometer just to try to test for cycling too?
See there is lots of confusion it will create if you want to check if activity is type of cycling ??because its just depend on accelerometer
accelerometer contain microscopic crystal structures that get stressed by accelerative forces, which causes a voltage to be generated.and from that voltage it can parse the result.. so what i know is its just classifies your speed and giving you result that its running walking or automotive so if you want to use cycling some time very fast very slow or medium so may be it will some time result in to walking or running or may be auotomotive so m7 can not clarify the thing if its automotive or cycling or running because there is not much of speed variance while you cycling.
Still while using for running and walking its some time gives wrong results in some cases.. so that will chances that your app will give wrong information too.
One more thing you asked is
Note, this also does not include general transport options for things
like a Train. For instance, I commute an hour a day on the train.
Automotive could be made more generic at least, similar to how Moves
uses Transport.
So Apple is also working on other mapping features. Apple is said to be planning notable updates to its Maps app in iOS 8, and the company is currently working on implementing both public transit directions and indoor mapping features (which Google already has on iOS).
http://www.macrumors.com/2013/09/12/apple-working-to-leverage-new-m7-motion-sensing-chip-for-mapping-improvements/ (Useful Link)
So, not sure if you still need an answer to that but here is the latest from iOs8 SDK
#property(readonly, nonatomic) BOOL cycling NS_AVAILABLE(NA, 8_0);
In session 612 at WWDC 2014, the two presenting Apple engineers provided some information: In the slides they stated:
Performance is very sensitive to location
Works best if device is worn on upper arm Best for retrospective use cases
Longest latency
Best for retrospective use cases
In the video they explain on the audio track (starting at about 11:00) that
Cycling is new, something we introduced in iOS 8.
Cycling is very challenging, and again you need the dynamics and so
it's going to be very sensitive to location.
If it was mounted on the upper arm the latency is going to be fairly
reasonable.
And if it's anywhere else, it's going to take a lot longer. So definitely I would not suggest using cycling activity classification as a hint for the context here and now. It's really something that you'll want to use in a retrospective manner for a journaling app, for example.
I made a simple test setup for iOS 8 and 9 and iPhone 5s and 6 and cycling was not detected - not a single time in over 1.5h cycling. If the new iPhone 6S makes good this major deficit in motion activity detection is unclear - Phil Schiller announced it in September 2015.
tl;tr
Currently, cycling detection in CoreMotion does not work as it works for stationary, walking, running, and car! It will be not detected and can be used retrospectively only.
I am working on augmented reality app. I have augmented a 3d model using open GL ES 2.0. Now, my problem is when I move device a 3d model should move according to device movement speed. Just like this app does : https://itunes.apple.com/us/app/augment/id506463171?l=en&ls=1&mt=8. I have used UIAccelerometer to achieve this. But, I am not able to do it.
Should I use UIAccelerometer to achieve it or any other framework?
It is complicated algorithm rather than just Accelerometer. You'd better use any third party frameworks, such as Vuforia, Metaio. That would save a lot of time.
Download and check a few samples apps. That is exactly what you want.
https://developer.vuforia.com/resources/sample-apps
You could use Unity3D to load your 3D model and export XCODE project. Or you could use open GL ES.
From your comment am I to understand that you want to have the model anchored at a real world location? If so, then the easiest way to do it is by giving your model a GPS location and reading the devices' GPS location. There is actually a lot of research going into the subject of positional tracking, but for now GPS is your best (and likely only) option without going into advanced positional tracking solutions.
Seeing as I can't add comments due to my account being too new. I'll also add a warning not to try to position the device using the accelerometer data. You'll get far too much error due to the double integration of acceleration to position (See Indoor Positioning System based on Gyroscope and Accelerometer).
I would definitely use Vuforia for this task.
Regarding your comment:
I am using Vuforia framework to augment 3d model in native iOS. It's okay. But, I want to
move 3d model when I move device. It is not provided in any sample code.
Well, it's not provided in any sample code, but that doesn't necessarily mean it's impossible or too difficult.
I would do it like this (working on Android, C++, but it must be very similar on iOS anyway):
locate your renderFrame function
simply do your translation before actual DrawElements:
QCARUtils::translatePoseMatrix(xMOV, yMOV, zMOV, &modelViewProjectionScaled.data[0]);
Where the data for the movement would be prepared by a function that reads them from the accelerometer as a time and acceleration...
What I actually find challenging is to find just the right calibration for a proper adjustment of the output from the sensor's API, which is a completely different and AR/Vuforia unrelated question. Here I guess you've got a huge advantage over Android devs regarding various devices...