Is there a way to find out the maximum number of simultaneous touches on an iOS device (iPhone, iPod Touch, iPad) ? I've read here and there that iPhone can handle 5 while the iPad can handle 11, but I haven't found an official way (through a function call, say) to confirm this.
By testing it! See here for videos and source: http://mattgemmell.com/2010/05/09/ipad-multi-touch
There's no public API to request that information from the hardware.
iPhone can register 5 touch points on its tiny display, don't know about iPad.
Having said that, I wouldn't count on the numbers that you find empirically, because this information is not documented nor there is an api for it.
One good reason I can think of is the reduced touch sensor accuracy as the number of touch points increase.
Related
Is it practically possible for a developer right now (or at least theoretically in the future) to develop an App that can measure via UWB, the distance towards other Iphones?
UWB technology can take different forms in terms of ranging techniques. How does (or will) ranging work in these iPhones?
I have seen the iPhone11 schematic, found uwb chip U1 support 5 antennas, three of which are used for AOA positioning, the other two support uwb channel 5 and channel 9 data transmission, Now the iPhone and AirTag positioning, should be carried out through 3 AOA antennas, the other 2 should be used to support the iPhone as a tag (position with other iPhone phones, or anchor/node) use.
Yes. It might be an late answer, iOS provides Nearby Interaction framework to get distance and direction information.
https://developer.apple.com/documentation/nearbyinteraction
I am writing an iOS app and don't want to start detecting for shake events until after my device has reached a certain speed. I know how to use CMMotionManager to get the CMAccelerationData to detect for shake events but does anyone know what i should use to detect how fast my device is moving? Either CMDeviceMoting/userAcceleration/GPS... i am just cannot find what i should do. I am writing my app in swift but answers in Objective-C will suffice as well.
Thank you for your help
You could use CoreLocation and use a CLLocation's speed property. This requires the device's GPS, and will only work for somewhat large speeds.
I'm working on an app that tries to keep two MKMapView's synchronized with respect to scale. I spent a few days debugging on the iOS simulator, and was getting increasingly frustrated that attempts to set a map view's scale, whether by setting the region or the MapRect, yielded results wildly different than I expected.
When I tried the app on the most convenient iOS device at hand (iPad mini), MapKit was working mostly as expected and I was able to resolve the remaining nuances quickly. At this point, I can get both maps on the device to display identical areas (down to 10m or less in each dimension); on the simulator, setting a map's scale yields a result sometimes off as much as 2x the scale expected.
Has anybody else experienced this disparity between the simulator and the device? If so, any explanation?
Thanks in advance.
the scale of mapkit cannot be set accuratley, neither for one view, nor for both.
At least this is valid before ios 6.
The reason is that mapkit zooms to the next suitable google resoultion, if you want lets say a scale 5% bigger than the next google map, it will still snap to the google resol.
so up to and including ios 5 it is not possible to programatically zoom to an exact value. (i saw no post that mentioned the behavior in ios 6 apple maps)
So in your case, one view could match one of the 16 google zoom levels, while the other view falls in another zoom level.
Ive been working with the iOS sensors a bit off late and i wanted to write an app that would accurately track the motion of the phone in space. I wanted to know if its possible to track the motion of the device and detect gestures, such as drawing a circle with your phone or even moving in a straight line.
I've been searching online about this, and i wanted to know two things:-
1.Is it possible to do this with the CoreMotion framework.
2.If Yes, what is the alternative for older devices that do not support CoreMotion. Without the double integral method using the accelerometer!
This would really help!
Any other alternative ideas are most welcome!
Thanks in advance!
As your write, you cannot do the double integral.
For gesture recognition, I would try dynamic time warping. See my earlier answer here.
How much control do I have over the vibration unit on the iPhone?
I am simulating a plucked string, it would be fantastic if I could get the iPhone to vibrate in accord with the sound.
so,
a) is it possible to modulate the intensity of the vibration?
b) is it possible to change the duration?
Would it be possible to mimic a sharp attack that falls off gradually over a period of a few seconds?
I am aware that in older versions of iOS, it was only possible to activate it. both duration and intensity were fixed: taking control on vibration
is this still the case? Or has it become more flexible?
iOS 5 comes with an app that let's you design a custom vibration. The vibration is "designed" based on the user tapping on the phone.
That means it's technically feasible to create custom vibrations, but I don't know if there's an API for it. If there was in the newest SDK, it'd still be under NDA anyway.