iOS Fine-grained control of vibration unit - ios

How much control do I have over the vibration unit on the iPhone?
I am simulating a plucked string, it would be fantastic if I could get the iPhone to vibrate in accord with the sound.
so,
a) is it possible to modulate the intensity of the vibration?
b) is it possible to change the duration?
Would it be possible to mimic a sharp attack that falls off gradually over a period of a few seconds?
I am aware that in older versions of iOS, it was only possible to activate it. both duration and intensity were fixed: taking control on vibration
is this still the case? Or has it become more flexible?

iOS 5 comes with an app that let's you design a custom vibration. The vibration is "designed" based on the user tapping on the phone.
That means it's technically feasible to create custom vibrations, but I don't know if there's an API for it. If there was in the newest SDK, it'd still be under NDA anyway.

Related

Is there any way to detect whether a device has a case and screen protector on it?

I know this might be a stupid question, but can we detect whether an iPhone device has a case and screen protector on it in code?
Enable the microphone and then vibrate the device. If you have a baseline idea of how the audio frequency looks without a bumper, then you can diff that against what you record.
Typically, the cases are made out of materials like silicone that reduce the intensity of the vibration sound.
I don’t think so. Screen protectors and cases are made with sensors in mind, so they don’t interfere with proximity sensors cameras, or microphones, which I think is the only way we have to know if there’s something on the phone.
Battery cases, or lightning accessories are something else though, but I don’t think you are talking about those.

Is it possible to simulate motion/collisions in GameScene for iOS?

I have an idea for a game in iOS but I would need to be able to simulate a wide range of physicsWorld possibilities for the game to work.
I can probably do this manually, but I'm not sure that it would be realistic to do this in real time (I would need to perform the calculations on the fly prior to new game setups). I'm wondering if there's anything built in for iOS.

Form Slide Transition Not Smooth

I used to test my apps using an iPhone 5S - now I switched to a iPhone SE.
Now I am asking myself why the default form slide transitions still stutter on such a fast device - the animation should always appear smooth if the calculated locations are correct relative to the timeline.
Looking into CommonTransitions I saw that there is a CommonTransitions.TYPE_FAST_SLIDE and wondered if this was the key to smooth transitions, is it?
In Theme Constants in the Codename One Designer under FormTransitionOut however there is no option fastSlide - why is that?
I've wondered the same, although I have only a cheap device to test on.
This may only add fuel to the fire, but have you tried Display.setFramerate(int rate). The docs say the default is an (attempted) rate of 10 (redraws per second).
Maybe it would look smoother with 20?

How to use CMDeviceMotion to detect how fast my device is traveling?

I am writing an iOS app and don't want to start detecting for shake events until after my device has reached a certain speed. I know how to use CMMotionManager to get the CMAccelerationData to detect for shake events but does anyone know what i should use to detect how fast my device is moving? Either CMDeviceMoting/userAcceleration/GPS... i am just cannot find what i should do. I am writing my app in swift but answers in Objective-C will suffice as well.
Thank you for your help
You could use CoreLocation and use a CLLocation's speed property. This requires the device's GPS, and will only work for somewhat large speeds.

Play sound when silence in the room; stop sound when voices heard

I need some guidance as I may have to shelve development until a later time.
I want to play a sound once the lights are switched off and the room goes dark, then stop the sound once the light is switched back on. I've discovered that Apple doesn't currently provide a way to access the ambient light sensor (not in any way that will get App Store approval).
The alternative I've been working on is to try and detect sound levels (using AVAudioPlayer/Recorder and example code from http://mobileorchard.com/tutorial-detecting-when-a-user-blows-into-the-mic/. I.e., when I detect voices of people in the room have dropped to a specific level (i.e. silence trying to compensate for background noise), I play my sounds.
However, if the people in the room start talking again and I detect the voices, I need to stop playing the sounds.
Q: is this self-defeating, i.e., the sound generated by the iPhone will essentially be picked up by the iPhone microphone and indistinguishable from any voices in the room? Methinks yes and unless there's an alternative approach to this, I'm at an impasse until light sensor API is opened up by Apple.
I don't think the noise made by the iPhone speaker will be picked up by the mic. The phone cancels sounds generated by the speaker. I read this once, and if I find the source I'll post it. Empirically, though, you can tell this is the case when you use speaker phone. If the mic picked up sound from the speaker that's an inch away from it, the feedback would be terrible.
Having said that, the only sure way to see if it will work for your situation is to try it out.
I agree with woz: the phone should cancel the sound it's emitting. About the ambient light sensor, the only alternative I see is using the camera, but it would be very energy inefficient, and would require the app to be launched.

Resources