Testing Ipad Touch Events in Windows - ipad

I have a website that I would like to be iPad compatible. However, I use Windows, so I can't install the official emulator.
I tried iBBDemo2 (http://code.google.com/p/ibbdemo2/) which looks good, but it uses regular mouse events instead of the touch events (so I can double click with my mouse and fire a dblclick event on an element... which in this case would just cause a zoom to occur on a real iPad)
Is there any way at all to test this?

Have you tried Ripple emulator? It is a very useful chrome extension. It has accelerometer support and other features.
The link is here.
I don't know if really solves your problem, but you can try it for free.

Related

Detecting Hand touch event on iPad screen

As part of a children's application I am trying to detect a hand print on an ipad screen. I can detect touches with the touch events but ultimately the hand is going to be 1 large touch event. I am trying to detect the size and x,y position of the hand. Any ideas on whether this is even possible?
The other part of this is I am using corona sdk/lua to develop. Is the solution (if there is one) still going to be feasible, or should I look at changing to native code?
I don't know how practical this will be in native code. It sound like you could enable multi touch and on touch events you would get multiple locations. See the Multitouch sample app.
If you're on a Mac, open Finder and go to /Applications/CoronaSDK/SampleCode/Interface/ and there are three sample projects in there that involve multitouch. If you're on Windows it's C:\Program Files (x86)\CoronaSDK\SampleCode (I think).

What changes must be done to use the blackberry application code to work on Blackberry touch?

Hi I have been working on a blackberry app and developed about 90% of the app .But my client wants the app should work on blackberry touch also. So I just wanted to know what changes I will need to do to convert the same code to work on blackberry touch .Please help me I am new to Blackberry app development .
Just to confirm, we are talking about a Java application running on BB7 or earlier OS.
The short answer is, it depends.
If you have used standard RIM controls (buttons, ObjectChoiceField etc.) then these will work on the touch screen with no change. If you have used your own controls, for example an image button, then these might not work very well - for example the button might not be big enough to be hit easily with a fat finger.
The other problem is the virtual keyboard, it may appear at points when you do not want it to and not appear when you do. Not a problem on a lot of phones, but remember there are at least 2 non keyboard phones out there, so the Virtual keyboard is the only option for typing.
The best approach is work through each of your screens and try them in the Simulator. Zoom the Simulator so that it looks like the real device. And pretend you are using a finger, don't rely on the mouse because you can position the mouse very accurately.
I would raise new questions about specific Fields that you have problems with, rather than continue this with any issues you find.

Show taps in iOS App Demo Video

I am making demo videos of my iOS apps, some of which I made with XCode and others of which I made in Unity3D. I plan to use the Elgato Game Capture HD to capture my demo videos but I am not sure how to show the "taps." I found Touchpose, but when I changed the main.m as suggested by the instructions on the GitHub page I got error messages saying that QAppDelegate and QTouchposeApplication were undefined. I added import "QTouchposeApplication" but got an error message suggesting I change QAppDelegate to AppDelegate. When I did this the build failed. When I left it as QAppDelegate, added QAppDelegate to the project and imported it into the main.m the error messages went away but the build still failed. Am I doing something incorrectly? I found no tutorials on Touchpose online and am confused. Alternatively, is there some other easy solution, for example using the Elgato software or some other software or framework? I also tried Reflector but my wifi is not fast enough to support good frame rates with it. I am aware that I could use unity and "build for OSX" or in the case of XCode apps just screen record the simulator but I would prefer a single, foolproof solution for all of my apps. Thanks!
One way is to use assistive touch and create your own touch. You can do this by going to Settings > General > Accessibility > AssistiveTouch, turn it on, then tap the thing that pops up, click Custom, then create your own touch. Then use that touch to show taps in a screen recording.
It's a bad solution, but I'll post it in case it fits your needs.
You would have to superimpose multiple videos to make it so that only the tap shows and not the AssistiveTouch icon. Also, you can't scroll while using it.
It's also explained here: https://www.youtube.com/watch?v=4JqjU0-4Cek
Are you able to demo on a simulator? If so, Giphy Capture is a good solution. It has an option for showing taps in the settings.
If you can make your screen recording on iOS simulator, you can enable displaying taps with following code. Open your terminal and run it and restart your simulator.
defaults write com.apple.iphonesimulator ShowSingleTouches 1
I had trouble with Touchpose initially but figured out how to use it. The issue is that you have to go to build phases and add the .m. In regard to screen recording I am using the Elgato Game Capture HD. My only problem is showing the dots on Unity 3D applications, which is something I am still working on.

Touch/Drag Heuristics have changed in iOS 6. Any way to get the old behavior back?

Our iPad app uses a webkit UI for a lot of the user interaction, and we are now fielding complaints from users that in iOS 6, the UI is ignoring their touches. We've done side-by-side comparisons, and are now quite certain that whereas a touch-small-drag-release gesture in iOS 5 would trigger on onclick event, a touch-small-drag-release gesture in iOS 6 does not. Thus, in iOS 6, you need to be very careful to never move your finger while pressing a button on the UI. (Or, perhaps they just changed the definition of "small" in small-drag.)
We believe that disabling multi-touch gestures in the Settings > General page improves things somewhat, although we're not convinced this isn't a placebo effect.
As a test, I tried removing the scroll-preventing:
document.body.addEventListener('touchmove', function(e){ e.preventDefault(); });
from our code, but it made no difference (other than making it really obvious that the drag events are dragging).
My next idea is to go through and change everywhere that we rely on onclick to instead rely on ontouchstart, but, well, yuck. (Particularly, yuck, in cases where we also need the same code to work in desktop browsers.)
Are we alone here? I'm not finding any complaints about this in my searches. Any clever ideas?
You are not alone!
We hit a similar problem in our new game Blue Pilot, while we extend the support to iPhone 5. We are now handling both events.

HTML 5 Canvas Whiteboard for iOS

I have been working with the code here: http://code.google.com/p/html-5-canvas-whiteboard/
Everything works great in the browser, but not on iOS devices.
Can anyone point me in the right direction to make this iOS compatible? More specifically, I would like it to work in Safari for my iPad.
Here is a test version of my code: http://www.coderedsupport.com/whiteboard
Any advice would be great.
You have mousedown, mouseup, and mousemove events, you need to add touchstart, touchend and touchmove events to match them.
You need to map the mouse-click events used for drawing on the whiteboard to actual touch-events. Just have a look in the Apple-Developer-Libraries I used it for a little proof-of-concept and it worked just fine with iOS and Android.
UPDATE: Regarding to your comment, you should have a look at this Blog

Resources