As part of a children's application I am trying to detect a hand print on an ipad screen. I can detect touches with the touch events but ultimately the hand is going to be 1 large touch event. I am trying to detect the size and x,y position of the hand. Any ideas on whether this is even possible?
The other part of this is I am using corona sdk/lua to develop. Is the solution (if there is one) still going to be feasible, or should I look at changing to native code?
I don't know how practical this will be in native code. It sound like you could enable multi touch and on touch events you would get multiple locations. See the Multitouch sample app.
If you're on a Mac, open Finder and go to /Applications/CoronaSDK/SampleCode/Interface/ and there are three sample projects in there that involve multitouch. If you're on Windows it's C:\Program Files (x86)\CoronaSDK\SampleCode (I think).
Related
I am making an application that works essentially like a simple Drag-and-Drop Playground with the command blocks on the left and a droppable area on the right. I want to make it fully compatible with VoiceOver and I'm running into trouble with some of the accessibility aspects since this is my first Swift application.
This is what the playground currently looks like: (App Screenshot)
My goal is to provide the users with audio cues/feedback while they are dragging the elements to help them figure out what part of the screen they are currently at. The ideal functionality would be exactly like what one uses when editing an iOS device's Home screen (the arrangement layout of the apps).
When trying to rearrange apps on the home screen with VoiceOver enabled, you hear a row/column alert when you are dragging an app over an open area. I want a similar type of feedback that says "Droppable Area" when you are over the correct area (see scenario 1).
When trying to rearrange apps on the home screen with VoiceOver enabled, you hear a sound when you tap on an area that has no app icon. (This also happens when you are not editing the layout and simply tap on an open area with no app.) I want that noise to be what you hear when you drag a command over an area that is not droppable (see scenario 2).
Any ideas on how this might be possible or good references to look at?
Hi I have been working on a blackberry app and developed about 90% of the app .But my client wants the app should work on blackberry touch also. So I just wanted to know what changes I will need to do to convert the same code to work on blackberry touch .Please help me I am new to Blackberry app development .
Just to confirm, we are talking about a Java application running on BB7 or earlier OS.
The short answer is, it depends.
If you have used standard RIM controls (buttons, ObjectChoiceField etc.) then these will work on the touch screen with no change. If you have used your own controls, for example an image button, then these might not work very well - for example the button might not be big enough to be hit easily with a fat finger.
The other problem is the virtual keyboard, it may appear at points when you do not want it to and not appear when you do. Not a problem on a lot of phones, but remember there are at least 2 non keyboard phones out there, so the Virtual keyboard is the only option for typing.
The best approach is work through each of your screens and try them in the Simulator. Zoom the Simulator so that it looks like the real device. And pretend you are using a finger, don't rely on the mouse because you can position the mouse very accurately.
I would raise new questions about specific Fields that you have problems with, rather than continue this with any issues you find.
I am wanting to get the touch points in an iOS app under Delphi (preferably as an event).
Is there an interface for this that is exposed?
I found this:
Get touch points in UIScrollView through UITapGestureRecognizer.
But I would not know if its possible to convert this code.
I am trying to implement a number of on-screen sliders (audio faders) so that the user and slide their finger to move it up and down. I plan to put a transparent rectangle control over top of the sliders so that I can get multiple touch points and move more than one simultaneously.
After going down this rabbit hole about as far as possible I am sad to say that's its not possible to get multiple touch points in Delphi. This is very shortsighted and appears to be a trivial oversight by Embarcadero.
Essentially the Delphi UIView implementation does not expose the "multipleTouchEnabled" property so the touch events inside of FMX.Platform.iOS.pas will never receive more than one touch event.
I am posting this to save the next person the time that I put into find this disappointing answer.
I have a website that I would like to be iPad compatible. However, I use Windows, so I can't install the official emulator.
I tried iBBDemo2 (http://code.google.com/p/ibbdemo2/) which looks good, but it uses regular mouse events instead of the touch events (so I can double click with my mouse and fire a dblclick event on an element... which in this case would just cause a zoom to occur on a real iPad)
Is there any way at all to test this?
Have you tried Ripple emulator? It is a very useful chrome extension. It has accelerometer support and other features.
The link is here.
I don't know if really solves your problem, but you can try it for free.
I am implementing camera application using then example comes with blackberry plugin for eclipse named "CameraDemo" the problem is that when the screen loses focus It does not display the camera view istead of it shows like this
has anybody faced such problem whats the solution?
This way of taking picture (using the Player and VideoControl.getSnapshot()) does not work nice on all BB models. I'd even say it works nice only on a narrow set of BB models. So if you are going to use your app on a wide range of BB models, then this is not the right way to go.
Instead to take a picture use a built-in Camera app. Here is a starting point on how to do that.
Basically you invoke the built-in Camera app and listen for the file-system changes to detect a new image file path. Then you need to close the built-in Camera app somehow - it's possible to do that by simulating two 'Esc' button presses.
Yes, this sounds a bit hacky/over-complicated, but that's how BB engeneers arranged that for us. :) BTW, this is actually not so bad if compare with Android where different device manufactorers violate the common rules and implement the Camera app in their specific way so you are not able to write the code once covering all Androids.