TouchEvent.supported attempts to create a TouchEvent to determine touch support. This works for actual touch devices, however, it does not help when using Chromium DevTools: “Emulate Touch Screen“. Chromium does expose functions Touch() and TouchList() on the window object. I added a check for context['Touch'] that now shows supportsTouchEvent: true. It is still not an indicator of whether “Emulate Touch Screen” is active. Any suggestions appreciated!
//bool get supportsTouchEvents =>TouchEvent.supported;
bool get supportsTouchEvents {
bool bIsTouchSupported = TouchEvent.supported;
if (bIsTouchSupported == false) {
//Get the browser's native window and check for Touch function
JsObject nativeTouch = context['Touch'];
bIsTouchSupported = (nativeTouch is JsObject);
//Check Chromium DevTools "Emulate Touch Screen"
}
return bIsTouchSupported;
}
Update-1: The additional check for context['Touch'] function does provide more insight that touch handling is possible. But it's a false-positive due to "Emulation" is not active until the DevTools window is opened. As a bloated alternative: If nativeTouch install both mouse and touch streamcontrollers/handlers.
This is a know bug ...............................
https://code.google.com/p/dart/issues/detail?id=16669
Related
I want to automate our iOS app using xamarin UI test. To access the app, user has to give his/her TouchID. But both Xamarin Test recorder and Repl() failed to identify the Touch ID system dialog (Please see attached screenshot).
This dialog is not developed by our developers and it is a system dialog.
How do I perform the action(s) which associated with this Touch ID?
Thanks
System dialogs is not something you can query with calabash or Xamarin.UITest, which is super annoying.
You would need to use a backdoor to simulate a success or fail state for this. You can read more about these here:
https://developer.xamarin.com/guides/testcloud/uitest/working-with/backdoors/
Basically you define a method in your AppDelegate:
[Export("touchIdBackdoor:")] // notice the colon at the end of the method name
public NSString TouchIdBackdoor(NSString value)
{
if (value == "true")
{
//simulate ok finger press
}
else
{
//simulate failed finger
}
}
Then in your test, when you expect the touch id to appear, you invoke the backdoor:
app.Invoke("touchIdBackdoor:", "true");
To dismiss the touch id and set whatever you need to continue.
Is it possible to detect when a user focuses on the program, like doing an alt-tab to the program?
I do get the window.onfocus when the window is being selected, but when I have the window minimized, then I don't receive anything.
You can try with the NW.js Window API
// Load library
var gui = require('nw.gui');
// Reference to window and tray
var win = gui.Window.get();
win.on('focus', function() {
console.log('Window is focused');
});
I have a really weird issue with cocos2d-x v3, the first 15 touches or so are not registered on my iOS device (tried iPad 2 and iPad air). As soon as a touch is finally registered, everything works fine (aka all touches after that trigger the onTouch functions).
The touch events work perfectly fine in the simulator.
Also, the same code works perfectly fin in my Windows and Android builds.
Has anyone had this happen, or maybe know what could be causing it?
I'm using the listener, and I debugged up to the spot where touchesBegan forwards the input events to the listener, but even there the events don't come in until after the 15th tap or so.
It's really weird... And I figured I'd give it a shot here, as someone might have encountered this as well, before I start stripping code to as clean as possible, and then try to work my way back from there...
Kind regards,
Michaël
EDIT: As requested, here is some code. The desired behaviour is that it works in iOS devices like it should: First touch triggers the onTouchBegan.
I didn't add it as it didn't think it would matter, since the code works fine for Android.
But I appreciate that you'd like to see it, just in case I might have missed something
GameLayer is a Cocos2d::Layer.
void GameLayer::onEnter()
{
cocos2d::CCLayer::onEnter();
// Register Touch Event
auto pEventDispatcher = cocos2d::Director::getInstance()->getEventDispatcher();
if (pEventDispatcher)
{
// Touch listener
auto pTouchListener = cocos2d::EventListenerTouchOneByOne::create();
if (pTouchListener)
{
pTouchListener->setSwallowTouches( true );
pTouchListener->onTouchBegan = CC_CALLBACK_2( GameLayer::onTouchBegan, this );
pTouchListener->onTouchMoved = CC_CALLBACK_2( GameLayer::onTouchMoved, this );
pTouchListener->onTouchEnded = CC_CALLBACK_2( GameLayer::onTouchEnded, this );
pTouchListener->onTouchCancelled = CC_CALLBACK_2( GameLayer::onTouchCancelled, this );
pEventDispatcher->addEventListenerWithSceneGraphPriority( pTouchListener, this );
}
}
}
bool GameLayer::onTouchBegan( cocos2d::Touch* pTouch, cocos2d::Event* /*pEvent*/ )
{
// Breakpoint here triggers fine on first touch for Android/Windows/iOS Simulator,
// but not on iOS device (iPad/iPhone)
bool breakHere = true;
<<snip actual code>>
}
EDIT:
The problem was an std::ofstream trying to open() on the iOS device (most likely in a folder it didn't have access to).
I have lots of layers in my game and I don't do it like you do. In your code the need to get the EventDispatcher locally and create the touch listener like how you are seems odd to me. I've never seen it down that way in so many steps.
I do:
auto listener = cocos2d::EventListenerTouchOneByOne::create();
listener->setSwallowTouches(true);
listener->onTouchBegan = [&](cocos2d::Touch* touch, cocos2d::Event* event)
{
return true;
};
listener->onTouchEnded = [=](cocos2d::Touch* touch, cocos2d::Event* event)
{
// ... do something
};
cocos2d::Director::getInstance()->getEventDispatcher()->addEventListenerWithFixedPriority(listener, 31);
I got it fixed.
The problem was seemingly totally unrelated, I was trying to open an std::ofstream file (my log file), most likely in a folder it didn't have (any and/or write) access to.
Which is not required, nor wanted on the iOS device.
Once I added IOS to the exclusion list (just like Android and some more targets) everything started to work perfect.
I do not know what goes wrong exactly, and why it does start working after a few touch inputs, but I'm guess it was waiting or retrying something in the background.
I found the issue while debugging another one :)
Hopefully this helps anyone else who might stumble onto the same or a related issue.
Kind regards,
Michaël
I would like to check for the user idle time since last touch and return the app to the home page after some period of time. I want this to be done using phonegap.
I googled and did find few solutions but I want to detect the idle time and return the app to the home page.
Thanks.
Using jQuery you *could bind a start touch event and end touch event then using a timer to execute a function
$('body').bind('touchstart',function() {
clearInterval(myTimer);
});
$('body').bind('touchend', function() {
myTimer = setInterval(function() {
/* return user to homepage */
},30000);
});
Touch events are a little buggy in mobile devices. But you set an Interval timer to run after a set amount of time after the last touch is detected. Remembering to clear it on the next touchstart event. Its a bit messy but should work (I havent tested it btw)
I got this working by setTimeout('Redirect()', 10000); where Redirect fn is function Redirect() { window.location.href="mylink.html"; }
I want to be able to detect physical keyboard (e.g. bluetooth) events in an iOS app.
I came across this post: iOS: how to respond to up/down/left/right arrow event from physical keyboard? which suggests over riding UIApplication:sendEvent .
After over riding UIApplication, I can tell that keypresses are "UIInternalEvent" and do not fit into one of the 3 documented event types. Using the debugger I'm unable to differentiate when a user clicks "a" from when they click "left arrow". Any suggestions?
my code:
- (void)sendEvent:(UIEvent *)event{
UIEventType u = [event type];
if(u != UIEventTypeTouches){
}
[super sendEvent:event];
}
The debugger break point is within the if statement.
EDIT: apparently UIInternalEvent is a wrapper for GSEvent. GSEvent info can be found here: https://github.com/kennytm/iphone-private-frameworks/blob/master/GraphicsServices/GSEvent.h but I still don't know how to make use of it.
Using the private framework GraphicsServices, you can get the GSEventRecord using the UIEvent object in the sendEvent of the UIApplication. More less like this:
GSEventRef e = [UIEventObject _gsEvent];
const GSEventRecord *record = GSEventGetGSEventRecord(e);
Then try looking at the flag value of the record structure, it changes when you press shift, control, etc.