I have modified the Google Cardboard DemoScene with my own UI text buttons. By default, these buttons respond to Onclick() or tap events to trigger actions. I would like to trigger these actions when the user sets their gaze on the object for 2 seconds.
I suspect I need to add some kind of conditional statement in the GazeInputModule but I can't figure out how to measure the time. Can anyone point me in the right direction? Is this the right approach or should I try something else? This is all still pretty new to me so even basic tips are very helpful!
Here's how I enabled a button click on a timed gaze. In my case I created a button to load the next scene.
I created a scene in my Unity game, so I didn't start off with modifying the Google Cardboard DemoScene, but the principles are similar.
Create a script named "LoadSceneButton.cs" (or whatever suitable name such as "TimedGazeButton.cs" and attach it to the button you want to enable the timed gaze on. See sample script http://pastebin.com/CXd6HA3C
On the button, add the "Event Trigger" component and set triggers - see screenshot Timed gaze button Event Triggers
Pointer Enter to the button's LoadSceneButton.SetGazedAt, and check the box to pass in the TRUE value. This indicates that user has started gazing at the object.
Pointer Exit to the button's LoadSceneButton.SetGazedAt, and uncheck the box to pass in the FALSE value. This indicates that user has stopped gazing at the object and has moved the reticle somewhere else.
The "LoadSceneButton.cs" will start timing the gaze when user's reticle moves into the button. Once the gaze time has reached a particular duration, the button's OnClick event is invoked. If the user moves the reticle away before that time, the timer is reset.
The version of Unity I used was Google Daydream Technical Preview v5.4.2f2-GVR12 dated 10th November 2016, with Google GVR SDK 1.0.3.
Related
I want to know the event in blocks section using which the entered text value can be obtained without a form submission through button.
Let's say user inputs text in mobile phone through keyboard and presses enter. In this case I want some event to trigger and get the value that user entered.
There are 2 events available like lostfocus and gotfocus.
Will these work? Or is there any other good approach for getting text value on pressing enter?
Unfortunately there is no such event like OnEnterPressed available in MIT App Inventor and the events LostFocus and GotFocus will not work in this case.
What you currently can do is
use a button and use the Button.Click event, or
create your own custom keyboard, see also this example
Currently there is a limitation for App Inventor extensions, which only can be used for non-visible components. Later as soon as also visible components are doable, then you could write your own textbox extension and add an event yourself.
Edit concerning the new question in the comments about different screens:
Use different screens wisely
Before starting to create another screen, first you should think about is it really necessary? See also Building apps with many screens and SteveJG's post about advantages/disadvantages, because in only one screen you also can use vertical arrangements to simulate different screens, just set the arrangements to visible = true/false as needed...
You can insert a Clock component that monitors the TextBox1.Text. When it triggers, it checks if the TextBox1.Text has changed and saves it to a variable. When it triggers again, it compares the variable with TextBox1.Text. After the user finishes typing, the variable and TextBox1.Text will be equal and then you can trigger the event like you eanted when the user pressed Enter.
Hope this helps!
I've created a TCustomControl derived class for a VCL application running on a Windows 8.1 tablet.
I'm using the OnMouseDown / OnMouseUp events even though this is obviously touch based.
What I'd like to do is detect a long press - i.e. touch down and hold for 1 second. So in the OnMouseDown event I record the down timestamp, set a flag to indicate the mouse is down and create an anonymous thread which sleeps for 1 second, and then checks the flag.
In OnMouseUp I set the flag to false.
This works as long as you wiggle your finger on the control. Otherwise if you just touch and hold the mouse down event is not called until you release your finger.
I've look at gestures, but that just looks completely overkill and from what I understand doesn't support long press anyway.
Thanks for any suggestions.
Richard
What you're facing is "normal" yet ridiculously stupid behaviour of Windows with touch input devices. We're facing the same issue for a while now, and are trying to solve it actively for the last couple weeks.
The trick is, that windows handles touch input device as "mouse, that can be fully controlled with single finger". Therefore, it has several states, more than events:
Events are: Touch begin (down), move, up, right click, and if you really want to register for a callback you'll get stationary (not moved since last report but still down).
Meanwhile, there are more states. First, when you press your finger down, it internally detects "Touch down!" event. It than waits a certain amount of time (which might or might not be available to hack-change, depends on drivers etc...), to determine whether what you wanted was a right click (after x time, upon release !before Y time! it will fire right click event) or whether it was a left click down (triggering event TouchBegin - after y time, which is LONGER than x-time, and after the y expires, it will fire left down event (not click!)).
Meanwhile, if the first touch down is interrupted by either move or - !! while waiting for x time to expire!! release of the finger BEFORE x time is ran out, it will automatically trigger left down (on move) or left click (on release). Same goes for any time after the Y time expires as well, or MOVE is triggered.
To put it more simply and understandable, I'll try to give an example:
Let's say that you have touch device, and the timings are as follows:
Point B is beginning, positioned at 0 seconds of our timeline.
Point X is first trigger breakpoint, positioned at 5 seconds of our timeline.
Point Y is second trigger breakpoint, positioned a 10 seconds of our timeline.
You put your finger on touch, and that starts the timeline.
Possible scenarios:
- You release it anytime before point X -> Left click is triggered (Touch down and up events, in order immediately one after another)
- You release it after point X but before point Y -> Right click is triggered (Touch down and up events, with right click flag)
- You reach point Y - Left down is triggered!
- You release it after point Y anytime, no matter if move triggers between release or not (but move is NOT triggered before reaching this point in timeline) -> Left up is triggered.
You trigger (do) move event before reaching point X -> Left down is triggered, followed by as many move events as it detects it, until you release it -> then left up is triggered
You trigger (do) move event after X but before Y -> same as above!
You reach point Y and move it afterwards -> check point no4 above.
Hope this is making any sense to you.
Now to jump to the possible solutions: Custom driver would be one option, but we're not there yet, so we're still trying other options. The most promising for now seems to be use of RAW_INPUT and Touch hooks, looks like we'll have to combine them to get what we want tho.
The bright side is, that windows itself DOES detect when finger touches the device, regardless of the timings and events that it wants to detect, determine and then forward to apps, but they for some reason made it hard to use in such nature. As a proof you could simply check the transparent dot that appears underneath your finger the very first moment you touch the screen.
Meanwhile, Android handles these things way better...
Hope it helps, and I'm happy to come around with the complete solution once we come through and get something good enough for usage.
M.*
If the MouseDown event is not triggered and you want to make your delay counter dependent on it then you are basically screwed. That said, I assume you are open to slight changes of concept. :)
I think it can still be done if you change your button into something like a flip-switch. The user has to "grab" (touch) it, "pull it up" (drag, should trigger StartDrag event) and hold it there for the desired amount of time. Upon release of the switch (MouseDown event), it snaps back down.
To design such a button, you could use a tiny TPanel for the knob and a slightly larger one for allowing limited vertical movement on it. Set BevelInner property to bvLowered and BevelOuter to bvRaised to make it look like a frame and the knob panel its child. Or use a TImage to display the knob in it's up/down positions. In any case, set the knob element's DragKind property to dkDrag or the StartDrag event won't occur. If still no event is fired, please try to detect touch input capabilities and report your findings..
Not sure this qualifies for an answer but maybe it's worth a swing.
Developing Smart Devices in Genexus. I am using the Load Event to load some hundreds of records (returned by a 3rd party webService) into the Grid (some rows might have different layouts).
When the user hits the search button, a ProgressIndicator is immediately shown (during the procedure execution). When the procedure ends (data retrieved), the ProgressIndicator disappears but it may take an additional 4-5 seconds for the Grid to show the fresh data.
This causes the user to think that there was a problem with the search. Then, unexpectedly, the grid refreshes.
Is it possible to, somehow, show a ProgressIndication during the Load or Refresh Events?
Or do you have any suggestions to prevent this behavior?
The main issue is that Refresh and Load events are "server" events in SD architecture so you don't have access to the device's APIs or resources like the Progress Indicator.
We had the same requirement in iOS and what we did was using the GXRefresh event.
Event 'gxrefresh'
Composite
//Your code. Example: ProgressIndicator.Hide()
EndComposite
EndEvent
Gxrefresh is a Local event that is executed after the Refresh and Load. Is a hidden event that helped us accomplish this. (This is not an official event and it can be taken out in any version of GeneXus)
So the solution is:
Start a Progress Indicator on the ClientStart event of that Panel.
Hide the Progress Indicator on the 'gxrefresh' event of that panel.
Note: Remember that in order to use the gxrefresh event you will need to add a hidden button named 'gxrefresh'. You can hide that button as you will not need it in the UI (we put it Visible=false on the application bar).
If that solution for any reason is not possible (for example the gxrefresh event is deprecated or you are developing for Android) I can think of a second WA that is not elegant at all but should work.
Start the Progress Indicator in the Client Start Event of the panel
Put a hidden variable with control type SD Chronometer.
Set the timer for 6 seconds
Stop the Progress Indicator on the Tick event of the SD Chronometer and stop the Chronometer so the Tick event is not executed any more.
These are the two options I can think of.
Maybe there is an easier way but I haven't heard of it. A Grid.DidLoad event would be great for this scenario. For sure we will have this soon or some other solution for this problem.
Links:
SD Chronometer: http://wiki.genexus.com/commwiki/servlet/hwikibypageid?25058
SD Events: http://wiki.genexus.com/commwiki/servlet/hwikibypageid?17042
Server Side Events: http://wiki.genexus.com/commwiki/servlet/hwikibypageid?24234
I have a ranorex project, which automated an use case with clicking 5 buttons in an application.
To set up this project I used the record function. I defined some sleep times between the clicks.
The run with the clicks is in a loop, where the loop count is dynamically defined. One button of the five and always the same button, Ranorex "ignores" sometimes. According to the log file, the button is visible and enabled and ranorex also runs the code with the Click(), but the application doesn’t receive the click. Before in code the click is called, I check with an “If” whether the button is visible and enabled - and if the click() is called (or doesn't), I log it to ranorexlog. Sometimes the application is running through and all the clicks work and sometimes this button click is missing one or more times. It doesn’t depend on the sleep time between the clicks, because I tried long and small pause times. It doesn’t depend on the focus of the application, too, because I switched the focus several times.
Does anybody know this problem, workarounds or what I’m doing wrong?
the obvious workaround is to use Mouse.Click("{Button}") with a 0 duration rather than invoking the click event, or if you are invoking the mouse click invoke the buttons click action. (sorry since they are both named Click() i don't know which one you are using).
As to what is happening, from your description it sounds like your script is finding more than one element with the same xpath. This would cause issues with the click event because it may try to click on the element, but it's the first xpath that matches not the second. Are there any optional elements in this sequence of button clicks that appear? for instance a busy spinner with inner text of "OK" that is hidden from view when you are clicking a button on a form with inner text of "OK" would cause two elements to be found if the xpath was looking for an inner text of ok. This element that is causing may not have been caught in the recorder because it might not have been present at the time.
Another possibility if you are using Mouse.Click() (particularly if this is a website)is that it is possible your button is not actually on the screen, I know that should be taken care of by the visible check but it is visible on the dom even though it is not visible on the screen that causes issues.
I highly recommend do not use the recorder to create test suites that need to be re-runnable. It would actually be better to use the spy tool to create a repository or several repositories for the product you are testing, this way you can be sure the xpaths are all unique, which using the recorder there is no gaurantee. You can actually record with the created repository, and ranorex will attempt to find item in the repository before creating a new item, so the recorder will use this repository when creating recordings.
In android, there is a function call Toast and it show instantly without involve in time. I tried to use Status in RIM but it must run in the invokeLater and cannot set the time less than 1 second. So it cannot display instantly.
Any other built in that same with Toast or Status?
No there is not afaik. Toast was "invented" in a way by Android.
Previous OS's have used popup boxes with confirmation buttons. A Toast is almost like a popup box with a timer attached to it.
Of course, #Signare gives the common replacement correctly for what you would "normally" do on BlackBerry. Dialog.alert(String)
If you want something more "Androidy", this is something we want to implement at Cobi, but have not gotten around to yet due to time constraints working on client work.
There are 2 unique aspects to a Toast compared to the "old" way of doing things:
the popup only shows for a short time
the popup does not block the user from interacting with the background screen at all
To create the popup screen, look at the PopupScreen class - and you pass in a layout manager of your own that will be displayed.
You could start a timer when the screen is shown (we have not implemented this yet) and that could close the screen for you.
As far as not blocking the user - this is the major difference - and I do not know if it can be done if you use the PopupScreen class. Perhaps if your PopupScreen passes all keypresses through to the underlying screen, this may be possible.
In some of our apps, we have a custom field, defined in our base MainScreen subclass, that can be positioned over the rest of the fields on the screen. This allows the user to continue interacting with the screen while the field is displayed. I cannot share that code at the moment here.