How to get the mouse button state in Silverlight outside of button press events? - silverlight-3.0

I have the following situation
I handle when the left mouse button is pressed in my Silverlight app and do some things while the mouse is held down and the mouse moves. When the left button is released, I turn off the flag that's telling it to do the stuff and the mouse movement handler then no longer does the stuff.
The problem is: if the user is in the control area, pushes the left button down and moves out of the control area, then releases the button and reenters, the MouseLeftButtonUp event never fires and the processing continues until the user clicks the mouse.
My temporary fix was to turn the mouse flag off on MouseLeave but that's not really what I'm going for. I'd like to check to see the mouse's left button state in the MouseEnter event, but I don't know of a way to do that.
Does anyone know of a way I can access the mouse button state outside of the press events in Silverlight 3? Thanks,
Update
After thorough research, it doesn't look like this is possible in Silverlight 2 (and probably 3.) I found this link. If anyone knows of a workaround, please let me know.

What you need to do can be accomplished with the UIElement.CaptureMouse method:
http://msdn.microsoft.com/en-us/library/system.windows.uielement.capturemouse%28VS.95%29.aspx
When a UIElement has captured the mouse, it will continue to receive mouse events even if the mouse leaves the Silverlight control.

Related

Button retains focus when mousedown and mouseout, accesibility implications

It seems to be a consistent browser behaviour that when the user drags out of a a element, to avoiding clicking on releasing the mouse button, that that element retains focus until the user clicks somewhere else.
Is there an accessibility related reason for this?
It seems to be a common user behaviour (interaction) to negate a click 'halfway through' and, in MacOS, at least, is the default menubar mouse behaviour.
I guess I'm looking for a way to force a drop of focus on mouseout but I don't know whether this is bad for accessibility.
Incidentally this is visually bad ugly either button is used as a UI object or an a is styled as a button.
Whether the focus remains on the button/link or if you force it away will not cause an accessibility issue with regards to WCAG conformance. It does not fail any guidelines.
It comes close to failing WCAG 3.2.2 On Focus but I think the timing of your events technically makes it pass. The mousedown event causes the focus event and you are not doing anything with the focus event. It's when the mouseup event happens that you're trying to move focus so in theory, you are not changing the context on the focus event.
I'm not sure that means you should do what you have proposed from a UX perspective but there probably aren't a lot of people that rely on the default behavior.

Long Touch with Delphi VCL on Windows Tablet

I've created a TCustomControl derived class for a VCL application running on a Windows 8.1 tablet.
I'm using the OnMouseDown / OnMouseUp events even though this is obviously touch based.
What I'd like to do is detect a long press - i.e. touch down and hold for 1 second. So in the OnMouseDown event I record the down timestamp, set a flag to indicate the mouse is down and create an anonymous thread which sleeps for 1 second, and then checks the flag.
In OnMouseUp I set the flag to false.
This works as long as you wiggle your finger on the control. Otherwise if you just touch and hold the mouse down event is not called until you release your finger.
I've look at gestures, but that just looks completely overkill and from what I understand doesn't support long press anyway.
Thanks for any suggestions.
Richard
What you're facing is "normal" yet ridiculously stupid behaviour of Windows with touch input devices. We're facing the same issue for a while now, and are trying to solve it actively for the last couple weeks.
The trick is, that windows handles touch input device as "mouse, that can be fully controlled with single finger". Therefore, it has several states, more than events:
Events are: Touch begin (down), move, up, right click, and if you really want to register for a callback you'll get stationary (not moved since last report but still down).
Meanwhile, there are more states. First, when you press your finger down, it internally detects "Touch down!" event. It than waits a certain amount of time (which might or might not be available to hack-change, depends on drivers etc...), to determine whether what you wanted was a right click (after x time, upon release !before Y time! it will fire right click event) or whether it was a left click down (triggering event TouchBegin - after y time, which is LONGER than x-time, and after the y expires, it will fire left down event (not click!)).
Meanwhile, if the first touch down is interrupted by either move or - !! while waiting for x time to expire!! release of the finger BEFORE x time is ran out, it will automatically trigger left down (on move) or left click (on release). Same goes for any time after the Y time expires as well, or MOVE is triggered.
To put it more simply and understandable, I'll try to give an example:
Let's say that you have touch device, and the timings are as follows:
Point B is beginning, positioned at 0 seconds of our timeline.
Point X is first trigger breakpoint, positioned at 5 seconds of our timeline.
Point Y is second trigger breakpoint, positioned a 10 seconds of our timeline.
You put your finger on touch, and that starts the timeline.
Possible scenarios:
- You release it anytime before point X -> Left click is triggered (Touch down and up events, in order immediately one after another)
- You release it after point X but before point Y -> Right click is triggered (Touch down and up events, with right click flag)
- You reach point Y - Left down is triggered!
- You release it after point Y anytime, no matter if move triggers between release or not (but move is NOT triggered before reaching this point in timeline) -> Left up is triggered.
You trigger (do) move event before reaching point X -> Left down is triggered, followed by as many move events as it detects it, until you release it -> then left up is triggered
You trigger (do) move event after X but before Y -> same as above!
You reach point Y and move it afterwards -> check point no4 above.
Hope this is making any sense to you.
Now to jump to the possible solutions: Custom driver would be one option, but we're not there yet, so we're still trying other options. The most promising for now seems to be use of RAW_INPUT and Touch hooks, looks like we'll have to combine them to get what we want tho.
The bright side is, that windows itself DOES detect when finger touches the device, regardless of the timings and events that it wants to detect, determine and then forward to apps, but they for some reason made it hard to use in such nature. As a proof you could simply check the transparent dot that appears underneath your finger the very first moment you touch the screen.
Meanwhile, Android handles these things way better...
Hope it helps, and I'm happy to come around with the complete solution once we come through and get something good enough for usage.
M.*
If the MouseDown event is not triggered and you want to make your delay counter dependent on it then you are basically screwed. That said, I assume you are open to slight changes of concept. :)
I think it can still be done if you change your button into something like a flip-switch. The user has to "grab" (touch) it, "pull it up" (drag, should trigger StartDrag event) and hold it there for the desired amount of time. Upon release of the switch (MouseDown event), it snaps back down.
To design such a button, you could use a tiny TPanel for the knob and a slightly larger one for allowing limited vertical movement on it. Set BevelInner property to bvLowered and BevelOuter to bvRaised to make it look like a frame and the knob panel its child. Or use a TImage to display the knob in it's up/down positions. In any case, set the knob element's DragKind property to dkDrag or the StartDrag event won't occur. If still no event is fired, please try to detect touch input capabilities and report your findings..
Not sure this qualifies for an answer but maybe it's worth a swing.

How to trigger GoBack() in WinRT XAML 8.1 WebView control with mouse back button?

I have a page in a WinRT XAML 8.1 app that is basically a full-page display made up of a WebView control, in order to display a web page.
For mouse users, it would be handy to be able to use the hardware mouse button that is included on many mice to go back a page in the WebView (not to the previous app page -- to the previous web page).
WinRT 8.1 is nicely programmed out of the box to respond to the hardware mouse back button and go back a page in an app. But when the mouse is hovered over the WebView control, the app does not respond to the hardware mouse back button at all. Move the mouse off the WebView, and pressing the hardware mouseback button moves to the previous app page.
SO the question is how do I detect the mouse back button when the mouse is hovering over the WebView control so I can issue a GoBack() command to the WebView control?
Manipulation events and certain mouse events (like these, it appears) are not forwarded through the WebView. There is no satisfactory workaround.

wpf - transparent MainWindow and issues with DragMove "Can only call DragMove when primary mouse button is down."

I have a wpf project which uses transparent windows and I share this transparent window styling for my dialog windows and my mainwindow.
I am getting an error on my DragMove() event of my MainWindow AFTER I close a dialog window that uses the same window style. To make this even more strange this exception only occurs when I handle a mouseleftbutton event on a label in my Status Bar on the MainWindow. IF I swap out the label for a button and replace the mouseleftbuttondown with a click event I do not get the error.
The strange thing is that the dialog window that pops up does not implement dragmove, and I'm not dragging around my mainwindow either. Somehow dragmove gets called after my code execution returns to the mainwindow after a showdialog() call.
An easy fix for me currently is to swap my label for a button and wire up the click event instead.
However, I'm more interested in hearing about what causes this issue and why a click event works but the mouse one fails miserably.
My "StatusBar" is simply a stackpanel with labels and other stackpanels (which contain more labels).
Has anyone else fought this issue before? Would I need to implement some sort of mouseclick event handler override so that I can capture and cancel this exception from happening?
Repro code can be provided if needed. I got enough hits on dragmove here so I am hoping this is an easy one for somebody out there.
Thanks in advance for any help!
my brain isn't working properly today. I forgot about routing of events in this scenario. I simply needed to set the Handled property on my routedevent that fired off when the mousebutton was down. Somehow I missed that in the debugger before posting the thread.
The 'correct' way to make a borderless window movable --> https://stackoverflow.com/a/3275712/146032
Be sure to only call DragMove when triggered by event MouseLeftButtonDown and don't forget to handle the event using e.Handled=true;
if you receive this exception when messagebox show complete. place Dragemove();
inside try and empty in catch.

What is the best method for implementing mouse wheel activity in Delphi VCL forms?

As a long time user of Delphi 7, I've rolled my own mouse wheel handling in a few controls but lately I've noticed that some recent applications only need the mouse cursor to be placed over a control (e.g. a list box or tree view) for the mouse wheel activity to cause that control to scroll.
This feels nice (as opposed to having to click focus a control before it responds to the wheel).
Now I've moved to Delphi 2010 I'm wondering what is the 'correct' behavior?
And what can I use in Delphi that avoids me having to bodge this with my own solutions now?
Thanks.
I don't know if there's an official 'correct' behavior, but I personally find it most intuitively correct when the mouse wheel goes to the window that the mouse pointer is currently hovering over without having to explicitly give it focus. That's not the default behavior, however, and it seems that about half the applications I normally use do it one way and the other half the other way.
To get mouse wheel messages without having focus, you need to implement a mouse hook.

Resources