Force use of mouse in Xtermjs (after reattaching) - xtermjs

My xterm setup uses a websocket to attach to a remote curses-like terminal application.
Upon start, that application will send the required control sequences to enable the mouse.
But when I re-connect to the application (after reloading the page, for example), that info is lost.
And then xterm.js adds scrollbars, and the mouse won't work.
In older versions of xterm.js (v3) I used this to force the mouse mode:
term.normalMouse = true;
term.mouseEvents = true;
But this no longer works, as the mouse system has been totally rewritten.
So how can I fix this?

Related

RemoteApp and Transparency

I use custom application as a Windows RemoteApp. The window of this application does not have a frame. Also the window is transparent in both ways: transparent background and transparent mouse input. This window places above all windows and indicate some changing information.
The problem is that when application works as a RemoteApp the window is created without WS_EX_TRANSPARENT flag. As a result the window is not transparent for mouse input. There is no such problem if application run locally.
Down below presented the pictures of application's window flags:
Window flags of application running locally
Window flags of application running as a RemoteApp
I tried to configure my rdp file with different settings, but I didn't find something usefull.
Anyway my rdp file looks like that:
redirectclipboard:i:1
redirectprinters:i:1
redirectcomports:i:0
redirectsmartcards:i:1
devicestoredirect:s:*
drivestoredirect:s:*
redirectdrives:i:1
session bpp:i:32
prompt for credentials on client:i:1
span monitors:i:1
use multimon:i:1
remoteapplicationmode:i:1
server port:i:3389
allow font smoothing:i:1
promptcredentialonce:i:0
videoplaybackmode:i:1
audiocapturemode:i:1
gatewayusagemethod:i:0
gatewayprofileusagemethod:i:1
gatewaycredentialssource:i:0
full address:s:XXX.XXX.XXX.XXX
alternate shell:s:||MyApp
remoteapplicationprogram:s:||MyApp
remoteapplicationname:s:MyApp
remoteapplicationcmdline:s:
workspace id:s:XXX.XXX.XXX.XXX
use redirection server name:i:1
loadbalanceinfo:s:tsv://MS Terminal Services Plugin.1.QuickSessionCollection
Any idea what can be done to make the window transparent for mouse input when application run as RemoteApp?

Can Chromium features that are normally turned on via switches be turned on programatically after starting up?

Is there a way to use Chromium switches after starting it up, or must they be enabled on start up? For example, can switches be passed to the renderer processes, or just to the main process?
I want to turn on paint flashing in Chromium (actually in my Electron app which runs on top of Chromium).
Devtools has a checkbox that turns this on, but I want to be able to turn this on without devtools, and after the app has started.
I know there's a show-paint-rects flag I can use:
chrome.exe --show-paint-rects
In my Electron app, I would need to use app.commandLine.appendSwitch:
app.commandLine.appendSwitch("show-paint-rects");
I also found a ui-show-paint-rects flag that lives in something called the "compositor," but that may be irrelevant for me.
Both of these work, but since Chromium uses a multi-process architecture, I'm hoping that there's a way I can specify that switch, or otherwise, turn on a feature in one process and not have to specify it at startup.
Is there a way to do this in Chromium? (Would be ideal to do this in Electron, but I'm not counting on it)

React Native events do not work

I have a trivial react component that only shows a button:
<Button onPress={() => console.log("test")} title="Button"/>
When I put this button into a project I created with react-native init, it works as expected.
However, I have an existing project into which I integrated React Native (0.51.0) manually (because it doesn't use cocoa pods; I followed this guide: https://medium.com/#joshyhargreaves/adding-react-native-to-existing-ios-project-without-cocoapods-6f1ee9106009).
The project seems to work fine: the UI loads, the button gives visual feedback when I tap it. But the buttons onPress event is not fired, so it does not log anything.
There are no errors or warnings (except Class RCTCxxModule was not exported, but it seems to be safe to ignore this).
I'm now out of ideas of what I could try or how I could debug this issue short of diving into Reacts touch handling code. Here's what I tried:
Made sure to only have one RCTRootView, and that it is created in the main thread.
Checked for any suspicious things happening in the remote debugger; everything looks normal (no exceptions thrown or warnings logged).
Tested a few other components that should fire events; for example, TouchableOpacity does not work either.
Logging something after a timeout does work, so it doesn't seem like anything is deallocated prematurely
Checking for errors reported by the metro bundler: it doesn't print anything
Any ideas on what I need to do to get my button to print "test" when I tap it?
I think you have not Debug JS Remotely option enabled. If you don't you have to open React Native Debug Menu Pressing (command / control) + D or shake your device if you are debugging with real device. Then just press Debug JS Remotely and it should appear in the Google Chrome. Then inspect and open the console. There it is!
This mite caused by date diff between the host (your computer) and the client (the mobile device)
You can check this by running adb shell "date" && date
to see if there is a diff.
If there is one go to your mobile device and toggle automatic date & time off and back on.
Then test time diff again as mentioned, if there is no more diff, tap events should work in debug mode
More details in the original git issue answer by - Alex Ciarlillo

Keyboard events in atom-shell

I am trying to use atom-shell, but I don't understand how the DOM events differ from a normal browser. I can't find doc about it either.
If I add:
document.onkeydown = function() { alert('...'); }
to index.html, it seems that atom-shell doesn't capture the event, and instead forwards the call to the terminal in which I ran atom-shell myapp.
Is there any documentation somewhere about how the DOM events work in atom-shell?
Edit: This only happens when I run atom-shell from the terminal window (in OSX). If I run the app by first starting the Atom application and dragging my app inside the drag area, then it works fine...
It's a bug: here is the GitHub issue

Is it possible to let capybara/webdriver start firefox minimized/unfocused?

Is it possible to stop Firefox from popping up when cucumber hits a #javascript tag? After a while it gets frustrating to always get disrupted while running tests.
It would be nice if webdriver was to start Firefox minimized or not focus on it.
There's no API in WebDriver to do this, and it's not really advisable since Firefox won't trigger certain events (like focus, blur) when the window is in the background. I usually work around this by using a virtual machine (e.g. in VirtualBox) or a separate X display.
You can add lines below to Info.plist of Firefox.app. After that it will start without grabbing focus.
<key>LSBackgroundOnly</key>
<string>1</string>

Resources