How to set gnome-terminal's character set encoding according to the system one calls? - character-encoding

The employees of our company use gnome-terminal run from Debian workstations to access a variety of systems running different O/Ss on our local network. Everything works very well except that the host systems and their applications use different character sets, either ISO-8859-1 ("Latin 1") or UTF-8, and the server applications notably do NOT adapt to the locale of the user. This requires the user to manually set gnome-terminal's character set encoding each time one starts a new session!
(In case that's not clear, we always want to log into system X using ISO-8859-1, and always log into system Y using UTF-8. This has to do with the relative antiquity of the O/S of each system, the older ones having little or no accomodation of UTF-8 while the newer ones deal rather grumpily with ISO-8859-1.)
It seems to me that gnome-terminal's character set encoding should be associated with the system one's logging into instead of the system one's calling from. And that therefore, the character set should be one of the parameters that can be pre-set in the profile. This is the way other terminal emulators behave, notably the Windows and Mac emulators that we use outside the office.
But in lieu of configuring it in the profile (which is not possible), does anyone know a way of setting the character set encoding as part of a command line invocation of gnome-terminal?
I've been trying to solve this annoyance off-and-on for years... any solution would receive our eternal gratitude. :)

in the good old times, gnome-terminal support --disable-factory, you can set up for local editing files:
#!/bin/sh
export GDM_LANG="de_DE#euro"
export LANG="de_DE#euro"
export RC_LANG="de_DE#euro"
export LC_ALL="de_DE#euro"
gnome-terminal --disable-factory
or remote access to a linux-box:
#!/bin/sh
export GDM_LANG="de_DE#euro"
export LANG="de_DE#euro"
export RC_LANG="de_DE#euro"
export LC_ALL="de_DE#euro"
gnome-terminal --disable-factory --tab --title="Server1 DE" --command "ssh user#Server1"
Now at gnome 3.10 I get
... Option "--disable-factory" is no longer supported ...
So, I am with you and will keep looking ...
Mario

This worked for me.
LANG=en_US.iso885915 /usr/bin/gnome-terminal

Related

Parametrize Connections from Database parametrization table

We are using Pentaho Data Integration V7 working with multiple data origins with an Oracle DWH destiny.
We have stored all the connection access data in a parametrization table, let's call it : D_PARAM. All the connections are configured using parameters (${database_name} ... etc)
We have , at the begining of every job , a transformation with a "set variables" step which reads the right parameters from D_PARAM.
This all works fine, my problem is :
Every time we want to edit a single transformation, or in the development process of a new one , we can't use the paremetrized connections because the parameters haven't been setted. We need then to use "hardcoded" connections during the development process.
Is there a better way to manage this situation ? The idea of having the connections parametrized is to avoid errors and simplify the connections management, but if at the end we need both kind of connections.. I don't see them so useful.
There's not a simple answer, you could rotate your kettle.properties file to change default values, you keep all the values in the file:
D_PARAM = DBN
D_PARAM_DB1 = DB1
D_PARAM_DB2 = DB2
...
And just update the D_PARAM with the one you need from the different D_PARAM_DBN before starting PDI. It's a hassle to be constantly updating the kettle.properties file, but works out of the box.
You could also try working with environments, for this you would have to install a plugin available in Github: https://github.com/mattcasters/kettle-environment, it was created by a former PDI developer, and I don't know if it works with v7 version, it was updated to work with 8.2, but it would probably work with v7, to test it, you can install your PDI version on another directory on your PC and install there the plugin (and other additional plugins you have in your current installation), so you don't break your setup. This blog entry gives you details on how to use the environments: http://diethardsteiner.github.io/pdi/2018/12/16/Kettle-Environment.html
I don't know if the environments plugin would solve your problem, because you can't change the environment in the middle of a job, but for me, with the maitre script to use the environments when I program a job or transform, it's been easier to work with different projects/paths in my setup.
In Spoon you can click on the “Edit” menu and “Set environment variables”. It’ll list all variables currently in use and you can set their values. Then the transformation will use those values when you run.
Also works in Preview, but it’s somewhat buggy, it doesn’t always take updated values.

certutil.exe is returning localized output

We have an PowerShell automation script that uses certutil.exe to list CA, issued certificates, etc... on a given Windows Server.
We wrapped some functions around a system Invoke of certutil.exe and we are grepping its output to look for some given patterns. However, in a French/German (and others for sure) installed Windows server, our script does not work at all, because certutil is returning localized outputs and it's impossible to predict that, and impossible to support all the languages of the world. Is there any way to force certutil.exe to print its output in English instead of the current machine language ?
I know that in Linux environment, we can do that:
LANG=en_EN.UTF-8 ls /tmp/toto
It will force ls to answer in English
Thanks for your help

Storage Spaces Direct

Some background:
I'm trying to set up Azure Pack in a test environment, and are currently woriking on setting up the servers who's going to host it all.
To do this i have two virtual Windows Server 2016 TP4 servers hostet on a ESXI host, and so i need to set up Storage Spaces Direct.
(iSCSI target and Storage Spaces (WS 2012), have been ruled out since the first is a nightmare to set up and the internet told me the second one comes with a low R/W speed).
I've been following this guide: https://technet.microsoft.com/en-us/library/mt126109.aspx
Problem:
When i run this cmdlet: Enable-ClusterStorageSpacesDirect
, I get this warning: No elegible DAS disk found.
Both servers have 3 disk each. They are initialized and 100% unallocated, and I have tried with them beeing both offline and online.
If I try running this cmdlet: (Get-Cluster).DasModeEnabled=1
I get the following error: The property 'DasModeEnabled' cannot be found on this object. Verify that the property exists and can be set.
Any and all help is greatly appriciated!
Storage Spaces Direct doesn't support FC & RAID-controlled LUNs.
The key is to force S2D to accept RAID BusType:
(Get-Cluster).S2DBusTypes=256
Here's a good article about it https://www.starwindsoftware.com/blog/resolving-enable-clusters2d-bus-type-support-issue-on-some-storage-controllers.
Another option is to reflash the controller's firmware to IT mode.
There's also other solutions, like that Starwind, which I suggest you to test.

securely run linux command line app from asp.net mvc app under mono

We have an internal and external facing asp.net mvc app running under mono on ubuntu 10.04 LTS. There is also a complicated (native, not mono) command line app that users use on the same server. They log on via ssh to do this. We have the security for the ssh users pretty locked down, so they can't do very much other than run the command line app.
The users of these apps have to:
login via ssh to the server, run the command line app with whatever command line switches are required which then does some long running processing and puts a report in the db of the web app.
Login to the web app, then set some options for publishing a report via the web app.
The users of the apps want to skip step 1 and do it all in the web app. I am thinking of creating a service that regulary polls the db for command line app jobs to run. The jobs would be created by the users as desired in the web app.
The problem is, the users want a box in the web app where they can just fill in any command line options. But I don't want them to do something like this:
-a dothis -b dothis & rm importantfile.txt
...in case the user's credentials to the web app are somehow compromised. I want to make sure that only that command line app can be used and nothing else. I am thinking of preventing the characters ! | < > & / \ $ ( ) from being allowed, which looks like are not required by the command line app.
Is that good enough? Are there any other shell tricks I should know about? Should I take a different approach?
I really don't want to have to write some sort of parser for the arguments that the users supply, because there are a ton of them that the users like to use.
Instead of running the command line as a shell command (launching the shell to launch the program), can you launch the program itself as a new process? I believe that's what the answer here is doing: Execute a command line utility in ASP.NET . If the actual program is launched as a process, rather than a shell, then things like & or rm will just be arguments to the command line utility, which should be fine if the command line utility checks for bad inputs.
If that's not feasible (although it's probably the better option), replacing all single quotes with single quote escape sequences, then placing single quotes around each of the arguments (split the string with a space as the delimeter), could provide a similar effect. Instead of making sure you avoid all possible bad characters (; can be used similarly to & in many shells), you only need to make sure that the provided arguments can't escape out of the single quotes. (You might also want to check for single quote surrounded arguments beforehand, to avoid double quoting them, and don't cound escaped spaces when splitting up arguments, etc., so that the users can provide arguments that need spaces).

Different paths on different computers

I use 3 computers regularly and a fourth one ocassionaly. I have used dorpbox to synchronize my .ahk script to all computers. However, the path names are different on the different computers. For instance at home there is C:\Users\Farrel\Documents\SyRRuP where as at work it is something such as C:\Users\fbuchins\Documents\SyRRuP and on an Windows XP computer it is somethinge else. Consequently a particular sequence of code that runs a particular file only works on one computer and bombs out on the others. What is the most elegant way to overcome the problem?
I'm not sure about dropbox, but I have used Windows environment variables to set things like this before. Something like PROGPATH="C:\thispath\" then read the PROGPATH variable from the app or script -

Resources