Different paths on different computers - path

I use 3 computers regularly and a fourth one ocassionaly. I have used dorpbox to synchronize my .ahk script to all computers. However, the path names are different on the different computers. For instance at home there is C:\Users\Farrel\Documents\SyRRuP where as at work it is something such as C:\Users\fbuchins\Documents\SyRRuP and on an Windows XP computer it is somethinge else. Consequently a particular sequence of code that runs a particular file only works on one computer and bombs out on the others. What is the most elegant way to overcome the problem?

I'm not sure about dropbox, but I have used Windows environment variables to set things like this before. Something like PROGPATH="C:\thispath\" then read the PROGPATH variable from the app or script -

Related

Parametrize Connections from Database parametrization table

We are using Pentaho Data Integration V7 working with multiple data origins with an Oracle DWH destiny.
We have stored all the connection access data in a parametrization table, let's call it : D_PARAM. All the connections are configured using parameters (${database_name} ... etc)
We have , at the begining of every job , a transformation with a "set variables" step which reads the right parameters from D_PARAM.
This all works fine, my problem is :
Every time we want to edit a single transformation, or in the development process of a new one , we can't use the paremetrized connections because the parameters haven't been setted. We need then to use "hardcoded" connections during the development process.
Is there a better way to manage this situation ? The idea of having the connections parametrized is to avoid errors and simplify the connections management, but if at the end we need both kind of connections.. I don't see them so useful.
There's not a simple answer, you could rotate your kettle.properties file to change default values, you keep all the values in the file:
D_PARAM = DBN
D_PARAM_DB1 = DB1
D_PARAM_DB2 = DB2
...
And just update the D_PARAM with the one you need from the different D_PARAM_DBN before starting PDI. It's a hassle to be constantly updating the kettle.properties file, but works out of the box.
You could also try working with environments, for this you would have to install a plugin available in Github: https://github.com/mattcasters/kettle-environment, it was created by a former PDI developer, and I don't know if it works with v7 version, it was updated to work with 8.2, but it would probably work with v7, to test it, you can install your PDI version on another directory on your PC and install there the plugin (and other additional plugins you have in your current installation), so you don't break your setup. This blog entry gives you details on how to use the environments: http://diethardsteiner.github.io/pdi/2018/12/16/Kettle-Environment.html
I don't know if the environments plugin would solve your problem, because you can't change the environment in the middle of a job, but for me, with the maitre script to use the environments when I program a job or transform, it's been easier to work with different projects/paths in my setup.
In Spoon you can click on the “Edit” menu and “Set environment variables”. It’ll list all variables currently in use and you can set their values. Then the transformation will use those values when you run.
Also works in Preview, but it’s somewhat buggy, it doesn’t always take updated values.

Electron does not run on shared folder

C:\share is shared folder.
C:\share\electron-v13.0.1-win32-x64, \\192.168.1.10\share\electron-v13.0.1-win32-x64 and Z:\electron-v13.0.1-win32-x64 are same folder.
Electron app is launched correctly when I execute C:\share\electron-v13.0.1-win32-x64\electron.exe command.
However, electron app is not launched correctly when I execute Z:\electron-v13.0.1-win32-x64\electron.exe command.
According to the task manager, electron processes are running.
However, electron's window is not shown.
Can electron run correctly on shared folder?
Should be safer to use it locally (from the C:\share). The mapped drives behave very differently compared to local filesystem. And their implementations can differ in their settings as well:
https://wiki.samba.org/index.php/Time_Synchronisation
https://www.truenas.com/community/threads/issue-with-modified-timestamps-on-windows-file-copy.82649/
https://help.2brightsparks.com/support/solutions/articles/43000335953-the-last-modification-date-and-time-are-wrong
If I understand you are just mapping back your own shared folder, and overall the Windows server cofigurations felt to me more consistent, however the protocol changed over the time as well:
https://en.wikipedia.org/wiki/Server_Message_Block
I do not understand the network sharing protocols well to give you exact answer why you have the problem, but I know enough to tell you that the mounted shared folders are not like your own local filesystem. In many cases the differences do not matter and it gives great user expierence, but in some cases these minute differences break things in misterious ways, even if they are mapped/mounted almost like a regular/local drive. This is not exclusive problem to Electron.
And that is a problem with a lot of things through SMB (mainly binaries/tools), the shared folder might be running a different filesystem, different permission and privileges (or run a completely different structure of permissions underneath if it's a completely different filesystem). Remote folders might have issues with inotify getting events on file updates, might miss changed file (like touch on Linux is meant to update date on the file), so through shared folder the date updates might be delayed/rounded. I think at one point even Makefiles were misbehaving as it was depending on the access-date to work the way it would locally.
Other problem with tools is the sharability, can it handle run multiple instances from the same location? Is it saving something into a ./tmp or some other file which could conflict with other user running it at the same time?
Overall with shares I tend to use them for data (and few times had issues with them as well), but have shared remotely applications only if they are known to not cause troubles.

How do I make a simple public read-only WebDAV server with SabreDAV?

I recently began looking into WebDAV, as I found it to be an option for letting me play a Blu-ray folder remotely - i.e. without requiring the viewer to download the whole 24gb ISO first.
Add a WebDAV source in Kodi v18 to a Blu-ray folder - and it actually plays! Very awesome.
The server can also be mounted on Windows with
net use m: http://example.com/webdavfolder/
or in Linux with
sudo mount -t davfs http://example.com/webdavfolder/ /mnt/mywebdav
-and should then (in theory) play with any software media players that supports Blu-ray Disc Java (BD-J), such as PowerDVD and VLC.
vlc bluray:///mnt/mywebdav --bluray-menu
PowerDVD.exe AUTOPLAY BD m:
(Unless of course time-out values has been set too low, which seems to be the case for VLC at the moment).
Anyway, all this is great, except I can't figure out how to make my WebDAV server read-only. Currently anyone can delete files as they wish, and that's of course not optimal.
So far I've only experimented with SabreDAV, because afaik that's the only option I have if I want to keep using my existing webhost. Trying with very minimal setups, because I've read that minimal setups should default to a read-only solution. It just doesn't seem to happen.
I initially used the setup from http://sabre.io/dav/gettingstarted/ and tried removing some lines. Also tried calling chmod 0444 MainFolder -R on the webserver. And I can see that everything does get a read-only attribute. But it changes nothing. It's still possible to delete whatever I want. :-(
What am I missing?
Maybe I'm using the wrong technology for what I want to do? Is there some other/better way of offering a Blu-ray folder for remote viewing? (One that includes the whole experience - i.e. full Java menus etc).
I should probably mention that all of this is of course perfectly legal. It is my own Blu-ray project - not copyright material.
Also: Difficult to decide if this belongs on StackOverflow or SuperUser. I ended up posting it on StackOverflow because SabreDAV is about coding, and because there's no sabredav tag on SuperUser.
You have two options:
Create your own file/directory classes for sabre/dav that simply throw an error when trying to delete. You can basically start with a copy of Sabre\DAV\FS\Directory and Sabre\DAV\FS\File and change the methods that do writing.
Since you're considering just using linux file permissions, really the key thing you are missing is that that 'deleting' is not controlled on the file or directory you're trying to delete. To delete a file or directory in unix, all you need is write permissions on the parent directory. However, I wouldn't recommend going this route as doing this will just cause a weird error in sabre/dav, which might leave clients in a confused state. It would result in a 500 error, not the expected 403 error.

Umbraco Bi-directional Deployment

I'm using Umbraco 7.4.x. I've been trying to figure out the best way to do bi-directional deployments.
As in, we have more than one dev working locally, and we have a dev server and a live server. We have single click deploys from local to dev, but that's only code. We were copying up the databases to dev, but now we also have people who need to enter content on dev. This leads us to making changes on dev database as well and copying down the database. We do all this with Version control of course, but still, this is all very inconvenient.
Is there a better approach to this that I'm missing? I tried using usync a few months ago but we'd often run into crashes.
I have heard of Courier, it seems like it would be good for deploying from dev/stage to production, but would that also work for pushing content/doc type changes to our local machines? I wasn't sure as they're not web servers on the internet but just local IIS Express running through Visual Studios
Thanks in advance!
We use uSync (uSync + uSync.ContentEdition - https://our.umbraco.org/projects/developer-tools/usync/) for moving everything between instances. Give it another shot as it has changed from the point when you're exploring it in the past. It's worth to mention that it requires good configuration on different enviroments to avoid conflicts etc.
You can also use Courier and it's latest version is used by Umbraco Cloud (http://umbraco.io/) which may also interest you as it gives you full control over deployment processes between multiple Umbraco instances.
One option is to have all of your developers set up to work off of the same dev database. On occasion, your developers might have to "Republish the entire site" or reindex the examine indexes to make sure all their cache and TEMP file are up to date. Otherwise, this has worked well for us for many years. One frustrating part of this is that media files uploaded by dev A won't be immediately on the file system for dev B. You should be able to move your media to azure blob storage to work around this problem. There is a package that should help set this up here.
I wouldn't recommend uSync.ContentEdition. I haven't tried it personally, but I have yet to hear a good report about it. uSync on the other hand has been a life saver for us even if it isn't perfect. At this point, we install usync on every site even if we never configure it to read in changes. We like that we can record our changes to document types and datatypes in source control. Working with the shared database setup means that we don't need usync to be reading on our dev and local environments. However, you will need to make sure that your devs all understand usync. If dev A adds a doc type, the usync .def file for that doc type could show up on the file system for dev B. Dev B should not commit that usync file in that situation.
Courier has been working a lot better recently. I wouldn't recommend it unless you are running umbraco 7 and can get the latest version of Courier. Courier is very useful, but you should do a lot of testing with it before you hand it over to a client because Courier gives you the ability to shoot yourself in the foot in a big way. It has definitely improved. In Courier for umbraco 6 I used to have to try really hard to deploy without breaking my site. Now, in Courier for umbraco7, I have to try really hard to break it. This is now a viable option for deploying content changes to production. Just make sure you test it heavily before you use it in a production environment.

Unable to understand the basic PATHs at root

I trying to put my Mac's data in order.
I have many rc-files at my root such as .vimrc, .srceenrc and .bashrc.
I would like to put these files to the specific folders such as .vimrc and .screenrc to ~/bin/coding and .bashrc then again to ~/bin/shells.
How can you determine where these rc-files must be?
Seriously, you should leave them where they are. Applications will be looking for them in specific locations (probably your $HOME directory which is not root, by the way, or shouldn't be). This is a very old UNIX convention that you should attempt to change only if you fully understand the consequences.
Not meaning to sound condescending but your error in naming your home directory as your root directory seems to indicate your knowledge level of how it all works is less than it should be to understand those consequences (apologies if that offends you, I agonized over the best way to say it - what I mean is that you should tread carefully).
If you move them, you will have to ensure you run the applications that use them with their paths fully specified, and some applications may not let you do that.
They all start with "." so that they're hidden to the normal ls commands and, if you're using a graphical file browser, there should be a way to hide them there as well (such as the Gnome File Manager CTRL-H).
Configuration of a program is both defined at system-level and user-level, you can tweak the user-level one, which resides in your home, to help you in what you need.
No need to group them in subfolders as you said: leaving them in your home (not root) is following the convention everybody uses, rc-files usually stay there after the program has been uninstalled, so if some day you make a fresh install you'll find the application configured as you left it.
Also, by leaving them in your home, you can bring your own home folder to another system and have the environment set as you like it.

Resources