I use electron-builder to create distribution packages of my Electron app. My app offers users the option to disable (and enable) the auto-update that's included in electron-builder. I don't want to offer this choice in cases in which the auto update doesn't work in the first place (e.g. Windows Store, Linux Snaps, Linux deb packages etc.)
I've seen that Electron has the property process.windowStore, which at least helps me find out about one of the build targets. How can I find about in which build target the Electron app runs in Linux so that I can hide the option to disable auto updates?
TL;DR: Other than checking for AppImages, Windows Store and Mac App Store, you cannot detect where your binary you're currently executing came from. The executable is the same for all packages on Linux, the package is only its container.
Besides process.windowsStore, there's also process.mas for the Mac App Store. On Linux, however, information where the executable came from (i.e. how it was installed) is lost. This is due to the fact that all package formats (Debian packages, RedHat packages, Snaps, etc.) essentially boil down to an archive which the installer (dpkg, rpm, snap, etc.) extracts to a certain location. Only those programs keep track of which files belong to which package.
If you provide your application in a manner of installable packages and not only simple tarballs, you probably will have to disable auto-update for all Linux builds. However, it may be worth checking at runtime whether the executable currently runs from a restricted directory (such as /bin, /usr, /lib, /lib64, etc.) -- this may be an indication that the user installed the app using a package:
// in the main process
if (process.platform === "linux") {
var disableAutoUpdate = false;
var restrictedDirs = [ "/bin", "/usr", "/lib", "/lib64" /* ... others to your liking ... */ ];
for (var i = 0; i < restrictedDirs.length; i++) {
if (__dirname.startsWith (restrictedDirs [i])) disableAutoUpdate = true;
}
if (disableAutoUpdate) {
// your logic
}
}
However, this is no guarantee -- the OS, the desktop environment, etc. may do some tricks you cannot possibly detect (e.g. extracting AppImages to such a destination). Also, don't test for /home (or, for that matter, /root) alone, because tarballs can be extracted anywhere. This includes user-owned directories which are not protected, not installation destinations for package installers and which are not beneath /home but rather in /mnt, /run, etc.
There's another way (which possibly works) when testing for AppImages. According to the AppImage documentation, the environment variable APPIMAGE (amongst others) will be set to the full path of the executable. Thus, you could re-write your auto-update check to something along the lines of:
if (process.platform === "linux") {
var disableAutoUpdate = true;
// maybe do some other checks...
if (process.env.APPIMAGE) disableAutoUpdate = false;
if (disableAutoUpdate) {
// your logic
}
}
Note, however, that this too is no guarantee, because any user may set environment variables before running any executable. This test makes false-positives just way less likely.
Related
I'm writing a dart package in which I require the path to a file within the package to access the file for the package to work. The package is not written for the web platform.
My understanding of where packages are stored is limited, however, I would assume that there won't be a common directory for each platform, and even within platforms, I suppose it would vary based on how dart was installed on the specific machine.
Despite this being rather obvious that Dart as an AOT language would mean that the file being executed is just one snapshot. I would want to know if there is a way I can access the directory structure of package without having the end-user having to pass path values to me.
To give you some context, I want to load a dynamic library on runtime using dart:ffi, and do so within a package which will be published to pub.dev with the libraries. Do let me know if you have any ideas.
What I've tried so far:
Directory.current.path: This is obviously not going to work.
${File(Platform.resolvedExecutable).parent.path: This seems to be a workaround for Windows machines, I don't know how this would be useful for Linux, MacOS, or even Android and iOS for that matter.
Directory.fromUri(Platform.script) :
This leads me to the snapshot created by the compiler on Linux, nevertheless, of no use to me.
It definitely won't work with ahead-of-time compilation, because then the compiled code is nowhere near the source code.
If your program is being run on the stand-alone VM, and has direct access to the source code, you can potentially use Isolate.resolvePackageUri from dart:isolate to convert a known package: URI to a file:URI, which can then be used withdart:io` to load the file.
Future<File?> fileFromPackageUri(Uri packageUri) async {
var fileUri = await Isolate.resolvePackageUri(packageUri);
if (fileUri == null) return null; // No such pacakge.
return File.fromUri(fileUri);
}
Again, this only works when running from source. Otherwise you need to find a way to deploy your native library along with the Dart program and know where to find it.
I was making a Delphi application, and wanted to test it on another PC to see if everything was working properly. I compiled and built the executable file, of course and I transfered all of the files from the Project folder to the other PC. When I launched the .exe file on the PC, nothing would happen. I then ticked the "Build with runtime packages" option in Project Options:
This made the .exe go from around 300 KBs to around 30 KBs, but now, instead of being able to launch the application on another (non-Delphi) PC, that PC got an error saying it was missing various files required to open the .exe .
I sent the same thing to various friends and all reported the same problem.
My application is a rather simple lottery prototype application, so I don't understand why I'm having trouble opening it on other PCs. Are there other special options I need to enable for this to work?
When you use runtime packages, you need to distribute those packages. These are the .bpl files that your program links to. It will be a subset of the packages listed in the runtime packages edit box in your screenshot. You should just list the packages that you use.
The net result of doing this is that the total amount that you will have to distribute is much greater than a single monolithic executable. Because in a monolithic executable the unused code can be stripped. If you want to minimize the size of your program, and make life simple, do not use runtime packages.
It would be worthwhile reading Embarcadero's documentation:
Working with Packages and Components
Solve the first problem.
Using Runtime Packages will not solve the problem of your EXE not running on certain PC's. All it does is increase the complexity of deploying your application (as you have found).
Unless you need Runtime Packages for other, specific reasons, then you are far, far better off NOT using them, especially if you do not understand them (which based on the way you describe having discovered them does appear to be the case, if we're being honest).
Concentrate on finding out why your application does not run as a single, stand-alone EXE.
With all of the problems involving runtime packages your EXE is currently not even reaching the point of running your application code, and this may be where your original problem lies. Which means that once you have solved all the issues created by Runtime Packages, you will stil be left with an EXE which does not run. i.e. your original problem.
What does your application do when it starts ? Does it attempt to load files from any specific locations ? What are those locations ? What are the files ? Are you using any third party libraries which may expect DLL's to be present or other external files ? Are you trying to read or write settings to the registry or any external files (INI files etc).
What is the OS you are trying to run on ? This can be a very significant question for applications compiled with older Delphi versions. Have you tried configuring the EXE to run in Compatibility Mode for older versions of Windows ? (something that you do in Windows itself, not when compiling the EXE).
These are the questions you should be focussing on. Not runtime packages.
Gday,
A small tool that's been around for a while to help you with this is Dependency Walker. You can find it at http://www.dependencywalker.com. It's helped me out on more than one occasion. This will tell you what files (usually BPLs as stated in the other responses) need to be sent with your EXE.
Also look at NSIS to create a simple installer, and put your EXE and supporting BPLs and any other files in the same directory.
I'm using Windows, Mac and Linux machines in my daily duties. On all machines, I program in C++ and various shells scripts. So far I've adopted the various "main" IDEs on each platform, but the diversity is irritating. I'm therefore looking into the possibility of using Sublilme Text on all platforms.
I have a setup of Sublime Text on Windows that works perfectly and would like to use the same on the other platforms also, so that when I change something in my Sublime setup on, say, my Mac, I can easily pick up the latest setup on my Windows machine the next time I'm there.
Is this possible on the 3 mentioned platforms, without getting (more) grey hair? If so, any suggestions or experiences thereof?
Many folks upload the "Packages/User" folder to GitHub (or your VCS service of choice). Then, they use Package Control to install their packages. Package control, through a settings file, will install any missing packages on a particular machine. I wrote a bit more about it here. You would then clone the git repo onto each machine, pulling updates when you decide to change something.
Alternatively, you could probably use a cloud service + symlinks to keep things auto synced, but I've personally never used it that way.
There are some plugins that are platform specific, so keep an eye out for those.
There's also the Package Package Syncing, which syncs installed packages and settings via some cloud service.
Works quite nicely, and automatically.
Has the advantage that you don't have to push/pull some dotfiles repo all the time.
No idea though whether this will work seamlessly across platforms (meaning whether all the settings will be platform-independent).
I have made a simple installer application in Delphi, nothing fancy. Basically I include files into the Exe, and then extract them to a user specified path.
I stumbled across a problem however, and I have noticed this works with ANY Windows Executable, it does not matter if it is an installer or not.
If an Exe is named, or contains the following words in the filename, "Setup", "Build", "Install" and maybe others, then.. whenever the Application is run and closed, Windows pops up a Product Compatibility Assistant dialog, saying the Application may not have installed correctly.
This is a problem, as even though the Files from my installer have actually extracted, and in my eyes the installer has done its job, Windows is complaining about it.
The only idea I have regarding this, is that Windows must check the filename of the Applications when executed, and in this case has identified it as an Installer. Windows must of then set a flag or something on the System, my Installer must then update this flag to say that the installation was a success?
Windows does not complain about this when debugging from the IDE, so it cannot be code related, it must be the OS - this only happens when launching the Application from Windows, not Delphi.
You can try this easily, either create an Application or rename one as Setup.exe, Run it and then close it - wait a few seconds and the Product Compatibility Assistant Dialog will show.
I don't know where to start investigating how to stop this dialog, or where a setting may be to tell Windows the Installer was completed correctly.
Appreciate your thoughts and solutions thanks.
If I recall correctly, this happens when your install app does not include an application manifest. When UAC was introduced, MS introduced a heuristic detection for installers and shows the UAC elevation dialog. The heuristic checks for names like setup.exe, install.exe. The simple solution is to include an application manifest. If it is an installer you probably want to use the requireAdministrator setting.
The feature is known as Installer Detection and is discussed here.
For what it is worth, I would always build an installer with a dedicated install tool like InnoSetup for example.
As David pointed out, MS uses some fuzzy logic to try to guess if the program is an installer. I wouldn't rely on this, as this is only for supporting legacy installer applications.
All new applications should have a manifest file, specifying whether it requires elevated privileges.
If an application has a manifest file that includes the requestedExecutionLevel directive, then Windows does not attempt Installer Detection.
Any program that is detected as an installer program but does not add a registry entry to the Add Remove Programs section of the registry (HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\CurrentVersion\Uninstall) will get the message "This program might not have installed correctly".
With the desire to be able to reproduce a given revision of a project that is utilizing 3rd party visual component packages, what goes in SVN and what's the best way to implement/structure the SVN repos?
For non-visual components, the rule seems simple to ensure no reliance on outside repos - "no svn-externals reference to any outside repo allowed". I have a shared repo that I control, which is the only 'svn-externals' reference allowed. This makes it easy to implement and share these types of runtime itemss with sourcecode in different SVN projects. Any reference this internal shared repo is by 'svn-externals' using a specific revision number.
Visual packages seem to go counter to being able to be version controlled easily as they may have to be reinstalled at each revision. How to best create a SVN project which is able to be recreated later at a specific revision number...is there a recommended solution?
Previously we didn't worry about 3rd party components as they don't change often and we never had a real good solution. I was wondering if others have figure out the best way to handle this problem as I'm doing a spring cleaning/internal reorganization and wanted to do it 'better' than before.
Technically, the RTL/VCL source should also be in the SVN repo as well (if there's a Delphi hotfix/service pack released.)
My solution will likely be to create a virtual machine with a particular release of the Delphi environment with all visual controls installed. As we add/update visual controls, or update Delphi with hotfixes/service packs then we create a new version of the virtual machine. We then store an image of this VM revision on a shelf somewhere. Is this what you do? Does the Delphi activation/licensing work well (or at all) in this scenario?
Thanks,
Darian
You can prepare "start IDE" (and possibly "build") scripts for your projects and maintain them as project evolves in repository.
Regardless of your decision about keeping components in separate repositories and using externals, or including them in a single repository with possible branching, you should also include compiled bpl files for every component build and for every branch prepared for a specific Delphi version.
You should definitely try to keep most (if not all) of paths relative, in a worst case use environment variables to point to your root project dir.
Start IDE script allows you to keep each project and Delphi version environment spearately configured on a single Windows installation.
It should include necessary registry keys for your project and Delphi:
Windows Registry Editor Version 5.00
[-${DelphiRegKey}\Disabled Packages]
[-${DelphiRegKey}\Known Packages]
[-${DelphiRegKey}\Library]
[${DelphiRegKey}\Known Packages]
"$(BDS)\\Bin\\dclstd${CompilerVersion}.bpl"="Borland Standard Components"
"$(BDS)\\Bin\\dclie${CompilerVersion}.bpl"="Internet Explorer Components"
"$(BDS)\\Bin\\dcldb${CompilerVersion}.bpl"="Borland Database Components"
(...)
"${CustomComponentPack}"="Custom Components"
[${DelphiRegKey}\Library]
"Search Path"="${YourLibrarySourceFolder1};${YourLibrarySourceFolder2}"
(...)
You can then prepare batch file:
regedit /s project.reg
%DelphiPath%\bin\bds -rProjectRegKey Project.dpr
Where ${DelphiRegKey} is HKEY_CURRENT_USER\Software\Borland(or CodeGear in newer versions)\ProjectRegKey.
Basically it is easier when you will dump your current working configuration from registry, strip it from unnecessary keys, change paths to relative and then adapt to make it work with your project.
In such configuration, switching between projects and their branches which have different sets of components (and/or possibly using different Delphi version) is a matter of checking out a repository only and running the script.
Fortunately for us, we don't have to worry about a hotfix/service pack; we're still on Delphi 5. :D
Sigh, there was a time when an entire application (settings and all) would exist within a single directory - making this a non-issue. But, the world has moved on, and we have various parts of an application scattered all over the place:
registry
Windows\System
Program Files
Sometimes even User folders in "Application Data" or "Local Settings"
You are quite right to consider the impact of hotfixes/service packs. It's not only RTL/VCL that could be affected, but the compiler itself could have been slightly changed. Note also that running on the same line of thought, even when you upgrade Delphi versions, you need to build using the correct version. Admittedly this is a little easier because you can run different Delphi versions alongside each other.
However, I'm going to advise that it's probably not worth going to too much effort. Remember, working on old versions is always more expensive than working on the current version.
Ideally you want all your dev to be be on main branch code, you want to minimise patch-work on older versions.
So strive to keep the majority of your users on the latest version as much as possible.
Admittedly this isn't always possible.
You wouldn't want to jump over to the 'new version' without some testing first in any case.
Certain agile processes do tend to make this easier.
By using a separate build machine or VM, you already have a measure of control.
TIP: I would also suggest that the build process automtically copy build output to a different machine, or at least a different hard-drive.
Once you're satisfied with the service pack, you can plan when you want to roll it to your build machine.
It is extremely important to keep record of the label at which the build configuration changed. (Just in case.)
If your build scripts are also kept in source control, this happens implicitly.
When you've rolled out the hotfix/service pack, fixes to older versions should be actively discouraged.
Of course, they probably can't be eliminated, but if it's rare enough, then even manual reconfiguration could be feasible.
Instead of a VM option to keep your old configuration, you can also consider drive-imaging.
To save on the $$$ of VMWare LabManager, look for a command-line driven VM Player.
You might have to keep 2 "live" machines/VMs, but should never need more than that.
It's okay for an automatic build script to fail because the desired configuration isn't available. This will remind you to set it up manually.
Remember, working on old versions is always more expensive than working on the current version.
Third Party Packages
We went to a little bit more effort here. One of our main motivations though was the fact that we use about 8 third party packages. So doing something to standardise this in itslef made sense. We also decided running 8 installation programs was a PITA, so we devised an easy way to manually install all required packages from source-control.
Key Considerations
The build environment doesn't need any packages installed, provided the object and/or source files are accessible.
It would help if developers could fairly easily ensure they're building with the same version of third party libraries when necessary.
However, dev environments usually must install packages into the IDE.
This can sometimes cause problems with source compatibility.
For example new properties that get written to IDE maintained files.
Which of course brings us back to the second point.
Since Third Party packages are infrequently updated, they are placed within a slightly different area of source-control.
But, NB must still be referenced via relative paths.
We created the following folder structure:
...\ThirdParty\_DesignTimePackages //The actual package files only are copied here
...\ThirdParty\_RunTimePackages //As above, for any packages "required" by those above
...\ThirdParty\Suite1
...\ThirdParty\Suite2
...\ThirdParty\Suite3
As a result of this it's quite easy to configure a new environment:
Get latest version of all ThirdParty files.
Add _DesignTimePackages and _RunTimePackages to Windows Path
Open Delphi
Select Install Components
Select all packages from _DesignTimePackages.
Done!
Edit: Darian was concerned about the possibility of errors when switching switching versions of Design Packages. However, this approach avoids those kinds of problems.
By adding _DesignTimePackages and _RunTimePackages to the Windows Path, Delphi will always find required packages in the same place.
As a result, you're less likely to encounter the 'package nightmare' of incompatible versions.
Of course, if you do something silly like rebuild some of your packages and check-in the new version, you can expect problems - no matter what approach you follow.
I usually structure my repository in SVN like this:
/trunk/app1
/trunk/comp/thirdparty1
/trunk/comp/thirdparty2
/trunk/comp/thirdparty3...
I have, right in the root folder (trunk) a project group (.groupproj, or .bpg on old delphi) that contains all my components. (allcomponents.groupproj).
Installing on a new machine, means opening that package, and installing the designtime components. That's a drag on all versions of Delphi older than 2010, but 2010 and XE have a lovely feature so you can see at a glance, which components are designtime components.
I also, sometimes, will save myself the trouble of installing those components by hand, by making a build.bat file, and a regcomponents.bat file. The regcomponents just runs regedit , and imports the keys needed to register all those components, after build.bat has built them, and everything else.
When you move up from one delphi version to another, it's sure good to have both a batch and reg file, and a group project, to help you. Especially if you have to go through and do a lot of opening of project/packages and saving them as MyComponent3.dpk instead of MyComponent2.dpk, or updating the package extension from 150 to 160, or whatever your packages do.