I'm writing a script to install postgresql modules. How can I find the "contrib" directory? - parsing

I'm writing a script that will install some of the optional postgresql modules. Is there a way to programmatically figure out where the contrib directory is or do I have to prompt for the path? I've looked at a few examples and it seems inconsistent; doesn't appear in pg_config.
(The script might be run on OS X or linux, and I can't make assumptions about how postgresql was installed).

There is no easy works everywhere answer to this, so it's a matter of how much intelligence you want to try and put into your installer. And for some people it will be impossible to do what you hope for, the best you'll be able to do is offer guidance about what's missing. There is a lot of variation on packaging here between Linux distributions, versions of PostgreSQL, and the various ways you can install PostgreSQL on OS X (MacPorts, homebrew, etc.)
First off, only source code installs will have a contrib directory with source code in it that allows building the optional modules. In the packaged builds for Linux, all the contrib binaries may only be available via an optional package, called something like postgresql-contrib. That's the only way to make the optional modules that come with the database available: install the package the binaries are included in. You may see some variation in the OS X builds here too.
If you want to install extensions (what these are officially called as of PostgreSQL 9.1 now, rather than 'modules') using binaries you provide instead, what you then want to know is where to put the resulting shared libraries and matching SQL files that reference them. What pgconfig returns for pkglibdir will tell you where the binaries go, while sharedir points toward the default place to put the SQL. Providing binaries is a losing game though; the job of syncing with every platform to build them is a huge one.
And here are the sort of additional complications you'll run into in this area, if you wanted to ship source code and try to build things yourself in an automatic way:
PostgreSQL 9.1 now installs these using the CREATE EXTENSION
mechanism, so you'll need to handle both the pre-9.1 method and the
new one introduced there.
Not all PostgreSQL installations will have
pg_config. It's considered a development tool, and which package it
gets bundled with (and whether that package is mandatory or not)
varies. Debian/Ubuntu put it into the optional liqpq-dev, RedHat
derived RPMs have it in postgresql-devel or
postgresql-[version]-devel.
Due to pg_config being necessary for
compiling the new 9.1 extensions, packagers have started
reconsidering where pg_config goes; it's considered a lot more
important now than it used to be. 9.1 or later packages might alter which package it's contained in. That doesn't really change what you can and can't do though. It just impacts what advice you might offer for correcting situations your program can't deal with.
I've been describing the standard
Linux packaging here when I talk about that OS. There are also installers for both Linux and
OS X from EnterpriseDB, what they call their "one-click installers".
These use a different standard altogether for what people do and don't get installed in this area. I don't follow the commercial packaging to know what is actually different, but it's another variable you can expect people to encounter.
Recent OS X versions may
have some system PostgreSQL components floating around too. No idea
yet how this handle extensions though.
Basically, all three of version/packager/platform can vary how this will work, and the idea that you'll find any solution that handles even the majority of those permutations is optimistic. Installing extensions is known to be difficult in PostgreSQL, which is one thing that motivated all the 9.1 changes to turn it into a simple CREATE EXTENSION for many of them. But for now, those changes have just added another whole set of variation into the mix, actually making this harder during the transition period. It will be a while until PostgreSQL versions supporting that are the only ones in popular use.

Related

Install same software/version with different compilers side-by-side

For the development/testing of our CFD code I like to frequently switch between Clang (for its strictness/warnings) and GCC (for performance), but this requires some of its dependencies (like e.g. NetCDF) to be compiled with the same compiler.
I know that Homebrew has the option to install multiple versions of software side-by-side and switch between them, but is it possible to do something similar using the same software version, but compiled with different compilers (by setting HOMEBREW_CC and HOMEBREW_CXX)?
Something like (wishful thinking, after somehow installing NetCDF with both Clang and GCC):
brew switch netcdf 4.3.3-gcc
brew switch netcdf 4.3.3-clang
I think this is possible only if you explicitly have different version numbers like in the example you used "4.3.3-gcc" and "4.3.3-clang".
If the version number is identical then there is no difference in the builds and brew can't differentiate them.
Also I wouldn't do this.
By compiling the same library multiple different ways you now start going into a dependency nightmare.
Dependency conflicts. Because even if you swap out "netcdf", how do you know you swapped out all it's dependencies too? If they are not compiled with the same compiler bad things can happen, for example conflicts can arise due to the differences in how calls are made or due to options enabled in one compiler for both the app and it's dependencies that weren't enabled in another build.
I don't recommend doing this, it's too much of a hassle.
But, if you really need two builds (such as for testing) then I would build them to isolated folder trees outside of your system path and do any testing against them there. Brew is not the best way to address this issue, as this is a non-standard use-case.

What is a good Sublime Text workflow for sharing a common setup between several platforms

I'm using Windows, Mac and Linux machines in my daily duties. On all machines, I program in C++ and various shells scripts. So far I've adopted the various "main" IDEs on each platform, but the diversity is irritating. I'm therefore looking into the possibility of using Sublilme Text on all platforms.
I have a setup of Sublime Text on Windows that works perfectly and would like to use the same on the other platforms also, so that when I change something in my Sublime setup on, say, my Mac, I can easily pick up the latest setup on my Windows machine the next time I'm there.
Is this possible on the 3 mentioned platforms, without getting (more) grey hair? If so, any suggestions or experiences thereof?
Many folks upload the "Packages/User" folder to GitHub (or your VCS service of choice). Then, they use Package Control to install their packages. Package control, through a settings file, will install any missing packages on a particular machine. I wrote a bit more about it here. You would then clone the git repo onto each machine, pulling updates when you decide to change something.
Alternatively, you could probably use a cloud service + symlinks to keep things auto synced, but I've personally never used it that way.
There are some plugins that are platform specific, so keep an eye out for those.
There's also the Package Package Syncing, which syncs installed packages and settings via some cloud service.
Works quite nicely, and automatically.
Has the advantage that you don't have to push/pull some dotfiles repo all the time.
No idea though whether this will work seamlessly across platforms (meaning whether all the settings will be platform-independent).

Command line access to iOS app directory (sandbox) from Mac

I need to access the sandbox directory for an application installed on an iOS device, using the command line (non-gui) from a Mac or Linux. This is to help with development and testing automation. Dropping a json file into the sandbox lets me set parameters like extra debug messages and smaller refresh intervals.
A tool like iFunBox works perfectly but is graphical only, requiring numerous clicks to do this. Emails to the developers were unanswered. It also does not support AppleScript. I did find another app that provided a Fuse module, but it turned out buggy especially if the app was uninstalled and then reinstalled (in order to reset back to first time user experience). I reported the problems to the developer but there is no fix on the horizon.
The things I need to do are:
Test if an app with a specific bundle id is installed
Create Library/Caches/MYLIBNAME directory if it doesn't exist
Copy a ~100 byte json file from the Mac to that directory
Get a copy of that file
A solution that only works from Linux is acceptable too
Devices are not jailbroken and I would prefer not to need that as a requirement
In some cases I do not have the source code to the app since it is a third party using my library, so compiling different versions of the app isn't practical.
Answer is below in many comments thanks to lxt. Summary is:
Various libraries and programs associated with libimobiledevice can solve the problems
Use patched iFuse to mount an application sandbox
Use idevicesyslog to see the console log
Use ideviceinstaller to install/uninstall apps
The various libraries and programs associated with libimobiledevice are incredibly difficult if not impossible to compile as is on Linux or Mac, and there is no unified distribution of the source or binaries
For Ubuntu try libimobiledevice (may have 3 suffix), ideviceinstaller and libimobiledevice-utils packages
For Mac a search for libimobiledevice-macosx may get you some of the way there
This is going to be a little tricky, because as I think you've found out the application name is randomly generated on every install. I don't think there is a way past that, certainly that I know of. This explains the problems you're running into when simulating a new install (...the app directory name changes to a new, random hash, and then you're stuck).
Although my preference would be to access this config file in some other way (perhaps over a network, and have some code that only executes on debug/test builds check for it), if you did want to do this then I'd suggest trying something like writing a script that when you want to simulate a new install chooses the app directory that's most recently modified. But this is very hacky.
If you're not able to insert conditional code that only executes on debug/ test builds then I think the random app naming schema that iOS uses at a file system level is going to be problematic for you whatever approach you take.
Update: Regarding iFuse and libimobiledevice - out of the box it limits you to the documents directory. This is because the authors of iFuse don't entry-level users to be confused, and also because the structure is a little different depending on iOS version. You can comment out the lines in the iFuse source - fuse_opt_add_arg(&args, "-osubdir=Documents"); - to get access to the library directory through the mount. You will obviously need to re-compile iFuse yourself if doing this.
You can make use of MobileDevice Library
I know this is an old question and I doubt anyone is looking here anymore, but I thought I'd mention that you can use 'brew install libimobiledevice' to compile on the mac. There are a lot of dependencies and Homebrew really helps make it an easy process by installing them for you.

Best practices for using SVN with Delphi Visual Component packages?

With the desire to be able to reproduce a given revision of a project that is utilizing 3rd party visual component packages, what goes in SVN and what's the best way to implement/structure the SVN repos?
For non-visual components, the rule seems simple to ensure no reliance on outside repos - "no svn-externals reference to any outside repo allowed". I have a shared repo that I control, which is the only 'svn-externals' reference allowed. This makes it easy to implement and share these types of runtime itemss with sourcecode in different SVN projects. Any reference this internal shared repo is by 'svn-externals' using a specific revision number.
Visual packages seem to go counter to being able to be version controlled easily as they may have to be reinstalled at each revision. How to best create a SVN project which is able to be recreated later at a specific revision number...is there a recommended solution?
Previously we didn't worry about 3rd party components as they don't change often and we never had a real good solution. I was wondering if others have figure out the best way to handle this problem as I'm doing a spring cleaning/internal reorganization and wanted to do it 'better' than before.
Technically, the RTL/VCL source should also be in the SVN repo as well (if there's a Delphi hotfix/service pack released.)
My solution will likely be to create a virtual machine with a particular release of the Delphi environment with all visual controls installed. As we add/update visual controls, or update Delphi with hotfixes/service packs then we create a new version of the virtual machine. We then store an image of this VM revision on a shelf somewhere. Is this what you do? Does the Delphi activation/licensing work well (or at all) in this scenario?
Thanks,
Darian
You can prepare "start IDE" (and possibly "build") scripts for your projects and maintain them as project evolves in repository.
Regardless of your decision about keeping components in separate repositories and using externals, or including them in a single repository with possible branching, you should also include compiled bpl files for every component build and for every branch prepared for a specific Delphi version.
You should definitely try to keep most (if not all) of paths relative, in a worst case use environment variables to point to your root project dir.
Start IDE script allows you to keep each project and Delphi version environment spearately configured on a single Windows installation.
It should include necessary registry keys for your project and Delphi:
Windows Registry Editor Version 5.00
[-${DelphiRegKey}\Disabled Packages]
[-${DelphiRegKey}\Known Packages]
[-${DelphiRegKey}\Library]
[${DelphiRegKey}\Known Packages]
"$(BDS)\\Bin\\dclstd${CompilerVersion}.bpl"="Borland Standard Components"
"$(BDS)\\Bin\\dclie${CompilerVersion}.bpl"="Internet Explorer Components"
"$(BDS)\\Bin\\dcldb${CompilerVersion}.bpl"="Borland Database Components"
(...)
"${CustomComponentPack}"="Custom Components"
[${DelphiRegKey}\Library]
"Search Path"="${YourLibrarySourceFolder1};${YourLibrarySourceFolder2}"
(...)
You can then prepare batch file:
regedit /s project.reg
%DelphiPath%\bin\bds -rProjectRegKey Project.dpr
Where ${DelphiRegKey} is HKEY_CURRENT_USER\Software\Borland(or CodeGear in newer versions)\ProjectRegKey.
Basically it is easier when you will dump your current working configuration from registry, strip it from unnecessary keys, change paths to relative and then adapt to make it work with your project.
In such configuration, switching between projects and their branches which have different sets of components (and/or possibly using different Delphi version) is a matter of checking out a repository only and running the script.
Fortunately for us, we don't have to worry about a hotfix/service pack; we're still on Delphi 5. :D
Sigh, there was a time when an entire application (settings and all) would exist within a single directory - making this a non-issue. But, the world has moved on, and we have various parts of an application scattered all over the place:
registry
Windows\System
Program Files
Sometimes even User folders in "Application Data" or "Local Settings"
You are quite right to consider the impact of hotfixes/service packs. It's not only RTL/VCL that could be affected, but the compiler itself could have been slightly changed. Note also that running on the same line of thought, even when you upgrade Delphi versions, you need to build using the correct version. Admittedly this is a little easier because you can run different Delphi versions alongside each other.
However, I'm going to advise that it's probably not worth going to too much effort. Remember, working on old versions is always more expensive than working on the current version.
Ideally you want all your dev to be be on main branch code, you want to minimise patch-work on older versions.
So strive to keep the majority of your users on the latest version as much as possible.
Admittedly this isn't always possible.
You wouldn't want to jump over to the 'new version' without some testing first in any case.
Certain agile processes do tend to make this easier.
By using a separate build machine or VM, you already have a measure of control.
TIP: I would also suggest that the build process automtically copy build output to a different machine, or at least a different hard-drive.
Once you're satisfied with the service pack, you can plan when you want to roll it to your build machine.
It is extremely important to keep record of the label at which the build configuration changed. (Just in case.)
If your build scripts are also kept in source control, this happens implicitly.
When you've rolled out the hotfix/service pack, fixes to older versions should be actively discouraged.
Of course, they probably can't be eliminated, but if it's rare enough, then even manual reconfiguration could be feasible.
Instead of a VM option to keep your old configuration, you can also consider drive-imaging.
To save on the $$$ of VMWare LabManager, look for a command-line driven VM Player.
You might have to keep 2 "live" machines/VMs, but should never need more than that.
It's okay for an automatic build script to fail because the desired configuration isn't available. This will remind you to set it up manually.
Remember, working on old versions is always more expensive than working on the current version.
Third Party Packages
We went to a little bit more effort here. One of our main motivations though was the fact that we use about 8 third party packages. So doing something to standardise this in itslef made sense. We also decided running 8 installation programs was a PITA, so we devised an easy way to manually install all required packages from source-control.
Key Considerations
The build environment doesn't need any packages installed, provided the object and/or source files are accessible.
It would help if developers could fairly easily ensure they're building with the same version of third party libraries when necessary.
However, dev environments usually must install packages into the IDE.
This can sometimes cause problems with source compatibility.
For example new properties that get written to IDE maintained files.
Which of course brings us back to the second point.
Since Third Party packages are infrequently updated, they are placed within a slightly different area of source-control.
But, NB must still be referenced via relative paths.
We created the following folder structure:
...\ThirdParty\_DesignTimePackages //The actual package files only are copied here
...\ThirdParty\_RunTimePackages //As above, for any packages "required" by those above
...\ThirdParty\Suite1
...\ThirdParty\Suite2
...\ThirdParty\Suite3
As a result of this it's quite easy to configure a new environment:
Get latest version of all ThirdParty files.
Add _DesignTimePackages and _RunTimePackages to Windows Path
Open Delphi
Select Install Components
Select all packages from _DesignTimePackages.
Done!
Edit: Darian was concerned about the possibility of errors when switching switching versions of Design Packages. However, this approach avoids those kinds of problems.
By adding _DesignTimePackages and _RunTimePackages to the Windows Path, Delphi will always find required packages in the same place.
As a result, you're less likely to encounter the 'package nightmare' of incompatible versions.
Of course, if you do something silly like rebuild some of your packages and check-in the new version, you can expect problems - no matter what approach you follow.
I usually structure my repository in SVN like this:
/trunk/app1
/trunk/comp/thirdparty1
/trunk/comp/thirdparty2
/trunk/comp/thirdparty3...
I have, right in the root folder (trunk) a project group (.groupproj, or .bpg on old delphi) that contains all my components. (allcomponents.groupproj).
Installing on a new machine, means opening that package, and installing the designtime components. That's a drag on all versions of Delphi older than 2010, but 2010 and XE have a lovely feature so you can see at a glance, which components are designtime components.
I also, sometimes, will save myself the trouble of installing those components by hand, by making a build.bat file, and a regcomponents.bat file. The regcomponents just runs regedit , and imports the keys needed to register all those components, after build.bat has built them, and everything else.
When you move up from one delphi version to another, it's sure good to have both a batch and reg file, and a group project, to help you. Especially if you have to go through and do a lot of opening of project/packages and saving them as MyComponent3.dpk instead of MyComponent2.dpk, or updating the package extension from 150 to 160, or whatever your packages do.

Why does Mac OS X come with ruby/rails?

Why does Mac OS X come with ruby and ruby on rails pre-installed? Does the OS actually use it at all? Can I update my Ruby, Rails or Gem versions safely without something spitting the dummy?
As others have noted, OS X comes with various open source packages pre-installed. While this can be a nice convenience, the packages often are only updated to new versions as part of a major OS X release (like 10.5 to 10.6). Also, some packages are used elsewhere by other parts of OS X and there is no easy way to know which. In general, Apple assumes (and you should, too) that everything under /System/Library and /usr/, except for /usr/local/, is part of OS X and is administered by Apple. You should not attempt to remove or modify files in those hierarchies. That includes just about all of the open source packages, including Ruby.
Instead, to upgrade an existing package, the right approach is to install a new version in a separate location (say, /usr/local/) and invoke the new version by an absolute path reference (/usr/local/bin/ruby) or manipulating the shell PATH environment variable, if necessary. /usr/local/ is often used if installing directly from source. Many people prefer to use one of the 3rd-party open source package distributors, such as MacPorts, Fink, or Homebrew, each of which has its own package manager and installation locations.
No the OS does not use, it is just that Apple wants to make her products a bit more appealing to developers. (there is also Python preinstalled along with some other packets).
You can safely update your Ruby, Rails, Gems but the default Ruby version is a bit outdated. Check RVM so that you can install different Rubies in your system
Consider Rubystack if you want to play with more up-to-date environments without interfering with the existing versions. Disclaimer, I am one of the developers of RubyStack. It is freely available under the open source Apache 2.0 License.

Resources