Delphi app calling cobol app -> error - delphi

We need to get data out of an older accounting system. We have received a dll that gives us access to the data we need. It includes a type library that we have imported.
If we run our test application from the same directory as the accounting system, everything works fine. If we try to run our application from a different directory, we get the following error:
Dynamically Bound RTS
Runtime DLL 'OOPS', version 3.1, entry point oops
not recorded in registry, not found or incompatible with requirements
of dynamically bound COBOL program. Dynamic binding of RTS requires:
Runtime DLL 'OOLSM', at least Version 3.1
Can anybody provide some helpful information on this?
Are we supposed to have some kind of cobol runtime in our directory? Or in the path? Or registered in the registry?
Thanks,
-Vegar
Updates:
Setting the system %path% to include the path to the accounting system seems to do the trick. Including it as a user variable did not have the same effect for some reason.

What Cobol are you using?
I had done this for year with Microfocus NetExpress 3.1, and all works just fine.
I write COBOL DLL to access COBOL data files, and also write Delphi DLL to add new features to old COBOL systens.
And yes, I use to set the runtime path, that is environment variable called COBDIR, there are others,but usually %PATH% and %COBDIR%is enough.
If you give more detais about what COBOL compiler are you using, and how is the dll interface that you are calling, will me ore easy to help you.
And maybe "Dependence Walker" can help you to identify what run time files are missing, if it is.
http://www.dependencywalker.com/

If it works from the accounting app's directory, but not a different one, the first thing I'd try is adding that directory to your path.

Unless it is already loaded into memory, Windows looks for DLL's that a program is requesting in every location listed in its PATH environment variable, and also in the directory the application is located within.

Related

rtl90.bpl was not found, how do I include it in my Delphi 2005 win 32bit app?

The software that I have found myself supporting, from time to time fails to run on different PC's. Generally they are new Win7 installs.
The error message is "this application failed to start because rtl90.bpl was not found..."
To rectify the problem I have out PC Support copy the rtl90.bpl file to the users system32 directory, however i would like to ensure this error no longer occurs.
I have googled and found the followling link rtl90.bpl problem
My question is this:
The option to "Build with runtime packages" is already selected under the Project options for this program, and does not appear to make any difference to the users getting the problem.
Do I have to specifically Add the missing rtl90.bpl file to the project?
Please note that I know very little about delphi programming.
Since you are marked the option Build with runtime packages in your project, the final exe will require be deployed with some additional bpl files. To avoid that dependencies you must uncheck that option y build your project. Now your exe wil be bigger but without dependences.
That package is a runtime package containing the VCL. You presumably also need to deploy rtl90.bpl for the RTL and possibly some others. By enabling runtime packages you are promising to deploy those packages where the executable can find them.
You have 3 main options:
Deploy the packages to a location that is contained in the PATH variable. Usually this means modifying PATH. You should never write to the system directory. It is owned by the system and you should respect that.
Deploy the packages to the same directory as the executable file.
Disable runtime packages and therefore build a single self-contained executable. The RTL/VCL code will be statically linked into your executable.
Option 1 is poor in my view. Relying on the PATH variable and the ability to modify it is fragile. Option 2 works but seems rather pointless in comparison with option 3. You deploy more files and larger files when you choose 2, so why choose it.
In summary I recommend option 3. Statically link all RTL/VCL code into your executable.
The only situation where option 2 wins, in my view, is when you have multiple related executables that are all deployed to the same directory. In that situation sharing the RTL/VCL code can make sense.

Delphi XE2 host application from output directory

Is there a way to tell a Delphi project that builds a DLL to use as a host application an executable in the same directory as the output directory of the DLL being built?
something like this:
One thing is, I'm using option sets with Delphi XE2, so in the dproj for the DLL I'm building I don't even have a DCC_ExeOutput directory, not sure if that matters.
Allowing this would seriously decomplicate some issues we've ran into trying to migrate from VSS to SVN.
Also, what do you call the $(thing)'s?
The $(name) things are environment variables. I tried setting the host application to .\$(Platform)\$(Config)\Test.exe and received this error message:
Could not find program, '.\%Platform%\%Config%\Test.exe'.
Note how the $(...) was turned into environment variable syntax.
I also tried with $(systemdrive)\Test.exe and received this error message:
Could not find program, 'C:\Test.exe'.
So clearly environment variables will be substituted with their values, if they do exist. I think it is reasonable to conclude that the environment used to start a host application clearly does not define the special Delphi specific environment variables.
So I think the answer to your question is that you cannot use indirection like this for the host application setting.
On the other hand, environment variables are substituted so perhaps you could use that to make things easier. In other words you could define some environment variables of your own. I've no idea whether that may be of help to you since I don't know the precise details of your problem.

windbg: version of loaded assemblies

does anybody know how to figure out the assembly versions (not file versions) of loaded assemblies if I have a full memory dump?
Suppose I have a full dump of the .net process and I found two assemblies with the same name loaded in one AppDomain. I need to know what versions those assemblies have.
The SOS commands !dumpmodule, !dumpassembly and !dumpdomain do not provide that kind of information or I just missed something.
Thank you in advance.
You could try the !SaveModule SOS command. This takes the start address of an assembly and creates a new file (the name of which is given by you) to save the contents of the assembly. You could then use something like .NET Refletor to open the file, and it might give you the .NET version somewhere in there. This SO question has some details on that:
How to find out which version of the .NET Framework an executable needs to run?
As for the !SaveModule command, here's a blog article that describes how to use it:
http://blogs.msdn.com/b/tess/archive/2006/05/18/601002.aspx?PageIndex=2

Why does my program say "folder does not exist" when run on Windows 2008?

We have a Delphi program whose task is like a service program. It watches a particular folder for a certain period, and it works great on Windows XP and 2003, but on Windows 2008r2 64bit, when it wants to create an automatic folder, it will show this message:
The ... folder does not exist. The file may have been moved or deleted.
This message causes the program to halt, which is not good; it should not be interrupted.
What can I do about this?
P.S.: I really don't have any idea whether to post my problem in Stack Overflow or Server Fault, so I've guessed it should be here.
It's likely the VirtualStore, if you're trying to store beneath Program Files (either one). See my writeup:
http://www.clipboardextender.com/off-topic/vista-program-files-hide-and-seek
You've left out the ... folder name. While that's understandable, it wouldn't happen to have anything to do with program files (which on x64 will be split in 2 directories) would it?
Windows Server 2008 is able to use 'virtual' file pathes. That means: 'what you see is not what you get'. The Windows Explorer just shows you the 'display' name. Check the file path with cmd.exe, if the path you are trying to use does realy exist.
The reason is of cause the File Virtualization (see for example http://msdn.microsoft.com/en-us/library/bb756960.aspx and http://technet.microsoft.com/en-us/magazine/2007.06.uac.aspx).
Because we on stackoverflow.com and not on serverfault.com I want add to all other answers that you can use Wow64DisableWow64FsRedirection, Wow64RevertWow64FsRedirection and Wow64EnableWow64FsRedirection functions (see http://msdn.microsoft.com/en-us/library/aa365743.aspx) to control the File Virtualization in your program. An example of the usage of this functions in C# you can find here http://www.pinvoke.net/default.aspx/kernel32.wow64disablewow64fsredirection.
You'll need to tell us the exact path and how do you go about constructing it. It can be as simple as the app not using env variable expansion but assuming that user's folders are where they were before.
Path virtualization (there are 2 kids actually) that people mentioned will hit you only if your app is trying to mess with system folders.
More puzzling problem will hit you if you are not expanding env vars like APPDATA, LOCALAPPDATA etc. and not expecting that there's more of them on Win7 and 2k8. Not only that default paths of user's files changed but some of them can also be on network shares - for the same user. So if you were running based on expectation that all user's stuff will be at definite paths under say %USERPROFILE% you can get hit by several surprises. Also notice %ProgramData% .
Fastest way to find out - open cmd.exe, run set and if you see some paths that you are constructing in alternative ways, take notice that you need to start expanding env vars for them. Then open cmd.exe as a 32-bit app and check set again. You can also pick them up via Process Explorer from some running 32-bit or 64-bit app.
Switching your app to 64-bit build will resolve most of virtualization issues but not the env var expansion. Also if your app is touching system folders you need to request elevated run from the code or even better make the manifest and declare it there. Then OS will yell at user up front if his UAC is on and your app will avoid that 2nd virtualization. BTW, virtualization is controllable via group policies so it might be present on some boxes and missing on others.

Updating UMDF drivers during development

I am having some trouble updating UMDF drivers using "devcon" during a
standard code-deploy-debug cycle. The problem is that "devcon update" isn't
really updating anything unless the version number or the date of the DLL
file and the INF file has changed from what is stored in the system's driver
cache folder. After a maddening series of experiments I've discovered that
one way to force the thing to use the latest files is by doing the
following:
Change the parameters passed to
"stampinf.exe" in "makefile.inc" by
explicitly setting a version with
the "-v" option.
Modify the
resource script file ("DRIVER_NAME.rc") to first define
VER_USE_OTHER_MAJOR_MINOR_VER
before including "ntverp.h" and then
explicitly define
VER_PRODUCTMAJORVERSION and
VER_PRODUCTMINORVERSION. You'll
note that this system does not allow
us to change the build and the
revision numbers. On Win7 this
seems to be fixed at 7600 and 16385
in "ntverp.h". Is this by design?
So, I first modify "makefile.inc" and set the "-v" option to something like
"1.1.7600.16385" manually incrementing the minor version for every single
build and then modify the RC file and update VER_PRODUCTMINORVERSION with
the same number.
Alternatively, if I run a command prompt under the SYSTEM account and go and
delete the driver cache folder in
"C:\windows\system32\DriverStore\FileRepository\DRIVER FOLDER" before
running "devcon" then that works too.
Now, I am thinking I am missing something fairly basic here as this seems to
be a rather painful way of doing it. Please help! Thanks!
Why can't you just unplug the device and replace the unloaded DLL? You shouldn't need to reinstall the driver, just replace the module. Note that you shouldn't do this during production or anything that has to do with customers, but if you're writing a driver, just slam in the new module with the same version number.
On Win7 this seems to be fixed at 7600 and 16385 in "ntverp.h". Is this by design?
Yep, at least until the next service pack
As Paul Betts has suggested above, the way to go seems to be to simply replace the UMDF DLL directly in the driver folder (for e.g. c:\windows\system32\drivers\umdf\) after disabling the device either in the device manager or using "devcon". I'd asked this question on Microsoft's device drivers newsgroup before posting here but hadn't got a satisfactory response - but some folks ended up responding there after I posted here! So I'll put up a link to that post as well:
http://bit.ly/6PDxKT

Resources