SSIS throwing an error in a script task before the first line of code is reached - ssis-2012

SSIS throwing an error in a script task before the first line of code is reached. Simialr to How can I reference a dll in the GAC from Visual Studio? I did not write it I am just looking at it to see how it works. Is there a rule of thumb of things to look for. It is using a third party library where the orignal code was referencing from the GAC but I am not sure if I installed this 3rd party library into the GAC correctly. So I have referenced the dll direct in the path it was installed into as a first attempt. Like other posts I have read it works fine if I isolate the code into a console app and run it.
Things I have tried so far that work:
No reference no code other than return success
With the local reference, no code other than return success
Things I have tried that do not work before I GAC'd:
kept the Try Catch but one line of code in the try var session = new Session(); Possible 3rd party library is not same framework
version??
Code base: Does a Simple File Transfer Protocol call to send a file to a folder on the destination server. Never gets to the break point if I dont install in the GAC.
Message: Exception has been thrown by the target of an invocation.
Error: at System.RuntimeMethodHandle.InvokeMethod(Object target, Object[] arguments, Signature sig, Boolean constructor)
at System.Reflection.RuntimeMethodInfo.UnsafeInvokeInternal(Object obj, Object[] parameters, Object[] arguments)
at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture)
at System.RuntimeType.InvokeMember(String name, BindingFlags bindingFlags, Binder binder, Object target, Object[] providedArgs, ParameterModifier[] modifiers, CultureInfo culture, String[] namedParams)
at Microsoft.SqlServer.Dts.Tasks.ScriptTask.VSTATaskScriptingEngine.ExecuteScript()
Things I have tried that do not work after I GAC'd:
I navigated to the v8.0A bin folder and installed my library opening cmd as administrator and cd to the assembly folder and ran gacutil /i "C:\Program Files (x86)\WinSCP\WinSCPNet.dll" and it said successfully installed. I then tried both referencing the library with the same path and also the assembly path C:\Windows\Microsoft.Net\assembly\WinSCP\WinSCPNet.dll" using CopyLocal = False and it works ok debugging locally but once I deploy the SSIS solution the same annoying error "... Target of an invocation ...".

If it helps anyone with a similar problem when using a 3rd party library in a SSIS package or similar lessons learnt are:
SSIS requires that the library must be GAC'd
I can only speak for the modest .NET developers amongst us as even I was a bit rusty on the basic concept of how to reference a library and using early binding,
Use copy local = False in SSIS
Make sure the version of library your reference is the GAC'd version, and if not found in the main GAC reference it by path and find in c:\Windows\Microsoft.Net... if you cannot find it anywhere else.
Note same version is both product and file version you will need to consider - I have not experimented with every combination as I initially thought if the product version was the same that would be good enough, then I thought it would just know to choose the latest version which is not the case either and I only recall getting it to work if both product and file were the same and choosing as I said the GAC version.
Note when you upgrade or copy from another machine and install a library into the GAC that you did not initially install as part of the main product which you are able to do, you have to be aware the way .NET works is that you can have multiple versions of the same library running concurrently and thus referenced by multiple applications each using their own version so as long as the version you reference exists on the machine you are deploying your SSIS solution (NB: it may even stop working when running it locally in VS after a change due to another default supporting thing it requires in the same directory that just works on the server) to and always read the documentation if any if there are any discrepancies between how and what is installed on the server vs your developer machine vs joe blogs user who does not know anything about the concept of automation (i.e. libraries may not be automatically GAC'd in the product installation phase) if you have to worry about it yourself.
It is not a bad idea to reference the library in say a console application so you can experiment with the library as you don't need it GAC'd to play around with.
Never assume the a 3rd party library is written by a technical person, seasoned IT professional or the person writing the documentation is a practised consumer of their own or employees library
Do not assume the person in your company who purchased it knows everything, has read all the features and common issues
Make sure you are aware of frequency of updates or keep an one eye open if practical to upgrade
Make sure the developer consuming the use of the library has the nous to try things if it doesn't work one way try another, read the documentation even if it is at first not straight forward, persevere, and thus get to know what the SDK contains as in classes/methods/properties etc. For e.g. most basic thing I would assume when doing SFTP is it can it handle multiple files and sub directories and file masks
As an aside issue I did try at least 6 or maybe more ways until I got it to work how I wanted it to that is copy multiple files using a file-mask in the specified folder only so excluding sub-folders, where the default behaviour of the only code example in the the documentation I could see at first (there was a forum I discovered later that had similar issues been logged but found it easier to just experiment myself) was it automatically created sub-folders, so a copy of the source layout on the destination even if no file was applicable to transfer from it. Further reading and experimentation their was no simple flag to set, the trick happened to be the choice of the method and the format of the file-mask as opposed to the wrong method and the wrong place to set the file-mask on a parameter object and the wrong setting of the mask.

Related

Delphi consistently building DLLs to the wrong directory

I've got a group project built in Delphi XE2 that has 3 projects that always build to the wrong folder for one option set. (I've got 4 configurations under Release and Debug, one for our software configurations and one for FastMM and it's only the debug one that I want to use for debugging that always goes in to the wrong folder. Compiling the project even says it's building to the correct folder, but the DLL always winds up in a different one which I only used once when I was unit testing the code outside of the main project.
I've deleted every associated file, .identcache, .res, .tvsproj (whatever that was) and nothing changed. One very strange thing I noticed is that I copied one of the projects to configure the second one and mimics the behavior of the one it was copied from and I never even unit tested that one, so it never had that output path configured for it.
Obviously this makes it pretty annoying to debug, I have to copy files in to the correct folder just to do that (I was kind of astonished when it actually worked, because I thought Delphi might expect to find the files in it's output path, but oh well, those things are magic)
Let me know if I can post anything to help, I don't really know what's necessary, I checked the registry for the output path that it is getting built do and found nothing that I thought was of any consequence (nothing related to these projects).
One thing I did notice was, because I copied the original project into another project (they're plugins to the same part of the main program) it has the same and when I try using it in the "Build Group" it automatically selects both projects. That's one mystery solved, but is probably a red herring?
OK so as usually happens, after 3 years of suffering with this when I finally ask the questions I'm lead straight to the answer it appears as if RAD Studio is lying to us. The configuration shows this:
but the dproj had this:
in it.
there were two conditions for cfg_3 and only the last one showed up in RAD Studio, well for some odd reason the build path was taken from the first one (even though it's specified in both). So, removing the wrong one (the first one) fixed the problem and things are now building to the correct folder.
I had imported the Utils option set when I was testing the library, but when I incorporated the program in to the main program, I removed it. Somehow it didn't find it's way completely out of the dproj and I guess (not sure why) but it seems like the other library got messed up because it shared a GUID.

TFS Build custom activity requiring more assemblies than needed

I've just written the first version of a workflow activity that will run Resharper's Code Issues on the projects and parse the output to display the issues as build warnings and errors.
At first, I was going to just call Resharper's command line and parse the resulting xml manually. After fiddling with the dlls in Resharper's SDK (through disassembly mostly), I found a way to parse the results using it's own public classes, which I figured was a much more elegant and safe way to do this.
The first problem I have is that that nuget package is absolutely huge. There is 140mb of files in there, which to me is absurd for a single, unpartitioned package. There seems to be such heavy coupling between them that by using just a few model classes and the parser class, I have to drag a dozen or so of those dlls along, some of them which seemingly have nothing to do with the main dlls I need. This is not a show stopper though, I'm struggling with something else now:
In the end, I managed to track down the dependencies I needed to 41 assemblies (which is, again, insane, but alas). Initially, I tried removing everything and adding the missing references one by one, but this turned out to be unreliable, still missing some indirect references, even after compiling successfully. Then, I decided to code a small console application to find all referenced assemblies in the main Resharper assemblies I used, which gave me the 41 references I mentioned. This is the code I used to find every dependency.
Since these are custom activities we are talking about, I decided to create a unit test project to validate them. Using these 41 references only, everything works correctly.
When I added the activity to the build workflow though, and pointed the build controller to the source control folder containing the required assemblies, every time I schedule a build, the process fails stating that I need one extra dll from Resharper's SDK. For example, this is the first one it asks:
Could not load file or assembly 'AsyncBridge.Net35, PublicKeyToken=b3b1c0202c0d6a87' or one of its dependencies. The system cannot find the file specified. (type FileNotFoundException)
When I add this specific assembly to the TFS folder, I get another similar error for another dll, and this keeps going on and on.
What I wanted to know is how can I know exactly which assemblies a workflow XAML will need in order to run correctly? My custom activity dll has two specific CodeActivities and a XAML only activity that uses these two. This XAML acticity is what I'm directly using in the modified workflow template.
I see that besides the references in my project, the XAML activity also contains a TextExpression.ReferencesForImplementation section, with some assembly names. I've run my dependency finder program on those dependencies too, and the results are the same 41 assemblies already at the TFS folder.
Meanwhile I'll go with having the whole SDK into the custom assemblies folder, but I would really like to avoid this in the future since it has such an enormous amount of unneeded and big dlls in there.
First, we have request for our command line tool to support workflow activity and we decided to implement just plain MsBuild task which is universal and works in TFS too. Task and targets files are included in ReSharper CLT 8.2.
Second, if you still want to implement workflow activity it's pretty easy to do with new API in CLT, designed specially for custom processing of found issues - http://confluence.jetbrains.com/display/NETCOM/Custom+InspectCode+Issue+Logger.
And last, but not least, you do not need to put in VCS binaries of ReSharper SDK package.
Use NuGet's restore package functionality.
If you have any other questions I'll be glad to answer them.
A custom activity is being load and run by .NET CLR like any other .NET program. If the stack trace reports a missing file, then it's required by the CLR and you can't change this fact without refactoring your code.
Having an entire SDK references in the custom assembly folder doesn't make sense. I would prefer GAC deployment over huge binaries folder in the source control. Or maybe consider having these activities running an pre\post build scripts in MSBuild or PowerShell.

Dynamicly resolving Assemblies without the file name

Yes, I've read the warning label, and I know that dynamically loading assemblies is somewhat discouraged. That said, I have an application that loads assemblies - that's just how it works. It works fine on Windows. Works fine on Windows CE. I need it to "work fine" on Android, even if it takes some massaging.
Basically the app is an engine that loads up plug-in DLLs (we'll call it an Adapter) that meet specific interfaces at run time. Under Windows, it even detects the appearance of a DLL at any point and goes and loads it - I'm fine if that's not going to work under Android.
What I'm having trouble getting working is having the Engine load an Adapter that it knew about at design/compile time but without hard coding the name of that Adapter into the Engine code. I'm fine with adding a reference to the Adapter to get it to not get linked out, but I really, really don't want to have to add in the DLL name every time, as the DLLs change with different deployments, and that would lead to a huge headache.
So I figured that if it's referenced, it would get into the APK, and I could use reflection to load it like this:
var asm = Assembly.Load("TheAdapterName.dll");
Initial tests show that this works for the Adapter if I just hard code in the name, but again, I really, really want to avoid that.
So I thought that maybe I could reflect through the references and extract the name, but oddly, not all references actually show up when I do that. So I do this:
var refs = asm.GetReferencedAssemblies().Select(a => a.Name).ToArray();
And I get back an array of 14 assembly names. But the assembly (asm) has 16 references, one of which is the Adapter plug-in I need to load. The Adapter is definitely there - heck I used Assembly.Load with the full name two lines above and it resolved.
I thought, ok, maybe I can figure out the "path" to the folder from which I'm running, and then look for DLLs there and load that way. Ha. After several hours of trying to figure out a way to get the path that would work under Debug and Release, I came up with nothing but more grey hair.
Sooooo...... any thoughts on how I might get the name of a DLL that I know is in my APK, but that I don't "know" the name of at build time (I'm loading them and looking for interfaces via reflection to detect their "Adapterness").
If those methods aren't working for you, then the only suggestion I can think of is to add a prebuild step which updates either a C# or an Assets file in order to provide the list you need.
Obviously this is extra work, but should be fully automated and is guaranteed to work no matter what platform changes get thrown at you.
As an aside, I also just looked at one of my mvx projects using reflector - it shows the same asm.GetReferencedAssemblies() list as your investigations report - runtime-loaded plugins are not listed. I guess that the GetReferencedAssemblies method is reporting only on assemblies actually used to import Type references at the IL level - so if you reference an assembly in the csproj but don't import any types then it doesn't list them as references in the compiled code.

microsoft teamfoundation GAC assembly, same version/key but different content

I ran into a situation where two machines both had "microsoft.teamfoundation.testmanagement.client.dll" in the GAC with the same version and public key. They differed in the content they contained though. The newer one had additional classes (e.g. BuildCoverage). Why would the content of the dll change while the version and public key stay the same? Is this common practice?
I don't know if it's common practice, but there are times when the AssemblyVersion (for Strong naming) isn't incremented during an "in-place" minor update to a GAC'd assembly to hot-fix a bug. Check the actual File Version by navigating to the file via command prompt and then checking it's properties. See if there's a difference between the files there. That should indicate if the actual build number is different between them. I bet a small VS hotfix has been applied to one and not another.
It's very normal to update an AssemblyFileVersion but not update the AssemblyVersion attribute. This is how hotfixes for .NET get shipped for example. The key is to test the hell out of the assembly to make sure it's completly backwards compatible to prevent DLL hell.

Copy Delphi Profile

My computer crashed recently. We have a Delphi app that takes a lot of work to get running.
One of my co-workers has it all installed still. Is there a way to copy the stuff stored in the palette? And the library paths?
I am using Delphi 5 (I know it is very very very old)
That information is stored in the Registry. I don't know exactly how Delphi 5 does it, but try looking for a key called HKEY_CURRENT_USER\Software\Borland\Delphi\5 or something like that. You'll find all the registration information under that key, including a list of installed packages. You can export the keys to a registry file, copy it to the new computer and install it.
Standard disclaimer: Mucking around in the registry manually can be risky if you don't know what you're doing. Be very careful, and if this solution causes your computer to crash, your house to burn down, or demons to come flying out your nose, it's not my fault.
Try CNWizards which has an export functionality for your IDE settings. You can use the same tool restore them on the new machine. We use it to get the same settings on every development machine. In that way we can ensure that all builds are the same, regardless of who built it.
Based on my experience of having done this a few times(!), the most important registry keys are:
HKEY_CURRENT_USER\Software\Borland\Delphi\5.0\Known Packages
HKEY_CURRENT_USER\Software\Borland\Delphi\5.0\Library
and possibly
HKEY_CURRENT_USER\Software\Borland\Delphi\5.0\Known IDE Packages
and maybe
HKEY_CURRENT_USER\Software\Borland\Delphi\5.0\Palette
HKEY_CURRENT_USER\Software\Borland\Delphi\5.0\Palette Defaults
So long as you have done a standard D5 installation first.
It's easier/more reliable to let the IDE fill in the other bits as you start using it and you change options as appropriate. Some component packages, eg madExcept, DevExpress etc are often best re-installed using their own installers anyway.
Unless you're going to have multiple users on the same machine using Delphi then the HKLM stuff isn't really all that important - I don't think.
As a related aside - I have learned that a good way to handle this is to build a FinalBuilder script (or similar) to set up my Delphi environment each time I decide to use a new machine/installation. I copy/download/checkout (which can be done in FB too) all package source then use FB to compile it, copy it, create dirs, and fill in the appropriate registry keys etc. I always get a consistent environment and makes it much easier to rebuild individual components or packages as and when they get upgraded too. The items can also be put into the script in 'dependency order' so that you know to re-compile a dependent package if something else changes. I now have a single FB sciprt that builds D5, D2007, D2009, D2010 environments and packages of all my main components, all depending on which compiler(s) I'm interested in which I indicate by a simple variable. Well worth it.
Seems to have just worked for me on a Win 7, SP1 and Delphi 5
Logged as user with Delphi & 3rd party components installed.
registry export
hkey current user\software\borland
(no other borland products so selected Borland)
rather than Borland\Delphi\5.0)
Logged into pc as new user.
Did not start Delphi5 (i.e. never started for this user).
Regedit File, Import
Started Delphi all components, including lots of 3rd
party, present.
Project compiled as expected under new user.

Resources