Why is WSCript object not known to my script that's controlled by a custom IScriptControl? - wsh

I am using someone else's library that provides its own scripting host instance, it appears.
This lib provides me with functions to define the type of scripting language such as "jscript" and "vbscript", and I can supply it with script code and have that executed, with passing arguments in and back. So, basically, it works.
However, when I try to access the "WScript" object, I get an exception saying that this keyword is undefined.
The developer, not knowing much about this either (he only made this lib for me because I do not want to deal with Windows SDKs right now), told me that he is using "IScriptControl" for this.
Oh, and the lib also provides flags to allow "only safe subset" and "allow UI", which I set to false and true, respectively.
Does that ring a bell with anyone? Do a user of IScriptControl have to take extra steps in order to make a WScript object available? Or, can he use IScriptControl in a way that this is supplied automatically, just as when running the same script from wscript.exe?
Basically, all I need is the WScript.CreateObject function in order to access another app's API via COM.

I don't know why WScript is not known, but I suspect it is because the script host doesn't provide it. Maybe only wscript.exe does this.
If you are using Javascript, to create an object you can use new ActiveXObject(). If you are using VBScript, you can just use CreateObject.
See this article for some background.

Related

Generate dependencies for Lua?

I have a lua project with lua files specified in multiple directories all under the same root folder with some dependencies.
Occasionally I run into issues where when a table is being loaded at load time I get a nil exception as the table is referencing a not yet initialised table, like:
Customer =
{
Type = CustomerTypes.Friendly
}
Which causes a nil exception for CustomerTypes as CustomerTypes.lua has not yet loaded.
My current solution is to simply have a global function call in these lua files to load the dependency scripts.
What I would like to do is pre-process my lua files to find all dependencies and at run time load them in that order without needing function calls or special syntax in my lua files to determine this (i.e. the pre-processor will procedurally work out dependencies).
Is this something which can be realistically achieved? Are there other solutions out there? (I've come across some but not sure if they're worth pursuing).
As usual with lua there are about 230891239122 ways to solve this. I'll name 3 off the top of my head but I bet I could illustrate at least 101 of them and publish a coffee table book.
First of all, it must be said that the notion of 'dependencies' here is strictly up to your application. Lua has no sense of it. So this isn't anything like overcoming a deficiency in lua, it's simply you creating a scripting environment in your application that makes you comfortable, and that's what lua's all about.
Now, it seems to me you've jumped to a conclusion that preprocessing is required to solve the given problem. I don't think that's warranted. I feel somewhat comfortable saying a more conventional approach to solving the problem would be to make a __newindex metamethod globally which handles the "CustomerTypes doesnt exist yet" situation by referencing a list of scripts which have been scanned out of the filesystem initially for one called CustomerTypes.lua and running that.
But maybe you have some good reason for wanting it done strictly as preprocessing. In your case, I would start by considering 'dependencies' to be any name which is a script found in your scripts filesystem. Then scan each script to look for the names of dependencies using the definitions/list you just created, and prepend a load(dependency) command to each of those scripts.
Since the concept of "runtime" or "preprocessing" is somewhat ambiguous in this context, you might mean at script-compile-time. You could use the LuaMacros token filters system to effect a macro which replaces CustomerTypes with require("CustomerTypes.lua") or something to that effect after having discovered that CustomerTypes is a legal dependency name.

Progress ABL How to Test for WEBSPEED in the PRE-PROCESSOR

I want to conditionally compile some blocks of code depending on type of client i'm running in. this is fine for batch and tty as i can use the {&BATCH-MODE} but how to test for when the code is being compiled in webspeed agent? eg. {&IF} not {&SOMETHING} EQ "YES" {&THEN}
{&ANALYSE-SUSPEND}
foo
bar
{&ANALYSE-RESUME}
{&ENDIF}
it would be helpful if this did not rely on defines auto generated by the architect in .w's etc but that would be a nice to have not essential.
Compile time isn't run time. If the program can be run different ways (as a part of a of webpage using webspeed, as a part of a batch and as a part of some other kind of client etc) you're most likely better of evaluating this in run time instead.
You can identify in what environment you're running:
SESSION:CLIENT-TYPE
This will identify your type of client.
DISPLAY SESSION:CLIENT-TYPE.
Type of client Attribute value
-------------------------------- -----------------------
ProVision standard ABL client 4GLCLIENT
WebClient WEBCLIENT
AppServer agent APPSERVER
WebSpeed agent WEBSPEED
Pacific Application Server agent MULTI-SESSION-AGENT
Other special-purpose clients Unknown value (?)
Documentation
Using VST
If you have at least one database connected
_Connect-ClientType tells you what kind of client this particular connection is:
Value Client
-------- ---------------------
ABL ABL client
SQLC SQL client
WTA Webspeed agent
APSV AppServer agent
SQFC SQL Federated client
Example:
FIND FIRST _myconnection NO-LOCK.
FIND FIRST _connect NO-LOCK WHERE _connect._connect-usr = _myconnection._MyConn-userid.
DISPLAY _connect._Connect-ClientType.
Based on OS
Perhaps you run different OS:es?
DISPLAY OPSYS.
Other ways
There's a number of other ways of doing this, including perhaps looking at PROPATH, Working directory etc.
Try to stick with a solution that won't change over the course of time because of Progress upgrades, new OS:es, new directory structures etc.
IMHO there is no such preprocessor variable out of the box.
But you could create your own include file and include that in the code that's relevant. You need two versions of that file, one says
&GLOBAL-DEFINE WebSpeed WebSpeed
and the other
&GLOBAL-DEFINE NoWebSpeed NoWebSpeed
And then configure your compile sessions so that they find exactly one of the files in propath.
But as you will agree, this is probably dangerous as the result will heavily rely on the proper PROPATH used during compilation. I'd rather attempt to use a runtime condition instead.
What are you trying to achieve in detail?
finally figured it out this morning {&webstream} and {&out} are not defined in in normal sessions so i can just test for that. runtime is not an issue in my case i just want to compile the code in all cases. in this shop dont ask me why but every single piece of code is session compiled. poor cpu but there u go. i could be defensive and add some logic with session:Client-type for bells and whistles you're right. if not can-do then boogie :)

Adobe DTM header for a library download

Background: My company is starting a proof of concept for adobe DTM and I am starting to familiarize myself with it. We have many different domains and many different internal sections that all may want to use different libraries on different pages. We are using the library download setup in DTM. I've watched a lot of the tutorials the tool and have read all of the documentation that I could find on the headers.
Issue: I believe that the libraries are all created with the DTM tool. For instance, if we create a rule, or add 3rd party javascript to DTM, then that would be placed in the library. Because of the way that the team has generally thought about js libraries before, where we upload them ourselves, most of the team believes that we can physically place the 3rd party js libraries in the location designated by the header and that we can reference them with an include() call in the Javascript/Third Party Tags section of a rule. I don't believe this is possible. Is there anyone who can shed some light on this?
Thanks for your time,
Mike
(I already answered this at the Adobe Forums, but I thought I would include the reply here for others looking at stack exchange)
I could be wrong in my assumptions, but I have always understood this method as a way to simply host DTM functionality on your own servers for downtime/uptime/SLA reasons. :) Meaning, you would want to go with this option simply because you need to ensure that DTM embed urls/scripts never ever go down and that they are lightning fast and never give you issues. :) You would then use the script loading capabilities by configuring the DTM UI to load the 3rd party scripts or custom built scripts through rules. You would load them either on pageLoad top or bottom, domReady, or onLoad. There is more documentation on this option here and some reasons why you would use that option:
http://microsite.omniture.com/t2/help/en_US/dtm/hosting.html
http://microsite.omniture.com/t2/help/en_US/dtm/deployment_download.html
However, you can also include these scrips just like you would with any other javascript reference like you mentioned above. The trick would be just figuring out the url to include as your src attribute. DTM itself has an API that you can use to load scripts, and it also includes a "settings" property and "configurationSettings" property that you can use to find a lot of those scripts that you are interested in loading. See all _satellite object documentation here:
http://microsite.omniture.com/t2/help/en_US/dtm/object_reference.pdf
In more detail, you could do something like this to get your script path dynamically after DTM embed scripts have loaded:
var scriptSrc = "//domainOfHost.com/" + _satellite.settings.scriptDir + "scriptSrc.js";
Then you could use this function on the _satellite object to load the script you are interested in:
_satellite.loadScript: function (url, callback)
PARAMETERS:
url: the URL of the script
callback(optional): the function to be called after the script has loaded.
DESCRIPTION: Load an external script.
Thanks,
Ben

Auto-Detect Application "Type"

I build four different "types" of applications with my framework:
1) Windows Services
2) Normal Applications
3) Service Applications (a normal application with the functionality of a Windows Service but with a local GUI console and an ability to auto-upgrade)
4) Remote GUI Consoles
Now I can detect, through code, if the application is a Windows Service. But currently to detect between the others I use DEFINES that need to be added to the project file. I would like find an alternate way that does not rely on DEFINES if possible. My initial thoughts are to use the Comments field of the project's version info.
Any ideas?
Edit: I am after a general technique that works regardless of how I "type" my applications. At the moment I use DEFINES from the project configuration, which works, but makes the code slightly messier than using "if" code switches, and because it is stored in the .dproj file, can be hidden from view.
Solution: From David's suggestion I initially used the conditional defines (and any other information such as whether the application was running as a Windows Service) to map all applications to one of the 4 application types, stored in a globally accessible object. Unless linking files that made no sense to include with a particular application type, I replaced almost all of my conditional compilation flags with code, which significantly improved the readability of the code. There are a few other "tweaks" I implemented, but that was the basic implementation.
Depending how you are using the Application global variable you can detect if you application is a Service, a VCL or a console App checking the type of this global variable. for consoles app you can use the System.IsConsole variable.
function ApplicationIsService(Component:TComponent):Boolean;
begin
Result:=Component.ClassName='TServiceApplication';
end;
function ApplicationIsVcl(Component:TComponent):Boolean;
begin
Result:=Component.ClassName='TApplication';
end;
and you can use like this
if ApplicationIsVcl(Application) then
//do something
else
if ApplicationIsService(Application) then
//do something else
else
if IsConsole then
//do another thing
It sounds like each project has a single app type so it seems logical to differentiate in either the .dpr file or the .dproj files.
Call a function to set a private global variable from the .dpr file.
Or use a conditional defined in the .dproj as you do now.
If it was me I'd stick to a conditional but use the trick of converting it into a Delphi enum with a shared helper method to make it read better.

Delphi LoadLibrary Failing to find DLL other directory - any good options?

Two Delphi programs need to load foo.dll, which contains some code that injects a client-auth certificate into a SOAP request. foo.dll resides in c:\fooapp\foo.dll and is normally loaded by c:\fooapp\foo.exe. That works fine. The other program needs the same functionality, but it resides in c:\program files\unwantedstepchild\sadapp.exe. Both aps load the DLL with this code:
FOOLib := LoadLibrary('foo.dll');
...
If FOOLib <> 0 then
begin
FOOProc := GetProcAddress(FOOLib , 'xInjectCert');
FOOProc(myHttpRequest, Data, CertName);
end;
It works great for foo.exe, as the dll is right there. sadapp.exe fails to load the library, so FOOLib is 0, and the rest never gets called. The sadapp.exe program therefore silently fails to inject the cert, and when we test against production, it the cert is missing, do the connection fails. Obviously, we should have fully-qualified the path to the DLL. Without going into a lot of details, there were aspects of the testing that masked this problem until recently, and now it's basically too late to fix in code, as that would require a full regression test, and there isn't time for that.
Since we've painted ourselves into a corner, I need to know if there are any options that I've overlooked. While we can't change the code (for this release), we CAN tweak the installer. I've found that placing c:\fooapp into the path works. As does adding a second copy of foo.dll directly into c:\program files\unwantedstepchild.
c:\fooapp\foo.exe will always be running while sadapp.exe is running, so I was hoping that Windows would find it that way, but apparently not. Is there a way to tell Windows that I really want that same DLL? Maybe a manifest or something? This is the sort of "magic bullet" that I'm looking for.
I know I can:
Modify the windows path, probably in the installer. That's ugly.
Add a second copy of the DLL, directly into the unwantedstepchild folder. Also ugly
Delay the project while we code and test a proper fix. Unacceptable.
Other?
Thanks for any guidance, especially with "Other". I understand that this issue is not necessarily specific to Delphi. Thanks!
The MSDN documentation for LoadLibrary tells you exactly where Windows will search for the DLLs. You either have to hard-code the path to the DLL, put it in the same folder as your app, or put it in one of those default search locations from the LoadLibrary docs.
This is not exactly a solution for the question asked, but it would have helped me, when I stumpled upon this question:
You can extend the search path for LoadLibrary via SetDllDirectory.
From MSDN-Doku:
The search path can be altered using the SetDllDirectory function.
This solution is recommended instead of using SetCurrentDirectory or
hard-coding the full path to the DLL.
You would have needed to add one line before your LoadLibrary call(s):
SetDllDirectory(PChar('c:\fooapp'));
Or you can simply edit the environment variable "path" and place the path to the dll in there. In this case adding ;c:\fooapp to the path should be sufficient. Since the environment changes of a parent effects a child, you can also create a loader application which adjusts the its environment variable then spawns to your application.

Resources