CopyBooks,.cpy files,WTX Design Studio - cobol

I am working on WTX design Studio with copybooks,i have a copybook but i dont have any corresponding sample input regarding that .cpy file .
Are there any means to generate a sample text file from copy book rather than writing the text document manually?

As bill said there are lots of ways depending on the tools available:
Ask for sample file
There are packages that can generate data. As I do not work at your site I do not know if any have been installed. They tend to be expensive though
Write a Cobol program (I presume the source is a mainframe) so it would have to be done on the mainframe.
I presume the source is a Mainframe, get on the Mainframe and use FileAid (or FileMaster or whatever they have) to setup the file. FileAid and its ilk let you edit files with a Cobol copybook.
Use the RecordEditor to create the file. You can import Cobol Copybooks into the RecordEditor and then use them to edit Cobol-Data files.
Use a RecordEditor macro to generate a file;
Write a Java / jython / JRuby program with Cobol interface package (have a look on sourceforge)
There are a lot of other possibilities, as I do not know what software or skills you have I can not really advise.
RecordEditor Macro to generate some numeric data:
/******************************************************************
* Purpose: RecordEditor Example Macroto generate numeric data for a file
*
* It is best to run this script from a Single Record Screen rather than
* a Table screen
*
*******************************************************************/
var rec = layout.getRecord(0)
var lines = RecordEditorData.view.createLines(20)
for (lineNo = 0; lineNo < 20; lineNo++) {
print(lineNo);
for (i=0; i < rec.getFieldCount(); i++) {
try {
lines[lineNo].getFieldValue(0, i).set(lineNo * 100 + i)
} catch(err) {
lines[lineNo].getFieldValue(0, i).set(i % 10)
}
}
}
RecordEditorData.view.addLines(-1,1, lines)
Output from the Macro:

Related

Reading characters within a file (the fseek command) not working in nxc (not-exactly-c) programming

Basically I'm trying to write code that reads a specified value from a file ex: file contains(12345) the code reads specific place (say the 3rd place) and the output = 3
I already know how to write numbers to a file but I am stuck on how to read a character because the fseek command won't work (it doesn't know what the command is). I downloaded the include folder with all the extra commands but it still couldn't find the file "stdio.h" in the code.
I don't completely know how it all works anyway
I am a complete noob in programing and so I only know the very basic stuff. I am 90% sure it is my fault it isn't working.
#include "cstdio.h." //gets error (doesn't recognize command/file)
task main ()
{
byte handle;
int fsize = 100;
int in;
int sus = 12345;
DeleteFile("int.txt");
CreateFile("int.txt",100,handle);
Write(handle, sus);
CloseFile(handle);
OpenFileRead("int.txt",fsize,handle);
fseek(handle, 10, 3); //gets error (doesn't recognize command)
in = fgetc(handle);
ClearScreen();
NumOut(30,LCD_LINE5,in);
Wait(100000);
CloseFile(handle);
}

Parsing LLVM IR code (with debug symbols) to map it back to the original source

I'm thinking about building a tool to help me visualise the generated LLVM-IR code for each instruction/function on my original source file.
Something like this but for LLVM-IR.
The steps to build such tool so far seem to be:
Start by with LLVM-IR AST builder.
Parse generated IR code.
On caret position get AST element.
Read the element scope, line, column and
file and signal it on the original source file.
Is this the correct way to approach it? Am I trivialising it too much?
I think your approach is quite correct. The UI part will probably be quite long to implement so I'll focus on the llvm part.
Let's say you start from a input file containing your LLVM-IR.
Step 1 process module:
Read file content to a string. Then Build a module from it, and process it to get the debug info:
llvm::MemoryBuffer* buf = llvm::MemoryBuffer::getMemBuffer(llvm::StringRef(fileContent)).release();
llvm::SMDiagnostic diag;
llvm::Module* module = llvm::parseIR(buf->getMemBufferRef(), diag, *context).release();
llvm::DebugInfoFinder* dif = new llvm::DebugInfoFinder();
dif->processModule(*module);
Step 2 iterate on instructions:
Once done with that, you can simply iterate on function and blocks and instructions:
// pseudo code for loops (real code is a bit long)
foreach(llvm::Function f in module.functions)
{
foreach(llvm::BasicBlock b in f.BasicBlockList)
{
foreach(llvm::Instruction inst in b.InstList)
{
llvm::DebugLoc dl = inst.getDebugLoc();
unsigned line = dl->getLine();
// accordingly populate some dictionary between your instructions and source code
}
}
}
Step 3 update your UI
This is another story...

Lua emulating the require function

In the embeded lua environment (World of Warcraft - WoW) is missing the require function.
I want port one existing lua source code (an great OO-library) for the use it in the WoW. The library itself is relatively small (approx 8 small files) but of course it heavily uses the require.
World of Warcraft loads files and libraries by defining it in an XML file, like:
<Ui xsi:schemaLocation="http://www.blizzard.com/wow/ui/">
<Script file="LibOne.lua"/>
<Script file="LibTwo.lua"/>
</Ui>
but i don't know how the low level library manipulation is done in the WoW.
AFAIK in the WoW is missing even the package. table too. :(
So the question(s): For me, the streamlined way would be write an function which will emulate the require function using the interface available in WoW. The question is how. Could someone give me some directions?
Or as alternative, for the porting the mentioned existing source to WoW, I need replace the require Some.Other.Module lines in the lua sources to something what WoW will understand. What is the equivalent/replacement for such require Some.Module in the WoW?
How the WoW handles modules/libraries at low-level?
You could merge all files into one using one of the various amalgamation scripts, e.g. amalg. Then you can load this file and a stub that implements the require function using the usual WoW way:
<Ui xsi:schemaLocation="http://www.blizzard.com/wow/ui/">
<Script file="RequireStub.lua"/>
<Script file="AllModules.lua"/><!-- amalgamated Lua modules -->
<Script file="YourCode.lua"/>
</Ui>
The file RequireStub.lua could look like:
package = {}
local preload, loaded = {}, {
string = string,
debug = debug,
package = package,
_G = _G,
io = io,
os = os,
table = table,
math = math,
coroutine = coroutine,
}
package.preload, package.loaded = preload, loaded
function require( mod )
if not loaded[ mod ] then
local f = preload[ mod ]
if f == nil then
error( "module '"..mod..[[' not found:
no field package.preload[']]..mod.."']", 1 )
end
local v = f( mod )
if v ~= nil then
loaded[ mod ] = v
elseif loaded[ mod ] == nil then
loaded[ mod ] = true
end
end
return loaded[ mod ]
end
This should emulate enough of the package library to get you a working require that loads modules in the amalgamated file. Different amalgamation scripts might need different bits from package, though, so you probably will have to take a look at the generated Lua source code.
And in the specific case of Coat you might need to implement stubs for other Lua functions as well. E.g. I've seen that Coat uses the debug library ...
WoW environment doesn't have dofile or any other means to read external files at all. You need to explicitly mention all files that must be loaded in .toc file or .xml referenced from .toc.
You can then write your own implementation of require to maintain compatibility with your library, which would be quite trivial as it would only need to parse module name and retrieve it's content from modules.loaded table, but you'd still need to alter original source to make files register in that table and you'll need to manually arrange all files into correct order of loading.
Alternatively you can rearrange files into separate WoW-addons and use its own built-in Dependencies/OptionalDeps facilities or popular LibStub framework to handle loading order automatically.

How to get Doxygen to recognize custom latex command

Is there a way to use extra latex packages and/or extra latex commands with Doxygen code documentation system. For example I define the shortcut in a custom sty file.
\newcommand{\tf}{\Theta_f}
Then I use it about 300 time in the code, which is across about a dozen files.
/*! Stochastic approximation of the latent response*/
void dual_bc_genw(
//...
double const * const psi, ///< \f$ \psi = B\tf \f$
//...
){/* lots of brilliant code */}
But how do I get the system to recognize the extra package.
Name your style file in the EXTRA_PACKAGES tag in your configuration file.

Load or Stress Testing Tool with URL Import Functionality

Can someone recommend a load testing tool which allows you to either:
a. replay an IIS (7) log(s) to simulate a real live site daily run;
b. import a CSV or equivalent list of URLS so we can achieve a similar thing as above but at a URL level;
c. .net API so I can create simple tests easily from my list of URLS is also a good way to go.
I do not really want to record my tests.
I think I can do B) with WAPT but need to create an XML file manually, not too much grief, but wondering if any tools cover these scenarios out the box.
Visual Studio Test Edition would require some code to parse the file into a suitable test run.
It is a great load testing solution.
Our load testing service lets you write a very simple script using JavaScript to pull data out of a CSV file and then fetch those URLs. For example, the following code would pluck 10 random URLs from the CSV file and fetch them as part of a single session:
var c = browserMob.openHttpClient();
var csv = browserMob.getCSV("urls.csv");
browserMob.beginTransaction();
for (var i = 0; i < 10; i++) {
browserMob.beginStep("Step 1");
var url = csv.random().get("url");
c.get(url);
browserMob.endStep();
}
browserMob.endTransaction();
The CSV file itself needs to be a normal CSV file with the first row containing a header named "url". This script would be run repeatedly for each virtual user participating in a load test.
We have support for so called 'uri-format' in our open-source tool called Yandex.Tank You simply put all your uris to a file, one uri -- one line, then specify headers in your load.ini like this:
[phantom]
address=example.org
rps_schedule=line(1, 1600, 2m)
headers = [Host: mts-maps.yandex.ru]
[Connection: close] [Bloody: yes]
ammo_file = ammo.uri
ammo.uri:
/
/index.html
/1/example.html
/2/example.html

Resources