I need to set up some SET options in Oracle SQLplus command line program each time I use it, such as SET HEADING OFF and the likes to beautify my results.
I found that I always have to input each line separately so Set different options and this is becoming annoying since I need to access it many times a day.
I found that there's no way to separate different SET commands with semicolons because it doesn't accept it:
SET HEADING OFF; SET LINESIZE 100;
returns an error
A solution could be adding them to a control script and create a shell alias, but I know control scripts execute and then exit and don't return you control over the command line.
So, anybody knows another solution? Or am I missing something?
Ok, answering my own question: apprently it is possible to do this:
SET HEADING OFF LINESIZE 100 PAGESIZE 0 xxx xxx
And go on adding rules as one comes up with them.
It is a simple and effective solution for now.
Put all your commands in a ".sql" file (for example "format.sql") then execute them with the "#" command in Sql*plus (for example "#format").
Note that it defaults to the ".sql" suffix when looking for the command file.
For example, if "format.sql" contains the commands "set linesize 100" and "set pagesize 0":
% sqlplus
SQL*Plus: Release 10.2.0.1.0 - Production on Thu Mar 18 08:39:03 2010
Connected to:
Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit Production
SQL> show linesize
linesize 80
SQL> #format
SQL> show linesize
linesize 100
SQL> select 1+1 from dual;
2
Related
Sample Data:
ID DATE READING FLAG
1 2012-06-01 10 0
2 2012-06-01 10.5 0
3 2012-06-01 8.5 0
I have a regex command to get the timestamp and send the data to directories biased on their
date:
agent1.sources.source1.interceptors = interceptor1
agent1.sources.source1.interceptors.interceptor1.type = regex_extractor
agent1.sources.source1.interceptors.interceptor1.regex = ([0-9][0-9][0-9][0-9]-[0-9][0-2]-[0-3][0-9])
I then use sterilizers to timestamp and can set the correct directory. /yyyy/mm/dd
Now I also want to use a second regex interceptor to only show the data in the 3rd column (Reading) of the tab separated file. I can do this in the terminal with grep running a regex command using -o, but can not seem to figure it out in flume.
The desired output of the second regex command would be
10
10.5
8.5
Also I am not sure if regex is best for this, but to my knowledge flume does not have a header filter.
Any help or suggestions is much appreciated
I'm learning COBOL programming and using GNUCobol (on Linux) to compile and test some simple programs. In one of those programs I have found an unexpected behavior that I don't understand: when reading a sequential file of records, I'm always getting one extra record and, when writing these records to a report, the last record is duplicated.
I have made a very simple program to reproduce this behavior. In this case, I have a text file with a single line of text: "0123456789". The program should count the characters in the file (or 1 chararacter long records) and I expect it to display "10" as a result, but instead I get "11".
Also, when displaying the records, as they are read, I get the following output:
0
1
2
3
4
5
6
7
8
9
11
(There are two blank spaces between 9 and 11).
This is the relevant part of this program:
FD SIMPLE.
01 SIMPLE-RECORD.
05 SMP-NUMBER PIC 9(1).
[...]
PROCEDURE DIVISION.
000-COUNT-RECORDS.
OPEN INPUT SIMPLE.
PERFORM UNTIL SIMPLE-EOF
READ SIMPLE
AT END
SET SIMPLE-EOF TO TRUE
NOT AT END
DISPLAY SMP-NUMBER
ADD 1 TO RECORD-COUNT
END-READ
END-PERFORM
DISPLAY RECORD-COUNT.
CLOSE SIMPLE.
STOP RUN.
I'm using the default options for the compiler, and I have tried using 'WITH TEST {BEFORE|AFTER}' but the result is the same. What can be the cause of this behavior or how can I get the expected result?
Edit: I tried using an "empty" file as data source, expecting a 0 record count, using two different methods to empty the file:
$ echo "" > SIMPLE
This way the record count is 1 (ls -l gives a size of 1 byte for the file).
$ rm SIMPLE
$ touch SIMPLE
This way the record count is 0 (ls -l gives a size of 0 bytes for the file). So I guess that somehow the compiled program is detecting an extra character, but I don't know how to avoid this.
I found out that the cause of this behavior is the automatic newline character that vim seems to append when saving the data file.
After disabling this in vim this way
:set binary
:set noeol
the program works as expected.
Edit: A more elegant way to prevent this problem, when working with data files created from a text editor, is using ORGANIZATION IS LINE SEQUENTIAL in the SELECT clause.
Since the problem was caused by the data format, should I delete this question?
Please help me with the below concern I am facing.
I have some existing shell scripts that run daily at regular intervals to spool some data into a text file and send it to another system.
Now i have made some changes to those scripts and the spooling which used to take 6 hours, now the same is taking more than 8 hours.
I have read "/" in the script usually executes the previous sql statement.
So by the code below, is the sql query being called twice?
I am new to this and sorry if i am being Naive, any help related to the same is appreciated.
Thanks in advance.
#!/bin/ksh
ORACLE_HOME=/pprodi1/oracle/9.2.0; export ORACLE_HOME;
Script_Path=<path>
dt=`date '+%y%m%d%H%M'`
find $Script_Path/testing_spool* -mtime +3 | xargs rm -f { }
cd $Script_Path
sqlplus -s uname/pwd#db_name<<EOF1>/dev/null
set echo off
set head off
set pages 0
set feedback off
set pause off
set colsep " "
set verify off
set termout off
set linesize 3000
set trimspool on
spool $Script_Path/testing_spool.dat
SELECT column_name
FROM table_name
WHERE created_date > SYSDATE - 1
AND col1 = '126'
AND col2 = 'N'
AND col3 = 6;
spool off;
/
EOF1
cat testing_spool.dat > testing_spool_$dt.txt
Yes, your query will be executed twice, once while the spool is active and then again after it has been turned off.
As you mentioned, the / executes whatever is currently in the SQL buffer, which will still contain your query. The spool off is a client command and does not affect the SQL buffer.
If you run your script without the output redirect >/dev/null on line 10, or redirect to a file instead if you expect a lot of output, then you will see the query results repeated.
Incidentally, set termout off isn't doing anything in your script because you're running it as a heredoc. If you had the statements in a script file and ran that with start or # then it would suppress the output, but as the documentation says:
ECHO does not affect the display of commands you enter interactively or redirect to SQL*Plus from the operating system.
You could potentially create a .sql file, run that, and then delete it - all within your shell script. You may not see much benefit, but it would mean you didn't need to hide all output with that redirect, which would make it harder to diagnose failures.
I have a stored procedure (zhm.GetBatchCmdList_sp) with two optional input parameters (#freq and #env). The stored procedure returns a single column list of SSIS files and stored procedures that I want to execute, one by one.
I've created some test code that simply echoes the stored procedure's result set to console:
set freq=D
set env=DEV
set cmd=sqlcmd -S LDNDSM05243\TDS_MAIN2_DEV -d CPRM_3DSTRESS -Q "exec zhm.GetBatchCmdList_sp #freq=$(freq), #freq=$(env)" -h -1
set cmdList=
setlocal enabledelayedexpansion
for /f "usebackq" %%i in (`"%cmd%"`) do (set cmdList=!cmdList! %%i)
for %%v in (%cmdList%) do echo %%v
endlocal
but I am having problems with it.
If I remove the input parameters from the above 'set cmd....' line by replacing it with:
set cmd=sqlcmd -S LDNDSM05243\TDS_MAIN2_DEV -d CPRM_3DSTRESS -Q "exec zhm.GetBatchCmdList_sp" -h -1
then the script runs. However, I want the caller to be able to pass input variables (ie. %1 and %2) to the script (which will pass these to the stored procedure) so that different values for freq and env can be set at runtime.
I don't know what I'm doing wrong but I suspect it may have something to do with the combination of stored proc input parameters and enabling delayed expansion. Of course, I could be wrong, but I've been googling and I can't find any help specific to this combo/question.
Can anyone help me please?
Edit - This is a completely rewritten answer
Your problem is the ) in your definition of cmd. You have two sets of quotes, one in your original definition, and then another set in your FOR IN() clause. The net effect is your ) characters are not quoted, so it terminates the IN() clause prematurely and the statement fails.
As you stated in your comment, changing to "!cmd!" solves your problem. But I think you could also have solved it simply by removing the quotes from your IN() clause.
for /f "usebackq" %%i in (`%cmd%`) do...
Does anybody know of a script or powershell windows utility, or 3rd party utility, codeplex, poshcode script, app, whatever etc, which can read and scan a powershell script and detail what snapins, providers, assemblies, cmdlets, functions, etc, etc, etc, the script needs to execute.
Thanks.
Bob.
First things first: No, I don't know of such a script/utility.
I suspect, though, that you can get pretty far with Powershell's capabilities of parsing itself.
For example, the following script:
function test {}
test
$content = gc $args[0]
[System.Management.Automation.PsParser]::Tokenize($content, [ref] $null) |
? { $_.Type -eq "Command" } | ft
given itself as argument yields the following output:
Content Type Start Length StartLine StartColumn EndLine EndColumn
------- ---- ----- ------ --------- ----------- ------- ---------
test Command 20 4 3 1 3 5
gc Command 39 2 5 12 5 14
? Command 127 1 6 76 6 77
ft Command 157 2 6 106 6 108
So, the "Command" type includes at least functions and cmdlets. You can further dissect this by un-aliasing those tokens.
This probably can tell you a little already but comes nowhere close to your pretty exhaustive list of what Powershell scripts could require.
But at least in the case of snap-ins or modules you probably need some magic anyway to know precisely what's missing.
Actually, we make several of these utilities:
http://scriptcop.start-automating.com - Free online / downloadable static analysis of PowerShell scripts
http://scriptcoverage.startautomating.com/ - Code Coverage Tool for Powershell Scripts
There's also a tool in the module ShowUI (Get-ReferencedCommand) that does exactly what you'd want. It looks at every command used within a command, and every command they use, to produce a list of all commands you must include.
You use it like: Get-ReferenedCommand -ScriptBlock { Start-MyCommand }
Hope this Helps