error while executing lua script for redis server - lua

I was following this simple tutorial to try out a simple lua script
http://www.redisgreen.net/blog/2013/03/18/intro-to-lua-for-redis-programmers/
I created a simple hello.lua file with these lines
local msg = "Hello, world!"
return msg
And i tried running simple command
EVAL "$(cat /Users/rsingh/Downloads/hello.lua)" 0
And i am getting this error
(error) ERR Error compiling script (new function): user_script:1: unexpected symbol near '$'
I can't find what is wrong here and i haven't been able to find someone who has come across this.
Any help would be deeply appreciated.

Your problem comes from the fact you are executing this command from an interactive Redis session:
$ redis-cli
127.0.0.1:6379> EVAL "$(cat /path/to/hello.lua)" 0
(error) ERR Error compiling script (new function): user_script:1: unexpected symbol near '$'
Within such a session you cannot use common command-line tools like cat et al. (here cat is used as a convenient way to get the content of your script in-place). In other words: you send "$(cat /path/to/hello.lua)" as a plain string to Redis, which is not Lua code (of course), and Redis complains.
To execute this sample you must stay in the shell:
$ redis-cli EVAL "$(cat /path/to/hello.lua)" 0
"Hello, world!"

If you are coming from windows and trying to run a lua script you should use this format:
redis-cli --eval script.lua
Run this from the folder where your script is located and it will load a multi line file and execute it.

On the off chance that anyone's come to this from Windows instead, I found I had to do a lot of juggling to achieve the same effect. I had to do this:
echo “local msg = 'Hello, world!'; return msg” > hello.lua
for /F "delims=" %i in ('type hello.lua') do #set cmd=%i
redis-cli eval "%cmd%" 0
.. if you want it saved as a file, although you'll have to have all the content on one line. If you don’t just roll the content into a set command
set cmd=“local msg = 'Hello, world!'; return msg”
redis-cli eval "%cmd%" 0

Related

Why QProcess is not showing stdout from bash script executed in remote server?

I made an script (findx.h) that doesn't have any problem when i ran it on Solaris server via console (bash-3.2$ ./findx.sh)
The problem appears when i try to run it from a windows Qt app using QProcess (code below) where it doesn't display the ouput of the command.
I tried little variations and appear to show data when just use one pipe instead of two. But i need the two: grep and ggrep.
//findx.h in solaris
//WHAT WORKS
#!/bin/bash
echo pass | sudo -S /usr/sbin/snoop -x0 -ta HSM1000 port 1000
//WHAT I WANT
#!/bin/bash
echo pass | sudo -S /usr/sbin/snoop -x0 -ta HSM1000 port 1000 | /usr/sfw/bin/ggrep -A 2 KR01
//Qt on windows
QString commands="(";
commands +="source setpath.sh";
commands +=";/path/to/script/findx.sh";
commands +=")";
this->logged=false;
QString program = "plink.exe";
QStringList arguments;
arguments <<"-ssh"
<<ip
<<"-l"
<<user
<<"-pw"
<<pass
<<commands;
this->myProcess=new QProcess(this);
connect(this->myProcess,SIGNAL(started()),
this, SLOT(onprocess_started()));
connect(this->myProcess, SIGNAL(errorOccurred(QProcess::ProcessError)),
this, SLOT(onprocess_errorOcurred(QProcess::ProcessError)));
connect(this->myProcess, SIGNAL(finished(int, QProcess::ExitStatus)),
this, SLOT(onprocess_finished(int, QProcess::ExitStatus)));
connect(this->myProcess, SIGNAL(readyReadStandardError()),
this, SLOT(onprocess_readyReadStandardError()));
connect(this->myProcess, SIGNAL(readyReadStandardOutput()),
this, SLOT(onprocess_readyReadStandardOutput()));
connect(this->myProcess, SIGNAL(stateChanged(QProcess::ProcessState)),
this, SLOT(onprocess_stateChanged(QProcess::ProcessState)));
this->myProcess->start(program, arguments);
this->ui->labStatus->setText("Starting");
return 0;
// How i read, i do the same for stderr and put it also in plainOutput
QByteArray err=this->myProcess->readAllStandardOutput();
QString m="Standard output:"+QString(err.data());
this->ui->plainOutput->appendPlainText(m);
please any advice would be useful.
Thanks in advance.

Lua: forward output from Minicom

I have a Lua script and in there I open a minicom session which executes a script (with the -S" parameter).
local myFile = assert(io.popen('minicom -S myScript.sh ' myDevice ' -C myLogFile.log'))
local myFileOutput = myFile:read('*all')
myFile:close()
This works really fine.
But I would like to get the same output as if I execute the minicom command itself:
minicom -S myScript.sh ' myDevice ' -C myLogFile.log
Right now I don't get any output at all (I know that that's somehow obvious).
I would that the output should also occur at (at least nearly) the same time as with the minicom command itself. Not one big chuck of data at the end.
Does anyone know how to achieve that?
If I understand you correctly, you need something like
local myFile = assert(io.popen('minicom ...'))
for line in myFile:lines('l') do
print(line)
end
myFile:close()

Error while running Lua script from redis client

I have the "Hello World" program in Lua script.
I am trying to call the script from (Chocolatey) Redis client.
I keep getting this error
(error) ERR Error compiling script (new function): user_script:1: function argument expected near '.'
Redis Script: "hello.lua"
local msg = "Hello, world!"
return msg
Chocolatey Redis Client:
127.0.0.1:6379> EVAL "D:\hello.lua" 0
Error Message
(error) ERR Error compiling script (new function): user_script:1: function argument expected near '.'
EVAL accepts the script itself, not a filename.
Try this:
EVAL 'local msg = "Hello, world!" return msg' 0
EDIT: to execute a script in a file, redis-cli provides the --eval switch that you can use as follows:
redis-cli --eval <path-to-script-file> [key1 [key2] ...] , [arg1 [arg2] ...]
I'm not familiar with the Windows fork, but it should be supported by it as well in all likelihood.
In *nix, you can also use the shell to provide the contents of the script to the cli, for example:
redis-cli SCRIPT LOAD "$(cat path-to-script-file)"
will load the contents in the file to Redis. There should be a similar way for achieving this in Windows but that's outside my current scope ;)

Run a loop in bash with timeout in a single line

I need to have a single line command that does something in a loop with a timeout.
Something like
export TIMEOUT=60
export BLOCK_SIZE=65536
COMMAND="timeout TIMEOUT while true; do dd if=/tmp/nfsometer_trace/mnt/TESTFILE of=/dev/null bs=BLOCK_SIZE; done"
The command will be executed in 'sh' by doing the following:
echo $COMMAND > COMMAND_FILE
sh COMMAND_FILE
But this gives me a syntax error:
syntax error near unexpected token `do'
Is there any way to have a single line command to timeout an infinite loop?
A command like
while true ; do echo $(( i++ )) ; sleep 5 ; done
will produce output like
0
1
2
3
But if I try to use that command directly in timeout, I get similar error messages, i.e.
timeout 20 while true ; do echo $(( i++ )) ; sleep 5 ; done
bash: syntax error near unexpected token do
Some programs (ssh for example), will process long strings of logic, if quoted so the argument is all one string.
timeout 20 "while true ; do echo $(( i++ )) ; sleep 5 ; done"
timeout: failed to run command 'while true ; do ...'
same error msg whether I use single of dbl-quotes to pass in string.
Reading info timeout we see that
COMMAND must not be a special built-in utility (*note Special built in utilities::).
Maybe while loops qualify as special built-in utility?
Finally, note that I have removed worrying about does using the "wrapper" information you have, ie.
export TIMEOUT=60
export BLOCK_SIZE=65536
COMMAND="timeout TIMEOUT while ....
as an cause of your problem.
When you have a problem like this, it is better to prove to yourself that your command is working in a simpler case and then get it work inside a more complex usage.
Given this evidence, I don't think timeout is designed to do what you want.
As a last resort, I recommend that you try converting that while loop into a script and calling just
timeout 20 myLooperScript
IHTH

How can I tell from a within a shell script if the shell that invoked it is an interactive shell?

I'm trying to set up a shell script that will start a screen session (or rejoin an existing one) only if it is invoked from an interactive shell. The solution I have seen is to check if $- contains the letter "i":
#!/bin/sh -e
echo "Testing interactivity..."
echo 'Current value of $- = '"$-"
if [ `echo \$- | grep -qs i` ]; then
echo interactive;
else
echo noninteractive;
fi
However, this fails, because the script is run by a new noninteractive shell, invoked as a result of the #!/bin/sh at the top. If I source the script instead of running it, it works as desired, but that's an ugly hack. I'd rather have it work when I run it.
So how can I test for interactivity within a script?
Give this a try and see if it does what you're looking for:
#!/bin/sh
if [ $_ != $0 ]
then
echo interactive;
else
echo noninteractive;
fi
The underscore ($_) expands to the absolute pathname used to invoke the script. The zero ($0) expands to the name of the script. If they're different then the script was invoked from an interactive shell. In Bash, subsequent expansion of $_ gives the expanded argument to the previous command (it might be a good idea to save the value of $_ in another variable in order to preserve it).
From man bash:
0 Expands to the name of the shell or shell script. This is set
at shell initialization. If bash is invoked with a file of com‐
mands, $0 is set to the name of that file. If bash is started
with the -c option, then $0 is set to the first argument after
the string to be executed, if one is present. Otherwise, it is
set to the file name used to invoke bash, as given by argument
zero.
_ At shell startup, set to the absolute pathname used to invoke
the shell or shell script being executed as passed in the envi‐
ronment or argument list. Subsequently, expands to the last
argument to the previous command, after expansion. Also set to
the full pathname used to invoke each command executed and
placed in the environment exported to that command. When check‐
ing mail, this parameter holds the name of the mail file cur‐
rently being checked.
$_ may not work in every POSIX compatible sh, although it probably works in must.
$PS1 will only be set if the shell is interactive. So this should work:
if [ -z "$PS1" ]; then
echo noninteractive
else
echo interactive
fi
try tty
if tty 2>&1 |grep not ; then echo "Not a tty"; else echo "a tty"; fi
man tty :
The tty utility writes the name of the terminal attached to standard
input to standard output. The name that is written is the string
returned by ttyname(3). If the standard input is not a terminal, the
message ``not a tty'' is written.
You could try using something like...
if [[ -t 0 ]]
then
echo "Interactive...say something!"
read line
echo $line
else
echo "Not Interactive"
fi
The "-t" switch in the test field checks if the file descriptor given matches a terminal (you could also do this to stop the program if the output was going to be printed to a terminal, for example). Here it checks if the standard in of the program matches a terminal.
Simple answer: don't run those commands inside ` ` or [ ].
There is no need for either of those constructs here.
Obviously I can't be sure what you expected
[ `echo \$- | grep -qs i` ]
to be testing, but I don't think it's testing what you think it's testing.
That code will do the following:
Run echo \$- | grep -qs i inside a subshell (due to the ` `).
Capture the subshell's standard output.
Replace the original ` ` expression with a string containing that output.
Pass that string as an argument to the [ command or built-in (depending on your shell).
Produce a successful return code from [ only if that string was nonempty (assuming the string didn't look like an option to [).
Some possible problems:
The -qs options to grep should cause it to produce no output, so I'd expect [ to be testing an empty string regardless of what $- looks like.
It's also possible that the backslash is escaping the dollar sign and causing a literal 'dollar minus' (rather than the contents of a variable) to be sent to grep.
On the other hand, if you removed the [ and backticks and instead said
if echo "$-" | grep -qs i ; then
then:
your current shell would expand "$-" with the value you want to test,
echo ... | would send that to grep on its standard input,
grep would return a successful return code when that input contained the letter i,
grep would print no output, due to the -qs flags, and
the if statement would use grep's return code to decide which branch to take.
Also:
no backticks would replace any commands with the output produced when they were run, and
no [ command would try to replace the return code of grep with some return code that it had tried to reconstruct by itself from the output produced by grep.
For more on how to use the if command, see this section of the excellent BashGuide.
If you want to test the value of $- without forking an external process (e.g. grep) then you can use the following technique:
if [ "${-%i*}" != "$-" ]
then
echo Interactive shell
else
echo Not an interactive shell
fi
This deletes any match for i* from the value of $- then checks to see if this made any difference.
(The ${parameter/from/to} construct (e.g. [ "${-//[!i]/}" = "i" ] is true iff interactive) can be used in Bash scripts but is not present in Dash, which is /bin/sh on Debian and Ubuntu systems.)

Resources