How to save a variable in an rspec expect? - ruby-on-rails

I have something along the following lines in one of my spec files:
expect(my_instance).to receive(:my_function).with(arg: instance_of(String))
I want to be able to capture the actual value of arg in a variable I can use in the spec. Is there a way to do that? I checked the rspec docs but didn't find anything like that.

You could declare the variable, say captured_arg before the expect (or allow, if you don't want it to fail if my_instance does not receive my_function). Then you can collect the arguments in a block and set captured_arg within that block.
captured_arg = nil
expect(my_instance).to receive(:my_function) { |arg| captured_arg = arg }
Edit: (Keyword Arguments)
If you are using keyword arguments, just modify the script above slightly, using arg as the keyword argument you'd like to capture:
captured_arg = nil
expect(my_instance).to receive(:my_function) { |args| captured_arg = args[:arg] }

Related

Struggling with calling lua functions from the command line

I have the following code:
dofile(arg[1])
function1 = arg[2]
argument = arg[3]
returned = _G[function1](argument)
print(returned)
It is designed to take three command-line arguments and run a function from a file.
So, i run the command lua libs.lua "printStuff.lua" "printStuff" "\"Hello, World\"", and i always end up with this:
"Hello, World"
nil
I don't understand why i always get "nil".
Here are the contents of printstuff.lua:
function printStuff(stuff)
print(stuff)
end
That is to be expected. What's going on here:
You're executing the file specified by the first argument, printstuff.lua, which will leave a function printStuff in the global table _G.
You're indexing the global table with the second argument, printStuff, obtaining that function
You're calling the function you just obtained with the third command line argument, "Hello World!", as parameter, which prints it, and storing the result of that in the global variable returned. The function printStuff doesn't return anything (there's no return in there, and even if there was, print doesn't return anything either), so you're assigning nil to returned.
You're printing returned, which is nil
Side note: I'd use the vararg ... instead of the arg table for improved readability:
local file, func, param = ...
dofile(file); print(func(param))
Why not simply...
-- xlua.lua
-- Example: lua xlua.lua os date "%H:%M:%S"
-- Or: lua xlua.lua _G print "Hello World"
-- Or: lua xlua.lua dofile /path/to/someusefull.lua "Arg for someusefull.lua"
local result = _G[arg[1]][arg[2]](arg[3])
-- 5. Only put out whats not nil
if (result ~= nil) then
print(result)
end

How to execute a groovy share library with optional parameters

I'm developing a shared library to execute it from my Jenkinsfile. This library has a function with optional parameters and I want to be able to execute this function with any number of parameters by specifying my value. I've been googling a lot, but couldn't find a good answer, so maybe somebody here could help me.
Example:
The function looks this way:
def doRequest(def moduleName=env.MODULE_NAME, def environment=env.ENVIRONMENT, def repoName=env.REPO_NAME) {
<some code goes here>
}
If I execute it from my Jenkinsfile this way:
script {
sendDeploymentStatistics.doRequest service_name
}
the function puts "service_name" value to the moduleName, but how do I specify "repoName" parameter?
In Python you would do it somehow like:
function_name(moduleName=service_name, repoName=repo_name)
but in Groovy + Jenkinsfile I can't find the right way.
Can anybody please help me to find out the right syntax?
Thank you!
Groovy has the concept of Default Parameters. If you change the order of the parameters, such that the environment comes last:
def doRequest(def moduleName=env.MODULE_NAME, def repoName=env.REPO_NAME, def environment=env.ENVIRONMENT) {
<some code goes here>
}
Then your call to function_name will take the default value for environment:
function_name(moduleName=service_name, repoName=repo_name)
Groovy however also has some sort of support for Named Parameters. It is not as nice as Python but you can get it to work as follows:
env = [MODULE_NAME: 'foo', ENVIRONMENT: 'bar', REPO_NAME: 'baz']
def doRequest(Map args = [:]) {
defaultMap = [moduleName: env.MODULE_NAME, environment: env.ENVIRONMENT, repoName: env.REPO_NAME]
args = defaultMap << args
return "${args.moduleName} ${args.environment} ${args.repoName}"
}
assert 'foo bar baz' == doRequest()
assert 'foo bar qux' == doRequest(repoName: 'qux')
assert '1 2 3' == doRequest(repoName: '3', moduleName: '1', environment: '2')
For Named Parameters you need a parameter of type Map (with a default value of the empty map). Groovy will then map the arguments upon calling the function to entries in a Map.
To use default values you need to create a map with the default values, and merge that defaultMap with the passed-in arguments.

Rspec: Checking the content of a system call

I have a Rake task in my Rails project which executes openssl through a system call.
The code looks like this:
system('bash', '-c', 'openssl cms -verify...')
I need to run the command in bash rather than dash (which is default on Ubuntu) to use process substitution in the command.
I need to create a test with rspec which checks that, in this case, the argument verify was passed as expected.
I have tried the following:
expect(Kernel).to receive(:system) do |args|
expect(args[2]).to match(/verify/)
end
However, this only gives me the third letter in the first string sent to system - i.e. the letter s from bash - rather than the third argument sent in the system call.
What am I doing wrong here? Any suggestions would be much appreciated.
Args are being passed to the block as sequential arguments, so if you want to treat them as an array, you need a splat operator in do |*args|:
expect(Kernel).to receive(:system) do |*args|
expect(args[2]).to match(/verify/)
end
Just to take a step back, it's important to understand how block arguments work, since they are different from methods. For example:
def my_fn(*args)
yield(*args)
end
my_fn(1,2,3) { |args| print args }
# => 1
my_fn(1,2,3) { |a, b, c| print [a,b,c] }
# => [1,2,3]
my_fn(1,2,3) { |*args| print args }
# => [1,2,3]
So if you did do |args| (without the splat), you are assigning the args variable to the first argument passed to the block ("bash") and ignoring the other arguments.

Lua (require) invoke an not intended print of required file name

When require is called in testt.lua which is one of two files the return is movee and movee.lua.
movee are for the most part a class to be required, but should be able to accept to be called direct with parameter.
movee.lua
local lib = {} --this is class array
function lib.moveAround( ... )
for i,direction in ipairs(arg) do
print(direction)
end
end
function lib.hello()
print("Hello water jump")
end
lib.moveAround(...)
return lib
testt.la
local move = require("movee")
Expected result is not to call lib.moveAround or print of file name when require is called.
Your expectations are incorrect. Lua, and most scripting languages for that matter, does not recognize much of a distinction between including a module and executing the Lua file which provides that module. Every function statement is a statement whose execution creates a function object. Until those statements are executed, those functions don't exist. Same goes for your local lib = {}. And so on.
Now, if you want to make a distinction between when a user tries to require your script as a module and when a user tries to execute your script on the command line (or via just loadfile or similar), then I would suggest doing the following.
Check the number of arguments the script was given. If no arguments were given, then your script was probably required, so don't do the stuff you don't want to do when the user requires your script:
local nargs = select("#", ...)
if(nargs > 0) then
lib.moveAround(...)
end
Solved by replacing
lib.moveAround(...)
with
local argument = {...}
if argument[1] ~= "movee" and argument[2] ~= "movee" then
lib.moveAround(...)
end
require("movee")
will execute the code within movee.lua
lib.moveAround(...)
is part of that code. Hence if you require "movee" you call lib.moveAround
If the expected result is not to call it, remove that line from your code or don't require that file.

How to handle unexisting variables passed to a proc

I would like to create a procedure like this simple example:
proc name {args} {
foreach val $args {
puts $val
}
}
But I would like the procedure to handle variables that don't exist, something like the code shown below:
proc name {args} {
foreach val $args {
if [info exists $val] {
puts $val
}
}
}
The problem is that the code is not executed because as soon as I call the procedure with an unexisting variable it immediately stalls, prior to go into the code, saying that there is a variable that doesn't exist. Is it probable because the procedure checks argument existance before entering the body?.
I can make it work by changing args by several optional variables with predefined values, but that limits the procedure and makes it look bad.
Can I make a proc able to handle unexisting variables?
You can't pass a variable as an argument: arguments have to be values. You can pass a variable name as an argument and use that as a reference to the variable inside the procedure. Example:
proc name args {
foreach varname $args {
upvar 1 $varname var
if {[info exists var]} {
puts $var
}
}
}
(The call to upvar creates a link between the variable whose name is the value of the variable varname outside the procedure and the variable called var inside the procedure. This is one way to "pass a variable to a procedure".)
Then you can do this:
% set foo 1 ; set baz 3
% name foo bar baz
1
3
Note that if you try to invoke the procedure as
% name $bar
where bar is undefined, the interpreter tries (and fails) to evaluate it before calling the procedure. That might be what you are seeing.
Documentation:
upvar
If we look at the point where you are calling the command (procedures are commands; they're a subclass really) you'll see something like this in your code:
name $a $b $c
That's fine if all those variables exist, but if one doesn't, it will blow up even before name is called. In Tcl, $a means exactly “read the variable a and use its contents here”, unlike in some other languages where $ means “look out language, here comes a variable name!”
Because of this, we need to change the calling convention to be one that works with this:
name a b c
That's going to require the use of upvar. Like this:
proc name {args} {
foreach varName $args {
# Map the caller's named variable to the local name “v”
upvar 1 $varName v
# Now we can work with v in a simple way
if {[info exists v]} {
puts $v
}
}
}
You made a mistake here
if [info exists $val]
When info exists is used it should be checked against variable name, not the variable value.
Lets come to your actual question.
You can pass the arguments to the procedure as a key-value pair, then it is pretty simple.
proc user_info {args} {
#Converting the arguments into array
if {[catch {array set aArgs $args}]} {
puts "Please pass the arguments as key-value pair"
return 1
}
#Assume, we need to ensure these 3 arguments passed for sure.
set mandatoryArgs "-name -age -country"
foreach mArg $mandatoryArgs {
if {![info exists aArgs($mArg)]} {
puts "Missing mandatory argument '$mArg'"
return 1
}
}
}
user_info -name Dinesh

Resources