Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 3 years ago.
Improve this question
function maximum (a)
local mi = 1 -- maximum index
local m = a[mi] -- maximum value
for i,val in ipairs(a) do
if val > m then
mi = i
m = val
end
end
return m, mi
end
print(maximum({8,10,23,12,5}))--> 23 3
I can't understand this, would someone explain this example? It's so confusing, Programming in Lua First edition always make some hard examples.
The function takes one argument, which is a table that is stored in the variable a.
The function iterates (loops) over each value in the table a, using the ipairs function to return the index and value from the table (temporarily stored in i and val).
Inside the loop the value from the table is compared against m, and if val is larger than m then m is assigned the value of val and mi is assigned the value of i.
Then the function returns the two values m and mi.
In short what the function does, is to find the maximum values and its index in the table passed as the argument.
Related
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 5 years ago.
Improve this question
I've been struggling with the the very first question of the exercise from opendatatructures.org regarding datastructures. I question goes like:
A Dyck word is a sequence of +1's and -1's with the property that the
sum of any prefix of the sequence is never negative. For example,
+1,−1,+1,−1 is a Dyck word, but +1,−1,−1,+1 is not a Dyck word since the prefix +1 − 1 − 1 < 0. Describe any relationship between Dyck
words and Stack push(x) and pop() operations.
How does one find the relation between the operation?
One way to represent check if a word if a Dyck word or not is to use a stack, where you push every time you encounter a +1 and pop every time you encounter a -1. If you ever try to pop from an empty stack, it's not a Dyck word.
Consider the following psuedocode (assume that a word is represented as a array of integers, since the question isn't really about parsing):
boolean isDyck(int[] word) {
Object dummy = new Object(); // Just so you have something to push
Stack stack = new Stack();
for (item : word) {
if (item > 0) {
stack.push(dummy);
} else {
if (stack.isEmpty()) {
return false;
}
stack.pop();
}
}
return true;
}
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 5 years ago.
Improve this question
Here is the code:
var n int
a, _ := fmt.Scanf("%d",&n)
Then a == 1, n has changed its value by input. Why does use of := with fmt.Scanf in Go always return 1?
fmt.Scanf() returns the number of successfully scanned items:
Scanf scans text read from standard input, storing successive space-separated values into successive arguments as determined by the format. It returns the number of items successfully scanned. If that is less than the number of arguments, err will report why.
So if your input is a valid integer number fitting into an int, fmt.Scanf() will succeed to parse it and store it in n, and so it will return 1.
Should you input an invalid number (e.g. the string value "a"), scanning would not succeed, so 0 would be returned along with a non-nil error, like in this example:
var n int
a, err := fmt.Sscanf("a", "%d", &n)
fmt.Println(a, err)
Which outputs (try it on the Go Playground):
0 expected integer
Closed. This question is not reproducible or was caused by typos. It is not currently accepting answers.
This question was caused by a typo or a problem that can no longer be reproduced. While similar questions may be on-topic here, this one was resolved in a way less likely to help future readers.
Closed 6 years ago.
Improve this question
Is there a way to index a table defined by a variable in Lua? For example:
function checkTable(t, k)
return t[k]
end
checkTable(coffee, 1)
to return the value of any key of any table.
However, this would return item 1 of the table "t", and not the table "coffee"; the function is not recognizing "t" as a variable and is instead looking for the literal table "t". How should this be done?
See §2.2 of the 5.3 reference about environments (which were introduced in 5.2):
As will be discussed in §3.2 and §3.3.3, any reference to a free name (that is, a name not bound to any declaration) var is syntactically translated to _ENV.var. Moreover, every chunk is compiled in the scope of an external local variable named _ENV (see §3.3.2), so _ENV itself is never a free name in a chunk.
So a proper implementation of your function would look like:
function checkTable(t, k)
local tbl = _ENV[t]
if tbl ~= nil then
return tbl[k]
else
return nil
end
end
However this function won't have access to the environment of callers, unless you either pass it to it, or you define it within another function (closure), so it access _ENV as upvalue.
There is also the global inbuilt variable _G which is the global environment containing all globals.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
I am struggling to see if there is an obvious advantage over which method to use when passing values into a function. My code below may not be the best example to explain the decision I'm trying to make, but it is, in my opinion, the easiest to understand.
Variadic Parameter Approach
func arithmeticMean(numbers: Double...) -> Double {
var total: Double = 0
for value in numbers {
total += value
}
return total / Double(numbers.count)
}
arithmeticMean(5, 10, 15)
Array Parameter Approach
func arithmeticMean(numbers: [Double]) -> Double {
var total: Double = 0
for value in numbers {
total += value
}
return total / Double(numbers.count)
}
arithmeticMean([5, 10, 15])
Is either of the two techniques preferred? If so, why (speed, reliability or just ease of reading)? Thanks.
I think there is no speed difference.Because,inside the function,you use Variadic Parameter just as Array.
I think that if the parameters count is small,for example,less than 5,Variadic Parameter may be a better solution,because it is easy to read.
If the count of parameters is large. Array is better solution.
Also know that,Variadic Parameter have some limitation:
A function may have at most one variadic parameter, and it must always appear last in the parameter list, to avoid ambiguity when calling the function with multiple parameters.
If your function has one or more parameters with a default value, and also has a variadic parameter, place the variadic parameter after all the defaulted parameters at the very end of the list.
Just from my idea.Hopes helpful
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
Hi this is my first lua code but I get a error pls fix it thx in advanced if you do get this working. I have a feeling its a small thing I'm missing.
class 'Autochat'
TalkTimer = Timer()
local TalkDelay = 1 -- in minutes
local active = 1
function
if active = 0 then
return
end
if active ~= "0" then
if(TalkTimer:GetSeconds() > (60 * timeDelay)) then
Chat:Broadcast("Hi the admin is offline.", Colors(0, 255, 0))
TalkTimer:Restart()
end
end
end
Autochat = Autochat()
The function is missing a name. Lua reads to the next line looking for the function's name and gets confused when it finds an if statement.
Also, the first if statement should be if active == 0 then because == is the comparison operator.