Write Recurrence for Given Function - recurrence

I am trying to write the recurrence relation for the running time of the following function:
function G(n):
if n>0 then:
x=0
for i = 1 to n:
x = x + 1
G(n-1)
end if
What I came up with was:
T(n) = 1 if n <= 0
T(n) = T(n-1) + 1 if n>0
However I was told that this was incorrect and I don't know why or what the correct solution would be. Any help is greatly appreciated!

T(n) = 1 if n <= 0
T(n) = T(n-1) + O(n) if n>0
Instead of O(1), it should be O(n), because you are looping from 1 to n
If you solve the recurrence, the overall complexity will be O(n2)

Related

Solve Recurrence for T(n) = 7T(n/7) + n

I'm trying to solve the recurrence for T(n) = 7T(n/7) + n.
I know using Master Theorem it's O(nlog7n), but I want to solve it by substitution.
At level i, I get: 7^i T(n/7^i) + (n+7n+7^2n+ .... + 7^i n)
By setting i = log7n, the above becomes: 7^(log7n)*T(1) + (n + 7n + 7^2n ..... + 7^(log7n) n
Since 7^log7n = n, the above finally becomes n+ (n+7n+(7^2)n+ ....n*n)
This solves to O(n^2) to me not O(nlog7n), any idea what's wrong?
T(n)=7T(n/7)+n=7[7T(n/72)+n/7]+n=72T(n/72)+2n=...=7kT(n/7k)+kn
n/7k=c ⇒ k=O(logn)
⇒T(n)=O(nlogn)

Asymptotic growth relation between 2^n and 2^(n/2)

There's this question I'm solving where
f(n) = 2^n
g(n) = 2^(n/2)
f(n) = ?(g(n))
I've found many answers as Ω and ω.
But shouldn't it be f(n) = θ(g(n))? Since the constant shouldn't affect the growth of the function?
f(n) = θ(g(n)) if and only if f(n) = Ω(g(n)) and f(n) = O(g(n)), but we can see that f(n) = O(g(n)) just doesn't hold as explained below.
O(g(n)) = {f(n): there exist positive constants c and n0 such that 0 <= f(n) <= c g(n) for all n >= n0}
2^n <= c 2^(n/2) would lead to 2^(n/2) <= c, and we cannot find such an n0 to make the above definition holds.
n = np.linspace(1, 10)
plt.plot(n, 2 ** n, label="2 ** n")
plt.plot(n, 2 ** (n/2), label="2 ** (n/2)")
plt.legend(loc='upper left')
plt.show()
Reference: CLRS

#Recurrence T(n)=3T(n/3)+Ѳ(log₃n) [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 5 years ago.
Improve this question
Sorry I have tried a lot to solve this recurrence equation
T (n) = 3T (n / 3) + Ѳ (log3n)
with the replacement method but I can not get the required result:
1) T (n) = O (nlogn)
2) Induction
Base: for every n = 1 -> 1log1 + 1 = 1 = T (1)
Inductive step: T (k) = klogk + k for each k <n
Use k = n / 3
T (n) = 3T (n / 3) + Ѳ (log₃n)
1) T (n) = O (nlogn)
2) Induction
Base: for every n = 1 -> 1log1 + 1 = 1 = T (1)
Inductive step: T (k) = klogk + k for each k <n
Use k = n / 3
T (n) = 3T (n / 3) + Ѳ (log₃n)
= 3 [n / 3logn / 3 + n / 3] + (log₃n)
= nlogn / 3 + n + (log₃n)
= n(logn-log3) + n + (log₃n)
= nlogn-nlog3 + n + (log3n)
Firstly we can (eventually) ignore the base 3 in the Theta-notation, as it amounts to a multiplicative factor as is therefore irrelevant. Then we can try the following method:
1. Hypothesis by inspection:
If we re-substitute T into itself multiple times, we get:
What is the upper limit m? We need to assume that T(n) has a stopping condition, i.e. some value of n where it stops recursing. Assuming that it is n = 1 (it really doesn't matter, as long as it's a constant much smaller than n). Continuing (and briefly restoring the base 3):
Surprisingly the answer is not Ө(n log n).
2. Induction base case
We don't use induction to prove the final result, but the series result we deduced by inspecting the behaviour of the expansion.
For the base case n / 3 = 1, we have:
Which is consistent.
3. Induction recurrence
Again, consistent. Thus by induction the summation result is correct, and T(n) is indeed Ө(n).
4. Numerical tests:
Just in case you still cannot believe that it is Ө(n), here is a numerical test to prove the result.
Javascript code:
function T(n) {
return n <= 1 ? 0 : 3*T(floor(n/3)) + log(n);
}
Results:
n T(n)
--------------------------
10 5.598421959
100 66.33828212
1000 702.3597066
10000 6450.185742
100000 63745.45154
1000000 580674.1886
10000000 8924162.276
100000000 81068207.64
Graph:
The linear relationship is clear.

Triangular Vertices - Lua calculation?

I am current determined to complete a problem in Lua and have no idea where to begin. I was thinking about beginning with a modulus operator. I am searching for advice from experienced Lua programmers on how to program this and mainly how I can calculate the theoretically mathematical side of the problem.
Source of the question... (http://www.eecs.qmul.ac.uk/~pbo/ACM/archive/00209.html)
Gratitude will be shown to anyone who answers correctly.
Thank-you.
function get_left(max)
i = 0
j = 1
ls = {}
repeat
i = i + 1
j = j + i - 1
ls[i] = j
print (ls[i], " ls")
until j >= max
return ls
end
a = get_left(27)
(need to format that as code -.-)
if the point1 is between a[i] and a[i+1]
if then point2 is still between a[i] and a[i+1] it is on the same line
else if point2 is between a[i + n] and a[i + n + 1]
then if point2 is at a[i + n] + (a[i] - point1) + n + 1 its in a strait line right above it
else if point2 is at a[i + n] + (a[i] - point1) + n - 1 its in a strait line left above it
if for all points n or n*(-1) is equal the distance between the points is equal.
That is if I didn't make any logical errors and you probably have to check more for it to work properly.
This is more a mathematical question than lua I recommend adding a math tag to it.

Runtime of while loop pseudocode

I have a pseudocode which I'm trying to make a detailed analysis, analyze runtime, and asymptotic analysis:
sum = 0
i = 1
while (i ≤ n){
sum = sum + i
i = 2i
}
return sum
My assignment requires that I write the cost/runtime for every line, add these together, and find a Big-Oh notation for the runtime. My analysis looks like this for the moment:
sum = 0 1
long i = 1 1
while (i ≤ n){ log n + 1
sum = sum + i n log n
i = 2i n log n
}
return sum 1
=> 2 n log n + log n + 4 O(n log n)
is this correct ? Also: should I use n^2 on the while loop instead ?
Because of integer arithmetic, the runtime is
O(floor(ln(n))+1) = O(ln(n)).
Let's step through your pseudocode. Consider the case that n = 5.
iteration# i ln(i) n
-------------------------
1 1 0 5
2 2 1 5
3 4 2 5
By inspection we see that
iteration# = ln(i)+1
So in summary:
sum = 0 // O(1)
i = 1 // O(1)
while (i ≤ n) { // O(floor(ln(n))+1)
sum = sum + i // 1 flop + 1 mem op = O(1)
i = 2i // 1 flop + 1 mem op = O(1)
}
return sum // 1 mem op = O(1)

Resources