1. Give the computational complexity of the following piece of code in Big-Oh no
ID: 3743221 • Letter: 1
Question
1. Give the computational complexity of the following piece of code in Big-Oh notation and explain how you arrived at your result:
for (i=1; i<=n; i=i+2){
// constant number of operations
}
2. Give the computational complexity of the following piece of code in Big-Oh notation and explain how you arrived at your result:
for (i=1; i<=n; i++){
for (j=i; j<=n; j=j+2){
// constant number of operations
}
}
3. Give the computational complexity of the following piece of code in Big-Oh notation and explain how you arrived at your result:
for (j=n; j>1; j=j/2){
// constant number of operations
}
4. Give the computational complexity of the following piece of code in Big-Oh notation and explain how you arrived at your result:
for (i=1; i<=n; i++){
for (j=n; j>1; j=j/2){
// constant number of operations
}
}
5. Is n 2 + 100 of complexity O(n 2 )? Use the formal definition of Big-Oh to answer this question.
6. Is n 2 + 100 of complexity O(n 3 )? Use the formal definition of Big-Oh to answer this question.
7. Is 2n+1 of complexity O(2n)? Use the formal definition of Big-Oh to answer this question.
8. Is 22n of complexity O(2n)? Use the formal definition of Big-Oh to answer this question.
9. Explain why the statement “The running time of the algorithm is at least O(n 2 )” does not provide any information.
Explanation / Answer
ANS(1) since it is known that Big-Oh gives the asymptotic upper bound of a function
as given for(i=1;i<=n;i=i+2)
{ // constant number of operation}
this for loop runs for f(n)=n/2 times because i is incremented by 2 each time
by defintion Big-Oh 0 f(n) cg(n) here c is a positive constant
f(n)=O(g(n)) so 2(n/2)=n , where c=2
computational complexity =O(n)
ANS(2)
loop 1 for (i=1; i<=n; i++){
loop2 for (j=i; j<=n; j=j+2){
// constant number of operations}}
lets see how the loop run
loop1 value of i number of times loop 2 runs( rounded off to floor inetger value)
1 n/2
2 (n-1)/2
3 (n-2)/2
-- -----
n-1 (n-(n-1))/2=1/2= 0 (rounded off value)
n 0
SUM= n/2 + (n-1)/2 + ------------------ + 0
SUM =(1/2)[ n+ (n-1)+ ------------- +1]= (n(n+1))/4
let sum be a function of n f(n)= (n2+n)/4
now using the definition Big -Oh c(n2+n)/4 where c = 4 we get n2+n in which dominating term is n2
computational complexity =O(n2)
ANS(3) for (j=n; j>1; j=j/2){
// constant number of operation }
lets see how the loop runs
every time loop executes value of j is halved
so there comes a point where n/2k =1 now evaluating the value of k we get k=log2n
this k is the number of times loop executed
computational complexity =O(log2n)
ANS(4)
loop1 for (i=1; i<=n; i++){
loop 2 for (j=n; j>1; j=j/2){
// constant number of operation }}
lets see how the loop run
inner loop i,e, loop2 executes
every time loop2 executes value of j is halved
so there comes a point where n/2k =1 now evaluating the value of k we get k=log2n
this k is the number of times loop executed
loop1 value of i number of times loop 2 runs
1 log2n
2 log2n
3 log2n
-- -----
n log2n
SUM= nlog2n
thus SUM gives the value number of times both loop executed
computational complexity =O(nlog2n)
ANS(5) f(n)=n2+100 of complexity O(n2)
By definition of big oh notation f(n)<=c.g(n)
then in Big-Oh notation it can be represented by O(g(n))
now if we choose an appropriate c =2 and g(n) =n2 we can satisfy the condition so ,
f(n)=n2+100 of complexity O(n2)
ANS(6) f(n)=n2+100 of complexity O(n3)
By definition of big oh notation f(n)<=c.g(n) for c>0
since the function g(n) is itself greater than f(n)
then in Big-Oh notation it can be represented by O(g(n))
now if we choose an appropriate c =1 and g(n) =n3 we can satisfy the condition so ,
f(n)=n2+100 of complexity O(n3)
ANS(7) f(n)=2n +1 of complexity O(2n)
By definition of big oh notation f(n)<=c.g(n) for c>0,
then in Big-Oh notation it can be represented by O(g(n))
now if we choose an appropriate c (say2) and g(n) =2n we can satisfy the condition so ,
f(n)=2n+1 of complexity O(2n)
ANS(8) f(n)=22n of complexity O(2n)
By definition of big oh notation f(n)<=c.g(n) for c>0,
then in Big-Oh notation it can be represented by O(g(n))
now if we choose an appropriate c (say 12) and g(n) =2n we can satisfy the condition so ,
f(n)=22n of complexity O(2n)
ANS(9) “The running time of the algorithm is at least O(n2 )” does not provide any information.
T(n) : running time of the algorithm A . we are more concerned over the upper bound and lower bound of T (n)
But the statement T(n)>=O(n2 )
UPPER BOUND : Because T(n)>=O(n2 ) then there's no information about upper bound of T(n)
LOWER BOUND: assume f(n)=n2 then the statement T(n)>=f(n) , but f(n) could be anything smaller than n2 for example- constant , n ........ thus there no conclusion on the lower bound T(n) either
Thus it can be concluded that "The running time of the algorithm is at least O(n2)" is meaningless.