Added on -2019-09-21

| 9 pages

| 2425 words

| 317 views

Trusted by 2+ million users,

1000+ happy students everyday

/*Author : Yaman KumarFor any further assistance contact : mr.yamankumar@gmail.com*/Ans .1 ) Ryan is not right. It is not the case that O(n log n)-time solution must always be faster than Brandon’s O(n2 ) solution. This is so since big Oh notation hides constant factor, in this notation we just take the upper bound on the number of operations as a function of n. The actual number of operations includes not just this upper bound but a constant factor too. This constant factor though plays a trivial role once n gets larger but it plays a significant role when n is small. This is the case that Brandon has observed i.e. n must have been small and thus the overall running time depended largely on these constant factorsAns. 2) a.(200 n^3) > (5nlogn + 4n) > (10n + 2n) >(4 logn)b.(60000*n^6) > (7nlogn) > (8n+9) = (6n) (asymptotically and not actually)c.(3^(n-1)) > (n^90) > (n^3 – 2000) > (2n^2 + 200n^2logn) >(2 ^100)d.(2^(n+2)) > (10^logn) > (4n) > (3logn) > 63Ans.3) a)n^4 + 3n <= 4*n^4 This is true for all n>=1.Thus the big Oh notation for the above is O(n^4)b)15n^16 + 3nlogn + 2n <= 17*n^16This is true for all n>=1.Thus the big Oh notation for the above is O(n^16)c)3nlogn + 2logn <= 4nlognThis is true for all n>=1.Thus the big Oh notation for the above is O(nlogn)d)12n*3^n -50n <= 12n*3^nThis is true for all n>=1.Thus the big Oh notation for the above is O(n*3^n)Ans.4) a)11*n^2 <= 5logn + 12n^2 <= 13*n^2 This is true for all n>=1.Here k0 = 11 , k1 = 13, n0 = 1Thus the theta notation for the above is Θ(n^2)

b)6nlogn <= 6nlogn + 4n <= 20nlogn This is true for all n>=2. Here k0 = 6 , k1 = 20, n0 = 2Thus the theta notation for the above is Θ(nlogn)Ans. 5)a)2^(n+5) ∈ O(2^n)This is true since If f(n) ∈ O(2^n), then this implies0 <= f(n) <= k*(2^n) for all n >= n0This is true for 2^(n+5), here0 <= 2^(n+5) <= 32*2^n for all n >= 1Thus n0 = 1 and k = 32 in the above formulab)n^2 ∈ Ω(nlogn)This is true since If f(n) ∈ Ω (2^n), then this implies0 <= k*(nlogn) <= f(n) for all n >= n0This is true for n^2, here0 <= nlogn <= n^2 for all n >= 1Thus n0 = 1 and k = 1 in the above formulaAns.6)a)This has order O(n^2). This is shown in the following justification:Let n= arr.lengthIn the worst case, i.e. when arr[i] + arr[j] is never zero for any (i,j), then the following takes placeFor i=0, the inner loop runs from j=0 to j=n, thus total operations executed = nFor i=1, the inner loop runs from j=1 to j=n, thus total operations executed = n-1For i=2, the inner loop runs from j=2 to j=n, thus total operations executed = n-2..

.For i=n-1, the inner loop runs from j=n-1 to j=n, thus total operations executed = 1Summing over all the operations,Total number of operations executed = Sum of all the above steps= n + n-1 + n-2 + .... + 1= n*(n+1)/2Thus total number of operations = n^2/2 + n/2n^2/2 + n/2 ∈ O(n^2)This is true since If f(n) ∈ O(n^2), then this implies0 <= f(n) <= k*(n^2) for all n >= n0This is true for n^2/2 + n/2, here0 <= n^2/2 + n/2 <= n^2 for all n >= 1Thus n0 = 1 and k = 2 in the above formulab)The running time of this is O(1). This is so since:For i = 1, the loop gets executedFor i=2, the condition i%2 == 0 becomes true, and hence it breaks out of loopThen 1 is returnedThus the number of operations carried out is independent of n and is a constant of 2 operations.Thus the complexity is also O(1) which is a constant.

** You are reading a preview**

Upload your documents to download

or

Become a Desklib member to get accesss

Paid Plans

Free Plan

Single Unlock

Monthly Plan

Yearly Plan