performance - base case and time complexity in recursive algorithms -
i clarification regarding o(n) functions. using sicp.
consider factorial function in book generates recursive process in pseudocode:
function factorial1(n) { if (n == 1) { return 1; } return n*factorial1(n-1); }
i have no idea how measure number of steps. is, don't know how "step" defined, used statement book define step:
thus, can compute n ! computing (n-1)! , multiplying result n.
i thought mean step. concrete example, if trace (factorial 5),
- factorial(1) = 1 = 1 step (base case - constant time)
- factorial(2) = 2*factorial(1) = 2 steps
- factorial(3) = 3*factorial(2) = 3 steps
- factorial(4) = 4*factorial(3) = 4 steps
- factorial(5) = 5*factorial(4) = 5 steps
i think indeed linear (number of steps proportional n).
on other hand, here factorial function keep seeing has different base case.
function factorial2(n) { if (n == 0) { return 1; } return n*factorial2(n-1); }
this same first one, except computation (step) added:
- factorial(0) = 1 = 1 step (base case - constant time)
- factorial(1) = 1*factorial(0) = 2 steps
- ...
now believe still o(n), correct if factorial2 more o(n+1) (where 1 base case) opposed factorial1 o(n) (including base case)?
one thing note factorial1
incorrect n = 0
, underflowing , causing stack overflow in typical implementations. factorial2
correct n = 0
.
setting aside, intution correct. factorial1
o(n) , factorial2
o(n + 1). however, since effect of n
dominates on constant factors (the + 1
), it's typical simplify saying it's o(n). wikipedia article on big o notation describes this:
...the function g(x) appearing within o(...) typically chosen simple possible, omitting constant factors , lower order terms.
from perspective though, it's more accurate these functions execute in pseudo-polynomial time. means polynomial respect numeric value of n
, exponential respect number of bits required represent value of n
. there excellent prior answer describes distinction.
Comments
Post a Comment