Introduction to Algorithms. (2nd edition) Thomas Cormen_ Charles Leiserson_ Ronad Rivest (811417), страница 12
Текст из файла (страница 12)
It is not contradictory, however, to say that the worst-case runningtime of insertion sort is Ω(n2), since there exists an input that causes the algorithm to takeΩ(n2) time. When we say that the running time (no modifier) of an algorithm is Ω(g(n)), wemean that no matter what particular input of size n is chosen for each value of n, the runningtime on that input is at least a constant times g(n), for sufficiently large n.Asymptotic notation in equations and inequalitiesWe have already seen how asymptotic notation can be used within mathematical formulas.For example, in introducing O-notation, we wrote "n = O(n2)." We might also write 2n2 + 3n+ 1 = 2n2 + Θ(n). How do we interpret such formulas?When the asymptotic notation stands alone on the right-hand side of an equation (orinequality), as in n = O(n2), we have already defined the equal sign to mean set membership:n O(n2).
In general, however, when asymptotic notation appears in a formula, we interpretit as standing for some anonymous function that we do not care to name. For example, theformula 2n2 + 3n + 1 = 2n2 + Θ(n) means that 2n2 + 3n + 1 = 2n2 + f(n), where f(n) is somefunction in the set Θ(n). In this case, f(n) = 3n + 1, which indeed is in Θ(n).Using asymptotic notation in this manner can help eliminate inessential detail and clutter in anequation. For example, in Chapter 2 we expressed the worst-case running time of merge sortas the recurrenceT(n) = 2T (n/2) + Θ(n).If we are interested only in the asymptotic behavior of T(n), there is no point in specifying allthe lower-order terms exactly; they are all understood to be included in the anonymousfunction denoted by the term Θ(n).The number of anonymous functions in an expression is understood to be equal to the numberof times the asymptotic notation appears.
For example, in the expressionthere is only a single anonymous function (a function of i). This expression is thus not thesame as O(1) + O(2) + . . . + O(n), which doesn't really have a clean interpretation.In some cases, asymptotic notation appears on the left-hand side of an equation, as in2n2 + Θ(n) = Θ(n2).We interpret such equations using the following rule: No matter how the anonymous functionsare chosen on the left of the equal sign, there is a way to choose the anonymous functions onthe right of the equal sign to make the equation valid.
Thus, the meaning of our example isthat for any function f(n) Θ(n), there is some function g(n) Θ(n2) such that 2n2 + f(n) =g(n) for all n. In other words, the right-hand side of an equation provides a coarser level ofdetail than the left-hand side.A number of such relationships can be chained together, as in2n2 + 3n + 1 = 2n2 + Θ(n)= Θ(n2).We can interpret each equation separately by the rule above. The first equation says that thereis some function f(n) Θ(n) such that 2n2 + 3n + 1 = 2n2 + f(n) for all n. The second equationsays that for any function g(n) Θ(n) (such as the f(n) just mentioned), there is some functionh(n) Θ(n2) such that 2n2 + g(n) = h(n) for all n. Note that this interpretation implies that 2n2+ 3n + 1 = Θ(n2), which is what the chaining of equations intuitively gives us.o-notationThe asymptotic upper bound provided by O-notation may or may not be asymptotically tight.The bound 2n2 = O(n2) is asymptotically tight, but the bound 2n = O(n2) is not.
We use onotation to denote an upper bound that is not asymptotically tight. We formally define o(g(n))("little-oh of g of n") as the seto(g(n)) = {f(n) : for any positive constant c > 0, there exists a constant n0 > 0 such that 0 ≤ f(n)< cg(n) for all n ≥ n0}.For example, 2n = o(n2), but 2n2 ≠ o(n2).The definitions of O-notation and o-notation are similar. The main difference is that in f(n) =O(g(n)), the bound 0 ≤ f(n) ≤ cg(n) holds for some constant c > 0, but in f(n) = o(g(n)), thebound 0 ≤ f(n) < cg(n) holds for all constants c > 0. Intuitively, in the o-notation, the functionf(n) becomes insignificant relative to g(n) as n approaches infinity; that is,(3.1)Some authors use this limit as a definition of the o-notation; the definition in this book alsorestricts the anonymous functions to be asymptotically nonnegative.ω-notationBy analogy, ω-notation is to Ω-notation as o-notation is to O-notation.
We use ω-notation todenote a lower bound that is not asymptotically tight. One way to define it is byf(n)ω(g(n)) if and only if g(n)o(f(n)).Formally, however, we define ω(g(n)) ("little-omega of g of n") as the setω(g(n)) = {f(n): for any positive constant c > 0, there exists a constant n0 > 0 such that 0 ≤cg(n) < f(n) for all n ≥ n0}.For example, n2/2 = ω(n), but n2/2 ≠ ω(n2). The relation f(n) = ω(g(n)) implies thatif the limit exists. That is, f(n) becomes arbitrarily large relative to g(n) as n approachesinfinity.Comparison of functionsMany of the relational properties of real numbers apply to asymptotic comparisons as well.For the following, assume that f(n) and g(n) are asymptotically positive.Transitivity:•f(n) = Θ(g(n)) and g(n) = Θ(h(n)) imply f(n) = Θ(h(n)),f(n) = O(g(n)) and g(n) = O(h(n)) imply f(n) = O(h(n)),f(n) = Ω(g(n)) and g(n) = Ω(h(n)) imply f(n) = Ω(h(n)),f(n) = o(g(n)) and g(n) = o(h(n)) imply f(n) = o(h(n)),f(n) = ω(g(n)) and g(n) = ω(h(n)) imply f(n) = ω(h(n)).Reflexivity:•f(n) = Θ(f(n)),f(n) = O(f(n)),f(n) = Ω(f(n)).Symmetry:f(n) = Θ(g(n)) if and only if g(n) = Θ(f(n)).Transpose symmetry:•f(n) = O(g(n)) if and only if g(n) = Ω(f(n)),f(n) = o(g(n)) if and only if g(n) = ω(f(n)).Because these properties hold for asymptotic notations, one can draw an analogy between theasymptotic comparison of two functions f and g and the comparison of two real numbers aand b:f(n) = O(g(n)) ≈ a ≤ b,f(n) = Ω(g(n)) ≈ a ≥ b,f(n) = Θ(g(n)) ≈ a = b,f(n) = o(g(n)) ≈ a < b,f(n) = ω(g(n)) ≈ a > b.We say that f(n) is asymptotically smaller than g(n) if f(n) = o(g(n)), and f(n) isasymptotically larger than g(n) if f(n) = ω(g(n)).One property of real numbers, however, does not carry over to asymptotic notation:•Trichotomy: For any two real numbers a and b, exactly one of the following musthold: a < b, a = b, or a > b.Although any two real numbers can be compared, not all functions are asymptoticallycomparable.
That is, for two functions f(n) and g(n), it may be the case that neither f(n) =O(g(n)) nor f(n) = Ω(g(n)) holds. For example, the functions n and n1+sin n cannot be comparedusing asymptotic notation, since the value of the exponent in n1+sin n oscillates between 0 and2, taking on all values in between.Exercises 3.1-1Let f(n) and g(n) be asymptotically nonnegative functions. Using the basic definition of Θnotation, prove that max(f(n), g(n)) = Θ(f(n) + g(n)).Exercises 3.1-2Show that for any real constants a and b, where b > 0,(3.2)Exercises 3.1-3Explain why the statement, "The running time of algorithm A is at least O(n2)," ismeaningless.Exercises 3.1-4Is 2n+1 = O(2n)? Is 22n = O(2n)?Exercises 3.1-5Prove Theorem 3.1.Exercises 3.1-6Prove that the running time of an algorithm is Θ(g(n)) if and only if its worst-case runningtime is O(g(n)) and its best-case running time is Ω(g(n)).Exercises 3.1-7Prove that o(g(n)) ∩ ω(g(n)) is the empty set.Exercises 3.1-8We can extend our notation to the case of two parameters n and m that can go to infinityindependently at different rates.
For a given function g(n, m), we denote by O(g(n, m)) the setof functionsO(g(n, m)) = {f(n, m): there exist positive constants c, n0, and m0 such that 0 ≤ f(n, m) ≤ cg(n,m) for all n ≥ n0 and m ≥ m0}.Give corresponding definitions for Ω(g(n, m)) and Θ(g(n, m)).[1]Within set notation, a colon should be read as "such that."[2]The real problem is that our ordinary notation for functions does not distinguish functionsfrom values. In λ-calculus, the parameters to a function are clearly specified: the function n2could be written as λn.n2, or even λr.r2. Adopting a more rigorous notation, however, wouldcomplicate algebraic manipulations, and so we choose to tolerate the abuse.3.2 Standard notations and common functionsThis section reviews some standard mathematical functions and notations and explores therelationships among them.
It also illustrates the use of the asymptotic notations.MonotonicityA function f(n) is monotonically increasing if m ≤ n implies f(m) ≤ f(n). Similarly, it ismonotonically decreasing if m ≤ n implies f(m) ≥ f(n). A function f(n) is strictly increasing ifm < n implies f(m) < f(n) and strictly decreasing if m < n implies f(m) > f(n).Floors and ceilingsFor any real number x, we denote the greatest integer less than or equal to x by ⌊x⌋ (read "thefloor of x") and the least integer greater than or equal to x by ⌈x⌉ (read "the ceiling of x").