Computer Science 112:

Introduction to Computer Science II

Gregory M. Kapfhammer

flickr photo shared by Billboard Art Project under a Creative Commons ( BY-NC-ND ) license

Color Scheme

Key Concept

Corresponding Diagram

In-Class Discussion

In-Class Activity

Details in the Textbook

Data Structure

Systematic way of organizing and accessing data


A step-by-step for performing a task in a finite time

Running time of algorithms as they process data structures

Larger inputs lead to longer running times

What about the space overhead of an algorithm?

Trade-off time and space overhead

See Code Fragment 4.1 for a timing method

We have done this already in several assignments!


Comparable results only when using same environment

Limited set of inputs may lead to incorrect conclusions

You must fully implement and test the algorithms

Is there an alternative?


Our Goals

Work independent of hardware and software

Takes into account all possible inputs

Does not require algorithm implementation

Count primitive operations

Primitive Operations

Assign a value to a variable

Follow an object reference

Compare two numbers

Return from a method

Call a method

Any questions about these types of operations?

Focus on the worst-case behavior

Look for iteration constructs

Nesting of iteration constructs


Set your CLASSPATH to contain profiler.jar

Study the source code of this program!

Compile the ExampleTimings program

Run: java ExampleTimings 10000 quadratic

Perform a wide variety of experiments

Any questions about these results?

Other types of algorithm performance behavior?




See Figure 4.2 for an example

Algorithm performance with a function

Operations as function of input size

The Seven Functions

Growth Rates








Fast-growing function, slow algorithm

Slow-growing function, fast algorithm

See Figure 4.4 to compare the growth rates!

Any questions about these functions?

Focusing on worst-case behavior

The "Big-Oh" Notation

f(n) is O(g(n))

f(n) is order g(n)

f(n) is bounded above by g(n)

Asymptotic sense as n grows toward infinity

Let's visualize this with a diagram!

Algorithm Growth Rates

log n: 3, 4, 5, 6, 7, 8, 9

n: 8, 16, 32, 64, 128, 256, 512

Quadratic: 64, 256, 1024, 4096, 16384, 65536, 262144

Cubic and exponential grow much faster!

Any questions about these growth rates?

Organize into teams of two or three

Which algorithm would you pick?

Algorithm Choices

Fast algorithm running on a slow computer

Slow algorithm running on a fast computer

Algorithm Choices

Good algorithm design is very important

Ensure that your algorithm has a slow growth rate

Dramatic speedups in hardware cannot overcome asymptotically slow algorithms

See Tables 4.4 and 4.5 for more details!

Examples of algorithm analysis

arrayMax on page 171

This algorithm is O(n)!

Can you clearly explain why?

repeat1 on page 172

This algorithm is O(n2)!

Can you clearly explain why?

Why is repeat1 slower than repeat2?

unique1 on page 174

This algorithm is O(n2)!

Can you clearly explain why?

unique2 on page 175

This algorithm is O(n×logn)!

Can you clearly explain why?

Interplay between theory and experiment

Questions about algorithm analysis?