Time Complexity
A measure of the amount of time an algorithm takes to complete as a function of the length of the input.
Time Complexity
Time complexity is a computational concept that measures the amount of time an algorithm takes to complete as a function of the length of the input. It provides a way to express how the runtime of an algorithm grows as the size of the input increases.
When analyzing time complexity, we're usually concerned with the worst-case scenario—the maximum amount of time an algorithm could take given any valid input of size n. However, we sometimes also consider:
- Best-case complexity: The minimum time required for an algorithm (often not very useful)
- Average-case complexity: The expected time for a typical input
- Amortized complexity: The time per operation, averaged over a sequence of operations
Time complexity is commonly expressed using Big O notation, which gives us an asymptotic upper bound on the growth rate of the algorithm's runtime. This means we focus on how the algorithm scales with large inputs, rather than the exact number of operations for a specific input size.
When calculating time complexity, we generally follow these principles:
- Ignore constants: O(2n) is simplified to O(n)
- Keep only the highest-order term: O(n² + n) is simplified to O(n²)
- Focus on the dominant operations (usually comparisons or iterations)
Examples
- O(1) - Constant time: Array access, basic arithmetic operations
- O(log n) - Logarithmic time: Binary search, operations in balanced binary search trees
- O(n) - Linear time: Linear search, traversing an array or linked list
- O(n log n) - Linearithmic time: Efficient sorting algorithms like Merge Sort and Quick Sort
- O(n²) - Quadratic time: Simple sorting algorithms like Bubble Sort, nested loops processing each element
Related Terms
Further Learning
Want to see these concepts in action?