Sorting Algorithm
Algorithms that arrange elements in a specific order, typically ascending or descending.
algorithms
Back to GlossarySorting Algorithm
Sorting algorithms are procedures that arrange elements in a specific order, typically in ascending or descending order based on a comparison operator. They are fundamental algorithms studied in computer science and are essential in many applications.
Sorting algorithms can be categorized in several ways:
- By stability: Stable sorts preserve the relative order of equal elements, while unstable sorts may reorder equal elements
- By adaptivity: Adaptive sorts perform better on partially sorted data
- By method: Comparison-based sorts (like Quick Sort) vs. non-comparison sorts (like Radix Sort)
- By memory usage: In-place sorts (minimal extra memory) vs. out-of-place sorts (require significant extra memory)
The efficiency of sorting algorithms is typically measured by their time complexity (how fast they run) and space complexity (how much memory they use). The best theoretical time complexity for comparison-based sorting is O(n log n), though specialized algorithms can perform better in specific scenarios.
Popular sorting algorithms include:
- Simple sorts: Bubble Sort, Insertion Sort, Selection Sort
- Efficient comparison sorts: Merge Sort, Quick Sort, Heap Sort
- Distribution sorts: Counting Sort, Bucket Sort, Radix Sort
Examples
- Bubble Sort: Repeatedly steps through the list, compares adjacent elements, and swaps them if they're in the wrong order (O(n²) time complexity)
- Merge Sort: Divides the array into halves, recursively sorts them, then merges them back together (O(n log n) time complexity)
- Quick Sort: Selects a 'pivot' element and partitions the array around it, then recursively sorts the sub-arrays (average O(n log n) time complexity)
- Heap Sort: Builds a max heap from the array and repeatedly extracts the maximum element (O(n log n) time complexity)
Related Terms
Further Learning
Want to see these concepts in action?