UdaanPath Logo UdaanPath

📖 Chapters

Data Structures and Algorithms in C  (DSA)

Data Structures and Algorithms in C (DSA)

Category: IT Fundamentals & Programming

This course is designed to help learners master the core concepts of Data Structures and Algorithms (DSA) using the C programming language. Starting from the basics and advancing to complex topics like graphs, dynamic programming, and memory optimization, this course …

Sorting Basics: Bubble, Selection, Insertion

📘 Sorting Basics: Bubble, Selection, Insertion

Organizing data in a specific order, known as sorting, is a fundamental operation in computer science. It's often a prerequisite for efficient searching (as we saw with Binary Search) and plays a crucial role in many algorithms and database operations. This chapter introduces three foundational, yet less efficient, sorting algorithms: Bubble Sort, Selection Sort, and Insertion Sort. Understanding these basic algorithms provides a strong foundation for grasping more advanced sorting techniques. We'll also visualize their step-by-step execution to make the concepts clearer.

🎈 Bubble Sort: The "Sinking" Elements

Bubble Sort is a simple sorting algorithm that repeatedly steps through the list, compares adjacent elements, and swaps them if they are in the wrong order. The pass through the list is repeated until no swaps are needed, which indicates that the list is sorted. It gets its name because smaller or larger elements "bubble" to their correct positions.

// Bubble Sort in C
void bubbleSort(int arr[], int n) {
    for (int i = 0; i < n - 1; i++) {
        // Flag to optimize: if no swaps occur in a pass, array is sorted
        int swapped = 0;
        // Last i elements are already in place
        for (int j = 0; j < n - i - 1; j++) {
            if (arr[j] > arr[j + 1]) {
                // Swap arr[j] and arr[j+1]
                int temp = arr[j];
                arr[j] = arr[j + 1];
                arr[j + 1] = temp;
                swapped = 1;
            }
        }
        // If no two elements were swapped by inner loop, then break
        if (swapped == 0)
            break;
    }
}
  

Time Complexity: O(n2) in the worst and average cases. In the best case (array already sorted), it's O(n) due to the `swapped` optimization.

Space Complexity: O(1) as it only uses a constant amount of extra space for temporary variable during swaps.

Bubble Sort Example Visualization: `[5, 3, 8, 4, 6]`

5
3
8
4
6
Initial Unsorted array
5
3
8
4
6
Compare 1st and 2nd (Swap)
3
5
8
4
6
Result of swap
3
5
8
4
6
Compare 2nd and 3rd (Do not Swap)
3
5
8
4
6
Compare 3rd and 4th (Swap)
3
5
4
8
6
Result of swap
3
5
4
8
6
Compare 4th and 5th (Swap)
3
5
4
6
8
End of 1st Pass. Largest element (8) is now at the end.
3
5
4
6
8
Start of 2nd Pass. Compare 1st and 2nd (Swap)
3
5
4
6
8
Result of swap (No change)
3
5
4
6
8
Compare 2nd and 3rd (Swap)
3
4
5
6
8
End of 2nd Pass.
3
4
5
6
8
Final Sorted Array (After remaining passes, omitted for brevity)

🎯 Selection Sort: Finding the Extremes

Selection Sort improves on the basic idea by repeatedly finding the minimum element (or maximum) from the unsorted part and putting it at the beginning (or end) of the sorted part. It divides the array into two parts: a sorted part and an unsorted part.

// Selection Sort in C
void selectionSort(int arr[], int n) {
    int i, j, min_idx;

    // One by one move boundary of unsorted subarray
    for (i = 0; i < n - 1; i++) {
        // Find the minimum element in unsorted array
        min_idx = i;
        for (j = i + 1; j < n; j++) {
            if (arr[j] < arr[min_idx])
                min_idx = j;
        }

        // Swap the found minimum element with the first element
        // of the unsorted part (arr[i])
        int temp = arr[min_idx];
        arr[min_idx] = arr[i];
        arr[i] = temp;
    }
}
  

Time Complexity: O(n2) in all cases (worst, average, and best). Even if the array is already sorted, it still goes through the same number of comparisons.

Space Complexity: O(1) as it performs sorting in-place.

Selection Sort Example Visualization: `[5, 3, 8, 4, 6]`

5
3
8
4
6
Initial Unsorted array
5
3
8
4
6
Find minimum in `[5, 3, 8, 4, 6]`. Minimum is 3 at index 1.
3
5
8
4
6
Swap 5 (index 0) with 3 (index 1). Element 3 is now sorted.
3
5
8
4
6
Find minimum in unsorted `[5, 8, 4, 6]`. Minimum is 4 at index 3.
3
4
8
5
6
Swap 5 (index 1) with 4 (index 3). Element 4 is now sorted.
3
4
8
5
6
Find minimum in unsorted `[8, 5, 6]`. Minimum is 5 at index 3.
3
4
5
8
6
Swap 8 (index 2) with 5 (index 3). Element 5 is now sorted.
3
4
5
8
6
Find minimum in unsorted `[8, 6]`. Minimum is 6 at index 4.
3
4
5
6
8
Swap 8 (index 3) with 6 (index 4). Element 6 is now sorted.
3
4
5
6
8
Final Sorted Array.

➕ Insertion Sort: Building a Sorted Subarray

Insertion Sort builds the final sorted array (or list) one item at a time. It iterates through the input elements and, for each element, it finds its correct position in the already sorted part of the array and inserts it there. This process is similar to how one might sort a hand of playing cards.

// Insertion Sort in C
void insertionSort(int arr[], int n) {
    int i, key, j;
    for (i = 1; i < n; i++) {
        key = arr[i]; // Current element to be inserted
        j = i - 1;

        // Move elements of arr[0..i-1], that are greater than key,
        // to one position ahead of their current position
        while (j >= 0 && arr[j] > key) {
            arr[j + 1] = arr[j];
            j = j - 1;
        }
        arr[j + 1] = key; // Place key at its correct position
    }
}
  

Time Complexity: O(n2) in the worst and average cases. However, in the best case (array already sorted), it's O(n) because it only needs to iterate through the array once to confirm it's sorted.

Space Complexity: O(1) as it sorts in-place.

Insertion Sort Example Visualization: `[5, 3, 8, 4, 6]`

5
3
8
4
6
Initial Unsorted array
5
3
8
4
6
Take 3 (`key`). Compare with sorted part (5). 3 < 5, so shift 5.
3
5
8
4
6
Insert 3. Sorted part: `[3, 5]`
3
5
8
4
6
Take 8 (`key`). Compare with sorted part (`[3, 5]`). 8 > 5, 8 > 3. Already in place.
3
5
8
4
6
Sorted part: `[3, 5, 8]`
3
5
8
4
6
Take 4 (`key`). Compare with sorted part (`[3, 5, 8]`). 4 < 8, shift 8.
3
5
8
4
6
4 < 5, shift 5.
3
5
8
4
6
4 > 3, stop. Insert 4.
3
4
5
8
6
Sorted part: `[3, 4, 5, 8]`
3
4
5
8
6
Take 6 (`key`). Compare with sorted part (`[3, 4, 5, 8]`). 6 < 8, shift 8.
3
4
5
8
6
6 > 5, stop. Insert 6.
3
4
5
6
8
Final Sorted Array.

🆚 Basic Sorting Algorithms: A Comparative Overview

Aspect Bubble Sort Selection Sort Insertion Sort
Time Complexity (Worst/Average) O(n2) O(n2) O(n2)
Time Complexity (Best) O(n) (with optimization) O(n2) O(n)
Space Complexity O(1) O(1) O(1)
Stability Yes No Yes
Use Cases Educational, very small arrays Educational, very small arrays Small arrays, nearly sorted arrays, online algorithms

Stability: A sorting algorithm is stable if it preserves the relative order of equal elements. For example, if two elements `A` and `B` are equal and `A` appears before `B` in the original array, then in a stable sort, `A` will still appear before `B` in the sorted array.

📌 Real-Life Example (UdaanPath)

While these basic sorting algorithms aren't typically used for large-scale data on platforms like UdaanPath due to their quadratic time complexity, they are excellent for understanding the foundational concepts of sorting. Imagine a small internal list on UdaanPath, perhaps a temporary list of 5-10 recently viewed courses that need to be displayed in alphabetical order for a specific user session. In such a scenario, where the dataset is extremely small and the performance impact is negligible, one of these simple sorts could theoretically be used for quick organization, though more optimized built-in sorts are always preferred in production for larger sets. They serve as a crucial stepping stone to understanding the inefficiencies that more advanced algorithms, which we'll cover in future chapters, aim to overcome. They are often taught first because their logic is very intuitive, making them ideal for beginners to grasp the core idea of arranging data.

✅ Summary: The Building Blocks of Sorting

  • Bubble Sort is simple but very inefficient for most practical purposes, repeatedly swapping adjacent elements. It can be optimized for already sorted arrays.
  • Selection Sort is also O(n2) in all cases, consistently finding the minimum and placing it. It performs minimal swaps.
  • Insertion Sort is efficient for small arrays and nearly sorted arrays, as it builds the sorted list incrementally by inserting elements into their correct positions within a growing sorted subarray.
  • All three algorithms have O(1) space complexity, meaning they sort in-place without needing significant extra memory.
  • Understanding these basic sorts is crucial for appreciating the optimizations and design principles behind more advanced and efficient sorting algorithms.

📤 Coming Next: Divide and Conquer Sorting

Now that we've grasped the basics, we'll move on to more powerful sorting paradigms that overcome the limitations of O(n2) complexity. In the next chapter, we’ll explore Merge Sort, the first of our "divide and conquer" sorting algorithms, which offers significantly better performance for large datasets.

ECHO Education Point  📚🎒

ECHO Education Point 📚🎒

ECHO Education Point proudly presents its Full Stack Development program 💻 – designed to launch your career in tech!

  • 🚀 Master both Front-End and Back-End technologies
  • 🧪 Includes 11 Mock Tests, 35 Mini Projects & 3 Website Builds
  • 🎯 Special training for job interviews & placement preparation

📍 Location: D-Mart Road, Meghdoot Nagar, Mandsaur
📞 Contact: 8269399715

Start your coding journey with expert instructor Vijay Jain (B.C.A., M.Sc., M.C.A.)
10 Days Free Demo Classes – Limited seats available!

#ECHO #FullStackDevelopment #MandsaurCoding