Sorting is fundamental to many operations in computer science. When it comes to sorting algorithms, there are many to choose from. While Selection sort may not boast top-tier efficiency, performance isn’t always a top-tier priority either. So why should you learn how to implement selection sort in Go?

  • Sometimes you just need a solution that is simple, elegant, and easy to implement.
  • Selection sort has its own place in the world of algorithms.
  • Selection sort is also a good starting point for understanding other sorting algorithms. This blog post is part of the series on Algorithms in Go. If you haven’t read the previous post, you can find it here

The way I approach learning algorithms is by first implementing the simplest version of the algorithm in Go. Then I try to study the different variations of the algorithm. That way I’m able to understand the tradeoffs between the different approaches and solidify my understanding of the algorithm.

What is Selection Sort and How Does it work?

Selection Sort is a simple sorting algorithm that works by selecting the smallest element in the array and swapping it with the first element. Then it selects the second smallest element and swaps it with the second element. It continues to do this until it reaches the end of the array.

Another way to think about this is by imagining that you have a box of candy. You want to sort the candy by size. You start by selecting the smallest candy and putting it in the first slot. Then you select the second smallest candy and put it in the second slot. You continue to do this until you reach the end of the box.

Implementing Selection Sort in Go

In the classic version of selection sort, you iterate through the array and swap the current element with the smallest element you have encountered so far.

Here’s an example of how you might implement this in Go:

func SelectionSort(slice []int) []int {
	n := len(slice)

	for i := 0; i < n; i++ {
		min := i
		for j := i + 1; j < n; j++ {
			if slice[j] < slice[min] {
				min = j
			}
		}
		slice[i], slice[min] = slice[min], slice[i]
	}

	return slice
}

Explanation:

  • Initialization: We start by determining the length of the input slice to be sorted, denoted by n.
  • Outer Loop: The outer loop runs n times, iterating through each element of the slice.
  • Finding Minimum: Within each iteration of the outer loop, we initialize a variable min to the current index i. This min represents the index of the minimum element found so far in the unsorted portion of the slice.
  • Inner Loop: The inner loop starts from the next element after the current index i and iterates till the end of the slice. It searches for the minimum element from the unsorted portion of the slice.
  • Updating Minimum: If an element at index j is less than the current minimum element at index min, we updated min to j.
  • Swapping Minimum: Once the inner loop completes, we swap the element at index i with the minimum element found (slice[min]). This places the minimum element in its correct sorted position.
  • Completing Iteration: The outer loop continues its iterations, each time finding and placing the next smallest element in its sorted position.
  • Returning Sorted Slice: Once the outer loop completes, the entire slice is sorted, and the function returns the sorted slice.

Time Complexity:

  • Outer Loop: The outer loop iterates n times, where n is the number of elements in the input slice.
  • Inner Loop: Within each iteration of the outer loop, the inner loop iterates (n - i - 1) times, where i is the current index of the outer loop.
  • Comparisons and Swaps: Within the inner loop, there are comparisons to find the minimum element, and potentially swaps to place the minimum element in its correct position. The number of comparisons is (n - 1) + (n - 2) + … + 1 = n * (n - 1) / 2, which is asymptotically bounded by O(n^2). Similarly, the number of swaps is at most n - 1, which is also bounded by O(n).
  • Overall Time Complexity: Considering both the outer and inner loops, and the comparisons/swaps within them, the overall time complexity of Selection Sort is O(n^2).

The time complexity of Selection Sort is O(n^2), which is quite inefficient when it comes to large inputs. It is suitable for small input sizes as it requires only a constant amount of memory.

Implementing Bidirectional Selection Sort in Go

In Bidirectional Selection Sort, you iterate through the array from both ends and swap the current element with the smallest element you’ve encountered so far. This one way to improve the performance of selection sort.

func BidirectionalSelection(slice []int) []int {
	left, right := 0, len(slice)-1

	for left <= right {
		swapped := false

		// Forward pass
		for i := left; i < right; i++ {
			if slice[i] > slice[i+1] {
				slice[i], slice[i+1] = slice[i+1], slice[i]
				swapped = true
			}
		}
		// If no swaps occurred, the slice is already sorted.
		if !swapped {
			break
		}

		// Backward pass
		for i := right; i > left; i-- {
			if slice[i] < slice[i-1] {
				slice[i], slice[i-1] = slice[i-1], slice[i]
			}
		}

		right--
		left++
	}

	return slice
}

Explanation:

  • Initialization: We start by initializing two pointers, left and right, to the beginning and end of the slice, respectively.
  • Outer Loop: The outer loop iterates until the left pointer crosses or equals the right pointer (the entire slice has been sorted.)
  • Forward Pass: During each iteration of the outer loop, make a forward pass from left to right - 1. If we find an element greater than the next one, we swap them and set a flag swapped to true.
  • Check for Sorted Slice: After the forward pass, if no swaps occurred (swapped is false), it means the slice is already sorted, and the function breaks out of the loop.
  • Backward Pass: If swaps occurred during the forward pass, we perform a backward pass from right to left + 1. If an element is smaller than the previous element, we swap them.
  • Updating Pointers: After each iteration of the outer loop, we decrement ‘right’ (moving towards the left), increment ’left’ (moving towards the right). This narrowing of the range reduces the portion of the slice that needs to be considered in the next iteration.
  • Completing Iteration: The outer loop continues until left crosses or equals right, ensuring that the entire slice is sorted.
  • Returning Sorted Slice: Once the sorting process is complete, return the sorted slice.

The bidirectional approach is more efficient because it scans from both ends of the slice toward the center. This reduces the number of comparisons and swaps compared to traditional selection sort.

Time Complexity:

  • Initialization: We initialize two pointers, left and right, which takes constant time, O(1).

  • Outer Loop: The outer loop continues until left crosses or equals right, which requires traversing at most n/2 elements, where n is the size of the slice. Thus, the time complexity of the outer loop is O(n/2) or simply O(n).

  • Forward Pass: During each iteration of the outer loop, there’s a forward pass from left to right - 1. This requires traversing at most n elements, resulting in a time complexity of O(n).

  • Check for Sorted Slice: After the forward pass, if no swaps occurred, we exit the loop. The worst-case scenario is that the slice is already sorted, resulting in a single pass through the outer loop. Thus, this step contributes O(n) time complexity.

  • Backward Pass: If swaps occurred during the forward pass, we perform a backward pass from right to left + 1. Similar to the forward pass, this requires traversing at most n elements, resulting in a time complexity of O(n).

  • Updating Pointers: After each iteration of the outer loop, we decrement right and increment left, both of which take constant time, O(1).

  • Completing Iteration: The outer loop continues until left crosses or equals right, which requires at most n/2 iterations. Thus, the time complexity of completing the outer loop is O(n/2) or simply O(n).

  • Returning Sorted Slice: Finally, once the sorting process ends, we return the sorted slice, which takes constant time, O(1).

  • Overall Time Complexity: Combining the time complexities of all steps, the dominant factor is the outer loop which has a time complexity of O(n). Therefore, the overall time complexity of the Bidirectional Selection Sort algorithm is O(n). This makes bidirectional selection sort more efficient than traditional Selection Sort; for partially sorted arrays it can terminate early if no swaps are made in a pass.

In this article, we’ve discussed Selection Sort and its implementation in Go. We’ve also discussed the time complexity of this algorithm and its bidirectional variant.

There other variants of this algorithm which morph into different sorting algorithms. We will come across such variants in the future as we continue our algorithms series.

While we’ve seen that the time complexity of Selection Sort isn’t the best, it still has its own use-case if we want improve the performance and with smaller input sizes.

Stay tuned for the coming blog posts on more efficient algorithms. And you can check out the full source code on GitHub .