This is the fourth in a sorting algorithms series I'm doing to up my CompSci 101 knowledge! Merge sort, while not adaptive to its data, is considered a highly predictable and stable sorting algorithm, with a Big-O notation of O(n*lg(n)).

This sort was a difficult one for me to grasp. I understood that initially, the array breaks down into slices the size of each single item, then groups slices to their neighbors, swapping their immediate positions if the latter is smaller than the former. But that is where my understanding stopped. I had to watch this video on merge sorting to understand that as soon as the program compares two groups of items, it basically looks only at the first elements of each group. The smaller element is pushed into a sorted list; then the program looks at the new set of first elements, and so on, until those sub-groups are empty. At first I didn't understand that because I thought that it simply combined them in place. Once I finally understood it though, it immediately became clear how to design my algorithm.

The only bug I needed to understand and fix after that was when it was comparing two sub-slices after the initial comparison of their first items. Once it had passed through the slices initially, one of those slices no longer had any elements and it broke trying to compare nil to the first element of the second slice. In this case as well as in the case of designing the algorithm, it was all a matter of understanding what was happening, and once I was able to do that, it was easy to fix.