Recursion is when a function calls itself inside the function unless a stop condition is reached. In computer science, Recursion is not the most efficient way to solve a problem; however, it allows for you to use some algorithms that would otherwise be impossible. One such algorithm is quick sort, which works in O(n log n) time.
In quick sort, you are taking one element of an array to become a pivot point. After this, you go through all the other elements and sort them into two arrays; one with all the elements smaller than the pivot and the other with all the larger elements. After doing this, you return a new array that has the following things in the following order: A call of quick sort on the smaller array, the pivot, and then a call of quick sort on the larger array. If an array is ever one element long, quick sort just returns that one element.
Another O(n log N) sorting algorithm is called merge sort. In merge sort, you take your array and brake it down into two smaller ones. After that, you call merge sort on both of these arrays. After the arrays have been merge sorted, you know that each of them is in order, and now you just need to merge the two sub arrays. To do this, you have a counter for each sub array and you add the smaller element to your array out of the two counter's indexes and add one to that counter. If one counter is past the final element of its sub array, you just add the rest of the other sub array to the array. Then you return the newly sorted array. Again, this algorithm just returns the array if it only contains one element.
Both of these two recursive algorithms are called divide and conquer algorithms because they take the larger problem (sorting an array), and break it up into manageable peaces (one element arrays) to solve the initial problem. Such algorithms are very fast, despite the need for recursion. In general, quick sort and merge sort are two of thee fastest sorting algorithms.
No comments:
Post a Comment