Summary
Sort type  Time Complexity  Space complexity  stable?  

bubble sort  $O(n^2)$  $O(1)$  Yes (do not swap on equal val)  
selection sort  $O(n^2)$  $O(1)$  Yes (do not update max idx on equal value)  
insertion Sort  $O(n^2)$  $O(1)$  Yes (stop insert on equal val  
radix/bucket sort  $O(nd)$ d is number of digit for max val  $o(n)$  Yes  
heap Sort  $O(nlogn)$  $O(n)$  No?  
merge sort  $O(nlogn)$  $O(n)$  Yes  
quick sort  $O(nlogn)$  $O(logn)$ call stack w/ tail recursion  No 
bubble sort
algorithm
 start from left to right, inspect two adjacent elements, swap two element if the value of first is larger than second
 continue with the next pair until reaching the end of the array
 the largest element now is at the end of the array
 repeat above two steps for rest of unsorted portion of the array until the entire array is sorted
complexity
 Time $O(n^2)$
 Space $O(1)$
code
1  package sorting; 
selection sort
algorithm
 find the smallest element using linear scan and move it to the front (via swapping)
 find the second smallest with same process and move it to the next postion
 repeat above steps unitl array is sorted
complexity
 Time: $O(n^2)$
 Spcae: $O(1)$ no extra space required
code
1  public static void selSort(int[] array) { 
merge sort
Algorithm
 break down the entire array until it has single.
 compare and merge the elements in the same group (activation record).
 after the last merge, you will get a sorted array.
complexity
 Time: $O(nlog(n))$ in worst case.
 Space: $O(n)$ auxiliary.
Code
1  public class MergeSort { 
quick sort
update: 05/12/2019
Just for fun 博主《被忽视的 partition 算法》 一文对partiton算法进行了深入的研究，提供的partition算法 1. 更加高效（删去了swap 方程中tmp的使用，每一次的swap减少了一次的拷贝）2. 更加简洁 (代码量更少）3. 更容易解释算法每一步的用途(解决了v2中 right, left, 和pivot位置都是模糊点，v1对此可以清晰的解释)。有时间再仔细研究Partition 进阶算法部分(TODO)。
1  public static void quickSort(int[] arr, int l, int r) { 
key idea
 each operation takes $O(n)$ in worst case (at first step, every value compare to piviot)
 the height of the tree is $O(N)$ in worst case (either piviot is smallest/largest value), and $O(logN)$ in best case.
 hence, total time complexity is $O(N^2)$ in worst case, $Nlog(N)$ in best case.
How to choose the piviot element?
 why not choosing pivot randomly?
randomly select pivot will work, however, the worst case time complexity of the algorithm will become $O(n^2)$.
Time Complexity Analysis
 Worst case $O(n^2)$:
everytimes pivit picked is either the smallest or the largest elements of the array. From $n>n1>n2> \dots > 1$  Average/best case $O(nlog(n))$:
pivit split array evenly so the number of steps is less. From $n > 2 n/2>4 n/4 \dots$. The height of the tree is $log(n)$ Hence, $O(nlog(n))$.
Algorithm
 select the middle value as the
pivot
.  maintain two pointers
left
andright
. left
pointer traverse from left to right until found element larger thanpivot
.right
pointer traverse from right to left until found element smaller thanpivot
. swap element at index
left
withright
.  continue until
left
>right
.  recursively
quickSort
the part smaller thanpivot
and the part larger thanpivot
.
Code Short Version
1  public Random rand; 
Algorithm
 similar process as the version above.
 instead of randomly choosing
pivot
from the middle value.(which could possible be the largest/smallest, to lead to worst time performance [quickSort
not able to break into even workload to utilize the parallelism.])  we choose following three values:
array[low]
,array[mid]
,array[high]
.  return the median of three as
pivot
.  revalue
array[0]
withsmallest
,array[length  1] = largest
,array[mid] = array[length  2]
, finally movepivot
toarray[length  2]
.
Code Faster Version
1 

radix/bucket sort
https://www.youtube.com/watch?v=Uey0GOMtT8
key idea
 can only be used for sorting integers, can be used when range of number is known
 take adavatage of known integer has finite bit
 key: during each pass, order of lower bit perserve (by queue) while new pass for higher bit is performing
 it has to be from lower to higher. (if we process higher bit first, lower bit will mess up the order of higher bit, since most sig bit are important to determine order, we process it last)
 array in sort order once all bits are passed once
algorithms
 find largest element, ex. 100
 number of passes
k
, determine by the number of digit of largest element, in this example,k=3
 zero pad all smaller values with less than
k
bit  prepare 10 buckets (queue, for example, since need to retrive value from buttom after each pass)
 each pass, put value with corresponding digit into corresponding bucket.
for example 023, 012, 100
. after first pass:
23
will put into bucket 312
puts into bucket 2100
put into bucket 0
repeat above process for all k
passes will result in sorted order
https://www.youtube.com/watch?v=YXFI4osELGU
complexity
 time: $O(kn)$ where
k
is the number of bit of max number  space: $O(n)$
1 
Insertion Sort
Algorithm
 start from $i = 1$
 move every elements that is larger than
currKey
to its right until postion found to insertcurrKey
.  insert
currKey
to $j+1$ index, where index $j$ is the last postion check that not meet the requirement (either $j \leq 0$ orcurrKey
$< A[j]$)
Analysis
 worst case complexity $O(n^2)$
Code
1 

1 

Heap Sort
 Sort algorithm based on heap data structure
 Heapsort is an inplace algorithm, but it is NOT a stable sort.
Algorithm
 repeatedly swaps the first value(max) of the list with the last value
 decreases the range of values considered in the heap operation by one (so that prev max will not be reheapify to top)
complexity
time: $O(nlogn)$ for both best & worst case
space: $O(1)$ in place sort
ref: https://en.wikipedia.org/wiki/Heapsort#Algorithm
Lexicographical/Dictionary Sort
 sort integers from 1 to 13
[1,10,11,12,13,2,3,4,5,6,7,8,9]
 treating integers as strings and use
Arrays.sort()
orCollection.sort()
will work but have $O(NLogN)$ time complexity  below is an O(n) algorithm,
n
is the number. Since it constructs a tree and does a preorder traversal, each node(i.e. each number) is visited exactly once.
1  1 2 3 4 
1  public List<Integer> lexicalOrder(int n) { 
logs
 created on 08/30/2016
 updated quick sort 01/02/2017
 added selection sort
 01/23/18 add bubble sort
 01/23/18 add radix/bucket sort
 01/23/18 add “key idea” for quicksort
 02/08/18 refactored code for quicksort short version
 03/18/18 update quicksort key idea
 05/05/18 add heap sort idea
 05/05/18 add stability sort chart
 05/05/18 add shorter quicksort version
 09/03/19 add Lexicographical sort