Sorts
Bubble Sort
We are switching each adjacent number. We have to go through the array length of array times.
Once a section is sorted, bubble sort will not touch those elements anymore.
https://leetcode.com/problems/minimum-swaps-to-arrange-a-binary-grid/description/
The worst-case time complexity of Bubble Sort is O(n^2) where n is the number of elements in the array.
In the given code, the outer while loop runs until the first time where no swaps happen, which means that the array is sorted. This loop can run at most n-1 times, where n is the length of the array.
Within the while loop, the for loop iterates over the array n-1 times. For each iteration, it compares adjacent elements and swaps them if they are in the wrong order.
Big O
The worst-case scenario occurs when the array is sorted in reverse order, and every element must be swapped with its adjacent element. In this case, the number of swaps is equal to the sum of the first n-1 positive integers, which is (n-1)(n-2)/2. Therefore, the total number of comparisons and swaps is (n-1)(n-2)/2, which is O(n^2).
The time complexity of Bubble sort is O(n^2) in the worst-case scenario, where n is the number of elements in the list. This means that the time taken by the algorithm to sort a list grows quadratically with the size of the list.
In the best-case scenario, when the list is already sorted, the time complexity of Bubble sort is O(n). However, in practice, it is unlikely to encounter a completely sorted list, so the worst-case scenario is a more realistic representation of its performance.
public class BubbleSort {
public static int sortAlgo(int[] bubble_arr)
{
int comps = 0;
int swaps = 0;
//int[] bubble_arr = {23,56,21,34,678,2,4,7,2235,996,446,999,6574,87,67,902,8754};
/* run continuously until the first time where no swaps happen */
for(int passes=1; passes<bubble_arr.length-1; passes++){ // you need to run passes for 1 through length-1 times
for(int i=0; i<bubble_arr.length-1; i++){
comps++;
if (bubble_arr[i]>bubble_arr[i+1]){
int temp = bubble_arr[i];
bubble_arr[i] = bubble_arr[i+1];
bubble_arr[i+1] = temp;
swaps++;
}
}
}
System.out.println("Comparison: " + comps);
System.out.println("Swaps: " + swaps);
return swaps;
}
public static void main(String[] args) {
double time = 0;
int swaps_avg=0;
for (int i = 0; i < 12; i++) {
int[] data = new int[5000];
for (int j = 0; j < data.length; j++) {
data[j] = (int) (Math.random() * 5000);
}
double startTime = System.nanoTime();
swaps_avg = swaps_avg + sortAlgo(data);
double endTime = System.nanoTime();
time += endTime - startTime;
}
System.out.println("Total Average Time in Seconds: " + time / 1000000000);
System.out.println("Average number of comparisons " + 12497500);
System.out.println("Average number of swaps " + swaps_avg/12);
}
}
BubbleSort.main(null);
Insertion Sort
- Insertion sort has two different sections: a sorted one and an unsorted one.
- It starts with the second element, or index 1, and decides where it belongs in the sorted section.
- It inserts each index depending on hierarchy.
- inserts to the left.
- Big O notation is O(n) in the best case.
- in the worst case, the notation is O(n^2)
The given code implements the Insertion Sort algorithm for sorting an array of integers. The worst-case time complexity of Insertion Sort is O(n^2) where n is the number of elements in the array.
The first for loop prints the original array, which takes O(n) time, where n is the number of elements in the array.
The second for loop iterates over the array n-1 times, where n is the length of the array. For each iteration, the inner loop iterates from the current index to the beginning of the array, checking whether the current element is smaller than the previous element. If so, it swaps the two elements. The inner loop may iterate up to i times, where i is the current index in the outer loop.
Big O
The worst-case scenario occurs when the array is sorted in reverse order, and every element must be moved to the beginning of the array. In this case, the total number of comparisons and swaps is equal to the sum of the first n-1 positive integers, which is (n-1)*(n-2)/2. Therefore, the total time complexity of the algorithm is O(n^2).
The time complexity of Insertion sort is O(n^2) in the worst-case scenario, where n is the number of elements in the list. This means that the time taken by the algorithm to sort a list grows quadratically with the size of the list.
In the best-case scenario, when the list is already sorted, the time complexity of Insertion sort is O(n). This is because each element is compared to the previous element and inserted in the correct position in the sorted array, without the need for additional comparisons.
Therefore, the time complexity of the Insertion Sort algorithm implemented in the given code is O(n^2).
public class InsertionSort {
public static int sortAlgo(int[] array){
int comps = 0;
int swaps = 0;
/* Assume the 0th element is the smallest. Start from the 1st element
*
*/
for(int index=1; index<array.length; index++){
for(int sorted_index = index; sorted_index>0; sorted_index--){
comps++;
if(array[sorted_index]<array[sorted_index-1]){
int temp = array[sorted_index];
array[sorted_index] = array[sorted_index-1];
array[sorted_index-1] = temp;
swaps++;
}
}
}
System.out.println("Comparison: " + comps);
System.out.println("Swaps: " + swaps);
return swaps;
}
public static void main(String[] args) {
double time = 0;
int swaps_avg=0;
for (int i = 0; i < 12; i++) {
int[] data = new int[5000];
for (int j = 0; j < data.length; j++) {
data[j] = (int) (Math.random() * 5000);
}
double startTime = System.nanoTime();
swaps_avg = swaps_avg + sortAlgo(data);
double endTime = System.nanoTime();
time += endTime - startTime;
}
System.out.println("Total Average Time in Seconds: " + time / 1000000000.0);
System.out.println("Average number of comparisons " + 12497500);
System.out.println("Average number of swaps " + swaps_avg/12);
}
}
InsertionSort.main(null);
Selection Sort
- If we have a set of numbers that are unsorted, we most the lowest number to the end.
- We keep taking the lowest number from the unsorted section, and adding it to the right of the sorted section.
- In place sorting algorithm takes constant amount of memory.
The given code implements the Selection Sort algorithm for sorting an array of integers. The worst-case time complexity of Selection Sort is O(n^2) where n is the number of elements in the array.
The first for loop prints the original array, which takes O(n) time, where n is the number of elements in the array.
The second for loop iterates over the array n-1 times, where n is the length of the array. For each iteration, it finds the minimum element in the remaining unsorted portion of the array, and swaps it with the first element of the unsorted portion. The inner loop iterates over the remaining unsorted portion of the array, which can be at most n-1 elements.
Big O
The time complexity of Selection sort is O(n^2) in the worst-case scenario, where n is the number of elements in the list. This means that the time taken by the algorithm to sort a list grows quadratically with the size of the list.
In the best-case scenario, when the list is already sorted, the time complexity of Selection sort is still O(n^2). This is because the algorithm still needs to iterate through the entire list to find the minimum element in each iteration, even if it is already in the correct position.
The worst-case scenario occurs when the array is sorted in reverse order, and every element must be moved to the beginning of the array. In this case, the total number of comparisons and swaps is equal to the sum of the first n-1 positive integers, which is (n-1)*(n-2)/2. Therefore, the total time complexity of the algorithm is O(n^2).
Therefore, the time complexity of the Selection Sort algorithm implemented in the given code is O(n^2)
public class SelectionSort {
public static int sortAlgo(int[] array){
int comps = 0;
int swaps = 0;
// int[] array = {23,12,57,335,5433,24567,64,25,66};
int times = array.length;
for (int outer=times; outer>0; outer--){
int min = 0xFFFFFF;
int min_idx = 0;
for(int i=0; i<outer; i++){
comps++;
if(array[i] < min){
min = array[i];
min_idx = i;
swaps++;
}
}
for (int j = min_idx+1; j < array.length; j++){
array[j-1]=array[j];
}
array[array.length-1] = min;
}
System.out.println("Comparison: " + comps);
System.out.println("Swaps: " + swaps);
return swaps;
}
public static void main(String[] args) {
double time = 0;
double swaps_avg = 0;
for (int i = 0; i < 12; i++) {
int[] data = new int[5000];
for (int j = 0; j < data.length; j++) {
data[j] = (int) (Math.random() * 5000);
}
double startTime = System.nanoTime();
swaps_avg = swaps_avg + sortAlgo(data);
double endTime = System.nanoTime();
time += endTime - startTime;
}
System.out.println("Total Average Time in Seconds: " + time / 1000000000.0);
System.out.println("Average number of comparisons " + 12502500);
System.out.println("Average number of swaps " + swaps_avg/12);
}
}
SelectionSort.main(null);
Merge Sort
Merge sort is a divide-and-conquer sorting algorithm that works by dividing the input array into two halves, sorting each half recursively, and then merging the sorted halves back together to form the final sorted array.
Big O
The time complexity of Merge sort is O(n log n) in the worst-case scenario, where n is the number of elements in the list. This means that the time taken by the algorithm to sort a list grows logarithmically with the size of the list.
In the best-case scenario, when the list is already sorted, the time complexity of Merge sort is still O(n log n). This is because the algorithm still needs to divide and merge the sorted halves, even if they are already in the correct order.
public class MergeSort {
public static int sortAlgo(int[] inputArray){
int inputLength = inputArray.length;
if(inputLength<2){
return 0; //return if the array is divided into just 1 or 0 elements
}
int midIndex = inputLength/2;
int[] leftHalf = new int[midIndex];
int[] rightHalf = new int[inputLength - midIndex];
for(int i =0; i<midIndex; i++){
leftHalf[i] = inputArray[i];
}
for(int i =midIndex; i<inputLength; i++){
rightHalf[i-midIndex] = inputArray[i];
}
sortAlgo(leftHalf);
sortAlgo(rightHalf); //recursively keep breaking the halves into halves into halves...
int x=0;
x = x+ merge(inputArray, leftHalf, rightHalf);
return x;
}
private static int merge(int[] inputArray, int[] leftHalf, int[] rightHalf){
int swaps=0;
int comps=0;
int leftSize = leftHalf.length;
int rightSize = rightHalf.length;
int i=0;
int j=0;
int k=0;
while(i<leftSize && j<rightSize){
if(leftHalf[i]<=rightHalf[j]){
inputArray[k] = leftHalf[i];
i++;
comps++;
}
else{
inputArray[k] = rightHalf[j];
j++;
comps++;
}
k++;
}
while(i<leftSize){
inputArray[k] = leftHalf[i];
i++;
k++;
comps++;
}
while(j<rightSize){
inputArray[k] = rightHalf[j];
j++;
k++;
comps++;
}
return comps;
}
public static void main(String[] args) {
double time = 0;
double comps_avg = 0;
for (int i = 0; i < 12; i++) {
int[] data = new int[5000];
for (int j = 0; j < data.length; j++) {
data[j] = (int) (Math.random() * 5000);
}
double startTime = System.nanoTime();
comps_avg = comps_avg+sortAlgo(data);
double endTime = System.nanoTime();
System.out.println("Comparison: " + comps_avg);
System.out.println("Swaps: " + 0);
time += endTime - startTime;
}
System.out.println("Total Average Time in Seconds: " + time / 1000000000.0);
System.out.println("Average number of comparisons " + comps_avg/12);
System.out.println("Average number of swaps " + 0);
System.out.println("Important thing to note: Merge sort does not need any swapping because it uses a temporary buffer array for processing the merge operation.");
}
}
MergeSort.main(null);
For large data sets, merge sort is typically the best choice among these four sorting algorithms. This is because merge sort has a time complexity of O(n log n), which grows much slower than the O(n^2) time complexity of bubble sort, insertion sort, and selection sort.
While merge sort requires additional memory to store the temporary arrays used during the merging process, it is still efficient for sorting large data sets as long as there is sufficient memory available. Merge sort is also a stable sorting algorithm, meaning that it maintains the relative order of equal elements in the input array.
In summary, if you are dealing with a large data set and you want a sorting algorithm that is efficient and stable, merge sort is a good choice. However, if memory usage is a concern or the data set is small, then bubble sort, insertion sort, or selection sort may be appropriate. This is seen in the table below, where I am comparing the times, swaps and comparisons. Also, we see the big O for each sort.
Sort | Avg Time | Avg Comparisions | Avg Swaps | Big O |
---|---|---|---|---|
Bubble Sort | 0.469380389 | 12497500 | 6238572 | O(n^2) |
Insertion Sort | 0.250415816 | 12497500 | 6216991 | O(n^2) |
Selection Sort | 0.192432211 | 12497500 | 39650.5 | O(n^2) |
Merge Sort | 0.018134474 | 5000.0 | 0.0 | O(nlog n) |
Hashmaps
Hash maps are data structures that store data in key-value pairs. They allow for fast retrieval of data based on a given key. In a hash map, the key is run through a hash function that converts it into an index within an array. The value associated with that key is then stored in the array at the corresponding index.
The use of hash functions allows for constant-time access to the value associated with a given key, making hash maps very efficient for data retrieval. Hash maps are widely used in computer science and programming for tasks such as caching, database indexing, and implementing associative arrays.
We are using a binary search tree to sort through the values. Then , once we find the value, then we associate the value. Caluclate the time, and return it.
import java.util.HashMap;
import java.lang.Integer;
import java.util.Scanner;
public class HashMapSortByValue {
public static void main(String[] args){
int[] list = new int[5000];
HashMap<Integer, Integer> hashmap = new HashMap<Integer, Integer>();
for (int i = 0; i < list.length; i++) {
Integer rand = (int) (Math.random() * 5000);
list[i] = rand;
hashmap.put(rand, rand);
}
double BS_time_avg=0;
double HM_time_avg = 0;
for (int times = 0; times<12; times++){
long HM_time = (hash(hashmap, 12));
HM_time_avg = HM_time_avg + HM_time;
System.out.println("Time using sort with Hashmap " + HM_time + " nanoseconds");
System.out.println();
long BS_time = (binarySearchTime(list, 12));
BS_time_avg = BS_time_avg + BS_time;
System.out.println("Time using binary search " + BS_time + " nanoseconds");
}
System.out.println("AVERAGE TIME using hashmap search " + HM_time_avg/12 + " nanoseconds");
System.out.println("AVERAGE TIME using binary search " + BS_time_avg/12 + " nanoseconds");
}
public static long hash(HashMap<Integer, Integer> hashmap, Integer find) {
long start = System.nanoTime();
hashmap.containsKey(find); //inbuilt hashmap method
long end = System.nanoTime();
return (end - start);
}
public static long binarySearchTime(int[] list, Integer find) {
long start = System.nanoTime();
/*
* Binary Search is repeatedly dividing in half the portion of the list that could contain the item,
* until you've narrowed down the possible locations to just one.
HOWEVER: for binary search, we need to make sure the list is sorted. We do so using isertion sort (you can use any sorting algo)
*/
for(int index=1; index<list.length; index++){
for(int sorted_index = index; sorted_index>0; sorted_index--){
if(list[sorted_index]<list[sorted_index-1]){
int temp = list[sorted_index];
list[sorted_index] = list[sorted_index-1];
list[sorted_index-1] = temp;
}
}
}
int min = 0;
int max = list.length - 1;
int middle = (min + max) / 2;
while (min <= max) {
if (list[middle] < find) {
min = middle + 1;
} else if (list[middle] == find) {
break;
} else {
max = middle - 1;
}
middle = (min + max) / 2;
}
long end = System.nanoTime();
return (end - start);
}
}
HashMapSortByValue.main(null);
// The binary search time is supposed to be significantly higher than the sort using hashmap
public class Hashmap_explore{
public static void main(String[] args){
HashMap<Integer, Integer> hashmap = new HashMap<Integer, Integer>();
int[] list = new int[5000];
for (int i = 0; i < list.length; i++) {
Integer value = (int) (Math.random() * 5000);
hashmap.put(value, value);
list[i] = value;
}
//learning a different method on how to do the sorting. Here, we are overiding the compare method method, and returning the
// hashmap in asscending order. The time complexity is the same for this.
Collections.sort(list, new Comparator<Map.Entry<Integer, Integer>>({
public int compare(Map.Entry<Integer, Integer> int1, Map.Entry<Integer, Integer> int2){
return int1.getValue()- int2.getValue();
}
}))
}
}
import java.util.*;
class Iterate_Hash {
public static void main(String[] args)
{
// Creating an Hashmap of string-integer pairs
// It contains student name and their marks
HashMap<String, Integer> hm
= new HashMap<String, Integer>();
// Adding mappings to above HashMap
// using put() method
hm.put("Aadya", 16);
hm.put("Avanthika", 14);
hm.put("Pranavi", 16);
// Printing all elements of HashMap
System.out.println("Created hashmap is" + hm);
Iterator hmIterator = hm.entrySet().iterator();
System.out.println(
"HashMap after adding bonus marks:");
// Iterating through Hashmap and
// adding 10 to age
while (hmIterator.hasNext()) {
Map.Entry mapElement
= (Map.Entry)hmIterator.next();
int marks = ((int)mapElement.getValue() + 10);
// Printing new map
System.out.println(mapElement.getKey() + " : "
+ marks);
}
}
}
Iterate_Hash.main(null);
import java.util.*;
public class Delete_hashmap {
public static void main(String[] args) {
// Creating an empty HashMap
HashMap<Integer, String> hash_map = new HashMap<Integer, String>();
// adding values to thhe hash map
hash_map.put(1, "English");
hash_map.put(2, "CSA");
hash_map.put(3, "Engineering");
hash_map.put(4, "Physics");
hash_map.put(5, "Calc");
// Displaying the HashMap
System.out.println("Initial Mappings are: " + hash_map);
// Removing the existing key mapping
String returned_value = (String)hash_map.remove(1);
System.out.println("Returned value is: "+ returned_value);
// pringting the new map
System.out.println("New map is: "+ hash_map);
}
}
Delete_hashmap.main(null);