Posts

Showing posts from February, 2025

CST 370 - Design and Analysis of Algorithms - Week 7

Image
  This week in CST-370, I learned about non-comparison based sorting algorithms such as Counting Sort and Radix Sort. Unlike traditional sorting algorithms that rely on comparisons, Counting Sort uses frequency counting and distribution calculations to sort integers efficiently when the range of numbers is relatively small. The time complexity of Counting Sort is O(n + k), where n is the number of elements and k is the range of values. Additionally, I learned about Radix Sort, which processes numbers digit by digit, starting from the least significant digit (LSD) and applying Counting Sort iteratively. Radix Sort is particularly useful for sorting numbers with multiple digits, and its efficiency is O(d * (n + k)), where d represents the number of digits in the largest number. Counting Sort maintains the stability of elements, preserving their relative order in the output. I also learned more Dynamic Programming (DP) and its applications in optimization problems. The concept of over...

CST 370 - Design and Analysis of Algorithms - Week 6

Image
  This week in CST-370, we learned about more advanced data structures such as AVL trees, heaps, and hashing. AVL trees solve the issue of unbalanced binary search trees by maintaining a balance factor between -1 and 1, ensuring logarithmic time complexity for operations like search, insert, and delete. I learned how rotations (left, right, and combinations of both) restore balance when inserting or deleting nodes. The concept of balancing height was interesting because it talks about the trade-offs between complexity and performance. We also learned about heaps, focusing on their role as priority queues. A heap is a complete binary tree where each parent node adheres to a specific order property, either greater than (max-heap) or less than (min-heap) its children. I learned the importance of heapify operations and how heaps support efficient insertion and deletion in logarithmic time. Also, heapsort was introduced as a sorting algorithm that leverages the heap’s structure to achie...

CST 370 - Design and Analysis of Algorithms - Week 5

Image
  This week in CST-370, I learned more about QuickSort and its importance in algorithm design. QuickSort is a divide and conquer algorithm, which stands out for being efficient and "in place" unlike Merge Sort. I learned how the partitioning step affects its performance, with the best case being a balanced split and the worst case occurring when the partition divides unevenly, such as in already sorted or reverse sorted arrays. Empirical analysis mentions the practical implications of efficient algorithms over brute computational power, which made me prioritize learning effective techniques. Additionally, I learned about binary tree traversals, specifically preorder, inorder, and postorder traversal methods. These techniques consist of visiting each node and I practiced some examples to better understand the order in which nodes are visited. Also, I was introduced to the "Decrease and Conquer" approach, an algorithm design technique that simplifies a problem by solv...

CST 370 - Design and Analysis of Algorithms - Week 4

Image
  This week in CST-370, we learned about the merge sort algorithm. Merge sort follows the divide and conquer technique to efficiently sort an input array, with a time complexity of Θ(n log n). I found this algorithm relatively easy to understand. The process begins by dividing an unsorted array into two halves, and this division continues recursively until we are left with individual elements. Once the base case is reached, the sorting phase begins, where smaller elements are placed on the left and larger elements on the right. This merging process continues until a fully sorted array is obtained. Besides learning about merge sort, we spent the rest of the week reviewing for the midterm. While we did not cover much new material, I found merge sort to be an interesting and useful algorithm.