Skip to main content

Posts

Showing posts from February, 2026

CST370: Preparing for the Final: Algorithms in Practice

This week, I learned how several algorithms solve optimization and graph problems, including dynamic programming for the coin-collecting and coin-row problems, Floyd and Warshall algorithms for shortest paths and transitive closure, and Prim’s algorithm for minimum spanning trees. I practiced tracing tables step by step and understanding how intermediate states evolve, which helped me better connect the concepts across topics like sorting and greedy methods. I also started reviewing for the final exam by going through the review materials and key topics such as algorithm analysis, sorting, graph algorithms, and problem-solving strategies to reinforce my understanding and identify areas that need more practice. Additionally, I watched a video review  of Dijkstra’s algorithm  (https://www.youtube.com/watch?v=Gd92jSu_cZk) , which helped reinforce how to trace the algorithm step by step and understand how shortest paths are computed in practice.

CST370: Exploring Efficient Data Structures This Week

 This week I focused on understanding different data structures that improve search efficiency, including AVL trees, 2-3 trees, heaps, and hashing. I learned how AVL trees maintain balance using balance factors and rotations, and how to identify when rotations like left, right, or double rotations are needed after insertion. For 2-3 trees, I studied how nodes can store one or two keys and how promotions work to keep the tree balanced with all leaves at the same level. I also learned about heaps and how they are implemented as complete binary trees. I practiced operations such as inserting elements, deleting the maximum, and building a heap using both top-down and bottom-up approaches. Understanding how heaps are represented using arrays helped me see how parent and child relationships are calculated using indices. I explored heap sort and how it repeatedly removes the maximum element to produce a sorted sequence with O(n log n) time complexity. For hashing, I learned how hash fun...

CST370: Sorting Smarter, Not Harder

This week’s module focused on algorithm design techniques and graph algorithms. I learned how divide-and-conquer is applied in QuickSort, including how pivot selection affects performance and how the Median-of-Three strategy helps avoid worst-case scenarios. I also studied decrease-and-conquer through binary search and understood why sorting first can improve efficiency. In addition, I learned about Directed Acyclic Graphs (DAGs) and topological sorting. Using Kahn’s algorithm, I practiced calculating in-degrees, identifying source vertices, and detecting cycles. The quizzes and HW4_2 helped reinforce how algorithm steps and data structures like queues affect the final output order.

CST370: Divide and Conquer & Merge Sort

This week, I focused on understanding the Merge Sort algorithm and how the divide and conquer strategy works in practice. From the lecture video, I learned that Merge Sort repeatedly splits an array into two halves until each subarray has only one element, which is already sorted. The key step is the merge operation , where two sorted subarrays are combined by comparing elements and placing them in order. Using the recurrence relation T(n) = 2T(n/2) + Θ(n) and applying the Master Theorem , I learned that the overall time complexity of Merge Sort is Θ(n log n) . This helped me connect the algorithm steps with formal time-complexity analysis.