Skip to main content

Posts

CST370: Preparing for the Final: Algorithms in Practice

This week, I learned how several algorithms solve optimization and graph problems, including dynamic programming for the coin-collecting and coin-row problems, Floyd and Warshall algorithms for shortest paths and transitive closure, and Prim’s algorithm for minimum spanning trees. I practiced tracing tables step by step and understanding how intermediate states evolve, which helped me better connect the concepts across topics like sorting and greedy methods. I also started reviewing for the final exam by going through the review materials and key topics such as algorithm analysis, sorting, graph algorithms, and problem-solving strategies to reinforce my understanding and identify areas that need more practice. Additionally, I watched a video review  of Dijkstra’s algorithm  (https://www.youtube.com/watch?v=Gd92jSu_cZk) , which helped reinforce how to trace the algorithm step by step and understand how shortest paths are computed in practice.
Recent posts

CST370: Exploring Efficient Data Structures This Week

 This week I focused on understanding different data structures that improve search efficiency, including AVL trees, 2-3 trees, heaps, and hashing. I learned how AVL trees maintain balance using balance factors and rotations, and how to identify when rotations like left, right, or double rotations are needed after insertion. For 2-3 trees, I studied how nodes can store one or two keys and how promotions work to keep the tree balanced with all leaves at the same level. I also learned about heaps and how they are implemented as complete binary trees. I practiced operations such as inserting elements, deleting the maximum, and building a heap using both top-down and bottom-up approaches. Understanding how heaps are represented using arrays helped me see how parent and child relationships are calculated using indices. I explored heap sort and how it repeatedly removes the maximum element to produce a sorted sequence with O(n log n) time complexity. For hashing, I learned how hash fun...

CST370: Sorting Smarter, Not Harder

This week’s module focused on algorithm design techniques and graph algorithms. I learned how divide-and-conquer is applied in QuickSort, including how pivot selection affects performance and how the Median-of-Three strategy helps avoid worst-case scenarios. I also studied decrease-and-conquer through binary search and understood why sorting first can improve efficiency. In addition, I learned about Directed Acyclic Graphs (DAGs) and topological sorting. Using Kahn’s algorithm, I practiced calculating in-degrees, identifying source vertices, and detecting cycles. The quizzes and HW4_2 helped reinforce how algorithm steps and data structures like queues affect the final output order.

CST370: Divide and Conquer & Merge Sort

This week, I focused on understanding the Merge Sort algorithm and how the divide and conquer strategy works in practice. From the lecture video, I learned that Merge Sort repeatedly splits an array into two halves until each subarray has only one element, which is already sorted. The key step is the merge operation , where two sorted subarrays are combined by comparing elements and placing them in order. Using the recurrence relation T(n) = 2T(n/2) + Θ(n) and applying the Master Theorem , I learned that the overall time complexity of Merge Sort is Θ(n log n) . This helped me connect the algorithm steps with formal time-complexity analysis. 

Learning Merge Sort and Graph Traversals

This week’s module focused on algorithm design and analysis, with a strong emphasis on divide-and-conquer techniques, especially Merge Sort . From the lecture video captions, I learned how Merge Sort works by repeatedly dividing an array into two halves until each subarray contains only one element. Since a single element is already sorted, the algorithm then focuses on the merge step, where two sorted arrays are combined by comparing elements one at a time. This helped me better understand why Merge Sort is efficient and reliable. I also learned how to analyze Merge Sort using a recurrence relation and apply the Master Theorem to determine its time complexity. By identifying the values of a , b , and f(n) , I was able to see why Merge Sort runs in Θ(n log n) . In addition, the homework and quiz strengthened my understanding of DFS and BFS traversals , including how traversal order and marking rules affect the final result. Topics like the Traveling Salesman Problem , Knapsack Prob...

CST 370 - From loops to Logarithms

This week’s lessons strengthened my understanding of how algorithm efficiency is measured using Big-O, Big-Theta, and Big-Omega notations . The lecture notes emphasized identifying the basic operation and using the dominant term to classify an algorithm’s growth order. Through quizzes, I learned why Big-Theta can only be used when an algorithm has the same time complexity in all cases, while Big-O represents an upper bound. The homework project applied these ideas in practice by showing how sorting often dominates overall runtime, leading to (n log n) complexity. Analyzing recursive algorithms using recurrence relations and backward substitution also helped clarify how time complexity evolves across recursive calls.

Thinking Like an Algorithm in CST 370 This Week

This week, I learned about several important concepts in algorithms and data structures. From the recorded videos, I learned how pseudocode is used to describe algorithms in a clear and language-independent way. I also learned about different problem types , such as sorting, searching, graph problems, and tree data structures. Topics like graphs with vertices and edges, the Traveling Salesman Problem, and trees such as binary trees and binary search trees helped me understand how real-life problems can be modeled using data structures. The quizzes helped reinforce ideas like basic operations, time complexity, stable sorting, and graph properties. One aha moment for me was understanding how binary search trees make searching efficient . By comparing values and choosing to go left or right, the search space becomes smaller very quickly. Another important realization was how sorting can improve searching performance, which explains why these two topics are often discussed together. I al...