Algorithms

Fibonacci Numbers, and some more of the Euclidean Algorithm and RSA.

Fibonacci Numbers, and some more of the Euclidean Algorithm and RSA.

Mike
We define the Fibonacci Sequence, develop a formula for the entries. We then use that to establish bounds on the growth of the sequence. We use that to prove a bound on the number of division operations required to compute the Euclidean Algorithm. Finally, we finish by continuing our discussion of the RSA algorithm and introducing the Golden Mean.
Factorization, Ideals and More Euclid!

Factorization, Ideals and More Euclid!

Mike
In this entry we begin by proving the correctness of the Euclidean Algorithm, then we discuss factorization in rings. We give an example of factoring into irreducibles in two distinct ways. We then define Unique Factorization Domain, Principal Ideal Domain and Euclidean Domain. We then prove that an ED is a PID and a PID is a UFD. We finish by explaining the RSA encryption algorithm.
The Euclidean Algorithm, and More

The Euclidean Algorithm, and More

Mike
We continue our investigation into rings and fields. We finish by explaining the Euclidean Algorithm. We also give a python implementation which, for any two positive integers, a and b, returns gcd(a,b) and the pair of integers, s and t, such that as + bt = gcd(a,b).
Parallel Sorting in Theory and in Practice III

Parallel Sorting in Theory and in Practice III

Mike
As promised, the last of a three-part series of entries on sorting in parallel. Here we present a parallel implementation of merge_sort which runs in O(nlog_2(n)) time and achieves fairly good parallelism.
Parallel Sorting in Theory and in Practice II

Parallel Sorting in Theory and in Practice II

Mike
As promised, this is the second entry on parallel sorting. In this entry, we’ll implement merge_sort, and then give two different ways to make it run in parallel. The first one will be a bit simpler than the second one.
Parallel Sorting in Theory and in Practice I

Parallel Sorting in Theory and in Practice I

Mike
We’re going to begin our discussion of parallel algorithms. We’ll do this by giving a parallel version of quick_sort_plus. We finish this entry by extending the “Almost the Master Theorem” to include cases where f(n) = cn^alpha*log_2(n). In our next entry, we’ll introduce merge_sort as well as a couple different parallel versions of it. We’ll also discuss both the theoretical and practical runtimes for these functions.
Finding the Median and an Application to Sorting

Finding the Median and an Application to Sorting

Mike
In the previous few entries we’ve been discussing quick_sort and analyzing the run-time complexity of recursive algorithms. We’re going to apply what we’ve learned so far to finding the median of an array in O(n) time. Then we’re going to see how that can be added to quick_sort to guarantee that it finishes in O(nlog_2(n)) time.
More on Sorting and Analyzing Run Time Complexity

More on Sorting and Analyzing Run Time Complexity

Mike
In the previous entry (Sorting, Random Permutations, and little bit of Probability), we introduced quick_sort, gave a version of it in C++ and started to analyze how many steps it takes to sort a vector of floating point numbers. In this entry, we continue that analysis and prove some results that will help us get a feel for other recursive algorithms.