The best algorithm helps us gain an edge but is there any way to optimize the best algorithm possible? These are some of the issues I would be discussing.

**What is an Algorithm?**

In mathematics and computer science, an algorithm is an unambiguous specification of how to solve a class of problems. In other words it is a process or set of rules to be followed in calculations or other problem-solving operations, especially by a computer.

**Evolution of Algorithms**

The first algorithms were invented many years before the invention of computers as we know them today.

Algorithms were used in ancient Greece. Two examples are the Sieve of Eratosthenes, which was described in Introduction to Arithmetic by Nicomachus, and the Euclidean algorithm, which was first described in Euclid’s Elements (c. 300 BC). In mathematics, the sieve of Eratosthenes is a simple, ancient algorithm for finding all prime numbers up to any given limit.It does so by iteratively marking as composite (i.e., not prime) the multiples of each prime, starting with the first prime number, 2. The Euclidean algorithm, or Euclid’s algorithm, is an efficient method for computing the greatest common divisor (GCD) of two numbers, the largest number that divides both of them without leaving a remainder.

One common application of algorithms is to sort optimally some given data according to a certain criteria. Among the authors of early sorting algorithms around 1951 was Betty Holberton (née Snyder), who worked on ENIAC and UNIVAC. As early as 1956 bubble sort which is also known as sinking sort was analyzed.

Wikipedia provides a summary of the most popular sorting algorithms. They are

**Bubble sort :** Exchange two adjacent elements if they are out of order. Repeat until array is sorted.

**Insertion sort :** Scan successive elements for an out-of-order item, then insert the item in the proper place.

**Selection sort :** Find the smallest element in the array, and put it in the proper place. Swap it with the value in the first position. Repeat until array is sorted.

**Quick sort :** Partition the array into two segments. In the first segment, all elements are less than or equal to the pivot value. In the second segment, all elements are greater than or equal to the pivot value. Finally, sort the two segments recursively.

**Merge sort** : Divide the list of elements in two parts, sort the two parts individually and then merge it.

**Heap sort :** Divides its input into a sorted and an unsorted region, and it iteratively shrinks the unsorted region by extracting the largest element and moving that to the sorted region. It is regarded as an improved selection sort.

By far the fastest sorting algorithms are quicksort and heapsort. The faster of the two is debatable. However, many people believe and have proven that quicksort is the faster of the two. This has been attributed to its smoother memory access pattern and its cache efficiency. Some tests have proven that it is twice as fast as heapsort. Are there proprietary sorting algorithms?

**Are all Algorithms in the Public Domain?**

It is easy to fall into a trap, believing that all possible algorithms are in the public domain. There is a great deal of information out there. There are many algorithms alongside their pseudo-codes on the internet. But not all algorithms are available publicly. In competing for market share many companies try to gain an edge over their competitors by providing superior products. Clearly it shouldn’t be difficult for any body to see how a better algorithm would win more customers. Going by this alone it would be safe to conclude that not all algorithms are in the public domain.

Data compression and decompression are important for many reasons. There have been many algorithms developed to solve the problem of compression and decompression. These algorithms vary in the time they compress data, memory consumption and efficiency with which they compress data — some produce a smaller sized output than others. For some time apple kept its compression algorithm secret. In July 2016, apple open-sourced its new lossless compression algorithm, LZFSE making it publicly available. This is another evidence that companies have been keeping their algorithms secret. Many are proprietary.

Apple open-sources its LZFSE algorithm

**Algorithms Reduce Overhead**

Overhead refers to performance loss in terms of either speed, memory consumption or both. Using the right algorithm can reduce the overhead in software.

Whenever developing a product or attempting to find a solution to a problem, often we have to come up with our own code to do just that. Sometimes we solve the problem heuristically but that only leaves us happy because of lack of a better alternative. Many spend sleepless nights trying to find the solution to a problem. However, many recurring problems have already been solved not once but many times. In other words algorithms are available. In many cases it would be difficult for any one to beat the best algorithms out there. The best approach therefore is to search for and select the best available algorithm. With this in mind I think we need something close to an algorithm wiki or journal.

**The Role of Assembly Language in Algorithm Implementation**

Regardless of the algorithm selected to solve a problem, the choice of programming language has an impact on performance. Low level programming languages such as C and assembly language would be faster. We can often gain significant edge by implementing the algorithm in Assembly language. In a previous article on this blog, listed below, we provided an in depth analysis of the performance improvement that can be garnered simply by implementing an algorithm in assembly.

Assembly language allows us to optimize every line of code at the lowest level possible. We also have the capability to control hardware directly. These are luxuries impossible with high level programming languages. Compilers often do a good work optimizing for speed or smaller memory consumption. Experienced assembly language programmers often find several lines of compiler output that can be optimized.

With the above information in mind it does make sense to implement every algorithm in assembly. However, sometimes this may not be possible. We may be writing software that would be run on CPUs with different instruction sets. In this case our best bet may be the C programming language. This is not the only case. On other occasions we would be designing software that would be run on all existing operating systems. Java will in this case be our best bet.

**Conclusion**

In conclusion, I would say that every attempt must be made to create an algorithm wiki or an algorithm online journal. Whenever possible algorithms must be implemented in assembly. Lastly, we must always search for the best possible algorithm when developing software.

Boachsoft Lowrider 2016 is an excellent repair shop management software as well as an excellent work order management software. Boachsoft also makes an excellent landlord software (property management software).

Credit: Yaw Boakye-Yiadom , Boachsoft Founder and CEO. All rights reserved!

Yaw Boakye-Yiadom is pronounced [Yiaw Bwachi – Yiadom (chi as in chill)]

## Leave a Reply