Algorithms have been used since ancient times. The very first recorded algorithm dates back to around 1,800 BCE. It explained particular procedures for computing different values including square roots. The algorithm for finding the greatest common divisor was first formulated around 300 BCE by Euclid. We have put together a list of the seven algorithms that are responsible for running the world today, check out the list below and let us know what you think of it!
PageRank
PageRank is the algorithm that has helped Google become what it is today. It was the first algorithm that was developed by Sergei Brin and Larry Page for indexing and ranking web pages on the Internet back in the late 1990s. It was subsequently used for powering the new Google search engine.
The essential aim of PageRank is to determine a score for the authority of a page. It does so by taking into account the authority scores of the pages that link to it. So if a higher number of authoritative pages are linking to a page, its authority score increases. PageRank is now among the 200 measures that Google uses for ranking web pages but still remains an essential driving force.
Key Exchange Encryption
How can you secure the information that is being broadcasted over a loudspeaker on a street corner and audible to everyone? This is somewhat an analogy to the challenge that networks communication traffic faces. These communications can be intercepted and read. That is where the Key Exchange Encryption algorithm comes in. This is how it works;
- Both parties, A and B, choose a number and don’t share it with anyone (private keys).
- One of the parties announces a random number over a public channel (the public key)
- Both parties apply for the private number as the exponent to the public number and obtain the result.
- Both parties then swap the different results.
- Both parties then apply their private number as the exponent to the result that was exchanged.
- The value is same for both parties and this can be used for encrypting the communications.
Since no one shares the private key on the public channel, it is next to impossible for anyone to determine the value that is being used for encrypting the communication. (A^n)^m and (A^m)^n will always give you the exact same answer. The private keys are m and n whereas the public key is A. The structure of the Key Exchange algorithm is a basic feature of more advanced encryption such as RSA encryption.
Backpropagation
One of the most important algorithms to have been developed during the last 50 years is the backpropagation via a neural network. Without diving into the details of how a neural network operates, we will talk about backpropagation. Let’s say you feed an image of a dog; the neural network will come down to choices such as dog, cat, mouse, and a human baby. Each of these options will have a probability, and the option with the highest probability is chosen as the answer. Backpropagation is the propagation of the error back through the neural network and over the connections that led to an incorrect answer. It will make adjustments and lower the probability assigned to wrong answers. Over time, a neural network can learn what something is by learning what it is not! Without back propagation, deep-learning neural networks would not exist.
Compression
Ah, the algorithm of compression. The basic idea of compression is making use of references and offsets rather than actual data for representing the data by using less space. Let’s say you have a string of characters that need to be compressed, ABCCABCCABACABACABACDDDBDB. This is 26 characters long. However, if you write it like ABCC2ABAC3D2DB2, it is only 15 characters long where each number tells you the number of times the string needs to be printed.
While this may not sound like much, we have reduced the memory required by the string by about 40%. For files of sizes in GBs, 40% is huge save in terms of memory. It is compression that allows us to carry out efficient transmission and storage of information.
Searching and Sorting Algorithms
Searches and Sorts are a special kind of algorithm since there are various techniques that are used for sorting a data set or for searching a particular value within one. These algorithms are used based on the kind of data set that you are dealing with and the way that data set has been organized. For example, in a dictionary using a binary search is advisable whereas heapsort doubles are advisable if you are looking for the highest or the lowest value. These algorithms are used in various programs that handle data and programmers make use of them quite frequently.
Dijkstra’s Shortest Path
This particular algorithm is a search algorithm intended for graphs but has a special value because it is different from other search algorithms. As per Edsger Dijkstra, he was sitting with his fiancé in 1959 in the Netherlands having coffee when he came up with an algorithm that could display the power of the computer system that he was using to the non-computing audience. He had 64 cities plotted on a graph witch each city being represented by a node and drew different paths that are known as edges between these cities. He labelled one node as Rotterdam and another node as Groningenand and developed an algorithm that found the shortest path between these two nodes.
The algorithm is one of the most widely used algorithms and has enabled GPS routing, signal routing, shipping a package across a country, and much more.
TCP/IP Routing Protocol Algorithms
This is how the Internet sees itself! When it was first developed, the standards for the transmission control protocol/Internet protocol (TCP/IP) were new, and despite being mathematically sound, the algorithm was not built while keeping the amount of traffic in mind. Luckily, the Internet didn’t cripple and was able to expand into our lives. The first initial decisions that form up the TCP/IP turned out to be crucial for the operation of the whole network once the traffic skyrocketed. Among the critical decisions, was the selection of an algorithm that was to be used for routing data packets. There are two algorithms that are used for this purpose; the Distance-Vector Routing Protocol Algorithm (DVRPA) and the Link-State Routing Protocol Algorithm (LSRPA).
DVRPA finds the shortest distance between the destination networks and the source. It can rely on any number of metrics for calculation but uses something simple such as the number of routers and the server ‘hops’ that it must carry out on the way. LSRPA operates, almost identically, but the routers running this algorithm maintain a map of the entire Internet that it can connect to and carry out tests of various connections and analyzes them for determining a more realistic cost of that connection while considering computation, time, etc.