# Algorithms Techniques - Part 1

A quick overview of common algorithms techniques.

An algorithm is a sequence of well-defined steps to solve a problem or accomplish a specific task. When writing a program, you are very likely creating or reusing an algorithm. When going from home to school you are also following an algorithm (get out of the house, walk to bus stop, get in the bus, … , you get the idea).

When writing an algorithm, you usually start by understanding the problem you are trying to solve and then working the steps needed to solve it. At this point, this is all abstract and there is no specifics about programming languages, for example. Let’s say, for example, that the problem you are trying to solve is “given a list of numbers, find the largest number.” The possible steps to solve it could be:

- Check the list, if it is empty, than there is no largest number.
- If the list is not empty, start at the first element and consider it to be the largest number.
- Iterate the list, comparing the current element with the largest number so far. If the current element is larger, than it is the new largest number.
- When there are no more elements in the list, consider the current largest number found to be the largest number in that list.

At some point, you will realize that some problems are very similar in a way and that the steps to solve one problem are almost the same to solve the other. These techniques of problem solving can be grouped and categorized and are known as *Algorithm Design Techniques*. They are a generic model for designing algorithms and are an abstraction higher than the algorithm, which is also an abstraction.

Some of the most common techniques are:

**Brute Force**

Brute force is a simple and very general technique that evaluates all possible candidates of a solution to find a solution.

**Recursion**

Recursion is a general technique where the algorithm calls itself progressively until a solution is found.

**Divide and Conquer**

Divide and Conquer algorithms decompose a problem into smaller sub-problems, solving them recursively to finally merge the solutions to find the final solution to the problem.

**Dynamic Programming**

Dynamic programming is a technique where a problem is divided in smaller, overlapping sub-problems. The algorithm then stores the solutions for these sub-problems and, before solving the next sub-problem, it checks if there is already a solution and reuse it instead of calculating it again.

**Greedy**

Greedy algorithms work by choosing the best optimal solution at each step. Problems solved using a greedy algorithm might not have an optimal solution.

**Backtracking**

Backtracking is a technique that builds a solution incrementally and discard any solution that don’t satisfy the constraints of the problem at any point.

In the next parts of the series I will go over these techniques and how they can be implemented to solve problems. You can use the links below (this page will be updated as the articles are published):

- Brute Force
- Recursion
- Divide and Conquer
- Dynamic Programming
- Greedy
- Backtracking

*Cover picture by Artem Labunsky via Unsplash*