### While the problems you need to solve will always be different, the approaches to solve them can be divided into few categories the so called "problem solving paradigms".

Let's go into more detail about the key ones.

###### 1 _ complete search

#### The simplest problem-solving paradigm

Also known as "brute force", with this very generic approach, we simply list all the possible solutions among which we then search for the correct one.

To apply a complete search for solving a problem we need:

To generate every possible solution to a problem by using an iterative or recursive method.

To know how to check if a solution is valid.

###### A complete search always gives you the correct answer, but when?

The brute force approach works well when there’s a small amount of information, otherwise the number of possible solutions can be extremely high.

###### 2 _ search algorithm

#### Two main approaches to look for a value in a list

The simplest way to search something is to look at everything.

With a **linear search**, however, we can just look at each possible element to find the correct one.

This approach works every time on both sorted and unsorted data, but may be slow on large data sets.

If the numbers of the list are sorted things will be easier.

With the **binary search algorithm**, we can take the middle element and compare it with our searched number. By repeating this process we can eliminate a lot of numbers at each step until we’ve found our solution, without even looking at them.

###### Which results can I obtain with a binary search algorithm?

The result can be one of the following:

If the middle element is the one we’re looking for, we have our solution.

If the middle element is greater than the searched element, the solution is in the first half of the list.

If the middle element is less than the searched element, the solution is in the second half of the list.

*NOTE: Search algorithms have many other applications besides looking for elements in array. You can also use them to speed up a complete search.*

###### 3 _ greedy algorithm

#### Make the best choices at each step

When facing a decision, you aim to find the best solution at any given time (greedy) and hope it’s the best one among all the others.

Unfortunately, choosing the best solution isn't always the best choice. Often, choices made in a moment can result in an even more negative impact later on.

So pay attention if you have to make decisions in advance.

###### exercise 1

###### Try writing an algorithm to solve the problem of giving the minimum amount of coins in change

` ``** example of solution **`

###### exercise 2

###### Try writing an algorithm to solve the problem of maximising the profit of a given list of tasks given their deadlines and values

` ``** code with example of solution **`

###### 4 _ divide & conquer

#### When problems are getting complicated... divide them

In the ‘divide and conquer’ approach, we try to simplify a problem by splitting it into smaller problems to the point where solving them becomes trivial.

With the following 3 steps, and by using recursion, you can solve problems easily. But attention, it doesn't always give the correct solution.

###### exercise 3

###### Consider an unordered array of numbers. How can you use a divide and conquer approach to sort the numbers?

` ``** code with example solution **`

###### exercise 4

###### Imagine you have two different sorted array of numbers. How you can merge them into a unique sorted array?

` ``** code with example solution **`

###### 5 _ dynamic programming

#### When things are getting harder, solve the problem as a master

Of all the problem solving paradigms, "dynamic programming" is probably the most challenging to master. The idea behind dynamic programming is the same as the divide and conquer approach: solve the bigger problem by solving smaller problems. But while in divide and conquer smaller problems are independent from each other, in dynamic programming smaller **problems are used to construct the bigger ones**.

The simplest way to approach problems with this technique is to assume you’ve already solved the smaller problems, so think about how to use them to solve a bigger solution.

A technique often used with the dynamic programming approach is **memoization** (not to be confused with memorization): when you need the solutions of the same problems several times, you can just memorize their value instead of reevalute them.

This simple idea can save you a lot of time.

###### exercise 5

###### Consider the same problem of sorting a array of numbers, but now all the elements are sorted except the last one. How can you use the solution of the smaller problem to solve the bigger one?

` ``** code of example solution **`

###### exercise 6

###### Go back to the greedy approach problems and try to solve them now.

` ``** code of example solution **`