First let's get familiar with the word greedy: It means you want more and more of something.

What Are Greedy Algorithms?

A greedy algorithm is an approach (technique | strategy) where, at each step, you choose the best option available at the moment without worrying about the future consequences of this choice. The goal is to reach a global optimum by making a sequence of locally optimal choices. This method is used to solve the optimization problems. What are optimization problems? A problem which requires either minimum result or maximum result.

It operates on the principle of “taking the best option now” without considering the long-term consequences. or we can say that “best decision for the moment”.

In essence, the algorithm follows the motto: "Make the best choice at each step, and hope that it leads to the global optimum."

For a problem to be solvable using a greedy approach, it must exhibit two important properties:

  1. Greedy Choice Property: A locally optimal solution leads to a globally optimal solution.
  2. Optimal Substructure: A solution to the problem can be constructed efficiently using solutions to its subproblems.

If both properties are satisfied, then a greedy algorithm can guarantee an optimal solution.

Example:

Suppose, You want to travel from location A to location B.

For this problem there could be more than one solution.

  • Solution 1: I can travel by walk.
  • Solution 2: I can take a bike.
  • Solution 3: I can take a car.
  • Solution 4: I can go by train.
  • Solution 5: I can go by flight.
  • and more.

However, If there is a constraint in the problem. like you have to cover this journey within two hours. Thus some of the solutions might be not be feasible.

  • Solution 1: I can travel by walk. ❌
  • Solution 2: I can take a bike. ❌
  • Solution 3: I can take a car. ❌
  • Solution 4: I can go by train. ✅ (feasible solution: satisfying the condition given in the problem)
  • Solution 5: I can go by flight. ✅ (feasible solution: satisfying the condition given in the problem)

What If I say that you have to cover this journey in minimum cost (as less as possible). Thus it become an minimization problem (minimization problem: the problem which demands our result should be minimum).

From the above feasible solution, any solution definitely taking minimum costs..

  • Solution 4: I can go by train. ✅ (optimal solution: a solution which is already feasible and giving minimum costs.)
  • Solution 5: I can go by flight. ❌ (not minimum solution)

For any problem there is only one Optimal Solution.

This problem requires minimum result, some others may require maximum results.

Thus if a problem requires either minimum or maximum result then that problem is known as optimization problem.

Terms

  • Feasible Solution:
    • A feasible solution is one that satisfies all the constraints of the problem. In other words, it is a valid solution, even if it may not be the best one.
    • For example, in the knapsack problem, a feasible solution would be a set of items whose total weight does not exceed the knapsack's capacity.
  • Optimal Solution:
    • An optimal solution is the best possible solution to the problem. It is the one that maximizes or minimizes the objective function (depending on whether it's a maximization or minimization problem).
    • In the knapsack problem, the optimal solution is the one that provides the maximum total value while staying within the weight limit.
  • An optimization problem is a type of problem where the goal is to find the best (optimal) solution from a set of feasible solutions. Typically, the problem involves:
    • Objective Function: This is the function that you want to maximize or minimize. It represents the measure of performance, cost, profit, or efficiency that you are optimizing. For example, in a transportation problem, the objective could be to minimize the total shipping cost.

Approach to Optimization Problem

The optimization problems can be solved by using one of the below approaches:

1. Greedy Method

  • Approach: The greedy method builds a solution step-by-step by making a series of locally optimal (greedy) choices, with the hope that these local choices will lead to a global optimal solution.
  • When It's Used: The greedy method works well for optimization problems that exhibit the greedy choice property (choosing the locally optimal solution leads to a globally optimal solution) and optimal substructure (the global optimal solution can be composed of optimal solutions to subproblems).
  • Advantages:
    • Simple and easy to implement.
    • Can be efficient with respect to time and space complexity.
  • Disadvantages:
    • It doesn’t always guarantee an optimal solution unless the problem satisfies specific properties.
  • Examples:
    • Fractional Knapsack Problem: Greedy method works because picking the item with the highest value-to-weight ratio always leads to the optimal solution.
    • Prim's and Kruskal's Algorithms for finding the Minimum Spanning Tree.

2. Dynamic Programming (DP)

  • Approach: Dynamic programming solves optimization problems by breaking them down into overlapping subproblems. It stores the solutions to subproblems and reuses these solutions to avoid redundant computations. This ensures optimality by systematically exploring all possible combinations of subproblems.
  • When It's Used: DP works well when the problem has optimal substructure (the optimal solution to the problem can be constructed from the optimal solutions to its subproblems) and overlapping subproblems (the same subproblems are solved multiple times during recursion).
  • Advantages:
    • Guarantees an optimal solution for problems with optimal substructure.
    • Avoids recomputation through memoization or tabulation, making it efficient in terms of time.
  • Disadvantages:
    • Can have high space complexity due to the need to store solutions to subproblems.
    • May be overkill for simple problems where greedy methods work.
  • Examples:
    • 0/1 Knapsack Problem: Unlike the fractional version, dynamic programming is used because the greedy method fails to give the optimal solution.
    • Fibonacci Sequence: DP stores previously calculated Fibonacci numbers to avoid redundant calculations.
    • Shortest Path Algorithms (e.g., Bellman-Ford for graphs with negative weights).

3. Branch and Bound (B&B)

  • Approach: Branch and bound is an exact algorithm used to solve combinatorial and discrete optimization problems. It works by systematically exploring branches of a decision tree, where each branch represents a subset of solutions. It bounds the potential solutions in each branch to prune sections of the search space that cannot lead to an optimal solution.
  • When It's Used: This method is used for problems where an exhaustive search of all possible solutions is impractical, but you need to ensure optimality. Branch and bound is often used for integer programming problems, where the solution must be an integer.
  • Advantages:
    • Guarantees finding the optimal solution.
    • It can significantly reduce the search space through bounding, making the algorithm more efficient.
  • Disadvantages:
    • Computationally expensive for large problems.
    • The efficiency of the algorithm can depend on how well the bounds are calculated.
  • Examples:
    • Traveling Salesman Problem (TSP): Branch and bound is used to find the shortest route by systematically branching and bounding subproblems.
    • Integer Linear Programming (ILP): B&B is commonly used when the decision variables must take on integer values.
    • 0/1 Knapsack Problem: This method can be used to solve the problem exactly by branching on whether an item is included or not and bounding based on value-to-weight ratios.

Comparison of the Three Approaches:

  • Greedy Method:
    • Works well for problems with the greedy choice property.
    • Fast and easy to implement.
    • May not always give the optimal solution.
  • Dynamic Programming:
    • Guarantees optimality for problems with optimal substructure.
    • Handles overlapping subproblems effectively.
    • Requires extra space for storing solutions to subproblems.
  • Branch and Bound:
    • Guarantees optimality by systematically exploring and pruning the solution space.
    • Can handle combinatorial and discrete optimization problems, but may be computationally expensive for large problems.

Greedy Approach Template

#include <iostream>
#include <vector>
#include <algorithm>

using namespace std;

// Define a struct for the problem input (generic item)
struct Item {
    // Attributes specific to the problem
    int attribute1; // Example: value, cost, etc.
    int attribute2; // Example: weight, time, etc.

    // Constructor for the item
    Item(int attr1, int attr2) : attribute1(attr1), attribute2(attr2) {}
};

// Greedy criteria comparator (modify based on the problem)
bool greedyCriteria(const Item &a, const Item &b) {
    // Example: Replace this logic with the appropriate greedy criteria for the problem
    // Sort based on a custom criterion (e.g., value-to-weight ratio)
    return (double)a.attribute1 / a.attribute2 > (double)b.attribute1 / b.attribute2;
}

// Function to check if adding an item is feasible (optional, problem-dependent)
bool isFeasible(const Item &item, int currentState) {
    // Example: Feasibility check based on the current state and constraints
    // Modify this logic based on the problem's feasibility condition
    return true;  // Default: always feasible
}

// Greedy algorithm function (generic template)
void greedyAlgorithm(vector<Item> &items) {
    // Step 1: Sort items based on the greedy criteria
    sort(items.begin(), items.end(), greedyCriteria);

    // Step 2: Initialize the solution and current state
    vector<Item> solution; // Store the solution
    int currentState = 0;  // Modify this to track the state (e.g., current weight, capacity, time)

    // Step 3: Iterate over the sorted items
    for (const auto &item : items) {
        if (isFeasible(item, currentState)) {
            // Step 4: If the item is feasible, add it to the solution
            solution.push_back(item);

            // Update the current state based on the item
            currentState += item.attribute2;  // Example: update state based on the problem
        }
    }

    // Step 5: Output the solution or process further
    cout << "Solution contains " << solution.size() << " items." << endl;
    // Further process the solution as needed (e.g., print items, calculate total value, etc.)
}

int main() {
    // Example input: List of items (customize based on the problem)
    vector<Item> items = { {60, 10}, {100, 20}, {120, 30} };

    // Run the greedy algorithm
    greedyAlgorithm(items);

    return 0;
}

Explanation of the Template:

  1. Problem Input Representation:
    • Item struct represents an abstract problem input with two attributes (attribute1, attribute2), which you can modify based on the actual problem. For example, attribute1 could represent the value, and attribute2 could represent the weight for the knapsack problem.
  2. Greedy Criteria Comparator:
    • The function greedyCriteria defines how the items should be sorted. In most greedy algorithms, sorting or prioritization is the key step. The comparator function can be customized depending on the problem, e.g., sorting by value-to-weight ratio, start times, or costs.
  3. Feasibility Check:
    • The isFeasible function checks if the current item can be added to the solution without violating the problem's constraints. Not all greedy algorithms require this, but it's useful for problems where you need to track constraints (e.g., knapsack capacity, job scheduling deadlines).
  4. Main Greedy Algorithm:
    • The main greedyAlgorithm function:
      • Sorts the items based on the greedy criteria.
      • Iterates over the sorted items, checking feasibility.
      • If an item is feasible, it's added to the solution, and the state (e.g., total weight or time) is updated.
      • The solution is stored in a vector for further use (e.g., printing or calculating the total objective function value).

Example of Customization:

Here’s how you can adapt this template to the Activity Selection Problem (where you select the maximum number of non-overlapping activities):

struct Activity {
    int start, end;
    Activity(int s, int e) : start(s), end(e) {}
};

bool greedyCriteria(const Activity &a, const Activity &b) {
    return a.end < b.end;  // Sort by finish time
}

bool isFeasible(const Activity &activity, int currentTime) {
    return activity.start >= currentTime;  // Activity can only be selected if it starts after or when the current time ends
}

void greedyActivitySelection(vector<Activity> &activities) {
    sort(activities.begin(), activities.end(), greedyCriteria);

    vector<Activity> solution;
    int currentTime = 0;

    for (const auto &activity : activities) {
        if (isFeasible(activity, currentTime)) {
            solution.push_back(activity);
            currentTime = activity.end;
        }
    }

    cout << "Selected " << solution.size() << " activities." << endl;
}

int main() {
    vector<Activity> activities = { {1, 3}, {2, 5}, {3, 4}, {5, 7}, {6, 8} };
    greedyActivitySelection(activities);

    return 0;
}