0% found this document useful (0 votes)
80 views

Algorithm Design Techniques - 1556432967209

Techniques that provide efficient solutions to problems include brute force, divide-and-conquer, dynamic programming, and greedy algorithms. Divide-and-conquer works by dividing problems into smaller subproblems, solving the subproblems recursively, and then combining the solutions. Dynamic programming similarly breaks problems into subproblems, but solves each subproblem only once and stores the results in a table to avoid recomputing. Greedy algorithms make locally optimal choices at each step to try to find a global optimum.

Uploaded by

gaurav
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
80 views

Algorithm Design Techniques - 1556432967209

Techniques that provide efficient solutions to problems include brute force, divide-and-conquer, dynamic programming, and greedy algorithms. Divide-and-conquer works by dividing problems into smaller subproblems, solving the subproblems recursively, and then combining the solutions. Dynamic programming similarly breaks problems into subproblems, but solves each subproblem only once and stores the results in a table to avoid recomputing. Greedy algorithms make locally optimal choices at each step to try to find a global optimum.

Uploaded by

gaurav
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 8

Algorithm Design

Techniques
 Instructor: Faisal Anwer, DCS, AMU
 SOURCE:
Introduction to Algorithm (3rd Edition) by Thomas H. Cormen, Charles E.
Leiserson, Ronald L. Rivest, Clifford Stein + Freely Accessible Web Resources
ALGORITHM DESIGN TECHNIQUES

 Techniques that provides the construction of efficient


solutions to problems.
 They provide templates suited to solve a broad range of
diverse problems
 Brute Force
 Divide-and-conquer
 Dynamic Programming
 Greedy Techniques
 Brute force
 Selection sort, Brute-force string matching, Convex hull problem
 Exhaustive search: Traveling salesman, Knapsack, and
Assignment problems
 Divide-and-conquer
 Mergesort, Quicksort, Binary Search, Strassen’s Matrix
Multiplication.
 Dynamic programming
 Warshall’s algorithm for transitive closure
 Floyd’s algorithms for all-pairs shortest paths
 Greedy techniques
 MST problem: Prim’s algorithm, Kruskal’s algorithm (Sets and set
operations)
 Dijkstra’s algorithm for single-source shortest path
problem,Huffman tree and code
 More on algorithms.
Brute Force
 Brute force is a straightforward approach to solve a problem
based on the problem’s statement and definitions of the concepts
involved.
 It is considered as one of the easiest approach to apply and is
useful for solving small – size instances of a problem.
 Brute force is important due to its wide applicability and
simplicity.
Divide-and-conquer
 Given an instance of the problem to be solved, split this into
several smaller sub-instances (of the same problem).
 independently solve each of the sub-instances.
 and then combine the sub-instance solutions so as to yield a
solution for the original instance.
Dynamic programming
 Dynamic programming, like the divide-and-conquer method,
solves problems by combining the solutions to sub problems.
 A dynamic-programming algorithm solves each subsubproblem
just once and then saves its answer in a table, thereby avoiding
the work of recomputing the answer every time it solves each
subsubproblem.
Greedy Algorithm
 A greedy algorithm is an algorithmic paradigm that follows the
problem solving heuristic of making the locally optimal choice at
each stage with the hope of finding a global optimum.
DIVIDE-AND-CONQUER

 In this paradigm a problem is solved recursively, applying three


steps at each level of the recursion
 Divide: The problem is divided into a number of sub
problems that are smaller instances of the same problem.
 Conquer: The sub problems are conquered by solving them

recursively.
 Combine: The solutions are combined to the sub problems
into the solution for the original problem.

.
ANALYSIS OF DIVIDE-AND-CONQUER
 Running time of a divide-and-conquer algorithm are
calculated from the three steps of the basic paradigm.
 Let T(n) be the running time on a problem of size n, If the
problem size is small enough, say n for some constant c, the
straightforward solution takes constant time, which we write
as Ɵ(1).
 Suppose that our division of the problem yields a
subproblems, each of which is 1/b the size of the original. So
the time would be a.T(n/b) to solve a subproblems.
 If we take D(n) time to divide the problem into subproblems
and C(n) time to combine the solutions to the subproblems
into the solution to the original problem, then the total time :

.
• Divide: The divide step just computes the middle of the
subarray, which takes constant time. Thus, D(n)=Ɵ(1).
• Conquer: We recursively solve two subproblems, each of
size n/2, which contributes 2T(n/2) to the running time.
• Combine: MERGE procedure on an n-element subarray
takes time Ɵ(n), and so C(n)=Ɵ(n).

You might also like