SlideShare une entreprise Scribd logo
1  sur  168
Télécharger pour lire hors ligne
A l i d D i f Al ithAnalysis and Design of Algorithms
Deepak John
Department Of Computer Applications , SJCET-Pala
Analysis of searching and sorting. Insertion sort,Analysis of searching and sorting. Insertion sort,
Quick sort, Merge sort and Heap sort. Binomial
Heaps and Fibonacci Heaps, Lower bounds for
sorting by comparison of keys. Comparison of sorting
algorithms. Amortized Time Analysis. Red-Black
T I ti d D l tiTrees – Insertion and Deletion.
Approachpp
Step I:
Choose the criteria (for natural numbers, criteria can be
ascending or descending order).
Step II:Step II:
How to put data in order using the criterion selected.
AnalysisAnalysis
• Final ordering of data can be obtained in a variety of ways.
• Some are meaningful and efficient.g
• Meaningful and efficient ways depend on many aspects of an
application- type of data, randomness of data, run time constraints,
i f th d t t f it i tsize of the data, nature of criteria, etc.
• To make comparisons, certain properties of sorting algorithms
should be defined.
• Properties which are used to compare algorithms without
depending on the type and speed of the machines are:
– number of comparisons– number of comparisons.
– number of data movements.
– Use of auxiliary storage.
Sortingg
Sorting is the process of arranging a group of items
into a defined order based on particular criteriainto a defined order based on particular criteria
There are many, different types of sorting algorithms,
but the primary ones are:
1. insertion sort1. insertion sort
2. Quick sort
3. Merge sort
4. Heap sort.
Insertion sortInsertion sort
th iti t i t th t l t i f d b• the proper position to insert the current element is found by
comparing the current element with the elements in the sorted
sub-array.which is an efficient algorithm for sorting a small
b f l tnumber of elements.
Analyzing Algorithmy g g
O t l (li 1 8) tl 1 ti ( ith l th(A))Outer loop (lines 1–8) runs exactly n − 1 times (with n = length(A))
Best case
•The best case for insertion sort is when the input array is already•The best case for insertion sort is when the input array is already
sorted, in which case the while loop never executes (but the
condition must be checked once).
•tj=1, and line 6 and 7 will be executed 0 times.
•T(n) = c1n + c2(n - 1) + c4(n - 1) + c5(n - 1) + c8(n - 1)
( 1 2 4 5 8) ( 2 4 5 8)= (c1 + c2 + c4 + c5 + c8)n - (c2 + c4 + c5 + c8)
= an + b
= Θ(n)= Θ(n)
Deepak John,Department Of IT,CE Poonjar
worst case for insertion sort is when the input array is in reverse
sorted order, in which case the while loop executes the maximum
number of times.
•the inner loop is executed exactly j − 1 times for every iteration of the
outer loop.outer loop.
=an2+bn-c (consider only leading terms of formula, since lower order
terms are insignificant for large n Ignore the leading terms constantterms are insignificant for large n.Ignore the leading terms constant
coefficient ,since constant factors are less significant than the rate of
growth in determine computational efficiency of large inputs)
Θ( 2)=Θ(n2)
Average case
• when random data is sorted, insertion sort is usually closer to the
worst caseworst case.
• in average, tj = j/2. T(n) will still be in the order of n2, same as the
worst case.
Order of Growth
The order of a running-time function is the fastest growing term,g g g ,
discarding constant factors
Best case: an + b → Θ(n)
2 2Worst case : an2 + bn - c → Θ(n2)
Average case: Θ(n2)
• Advantageg
The advantage of Insertion Sort is that it is relatively simple and
easy to implement.
• Disadvantage
The disadvantage of Insertion Sort is that it is not efficient to
operate with a large list or input sizeoperate with a large list or input size.
Quick-SortQ
• Quick-sort is a randomized
ti l ith b d thsorting algorithm based on the
divide-and-conquer paradigm:
– Divide: pick a random
x
Divide: pick a random
element x (called pivot)
and partition S into xp
• L elements less than x
• E elements equal x
x
L GE
• G elements greater than x
– Recur: sort L and G
L GE
x
– Conquer: join L, E and G
Choice Of PivotChoice Of Pivot
Three ways to choose the pivot:
• Median-of-Three - from the leftmost, middle, and rightmostg
elements of the list to be sorted, select the one with median
key as the pivot
• Pi ot is rightmost element in list that is to be sorted• Pivot is rightmost element in list that is to be sorted
– When sorting A[6:20], use A[20] as the pivot
• Randomly select one of the elements to be sorted as the pivotRandomly select one of the elements to be sorted as the pivot
– When sorting A[6:20], generate a random number r in the
range [6, 20]
– Use A[r] as the pivot
Algorithm
Given an array of n elements (e.g.,
integers):
• If array only contains one element,
return
• Else• Else
– pick one element to use as pivot.
– Partition elements into two sub-arrays:
• Elements less than or equal to pivot
• Elements greater than pivot
– Quick sort two sub-arraysQuick sort two sub arrays
– Return results
The Pseudo-Code
P titi iPartitioning
• The key to the algorithm is the PARTITION procedure, which
rearranges the sub-array in place.g y p
• Given a pivot, partition the elements of the array such that the
resulting array consists of:
1 One sub array that contains elements > pivot1. One sub-array that contains elements >= pivot
2. Another sub-array that contains elements < pivot
• The sub-arrays are stored in the original data array.y g y
Analysisy
The running time of quick sort depends on whether the partitioning is
b l d t A th t k d if lbalanced or not. Assume that keys are random, uniformly
distributed.
Best caseBest case
Recursion:
1. Partition splits array in two sub-arrays of size n/2
2. Quicksort each sub-array
the depth of the recursion is log2n
At each level of the recursion, the work done in all the partitions at
that level is =O(n)
O(log n) * O(n) = O(n log n)O(log2n) O(n) O(n log2n)
Best case running time: O(n log2n)
Deepak John,Department Of IT,CE Poonjar
W tWorst case
• Data is sorted already
Recursion:– Recursion:
1. Partition splits array in two sub-arrays:
• one sub-array of size 0
• the other sub-array of size n-1
2. Quick sort each sub-array
Rec rring on the length n 1 part req ires rec rring to depth n 1Recurring on the length n-1 part requires recurring to depth n-1
• recursion is O(n) levels deep (for an array of size n).
• the partitioning work done at each level is O(n)the partitioning work done at each level is O(n).
• O(n) * O(n) = O(n2)
Worst case running time: O(n2)
Average-case
• If the pivot element is randomly chosen we expect the split of the
input array to be reasonably well balanced on average .
• Assuming random input, average-case running time is much closer to
(n lg n) than (n2)
• T(n)=O(n lgn)• T(n)=O(n lgn)
Improved Pivot Selection
Pick median value of three elements from data array : data[0],y [ ],
data[n/2], and data[n-1].Use this median value as pivot.
Merge sortg
Merge-sort on an input sequence S with n elements consists of three
steps:
Divide: partition S into two sequences S1 and S2 of about n/2
elements each
Recur: recursively sort S1 and S2y 1 2
Conquer: merge S1 and S2 into a unique sorted sequence
A L G O R I T H M S
divideA L G O R I T H M S
sortA G L O R H I M S T
Deepak John,Department Of IT,CE Poonjar
mergeA G H I L M O R S T
Algorithm
MERGE-SORT (A, p, r)
1 IF p < r // Check for base case
Algorithm
1. IF p < r // Check for base case
2. THEN q = (p + r)/2 // Divide step
3. MERGE-SORT (A, p, q // Conquer step.
4 MERGE SORT(A + 1 ) // C t4. MERGE-SORT(A, q + 1, r) // Conquer step.
5. MERGE (A, p, q, r) // Conquer step.
( )
1 2 3 4 5 6 7 8
p rq
MERGE(A, p, q, r)
1. Compute n1 and n2
2 Copy the first n1 elements into
63217542
n1 n22. Copy the first n1 elements into
L[1 . . n1 + 1] and the next n2 elements into R[1 . . n2 + 1]
3. L[n1 + 1] ← ; R[n2 + 1] ←
4 i 1 j 1
p q
1 2
4. i ← 1; j ← 1
5. for k ← p to r
6. do if L[ i ] ≤ R[ j ]
p q
7542
rq + 1
L
6. do if L[ i ] ≤ R[ j ]
7. then A[k] ← L[ i ]
8. i ←i + 1
6321R
9. else A[k] ← R[ j ]
10. j ← j + 1
Analysis
• For simplicity assume that n is a power of 2 so that each divide step• For simplicity, assume that n is a power of 2 so that each divide step
yields two subproblems, both of size exactly n/2.
• The base case occurs when n = 1.When n > 1, time for merge sort
steps:
Divide: Just compute q as the average of p and r, which takes constant
time i e Θ(1)time i.e. Θ(1).
Conquer: Recursively solve 2 sub problems, each of size n/2, which is
2T(n/2).
Combine: MERGE on an n-element sub array takes Θ(n) time.
• Summed together they give a function ,the recurrence for merge sort
i i irunning time is
T(n) = Θ(1) if n=1
= 2T(n/2)+ Θ (n)+Θ(1) If n>1= 2T(n/2)+ Θ (n)+Θ(1). If n>1
T(n)=Θ(n lg2n)
Analysis of MergeSort
O(n log n) best-, average-, and worst-case complexity because the
merging is always linear
Analysis of MergeSort
g g y
―Extra O(n) temporary array for merging data
―Extra copying to the temporary array and back
Useful only for external sorting
Deepak John,Department Of IT,CE Poonjar
Heapsp
Definitions of heap:
1. A balanced, left-justified binary tree in which no node has a, j y
value greater than the value in its parent.
Example min heap
Y>=X
Z>=X
Heap
• The binary heap data structures is an array that can be
viewed as a complete binary tree. Each node of the binary
tree corresponds to an element of the array. The array is
completely filled on all levels except possibly lowest.
19
12 16
41 7
1619 1 412 7Array A 1619 1 412 7Array A
Max Heap Example Min Heap Example
19
1
12 16
4 16
41 7
127 19
1619 1 412 7
41 7
127 191641
Array A Array A
Heap Property
Heap Proceduresp
• Algorithm
1. Add the new element to the next available position at the
lowest level
2. Restore the max-heap property if violated
• General strategy is percolate up (or bubble up): if the parent• General strategy is percolate up (or bubble up): if the parent
of the element is smaller than the element, then interchange
the parent and child.
OROR
Restore the min-heap property if violated
• General strategy is percolate up (or bubble up): if the parentgy p p ( p) p
of the element is larger than the element, then interchange
the parent and child.
19 19
12 16 12 16
41 7 41 7 17
19
Insert 17
12 17
swap
41 7 16
Percolate up to maintain the heap property
• Delete max
– Copy the last number to the root ( overwrite the maximum
l t t d th )element stored there ).
– Restore the max heap property by percolate down.
• Delete min
– Copy the last number to the root ( overwrite the minimumpy (
element stored there ).
– Restore the min heap property by percolate down.
Maintaining the Heap PropertyMaintaining the Heap Property
• Suppose a node is smaller than a childpp
– Left and Right subtrees of i are max-
heaps
• To eliminate the violation:
– Exchange with larger child
Move down the tree– Move down the tree
– Continue until node is not smaller than
children
Maintaining the Heap Property
• Assumptions:
– Left and Right
Alg: MAX-HEAPIFY(A, i, n)
1 l ← LEFT(i)Left and Right
subtrees of i are
max-heaps
1. l ← LEFT(i)
2. r ← RIGHT(i)
3. if l ≤ n and A[l] > A[i]
– A[i] may be
smaller than its
children
[ ] [ ]
4. then largest ←l
5. else largest ←i
6. if r ≤ n and A[r] > A[largest]
7. then largest ←r
8 if l ≠ i8. if largest ≠ i
9. then exchange A[i] ↔ A[largest]
10 MAX HEAPIFY(A largest n)10. MAX-HEAPIFY(A, largest, n)
ExampleMAX-HEAPIFY(A 2 10)MAX HEAPIFY(A, 2, 10)
A[2] → A[4]
A[2] violates the heap property A[4] violates the heap property
A[4] → A[9]
Heap property restored
T(n)=O(lg n )
•Best Case Occurs when no swap is performed, T(n)=O(1)p p , ( ) ( )
•Worst case occurs when we swap all elements
BUILD-MAX-HEAP
Produces a max-heap from an unordered input array
BUILD MAX HEAP
•O(n) calls( )
• Each call takes O(lg n) time for max haepify ,so O(n lg n) be the
total time.
Heap sort
The heapsort algorithm consists of two phases:
- build a heap from an arbitrary array
- use the heap to sort the datause the heap to sort the data
• To sort the elements in the decreasing order, use a min heap
• To sort the elements in the increasing order, use a max heap
11
Example Heap Sort
Let us look at this example: we must convert the unordered array
with n = 10 elements into a max-heapwith n 10 elements into a max heap
we start with position 10/2 = 5
We compare 3 with its child and swap them
W 17 ith it t hild d it ith thWe compare 17 with its two children and swap it with the
maximum child (70)
We compare 28 with its two children, 63 and 34, and swap it with
the largest child
We compare 52 with its children, swap it with the largest
Rec rsing no f rther s aps are needed– Recursing, no further swaps are needed
Finally, we swap the root with its largest child, and recurse,
i 46 i i h 81 d h i i h 70swapping 46 again with 81, and then again with 70
We have now converted the unsorted
array
into a max-heap:
Suppose we pop the maximum element of this heap
This leaves a gap at the back of the array:
This is the last entry in the array, so why not fill it with the largest
element?element?
Repeat this process: pop the maximum element, and then insert it at
the end of the array:
Repeat this process
– Pop and append 70Pop and append 70
– Pop and append 63
We have the 4 largest elements in order
– Pop and append 52
– Pop and append 46
Continuing...
– Pop and append 34
– Pop and append 28
Finally we can pop 17 insert it into the 2nd location and theFinally, we can pop 17, insert it into the 2nd location, and the
resulting array is sorted
Analysisy
• The call to BuildHeap() takes O(n) time
• Each of the n - 1 calls to Heapify() takes O(lg n) timep y() ( g )
• Thus the total time taken by HeapSort()
= O(n) + (n - 1) O(lg n)
O( ) + O( l )= O(n) + O(n lg n)
= O(n lg n)
There are no best-case and worst-case scenarios for heap sortp
Binomial trees
•Is an ordered tree defined recursively
•Binomial tree properties:
Examples
B00
B1
B2
B2
B1
B0
Bk-1
Bk-2
2
Bk
Binomial heaps 
A binomial heap is a linked list of binomial trees with the following
properties:
1 The binomial trees are linked in increasing order of size1. The binomial trees are linked in increasing order of size.
2. There is at most one binomial tree of each size.
3. Each binomial tree has the heap structure: the value in each node is ≤
5 1h d[H]
3. Each binomial tree has the heap structure: the value in each node is ≤
the values in its children.
5 1
1210
head[H]
7
2
13103
15 151210
1616
Binomial Heap Implementation
E h d h th f ll i fi ld• Each node has the following fields:
p: parent
child: leftmost childchild: leftmost child
sibling
Degreeg
Key
•Roots of the trees are connected using linked list.
•Each node x also contains the field degree[x] , which is the number of
children of x.
Binomial Heap Implementation
a) c)key
p
2
0
NIL
h d[H]
1
2
NIL
)key
degree
child sibling
NILhead[H] NIL
1210
b)
2
12
0
NIL NIL
head[H] 1
1210
10
1
15
15
0
NIL NIL
Binomial Heap OperationsBinomial Heap Operations
1 Make-Heap()1. Make-Heap().
2. Insert(H, x), where x is a node .
3. Minimum(H).( )
4. Extract-Min(H).
5. Union(H1, H2): merge H1 and H2, creating a new heap.
6. Decrease-Key(H, x, k): decrease x.key (x is a node in
H) to k. (It’s assumed that k x.key.)
Make-Heap():p()
•Make an empty binomial heap. Creating all of the pointers can be
done in O(1) time.
Th ti i l t i t d t it t NILThe operation simply creates a new pointer and sets it to NIL.
Binomial-Heap-Create()
1 head[H] <- NIL
2 return head[H]
Minimum(H):
•To do this we find the smallest key among those stored at the rootsTo do this we find the smallest key among those stored at the roots
connected to the head of H.
•The minimum must be in some root in the top list.
•If there are n nodes in the heap there are at most lg n roots at the top at•If there are n nodes in the heap there are at most lg n roots at the top, at
most one each of degree 0, 1, 2, . . . , lg n , so this can be found in O(lg
n) time. Binomial-Heap-Minimum(H)
1 y <- NIL
2 x <- head[H]
3 min <- ∞
4 while x is not NIL
5 do if key[x] < min then
6 min < key[x]6 min <- key[x]
7 y <- x
8 x <- sibling[x]
9 return y
Find Minimum Key Example
5 1head[ 2 5 1head[ 2
a) b)
1210
15
H]
7 1210
15
H]
7
15 15
5 1head[ 2 5 1head[ 2
c) d)
1210
head[
H]
7 1210
head[
H]
7
15 15
Deepak John,Department Of IT,CE Poonjar
Binomial-Link(y,z)
Link binomial trees with the same degree. Note that z, the second
argument to BL(), becomes the parent, and y becomes the child.
Link(y,z)
p[y] := z;
ibli [ ] hild[ ]
Link(y,z)
p[y] := z;
ibli [ ] hild[ ]sibling[y] := child[z];
child[z] := y;
degree[z] := degree[z] + 1
sibling[y] := child[z];
child[z] := y;
degree[z] := degree[z] + 1g [ ] g [ ]g [ ] g [ ]
y y
z
z
Bk-1
Bk-1Bk-1 Bk-1
y
Link
Deepak John,Department Of IT,CE Poonjar
Union(H1,H2)
•is the most sophisticated of the binomial heap operationsis the most sophisticated of the binomial heap operations.
•It’s used in many other operations.
The running time will be O(log n).g ( g )
UnionH1, H2
H1  H2
H1 = H2 =
Union traverses the new root list like this:
prev-x x next-x
Union traverses the new root list like this:
Deepak John,Department Of IT,CE Poonjar
Starting with the following two binomial heaps:
1880602
58 19
18
93
8060
32 63
2
69
M t li t b t 2 188060
53
Merge root lists, but
now we have two
trees of same degree
53
32 63
2
69
58 19
18
93
8060
53 69
Combine trees of same
28060
Combine trees of same
degree using binomial
link, make smaller key
the root of the
53
32 63
58 19
1893
the root of the
combined tree 69
Cases
prev-x x next-x sibling[next-x] prev-x x next-x
a b c d
p g[ ]
Bk Bl
Case 1 a b c d
p x next-x
Bk Bl
Case 1:occurs when degree[x] ≠ degree[next-x], that is, when x is the
root of a Bk-tree and next-x is the root of a Bl-tree for some l > k.k l
prev-x x next-x sibling[next-x] prev-x x next-x
a b c d
p g[ ]
BB
Case 2 a b c d
prev x x
BBBBkBk
Bk
BkBkBk
Case 2: occurs when x is the first of three roots of equal degree, that
is when
Deepak John,Department Of IT,CE Poonjar
is, when
degree[x] = degree[next-x] = degree[sibling[next-x]].
a b c d
prev-x x next-x sibling[next-x]
Case 3 a b d
prev-x x next-x
BkBk
Bl
key[x]  key[next[x]]
c
Bk
Bk
Bl
prev-x x next-x sibling[next-x]
C 4
Bk+1
prev-x x next-x
a b c d
prev x x next x sibling[next x]
Bk Bk
Bl
Case 4 a
b
c d
Bk
Bl
prev x x next x
k
key[x] > key[next[x]]
k
Bk
Bk+1
Case 3 and 4: occur when x is the first of two roots of equal
degree, that is, when
d [ ] d [ t ] ≠d [ ibli [ t ]]degree[x] = degree[next-x] ≠degree[sibling[next-x]].
Union(H1, H2)Union(H1, H2)( 1, 2)
H := new heap;
head[H] := merge(H1, H2); /* simple merge of root lists */
if head[H] = NIL then return H fi;
( 1, 2)
H := new heap;
head[H] := merge(H1, H2); /* simple merge of root lists */
if head[H] = NIL then return H fi;if head[H] NIL then return H fi;
prev-x := NIL;
x := head[H];
next-x := sibling[x];
if head[H] NIL then return H fi;
prev-x := NIL;
x := head[H];
next-x := sibling[x];next x : sibling[x];
while next-x  NIL do
if (degree[x]  degree[next-x]) or
(sibling[next-x]  NIL and degree[sibling[next-x]] = degree[x]) then
next x : sibling[x];
while next-x  NIL do
if (degree[x]  degree[next-x]) or
(sibling[next-x]  NIL and degree[sibling[next-x]] = degree[x]) then(sibling[next-x]  NIL and degree[sibling[next-x]] = degree[x]) then
prev-x := x;
x := next-x;
else
(sibling[next-x]  NIL and degree[sibling[next-x]] = degree[x]) then
prev-x := x;
x := next-x;
else
Cases
1,2
elseelse
Deepak John,Department Of IT,CE Poonjar
if key[x]  key[next-x] thenif key[x]  key[next-x] then
sibling[x] := sibling[next-x];
Link(next-x, x)
else
if NIL th h d[H] l ibli [ ] fi
sibling[x] := sibling[next-x];
Link(next-x, x)
else
if NIL th h d[H] l ibli [ ] fi
Case 3
if prev-x = NIL then head[H] := next-x else sibling[prev-x] := next-x fi
Link(x, next-x);
x := next-x
fi
if prev-x = NIL then head[H] := next-x else sibling[prev-x] := next-x fi
Link(x, next-x);
x := next-x
fi
Case 4
fi
fi;
next-x := sibling[x]
od;
fi
fi;
next-x := sibling[x]
od;od;
return H
od;
return H
Deepak John,Department Of IT,CE Poonjar
Union Example
12
3328
15
25
7head[H1] 18
37
3
441029
6
8
head[H2]
41
50
31 1748
32 24
45
222330
55
Merge
18
37
3
4410
612
3328
15
25
7head[H]
x next-x
37
50
31 1748
441029
32 24
222330
8
33
41
2825
50
55
32 24
45
t
18
37
3
441029
6
8
12
3328
15
25
7head[H]
x next-x
50
31 1748
32 24
45
222330
41
55
Case 3
37
3
441029
6
8
3328
15
25
7head[H]
18
12
x next-x
50
31 1748
1029
32 24
45
222330
8
41
55
3245
t
37
3
441029
6
8
3328
15
25
7head[H]
18
12
x next-x
50
31 1748
32 24
45
222330
41
Case 2 55
37
3
441029
6
8
3328
15
25
7head[H]
18
12
prev-x x next-x
50
31 1748
32 24
45
222330
41
55
37
3
441029
6
8
3328
15
25
7head[H]
18
12
prev-x x next-x
50
31 1748
1029
32 24
45
222330
8
41
Case 4
55
3245
37
3
441029
6
3328
15
7
head[H]
18
12
prev-x x next-x
50
31 1748
441029
32 24
45
222330
8
4125
7
55
32 24
45
prev x next x
37
3
441029
6
8
3328
15
7
head[H]
18
12
prev-x x next-x
50
31 1748
32 24
45
222330
4125
Case 3 55
37
3
441029
6
815 7
head[H]
18
12
prev-x x next-x
50
31 1748
32 24
45
222330
8
33
41
28
25
55
37
3
441029
6
815 7
head[H]
18
12
prev-x x next-x
50
31 1748
32 24
45
222330
33
41
28 25
55
Case 1
37
3
441029
6
815 7
head[H]
18
12
prev-x x next-x = NIL
 terminates
50
31 1748
29
32 24
45
222330
8
33
41
28
5
25
Note: Union is
O(l )
55
345 O(lg n).
insert
Insert(H, x)
H’ := Make-B-H();
Insert(H, x)
H’ := Make-B-H();();
p[x] := NIL;
child[x] := NIL;
sibling[x] := NIL;
();
p[x] := NIL;
child[x] := NIL;
sibling[x] := NIL;sibling[x] := NIL;
degree[x] := 0;
head(H’) := x;
i ( ’)
sibling[x] := NIL;
degree[x] := 0;
head(H’) := x;
i ( ’)H := Union(H, H’)H := Union(H, H’)
Deepak John,Department Of IT,CE Poonjar
Extract Node With Minimum Key
This operation is started by finding and removing the node x withThis operation is started by finding and removing the node x with
minimum key from the binomial heap H. Create a new binomial heap
H’ and set to the list of x’s children in the reverse order. Unite H and H’
to get the resulting binomial heap.to get the resulting binomial heap.
Pseudocode
Binomial-Heap-Extract-Min(H)
1 find the root x with the minimum key in the root list of H,
and remove x from the root list of H.
2 H’ <- Make-Binomial-Heap()2 H <- Make-Binomial-Heap()
3 reverse the order of the linked list of x’s children,and set
head[H’] to point to the head of the resulting list.
4 H <- Binomial-Heap-Union(H,H’)
5 Return x
Run time: O(log n)Run time: O(log n)
Deepak John,Department Of IT,CE Poonjar
Extract Minimum Key Exampley p
5 1head[H]
2
12107
2
1210
3
15
15
1210
15
5 1head[H]
2
1210
15
7
1210
15
3
1210
Deepak John,Department Of IT,CE Poonjar
15
5 12 10head[H] 2 head[H’]
157 1210
15
2
151210
15
5
7
2
12102
head[H]
10
12
15
2
1210
15
Deepak John,Department Of IT,CE Poonjar
15
Decreasing a key
The current key is replaced with a new key To maintain the min-heapThe current key is replaced with a new key. To maintain the min-heap
property, it is then compared to the key of the parent. If its parent’s key is
greater then the key and data will be exchanged. This process continues until
the new key is greater than the parent’s key or the new key is in the root.y g p y y
Pseudocode:
Binomial-Heap-Decrease-Key(H,x,k)
1 if k > key[x]1 if k > key[x]
2 then error “new key is greater than current key”
3 key[x] <-k
4 y <-x
5 z <-p[y]
6 while z not NIL and key[y] < key[z]6 while z not NIL and key[y] key[z]
7 do exchange key[y] <-> key[z]
8 if y and z have satellite fields, exchange them, too.
9 <9 y <- z
10 z <- p[y]
Deepak John,Department Of IT,CE Poonjar
Decreasing a key
Execution time: This procedure takes O(log n) since the maximumExecution time: This procedure takes O(log n) since the maximum
depth of x is log n.
Example:p
5 2h d[H] 5 2
1210
head[H] 5 2
1210
head[H]
15 1
5 2head[H] 5 15 2
121
head[H] 5 1
122
head[H]
Deepak John,Department Of IT,CE Poonjar
10 10
Delete a Node
With assumption that there is no node in H has a key of -∞.
h k f d l i d i fi d dThe key of deleting node is first decreased to -∞.
This node is then deleted using extracting min procedure.
Pseudocode:
Binomial-Heap-Delete(H,x)
1 Binomial-Heap-Decrease-Key(H,x,-∞)
2 Binomial-Heap-Extract-Min(H)2 Binomial-Heap-Extract-Min(H)
Run time: O(log n) since the run time of both Binomial-Heap-Decrease-
K d Bi i l H E t t Mi d i d f O(l )Key and Binomial-Heap-Extract-Min procedures are in order of O(log n).
Deepak John,Department Of IT,CE Poonjar
Delete a Node Examplep
a) b)
5 2head[H] 5 2head[H]
a) b)
1210
15
12-∞
1515
5 -∞head[H] 5head[H]
c) d)
122
1
12 2head[H’]
Deepak John,Department Of IT,CE Poonjar
15 15
e) f)
5 12 2
15
head[H] 5
12
2
15
head[H]
)
15 12 15
g)
5
2
15
head[H]
12
Deepak John,Department Of IT,CE Poonjar
Fibonacci heap
•A Fibonacci heap is Set of min heap ordered trees
Fibonacci heap
•A Fibonacci heap is Set of min-heap ordered trees.
•Each node x has pointed p[x] to its parent & child [x] to one of its
children
•Represent trees using left-child, right sibling pointers and circular,
doubly linked list.
Child li k d t th i d bl li k d i l li t•Children are linked together in a doubly-linked circular list.
•The entire heap is accessed by a pointer min [H] which points to the
minimum-key rootu ey oo
Deepak John,Department Of IT,CE Poonjar
min[ H ]
(a) 23 7 24
30
17
26
3
4139
303818 52 26 46
3541
min[ H ]
(b) 23 7 24
30
17
3818 52 26
3
46
4139
303818 52 26 46
35
Deepak John,Department Of IT,CE Poonjar
• Potential function:
Number of marked nodes in H
Fibonacci heap
Number of trees in the rooted list of H
Number of marked nodes in H
 (H) = t(H) + 2m(H)
(H) = 5 + 2 3 = 11 minHeap H trees(H) = 5 marks(H) = 3
72317 24 3
30
35
26 46
4118 52
35
39 44marked
Deepak John,Department Of IT,CE
Poonjar
Fibonacci Heap Operationsp p
Create
InsertInsert
Find-Min
UnionU o
Delete
Delete-Min
•Make-Fib-Heap(H):
Allocate and return the Fibonacci heap object H with n[H]=0 and
min[H]=nil, t(H)=0 , m(H)=0 so  (H)=0
The cost of Make-Fib-Heap is O(1)The cost of Make-Fib-Heap is O(1).
Deepak John,Department Of IT,CE Poonjar
Fibonacci Heaps: Insert
Insert.
Create a new singleton tree.
Add to left of min pointer.
Update min pointer.
i t 21
21
insert 21
min
72317 24 3
30 26 46
4118 52
Deepak John,Department Of IT,CE Poonjar
35
39 44
Heap H
min
41
723
18 52
3
30
17
26 46
24 21
39
4118 52
35
44
Heap H
Insert Analysisse t a ys s
Actual cost. O(1)
Change in potential. +1
Amortized cost O(1)Amortized cost. O(1)
Fib-Heap-Insert(H x)Fib Heap Insert(H, x)
{ degree[x]  0
P[x]  NIL[ ]
child[x]  NIL
left[x]  x ; right[x]  x
mark[x]  FALSE
concatenate the root list containing x with root list H
if i [H] NIL k [ ] k [ i [H]]if min[H] = NIL or key[x]<key[min[H]]
then min[H]  x
n[H]  n[H]+1n[H]  n[H]+1
}
Fibonacci Heaps: Union
Union.
Concatenate two Fibonacci heaps.
Root lists are circular, doubly linked lists.
min min
717 323 24 21
39
4118 52
30
35
26 46
44
Heap H' Heap H''
39 44
Deepak John,Department Of IT,CE Poonjar
min
717 323 24 21
4118 52
30 26 46
39
35
44
Heap H
Actual cost. O(1)
Change in potential. 0
Amortized cost. O(1)( )
Fib-Heap-Union(H1, H2)
{ k ib []{ H Make-Fib-Heap[]
min[H]  min[H1]
concatenate the root list of H with the root list of Hconcatenate the root list of H2 with the root list of H
if (min[H1]=NIL) or (min[H2]  NIL and
min[H2]<min[H1])[ 2] [ 1])
then min[H]  min[H2]
n[H]  n[H1]+n[H2]
free the objects H1 and H2
return H
}
Deepak John,Department Of IT,CE Poonjar
Extract min()
Fib-Heap-Extract-Min(H)
{ z  min[H]
if z  NILif z  NIL
then { for each child x of z
do { add x to the root list of H
P[x]  NIL }P[x]  NIL }
remove z from the root list of H
if z = right[z]
then min[H]  NILthen min[H]  NIL
else min[H]  right[z]
Consolidate(H)
n[H]  n[H] – 1n[H]  n[H] – 1
}
return z
}
Deepak John,Department Of IT,CE Poonjar
}
Fib-Heap-Link(H, y, x)
{ remove y from
the root list of H;Consolidate(H) the root list of H;
make y a child of x;
degree[x]degree[x]+1;
mark[y]  FALSE;
Consolidate(H)
{ for i  0 to D(n[H]) do A[i]=NIL
for each node w in the root list of H
do { x  w ; d  degree[x] ; mark[y]  FALSE;
}
do { x  w ; d  degree[x] ;
while A[d]  NIL
do { y A[d]
if key[x]>key[y] then exchange xyif key[x]>key[y] then exchange xy
Fib-Heap-Link(H, y, x)
A[d]  NIL ; d  d+1 }
A[d]  }A[d]  x }
min[H]  NIL
for i  0 to D(n[H]) do
if A[i] NIL h { dd A[i] h li f Hif A[i]  NIL then { add A[i] to the root list of H ;
if min[H]=NIL or key[A[i]]<key[min[H]]
then min[H]  A[i] }
Deepak John,Department Of IT,CE Poonjar
}
Fibonacci Heaps: Delete MinFibonacci Heaps: Delete Min
• Delete min.
Delete min; meld its children into root list; update min– Delete min; meld its children into root list; update min.
– Consolidate trees so that no two roots have same rank.
min
317237 24
39
4118 52
44
30
35
26 46
39 4435
411723 18 527 24
min
3930 26 46 44
35
min
current
411723 18 527 24
3930
35
26 46 44
35
0 1 2 3
rank
411723 18 527 24
currentmin
3930 26 46 44
35 0 1 2 3
411723 18 527 24
min current
39
411723 18 52
30
7
26 46
24
4430
35
26 46
0 1 2 3
rank
411723 18 527 24
min
3930 26 46 44current
rank
35
0 1 2 3
rank
411723 18 527 24
min
3930 26 46 44current
35
link 23 into 17
0 1 2 3
rank
min
4117 18 527 24
392330
35
26 46 44current
35
link 17 into 7
0 1 2 3
rank
0 1 2 3
current
417 18 5224
min
39301726 46 44
35 23
link 24 into 7
0 1 2 3
rank
min
current
39
417
30
18 52
1724 443930
23
17
26 46
24 44
35
rank
0 1 2 3
a
417 18 52
min
current
39301724 44
23
35
26 46
35
0 1 2 3
rank
417 18 52
min
current
39
417
30
18 52
1724 44
2326 46
35
0 1 2 3
rank
417 18 52
min
current
39
417
30
18 52
1724 44
2326 46
35
link 41 into 18
0 1 2 3
rank
7 1852
min
current
3941
7
30
1852
1724
2326 46 44
35
0 1 2 3
rank
min
current
7
30
52
1724 3941
18
30
23
17
26 46
24 3941
44
35
7 52
min
18
301724 3941
23
35
26 46 44
stop
Fibonacci Heaps: Decrease Key
Decrease key of element x to k.
Case 0: min-heap property not violated.Case 0: min heap property not violated.
•decrease key of x to k
•change heap min pointer if necessary
7 18 38min
24 17 23 21 39 41
46 3026 5245
Deepak John,Department Of IT,CE Poonjar
88
Decrease 46 to 45.
7235
Case 1: parent of x is unmarked.
d k f t k•decrease key of x to k
•cut off link between x and its parent
•mark parent•mark parent
•add tree rooted at x to root list, updating heap min pointer
7 18 38
min
24 17 23 21 39 41
45 3026 52
Decrease 45 to 15
15
Deepak John,Department Of IT,CE Poonjar
88
Decrease 45 to 15.
7235
7 18 38
min
24 17 23 21 39 4124
15 3026 52
Decrease 45 to 15.
88
ec ease 5 to 5.
7235
7 18 38
min
15
24 17 23 21 39 412472
3026 52
88
Decrease 45 to 15.
35
Deepak John,Department Of IT,CE Poonjar
Case 2: parent of x is marked.
•decrease key of x to k
•cut off link between x and its parent p[x], and add x to root list
•cut off link between p[x] and p[p[x]], add p[x] to root list
If p[p[x]] unmarked, then mark it.
If p[p[x]] marked, cut off p[p[x]], unmark, and repeat.
15 7 18 38
min
24 17 23 21 39 4172 24
3026 52
Decrease 35 to 5
35
Deepak John,Department Of IT,CE Poonjar
88
Decrease 35 to 5.
5
7 18 38515
min
24 17 23 21 39 412472
3026 52
D 35 5
parent marked
Decrease 35 to 5.
88
26 7 18 38515
min
24 17 23 21 39 4188 2472
30 52
Decrease 35 to 5.parent marked
Deepak John,Department Of IT,CE Poonjar
26 7 18 38515 24
min
17 23 21 39 418872
30 52
Decrease 35 to 5.
Deepak John,Department Of IT,CE Poonjar
Deepak John,Department Of IT,CE Poonjar
Deepak John,Department Of IT,CE Poonjar
Amortized Analysis techniquesy q
• In amortized analysis we average the time required for a sequence
of operations over all the operations performed.
• A ti d l i t t f h• Amortized analysis guarantees an average worst case for each
operation.
– No involvement of probability
• The amortized cost per operation is therefore T(n)/n.
 The aggregate method
 The Accounting method The Accounting method.
 The potential method
Aggregate analysisAggregate analysis
– The total amount of time needed for the n operations is
computed and divided by n.
– Treat all operations equally.
– Compute the worst-case running time of a sequence of n
operationsoperations.
– Divide by n to get an amortized running time.
– We aggregate the cost of a series of n operations to T(n), thengg g p ( ),
each operation has the same amortized cost of T(n)/n
The Accounting methodThe Accounting method
• Principles of the accounting methodp g
– 1. Associate credit accounts with different parts of the
structure
– 2. Associate amortized costs with operations and show
how they credit or debit accounts
• Different costs may be assigned to different operations• Different costs may be assigned to different operations.
operations are assigned an amortized cost. Objects of the
data structure are assigned a credit
Accounting Method vs. Aggregate
Method
• Aggregate method:gg g
– first analyze entire sequence
– then calculate amortized cost per operation
• Accounting method:
– first assign amortized cost per operation
– check that they are valid (never go into the red)
– then compute cost of entire sequence of operations
The Potential method• Similar to accounting method• Similar to accounting method
• Amortized costs are assigned in a more complicated way
– based on a potential functionbased on a potential function
– and the current state of the data structure
• Must ensure that sum of amortized costs of all operations in the
sequence is at least the sum of the actual costs of all operations in the
sequence.
• Define potential function  which maps any state of the data• Define potential function  which maps any state of the data
structure to a real number
• Notation:
– D0 - initial state of data structure
– Di - state of data structure after i-th operation
t l t f i th ti– ci - actual cost of i-th operation
– mi - amortized cost of i-th operation
Red-Black Trees
A red-black tree can also be defined as a binary search tree that satisfies
the following properties:
1.A node is either red or black.
2.The root is ALWAYS black.
3 All leaves are black3.All leaves are black.
4.Both Children of a node that is red, are black. (no red node can have
a red child).
5 E h f i d d d d l f i h5.Every path from a given node down to any descendant leaf contains the
same number of black nodes. The number of black nodes on such a path
(not including the initial node but including leaves) is called the black-
height (bh) of the node.
The red-black tree has O(lg n) height
Deepak John,Department Of IT,CE Poonjar
Red-Black Tree
■ Root Property: the root is black
■ External Property: every leaf is blackp y y
■ Internal Property: the children of a red node are black
■ Depth Property: all the leaves have the same black depth
Deepak John,Department Of IT,CE Poonjar
Rotations
•Rotations are the basic tree-restructuring operation for almost all
balanced search trees.
R t ti t k d bl k t d d•Rotation takes a red-black-tree and a node,
•Changes pointers to change the local structure, and Won’t violate the
binary-search-tree property.
•Left rotation and right rotation are inverses.
y

Left-Rotate(T, x)x
x
 


 y

Right-Rotate(T, y)
Deepak John,Department Of IT,CE Poonjar
 
An example of LEFT-ROTATE(T,x)
Deepak John,Department Of IT,CE Poonjar
Left and Right Rotation
Left Rotate (T x)Left Rotate (T x)Left-Rotate (T, x)
1. y  right[x] // Set y.
2. right[x]  left[y] //Turn y’s left subtree into x’s right subtree.
Left-Rotate (T, x)
1. y  right[x] // Set y.
2. right[x]  left[y] //Turn y’s left subtree into x’s right subtree.
3. if left[y]  nil[T ]
4. then p[left[y]]  x
5 [ ]  [ ] // Li k ’ t t
3. if left[y]  nil[T ]
4. then p[left[y]]  x
5 [ ]  [ ] // Li k ’ t t5. p[y]  p[x] // Link x’s parent to y.
6. if p[x] = nil[T ]
7. then root[T ]  y
5. p[y]  p[x] // Link x’s parent to y.
6. if p[x] = nil[T ]
7. then root[T ]  y
•The code for RIGHT-
ROTATE is symmetric.
[ ] y
8. else if x = left[p[x]]
9. then left[p[x]]  y
10 l i h [ [ ]]
[ ] y
8. else if x = left[p[x]]
9. then left[p[x]]  y
10 l i h [ [ ]] •Both LEFT-ROTATE
and RIGHT-ROTATE
run in O(1) time
10. else right[p[x]]  y
11. left[y]  x // Put x on y’s left.
12. p[x]  y
10. else right[p[x]]  y
11. left[y]  x // Put x on y’s left.
12. p[x]  y
Deepak John,Department Of IT,CE Poonjar
run in O(1) time.12. p[x]  y12. p[x]  y
Ri h iRight rotation:
1. x=left[y];
2. left[y]=right[x];[y] g [ ];
3. If(right[x]!=nil)
4. then p[right[x]]=y;
5 p[x]=p[y];5. p[x] p[y];
6. if(p[y]==nil)
7. then root=x;
8 El If(l f [ [ ]] )8. Else If(left[p[y]]=y)
9. then left[p[y]]=x;
10. else right[p[y]]=x;g [p[y]]
11. right[x]=y;
12. p[y]=x;
Rotation
Th d d f L f R h• The pseudo-code for Left-Rotate assumes that
– right[x]  nil[T ], and
root’s parent is nil[T ]– root s parent is nil[T ].
• Left Rotation on x, makes x the left child of y, and the left subtree
of y into the right subtree of x.
• Pseudocode for Right-Rotate is symmetric: exchange left and right
everywhere.
Ti O(1) f b h L f R d Ri h R i• Time: O(1) for both Left-Rotate and Right-Rotate, since a constant
number of pointers are modified.
Operations on RB TreesOperations on RB Trees
• All operations can be performed in O(lg n) time.
• Insertion and Deletion are not straightforward.
When Inserting a Nodeg
Remember:
1. Insert nodes one at a time, and after every Insertion
balance the treebalance the tree.
2. Every node inserted starts as a Red node.
3. Consult the cases, for rebalancing the tree.
•Basic steps:
1. Use Tree-Insert from BST (slightly modified) to insert a node
x into Tx into T.
-Procedure RB-Insert(x).
-Color the node x red.Color the node x red.
2. Fix the modified tree by re-coloring nodes and performing
rotation to preserve RB tree property.
Deepak John,Department Of IT,CE Poonjar
-Procedure RB-Insert-Fixup.
Red-Black fixup
• y = z’s “uncle”y
• Three cases:
– y is red
– y is black and z is a right child
– y is black and z is a left child.
Case 1 – Z’s uncle y is red
C
C
new z
p[p[z]]
A D
y
C
A D
p[z]
B
  
z
A D
B
  
B
 
B
 
z is a right child here.
Similar steps if z is a left child.
• y.Color = black
• z.Parent.Color = black
• z.Parent.Parent.Color = red
• z = z.Parent.Parent
R fi• Repeat fixup
11
2 14
71 15
5 8
4
y
4z
y.Color = black
z.Parent.Color = blackNew . a e .Co o b ac
z.Parent.Parent.Color = red
z = z.Parent.Parent
New
Node
repeat fixup
1111
2 14
71 15
5 8 y
4z y.Color = black
z.Parent.Color = black
z.Parent.Parent.Color = red
z = z.Parent.Parent
fi
New
Node
repeat fixup
11
2 142 14
71 15
5 8 y
C l bl k
4z
y.Color = black
z.Parent.Color = black
z Parent Parent Color = redNew z.Parent.Parent.Color red
z = z.Parent.Parent
repeat fixup
New
Node
p p
1111
2 14
71 15
5 8 y
4z
y.Color = black
z.Parent.Color = black
z Parent Parent Color redz.Parent.Parent.Color = red
z = z.Parent.Parent
repeat fixup
New
Node
repeat fixup
1111
2 14 y
71 15z
5 8
y.Color = black
4 z.Parent.Color = black
z.Parent.Parent.Color = red
P P
New
z = z.Parent.Parent
repeat fixup
New
Node
Case 2 – y is black, z is a right child
C C
p[z]
p[z]
A 
z
y B  y
B

 
A
 
z
   
• z = z.Parent
• Left-Rotate(T, z)
• Do Case 3
N t th t C 2 i b t f C 3• Note that Case 2 is a subset of Case 3
1111
2 14 y
71 15z
5 8
z = z.Parent
4
Left-Rotate(T,z)
Do Case 3
1111
2 14z y
71 15
5 8 z = z.Parent
Left-Rotate(T,z)
4
( , )
Do Case 3
11
147 y
2
1
15
5
8z
1 5
44
z = z.Parent
Left-Rotate(T,z)
Do Case 3
Case 3 – y is black, z is a left child
BC
p[z]
AB  y
p[z]
C
z
   
A 
z
 
• z.Parent.Color = black
• z.Parent.Parent.Color = red
• Right-Rotate(T, z.Parent.Parent)
1111
147 y
2 158z
1 5
4 z.Parent.Color = black
z.Parent.Parent.Color = red
Right-Rotate(T, z.Parent.Parent)
1111
147 y
2 158z
1 5
z Parent Color = black
4
z.Parent.Color = black
z.Parent.Parent.Color = red
Right-Rotate(T, z.Parent.Parent)Right Rotate(T, z.Parent.Parent)
11
147 y
2 158z
1 5
4
z.Parent.Color = black
4
z.Parent.Parent.Color = red
Right-Rotate(T, z.Parent.Parent)
7
112
7
z
141 5 8 y
15
4
z.Parent.Color = black
z.Parent.Parent.Color = red
Right-Rotate(T, z.Parent.Parent)
RB-Insert(T, z)
1. y  nil[T]
2 x  root[T]2. x  root[T]
3. while x  nil[T]
4. do y  x
5 if key[z] < key[x]5. if key[z] < key[x]
6. then x  left[x]
7. else x  right[x]
8 [ ] 8. p[z]  y
9. if y = nil[T]
10. then root[T]  z
11 l if k [ ] k [ ]11. else if key[z] < key[y]
12. then left[y]  z
13. else right[y]  z
14. left[z]  nil[T]
15. right[z]  nil[T]
16 color[z]  RED16. color[z]  RED
17. RB-Insert-Fixup (T, z)
RB-Insert-Fixup (T, z)
1. while color[p[z]] = RED
2 d if [ ] l ft[ [ [ ]]]2. do if p[z] = left[p[p[z]]]
3. then y  right[p[p[z]]]
4. if color[y] = RED
5. then color[p[z]]  BLACK // Case 1
6. color[y]  BLACK // Case 1
7. color[p[p[z]]]  RED // Case 1[p[p[ ]]]
8. z  p[p[z]] // Case 1
9. else if z = right[p[z]] // color[y]  RED
10 then z  p[z] // Case 210. then z  p[z] // Case 2
11. LEFT-ROTATE(T, z) // Case 2
12. color[p[z]]  BLACK // Case 3
13 color[p[p[z]]]  RED // Case 313. color[p[p[z]]]  RED // Case 3
14. RIGHT-ROTATE(T, p[p[z]]) // Case 3
15. else (if p[z] = right[p[p[z]]])(same as 10-14
16 ith “ i ht” d “l ft” h d)16. with “right” and “left” exchanged)
17. color[root[T ]]  BLACK
Correctness
Loop invariant:
• At the start of each iteration of the while loop,
– z is red.
– If p[z] is the root, then p[z] is black.
– There is at most one red-black violation:
• Property 2: z is a red root or• Property 2: z is a red root, or
• Property 4: z and p[z] are both red.
• Termination: The loop terminates only if p[z] is black. Hence,
property 4 is OK. The last line ensures property 2 always holds.p p y p p y y
• Maintenance: We drop out when z is the root (since then p[z] is
sentinel nil[T ], which is black). When we start the loop body, the
l i l ti i f t 4only violation is of property 4.
– There are 6 cases, 3 of which are symmetric to the other 3. We
consider cases in which p[z] is a left child.p[ ]
– Let y be z’s uncle (p[z]’s sibling).
Algorithm AnalysisAlgorithm Analysis
• O(lg n) time to get through RB-Insert up to theO(lg n) time to get through RB Insert up to the
call of RB-Insert-Fixup.
• Within RB-Insert-Fixup:• Within RB-Insert-Fixup:
– Each iteration takes O(1) time.
Each iteration but the last moves up 2 levels– Each iteration but the last moves z up 2 levels.
– O(lg n) levels  O(lg n) time.
Th i ti i d bl k t t k O(l ) ti– Thus, insertion in a red-black tree takes O(lg n) time.
– Note: there are at most 2 rotations overall.
Deletion
• Find
• Swap
– Moves entry to node with one external node (left)
• Remove entry
• Reattach right child
Deletion
Deletion from a red black tree, is similar to deletion for a binary
search tree, with a few exception:
•Always set the parent of a deleted node, to be the parent of one
of the deleted nodes children.
•Red black fix-up method called if removed node is black.p
After a deletion of a red node (no violations occur):
N bl k h i h h b ff d•No black-heights have been affected.
•No red nodes have been made adjacent (parent and child both
red).)
•Deleted node is not the root since the root is black.
Deepak John,Department Of IT,CE Poonjar
• After Deletion of a Black node a restore function must be called to
fix red-black properties that might be violated. There are 3
possible initial violations.
If d l t d d th t d hild i ht b th t– If deleted node was the root, a red child might now be the root,
Violation of property 2.
– If both the parent of removed node, and a child of removedIf both the parent of removed node, and a child of removed
node are red, we have a violation of property 4.
– The removal of a black node can cause the black-height of one
h b h (b 1) i l i 5path to be shorter (by 1), violating property 5.
– We correct the problem of rule 5 by adding an extra “black” to
the node passed into the fix-up procedure. This leads tothe node passed into the fix up procedure. This leads to
violations in rules 1 since this node is now neither red or black.
Deepak John,Department Of IT,CE Poonjar
Delete PossibilitiesDelete Possibilities
1:Delete Red node
• No problem
2:Delete black node with red child
• Color red child black
3:Delete black node with black child
C l hild “D bl Bl k”• Color child “Double Black”
• 3 possibilities depending on neighboring nodes
X’s sibling is black with at least one red child– X s sibling is black with at least one red child
– X’s sibling is black with no red children
– X’s sibling is reds s b g s ed
Deletion – Fixupp
• Idea: Move the extra black up the tree until x points to a red &
black node  turn it into a black node,
• x points to the root  just remove the extra black, or
• We can do certain rotations and recolorings and finish.
Withi th hil l• Within the while loop:
– x always points to a nonroot doubly black node.
– w is x’s siblingw is x s sibling.
– w cannot be nil[T ], since that would violate property 5 at
p[x].
Case 1 – w is red
p[x]
B
A D B
x w
D
E
p[ ]
A D
C E
 
B
A  C
E
x new
wC E
   
   
w
•w must have black children.
•Make w black and p[x] red.
•Th l ft t t [ ]•Then left rotate on p[x].
•New sibling of x was a child of w before rotation  must be black.
Go immediately to case 2, 3, or 4.
Deepak John,Department Of IT,CE Poonjar
Case 2 – w is black, both w’s children are
blackp[x] black
B
A D
x w
B
new xc
c
p[x]
A D
C E
 
A D
C E
 
•Take 1 black off x ( singly black) and off w ( red)
C E
   
C E
   
•Take 1 black off x ( singly black) and off w ( red).
•Move that black to p[x].
•Do the next iteration with p[x] as the new xDo the next iteration with p[x] as the new x.
•If entered this case from case 1, then p[x] was red  new x is red &
black  color attribute of new x is RED  loop terminates. Then new
x is made black in the last line.
Deepak John,Department Of IT,CE Poonjar
Case 3 – w is black, w’s left child is red,
w’s right child is blackw s right child is black
B
x w
B
c
c
A D
 
x w
A C
D 
new wx
C E

   
D 

 E
•Make w red and w’s left child black.

Make w red and w s left child black.
•Then right rotate on w.
•New sibling w of x is black with a red right child  case 4.
Deepak John,Department Of IT,CE Poonjar
Case 4 – w is black, w’s right child is red
B
A D B
x w
D
E
c
A D
C E
 
B
A  C
E
x
c’
C E
   
   
•Make w be p[x]’s color (c).
•Make p[x] black and w’s right child black.
Th l ft t t [ ]•Then left rotate on p[x].
•Remove the extra black on x ( x is now singly black) without
violating any red-black properties.g y p p
•All done. Setting x to root causes the loop to terminate.
Deepak John,Department Of IT,CE Poonjar
RB-Delete(T, z)
1. if left[z] = nil[T] or right[z] = nil[T]
2. then y  z
RB-Delete(T, z)
1. if left[z] = nil[T] or right[z] = nil[T]
2. then y  z
3. else y  TREE-SUCCESSOR(z)
4. if left[y] = nil[T ]
5. then x  left[y]
3. else y  TREE-SUCCESSOR(z)
4. if left[y] = nil[T ]
5. then x  left[y]
6. else x  right[y]
7. p[x]  p[y] // Do this, even if x is nil[T]
6. else x  right[y]
7. p[x]  p[y] // Do this, even if x is nil[T]
8. if p[y] = nil[T ]8. if p[y] = nil[T ]
9. then root[T ]  x
10. else if y = left[p[y]]
11. then left[p[y]]  x
9. then root[T ]  x
10. else if y = left[p[y]]
11. then left[p[y]]  x11. then left[p[y]]  x
12. else right[p[y]]  x
13. if y = z
14 then key[z]  key[y]
11. then left[p[y]]  x
12. else right[p[y]]  x
13. if y = z
14 then key[z]  key[y]14. then key[z]  key[y]
15. copy y’s satellite data into z
16. if color[y] = BLACK
17 th RB D l t Fi (T )
14. then key[z]  key[y]
15. copy y’s satellite data into z
16. if color[y] = BLACK
17 th RB D l t Fi (T )
Deepak John,Department Of IT,CE Poonjar
17. then RB-Delete-Fixup(T, x)
18. return y
17. then RB-Delete-Fixup(T, x)
18. return y
RB D l Fi (T )RB D l Fi (T )RB-Delete-Fixup(T, x)
1. while x  root[T ] and color[x] = BLACK
2 do if x = left[p[x]]
RB-Delete-Fixup(T, x)
1. while x  root[T ] and color[x] = BLACK
2 do if x = left[p[x]]2. do if x = left[p[x]]
3. then w  right[p[x]]
4. if color[w] = RED
2. do if x = left[p[x]]
3. then w  right[p[x]]
4. if color[w] = RED[ ]
5. then color[w]  BLACK // Case 1
6. color[p[x]]  RED // Case 1
[ ]
5. then color[w]  BLACK // Case 1
6. color[p[x]]  RED // Case 1
7. LEFT-ROTATE(T, p[x]) // Case 1
8. w  right[p[x]] // Case 1
7. LEFT-ROTATE(T, p[x]) // Case 1
8. w  right[p[x]] // Case 1
Deepak John,Department Of IT,CE Poonjar
/* x is still left[p[x]] */
9. if color[left[w]] = BLACK and color[right[w]] = BLACK
l //
/* x is still left[p[x]] */
9. if color[left[w]] = BLACK and color[right[w]] = BLACK
l //10. then color[w]  RED // Case 2
11. x  p[x] // Case 2
12. else if color[right[w]] = BLACK
10. then color[w]  RED // Case 2
11. x  p[x] // Case 2
12. else if color[right[w]] = BLACK12. else if color[right[w]] BLACK
13. then color[left[w]]  BLACK // Case 3
14. color[w]  RED // Case 3
12. else if color[right[w]] BLACK
13. then color[left[w]]  BLACK // Case 3
14. color[w]  RED // Case 3
15. RIGHT-ROTATE(T,w) // Case 3
16. w  right[p[x]] // Case 3
17 color[w]  color[p[x]] // Case 4
15. RIGHT-ROTATE(T,w) // Case 3
16. w  right[p[x]] // Case 3
17 color[w]  color[p[x]] // Case 417. color[w]  color[p[x]] // Case 4
18. color[p[x]]  BLACK // Case 4
19. color[right[w]]  BLACK // Case 4
17. color[w]  color[p[x]] // Case 4
18. color[p[x]]  BLACK // Case 4
19. color[right[w]]  BLACK // Case 4
20. LEFT-ROTATE(T, p[x]) // Case 4
21. x  root[T ] // Case 4
22 else (same as then cla se ith “right” and “left” e changed)
20. LEFT-ROTATE(T, p[x]) // Case 4
21. x  root[T ] // Case 4
22 else (same as then cla se ith “right” and “left” e changed)22. else (same as then clause with “right” and “left” exchanged)
23. color[x]  BLACK
22. else (same as then clause with “right” and “left” exchanged)
23. color[x]  BLACK
Delete Analysis
O(lg n) time to get through RB-Delete up to the call of RB-Delete-
Fixup.
Within RB-Delete-Fixup:
Case 2 is the only case in which more iterations occur.
x moves up 1 levelx moves up 1 level.
Hence, O(lg n) iterations.
Each of cases 1, 3, and 4 has 1 rotation   3 rotations in all.
Hence, O(lg n) time.
Deepak John,Department Of IT,CE Poonjar

Contenu connexe

Tendances

Lecture 3 insertion sort and complexity analysis
Lecture 3   insertion sort and complexity analysisLecture 3   insertion sort and complexity analysis
Lecture 3 insertion sort and complexity analysisjayavignesh86
 
Algorithm analysis
Algorithm analysisAlgorithm analysis
Algorithm analysissumitbardhan
 
Analysis of algorithms
Analysis of algorithmsAnalysis of algorithms
Analysis of algorithmsGanesh Solanke
 
Analysis Of Algorithms I
Analysis Of Algorithms IAnalysis Of Algorithms I
Analysis Of Algorithms ISri Prasanna
 
Introduction to Algorithms
Introduction to AlgorithmsIntroduction to Algorithms
Introduction to AlgorithmsVenkatesh Iyer
 
Design and Analysis of Algorithms
Design and Analysis of AlgorithmsDesign and Analysis of Algorithms
Design and Analysis of AlgorithmsSwapnil Agrawal
 
Algorithm And analysis Lecture 03& 04-time complexity.
 Algorithm And analysis Lecture 03& 04-time complexity. Algorithm And analysis Lecture 03& 04-time complexity.
Algorithm And analysis Lecture 03& 04-time complexity.Tariq Khan
 
Algorithms Lecture 2: Analysis of Algorithms I
Algorithms Lecture 2: Analysis of Algorithms IAlgorithms Lecture 2: Analysis of Algorithms I
Algorithms Lecture 2: Analysis of Algorithms IMohamed Loey
 
Algorithm chapter 2
Algorithm chapter 2Algorithm chapter 2
Algorithm chapter 2chidabdu
 
Data Structures and Algorithm Analysis
Data Structures  and  Algorithm AnalysisData Structures  and  Algorithm Analysis
Data Structures and Algorithm AnalysisMary Margarat
 
Lecture 2 role of algorithms in computing
Lecture 2   role of algorithms in computingLecture 2   role of algorithms in computing
Lecture 2 role of algorithms in computingjayavignesh86
 
Lecture 4 asymptotic notations
Lecture 4   asymptotic notationsLecture 4   asymptotic notations
Lecture 4 asymptotic notationsjayavignesh86
 
Introduction to Algorithms Complexity Analysis
Introduction to Algorithms Complexity Analysis Introduction to Algorithms Complexity Analysis
Introduction to Algorithms Complexity Analysis Dr. Pankaj Agarwal
 
Design and analysis of Algorithm By Dr. B. J. Mohite
Design and analysis of Algorithm By Dr. B. J. MohiteDesign and analysis of Algorithm By Dr. B. J. Mohite
Design and analysis of Algorithm By Dr. B. J. MohiteZeal Education Society, Pune
 
Fundamentals of the Analysis of Algorithm Efficiency
Fundamentals of the Analysis of Algorithm EfficiencyFundamentals of the Analysis of Algorithm Efficiency
Fundamentals of the Analysis of Algorithm EfficiencySaranya Natarajan
 
Algorithms lecture 3
Algorithms lecture 3Algorithms lecture 3
Algorithms lecture 3Mimi Haque
 

Tendances (20)

Lecture 3 insertion sort and complexity analysis
Lecture 3   insertion sort and complexity analysisLecture 3   insertion sort and complexity analysis
Lecture 3 insertion sort and complexity analysis
 
Algorithm analysis
Algorithm analysisAlgorithm analysis
Algorithm analysis
 
Daa unit 5
Daa unit 5Daa unit 5
Daa unit 5
 
Analysis of algorithms
Analysis of algorithmsAnalysis of algorithms
Analysis of algorithms
 
Analysis Of Algorithms I
Analysis Of Algorithms IAnalysis Of Algorithms I
Analysis Of Algorithms I
 
Introduction to Algorithms
Introduction to AlgorithmsIntroduction to Algorithms
Introduction to Algorithms
 
Design and Analysis of Algorithms
Design and Analysis of AlgorithmsDesign and Analysis of Algorithms
Design and Analysis of Algorithms
 
Algorithm And analysis Lecture 03& 04-time complexity.
 Algorithm And analysis Lecture 03& 04-time complexity. Algorithm And analysis Lecture 03& 04-time complexity.
Algorithm And analysis Lecture 03& 04-time complexity.
 
Algorithms Lecture 2: Analysis of Algorithms I
Algorithms Lecture 2: Analysis of Algorithms IAlgorithms Lecture 2: Analysis of Algorithms I
Algorithms Lecture 2: Analysis of Algorithms I
 
Algorithm chapter 2
Algorithm chapter 2Algorithm chapter 2
Algorithm chapter 2
 
Data Structures and Algorithm Analysis
Data Structures  and  Algorithm AnalysisData Structures  and  Algorithm Analysis
Data Structures and Algorithm Analysis
 
Lecture 2 role of algorithms in computing
Lecture 2   role of algorithms in computingLecture 2   role of algorithms in computing
Lecture 2 role of algorithms in computing
 
Lecture 4 asymptotic notations
Lecture 4   asymptotic notationsLecture 4   asymptotic notations
Lecture 4 asymptotic notations
 
Introduction to Algorithms Complexity Analysis
Introduction to Algorithms Complexity Analysis Introduction to Algorithms Complexity Analysis
Introduction to Algorithms Complexity Analysis
 
Design and analysis of Algorithm By Dr. B. J. Mohite
Design and analysis of Algorithm By Dr. B. J. MohiteDesign and analysis of Algorithm By Dr. B. J. Mohite
Design and analysis of Algorithm By Dr. B. J. Mohite
 
Daa unit 1
Daa unit 1Daa unit 1
Daa unit 1
 
Daa unit 1
Daa unit 1Daa unit 1
Daa unit 1
 
Fundamentals of the Analysis of Algorithm Efficiency
Fundamentals of the Analysis of Algorithm EfficiencyFundamentals of the Analysis of Algorithm Efficiency
Fundamentals of the Analysis of Algorithm Efficiency
 
Analysis of algorithm
Analysis of algorithmAnalysis of algorithm
Analysis of algorithm
 
Algorithms lecture 3
Algorithms lecture 3Algorithms lecture 3
Algorithms lecture 3
 

Similaire à Analysis and design of algorithms part2

Similaire à Analysis and design of algorithms part2 (20)

Merge sort
Merge sortMerge sort
Merge sort
 
Quick sort Algorithm Discussion And Analysis
Quick sort Algorithm Discussion And AnalysisQuick sort Algorithm Discussion And Analysis
Quick sort Algorithm Discussion And Analysis
 
Sorting algorithms
Sorting algorithmsSorting algorithms
Sorting algorithms
 
1_Asymptotic_Notation_pptx.pptx
1_Asymptotic_Notation_pptx.pptx1_Asymptotic_Notation_pptx.pptx
1_Asymptotic_Notation_pptx.pptx
 
Sorting
SortingSorting
Sorting
 
Data Structures 6
Data Structures 6Data Structures 6
Data Structures 6
 
quick sort by deepak.pptx
quick sort by deepak.pptxquick sort by deepak.pptx
quick sort by deepak.pptx
 
Daa chapter5
Daa chapter5Daa chapter5
Daa chapter5
 
sorting
sortingsorting
sorting
 
Best,worst,average case .17581556 045
Best,worst,average case .17581556 045Best,worst,average case .17581556 045
Best,worst,average case .17581556 045
 
Introduction to Algorithms
Introduction to AlgorithmsIntroduction to Algorithms
Introduction to Algorithms
 
2 chapter2 algorithm_analysispart1
2 chapter2 algorithm_analysispart12 chapter2 algorithm_analysispart1
2 chapter2 algorithm_analysispart1
 
Data Structure (MC501)
Data Structure (MC501)Data Structure (MC501)
Data Structure (MC501)
 
Introduction
IntroductionIntroduction
Introduction
 
Skiena algorithm 2007 lecture08 quicksort
Skiena algorithm 2007 lecture08 quicksortSkiena algorithm 2007 lecture08 quicksort
Skiena algorithm 2007 lecture08 quicksort
 
Quick sort
Quick sortQuick sort
Quick sort
 
Sorting
SortingSorting
Sorting
 
Dsa – data structure and algorithms sorting
Dsa – data structure and algorithms  sortingDsa – data structure and algorithms  sorting
Dsa – data structure and algorithms sorting
 
Algorithm Design and Analysis
Algorithm Design and AnalysisAlgorithm Design and Analysis
Algorithm Design and Analysis
 
Quick Sort
Quick SortQuick Sort
Quick Sort
 

Plus de Deepak John

Network concepts and wi fi
Network concepts and wi fiNetwork concepts and wi fi
Network concepts and wi fiDeepak John
 
Web browser week5 presentation
Web browser week5 presentationWeb browser week5 presentation
Web browser week5 presentationDeepak John
 
Information management
Information managementInformation management
Information managementDeepak John
 
It security,malware,phishing,information theft
It security,malware,phishing,information theftIt security,malware,phishing,information theft
It security,malware,phishing,information theftDeepak John
 
Email,contacts and calendar
Email,contacts and calendarEmail,contacts and calendar
Email,contacts and calendarDeepak John
 
Module 2 instruction set
Module 2 instruction set Module 2 instruction set
Module 2 instruction set Deepak John
 
introduction to computers
 introduction to computers introduction to computers
introduction to computersDeepak John
 
Registers and counters
Registers and counters Registers and counters
Registers and counters Deepak John
 
Computer security module 4
Computer security module 4Computer security module 4
Computer security module 4Deepak John
 
Module 4 network and computer security
Module  4 network and computer securityModule  4 network and computer security
Module 4 network and computer securityDeepak John
 
Network and computer security-
Network and computer security-Network and computer security-
Network and computer security-Deepak John
 
Computer security module 3
Computer security module 3Computer security module 3
Computer security module 3Deepak John
 
Module 4 registers and counters
Module 4 registers and counters Module 4 registers and counters
Module 4 registers and counters Deepak John
 
Module 2 network and computer security
Module 2 network and computer securityModule 2 network and computer security
Module 2 network and computer securityDeepak John
 
Computer security module 2
Computer security module 2Computer security module 2
Computer security module 2Deepak John
 
Computer security module 1
Computer security module 1Computer security module 1
Computer security module 1Deepak John
 
Network and Computer security
Network and Computer securityNetwork and Computer security
Network and Computer securityDeepak John
 
Combinational and sequential logic
Combinational and sequential logicCombinational and sequential logic
Combinational and sequential logicDeepak John
 
Module 2 logic gates
Module 2  logic gatesModule 2  logic gates
Module 2 logic gatesDeepak John
 

Plus de Deepak John (20)

Network concepts and wi fi
Network concepts and wi fiNetwork concepts and wi fi
Network concepts and wi fi
 
Web browser week5 presentation
Web browser week5 presentationWeb browser week5 presentation
Web browser week5 presentation
 
Information management
Information managementInformation management
Information management
 
It security,malware,phishing,information theft
It security,malware,phishing,information theftIt security,malware,phishing,information theft
It security,malware,phishing,information theft
 
Email,contacts and calendar
Email,contacts and calendarEmail,contacts and calendar
Email,contacts and calendar
 
Module 1 8086
Module 1 8086Module 1 8086
Module 1 8086
 
Module 2 instruction set
Module 2 instruction set Module 2 instruction set
Module 2 instruction set
 
introduction to computers
 introduction to computers introduction to computers
introduction to computers
 
Registers and counters
Registers and counters Registers and counters
Registers and counters
 
Computer security module 4
Computer security module 4Computer security module 4
Computer security module 4
 
Module 4 network and computer security
Module  4 network and computer securityModule  4 network and computer security
Module 4 network and computer security
 
Network and computer security-
Network and computer security-Network and computer security-
Network and computer security-
 
Computer security module 3
Computer security module 3Computer security module 3
Computer security module 3
 
Module 4 registers and counters
Module 4 registers and counters Module 4 registers and counters
Module 4 registers and counters
 
Module 2 network and computer security
Module 2 network and computer securityModule 2 network and computer security
Module 2 network and computer security
 
Computer security module 2
Computer security module 2Computer security module 2
Computer security module 2
 
Computer security module 1
Computer security module 1Computer security module 1
Computer security module 1
 
Network and Computer security
Network and Computer securityNetwork and Computer security
Network and Computer security
 
Combinational and sequential logic
Combinational and sequential logicCombinational and sequential logic
Combinational and sequential logic
 
Module 2 logic gates
Module 2  logic gatesModule 2  logic gates
Module 2 logic gates
 

Dernier

GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTSGRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTSJoshuaGantuangco2
 
Daily Lesson Plan in Mathematics Quarter 4
Daily Lesson Plan in Mathematics Quarter 4Daily Lesson Plan in Mathematics Quarter 4
Daily Lesson Plan in Mathematics Quarter 4JOYLYNSAMANIEGO
 
4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptx4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptxmary850239
 
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...Nguyen Thanh Tu Collection
 
Earth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice greatEarth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice greatYousafMalik24
 
4.18.24 Movement Legacies, Reflection, and Review.pptx
4.18.24 Movement Legacies, Reflection, and Review.pptx4.18.24 Movement Legacies, Reflection, and Review.pptx
4.18.24 Movement Legacies, Reflection, and Review.pptxmary850239
 
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdfGrade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdfJemuel Francisco
 
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptxQ4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptxlancelewisportillo
 
Keynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-designKeynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-designMIPLM
 
Barangay Council for the Protection of Children (BCPC) Orientation.pptx
Barangay Council for the Protection of Children (BCPC) Orientation.pptxBarangay Council for the Protection of Children (BCPC) Orientation.pptx
Barangay Council for the Protection of Children (BCPC) Orientation.pptxCarlos105
 
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptxMULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptxAnupkumar Sharma
 
How to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPHow to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPCeline George
 
Transaction Management in Database Management System
Transaction Management in Database Management SystemTransaction Management in Database Management System
Transaction Management in Database Management SystemChristalin Nelson
 
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)lakshayb543
 
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...JhezDiaz1
 
Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)Mark Reed
 
Student Profile Sample - We help schools to connect the data they have, with ...
Student Profile Sample - We help schools to connect the data they have, with ...Student Profile Sample - We help schools to connect the data they have, with ...
Student Profile Sample - We help schools to connect the data they have, with ...Seán Kennedy
 

Dernier (20)

GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTSGRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
 
Daily Lesson Plan in Mathematics Quarter 4
Daily Lesson Plan in Mathematics Quarter 4Daily Lesson Plan in Mathematics Quarter 4
Daily Lesson Plan in Mathematics Quarter 4
 
4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptx4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptx
 
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
 
Earth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice greatEarth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice great
 
4.18.24 Movement Legacies, Reflection, and Review.pptx
4.18.24 Movement Legacies, Reflection, and Review.pptx4.18.24 Movement Legacies, Reflection, and Review.pptx
4.18.24 Movement Legacies, Reflection, and Review.pptx
 
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdfGrade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
 
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptxQ4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
 
FINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptx
FINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptxFINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptx
FINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptx
 
Raw materials used in Herbal Cosmetics.pptx
Raw materials used in Herbal Cosmetics.pptxRaw materials used in Herbal Cosmetics.pptx
Raw materials used in Herbal Cosmetics.pptx
 
Keynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-designKeynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-design
 
Barangay Council for the Protection of Children (BCPC) Orientation.pptx
Barangay Council for the Protection of Children (BCPC) Orientation.pptxBarangay Council for the Protection of Children (BCPC) Orientation.pptx
Barangay Council for the Protection of Children (BCPC) Orientation.pptx
 
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptxMULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
 
How to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPHow to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERP
 
Transaction Management in Database Management System
Transaction Management in Database Management SystemTransaction Management in Database Management System
Transaction Management in Database Management System
 
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
 
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
 
Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)
 
YOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptx
YOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptxYOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptx
YOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptx
 
Student Profile Sample - We help schools to connect the data they have, with ...
Student Profile Sample - We help schools to connect the data they have, with ...Student Profile Sample - We help schools to connect the data they have, with ...
Student Profile Sample - We help schools to connect the data they have, with ...
 

Analysis and design of algorithms part2

  • 1. A l i d D i f Al ithAnalysis and Design of Algorithms Deepak John Department Of Computer Applications , SJCET-Pala
  • 2. Analysis of searching and sorting. Insertion sort,Analysis of searching and sorting. Insertion sort, Quick sort, Merge sort and Heap sort. Binomial Heaps and Fibonacci Heaps, Lower bounds for sorting by comparison of keys. Comparison of sorting algorithms. Amortized Time Analysis. Red-Black T I ti d D l tiTrees – Insertion and Deletion.
  • 3. Approachpp Step I: Choose the criteria (for natural numbers, criteria can be ascending or descending order). Step II:Step II: How to put data in order using the criterion selected.
  • 4. AnalysisAnalysis • Final ordering of data can be obtained in a variety of ways. • Some are meaningful and efficient.g • Meaningful and efficient ways depend on many aspects of an application- type of data, randomness of data, run time constraints, i f th d t t f it i tsize of the data, nature of criteria, etc. • To make comparisons, certain properties of sorting algorithms should be defined. • Properties which are used to compare algorithms without depending on the type and speed of the machines are: – number of comparisons– number of comparisons. – number of data movements. – Use of auxiliary storage.
  • 5. Sortingg Sorting is the process of arranging a group of items into a defined order based on particular criteriainto a defined order based on particular criteria There are many, different types of sorting algorithms, but the primary ones are: 1. insertion sort1. insertion sort 2. Quick sort 3. Merge sort 4. Heap sort.
  • 6. Insertion sortInsertion sort th iti t i t th t l t i f d b• the proper position to insert the current element is found by comparing the current element with the elements in the sorted sub-array.which is an efficient algorithm for sorting a small b f l tnumber of elements.
  • 7.
  • 8.
  • 9. Analyzing Algorithmy g g O t l (li 1 8) tl 1 ti ( ith l th(A))Outer loop (lines 1–8) runs exactly n − 1 times (with n = length(A))
  • 10. Best case •The best case for insertion sort is when the input array is already•The best case for insertion sort is when the input array is already sorted, in which case the while loop never executes (but the condition must be checked once). •tj=1, and line 6 and 7 will be executed 0 times. •T(n) = c1n + c2(n - 1) + c4(n - 1) + c5(n - 1) + c8(n - 1) ( 1 2 4 5 8) ( 2 4 5 8)= (c1 + c2 + c4 + c5 + c8)n - (c2 + c4 + c5 + c8) = an + b = Θ(n)= Θ(n) Deepak John,Department Of IT,CE Poonjar
  • 11. worst case for insertion sort is when the input array is in reverse sorted order, in which case the while loop executes the maximum number of times. •the inner loop is executed exactly j − 1 times for every iteration of the outer loop.outer loop. =an2+bn-c (consider only leading terms of formula, since lower order terms are insignificant for large n Ignore the leading terms constantterms are insignificant for large n.Ignore the leading terms constant coefficient ,since constant factors are less significant than the rate of growth in determine computational efficiency of large inputs) Θ( 2)=Θ(n2)
  • 12. Average case • when random data is sorted, insertion sort is usually closer to the worst caseworst case. • in average, tj = j/2. T(n) will still be in the order of n2, same as the worst case. Order of Growth The order of a running-time function is the fastest growing term,g g g , discarding constant factors Best case: an + b → Θ(n) 2 2Worst case : an2 + bn - c → Θ(n2) Average case: Θ(n2)
  • 13. • Advantageg The advantage of Insertion Sort is that it is relatively simple and easy to implement. • Disadvantage The disadvantage of Insertion Sort is that it is not efficient to operate with a large list or input sizeoperate with a large list or input size.
  • 14. Quick-SortQ • Quick-sort is a randomized ti l ith b d thsorting algorithm based on the divide-and-conquer paradigm: – Divide: pick a random x Divide: pick a random element x (called pivot) and partition S into xp • L elements less than x • E elements equal x x L GE • G elements greater than x – Recur: sort L and G L GE x – Conquer: join L, E and G
  • 15. Choice Of PivotChoice Of Pivot Three ways to choose the pivot: • Median-of-Three - from the leftmost, middle, and rightmostg elements of the list to be sorted, select the one with median key as the pivot • Pi ot is rightmost element in list that is to be sorted• Pivot is rightmost element in list that is to be sorted – When sorting A[6:20], use A[20] as the pivot • Randomly select one of the elements to be sorted as the pivotRandomly select one of the elements to be sorted as the pivot – When sorting A[6:20], generate a random number r in the range [6, 20] – Use A[r] as the pivot
  • 16. Algorithm Given an array of n elements (e.g., integers): • If array only contains one element, return • Else• Else – pick one element to use as pivot. – Partition elements into two sub-arrays: • Elements less than or equal to pivot • Elements greater than pivot – Quick sort two sub-arraysQuick sort two sub arrays – Return results
  • 18. P titi iPartitioning • The key to the algorithm is the PARTITION procedure, which rearranges the sub-array in place.g y p • Given a pivot, partition the elements of the array such that the resulting array consists of: 1 One sub array that contains elements > pivot1. One sub-array that contains elements >= pivot 2. Another sub-array that contains elements < pivot • The sub-arrays are stored in the original data array.y g y
  • 19. Analysisy The running time of quick sort depends on whether the partitioning is b l d t A th t k d if lbalanced or not. Assume that keys are random, uniformly distributed. Best caseBest case Recursion: 1. Partition splits array in two sub-arrays of size n/2 2. Quicksort each sub-array the depth of the recursion is log2n At each level of the recursion, the work done in all the partitions at that level is =O(n) O(log n) * O(n) = O(n log n)O(log2n) O(n) O(n log2n) Best case running time: O(n log2n) Deepak John,Department Of IT,CE Poonjar
  • 20.
  • 21. W tWorst case • Data is sorted already Recursion:– Recursion: 1. Partition splits array in two sub-arrays: • one sub-array of size 0 • the other sub-array of size n-1 2. Quick sort each sub-array Rec rring on the length n 1 part req ires rec rring to depth n 1Recurring on the length n-1 part requires recurring to depth n-1 • recursion is O(n) levels deep (for an array of size n). • the partitioning work done at each level is O(n)the partitioning work done at each level is O(n). • O(n) * O(n) = O(n2) Worst case running time: O(n2)
  • 22. Average-case • If the pivot element is randomly chosen we expect the split of the input array to be reasonably well balanced on average . • Assuming random input, average-case running time is much closer to (n lg n) than (n2) • T(n)=O(n lgn)• T(n)=O(n lgn) Improved Pivot Selection Pick median value of three elements from data array : data[0],y [ ], data[n/2], and data[n-1].Use this median value as pivot.
  • 23. Merge sortg Merge-sort on an input sequence S with n elements consists of three steps: Divide: partition S into two sequences S1 and S2 of about n/2 elements each Recur: recursively sort S1 and S2y 1 2 Conquer: merge S1 and S2 into a unique sorted sequence A L G O R I T H M S divideA L G O R I T H M S sortA G L O R H I M S T Deepak John,Department Of IT,CE Poonjar mergeA G H I L M O R S T
  • 24. Algorithm MERGE-SORT (A, p, r) 1 IF p < r // Check for base case Algorithm 1. IF p < r // Check for base case 2. THEN q = (p + r)/2 // Divide step 3. MERGE-SORT (A, p, q // Conquer step. 4 MERGE SORT(A + 1 ) // C t4. MERGE-SORT(A, q + 1, r) // Conquer step. 5. MERGE (A, p, q, r) // Conquer step.
  • 25. ( ) 1 2 3 4 5 6 7 8 p rq MERGE(A, p, q, r) 1. Compute n1 and n2 2 Copy the first n1 elements into 63217542 n1 n22. Copy the first n1 elements into L[1 . . n1 + 1] and the next n2 elements into R[1 . . n2 + 1] 3. L[n1 + 1] ← ; R[n2 + 1] ← 4 i 1 j 1 p q 1 2 4. i ← 1; j ← 1 5. for k ← p to r 6. do if L[ i ] ≤ R[ j ] p q 7542 rq + 1 L 6. do if L[ i ] ≤ R[ j ] 7. then A[k] ← L[ i ] 8. i ←i + 1 6321R 9. else A[k] ← R[ j ] 10. j ← j + 1
  • 26. Analysis • For simplicity assume that n is a power of 2 so that each divide step• For simplicity, assume that n is a power of 2 so that each divide step yields two subproblems, both of size exactly n/2. • The base case occurs when n = 1.When n > 1, time for merge sort steps: Divide: Just compute q as the average of p and r, which takes constant time i e Θ(1)time i.e. Θ(1). Conquer: Recursively solve 2 sub problems, each of size n/2, which is 2T(n/2). Combine: MERGE on an n-element sub array takes Θ(n) time. • Summed together they give a function ,the recurrence for merge sort i i irunning time is T(n) = Θ(1) if n=1 = 2T(n/2)+ Θ (n)+Θ(1) If n>1= 2T(n/2)+ Θ (n)+Θ(1). If n>1 T(n)=Θ(n lg2n)
  • 27. Analysis of MergeSort O(n log n) best-, average-, and worst-case complexity because the merging is always linear Analysis of MergeSort g g y ―Extra O(n) temporary array for merging data ―Extra copying to the temporary array and back Useful only for external sorting Deepak John,Department Of IT,CE Poonjar
  • 28. Heapsp Definitions of heap: 1. A balanced, left-justified binary tree in which no node has a, j y value greater than the value in its parent. Example min heap Y>=X Z>=X
  • 29. Heap • The binary heap data structures is an array that can be viewed as a complete binary tree. Each node of the binary tree corresponds to an element of the array. The array is completely filled on all levels except possibly lowest. 19 12 16 41 7 1619 1 412 7Array A 1619 1 412 7Array A
  • 30. Max Heap Example Min Heap Example 19 1 12 16 4 16 41 7 127 19 1619 1 412 7 41 7 127 191641 Array A Array A
  • 32. • Algorithm 1. Add the new element to the next available position at the lowest level 2. Restore the max-heap property if violated • General strategy is percolate up (or bubble up): if the parent• General strategy is percolate up (or bubble up): if the parent of the element is smaller than the element, then interchange the parent and child. OROR Restore the min-heap property if violated • General strategy is percolate up (or bubble up): if the parentgy p p ( p) p of the element is larger than the element, then interchange the parent and child.
  • 33. 19 19 12 16 12 16 41 7 41 7 17 19 Insert 17 12 17 swap 41 7 16 Percolate up to maintain the heap property
  • 34. • Delete max – Copy the last number to the root ( overwrite the maximum l t t d th )element stored there ). – Restore the max heap property by percolate down. • Delete min – Copy the last number to the root ( overwrite the minimumpy ( element stored there ). – Restore the min heap property by percolate down.
  • 35. Maintaining the Heap PropertyMaintaining the Heap Property • Suppose a node is smaller than a childpp – Left and Right subtrees of i are max- heaps • To eliminate the violation: – Exchange with larger child Move down the tree– Move down the tree – Continue until node is not smaller than children
  • 36. Maintaining the Heap Property • Assumptions: – Left and Right Alg: MAX-HEAPIFY(A, i, n) 1 l ← LEFT(i)Left and Right subtrees of i are max-heaps 1. l ← LEFT(i) 2. r ← RIGHT(i) 3. if l ≤ n and A[l] > A[i] – A[i] may be smaller than its children [ ] [ ] 4. then largest ←l 5. else largest ←i 6. if r ≤ n and A[r] > A[largest] 7. then largest ←r 8 if l ≠ i8. if largest ≠ i 9. then exchange A[i] ↔ A[largest] 10 MAX HEAPIFY(A largest n)10. MAX-HEAPIFY(A, largest, n)
  • 37. ExampleMAX-HEAPIFY(A 2 10)MAX HEAPIFY(A, 2, 10) A[2] → A[4] A[2] violates the heap property A[4] violates the heap property
  • 38. A[4] → A[9] Heap property restored
  • 39. T(n)=O(lg n ) •Best Case Occurs when no swap is performed, T(n)=O(1)p p , ( ) ( ) •Worst case occurs when we swap all elements
  • 40. BUILD-MAX-HEAP Produces a max-heap from an unordered input array BUILD MAX HEAP •O(n) calls( ) • Each call takes O(lg n) time for max haepify ,so O(n lg n) be the total time.
  • 41. Heap sort The heapsort algorithm consists of two phases: - build a heap from an arbitrary array - use the heap to sort the datause the heap to sort the data • To sort the elements in the decreasing order, use a min heap • To sort the elements in the increasing order, use a max heap 11
  • 42. Example Heap Sort Let us look at this example: we must convert the unordered array with n = 10 elements into a max-heapwith n 10 elements into a max heap we start with position 10/2 = 5
  • 43. We compare 3 with its child and swap them W 17 ith it t hild d it ith thWe compare 17 with its two children and swap it with the maximum child (70)
  • 44. We compare 28 with its two children, 63 and 34, and swap it with the largest child We compare 52 with its children, swap it with the largest Rec rsing no f rther s aps are needed– Recursing, no further swaps are needed
  • 45. Finally, we swap the root with its largest child, and recurse, i 46 i i h 81 d h i i h 70swapping 46 again with 81, and then again with 70
  • 46. We have now converted the unsorted array into a max-heap:
  • 47. Suppose we pop the maximum element of this heap This leaves a gap at the back of the array:
  • 48. This is the last entry in the array, so why not fill it with the largest element?element? Repeat this process: pop the maximum element, and then insert it at the end of the array:
  • 49. Repeat this process – Pop and append 70Pop and append 70 – Pop and append 63
  • 50. We have the 4 largest elements in order – Pop and append 52 – Pop and append 46
  • 51. Continuing... – Pop and append 34 – Pop and append 28
  • 52. Finally we can pop 17 insert it into the 2nd location and theFinally, we can pop 17, insert it into the 2nd location, and the resulting array is sorted
  • 53. Analysisy • The call to BuildHeap() takes O(n) time • Each of the n - 1 calls to Heapify() takes O(lg n) timep y() ( g ) • Thus the total time taken by HeapSort() = O(n) + (n - 1) O(lg n) O( ) + O( l )= O(n) + O(n lg n) = O(n lg n) There are no best-case and worst-case scenarios for heap sortp
  • 54.
  • 55. Binomial trees •Is an ordered tree defined recursively •Binomial tree properties:
  • 57. Binomial heaps  A binomial heap is a linked list of binomial trees with the following properties: 1 The binomial trees are linked in increasing order of size1. The binomial trees are linked in increasing order of size. 2. There is at most one binomial tree of each size. 3. Each binomial tree has the heap structure: the value in each node is ≤ 5 1h d[H] 3. Each binomial tree has the heap structure: the value in each node is ≤ the values in its children. 5 1 1210 head[H] 7 2 13103 15 151210 1616
  • 58.
  • 59. Binomial Heap Implementation E h d h th f ll i fi ld• Each node has the following fields: p: parent child: leftmost childchild: leftmost child sibling Degreeg Key •Roots of the trees are connected using linked list. •Each node x also contains the field degree[x] , which is the number of children of x.
  • 60. Binomial Heap Implementation a) c)key p 2 0 NIL h d[H] 1 2 NIL )key degree child sibling NILhead[H] NIL 1210 b) 2 12 0 NIL NIL head[H] 1 1210 10 1 15 15 0 NIL NIL
  • 61. Binomial Heap OperationsBinomial Heap Operations 1 Make-Heap()1. Make-Heap(). 2. Insert(H, x), where x is a node . 3. Minimum(H).( ) 4. Extract-Min(H). 5. Union(H1, H2): merge H1 and H2, creating a new heap. 6. Decrease-Key(H, x, k): decrease x.key (x is a node in H) to k. (It’s assumed that k x.key.)
  • 62. Make-Heap():p() •Make an empty binomial heap. Creating all of the pointers can be done in O(1) time. Th ti i l t i t d t it t NILThe operation simply creates a new pointer and sets it to NIL. Binomial-Heap-Create() 1 head[H] <- NIL 2 return head[H]
  • 63. Minimum(H): •To do this we find the smallest key among those stored at the rootsTo do this we find the smallest key among those stored at the roots connected to the head of H. •The minimum must be in some root in the top list. •If there are n nodes in the heap there are at most lg n roots at the top at•If there are n nodes in the heap there are at most lg n roots at the top, at most one each of degree 0, 1, 2, . . . , lg n , so this can be found in O(lg n) time. Binomial-Heap-Minimum(H) 1 y <- NIL 2 x <- head[H] 3 min <- ∞ 4 while x is not NIL 5 do if key[x] < min then 6 min < key[x]6 min <- key[x] 7 y <- x 8 x <- sibling[x] 9 return y
  • 64. Find Minimum Key Example 5 1head[ 2 5 1head[ 2 a) b) 1210 15 H] 7 1210 15 H] 7 15 15 5 1head[ 2 5 1head[ 2 c) d) 1210 head[ H] 7 1210 head[ H] 7 15 15 Deepak John,Department Of IT,CE Poonjar
  • 65. Binomial-Link(y,z) Link binomial trees with the same degree. Note that z, the second argument to BL(), becomes the parent, and y becomes the child. Link(y,z) p[y] := z; ibli [ ] hild[ ] Link(y,z) p[y] := z; ibli [ ] hild[ ]sibling[y] := child[z]; child[z] := y; degree[z] := degree[z] + 1 sibling[y] := child[z]; child[z] := y; degree[z] := degree[z] + 1g [ ] g [ ]g [ ] g [ ] y y z z Bk-1 Bk-1Bk-1 Bk-1 y Link Deepak John,Department Of IT,CE Poonjar
  • 66. Union(H1,H2) •is the most sophisticated of the binomial heap operationsis the most sophisticated of the binomial heap operations. •It’s used in many other operations. The running time will be O(log n).g ( g ) UnionH1, H2 H1  H2 H1 = H2 = Union traverses the new root list like this: prev-x x next-x Union traverses the new root list like this: Deepak John,Department Of IT,CE Poonjar
  • 67. Starting with the following two binomial heaps: 1880602 58 19 18 93 8060 32 63 2 69 M t li t b t 2 188060 53 Merge root lists, but now we have two trees of same degree 53 32 63 2 69 58 19 18 93 8060 53 69 Combine trees of same 28060 Combine trees of same degree using binomial link, make smaller key the root of the 53 32 63 58 19 1893 the root of the combined tree 69
  • 68. Cases prev-x x next-x sibling[next-x] prev-x x next-x a b c d p g[ ] Bk Bl Case 1 a b c d p x next-x Bk Bl Case 1:occurs when degree[x] ≠ degree[next-x], that is, when x is the root of a Bk-tree and next-x is the root of a Bl-tree for some l > k.k l prev-x x next-x sibling[next-x] prev-x x next-x a b c d p g[ ] BB Case 2 a b c d prev x x BBBBkBk Bk BkBkBk Case 2: occurs when x is the first of three roots of equal degree, that is when Deepak John,Department Of IT,CE Poonjar is, when degree[x] = degree[next-x] = degree[sibling[next-x]].
  • 69. a b c d prev-x x next-x sibling[next-x] Case 3 a b d prev-x x next-x BkBk Bl key[x]  key[next[x]] c Bk Bk Bl prev-x x next-x sibling[next-x] C 4 Bk+1 prev-x x next-x a b c d prev x x next x sibling[next x] Bk Bk Bl Case 4 a b c d Bk Bl prev x x next x k key[x] > key[next[x]] k Bk Bk+1 Case 3 and 4: occur when x is the first of two roots of equal degree, that is, when d [ ] d [ t ] ≠d [ ibli [ t ]]degree[x] = degree[next-x] ≠degree[sibling[next-x]].
  • 70. Union(H1, H2)Union(H1, H2)( 1, 2) H := new heap; head[H] := merge(H1, H2); /* simple merge of root lists */ if head[H] = NIL then return H fi; ( 1, 2) H := new heap; head[H] := merge(H1, H2); /* simple merge of root lists */ if head[H] = NIL then return H fi;if head[H] NIL then return H fi; prev-x := NIL; x := head[H]; next-x := sibling[x]; if head[H] NIL then return H fi; prev-x := NIL; x := head[H]; next-x := sibling[x];next x : sibling[x]; while next-x  NIL do if (degree[x]  degree[next-x]) or (sibling[next-x]  NIL and degree[sibling[next-x]] = degree[x]) then next x : sibling[x]; while next-x  NIL do if (degree[x]  degree[next-x]) or (sibling[next-x]  NIL and degree[sibling[next-x]] = degree[x]) then(sibling[next-x]  NIL and degree[sibling[next-x]] = degree[x]) then prev-x := x; x := next-x; else (sibling[next-x]  NIL and degree[sibling[next-x]] = degree[x]) then prev-x := x; x := next-x; else Cases 1,2 elseelse Deepak John,Department Of IT,CE Poonjar
  • 71. if key[x]  key[next-x] thenif key[x]  key[next-x] then sibling[x] := sibling[next-x]; Link(next-x, x) else if NIL th h d[H] l ibli [ ] fi sibling[x] := sibling[next-x]; Link(next-x, x) else if NIL th h d[H] l ibli [ ] fi Case 3 if prev-x = NIL then head[H] := next-x else sibling[prev-x] := next-x fi Link(x, next-x); x := next-x fi if prev-x = NIL then head[H] := next-x else sibling[prev-x] := next-x fi Link(x, next-x); x := next-x fi Case 4 fi fi; next-x := sibling[x] od; fi fi; next-x := sibling[x] od;od; return H od; return H Deepak John,Department Of IT,CE Poonjar
  • 72. Union Example 12 3328 15 25 7head[H1] 18 37 3 441029 6 8 head[H2] 41 50 31 1748 32 24 45 222330 55 Merge 18 37 3 4410 612 3328 15 25 7head[H] x next-x 37 50 31 1748 441029 32 24 222330 8 33 41 2825 50 55 32 24 45
  • 73. t 18 37 3 441029 6 8 12 3328 15 25 7head[H] x next-x 50 31 1748 32 24 45 222330 41 55 Case 3 37 3 441029 6 8 3328 15 25 7head[H] 18 12 x next-x 50 31 1748 1029 32 24 45 222330 8 41 55 3245
  • 74. t 37 3 441029 6 8 3328 15 25 7head[H] 18 12 x next-x 50 31 1748 32 24 45 222330 41 Case 2 55 37 3 441029 6 8 3328 15 25 7head[H] 18 12 prev-x x next-x 50 31 1748 32 24 45 222330 41 55
  • 75. 37 3 441029 6 8 3328 15 25 7head[H] 18 12 prev-x x next-x 50 31 1748 1029 32 24 45 222330 8 41 Case 4 55 3245 37 3 441029 6 3328 15 7 head[H] 18 12 prev-x x next-x 50 31 1748 441029 32 24 45 222330 8 4125 7 55 32 24 45
  • 76. prev x next x 37 3 441029 6 8 3328 15 7 head[H] 18 12 prev-x x next-x 50 31 1748 32 24 45 222330 4125 Case 3 55 37 3 441029 6 815 7 head[H] 18 12 prev-x x next-x 50 31 1748 32 24 45 222330 8 33 41 28 25 55
  • 77. 37 3 441029 6 815 7 head[H] 18 12 prev-x x next-x 50 31 1748 32 24 45 222330 33 41 28 25 55 Case 1 37 3 441029 6 815 7 head[H] 18 12 prev-x x next-x = NIL  terminates 50 31 1748 29 32 24 45 222330 8 33 41 28 5 25 Note: Union is O(l ) 55 345 O(lg n).
  • 78. insert Insert(H, x) H’ := Make-B-H(); Insert(H, x) H’ := Make-B-H();(); p[x] := NIL; child[x] := NIL; sibling[x] := NIL; (); p[x] := NIL; child[x] := NIL; sibling[x] := NIL;sibling[x] := NIL; degree[x] := 0; head(H’) := x; i ( ’) sibling[x] := NIL; degree[x] := 0; head(H’) := x; i ( ’)H := Union(H, H’)H := Union(H, H’) Deepak John,Department Of IT,CE Poonjar
  • 79. Extract Node With Minimum Key This operation is started by finding and removing the node x withThis operation is started by finding and removing the node x with minimum key from the binomial heap H. Create a new binomial heap H’ and set to the list of x’s children in the reverse order. Unite H and H’ to get the resulting binomial heap.to get the resulting binomial heap. Pseudocode Binomial-Heap-Extract-Min(H) 1 find the root x with the minimum key in the root list of H, and remove x from the root list of H. 2 H’ <- Make-Binomial-Heap()2 H <- Make-Binomial-Heap() 3 reverse the order of the linked list of x’s children,and set head[H’] to point to the head of the resulting list. 4 H <- Binomial-Heap-Union(H,H’) 5 Return x Run time: O(log n)Run time: O(log n) Deepak John,Department Of IT,CE Poonjar
  • 80. Extract Minimum Key Exampley p 5 1head[H] 2 12107 2 1210 3 15 15 1210 15 5 1head[H] 2 1210 15 7 1210 15 3 1210 Deepak John,Department Of IT,CE Poonjar 15
  • 81. 5 12 10head[H] 2 head[H’] 157 1210 15 2 151210 15 5 7 2 12102 head[H] 10 12 15 2 1210 15 Deepak John,Department Of IT,CE Poonjar 15
  • 82. Decreasing a key The current key is replaced with a new key To maintain the min-heapThe current key is replaced with a new key. To maintain the min-heap property, it is then compared to the key of the parent. If its parent’s key is greater then the key and data will be exchanged. This process continues until the new key is greater than the parent’s key or the new key is in the root.y g p y y Pseudocode: Binomial-Heap-Decrease-Key(H,x,k) 1 if k > key[x]1 if k > key[x] 2 then error “new key is greater than current key” 3 key[x] <-k 4 y <-x 5 z <-p[y] 6 while z not NIL and key[y] < key[z]6 while z not NIL and key[y] key[z] 7 do exchange key[y] <-> key[z] 8 if y and z have satellite fields, exchange them, too. 9 <9 y <- z 10 z <- p[y] Deepak John,Department Of IT,CE Poonjar
  • 83. Decreasing a key Execution time: This procedure takes O(log n) since the maximumExecution time: This procedure takes O(log n) since the maximum depth of x is log n. Example:p 5 2h d[H] 5 2 1210 head[H] 5 2 1210 head[H] 15 1 5 2head[H] 5 15 2 121 head[H] 5 1 122 head[H] Deepak John,Department Of IT,CE Poonjar 10 10
  • 84. Delete a Node With assumption that there is no node in H has a key of -∞. h k f d l i d i fi d dThe key of deleting node is first decreased to -∞. This node is then deleted using extracting min procedure. Pseudocode: Binomial-Heap-Delete(H,x) 1 Binomial-Heap-Decrease-Key(H,x,-∞) 2 Binomial-Heap-Extract-Min(H)2 Binomial-Heap-Extract-Min(H) Run time: O(log n) since the run time of both Binomial-Heap-Decrease- K d Bi i l H E t t Mi d i d f O(l )Key and Binomial-Heap-Extract-Min procedures are in order of O(log n). Deepak John,Department Of IT,CE Poonjar
  • 85. Delete a Node Examplep a) b) 5 2head[H] 5 2head[H] a) b) 1210 15 12-∞ 1515 5 -∞head[H] 5head[H] c) d) 122 1 12 2head[H’] Deepak John,Department Of IT,CE Poonjar 15 15
  • 86. e) f) 5 12 2 15 head[H] 5 12 2 15 head[H] ) 15 12 15 g) 5 2 15 head[H] 12 Deepak John,Department Of IT,CE Poonjar
  • 87. Fibonacci heap •A Fibonacci heap is Set of min heap ordered trees Fibonacci heap •A Fibonacci heap is Set of min-heap ordered trees. •Each node x has pointed p[x] to its parent & child [x] to one of its children •Represent trees using left-child, right sibling pointers and circular, doubly linked list. Child li k d t th i d bl li k d i l li t•Children are linked together in a doubly-linked circular list. •The entire heap is accessed by a pointer min [H] which points to the minimum-key rootu ey oo Deepak John,Department Of IT,CE Poonjar
  • 88.
  • 89. min[ H ] (a) 23 7 24 30 17 26 3 4139 303818 52 26 46 3541 min[ H ] (b) 23 7 24 30 17 3818 52 26 3 46 4139 303818 52 26 46 35
  • 90. Deepak John,Department Of IT,CE Poonjar
  • 91. • Potential function: Number of marked nodes in H Fibonacci heap Number of trees in the rooted list of H Number of marked nodes in H  (H) = t(H) + 2m(H) (H) = 5 + 2 3 = 11 minHeap H trees(H) = 5 marks(H) = 3 72317 24 3 30 35 26 46 4118 52 35 39 44marked Deepak John,Department Of IT,CE Poonjar
  • 92. Fibonacci Heap Operationsp p Create InsertInsert Find-Min UnionU o Delete Delete-Min •Make-Fib-Heap(H): Allocate and return the Fibonacci heap object H with n[H]=0 and min[H]=nil, t(H)=0 , m(H)=0 so  (H)=0 The cost of Make-Fib-Heap is O(1)The cost of Make-Fib-Heap is O(1). Deepak John,Department Of IT,CE Poonjar
  • 93. Fibonacci Heaps: Insert Insert. Create a new singleton tree. Add to left of min pointer. Update min pointer. i t 21 21 insert 21 min 72317 24 3 30 26 46 4118 52 Deepak John,Department Of IT,CE Poonjar 35 39 44 Heap H
  • 94. min 41 723 18 52 3 30 17 26 46 24 21 39 4118 52 35 44 Heap H Insert Analysisse t a ys s Actual cost. O(1) Change in potential. +1 Amortized cost O(1)Amortized cost. O(1)
  • 95. Fib-Heap-Insert(H x)Fib Heap Insert(H, x) { degree[x]  0 P[x]  NIL[ ] child[x]  NIL left[x]  x ; right[x]  x mark[x]  FALSE concatenate the root list containing x with root list H if i [H] NIL k [ ] k [ i [H]]if min[H] = NIL or key[x]<key[min[H]] then min[H]  x n[H]  n[H]+1n[H]  n[H]+1 }
  • 96. Fibonacci Heaps: Union Union. Concatenate two Fibonacci heaps. Root lists are circular, doubly linked lists. min min 717 323 24 21 39 4118 52 30 35 26 46 44 Heap H' Heap H'' 39 44 Deepak John,Department Of IT,CE Poonjar
  • 97. min 717 323 24 21 4118 52 30 26 46 39 35 44 Heap H Actual cost. O(1) Change in potential. 0 Amortized cost. O(1)( )
  • 98. Fib-Heap-Union(H1, H2) { k ib []{ H Make-Fib-Heap[] min[H]  min[H1] concatenate the root list of H with the root list of Hconcatenate the root list of H2 with the root list of H if (min[H1]=NIL) or (min[H2]  NIL and min[H2]<min[H1])[ 2] [ 1]) then min[H]  min[H2] n[H]  n[H1]+n[H2] free the objects H1 and H2 return H } Deepak John,Department Of IT,CE Poonjar
  • 99. Extract min() Fib-Heap-Extract-Min(H) { z  min[H] if z  NILif z  NIL then { for each child x of z do { add x to the root list of H P[x]  NIL }P[x]  NIL } remove z from the root list of H if z = right[z] then min[H]  NILthen min[H]  NIL else min[H]  right[z] Consolidate(H) n[H]  n[H] – 1n[H]  n[H] – 1 } return z } Deepak John,Department Of IT,CE Poonjar }
  • 100. Fib-Heap-Link(H, y, x) { remove y from the root list of H;Consolidate(H) the root list of H; make y a child of x; degree[x]degree[x]+1; mark[y]  FALSE; Consolidate(H) { for i  0 to D(n[H]) do A[i]=NIL for each node w in the root list of H do { x  w ; d  degree[x] ; mark[y]  FALSE; } do { x  w ; d  degree[x] ; while A[d]  NIL do { y A[d] if key[x]>key[y] then exchange xyif key[x]>key[y] then exchange xy Fib-Heap-Link(H, y, x) A[d]  NIL ; d  d+1 } A[d]  }A[d]  x } min[H]  NIL for i  0 to D(n[H]) do if A[i] NIL h { dd A[i] h li f Hif A[i]  NIL then { add A[i] to the root list of H ; if min[H]=NIL or key[A[i]]<key[min[H]] then min[H]  A[i] } Deepak John,Department Of IT,CE Poonjar }
  • 101. Fibonacci Heaps: Delete MinFibonacci Heaps: Delete Min • Delete min. Delete min; meld its children into root list; update min– Delete min; meld its children into root list; update min. – Consolidate trees so that no two roots have same rank. min 317237 24 39 4118 52 44 30 35 26 46 39 4435
  • 102. 411723 18 527 24 min 3930 26 46 44 35 min current 411723 18 527 24 3930 35 26 46 44 35
  • 103. 0 1 2 3 rank 411723 18 527 24 currentmin 3930 26 46 44 35 0 1 2 3 411723 18 527 24 min current 39 411723 18 52 30 7 26 46 24 4430 35 26 46
  • 104. 0 1 2 3 rank 411723 18 527 24 min 3930 26 46 44current rank 35 0 1 2 3 rank 411723 18 527 24 min 3930 26 46 44current 35 link 23 into 17
  • 105. 0 1 2 3 rank min 4117 18 527 24 392330 35 26 46 44current 35 link 17 into 7
  • 106. 0 1 2 3 rank 0 1 2 3 current 417 18 5224 min 39301726 46 44 35 23 link 24 into 7
  • 107. 0 1 2 3 rank min current 39 417 30 18 52 1724 443930 23 17 26 46 24 44 35
  • 108. rank 0 1 2 3 a 417 18 52 min current 39301724 44 23 35 26 46 35
  • 109. 0 1 2 3 rank 417 18 52 min current 39 417 30 18 52 1724 44 2326 46 35
  • 110. 0 1 2 3 rank 417 18 52 min current 39 417 30 18 52 1724 44 2326 46 35 link 41 into 18
  • 111. 0 1 2 3 rank 7 1852 min current 3941 7 30 1852 1724 2326 46 44 35
  • 112. 0 1 2 3 rank min current 7 30 52 1724 3941 18 30 23 17 26 46 24 3941 44 35
  • 114. Fibonacci Heaps: Decrease Key Decrease key of element x to k. Case 0: min-heap property not violated.Case 0: min heap property not violated. •decrease key of x to k •change heap min pointer if necessary 7 18 38min 24 17 23 21 39 41 46 3026 5245 Deepak John,Department Of IT,CE Poonjar 88 Decrease 46 to 45. 7235
  • 115. Case 1: parent of x is unmarked. d k f t k•decrease key of x to k •cut off link between x and its parent •mark parent•mark parent •add tree rooted at x to root list, updating heap min pointer 7 18 38 min 24 17 23 21 39 41 45 3026 52 Decrease 45 to 15 15 Deepak John,Department Of IT,CE Poonjar 88 Decrease 45 to 15. 7235
  • 116. 7 18 38 min 24 17 23 21 39 4124 15 3026 52 Decrease 45 to 15. 88 ec ease 5 to 5. 7235 7 18 38 min 15 24 17 23 21 39 412472 3026 52 88 Decrease 45 to 15. 35 Deepak John,Department Of IT,CE Poonjar
  • 117. Case 2: parent of x is marked. •decrease key of x to k •cut off link between x and its parent p[x], and add x to root list •cut off link between p[x] and p[p[x]], add p[x] to root list If p[p[x]] unmarked, then mark it. If p[p[x]] marked, cut off p[p[x]], unmark, and repeat. 15 7 18 38 min 24 17 23 21 39 4172 24 3026 52 Decrease 35 to 5 35 Deepak John,Department Of IT,CE Poonjar 88 Decrease 35 to 5. 5
  • 118. 7 18 38515 min 24 17 23 21 39 412472 3026 52 D 35 5 parent marked Decrease 35 to 5. 88 26 7 18 38515 min 24 17 23 21 39 4188 2472 30 52 Decrease 35 to 5.parent marked Deepak John,Department Of IT,CE Poonjar
  • 119. 26 7 18 38515 24 min 17 23 21 39 418872 30 52 Decrease 35 to 5. Deepak John,Department Of IT,CE Poonjar
  • 120. Deepak John,Department Of IT,CE Poonjar
  • 121. Deepak John,Department Of IT,CE Poonjar
  • 122. Amortized Analysis techniquesy q • In amortized analysis we average the time required for a sequence of operations over all the operations performed. • A ti d l i t t f h• Amortized analysis guarantees an average worst case for each operation. – No involvement of probability • The amortized cost per operation is therefore T(n)/n.  The aggregate method  The Accounting method The Accounting method.  The potential method
  • 123. Aggregate analysisAggregate analysis – The total amount of time needed for the n operations is computed and divided by n. – Treat all operations equally. – Compute the worst-case running time of a sequence of n operationsoperations. – Divide by n to get an amortized running time. – We aggregate the cost of a series of n operations to T(n), thengg g p ( ), each operation has the same amortized cost of T(n)/n
  • 124. The Accounting methodThe Accounting method • Principles of the accounting methodp g – 1. Associate credit accounts with different parts of the structure – 2. Associate amortized costs with operations and show how they credit or debit accounts • Different costs may be assigned to different operations• Different costs may be assigned to different operations. operations are assigned an amortized cost. Objects of the data structure are assigned a credit
  • 125. Accounting Method vs. Aggregate Method • Aggregate method:gg g – first analyze entire sequence – then calculate amortized cost per operation • Accounting method: – first assign amortized cost per operation – check that they are valid (never go into the red) – then compute cost of entire sequence of operations
  • 126. The Potential method• Similar to accounting method• Similar to accounting method • Amortized costs are assigned in a more complicated way – based on a potential functionbased on a potential function – and the current state of the data structure • Must ensure that sum of amortized costs of all operations in the sequence is at least the sum of the actual costs of all operations in the sequence. • Define potential function  which maps any state of the data• Define potential function  which maps any state of the data structure to a real number • Notation: – D0 - initial state of data structure – Di - state of data structure after i-th operation t l t f i th ti– ci - actual cost of i-th operation – mi - amortized cost of i-th operation
  • 127. Red-Black Trees A red-black tree can also be defined as a binary search tree that satisfies the following properties: 1.A node is either red or black. 2.The root is ALWAYS black. 3 All leaves are black3.All leaves are black. 4.Both Children of a node that is red, are black. (no red node can have a red child). 5 E h f i d d d d l f i h5.Every path from a given node down to any descendant leaf contains the same number of black nodes. The number of black nodes on such a path (not including the initial node but including leaves) is called the black- height (bh) of the node. The red-black tree has O(lg n) height Deepak John,Department Of IT,CE Poonjar
  • 128. Red-Black Tree ■ Root Property: the root is black ■ External Property: every leaf is blackp y y ■ Internal Property: the children of a red node are black ■ Depth Property: all the leaves have the same black depth Deepak John,Department Of IT,CE Poonjar
  • 129. Rotations •Rotations are the basic tree-restructuring operation for almost all balanced search trees. R t ti t k d bl k t d d•Rotation takes a red-black-tree and a node, •Changes pointers to change the local structure, and Won’t violate the binary-search-tree property. •Left rotation and right rotation are inverses. y  Left-Rotate(T, x)x x      y  Right-Rotate(T, y) Deepak John,Department Of IT,CE Poonjar  
  • 130. An example of LEFT-ROTATE(T,x) Deepak John,Department Of IT,CE Poonjar
  • 131. Left and Right Rotation Left Rotate (T x)Left Rotate (T x)Left-Rotate (T, x) 1. y  right[x] // Set y. 2. right[x]  left[y] //Turn y’s left subtree into x’s right subtree. Left-Rotate (T, x) 1. y  right[x] // Set y. 2. right[x]  left[y] //Turn y’s left subtree into x’s right subtree. 3. if left[y]  nil[T ] 4. then p[left[y]]  x 5 [ ]  [ ] // Li k ’ t t 3. if left[y]  nil[T ] 4. then p[left[y]]  x 5 [ ]  [ ] // Li k ’ t t5. p[y]  p[x] // Link x’s parent to y. 6. if p[x] = nil[T ] 7. then root[T ]  y 5. p[y]  p[x] // Link x’s parent to y. 6. if p[x] = nil[T ] 7. then root[T ]  y •The code for RIGHT- ROTATE is symmetric. [ ] y 8. else if x = left[p[x]] 9. then left[p[x]]  y 10 l i h [ [ ]] [ ] y 8. else if x = left[p[x]] 9. then left[p[x]]  y 10 l i h [ [ ]] •Both LEFT-ROTATE and RIGHT-ROTATE run in O(1) time 10. else right[p[x]]  y 11. left[y]  x // Put x on y’s left. 12. p[x]  y 10. else right[p[x]]  y 11. left[y]  x // Put x on y’s left. 12. p[x]  y Deepak John,Department Of IT,CE Poonjar run in O(1) time.12. p[x]  y12. p[x]  y
  • 132. Ri h iRight rotation: 1. x=left[y]; 2. left[y]=right[x];[y] g [ ]; 3. If(right[x]!=nil) 4. then p[right[x]]=y; 5 p[x]=p[y];5. p[x] p[y]; 6. if(p[y]==nil) 7. then root=x; 8 El If(l f [ [ ]] )8. Else If(left[p[y]]=y) 9. then left[p[y]]=x; 10. else right[p[y]]=x;g [p[y]] 11. right[x]=y; 12. p[y]=x;
  • 133. Rotation Th d d f L f R h• The pseudo-code for Left-Rotate assumes that – right[x]  nil[T ], and root’s parent is nil[T ]– root s parent is nil[T ]. • Left Rotation on x, makes x the left child of y, and the left subtree of y into the right subtree of x. • Pseudocode for Right-Rotate is symmetric: exchange left and right everywhere. Ti O(1) f b h L f R d Ri h R i• Time: O(1) for both Left-Rotate and Right-Rotate, since a constant number of pointers are modified. Operations on RB TreesOperations on RB Trees • All operations can be performed in O(lg n) time. • Insertion and Deletion are not straightforward.
  • 134. When Inserting a Nodeg Remember: 1. Insert nodes one at a time, and after every Insertion balance the treebalance the tree. 2. Every node inserted starts as a Red node. 3. Consult the cases, for rebalancing the tree. •Basic steps: 1. Use Tree-Insert from BST (slightly modified) to insert a node x into Tx into T. -Procedure RB-Insert(x). -Color the node x red.Color the node x red. 2. Fix the modified tree by re-coloring nodes and performing rotation to preserve RB tree property. Deepak John,Department Of IT,CE Poonjar -Procedure RB-Insert-Fixup.
  • 135. Red-Black fixup • y = z’s “uncle”y • Three cases: – y is red – y is black and z is a right child – y is black and z is a left child.
  • 136. Case 1 – Z’s uncle y is red C C new z p[p[z]] A D y C A D p[z] B    z A D B    B   B   z is a right child here. Similar steps if z is a left child. • y.Color = black • z.Parent.Color = black • z.Parent.Parent.Color = red • z = z.Parent.Parent R fi• Repeat fixup
  • 137. 11 2 14 71 15 5 8 4 y 4z y.Color = black z.Parent.Color = blackNew . a e .Co o b ac z.Parent.Parent.Color = red z = z.Parent.Parent New Node repeat fixup
  • 138. 1111 2 14 71 15 5 8 y 4z y.Color = black z.Parent.Color = black z.Parent.Parent.Color = red z = z.Parent.Parent fi New Node repeat fixup
  • 139. 11 2 142 14 71 15 5 8 y C l bl k 4z y.Color = black z.Parent.Color = black z Parent Parent Color = redNew z.Parent.Parent.Color red z = z.Parent.Parent repeat fixup New Node p p
  • 140. 1111 2 14 71 15 5 8 y 4z y.Color = black z.Parent.Color = black z Parent Parent Color redz.Parent.Parent.Color = red z = z.Parent.Parent repeat fixup New Node repeat fixup
  • 141. 1111 2 14 y 71 15z 5 8 y.Color = black 4 z.Parent.Color = black z.Parent.Parent.Color = red P P New z = z.Parent.Parent repeat fixup New Node
  • 142. Case 2 – y is black, z is a right child C C p[z] p[z] A  z y B  y B    A   z     • z = z.Parent • Left-Rotate(T, z) • Do Case 3 N t th t C 2 i b t f C 3• Note that Case 2 is a subset of Case 3
  • 143. 1111 2 14 y 71 15z 5 8 z = z.Parent 4 Left-Rotate(T,z) Do Case 3
  • 144. 1111 2 14z y 71 15 5 8 z = z.Parent Left-Rotate(T,z) 4 ( , ) Do Case 3
  • 145. 11 147 y 2 1 15 5 8z 1 5 44 z = z.Parent Left-Rotate(T,z) Do Case 3
  • 146. Case 3 – y is black, z is a left child BC p[z] AB  y p[z] C z     A  z   • z.Parent.Color = black • z.Parent.Parent.Color = red • Right-Rotate(T, z.Parent.Parent)
  • 147. 1111 147 y 2 158z 1 5 4 z.Parent.Color = black z.Parent.Parent.Color = red Right-Rotate(T, z.Parent.Parent)
  • 148. 1111 147 y 2 158z 1 5 z Parent Color = black 4 z.Parent.Color = black z.Parent.Parent.Color = red Right-Rotate(T, z.Parent.Parent)Right Rotate(T, z.Parent.Parent)
  • 149. 11 147 y 2 158z 1 5 4 z.Parent.Color = black 4 z.Parent.Parent.Color = red Right-Rotate(T, z.Parent.Parent)
  • 150. 7 112 7 z 141 5 8 y 15 4 z.Parent.Color = black z.Parent.Parent.Color = red Right-Rotate(T, z.Parent.Parent)
  • 151. RB-Insert(T, z) 1. y  nil[T] 2 x  root[T]2. x  root[T] 3. while x  nil[T] 4. do y  x 5 if key[z] < key[x]5. if key[z] < key[x] 6. then x  left[x] 7. else x  right[x] 8 [ ] 8. p[z]  y 9. if y = nil[T] 10. then root[T]  z 11 l if k [ ] k [ ]11. else if key[z] < key[y] 12. then left[y]  z 13. else right[y]  z 14. left[z]  nil[T] 15. right[z]  nil[T] 16 color[z]  RED16. color[z]  RED 17. RB-Insert-Fixup (T, z)
  • 152. RB-Insert-Fixup (T, z) 1. while color[p[z]] = RED 2 d if [ ] l ft[ [ [ ]]]2. do if p[z] = left[p[p[z]]] 3. then y  right[p[p[z]]] 4. if color[y] = RED 5. then color[p[z]]  BLACK // Case 1 6. color[y]  BLACK // Case 1 7. color[p[p[z]]]  RED // Case 1[p[p[ ]]] 8. z  p[p[z]] // Case 1 9. else if z = right[p[z]] // color[y]  RED 10 then z  p[z] // Case 210. then z  p[z] // Case 2 11. LEFT-ROTATE(T, z) // Case 2 12. color[p[z]]  BLACK // Case 3 13 color[p[p[z]]]  RED // Case 313. color[p[p[z]]]  RED // Case 3 14. RIGHT-ROTATE(T, p[p[z]]) // Case 3 15. else (if p[z] = right[p[p[z]]])(same as 10-14 16 ith “ i ht” d “l ft” h d)16. with “right” and “left” exchanged) 17. color[root[T ]]  BLACK
  • 153. Correctness Loop invariant: • At the start of each iteration of the while loop, – z is red. – If p[z] is the root, then p[z] is black. – There is at most one red-black violation: • Property 2: z is a red root or• Property 2: z is a red root, or • Property 4: z and p[z] are both red.
  • 154. • Termination: The loop terminates only if p[z] is black. Hence, property 4 is OK. The last line ensures property 2 always holds.p p y p p y y • Maintenance: We drop out when z is the root (since then p[z] is sentinel nil[T ], which is black). When we start the loop body, the l i l ti i f t 4only violation is of property 4. – There are 6 cases, 3 of which are symmetric to the other 3. We consider cases in which p[z] is a left child.p[ ] – Let y be z’s uncle (p[z]’s sibling).
  • 155. Algorithm AnalysisAlgorithm Analysis • O(lg n) time to get through RB-Insert up to theO(lg n) time to get through RB Insert up to the call of RB-Insert-Fixup. • Within RB-Insert-Fixup:• Within RB-Insert-Fixup: – Each iteration takes O(1) time. Each iteration but the last moves up 2 levels– Each iteration but the last moves z up 2 levels. – O(lg n) levels  O(lg n) time. Th i ti i d bl k t t k O(l ) ti– Thus, insertion in a red-black tree takes O(lg n) time. – Note: there are at most 2 rotations overall.
  • 156. Deletion • Find • Swap – Moves entry to node with one external node (left) • Remove entry • Reattach right child
  • 157. Deletion Deletion from a red black tree, is similar to deletion for a binary search tree, with a few exception: •Always set the parent of a deleted node, to be the parent of one of the deleted nodes children. •Red black fix-up method called if removed node is black.p After a deletion of a red node (no violations occur): N bl k h i h h b ff d•No black-heights have been affected. •No red nodes have been made adjacent (parent and child both red).) •Deleted node is not the root since the root is black. Deepak John,Department Of IT,CE Poonjar
  • 158. • After Deletion of a Black node a restore function must be called to fix red-black properties that might be violated. There are 3 possible initial violations. If d l t d d th t d hild i ht b th t– If deleted node was the root, a red child might now be the root, Violation of property 2. – If both the parent of removed node, and a child of removedIf both the parent of removed node, and a child of removed node are red, we have a violation of property 4. – The removal of a black node can cause the black-height of one h b h (b 1) i l i 5path to be shorter (by 1), violating property 5. – We correct the problem of rule 5 by adding an extra “black” to the node passed into the fix-up procedure. This leads tothe node passed into the fix up procedure. This leads to violations in rules 1 since this node is now neither red or black. Deepak John,Department Of IT,CE Poonjar
  • 159. Delete PossibilitiesDelete Possibilities 1:Delete Red node • No problem 2:Delete black node with red child • Color red child black 3:Delete black node with black child C l hild “D bl Bl k”• Color child “Double Black” • 3 possibilities depending on neighboring nodes X’s sibling is black with at least one red child– X s sibling is black with at least one red child – X’s sibling is black with no red children – X’s sibling is reds s b g s ed
  • 160. Deletion – Fixupp • Idea: Move the extra black up the tree until x points to a red & black node  turn it into a black node, • x points to the root  just remove the extra black, or • We can do certain rotations and recolorings and finish. Withi th hil l• Within the while loop: – x always points to a nonroot doubly black node. – w is x’s siblingw is x s sibling. – w cannot be nil[T ], since that would violate property 5 at p[x].
  • 161. Case 1 – w is red p[x] B A D B x w D E p[ ] A D C E   B A  C E x new wC E         w •w must have black children. •Make w black and p[x] red. •Th l ft t t [ ]•Then left rotate on p[x]. •New sibling of x was a child of w before rotation  must be black. Go immediately to case 2, 3, or 4. Deepak John,Department Of IT,CE Poonjar
  • 162. Case 2 – w is black, both w’s children are blackp[x] black B A D x w B new xc c p[x] A D C E   A D C E   •Take 1 black off x ( singly black) and off w ( red) C E     C E     •Take 1 black off x ( singly black) and off w ( red). •Move that black to p[x]. •Do the next iteration with p[x] as the new xDo the next iteration with p[x] as the new x. •If entered this case from case 1, then p[x] was red  new x is red & black  color attribute of new x is RED  loop terminates. Then new x is made black in the last line. Deepak John,Department Of IT,CE Poonjar
  • 163. Case 3 – w is black, w’s left child is red, w’s right child is blackw s right child is black B x w B c c A D   x w A C D  new wx C E      D    E •Make w red and w’s left child black.  Make w red and w s left child black. •Then right rotate on w. •New sibling w of x is black with a red right child  case 4. Deepak John,Department Of IT,CE Poonjar
  • 164. Case 4 – w is black, w’s right child is red B A D B x w D E c A D C E   B A  C E x c’ C E         •Make w be p[x]’s color (c). •Make p[x] black and w’s right child black. Th l ft t t [ ]•Then left rotate on p[x]. •Remove the extra black on x ( x is now singly black) without violating any red-black properties.g y p p •All done. Setting x to root causes the loop to terminate. Deepak John,Department Of IT,CE Poonjar
  • 165. RB-Delete(T, z) 1. if left[z] = nil[T] or right[z] = nil[T] 2. then y  z RB-Delete(T, z) 1. if left[z] = nil[T] or right[z] = nil[T] 2. then y  z 3. else y  TREE-SUCCESSOR(z) 4. if left[y] = nil[T ] 5. then x  left[y] 3. else y  TREE-SUCCESSOR(z) 4. if left[y] = nil[T ] 5. then x  left[y] 6. else x  right[y] 7. p[x]  p[y] // Do this, even if x is nil[T] 6. else x  right[y] 7. p[x]  p[y] // Do this, even if x is nil[T] 8. if p[y] = nil[T ]8. if p[y] = nil[T ] 9. then root[T ]  x 10. else if y = left[p[y]] 11. then left[p[y]]  x 9. then root[T ]  x 10. else if y = left[p[y]] 11. then left[p[y]]  x11. then left[p[y]]  x 12. else right[p[y]]  x 13. if y = z 14 then key[z]  key[y] 11. then left[p[y]]  x 12. else right[p[y]]  x 13. if y = z 14 then key[z]  key[y]14. then key[z]  key[y] 15. copy y’s satellite data into z 16. if color[y] = BLACK 17 th RB D l t Fi (T ) 14. then key[z]  key[y] 15. copy y’s satellite data into z 16. if color[y] = BLACK 17 th RB D l t Fi (T ) Deepak John,Department Of IT,CE Poonjar 17. then RB-Delete-Fixup(T, x) 18. return y 17. then RB-Delete-Fixup(T, x) 18. return y
  • 166. RB D l Fi (T )RB D l Fi (T )RB-Delete-Fixup(T, x) 1. while x  root[T ] and color[x] = BLACK 2 do if x = left[p[x]] RB-Delete-Fixup(T, x) 1. while x  root[T ] and color[x] = BLACK 2 do if x = left[p[x]]2. do if x = left[p[x]] 3. then w  right[p[x]] 4. if color[w] = RED 2. do if x = left[p[x]] 3. then w  right[p[x]] 4. if color[w] = RED[ ] 5. then color[w]  BLACK // Case 1 6. color[p[x]]  RED // Case 1 [ ] 5. then color[w]  BLACK // Case 1 6. color[p[x]]  RED // Case 1 7. LEFT-ROTATE(T, p[x]) // Case 1 8. w  right[p[x]] // Case 1 7. LEFT-ROTATE(T, p[x]) // Case 1 8. w  right[p[x]] // Case 1 Deepak John,Department Of IT,CE Poonjar
  • 167. /* x is still left[p[x]] */ 9. if color[left[w]] = BLACK and color[right[w]] = BLACK l // /* x is still left[p[x]] */ 9. if color[left[w]] = BLACK and color[right[w]] = BLACK l //10. then color[w]  RED // Case 2 11. x  p[x] // Case 2 12. else if color[right[w]] = BLACK 10. then color[w]  RED // Case 2 11. x  p[x] // Case 2 12. else if color[right[w]] = BLACK12. else if color[right[w]] BLACK 13. then color[left[w]]  BLACK // Case 3 14. color[w]  RED // Case 3 12. else if color[right[w]] BLACK 13. then color[left[w]]  BLACK // Case 3 14. color[w]  RED // Case 3 15. RIGHT-ROTATE(T,w) // Case 3 16. w  right[p[x]] // Case 3 17 color[w]  color[p[x]] // Case 4 15. RIGHT-ROTATE(T,w) // Case 3 16. w  right[p[x]] // Case 3 17 color[w]  color[p[x]] // Case 417. color[w]  color[p[x]] // Case 4 18. color[p[x]]  BLACK // Case 4 19. color[right[w]]  BLACK // Case 4 17. color[w]  color[p[x]] // Case 4 18. color[p[x]]  BLACK // Case 4 19. color[right[w]]  BLACK // Case 4 20. LEFT-ROTATE(T, p[x]) // Case 4 21. x  root[T ] // Case 4 22 else (same as then cla se ith “right” and “left” e changed) 20. LEFT-ROTATE(T, p[x]) // Case 4 21. x  root[T ] // Case 4 22 else (same as then cla se ith “right” and “left” e changed)22. else (same as then clause with “right” and “left” exchanged) 23. color[x]  BLACK 22. else (same as then clause with “right” and “left” exchanged) 23. color[x]  BLACK
  • 168. Delete Analysis O(lg n) time to get through RB-Delete up to the call of RB-Delete- Fixup. Within RB-Delete-Fixup: Case 2 is the only case in which more iterations occur. x moves up 1 levelx moves up 1 level. Hence, O(lg n) iterations. Each of cases 1, 3, and 4 has 1 rotation   3 rotations in all. Hence, O(lg n) time. Deepak John,Department Of IT,CE Poonjar