DSA Trees + Heap
- Trees are special form of graph
- If n nodes => n-1 edges (means 2 points have only 1 line joining them)
- They are acyclic => means from node A to node B there is only 1 way to reach
- Whenever see stuffs like connected, unique path -> tree
- Types of traversal in trees
- inorder traversion - left -> root -> right
- preorder traversion - root -> left -> right
- postorder traversion - left -> right -> root
- If first time visited -> must be root (bcz parent hai) => PRE-ORDER
- If second time visited -> left subtree fully analyzed bcz we are moving in anti-clockwise direction => toh left check krke hi aayenge => IN-ORDER
- If third time visited -> left and right subtree fully analyzed
=> POST-ORDER - Post-order is reverse of Pre-order from clockwise
- Binary search tree (BST) = Binary tree + Search property
- each subtree follows -- left subtree < root value < right subtree
- Inorder traversal in BST is sorted array
- Ternary tree (can contain 0/1/2/3 pointer or node)
- Full binary tree - Every node has either 2 child or it's leaf [NO node has 1 child]
- TIP - "Full" mai 2 l hota hai → 2 child
- Complete binary tree - each level is filled with 2^i nodes
- Height of tree - longest possible distance from any leaf node
- Depth of node - distance from root node
- For binary tree - struct* node => has parent, left child, right child and data
For any tree - struct* node => has parent, first child, next sibling and data
- Implemented using array => for node at index i --> parent = i/2, left = 2i, right = 2i+1
- above formula only if 1-based indexing array
- Below will be for max heap --> similarly, can do min heap
- Insertion - add at end of array and keep swapping with parent jb tk it's value > parent
- Deletion - swap that element with last element of array => then delete last element (do size--) => keep swapping with max. child jb tk it's value < max of left/right child
- Heapify - given any random array => construct max heap from it
- 2 options => shift up (swap with parent) or shift down (swap with largest child)
- shift Down - expensive for nodes at top of tree => Used in deletion
shift Up - expensive for nodes at bottom of tree => Used in insertion - If use shift down => work required = (0*n/2) + (1*n/4) + (2*n/8) + ... + (h*1)
- h = log(n) --> always bcz complete binary tree
- Mathematically => above work = O(n)
- If use shift up => work required = (h*n/2) + (h-1)*n/4 + (h-2)*n/8 + ... + (0*1)
- since n/2 elements are at the leaf node --> h*n/2 ≈ n*log(n)
- Better option => USE Shift Down
- Heap Sort - first heapify => keep deleting root element and storing at the end of array
- [55, 54, 53] --> delete 55 => swap with last element & do size--
[53, 54 || 55] (Observe - new array = [53, 54] --> adjust 53 then again delete root
[54, 53 || 55] --> delete 54 => [53 || 54, 55] => since only 1 element left
--> final sorted array = [53, 54, 55] - every time we keep deleting root element --> O(n*log(n))
- Note - heap is not like BST --> Only root element is either max or min depending on heap but not all elements are sorted
- Time complexity - O(log(n)) for insertion and deletion
O(n) for building heap/heapify and O(n*log(n)) for heap sort - Demerits - Not stable sort + poor locality of reference bcz parent and child can be far apart in memory while comparing
- heapsort do more comparison & swaps wrt quicksort → quicksort almost always outperforms on average datasets → standard library sort function use Introsort which starts with quicksort but switch to heapsort if detects quicksort heading towards O(n²) → speed of quicksort with safety of heapsort
AVL Trees - self-balancing BST where |height_left - height_right| ≤ 1 for every node
IDEA - Write order (since BST must hold) --> For Right-Left Rotation --> A < B < C
Now try to put the middle element as root node
Left / Right rotation only affect that node + Child --> about A => affect only B
(NOTE - C will not be affected bcz not the child of A)

-768.png)

-768.png)
-768.png)
Comments
Post a Comment