📚 Introduction to AVL Trees
What is an AVL Tree?
An AVL tree is a self-balancing binary search tree named after its inventors Adelson-Velsky and Landis. It maintains a balanced structure by ensuring that for every internal node, the heights of its two children differ by at most 1.
Mathematically:
|height(left) - height(right)| ≤ 1
Why AVL Trees?
- Guaranteed Performance: Operations are always O(log n)
- No Worst Case: Unlike regular BST which can degrade to O(n)
- Automatic Balancing: Self-adjusts after insertions and deletions
- Efficient Searching: Maintains logarithmic height
🎯 AVL Tree Properties
Balance Factor
The balance factor of a node is calculated as:
Valid balance factors: -1, 0, +1
- +1: Left subtree is one level taller
- 0: Both subtrees have equal height
- -1: Right subtree is one level taller
- +2 or -2: ⚠️ Tree is unbalanced! (needs rotation)
Height of AVL Tree
Therefore: h = O(log n)
Let n(h) = minimum number of nodes in an AVL tree of height h
• n(1) = 1
• n(2) = 2
• n(h) = 1 + n(h-1) + n(h-2)
Since n(h-1) > n(h-2), we get n(h) > 2n(h-2)
By induction: n(h) > 2^(h/2-1)
Taking logarithms: h < 2log n(h) + 2
🔄 Tree Rotations
When to Use Rotations?
- When balance factor becomes +2 or -2
- After insertion or deletion operations
- To maintain the AVL property
1️⃣ Single Right Rotation (LL Case)
When: Left-Left imbalance (same sign)
Pattern: Left subtree is taller, and imbalance is in left child's left subtree
↙
y
↙
x
⬇ Right Rotation
y
↙ ↘
x z
2️⃣ Single Left Rotation (RR Case)
When: Right-Right imbalance (same sign)
Pattern: Right subtree is taller, and imbalance is in right child's right subtree
↘
y
↘
x
⬇ Left Rotation
y
↙ ↘
z x
3️⃣ Left-Right Rotation (LR Case)
When: Left-Right imbalance (opposite signs)
Pattern: Left subtree is taller, but imbalance is in left child's right subtree
↙
y
↘
x
⬇ Left then Right
x
↙ ↘
y z
4️⃣ Right-Left Rotation (RL Case)
When: Right-Left imbalance (opposite signs)
Pattern: Right subtree is taller, but imbalance is in right child's left subtree
↘
y
↙
x
⬇ Right then Left
x
↙ ↘
z y
- Same Sign (both left or both right) → Single Rotation
- Opposite Signs (left-right or right-left) → Double Rotation
Trinode Restructuring
A general approach to handle all rotation cases:
- Let (a, b, c) be the inorder listing of nodes x, y, z
- Perform rotations to make b the topmost node
- a becomes the left child, c becomes the right child
- Subtrees T₀, T₁, T₂, T₃ are distributed appropriately
➕ Insertion in AVL Trees
Algorithm Steps
- Insert as in BST: Find the correct position and insert the new node
- Update Heights: Travel up from the inserted node, updating heights
- Check Balance: At each ancestor node, check the balance factor
- Perform Rotation: If unbalanced (|BF| = 2), perform appropriate rotation
- Continue Upward: Continue checking until root is reached
📝 Example: Inserting 54
Consider an AVL tree with nodes: 44, 17, 78, 32, 50, 88, 48, 62
After inserting 54:
- Node 62 becomes unbalanced (balance factor = -2)
- Pattern: Right-Left (RL case)
- Solution: Right rotation at 62, then restructure
- Result: Balanced tree with 54 in correct position
Complexity Analysis
- Finding insertion point: O(log n)
- Updating heights: O(log n)
- Single rotation: O(1)
- Total: O(log n)
➖ Deletion in AVL Trees
Algorithm Steps
- Delete as in BST: Remove the node using standard BST deletion
- Start from Parent: Begin at the parent of the deleted node
- Update Heights: Recalculate heights going upward
- Check Balance: Check balance factor at each node
- Perform Rotations: If unbalanced, perform appropriate rotation
- Continue to Root: Must check ALL the way to the root
Deletion may require MULTIPLE rotations as you travel up to the root! Each rotation might create a new imbalance higher up.
📝 Example: Deleting 32
From a tree with nodes: 44, 17, 62, 32, 50, 78, 48, 54, 88
- Delete node 32 (leaf node)
- Parent 17 becomes unbalanced
- Perform rotation at 17
- Check parent nodes up to root
- Perform additional rotations if needed
BST Deletion Cases
- Case 1: Node has no children (leaf) → Simply remove
- Case 2: Node has one child → Replace node with its child
- Case 3: Node has two children → Replace with inorder successor/predecessor
Complexity Analysis
- Finding node: O(log n)
- Deleting node: O(1)
- Rebalancing (multiple rotations): O(log n)
- Total: O(log n)
⚡ Performance Analysis
| Operation | AVL Tree (Average) | AVL Tree (Worst) | BST (Average) | BST (Worst) |
|---|---|---|---|---|
| Search | O(log n) | O(log n) | O(log n) | O(n) |
| Insertion | O(log n) | O(log n) | O(log n) | O(n) |
| Deletion | O(log n) | O(log n) | O(log n) | O(n) |
| Traversal | O(n) | O(n) | O(n) | O(n) |
Space Complexity
O(n) - AVL trees store n nodes plus height information at each node
- Predictable: No worst-case O(n) operations
- Balanced: Always maintains logarithmic height
- Efficient: All operations guaranteed O(log n)
- Reliable: Performance doesn't depend on insertion order
💻 Key Implementation Concepts
Node Structure
Essential Functions
1. Height Calculation
2. Balance Factor Check
3. Taller Child Selection
4. Rebalancing After Insertion
5. Rebalancing After Deletion
📖 Study Tips & Common Mistakes
✅ Key Concepts to Master
- Understanding balance factor calculation
- Identifying which rotation to use (same vs opposite signs)
- Recognizing that insertion needs only ONE rotation
- Remembering that deletion may need MULTIPLE rotations
- Height calculation after rotations
- Trinode restructuring process
⚠️ Common Mistakes to Avoid
- Forgetting to update heights after rotations
- Stopping at first rotation during deletion (must go to root!)
- Wrong rotation selection - remember same sign = single, opposite = double
- Not maintaining BST property during rotations
- Calculating balance factor incorrectly (it's left - right, not right - left)
🎯 Practice Problems
- Insert nodes 45, 70, 35, 3, 74, 25, 81, 4 into an empty AVL tree. Show all rotations.
- Delete node 25 from the tree created above. Show all rebalancing steps.
- Given a tree of height h, prove the minimum number of nodes is approximately Fibonacci(h+2).
- Compare AVL trees with Red-Black trees - when would you use each?
🔍 Quick Reference
Is node unbalanced (|BF| = 2)?
├─ NO → Continue checking parent
└─ YES → Which subtree is taller?
├─ Left (BF = +2)
│ ├─ Left child's left subtree taller → LL: Single Right Rotation
│ └─ Left child's right subtree taller → LR: Double Rotation (Left-Right)
└─ Right (BF = -2)
├─ Right child's right subtree taller → RR: Single Left Rotation
└─ Right child's left subtree taller → RL: Double Rotation (Right-Left)
📋 Summary
AVL Tree Advantages
- ✅ Guaranteed O(log n) for search, insert, delete
- ✅ More strictly balanced than Red-Black trees
- ✅ Better search performance (fewer comparisons)
- ✅ Predictable performance regardless of input order
AVL Tree Disadvantages
- ❌ More rotations during insertion/deletion than Red-Black trees
- ❌ Extra space for height storage at each node
- ❌ More complex implementation than simple BST
- ❌ Slightly slower insertions/deletions compared to Red-Black trees
🎓 Remember for Exams
- Height of AVL tree with n nodes: h = O(log n)
- Balance factor range: -1, 0, +1
- All operations: O(log n)
- Space complexity: O(n)
- Insertion: 1 rotation max
- Deletion: O(log n) rotations possible
- Rotation time: O(1)