Data Structures : Time Complexity
Time Complexity
Time
complexity
is a way to describe how the execution time of an algorithm increases as
the size of the input grows. It is a measure of algorithm efficiency.
Time
complexity is usually expressed using Big O notation.
✅ Key Concepts -----------------------
- Input
Size (n): Represents
the amount of data the algorithm is processing.
- Number
of Operations: Refers
to the basic steps the algorithm takes (like comparisons, assignments,
arithmetic operations, etc).
- Big
O Notation: A
mathematical notation used to classify algorithms based on their time
complexity.
✅ Common Time Complexities ----------
O(1):
Constant Time:
The
algorithm takes the same amount of time regardless of the input
size. Example: Accessing an element in an array by its index.
O(log
n): Logarithmic Time:
The
runtime increases logarithmically with the input size. Example: Binary
search.
O(n):
Linear Time:
The
runtime increases linearly with the input size. Example: Iterating through
an array once.
O(n
log n): Linearithmic Time:
The
runtime increases with both linear and logarithmic factors. Example:
Efficient sorting algorithms like merge sort.
O(n2):
Quadratic Time:
The
runtime increases quadratically with the input size. Example: Nested loops
iterating through all pairs of elements in an array.
O(2n):
Exponential Time:
The
runtime doubles with each increase in the input size. Example: Recursive
algorithms that explore all possible subsets of a set.
Why
is Time Complexity Important?
✅ Examples --------------------------
- Linear Search (O(n))
Goes through each element until the target is found.
- Binary Search (O(log
n))
Repeatedly divides the sorted list in half.
- Bubble Sort (O(n²))
Repeatedly compares adjacent elements and swaps them.
- Merge Sort (O(n log n))
Divides list into halves, sorts them, then merges.
π In summary, time complexity is a fundamental
concept in computer science that helps in understanding and comparing the
efficiency of algorithms.
By
understanding time complexity, you can make informed decisions about which
algorithms to use and how to optimize your code for better performance.
π General Rule of Thumb
Structure | Time Complexity
|
Single
loop
|
O(n)
|
Nested
loops
|
O(n²),
O(n³)
|
Logarithmic
growth
|
O(log
n)
|
Divide
and conquer
|
O(n
log n)
|
--------------------------
✅ Calculating Time Complexity -
To
calculate time complexity, you analyze how many times each line or
block of code executes in relation to the input size (usually denoted as n).
{
for(int i = 1; i < = n; i++)
{
printf("%d",i);
}
}
π Step-by-Step Analysis:
· for
(int i = 1; i <=n; i++)
· This loop runs n times.
·
printf("%d",i);
·
This
line runs once per iteration, so also n times.
π Time Complexity: O(n)
· Because the loop runs linearly with input size n.
{
for(int i = 1; I < = n; i++)
{
for(int j = 1; j < = n; j++)
{
printf("%d",i);
}
}
}
π Step-by-Step Analysis:
- Outer
loop: runs n times.
- Inner
loop: for every iteration of the outer loop, runs n times.
- Total
operations = n × n = n².
π Time Complexity: O(n²)
·
Because
two loops are nested, and each runs n times.
Comments
Post a Comment