Dynamic arrays are a much more common alternative to static arrays. They’re useful because they can grow as elements are added. In JavaScript and Python, these are the default arrays.
Unlike static arrays, with dynamic arrays we don’t have to specify a size when initializing them.
In different languages, dynamic arrays may have a default capacity — in Java it’s often 10 and in C# it’s 4. Regardless, they automatically resize at runtime as arrays grow.
Dynamic Array Insertion
When inserting at the end of a dynamic array, the next empty slot is found and the element is inserted there. Consider an array of size 3 where we insert elements until we run out of space.
// Insert n in the last position of the array
void pushback(int n) {
if (length == capacity) {
resize();
}
// insert at next empty position
arr[length++] = n;
}
// Insert n in the last position of the array
void pushback(int n) {
if (length == capacity) {
resize();
}
// insert at next empty position
arr[length++] = n;
}
// Insert n in the last position of the array
fun pushback(n: Int) {
if (length == capacity) {
resize()
}
// insert at next empty position
arr[length++] = n
}
# Insert n in the last position of the array
def pushback(n):
if length == capacity:
resize()
# insert at next empty position
arr[length] = n
length += 1
Resize
Since the array is dynamic in size, we can keep adding elements. But it’s not magic — this is achieved by copying the values into a new static array that’s double the size of the original. This means the resulting array will be size 6 and will have newly allocated space in memory. The following code demonstrates this resizing operation.
void resize() {
// Create new array of double capacity
capacity = 2 * capacity;
int *newArr = (int*)malloc(capacity * sizeof(int));
// Copy elements to newArr
for (int i = 0; i < length; i++)
{
newArr[i] = arr[i];
}
free(arr);
arr = newArr;
}
void resize() {
// Create new array of double capacity
capacity = 2 * capacity;
int[] newArr = new int[capacity];
// Copy elements to newArr
for (int i = 0; i < length; i++)
{
newArr[i] = arr[i];
}
arr = newArr;
}
fun resize() {
// Create new array of double capacity
capacity = 2 * capacity
val newArr = IntArray(capacity)
// Copy elements to newArr
for (i in 0 until length)
{
newArr[i] = arr[i]
}
arr = newArr
}
def resize():
# Create new array of double capacity
capacity = 2 * capacity
new_arr = [0] * capacity
# Copy elements to new_arr
for i in range(length):
new_arr[i] = arr[i]
arr = new_arr
When all elements from the first array have been copied, the first static array will be deallocated.
Adding elements to a dynamic array runs in amortized O(1) time.
Amortized time complexity is the average time per operation over a sequence of operations. The resize operation itself is O(n), but since it isn’t performed every time we add an element, the average time per operation is O(1). This holds if we double the array size when we run out of space.
Why double the capacity?
Imagine we want to dynamically fill an array and start with an array of size 1. We’d add 5, double the space to add 6, double again to add 7 and 8, double again to add 9, 10, 11 and 12.
The array size above went from 1 → 2 → 4 → 8.
To analyze the time complexity we have to account for the sum of all operations that occurred before the last one, since we wouldn’t have reached the resulting array without them. To arrive at an array of size 8, we’d need to perform 1 + 2 + 4 + 8 = 15 operations, which includes the resizing operations.
The pattern here is that the last term (the dominant term) is always greater than or equal to the sum of all previous terms. In this case, 1 + 2 + 4 = 7, and 7 < 8. Adding 8 to 7 yields a total of 15 operations to create the resulting array of size 8.
Generally, the formula is that for any array size n, it will take at most 2n operations to create it, which belongs to O(n).
Since inserting n elements into a dynamic array is O(n), the amortized time complexity of inserting a single element is O(1).
With time complexity we care about asymptotic analysis. This means we care about how fast the runtime grows as the input size grows. We don’t distinguish between O(2n) and O(n) because the runtime grows linearly with the input size, even if the constant doubles.
In time complexity analysis, we typically drop constant terms and coefficients.
Other Operations
Inserting or deleting from the middle of a dynamic array would be similar to a static array. We’d need to shift elements to the right or left to make room for the new element or to fill the gap left by the removed element. This runs in O(n) time.
Time Complexity
Operation | Big-O Time | Notes |
---|---|---|
Access | O(1) | - |
Insertion | O(1)* | O(n) if insertion is in the middle since shifting is required |
Deletion | O(1)* | O(n) if deletion is in the middle since shifting is required |