In computer science, a dynamic array, growable array, resizable array, dynamic table, mutable array, or array list is a random access, variable-size list data structure that allows elements to be added or removed. It is supplied with standard libraries in many modern mainstream programming languages. Dynamic arrays overcome a limit of static arrays, which have a fixed capacity that needs to be specified at allocation. A dynamic array is not the same thing as a dynamically allocated array, which is an array whose size is fixed when the array is allocated, although a dynamic array may use such a fixed-size array as a back end.[1] Contents 1 Bounded-size dynamic arrays and capacity 2 Geometric expansion and amortized cost 3 Growth factor 4 Performance 5 Variants 6 Language support 7 References 8 External links Bounded-size dynamic arrays and capacity[edit] A simple dynamic array can be constructed by allocating an array of fixed-size, typically larger than the number of elements immediately required. The elements of the dynamic array are stored contiguously at the start of the underlying array, and the remaining positions towards the end of the underlying array are reserved, or unused. Elements can be added at the end of a dynamic array in constant time by using the reserved space, until this space is completely consumed. When all space is consumed, and an additional element is to be added, then the underlying fixed-sized array needs to be increased in size. Typically resizing is expensive because it involves allocating a new underlying array and copying each element from the original array. Elements can be removed from the end of a dynamic array in constant time, as no resizing is required. The number of elements used by the dynamic array contents is its logical size or size, while the size of the underlying array is called the dynamic array's capacity or physical size, which is the maximum possible size without relocating data.[2] A fixed-size array will suffice in applications where the maximum logical size is fixed (e.g. by specification), or can be calculated before the array is allocated. A dynamic array might be preferred if: the maximum logical size is unknown, or difficult to calculate, before the array is allocated it is considered that a maximum logical size given by a specification is likely to change the amortized cost of resizing a dynamic array does not significantly affect performance or responsiveness Geometric expansion and amortized cost[edit] To avoid incurring the cost of resizing many times, dynamic arrays resize by a large amount, such as doubling in size, and use the reserved space for future expansion. The operation of adding an element to the end might work as follows: function insertEnd(dynarray a, element e) if (a.size == a.capacity) // resize a to twice its current capacity: a.capacity ← a.capacity * 2 // (copy the contents to the new memory location here) a[a.size] ← e a.size ← a.size + 1 As n elements are inserted, the capacities form a geometric progression. Expanding the array by any constant proportion a ensures that inserting n elements takes O(n) time overall, meaning that each insertion takes amortized constant time. Many dynamic arrays also deallocate some of the underlying storage if its size drops below a certain threshold, such as 30% of the capacity. This threshold must be strictly smaller than 1/a in order to provide hysteresis (provide a stable band to avoiding repeatedly growing and shrinking) and support mixed sequences of insertions and removals with amortized constant cost. Dynamic arrays are a common example when teaching amortized analysis.[3][4] Growth factor[edit] The growth factor for the dynamic array depends on several factors including a space-time trade-off and algorithms used in the memory allocator itself. For growth factor a, the average time per insertion operation is about a/(a−1), while the number of wasted cells is bounded above by (a−1)n[citation needed]. If memory allocator uses a first-fit allocation algorithm, then growth factor values such as a=2 can cause dynamic array expansion to run out of memory even though a significant amount of memory may still be available.[5] There have been various discussions on ideal growth factor values, including proposals for the golden ratio as well as the value 1.5.[6] Many textbooks, however, use a = 2 for simplicity and analysis purposes.[3][4] Below are growth factors used by several popular implementations: Implementation Growth factor (a) Java ArrayList[1] 1.5 (3/2) Python PyListObject[7] ~1.125 (n + n >> 3)
Facebook folly/FBVector[9] 1.5 (3/2) Performance[edit] Comparison of list data structures Linked list
Array
Dynamic array
Balanced tree
Indexing Θ(n) Θ(1) Θ(1) Θ(log n) Θ(log n)[10] Θ(1) Insert/delete at beginning Θ(1) N/A Θ(n) Θ(log n) Θ(1) Θ(n) Insert/delete at end Θ(1) when last element is known; Θ(n) when last element is unknown N/A Θ(1) amortized Θ(log n) Θ(log n) updating Θ(1) amortized Insert/delete in middle search time + Θ(1)[11][12][13] N/A Θ(n) Θ(log n) Θ(log n) updating Θ(n) Wasted space (average) Θ(n) 0 Θ(n)[14] Θ(n) Θ(n) Θ(√n) The dynamic array has performance similar to an array, with the addition of new operations to add and remove elements: Getting or setting the value at a particular index (constant time) Iterating over the elements in order (linear time, good cache performance) Inserting or deleting an element in the middle of the array (linear time) Inserting or deleting an element at the end of the array (constant amortized time) Dynamic arrays benefit from many of the advantages of arrays,
including good locality of reference and data cache utilization,
compactness (low memory use), and random access. They usually have
only a small fixed additional overhead for storing information about
the size and capacity. This makes dynamic arrays an attractive tool
for building cache-friendly data structures. However, in languages
like Python or Java that enforce reference semantics, the dynamic
array generally will not store the actual data, but rather it will
store references to the data that resides in other areas of memory. In
this case, accessing items in the array sequentially will actually
involve accessing multiple non-contiguous areas of memory, so the many
advantages of the cache-friendliness of this data structure are lost.
Compared to linked lists, dynamic arrays have faster indexing
(constant time versus linear time) and typically faster iteration due
to improved locality of reference; however, dynamic arrays require
linear time to insert or delete at an arbitrary location, since all
following elements must be moved, while linked lists can do this in
constant time. This disadvantage is mitigated by the gap buffer and
tiered vector variants discussed under Variants below. Also, in a
highly fragmented memory region, it may be expensive or impossible to
find contiguous space for a large dynamic array, whereas linked lists
do not require the whole data structure to be stored contiguously.
A balanced tree can store a list while providing all operations of
both dynamic arrays and linked lists reasonably efficiently, but both
insertion at the end and iteration over the list are slower than for a
dynamic array, in theory and in practice, due to non-contiguous
storage and tree traversal/manipulation overhead.
Variants[edit]
Gap buffers are similar to dynamic arrays but allow efficient
insertion and deletion operations clustered near the same arbitrary
location. Some deque implementations use array deques, which allow
amortized constant time insertion/removal at both ends, instead of
just one end.
Goodrich[15] presented a dynamic array algorithm called tiered vectors
that provided O(n1/2) performance for order preserving insertions or
deletions from the middle of the array.
^ a b See, for example, the source code of java.util.ArrayList class
from OpenJDK 6.
^ Lambert, Kenneth Alfred (2009), "Physical size and logical size",
Fundamentals of Python: From First Programs Through Data Structures,
Cengage Learning, p. 510, ISBN 1423902181
^ a b Goodrich, Michael T.; Tamassia, Roberto (2002), "1.5.2 Analyzing
an Extendable Array Implementation", Algorithm Design: Foundations,
Analysis and Internet Examples, Wiley, pp. 39–41 .
^ a b Cormen, Thomas H.; Leiserson, Charles E.; Rivest, Ronald L.;
Stein, Clifford (2001) [1990]. "17.4 Dynamic tables". Introduction to
Algorithms (2nd ed.). MIT Press and McGraw-Hill. pp. 416–424.
ISBN 0-262-03293-7.
^ a b c "
External links[edit] NIST Dictionary of Algorithms and Data Structures: Dynamic array VPOOL - C language implementation of dynamic array. CollectionSpy — A Java profiler with explicit support for debugging ArrayList- and Vector-related issues. Open Data Structures - Chapter 2 - Array-Based Lists v t e Data structures Types Collection Container Abstract Associative array Multimap List Stack Queue Double-ended queue Priority queue Double-ended priority queue Set Multiset Disjoint-set Arrays Bit array Circular buffer Dynamic array Hash table Hashed array tree Sparse matrix Linked Association list Linked list Skip list Unrolled linked list XOR linked list Trees B-tree Binary search tree AA tree AVL tree Red–black tree Self-balancing tree Splay tree Heap Binary heap Binomial heap Fibonacci heap R-tree R* tree R+ tree Hilbert R-tree Trie Hash tree Graphs Binary decision diagram Directed acyclic graph Directed acyclic word graph List |

In computer science, a dynamic array, growable array, resizable array, dynamic table, mutable array, or array list is a random access, variable-size list data structure that allows elements to be added or removed. It is supplied with standard libraries in many modern mainstream programming languages. Dynamic arrays overcome a limit of static arrays, which have a fixed capacity that needs to be specified at allocation. A dynamic array is not the same thing as a dynamically allocated array, which is an array whose size is fixed when the array is allocated, although a dynamic array may use such a fixed-size array as a back end.[1] Contents 1 Bounded-size dynamic arrays and capacity 2 Geometric expansion and amortized cost 3 Growth factor 4 Performance 5 Variants 6 Language support 7 References 8 External links Bounded-size dynamic arrays and capacity[edit] A simple dynamic array can be constructed by allocating an array of fixed-size, typically larger than the number of elements immediately required. The elements of the dynamic array are stored contiguously at the start of the underlying array, and the remaining positions towards the end of the underlying array are reserved, or unused. Elements can be added at the end of a dynamic array in constant time by using the reserved space, until this space is completely consumed. When all space is consumed, and an additional element is to be added, then the underlying fixed-sized array needs to be increased in size. Typically resizing is expensive because it involves allocating a new underlying array and copying each element from the original array. Elements can be removed from the end of a dynamic array in constant time, as no resizing is required. The number of elements used by the dynamic array contents is its logical size or size, while the size of the underlying array is called the dynamic array's capacity or physical size, which is the maximum possible size without relocating data.[2] A fixed-size array will suffice in applications where the maximum logical size is fixed (e.g. by specification), or can be calculated before the array is allocated. A dynamic array might be preferred if: the maximum logical size is unknown, or difficult to calculate, before the array is allocated it is considered that a maximum logical size given by a specification is likely to change the amortized cost of resizing a dynamic array does not significantly affect performance or responsiveness Geometric expansion and amortized cost[edit] To avoid incurring the cost of resizing many times, dynamic arrays resize by a large amount, such as doubling in size, and use the reserved space for future expansion. The operation of adding an element to the end might work as follows: function insertEnd(dynarray a, element e) if (a.size == a.capacity) // resize a to twice its current capacity: a.capacity ← a.capacity * 2 // (copy the contents to the new memory location here) a[a.size] ← e a.size ← a.size + 1 As n elements are inserted, the capacities form a geometric progression. Expanding the array by any constant proportion a ensures that inserting n elements takes O(n) time overall, meaning that each insertion takes amortized constant time. Many dynamic arrays also deallocate some of the underlying storage if its size drops below a certain threshold, such as 30% of the capacity. This threshold must be strictly smaller than 1/a in order to provide hysteresis (provide a stable band to avoiding repeatedly growing and shrinking) and support mixed sequences of insertions and removals with amortized constant cost. Dynamic arrays are a common example when teaching amortized analysis.[3][4] Growth factor[edit] The growth factor for the dynamic array depends on several factors including a space-time trade-off and algorithms used in the memory allocator itself. For growth factor a, the average time per insertion operation is about a/(a−1), while the number of wasted cells is bounded above by (a−1)n[citation needed]. If memory allocator uses a first-fit allocation algorithm, then growth factor values such as a=2 can cause dynamic array expansion to run out of memory even though a significant amount of memory may still be available.[5] There have been various discussions on ideal growth factor values, including proposals for the golden ratio as well as the value 1.5.[6] Many textbooks, however, use a = 2 for simplicity and analysis purposes.[3][4] Below are growth factors used by several popular implementations: Implementation Growth factor (a) Java ArrayList[1] 1.5 (3/2) Python PyListObject[7] ~1.125 (n + n >> 3)
Facebook folly/FBVector[9] 1.5 (3/2) Performance[edit] Comparison of list data structures Linked list
Array
Dynamic array
Balanced tree
Indexing Θ(n) Θ(1) Θ(1) Θ(log n) Θ(log n)[10] Θ(1) Insert/delete at beginning Θ(1) N/A Θ(n) Θ(log n) Θ(1) Θ(n) Insert/delete at end Θ(1) when last element is known; Θ(n) when last element is unknown N/A Θ(1) amortized Θ(log n) Θ(log n) updating Θ(1) amortized Insert/delete in middle search time + Θ(1)[11][12][13] N/A Θ(n) Θ(log n) Θ(log n) updating Θ(n) Wasted space (average) Θ(n) 0 Θ(n)[14] Θ(n) Θ(n) Θ(√n) The dynamic array has performance similar to an array, with the addition of new operations to add and remove elements: Getting or setting the value at a particular index (constant time) Iterating over the elements in order (linear time, good cache performance) Inserting or deleting an element in the middle of the array (linear time) Inserting or deleting an element at the end of the array (constant amortized time) Dynamic arrays benefit from many of the advantages of arrays,
including good locality of reference and data cache utilization,
compactness (low memory use), and random access. They usually have
only a small fixed additional overhead for storing information about
the size and capacity. This makes dynamic arrays an attractive tool
for building cache-friendly data structures. However, in languages
like Python or Java that enforce reference semantics, the dynamic
array generally will not store the actual data, but rather it will
store references to the data that resides in other areas of memory. In
this case, accessing items in the array sequentially will actually
involve accessing multiple non-contiguous areas of memory, so the many
advantages of the cache-friendliness of this data structure are lost.
Compared to linked lists, dynamic arrays have faster indexing
(constant time versus linear time) and typically faster iteration due
to improved locality of reference; however, dynamic arrays require
linear time to insert or delete at an arbitrary location, since all
following elements must be moved, while linked lists can do this in
constant time. This disadvantage is mitigated by the gap buffer and
tiered vector variants discussed under Variants below. Also, in a
highly fragmented memory region, it may be expensive or impossible to
find contiguous space for a large dynamic array, whereas linked lists
do not require the whole data structure to be stored contiguously.
A balanced tree can store a list while providing all operations of
both dynamic arrays and linked lists reasonably efficiently, but both
insertion at the end and iteration over the list are slower than for a
dynamic array, in theory and in practice, due to non-contiguous
storage and tree traversal/manipulation overhead.
Variants[edit]
Gap buffers are similar to dynamic arrays but allow efficient
insertion and deletion operations clustered near the same arbitrary
location. Some deque implementations use array deques, which allow
amortized constant time insertion/removal at both ends, instead of
just one end.
Goodrich[15] presented a dynamic array algorithm called tiered vectors
that provided O(n1/2) performance for order preserving insertions or
deletions from the middle of the array.
^ a b See, for example, the source code of java.util.ArrayList class
from OpenJDK 6.
^ Lambert, Kenneth Alfred (2009), "Physical size and logical size",
Fundamentals of Python: From First Programs Through Data Structures,
Cengage Learning, p. 510, ISBN 1423902181
^ a b Goodrich, Michael T.; Tamassia, Roberto (2002), "1.5.2 Analyzing
an Extendable Array Implementation", Algorithm Design: Foundations,
Analysis and Internet Examples, Wiley, pp. 39–41 .
^ a b Cormen, Thomas H.; Leiserson, Charles E.; Rivest, Ronald L.;
Stein, Clifford (2001) [1990]. "17.4 Dynamic tables". Introduction to
Algorithms (2nd ed.). MIT Press and McGraw-Hill. pp. 416–424.
ISBN 0-262-03293-7.
^ a b c "
External links[edit] NIST Dictionary of Algorithms and Data Structures: Dynamic array VPOOL - C language implementation of dynamic array. CollectionSpy — A Java profiler with explicit support for debugging ArrayList- and Vector-related issues. Open Data Structures - Chapter 2 - Array-Based Lists v t e Data structures Types Collection Container Abstract Associative array Multimap List Stack Queue Double-ended queue Priority queue Double-ended priority queue Set Multiset Disjoint-set Arrays Bit array Circular buffer Dynamic array Hash table Hashed array tree Sparse matrix Linked Association list Linked list Skip list Unrolled linked list XOR linked list Trees B-tree Binary search tree AA tree AVL tree Red–black tree Self-balancing tree Splay tree Heap Binary heap Binomial heap Fibonacci heap R-tree R* tree R+ tree Hilbert R-tree Trie Hash tree Graphs Binary decision diagram Directed acyclic graph Directed acyclic word graph List |

Time at 25450774.983333, Busy percent: 30

***************** NOT Too Busy at 25450774.983333 3../logs/periodic-service_log.txt

1440 = task['interval'];

25451903.666667 = task['next-exec'];

0 = task['last-exec'];

daily-work.php = task['exec'];

25450774.983333 Time.

10080 = task['interval'];

25460543.666667 = task['next-exec'];

0 = task['last-exec'];

weekly-work.php = task['exec'];

25450774.983333 Time.

30 = task['interval'];

25450794.416667 = task['next-exec'];

25450764.416667 = task['last-exec'];

PeriodicStats.php = task['exec'];

25450774.983333 Time.

1440 = task['interval'];

25451903.666667 = task['next-exec'];

0 = task['last-exec'];

PeriodicBuild.php = task['exec'];

25450774.983333 Time.

1440 = task['interval'];

25451903.666667 = task['next-exec'];

0 = task['last-exec'];

build-sitemap-xml.php = task['exec'];

25450774.983333 Time.

60 = task['interval'];

25450824.15 = task['next-exec'];

25450764.15 = task['last-exec'];

cleanup.php = task['exec'];

25450774.983333 Time.

2 = task['interval'];

25450776.7 = task['next-exec'];

25450774.7 = task['last-exec'];

parse-contens.php = task['exec'];

25450774.983333 Time.