Python has two similar sequence types such as tuples and lists. The most well-known difference between them is that tuples are immutable, that is, you cannot change their size as well as their immutable objects.
You can't changes items in a tuple:
>>> a = (1,2,3)
>>> a[0] = 10
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: 'tuple' object does not support item assignment
But you can change mutable objects:
>>> b = (1,[1,2,3],3)
>>> b[1]
[1, 2, 3]
>>> b[1].append(4)
>>> b
(1, [1, 2, 3, 4], 3)
Internally, both lists and tuples are implemented as a list of pointers to the Python objects (items). When you remove an item from a list, the reference to an item gets destroyed. Keep in mind, that removed item can stay alive if there are other references in your program to it.
Tuples
Despite the fact that tuples are less popular than lists, it is a fundamental data type, which is used a lot internally.
You may not notice, but you are using tuples when:
- working with arguments and parameters
- returning 2 or more items from a function
- iterating over dictionary's key-value pairs
- using string formatting
Typically, a running program has thousands of allocated tuples.
>>> import gc
>>> def type_stats(type_obj):
... count = 0
... for obj in gc.get_objects():
... if type(obj) == type_obj:
... count += 1
... return count
...
>>> type_stats(tuple)
3136
>>> type_stats(list)
659
>>> import pandas
>>> type_stats(tuple)
6953
>>> type_stats(list)
2455
Empty lists vs. empty tuples
Empty tuple acts as a singleton, that is, there is always only one tuple with a length of zero. When creating an empty tuple Python points to already preallocated one, in such way that any empty tuple has the same address in the memory. This is possible because tuples are immutable and sometimes saves a lot of memory.
>>> a = ()
>>> b = ()
>>> a is b
True
>>> id(a)
4409020488
>>> id(b)
4409020488
But this doesn't apply to lists since they can be modified.
>>> a = []
>>> b = []
>>> a is b
False
>>> id(a)
4465566920
>>> id(b)
4465370632
Allocation optimization for small tuples
To reduce memory fragmentation and speed up allocations, Python reuses old tuples. If a tuple no longer needed and has less than 20 items instead of deleting it permanently Python moves it to a free list.
A free list is divided into 20 groups, where each group represents a list of tuples of length n between 0 and 20. Each group can store up to 2 000 tuples. The first (zero) group contains only 1 element and represents an empty tuple.
>>> a = (1,2,3)
>>> id(a)
4427578104
>>> del a
>>> b = (1,2,4)
>>> id(b)
4427578104
In the example above we can see that a and b have the same id. That is because we immediately occupied a destroyed tuple which was on the free list.
Allocation optimization for lists
Since lists can be modified, Python does not use the same optimization as in tuples. However, Python lists also have a free list, but it is used only for empty objects. If an empty list is deleted or collected by GC, it can be reused later.
>>> a = []
>>> id(a)
4465566792
>>> del a
>>> b = []
>>> id(b)
4465566792
List resizing
To avoid the cost of resizing, Python does not resize a list every time you need to add or remove an item. Instead, every list has a number of empty slots which are hidden from a user but can be used for new items. If the slots are completely consumed Python over-allocates additional space for them. The number of additional slots is chosen based on the current size of the list.
Developer documentation describes it as follows:
This over-allocates proportional to the list size, making room for additional growth. The over-allocation is mild but is enough to give linear-time amortized behavior over a long sequence of appends() in the presence of a poorly-performing system realloc().
The growth pattern is: 0, 4, 8, 16, 25, 35, 46, 58, 72, 88, ...
Note: new_allocated won't overflow because the largest possible value is PY_SSIZE_T_MAX * (9 / 8) + 6 which always fits in a size_t.
For example, if you want to append an item to a list of length 8, Python will resize it to16 slots and add the 9th item. The rest of the slots will be hidden and reserved for new items.
The growing factor looks as follows:
>>> def get_new_size(n_items):
... new_size = n_items + (n_items // 2 ** 3)
... if n_items < 9:
... new_size += 3
... else:
... new_size += 6
...
... return new_size
...
>>> get_new_size(9)
16
Performance
If you are interested in speed comparison, there is a good summary about the overall performance by Raymond Hettinger.