updated June 29, 2024

Popular questions in Python job interview

Let's dive into what every Python developer is expected to know.

1. Useful classes from collections module

  • defaultdict is a dictionary that creates an object on key lookup if the key is not in the dictionary, returns it and stores the key mapping to the created object:
    >>> from collections import defaultdict
    >>> d = defaultdict(int)
    >>> d[2]
    0
    >>> d
    defaultdict(, {2: 0})
    >>> d[3] += 5
    >>> d
    defaultdict(, {2: 0, 3: 5})
    
  • deque is a stack: the last element added is removed first (LIFO):
    >>> from collections import deque
    >>> s = deque()
    >>> s.append(2)
    >>> s.append(5)
    >>> s
    deque([2, 5])
    >>> s.pop()
    5
    >>> s
    deque([2])
    
  • ChainMap(*maps) supports lookup operation via several maps.

    >>> from collections import ChainMap
    >>> m1 = {1: 5, 2: 7}
    >>> m2 = {11: 4, 14: 2}
    >>> m3 = {23: 8, 29: 1}
    >>> m = ChainMap(m1, m2, m3)
    >>> m[23]
    8
    >>> m[2]
    7
    >>> m[14]
    2
    

2. Useful function from itertools module.

This module provides functions to create iterators for effective looping.
  • itertools.chain() provides iterator over several sequences:
    >>> from itertools import chain
    >>> it = chain([1, 2, 3], [7, 8, 9]) 
    >>> it
    
    >>> list(it)
    [1, 2, 3, 7, 8, 9]
    
  • itertools.batched(iterable, n) groups sequence into tuples of length n:
    >>> arr = [1, 2, 3, 4, 5]
    >>> list(itertools.batched(arr, 2))
    [(1, 2), (3, 4), (5,)]
    

    Added in Python 3.12

  • itertools.filterfalse(func, iterator) works similar to the built-in filter() function except it filter outs elements for which the first argument function returns True instead keeping them:
    >>> from itertools import filterfalse
    >>> numbers = [1, 2, 3, 4, 5]
    >>> res = filter(lambda n: n % 2 == 0, numbers)
    >>> list(res)
    [2, 4]
    >>> res = filterfalse(lambda n: n % 2 == 0, numbers)
    >>> list(res)
    [1, 3, 5]
    

3. Useful functions from functools module.

  • functools.reduce(function, iterable, initializer=None) converts a sequence to a single value using function with two arguments: applying it to the 1st and 2nd sequence element, then to the result and 3rd element, then to the result and 4th element and repeating till the last sequence element:
    >>> from functools import reduce
    >>> reduce(lambda x, y: x + y, [1, 2, 3, 4])
    10
    >>> import operator
    >>> reduce(operator.add, [1, 2, 3, 4])
    10
    

    Often it is used together with operator module which provides a lot of functions for standard built-in operations.

  • functools.partial(function, *args, **keywords) returns callable object that behaves like the function is called with the *args, **kwargs arguments. If a function is called with positional arguments, they are added to the *args. If keyword arguments are provided, they override the **kwargs arguments:
    >>> from functools import partial
    >>> def func(a, b, x, y):
    ...   print(a, b, x, y)
    ... 
    >>> p = partial(func, 1, x=11, y=12)
    >>> p(2, y=15)
    1 2 11 15
    
  • @functools.cache(user_function) decorator function caches the function return value by the arguments so the cached return value is returned for the arguments used before without the function invocation:
    @cache
    def my_func(n):
       ...
       
       
    my_func(3)  # my_func is called
    my_func(3)  # my_func isn't called, looked up the previous call result
    

    The cache size is unlimited and the values are not removed from it. functools.lru_cache(max_size) provides restricted size cache with the key eviction.

  • @functools.cached_property caches the class method return value:
    class Square:
    
      @cached_property
      def area(self):
        return self.size ** 2
    

4. What does functools.wraps do?

It copies the function name and docstring from the wrapped function to the function which is returned by decorator:
import functools

def my_decorator(func):
  @functools.wraps(func)
  def wrapper(*args, **kwargs):
    print("My decorator")
    func(*args, **kwargs)
  return wrapper

@my_decorator
def my_func():
  """My func docstring"""
  print("My func")
  

my_func()
my_func.__name__  # "my_func"
my_func.__doc__  # "My func docstring"

5. What's generator?

Python generator function works as iterator (implements iterator protocol): it returns the iterator values with yield keyword:
def my_generator():
  for i in range(5):
    yield i  # return next value

list(my_generator())  # can convert into list

for val in my_generator():  # can loop
  print(val)

6. Python variable scopes, LEGB

The are 4 types of variable scope in Python:
  • local: it contains variables inside function definition. The object references in this scope are released when code execution leaves the function body
  • enclosed: it's scope for inner function only and contains variables declared in outer function, associated with inner function so references are valid till the inner function exist:
        def outer():
          v = 1
          def inner():
            print(v)  # v from enclosed scope
          return inner
          
        f = outer()
        f()  # v is still valid after outer() call
        
  • global: variables declared in modules
  • built-in: functions and objects names, keywords provided by Python interpreter

The first letters of these scopes make LEGB acronym.

7. What's class search order to lookup class property definition, in case of multiple inheritance?

The mro() method that returns such search order. It implements the Python method resolution order (MRO) algorithm:
>>> class A:
...   def f(self):
...     print("A.f")
... 
>>> class B:
...   def f(self):
...     print("B.f")
... 
>>> class C(A, B):
...   pass
... 
>>> c = C()
>>> c.f()
A.f
>>> C.mro()
[<class '__main__.C'>, <class '__main__.A'>, <class '__main__.B'>, <class 'object'>]

So f() method definition is looked in C class first, then in A, B, object. That's why it's found and called in class A.

8. How to call a method in super class? How to do this in case of multiple inheritance?

Super class methods are called via object returned by built-in super() call. In Python 3 super() and the call with parameters in the example below do the same thing:
class Radio(Device):
  def volume(self):
    return super(Radio, self).volume()  # the same as super().volume()

The method is searched in class hierarchy using MRO starting from the parent of the class or 2nd MRO element, as well for multiple inheritance:

>>> class A:
...   def f(self):
...     print("A.f")
... 
>>> class B:
...   def f(self):
...     print("B.f")
... 
>>> class C(A, B):
...   def f(self):
...     super().f()
... 
>>> c = C()
>>> c.f()  # MRO: C - A - B - object
A.f

But it's possible to search further in MRO if pass the start class for the search so it's done from the next after it instead 2nd element in MRO:

>>> class C(A, B):
...   def f(self):
...     super(A, self).f()
... 
>>> c = C()
>>> c.f()  # MRO: A - B - object, start from B
B.f

9. What are / and * function parameters?

These are delimiters that mark which portion of function parameters are positional only and which are keyword only:
def my_func(p, /, pk, *, k):
  pass
  
my_func(1, 2, k=3)  # k is keyword only parameter
my_func(1, pk=2, k=3)  # p could be positional or keyword

The delimiters itself are not parameters so the function above has 3 parameters. Parameters after * delimiter can be passed as keyword argument only. Parameter before / is positional only and can't be passed as keyword argument. The calls below are not valid and raise TypeError:

my_func(p=1, pk=2, k=3)  # p is positional only
my_func(1, 2, 3)  # k is keyword only

10. How does Python interpretator free not used memory?

The reference counting is used for objects: references count to this object is stored in the internal interpreter structures for this object and updated on reference assignment:
>>> import sys
>>> c = object()
>>> sys.getrefcount(c)  # 2 references: the call parameter and c
2
>>> b = c
>>> sys.getrefcount(c)  # 3 references: b reference added
3
>>> b = 1
>>> sys.getrefcount(c)  # 2 references: b reference removed
2

If the reference is removed (assigned to another object, out of scope) then reference count is decreased. When the reference count reaches 0 then the object memory is released.

But cyclic object references are possible: the group of objects that reference only objects from the group but they are not referenced from the other places (local, module scope etc) so useless. Such objects always have non-zero reference count so the memore can't be released by the method above. These objects are found by special algorithm.