I have 1000s of custom (compiled to \'.so\') modules that I\'d like to use in python
at the same time. Each such module is of size (100 [KB]
) on averag
CPython has no limit on the number of imports. However, each .so
file will be opened using dlopen()
, which is outside of Python's control -- as is the symbol table that would need to keep growing to collect information about your extension modules. Whether those have a practical limit is also outside of Python's purview. CPython itself merely takes up some memory per module you import, so as long as you have enough memory you should be fine.
The amount of memory consumed by a single imported module is going to be at least as big as the size of the module on disk. The overhead is determined by both the OS itself (for loading a dynamic module) and Python's overhead in importing a module.
So if your module are on average 100kB in size, then importing 10000 of them will take up at least 1 GB of address space. Importing 50000 of them will run over 5 GB. You'd better be using an operating system with a 64-bit address space.
There's no Python limit on number of imports in a module. If there's a limit in any particular implementation, it's probably because of resource limits outside the Python interpreter.