I have a question regarding one single module that is distributed over multiple directories.
Let\'s say I have these two file and directories:
~/lib/
Python does indeed remember that there was a xxx
package. This is pretty much necessary to achieve acceptable performance, once modules and packages are loaded they are cached. You can see which modules are loaded by looking the the dictionary sys.modules
.
sys.modules
is a normal dictionary so you can remove a package from it to force it to be reloaded like below:
import sys
print sys.modules
import xml
print sys.modules
del sys.modules['xml']
print sys.modules
Notice that after importing the xml
package it is the dictionary, however it is possible to remove it from that dictionary too. This is a point I make for pedagogical purposes only, I would not recommend this approach in a real application. Also if you need to use your misc
and util
packages together this would not work so great. If at all possible rearrange your source code structure to better fit the normal Python module loading mechanism.