After a very thorough read of the Python\'s decimal module documentation, I still find myself puzzled by what happens when I divide a decimal.
In Python 2.4.6 (makes
From your MacPorts bug, you have installed Xcode 4 and your version of Python 2.7.2 was built with the clang C compiler, rather than gcc-4.2. There is at least one known problem with building with clang on OS X that has been fixed in Python subsequent to the 2.7.2. release. Either apply the patch or, better, ensure the build uses gcc-4.2. Something like (untested!):
sudo bash
export CC=/usr/bin/gcc-4.2
port clean python27
port upgrade --force python27
prior to the build might work if MacPorts doesn't override it.
UPDATE: The required patch has now been applied to the MacPorts port files for Python 2. See https://trac.macports.org/changeset/87442
Just for the record: For python 2.7.3 compiled with clang (via Homebrew on OS X), this seems to be fixed.
Python 2.7.3 (default, Oct 10 2012, 13:00:00)
[GCC 4.2.1 Compatible Apple Clang 4.1 ((tags/Apple/clang-421.11.66))] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import decimal
>>> decimal.Decimal(1000) / 10
Decimal('100')