Capturing library f2py call stdout from ipython

青春壹個敷衍的年華 提交于 2019-12-11 05:58:17

问题


I'm using Jupyter notebook with a Python 3 kernel.

If I run:

import scipy.optimize
scipy.optimize.minimize(
    lambda _: 1,
    0,
    method='COBYLA',
    options={'iprint': 1, 'disp': True, 'maxiter': 2})

I expect to get diagnostic optimization info printed to the ipython notebook. However, this prints to console. I suspect this is the case because the optimization routine is implemented in Fortran, and is interfaced in scipy through f2py. The COBYLA Fortran file does the actual printing.

How can I pipe the Fortran subroutine output to the ipython notebook? As I understand it, it should be the same as calling a compiled C function - so why isn't the stdout shared?


回答1:


The short answer is that you can't. Not easily. That one of the use case that could be covered by the following enhancement proposal to the IPython/Jupyter protocol. Though it's not accepted yet, nor going to happen fast.

The (hand waved) reason is that when using Python you can monkeypatch sys.stdin/sys.stdout/sys.stderr and write to a file-like interface that redirect to do "The Right Thing"™, though when it's a fortran/c/... functions, they often do directly open the filehandles corresponding to the raw streams, and you can't change this after the fact.

The only solution is to control how the process get launched, and change the file descriptor ahead of time, hence the proposal for the "kernel nany".


Let's develop (after OP further question).

Python print is a function which does not directly print to standard out, it does in fact write to sys.stdout unless told otherwise. If you check in a normal python shell:

>>> import sys
>>> sys.stdout
<_io.TextIOWrapper name='<stdout>' mode='w' encoding='UTF-8'>

You can see that it is a direct wrapper around a filehandle.

If you do the same in a notebook (not in an IPython terminal, that's another story), you will see <ipykernel.iostream.OutStream at 0x104602be0> which is a Proxy object around the ZMQ protocol. In an IPython kernel the previous stream is stored in sys.__stdout__ so you can play around and try a

sys.__stdout__.write('Hello StackOverflow\n')

Which will print "Hello Stackoverflow" in your notebook server's terminal. Do not forget the \n which trigger the stream to be flushed.

Not that this is not a Jupyter behavior, it is an IPython behavior. The Jupyter side does not care how you do it, as long as you send the stdout over ZMQ. The Haskell kernel likely does the same by providing it's own io module.

Capturing the process stdout is one solution (that the kernel nanny proposal covers) but it has its own drawbacks. It is simpler to redirect at the Python level, as sys.stdout is made for that.

This behavior is neither a bug nor a "Feature", one could argue that subprocess/f2py/pyc, etc... should be able to handle non-standard stdout/stderr, as argument, and the the nanny is a workaround to help with these case, which will be



来源:https://stackoverflow.com/questions/41350116/capturing-library-f2py-call-stdout-from-ipython

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!