[python] Redirect stdout to a file in Python?

How do I redirect stdout to an arbitrary file in Python?

When a long-running Python script (e.g, web application) is started from within the ssh session and backgounded, and the ssh session is closed, the application will raise IOError and fail the moment it tries to write to stdout. I needed to find a way to make the application and modules output to a file rather than stdout to prevent failure due to IOError. Currently, I employ nohup to redirect output to a file, and that gets the job done, but I was wondering if there was a way to do it without using nohup, out of curiosity.

I have already tried sys.stdout = open('somefile', 'w'), but this does not seem to prevent some external modules from still outputting to terminal (or maybe the sys.stdout = ... line did not fire at all). I know it should work from simpler scripts I've tested on, but I also didn't have time yet to test on a web application yet.

This question is related to python stdout

The answer is


If you want to do the redirection within the Python script, setting sys.stdout to a file object does the trick:

import sys
sys.stdout = open('file', 'w')
print('test')
sys.stdout.close()

A far more common method is to use shell redirection when executing (same on Windows and Linux):

$ python foo.py > file

There is contextlib.redirect_stdout() function in Python 3.4+:

from contextlib import redirect_stdout

with open('help.txt', 'w') as f:
    with redirect_stdout(f):
        print('it now prints to `help.text`')

It is similar to:

import sys
from contextlib import contextmanager

@contextmanager
def redirect_stdout(new_target):
    old_target, sys.stdout = sys.stdout, new_target # replace sys.stdout
    try:
        yield new_target # run some code with the replaced stdout
    finally:
        sys.stdout = old_target # restore to the previous value

that can be used on earlier Python versions. The latter version is not reusable. It can be made one if desired.

It doesn't redirect the stdout at the file descriptors level e.g.:

import os
from contextlib import redirect_stdout

stdout_fd = sys.stdout.fileno()
with open('output.txt', 'w') as f, redirect_stdout(f):
    print('redirected to a file')
    os.write(stdout_fd, b'not redirected')
    os.system('echo this also is not redirected')

b'not redirected' and 'echo this also is not redirected' are not redirected to the output.txt file.

To redirect at the file descriptor level, os.dup2() could be used:

import os
import sys
from contextlib import contextmanager

def fileno(file_or_fd):
    fd = getattr(file_or_fd, 'fileno', lambda: file_or_fd)()
    if not isinstance(fd, int):
        raise ValueError("Expected a file (`.fileno()`) or a file descriptor")
    return fd

@contextmanager
def stdout_redirected(to=os.devnull, stdout=None):
    if stdout is None:
       stdout = sys.stdout

    stdout_fd = fileno(stdout)
    # copy stdout_fd before it is overwritten
    #NOTE: `copied` is inheritable on Windows when duplicating a standard stream
    with os.fdopen(os.dup(stdout_fd), 'wb') as copied: 
        stdout.flush()  # flush library buffers that dup2 knows nothing about
        try:
            os.dup2(fileno(to), stdout_fd)  # $ exec >&to
        except ValueError:  # filename
            with open(to, 'wb') as to_file:
                os.dup2(to_file.fileno(), stdout_fd)  # $ exec > to
        try:
            yield stdout # allow code to be run with the redirected stdout
        finally:
            # restore stdout to its previous value
            #NOTE: dup2 makes stdout_fd inheritable unconditionally
            stdout.flush()
            os.dup2(copied.fileno(), stdout_fd)  # $ exec >&copied

The same example works now if stdout_redirected() is used instead of redirect_stdout():

import os
import sys

stdout_fd = sys.stdout.fileno()
with open('output.txt', 'w') as f, stdout_redirected(f):
    print('redirected to a file')
    os.write(stdout_fd, b'it is redirected now\n')
    os.system('echo this is also redirected')
print('this is goes back to stdout')

The output that previously was printed on stdout now goes to output.txt as long as stdout_redirected() context manager is active.

Note: stdout.flush() does not flush C stdio buffers on Python 3 where I/O is implemented directly on read()/write() system calls. To flush all open C stdio output streams, you could call libc.fflush(None) explicitly if some C extension uses stdio-based I/O:

try:
    import ctypes
    from ctypes.util import find_library
except ImportError:
    libc = None
else:
    try:
        libc = ctypes.cdll.msvcrt # Windows
    except OSError:
        libc = ctypes.cdll.LoadLibrary(find_library('c'))

def flush(stream):
    try:
        libc.fflush(None)
        stream.flush()
    except (AttributeError, ValueError, IOError):
        pass # unsupported

You could use stdout parameter to redirect other streams, not only sys.stdout e.g., to merge sys.stderr and sys.stdout:

def merged_stderr_stdout():  # $ exec 2>&1
    return stdout_redirected(to=sys.stdout, stdout=sys.stderr)

Example:

from __future__ import print_function
import sys

with merged_stderr_stdout():
     print('this is printed on stdout')
     print('this is also printed on stdout', file=sys.stderr)

Note: stdout_redirected() mixes buffered I/O (sys.stdout usually) and unbuffered I/O (operations on file descriptors directly). Beware, there could be buffering issues.

To answer, your edit: you could use python-daemon to daemonize your script and use logging module (as @erikb85 suggested) instead of print statements and merely redirecting stdout for your long-running Python script that you run using nohup now.


Programs written in other languages (e.g. C) have to do special magic (called double-forking) expressly to detach from the terminal (and to prevent zombie processes). So, I think the best solution is to emulate them.

A plus of re-executing your program is, you can choose redirections on the command-line, e.g. /usr/bin/python mycoolscript.py 2>&1 1>/dev/null

See this post for more info: What is the reason for performing a double fork when creating a daemon?


import sys
sys.stdout = open('stdout.txt', 'w')

you can try this too much better

import sys

class Logger(object):
    def __init__(self, filename="Default.log"):
        self.terminal = sys.stdout
        self.log = open(filename, "a")

    def write(self, message):
        self.terminal.write(message)
        self.log.write(message)

sys.stdout = Logger("yourlogfilename.txt")
print "Hello world !" # this is should be saved in yourlogfilename.txt

The other answers didn't cover the case where you want forked processes to share your new stdout.

To do that:

from os import open, close, dup, O_WRONLY

old = dup(1)
close(1)
open("file", O_WRONLY) # should open on 1

..... do stuff and then restore

close(1)
dup(old) # should dup to 1
close(old) # get rid of left overs

You need a terminal multiplexer like either tmux or GNU screen

I'm surprised that a small comment by Ryan Amos' to the original question is the only mention of a solution far preferable to all the others on offer, no matter how clever the python trickery may be and how many upvotes they've received. Further to Ryan's comment, tmux is a nice alternative to GNU screen.

But the principle is the same: if you ever find yourself wanting to leave a terminal job running while you log-out, head to the cafe for a sandwich, pop to the bathroom, go home (etc) and then later, reconnect to your terminal session from anywhere or any computer as though you'd never been away, terminal multiplexers are the answer. Think of them as VNC or remote desktop for terminal sessions. Anything else is a workaround. As a bonus, when the boss and/or partner comes in and you inadvertently ctrl-w / cmd-w your terminal window instead of your browser window with its dodgy content, you won't have lost the last 18 hours-worth of processing!


Quoted from PEP 343 -- The "with" Statement (added import statement):

Redirect stdout temporarily:

import sys
from contextlib import contextmanager
@contextmanager
def stdout_redirected(new_stdout):
    save_stdout = sys.stdout
    sys.stdout = new_stdout
    try:
        yield None
    finally:
        sys.stdout = save_stdout

Used as follows:

with open(filename, "w") as f:
    with stdout_redirected(f):
        print "Hello world"

This isn't thread-safe, of course, but neither is doing this same dance manually. In single-threaded programs (for example in scripts) it is a popular way of doing things.


Here is a variation of Yuda Prawira answer:

  • implement flush() and all the file attributes
  • write it as a contextmanager
  • capture stderr also

.

import contextlib, sys

@contextlib.contextmanager
def log_print(file):
    # capture all outputs to a log file while still printing it
    class Logger:
        def __init__(self, file):
            self.terminal = sys.stdout
            self.log = file

        def write(self, message):
            self.terminal.write(message)
            self.log.write(message)

        def __getattr__(self, attr):
            return getattr(self.terminal, attr)

    logger = Logger(file)

    _stdout = sys.stdout
    _stderr = sys.stderr
    sys.stdout = logger
    sys.stderr = logger
    try:
        yield logger.log
    finally:
        sys.stdout = _stdout
        sys.stderr = _stderr


with log_print(open('mylogfile.log', 'w')):
    print('hello world')
    print('hello world on stderr', file=sys.stderr)

# you can capture the output to a string with:
# with log_print(io.StringIO()) as log:
#   ....
#   print('[captured output]', log.getvalue())

Based on this answer: https://stackoverflow.com/a/5916874/1060344, here is another way I figured out which I use in one of my projects. For whatever you replace sys.stderr or sys.stdout with, you have to make sure that the replacement complies with file interface, especially if this is something you are doing because stderr/stdout are used in some other library that is not under your control. That library may be using other methods of file object.

Check out this way where I still let everything go do stderr/stdout (or any file for that matter) and also send the message to a log file using Python's logging facility (but you can really do anything with this):

class FileToLogInterface(file):
    '''
    Interface to make sure that everytime anything is written to stderr, it is
    also forwarded to a file.
    '''

    def __init__(self, *args, **kwargs):
        if 'cfg' not in kwargs:
            raise TypeError('argument cfg is required.')
        else:
            if not isinstance(kwargs['cfg'], config.Config):
                raise TypeError(
                    'argument cfg should be a valid '
                    'PostSegmentation configuration object i.e. '
                    'postsegmentation.config.Config')
        self._cfg = kwargs['cfg']
        kwargs.pop('cfg')

        self._logger = logging.getlogger('access_log')

        super(FileToLogInterface, self).__init__(*args, **kwargs)

    def write(self, msg):
        super(FileToLogInterface, self).write(msg)
        self._logger.info(msg)