[java] Java Process with Input/Output Stream

I have the following code example below. Whereby you can enter a command to the bash shell i.e. echo test and have the result echo'd back. However, after the first read. Other output streams don't work?

Why is this or am I doing something wrong? My end goal is to created a Threaded scheduled task that executes a command periodically to /bash so the OutputStream and InputStream would have to work in tandem and not stop working. I have also been experiencing the error java.io.IOException: Broken pipe any ideas?

Thanks.

String line;
Scanner scan = new Scanner(System.in);

Process process = Runtime.getRuntime ().exec ("/bin/bash");
OutputStream stdin = process.getOutputStream ();
InputStream stderr = process.getErrorStream ();
InputStream stdout = process.getInputStream ();

BufferedReader reader = new BufferedReader (new InputStreamReader(stdout));
BufferedWriter writer = new BufferedWriter(new OutputStreamWriter(stdin));

String input = scan.nextLine();
input += "\n";
writer.write(input);
writer.flush();

input = scan.nextLine();
input += "\n";
writer.write(input);
writer.flush();

while ((line = reader.readLine ()) != null) {
System.out.println ("Stdout: " + line);
}

input = scan.nextLine();
input += "\n";
writer.write(input);
writer.close();

while ((line = reader.readLine ()) != null) {
System.out.println ("Stdout: " + line);
}

This question is related to java stream

The answer is


Firstly, I would recommend replacing the line

Process process = Runtime.getRuntime ().exec ("/bin/bash");

with the lines

ProcessBuilder builder = new ProcessBuilder("/bin/bash");
builder.redirectErrorStream(true);
Process process = builder.start();

ProcessBuilder is new in Java 5 and makes running external processes easier. In my opinion, its most significant improvement over Runtime.getRuntime().exec() is that it allows you to redirect the standard error of the child process into its standard output. This means you only have one InputStream to read from. Before this, you needed to have two separate Threads, one reading from stdout and one reading from stderr, to avoid the standard error buffer filling while the standard output buffer was empty (causing the child process to hang), or vice versa.

Next, the loops (of which you have two)

while ((line = reader.readLine ()) != null) {
    System.out.println ("Stdout: " + line);
}

only exit when the reader, which reads from the process's standard output, returns end-of-file. This only happens when the bash process exits. It will not return end-of-file if there happens at present to be no more output from the process. Instead, it will wait for the next line of output from the process and not return until it has this next line.

Since you're sending two lines of input to the process before reaching this loop, the first of these two loops will hang if the process hasn't exited after these two lines of input. It will sit there waiting for another line to be read, but there will never be another line for it to read.

I compiled your source code (I'm on Windows at the moment, so I replaced /bin/bash with cmd.exe, but the principles should be the same), and I found that:

  • after typing in two lines, the output from the first two commands appears, but then the program hangs,
  • if I type in, say, echo test, and then exit, the program makes it out of the first loop since the cmd.exe process has exited. The program then asks for another line of input (which gets ignored), skips straight over the second loop since the child process has already exited, and then exits itself.
  • if I type in exit and then echo test, I get an IOException complaining about a pipe being closed. This is to be expected - the first line of input caused the process to exit, and there's nowhere to send the second line.

I have seen a trick that does something similar to what you seem to want, in a program I used to work on. This program kept around a number of shells, ran commands in them and read the output from these commands. The trick used was to always write out a 'magic' line that marks the end of the shell command's output, and use that to determine when the output from the command sent to the shell had finished.

I took your code and I replaced everything after the line that assigns to writer with the following loop:

while (scan.hasNext()) {
    String input = scan.nextLine();
    if (input.trim().equals("exit")) {
        // Putting 'exit' amongst the echo --EOF--s below doesn't work.
        writer.write("exit\n");
    } else {
        writer.write("((" + input + ") && echo --EOF--) || echo --EOF--\n");
    }
    writer.flush();

    line = reader.readLine();
    while (line != null && ! line.trim().equals("--EOF--")) {
        System.out.println ("Stdout: " + line);
        line = reader.readLine();
    }
    if (line == null) {
        break;
    }
}

After doing this, I could reliably run a few commands and have the output from each come back to me individually.

The two echo --EOF-- commands in the line sent to the shell are there to ensure that output from the command is terminated with --EOF-- even in the result of an error from the command.

Of course, this approach has its limitations. These limitations include:

  • if I enter a command that waits for user input (e.g. another shell), the program appears to hang,
  • it assumes that each process run by the shell ends its output with a newline,
  • it gets a bit confused if the command being run by the shell happens to write out a line --EOF--.
  • bash reports a syntax error and exits if you enter some text with an unmatched ).

These points might not matter to you if whatever it is you're thinking of running as a scheduled task is going to be restricted to a command or a small set of commands which will never behave in such pathological ways.

EDIT: improve exit handling and other minor changes following running this on Linux.


I think you can use thread like demon-thread for reading your input and your output reader will already be in while loop in main thread so you can read and write at same time.You can modify your program like this:

Thread T=new Thread(new Runnable() {

    @Override
    public void run() {
        while(true)
        {
            String input = scan.nextLine();
            input += "\n";
            try {
                writer.write(input);
                writer.flush();
            } catch (IOException e) {
                // TODO Auto-generated catch block
                e.printStackTrace();
            }

        }

    }
} );
T.start();

and you can reader will be same as above i.e.

while ((line = reader.readLine ()) != null) {
    System.out.println ("Stdout: " + line);
}

make your writer as final otherwise it wont be able to accessible by inner class.


You have writer.close(); in your code. So bash receives EOF on its stdin and exits. Then you get Broken pipe when trying to read from the stdoutof the defunct bash.