I've been writing some bash scripts recently and it took a while to get the output redirection working correctly. All I wanted to do was log script output to a file, but I kept getting stderr coming to the shell. After a bit of research, I think I've now got a decent understanding of how this works and why.
Here's what I was doing.
./myscript.sh 2>&1 >> results.log # wrong
So I'm redirecting stderr to stdout, then redirecting stdout to a file. Except that it didn't work. stderr was still coming to the shell, and not going to the log file.
It turns out what is actualy happening is a bit different than I expected. stderr isn't being redirected to stdout, it's being redirected to wherever stdout is going right now, as the shell parses the command line, from left to right. At the point when the shell comes across the 2>&1 directive stdout is going to the shell, so that's where stderr output goes. It's only later that the >> directive sends stdout to the file.
So to get this working properly you need to do this:
./myscript.sh >> results.log 2>&1 # correct
So stdout is first sent to the file, then stderr is sent to wherever stdout is going, which is the file. Hurrah!
However before decaring victory and moving on, suppose you want to send both stderr and stdout to a file and to the shell. Check this out.
./myscript.sh 2>&1 | tee results.log # correct
Wait... what? How can that be correct? Well, it turns out that the shell scans the command line in several passes. The pipe command | is parsed in an earlier run-through than > or >> by the shell. So by the time the 2>&1 is parsed, the shell already knows all about the upcoming pipe and is already redirecting stdout to the tee command.
This works out quite nicely, after all you couldn't put the 2>&1 after the pipe because then it would be operating in the tee command's context.