_____ _ ___ _________ __ |_ _|__ _ __ _ __ ___ (_)_ __ __ _| \ \ / / ____\ \/ / | |/ _ \ '__| '_ ` _ \| | '_ \ / _` | |\ \ / /| _| \ / | | __/ | | | | | | | | | | | (_| | | \ V / | |___ / \ |_|\___|_| |_| |_| |_|_|_| |_|\__,_|_| \_/ |_____/_/\_\
One of the most powerful aspects of the Unix philosophy is connecting small programs that each do one thing well to accomplish complex tasks. The mechanisms that create these connections are pipes and redirection operators. Commands that are simple on their own become incredibly powerful workflows when combined through these operators.
To understand these concepts, we first need to know about standard streams.
In Linux, every program works with three standard streams:
# stdout example: ls output is printed to terminal
$ ls
file1.txt file2.txt folder/
# stderr example: non-existent file produces error
$ cat nonexistent-file.txt
cat: nonexistent-file.txt: No such file or directory
# stdin example: cat waits for keyboard input
$ cat
Hello (user types)
Hello (cat echoes)
(Ctrl+D to end)
Output redirection operators redirect a command's stdout output to a file instead of printing it to the terminal.
The > operator writes output to the specified file. If the file already exists, it completely erases its contents and overwrites it; if it does not exist, it creates a new file.
# Write ls output to a file
$ ls -l > file-list.txt
# Write date information to a file
$ date > date.txt
$ cat date.txt
Tue Jan 20 14:30:00 +03 2026
# Create an empty file (clear a file's contents)
$ > old-log.txt
Caution: The > operator completely erases and rewrites the file. Use >> if you want to preserve existing content.
The >> operator appends output to the end of the file. Existing content is preserved.
# Add new entry to log file
$ echo "$(date): Application started" >> app.log
$ echo "$(date): User logged in" >> app.log
$ cat app.log
Tue Jan 20 14:30:00 +03 2026: Application started
Tue Jan 20 14:31:00 +03 2026: User logged in
# Merge multiple files
$ cat chapter1.txt >> book.txt
$ cat chapter2.txt >> book.txt
The < operator redirects the contents of a file to a command's stdin stream. Instead of receiving input from the keyboard, the command reads from the file.
# Count lines from a file
$ wc -l < access.log
1542
# Sort using file input
$ sort < names.txt
# Send email (if mail command is available)
$ mail user@company.com < report.txt
# Run MySQL query from file
$ mysql database < query.sql
To redirect the standard error stream (stderr), we use the file descriptor number 2.
# Redirect error messages to a file
$ find / -name "*.conf" 2> errors.txt
# Append error messages to existing file
$ command 2>> error-log.txt
# Redirect stdout and stderr to separate files
$ command > output.txt 2> errors.txt
# Redirect stdout and stderr to the same file
$ command > all-output.txt 2>&1
# or with modern bash syntax
$ command &> all-output.txt
# Ignore errors (redirect to /dev/null)
$ find / -name "*.log" 2>/dev/null
/dev/null is a special device file; everything written to it disappears. It is used to suppress unwanted output.
# Suppress error messages
$ find / -type f -name "*.tmp" 2>/dev/null
# Suppress all output (silent execution)
$ command > /dev/null 2>&1
# Show only error output (suppress stdout)
$ command > /dev/null
# Check if a command succeeds
$ grep -q "error" log.txt 2>/dev/null && echo "Error found" || echo "No errors"
The pipe (|) operator connects the stdout output of one command to the stdin input of the next command. This is the heart of the Unix philosophy, allowing you to chain commands together like links in a chain.
# Sort file listing
$ ls -l | sort -k5 -n
# Search for "node" in running processes
$ ps aux | grep node
# View large output page by page
$ cat large-log.txt | less
# Count unique lines
$ cat access.log | sort | uniq -c | sort -rn
# Count files in a directory
$ ls | wc -l
# List the 10 largest files
$ du -sh * | sort -rh | head -10
# Extract a specific column
$ cat /etc/passwd | cut -d: -f1 | sort
You can use pipe operators multiple times to create complex data processing pipelines:
# Find the top 5 IP addresses making the most requests in web server logs
$ cat access.log | awk '{print $1}' | sort | uniq -c | sort -rn | head -5
1523 192.168.1.100
842 10.0.0.50
631 172.16.0.25
419 192.168.1.200
287 10.0.0.75
# Count total lines of code in a project directory
$ find . -name "*.js" | xargs wc -l | tail -1
# Sort running processes by memory usage
$ ps aux | sort -k4 -rn | head -10
# Count unique words in a file
$ cat text.txt | tr ' ' '\n' | sort | uniq -c | sort -rn | head -20
The tee command writes data read from stdin to both stdout (terminal) and the specified file. Its name comes from the T-shaped pipe fitting.
# Display output on terminal and save to file
$ ls -l | tee file-list.txt
# Write to multiple files
$ echo "Important info" | tee file1.txt file2.txt
# Append to file (without overwriting)
$ date | tee -a log.txt
# Use in the middle of a pipe chain
$ cat access.log | tee raw-log.txt | grep "ERROR" | tee error-log.txt | wc -l
In the last example, the process is as follows: access.log is read, the full content is saved to raw-log.txt, then only lines containing ERROR are filtered, these are saved to error-log.txt, and finally the error count is printed to the screen.
The following examples demonstrate how pipes and redirection are used in the real world:
# Find errors from the last hour
$ tail -n 1000 /var/log/syslog | grep -i error
# Count 404 errors for a specific date
$ grep "20/Jan/2026" access.log | grep " 404 " | wc -l
# List most frequently recurring error messages
$ grep "ERROR" app.log | awk -F'ERROR' '{print $2}' | sort | uniq -c | sort -rn | head -10
# Show disk usage sorted
$ df -h | sort -k5 -rn
# Processes using the most CPU
$ ps aux --sort=-%cpu | head -10
# Count open network connections
$ ss -tunap | grep ESTABLISHED | wc -l
# Search for a specific function in a project
$ grep -r "functionName" src/ | grep -v node_modules
# Lint changed JavaScript files with git
$ git diff --name-only | grep ".js$" | xargs eslint
# Check package counts
$ npm ls --depth=0 2>/dev/null | wc -l
Besides pipes, there are also operators for running commands sequentially:
# Sequential execution with ; (continues even if previous fails)
$ mkdir project; cd project; touch README.md
# Conditional execution with && (continues only if previous succeeds)
$ mkdir project && cd project && npm init -y
# Alternative execution with || (runs if previous fails)
$ cd project || echo "Directory not found"
# Combined usage
$ npm test && echo "Tests passed" || echo "Tests failed"
Pipes and redirection are among the most powerful features of terminal usage. Understanding standard streams (stdin, stdout, stderr) is the key to using these operators effectively. You can redirect output to files with > and >>, chain commands with |, control the error stream with 2>, and duplicate output with tee. As you integrate these techniques into your daily workflow, you will discover the true potential of the terminal.