My CLI Toolkit & How I Back Everything Up with Restic
There's something deeply satisfying about a well-tuned terminal. I spend most of my working hours inside one — training models, wrangling data pipelines, SSHing into GPU boxes, writing code. Over the years I've assembled a set of CLI tools that have become second nature. I wanted to share what my setup looks like today and, importantly, how I back it all up with Restic.
The Shell
I run fish as my daily shell. I know, it's not POSIX — and I don't care. The autosuggestions, syntax highlighting, and sane defaults out of the box are worth the occasional compatibility hiccup. For the rare script that needs bash, I just shebang it.
My prompt is Starship — fast, minimal, and configured in a single TOML file. It shows me the git branch, Python virtualenv, exit codes, and command duration. Nothing more.
Terminal Multiplexing with tmux
I'm a heavy tmux user. Every project gets its own session with a standard window layout: editor, shell, and logs. I use tmux-resurrect and tmux-continuum to persist sessions across reboots — losing your window layout after a restart is unacceptable.
Key bindings I can't live without:
prefix + zto zoom a pane full-screen when I need focusprefix + |andprefix + -for intuitive splitsprefix + Lto jump to the last session
I pair this with tmuxinator for project-specific layouts. One command and I'm in my full dev environment for a given repo.
The Modern CLI Replacements
I've swapped out most of the classic Unix tools for faster, friendlier alternatives:
- eza instead of
ls— tree views, git status integration, icons. I aliaslstoeza --icons --group-directories-firstandlltoeza -la --git. - bat instead of
cat— syntax highlighting, line numbers, git diff markers. It's whatcatshould have been. - fd instead of
find— intuitive syntax, respects.gitignore, blazing fast.fd "*.py" src/just works. - ripgrep (
rg) instead ofgrep— fast enough to search entire monorepos interactively. I use it dozens of times a day. - zoxide instead of
cd— it learns your most-visited directories.z ml-pipelinejumps me straight to~/projects/ml-pipelineno matter where I am. - fzf — the universal fuzzy finder. I pipe everything through it: git branches, docker containers, command history, kill signals.
Ctrl+Rfor history search through fzf is life-changing. - delta for git diffs — side-by-side, syntax-highlighted diffs in the terminal. Once you see it, plain
git difffeels broken. - jq for JSON wrangling — parsing API responses, config files, ML experiment logs. Indispensable.
- htop / btop for process monitoring — I keep btop in a tmux pane when running long training jobs.
Neovim as the Editor
My editor is Neovim with a fairly minimal config built on lazy.nvim. I use Telescope for fuzzy file finding, Treesitter for syntax highlighting, and a handful of LSP integrations. The key is keeping it fast — I want my editor to open instantly, not spin up for 10 seconds like an IDE.
For quick edits I'll still reach for nvim in the terminal. For larger projects, I sometimes open VS Code with the Neovim extension — best of both worlds.
Dotfiles Management
All of this config lives in a bare git repo that I manage with a dotfiles alias:
alias dotfiles='git --git-dir=$HOME/.dotfiles --work-tree=$HOME'New machine setup is just a clone and checkout away. I've been doing this for years and it's the simplest dotfiles approach I've found — no symlink managers, no extra tools.
Backing It All Up with Restic
Now for the part that most people skip until they lose something: backups.
I use Restic for all my backups. It's fast, encrypted, deduplicated, and works with basically any storage backend. Here's my setup.
What I Back Up
- Home directory essentials — dotfiles, SSH keys, GPG keys,
~/.config, shell history - Project directories — active repos, datasets, experiment configs
- Documents and notes — personal docs, research papers, Obsidian vault
- Machine-specific configs —
/etcfiles, cron jobs, systemd units
Where It Goes
I back up to two destinations:
- Local NAS via SFTP — fast, always available on the home network
- Backblaze B2 — offsite, cheap, reliable cloud storage
Restic makes multi-destination backups trivial. Same repo format, same commands, different backend URI.
The Backup Script
I have a simple wrapper script that runs nightly via a systemd timer:
#!/usr/bin/env bash
set -euo pipefail
export RESTIC_REPOSITORY="sftp:nas:/backups/ramin-workstation"
export RESTIC_PASSWORD_FILE="$HOME/.config/restic/password"
# Backup with exclusions
restic backup \
"$HOME" \
--exclude-file="$HOME/.config/restic/excludes.txt" \
--tag "nightly" \
--tag "$(hostname)"
# Prune old snapshots: keep 7 daily, 4 weekly, 6 monthly
restic forget \
--keep-daily 7 \
--keep-weekly 4 \
--keep-monthly 6 \
--prune
# Also push to B2
export RESTIC_REPOSITORY="b2:ramin-backups:/"
restic backup \
"$HOME" \
--exclude-file="$HOME/.config/restic/excludes.txt" \
--tag "nightly" \
--tag "offsite"The Excludes File
This is crucial — you don't want to back up node_modules, virtual environments, or Docker layers:
node_modules
.venv
__pycache__
.cache
.docker
Downloads
.local/share/Trash
*.pyc
.tox
.noxVerification
I run restic check weekly and do a test restore every few months. A backup you've never restored from is not a backup — it's a hope.
# Quick integrity check
restic check
# Test restore a specific directory
restic restore latest --target /tmp/restore-test --include "$HOME/.config"Why Restic Over Alternatives?
I've tried them all — rsync scripts, Duplicity, Borg. Restic wins for me because:
- Deduplication means incremental backups are tiny and fast
- Encryption by default — I don't have to think about it
- Multiple backends — local, SFTP, S3, B2, Azure, GCS all work the same way
- Fast restores — I can mount a snapshot as a FUSE filesystem and browse it
- No special server — the backup destination is just a directory
Wrapping Up
The tools you reach for every day compound. A few seconds saved per command, multiplied across thousands of invocations, adds up to hours. More importantly, a comfortable terminal makes you want to live in it — and that's where the real productivity lives.
And please, set up your backups. It takes an afternoon, and the peace of mind is worth it. Your future self, staring at a dead SSD, will thank you.
If you have CLI tools you swear by that I haven't mentioned, I'd love to hear about them — reach out on GitHub or LinkedIn.