Backing up stuff to the cloud is all the rage, and I wanted to give it a try. Since I don’t really want to spend any money on it, I decided to go with the best free provider I could find. I haven’t done too much research, but MEGA’s 50 free GB seems to be the best offer around. They also have a simple CLI tool called megatools (manual).

Since I couldn’t find megatools in the official debian repo, I decided to use it from docker. There is no official docker container either, so I just grabbed one at the top (tomzo/megatools-docker). He has added two scripts of his own, which may be helpful if you want to see how megatools work (github), but I feel they are too clunky to use.

Instead, I created a script below. The goal is to backup all files and folders in /media/stor/backup. In case they are folders, I’ll gzip them first. Since the data in no circumstances should be readably by others, I also decided to also encrypt all files with my gpg public key. Just in case. To set it up, you need to create a directory which looks like this:

.
├── megarc
├── subfolders
│   ├── 1
│   ├── 2
│   └── 3
└── sync

The files in subfolders can have any name, and should be empty files. They are meant for rotating between different subfolders with the backup. In my case, I have the files 1, 2 and 3, so the first time I backup it will be saved at MEGA_FOLDER_NAME/1/, the next backup will be to MEGA_FOLDER_NAME/2/, then MEGA_FOLDER_NAME/3/, and the next time it’s back to overwriting the first MEGA_FOLDER_NAME/1/. Just in case I mess up and accidentally delete on of them, I can use the backup from the day before.

The megarc-file should have the following data:

[Login]
Username = your@email
Password = yourpassword

The sync-file should have the following data. Change your@email, SOURCE_DIR (The directory to back up from), MEGA_DIR (the path to the directory above) and MEGA_FOLDER_NAME (the folder name at your MEGA drive)

#! /usr/bin/env bash
set -euo pipefail

for arg in "${@}"; do
  case "${arg}" in
    "-v") set -x ;;
  esac
done

clear_sync_dir() {
  if [[ -e "${DEST_DIR}" ]]; then
    rm -r "${DEST_DIR}"
  fi
  mkdir "${DEST_DIR}"
}

tar_and_copy() {
  local file="${1}"

  local filename="$(basename "${file}")"
  tar czf "${DEST_DIR}/${filename}.tar.gz" --warning='no-file-ignored' -C "${SOURCE_DIR}" "${filename}"
}

copy_to_sync_dir() {
  for file in ${SOURCE_DIR}/*; do
    if [[ -d "${file}" ]]; then
      tar_and_copy "${file}"
    else
      cp "${file}" "${DEST_DIR}"
    fi
  done
}

encrypt_sync_dir() {
  for file in "${DEST_DIR}"/*; do
    gpg -r 'your@email' -e "${file}"
    rm "${file}"
  done
}

SOURCE_DIR="/media/stor/backup"
MEGA_DIR="/media/stor/backup-mega"
DEST_DIR="${MEGA_DIR}/data"

clear_sync_dir
copy_to_sync_dir
encrypt_sync_dir

MEGA_FOLDER_NAME="david2-backup/"
MEGA_SUBFOLDER="$(find "${MEGA_DIR}/subfolders" -type f -printf '%T+ %p\n' | sort | awk '{print $2; exit}')"
MEGA_FOLDER_NAME="david2-backup/$(basename "${MEGA_SUBFOLDER}")"

touch "${MEGA_SUBFOLDER}"
docker run --rm \
  -v="${MEGA_DIR}/megarc:/.megarc" \
  -v="${DEST_DIR}:/Root/${MEGA_FOLDER_NAME}" \
  tomzo/megatools-docker \
  /bin/bash -c "megarm /Root/${MEGA_FOLDER_NAME}; megamkdir /Root/${MEGA_FOLDER_NAME}; megacopy --no-progress --local=/Root/${MEGA_FOLDER_NAME} --remote=/Root/${MEGA_FOLDER_NAME}"

To make the backup run every night, I added the following to my crontab:

# m h  dom mon dow   command
  0  2   *   *   *    /media/stor/backup-mega/sync > /media/stor/backup-mega/stdout.log

If you don’t redirect the output, you will receive an email telling you which files where uploaded. You might just want to get that information when trying it out.