Revamp Infrastructure

13/04/2020 17:11 BST

I decided to revamp my existing infrastructure to something I can easily manage and set up backups and checkpoints. I was running the now discontinued Antergos which is based on top of Arch Linux which is a brilliant operating system, but it’s not very suitable for the enterprise because they often prefer matured application, with Arch Linux you always get given the latest application it’s too new for the enterprise especially for databases, so I thought it would be better to have a new infrastructure setup.

I decided to replace Antergos with Microsoft Hyper-V Server 2019 which is a cut-down version of Windows Server, just only with Hyper-V and nothing else, once I got that up and running managing the server was a walk in the park.

Hyper-V Manager

As you can see in the screenshot, I have set up four different virtual machines, ArchApp obviously runs on Arch Linux and serves as the user-facing production application server and that where my the code base for this website is hosted, ArchPortal is the backdoor is for the administrator (that me) to get in behind the firewall from outside the premises, so the admin can manage the other server on the network and the last two UbuntuDev and UbuntuProd those are the data server, one for production and the other for development, they both on Ubuntu 18.04 LTS (long-term support) and they have Mongo, Postgres and Redis installed and are all locked down to a specific version and I’ll only upgrade them when I’m ready to do so.

I did try to use Docker & Podman, it’s kept breaking my development server when I tried to run a backup, it’s did run well on my production server, but I decided to not use them anymore as I find them very difficult to monitor and probably won’t actually get used by the enterprise especially docker. The enterprise just prefers something that is very to monitor and does not break down too easily.

I was able to run a backup on both UbuntuProd and UbuntuDev with any issue, as Mongo and Postgres are running directly on the virtual machine, running the backup was easy, all I had to do was create two scripts one on the server and one on the client.

Server-Side Script

#!/bin/bash
cd $( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null && pwd )
mongodump
sudo su - postgres -c "pg_dumpall" > postgres.sql
tar -zcvf ../backup-$(date '+%s').tar.gz ./
rm -rf dump
rm postgres.sql

Client-Side Script

#!/bin/bash
cd "$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null && pwd )"
ssh [email protected] "~/backup/script"
scp "[email protected]:~/backup-*.tar.gz" .
ssh [email protected] "rm -rf ~/backup-*.tar.gz"

It’s just backup everything, everything in Postgres and Mongo including the files stored in GridFs, just like that. 🙂

| |

Privacy Policy | CV