Half-Elf on Tech

Thoughts From a Professional Lesbian

Tag: backup

  • cPanel ❤️ DreamObjects

    cPanel ❤️ DreamObjects

    This is something I’ve wanted for a long time. I opened a ticket with cPanel about it yonks ago as Ceph storage is offered by more than just Amazon, and yet cPanel was making it super hard to use for backups.

    Well in the next release of cPanel, this will no longer be the case! If you’re on version 74 (which is in release stage, but not current, so most people do not have it yet) you can do this.

    Add A New Backup Option

    Go to Home > Backups and open up the settings.

    In there, you can add a new Backup option. Pick S3 Compatible:

    Backing up from cPanel to DreamObjects will soon be a reality.

    Configure for DreamObjects

    Now just throw in the right data:

    You’ll want to use objects-us-east1.dream.io for the endpoint, and then your bucket and keys.

    Back it Up

    And with that you’re done. Thank you, cPanel!

  • Local Backups

    Local Backups

    You heard about CodeSpaces didn’t you?

    On June 17th they got hit with a DDoS. It happens. On June 18th, the attacker deleted their data. And the backups. Because the backups were on the same server… You can read the story here and make up your own mind.

    But that brings us to this. Are you making your own, personal, backups?

    My server makes entire server backups every day and collocates them, but I also have my own backups of my own, personal, data. Not my email. If that blew up today I would lose nothing I can’t get back. That’s right, I don’t keep much email. If it’s important, I store it on my laptop, on iCloud or Dropbox, and I backup my laptop to TimeMachine. Oh and I check that backup regularly.

    So how do I backup my sites? It’s three fold.

    Clouds and Fences

    On the Server

    I use a DB script from Daniel D Vork. He backs up files to DropBox, which is cool, but for me, I have this script on my server and it stores to a non-web-accessible folder called ‘backups’:

    #!/bin/bash
     
    USER="your_user"
    PASSWORD="your_password"
    OUTPUT="/Users/YOURUSERNAME/backups"
     
    rm "$OUTPUT/*gz" > /dev/null 2>&1
     
    databases=`mysql --user=$USER --password=$PASSWORD -e "SHOW DATABASES;" | tr -d "| " | grep -v Database`
     
    for db in $databases; do
        if [[ "$db" != "information_schema" ]] && [[ "$db" != _* ]] ; then
            echo "Dumping database: $db"
            mysqldump --force --opt --user=$USER --password=$PASSWORD --databases $db > $OUTPUT/`date +%Y%m%d`.$db.sql
            gzip $OUTPUT/`date +%Y%m%d`.$db.sql -f
        fi
    done
    

    That script is called every day at midnight via a cron job.

    Bring it local

    On my laptop, under the ~/Sites/ folder, I have a folder for each domain. So there’s one for ipstenu.org (which is where this site lives), and in there are the following:

    backup-exclude.txt       backup.sh          log.txt
    public_html/
    

    The public_html folder is a full backup of my site files. It’s not that crazy, don’t panic.

    The backup.sh file does an rsync:

    #!/bin/sh
    
    cd $(dirname $0)
    
    TODAY=$(date)
    echo "
    -----------------------------------------------------
    Date: $TODAY
    Host: ipstenu.org hosted sites
    -----------------------------------------------------\n" > log.txt
    
    echo "Backup files..." >> log.txt
    rsync -aCv --delete --exclude-from 'backup-exclude.txt' -e ssh backups@ipstenu.org:/home/ipstenu/public_html/ public_html > log.txt
    
    echo "\nBackup databases..." >> log.txt
    rsync -aCv --delete --exclude-from 'backup-exclude.txt' -e ssh backups@ipstenu.org:/home/ipstenu/backups/ databases >> log.txt
    
    echo "\nEnd Backup. Have a nice day." >> log.txt
    

    Backups is not the name of the account but I do have a backup only account for this. The backup-exclude.txt file it calls lists folders like ‘cache’ or ‘mutex’ so I don’t accidentally back them up! It’s simply just each file or folder name that I don’t want to backup on it’s own line. And yes, I like pretty output in my logs so I can read them when I’m having a brainless moment.

    The cd $(dirname $0) at the beginning is so that I can call this from other folders. Remember! If your script uses relative paths to access local resources, then your script will break if you call if from another folder. This has a reason why in the next section.

    Automate that shit!

    I’m on a Mac. I decided I wanted that backup to run every time I logged in to my computer. Not rebooted, logged in. And waking from sleep. That became problematic, but let’s get into this code.

    Writing the scripts

    First I made a new folder called ~/Development/backups and I’ll be stashing my code there. In there I have a couple files. First is website-backup.sh:

    #!/bin/sh
    
    /Users/ipstenu/Sites/ipstenu.org/backup.sh
    /Users/ipstenu/Sites/othersite.net/backup.sh
    

    Basically for every site I want to run backup for, it’s in there. This is why I have the change-directory comment on the backup scripts.

    The other file is my launchd file, called com.ipstenu.website-backups.plist and I got this code from stackexchange:

    <?xml version="1.0" encoding="UTF-8"?>
    <!DOCTYPE plist PUBLIC "-//Apple Computer//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
    <plist version="1.0">
    <dict>
       <key>Label</key>
       <string>com.ipstenu.website-backups</string>
       <key>Program</key>
       <string>/Users/ipstenu/Development/backups/website-backup.sh</string>
       <key>RunAtLoad</key>
       <true/>
    </dict>
    </plist>
    

    Instead of copying the file, though, I did a symlink:

    ln -sfv /Users/ipstenu/Development/backups/com.ipstenu.website-backups.plist ~/Library/LaunchAgents
    

    This lets me change it if I need to, which I doubt I will. I’ll just edit that .sh script. The filename of the plist is intentional to tell me what the heck it is.

    But wait, what about waking from sleep? Logging in from a sleeping computer is not the same as a log in to a Mac, and there’s no built in tool to monitor sleep and wake for some reason. There are apps that can do it, but there’s also SleepWatcher, which can be installed via Brew! Since I’m running an rsync, it’s not a big deal to run multiple times a day. Heck it may actually be faster.

    First we install Sleepwatcher:

    brew install sleepwatcher
    

    Now Sleepwatcher looks for user scripts named ~/.sleep and ~/.wakeup which sure makes my life easier. My ~/.wakeup file calls website-backup.sh, and while I could have it repeat the code, I chose not to for a reason. I know my backup scripts will live in ~/Development/backups/ so I can add a new one for something else without messing around with more than one file.

    Do you remember launchd a moment ago? We want to use that again to tell Sleepwatcher it’s okay to run on startup or login. This time, since we’re only using Sleepwatcher for sleep and wake, we can symlink the sample files to the proper LauchAgents directories. In my case, it’s only running for me, so it’s all local:

    ln -sfv /usr/local/Cellar/sleepwatcher/2.2/de.bernhard-baehr.sleepwatcher-20compatibility-localuser.plist ~/Library/LaunchAgents
    

    If you’re interested in doing more with sleepwatcher, read Mac OS X: Automating Tasks on Sleep by Kodiak.

    Finally we’re going to load both of these commands into launchctl:

    launchctl load ~/Library/LaunchAgents/com.ipstenu.website-backups.plist
    launchctl load ~/Library/LaunchAgents/de.bernhard-baehr.sleepwatcher-20compatibility-localuser.plist
    

    Now every time I log in on my laptop, it runs a backup, be that a real login, or a wake-from-sleep one.

    And remember, this is on top of my full server backups and my personal git repository for my code, so I have my data backed up in the important places. Everything on my laptop is backed up to the TimeMachine, so really I can just look back a year or three and find that html file I used once.

    The other thing I do is check these backups pretty regularly. I scheduled a day every month to check that everything’s working right, that the files are restorable, and that I feel secure. Thus far, the most I’ve lost has been 16 hours of work on a Wiki.

  • Sometimes The Answer Sucks

    Sometimes The Answer Sucks

    I’m good at what I do. I’m really good. I’m an expert, and I’ve rarely run into a WordPress site I couldn’t fix, or at least get back to usable. This doesn’t mean I can code everything, but it means I can take a broken site and get your content back. I can’t, however, perform miracles all the time. You saw how I said ‘rarely’ right?

    The real issue here is that sometimes the answer I give people is a horrible, terrible, sucky answer.

    Scotty_JefferiesTubeWhile Montgomery Scott always saved the day giving the engines more power, and skipping through the Jefferies Tube like the most bad-ass red shirt in existence, the sad reality of life is that sometimes we can’t save your website. If we can’t figure out why it broke, we may not be able able to fix it.

    For example, a site suddenly lost all the plugin settings. They were just gone. Poof. No one had done anything, so the obvious cause is the database having a snafu, right? Well no. The DB was checked, everything seemed in order. We tried a restore, no-go. At that point, the only thing I could tell the person was to re-apply all the changes again, manually. The user was pissed off and it’s totally understandable why! I was pissed off. I couldn’t solve a problem and yes, when I can’t solve things, I get very upset with myself. And I was upset that the answer was so sucky! Redo your hard work? What a crock! But no matter what I did, no matter how I tried to pull the settings back, I was just getting further and further down that rabbit hole, and I knew I absolutely had to cut my losses.

    Kirk_AngryIn all likelihood, someone did something without checking it was right and without making a backup first. This happens. We know we shouldn’t mess with ‘production’ but we all do it. So that means sometimes we’re really reckless and we shoot ourselves in the foot without protection. While we can, and do, try really hard not to be stupid anymore, accepting that you (or perhaps your captain) has made a boneheaded mistake is really important. Equally so? Accepting that cleaning up that mistake may not be the answer we wanted to hear.

    No one wants to hear ‘Start over.’ That’s pretty much a given. And yet we’ve all done it before. When I studied music, the number of times I had to start over because I’d made a mistake is uncountable. When I was learning to connect pipes in plumbing? Oh I ripped things out a hundred times before getting it right. I even restarted this entire blog post a couple times. And that’s not the only time the answer sucks. You changed user roles and capabilities and now you can’t log in? Congratulations, you get to reset them and start over.

    I could go on with example after example of things we do, without realizing how dangerous they are, and how much trouble they get us in, but I suspect the point is made. We do amazing things to ourselves and can’t always fix them. Should you be upset when it happens to you? Of course. And should you be annoyed when you didn’t do anything and they break? You bet! But ….

    Your website is like a car that’s always running. Eventually something is going to break, and when it does, the only hope you have of salvation are your backups. Everything really comes back to that, doesn’t it? I deleted the wrong table in a DB and had to restore the whole thing from the day before. Lost a day’s work. Nothing to be done to fix it but that. I had a file, for no reason I could see, go corrupt and refuse to let me edit it. Thankfully the backup was the version I wanted to edit, so I deleted and re-uploaded and moved on.

    These things will happen.

    The answer will suck.

    Decide if you’d rather spend your time complaining about how it’s sucky, or if you want to knuckle down and get to work.

    Scotty and Scotch

    Or drink scotch.

  • Backup Where You Belong

    I’ll make it quick: At the end of the day, there’s only one person who’s responsible for your backups, and that’s you.

    Here’s the deal. WordPress does not have a 100% backup everything tool. Neither does Drupal nor Joomla.(All three of the big guys have plugins that can do this, don’t worry, I’ll get to that in a minute.) In fact, I don’t know of any app on the web that does. Even though Google says “You own your data!”, if you use their tool to download everything, it’s not in a form you can just slap back on the web. Their backups remain tied to themselves. You get the data, but then you have to parse it.

    This brings up a bigger question, though. What is the point of a backup? In a worst-case world, your backup is to save your bacon for when you screw something up. It’s to restore from a crisis or to roll back a bad change. So why aren’t these sorts of things built into applications?

    When you think about it, they’re not built into any application. From Microsoft Word to your favorite Twitter app, if an upgrade breaks something, there’s no ‘roll back’ option. You can uninstall and reinstall, but most of the time that means you have to reconfigure all your favorite settings.(This is actually why I try to make as few special config changes as possible.) Yes, in Microsoft Office, you can save your ‘document’ in total, but that isn’t a direct analog to Web Publishing, because there’s far more than ‘just’ your book, there’s all those settings and preferences. If you’ve ever tried to copy someone’s preferences and settings from one computer to another (and you’re not on a Mac, who makes that shockingly easy) you know what I mean.

    The best backup tools are things like Microsoft’s cloud backup, or Apple’s Time Machine. Both make a massive copy and then incremental updates to your entire computer. They are, as we say, OS (Operating System) backups. All your documents, all your applications, all your settings, are backed up. No individual application has ever bothered with this so why should a web app?

    The argument goes that you should be able to pick up your web app and put it down on another server via exporting. I can think of one app off the top of my head that can do that: Cpanel. I’ve never tried it myself, but I’ve been told it works pretty well. Still, Cpanel actually falls under the weird realm of operating systems, as it’s really a server management tool. It’s where you logically expect to see things like your backup tools, db access, etc etc and so on and so forth.

    In short, it’s the right place to see your backups made.

    How do you backup a webapp?

    Step 1) Backup ALL the files on your server.
    Step 2) Backup all your databases.

    That’s kind of it. For most well written apps (WordPress, Drupal, etc) to ‘restore’ these backups, you just copy them back up. For the database stuff, you may need to make a second database and edit your files to point to that DB instead of the original, but it’s pretty fast. Professionally, we have one-click rollbacks installed on databases, but even then, we tend to go ‘Oh, that didn’t work.’ and rename the NEW DB (databasename_date_BAD) and re-upload the old one. Why? Because it works. When the DBs are too big, we have incremental backups and rollbacks set up. Files ditto (actually for files, we ALWAYS have step one ‘make a copy of the old folder structure…’ and the rollback is just renaming things).

    We rarely rely on the applications themselves to perform these tasks for one simple reason: They’re BAD at it.

    I’ve always been an advocate of the right tool for the right job. A web app is good at its job. A backup tool is good at its job. The two may cross, but there’s nothing wrong with using a backup tool to make backups and a writing tool to write. I don’t use any plugins on my apps to make backups, I do it via the tools built into my server that are, expressly, for making backups. Ditto my computers at home. I know what the right tool is, and I use it.