Half-Elf on Tech

Thoughts From a Professional Lesbian

Tag: unix

  • stdin: is not a tty

    stdin: is not a tty

    That was the error.

    stdin: is not a tty

    I use rsync to make a backup of my files via this shell script that runs every time I log in to my laptop:

    #!/bin/sh
    
    cd $(dirname $0)
    
    TODAY=$(date)
    echo "
    -----------------------------------------------------
    Date: $TODAY
    Host: example.com
    -----------------------------------------------------\n" > log.txt
    
    echo "Backup files..." >> log.txt
    rsync -aCv --delete --exclude-from 'backup-exclude.txt' -e ssh me@example.com:/home/me/public_html/ public_html >> log.txt
    
    echo "\nEnd Backup. Have a nice day." >> log.txt
    

    It’s a nice little script. It downloads everything into a folder called example.dev which I then use with DesktopServer to have a copy of my site. The database? That’s handled by another file which pulls down the DB backups from Amazon S3 (something built in to cPanel) which I may cover at a later point.

    Today though, let’s talk about what that error is, what it means, how we fix it, and why that fix works.

    The error is caused by having mesg at the top of a .bashrc file on my server. In my case, the line is not in the user’s file, but the root file. The message, on login, tells you when your last login was, where it was from, and what the IP was. It also tells you how many failed logins happened since your last login, a report that amuses me when I sudo into root now and then.

    Why I get the error is because when I log in via rsync, the message is trying to show on the rsync output, which can’t parse it, and thus errors. The fix means I need to tell it not to show the output. And to do that we put this at the top of the .bashrc file:

    [ -z "$PS1" ] && return
    

    Another option would be this:

    if `tty -s`; then
       mesg n
    fi
    

    It depends on your flavor of Linux of course.

    The final question we have with this is why does it work?

    The second fix is simple. It checks for tty, which is Teletype. If you’ve ever wondered how deaf people use the phone, it’s via a teletype machine. For the purposes of computers, it just means “This is text and we are going to talk in text interactively.” The tty setting is handled by your terminal of choice. If it doesn’t get tty, the server will just not show the message.

    The first fix is a little more weird. PS1 stands for Prompt String 1 and is one of the prompts you get when logging in. Normally it just shows username and password. Using -z is checking if the prompt is interactive or not. If not, return (aka exit out and do nothing else).

  • Command Line Cleaning WP

    Command Line Cleaning WP

    Blob-Town-The-Blob-1958-Documentary-@-Phoenixville-Pennsylvania-by-James-RolfeI’m a huge fan of the scorched earth clean up for WordPress. By which I mean when I clean up WP, I rip it out, scrub it, and reinstall. This scares the heck out of people sometimes, and if you’re doing it in a GUI, yeah, it can be sucky and time consuming. Me? I do it in 5-10 minutes, depending on if my cat wants to be petted.

    I’ve been asked ‘How do you do it that fast?’ so here are my steps for cleaning up WP, with the following assumptions:

    1. I’m working in the folder where WP is installed
    2. wp-config.php is in this folder
    3. WP is in ‘root’ (i.e. I’m not giving WP it’s own folder)

    If any of those aren’t true for you, adjust the folder locations in the commands:

    Download WP: wget -P ../ http://wordpress.org/latest.zip

    Unzip it: unzip -qq -d ../ ../latest.zip

    Backup DB: wp db export

    Pause. Here I’m using WP CLI, which makes my life way easier. If you’re not, you’ll need something like this: mysqldump --opt --user=username --password=password --host=yourMySQLHostname dbname > domain_com.sql

    Zip up the files I want to backup: zip -r ../domain.zip *.sql wp-config.php .htaccess wp-content/

    Set glob. Glob is scary, I know, but read about glob before you dismiss it (if you’re on korn, you can usually skip this): shopt -s extglob

    Delete files: rm -rf !(wp-config.php|wp-content)

    Pause. At this point, It’s probably wise to consider that my hack may be in my theme and/or plugin. If so, I want to nuke them and JUST keep my uploaded files, so I use this instead…

    Delete files: rm -rf !(wp-config.php|wp-content) wp-content/!(uploads|blogs.dir)

    Pause again. No matter what, want to scan for evil files, but this way I do it over a much smaller group of files. Either way, though, I do want to scan the folder for evil, because leaving behind hacks in themes and plugins is really common. Also it’s a good idea to delete every plugin you don’t use, and theme as well. Since you really can’t delete all themes but one on a Multisite, this gets harder. Generally I don’t delete the themes automatically, but instead go in and nuke them one at a time, so I run this…

    Delete files: rm -rf !(wp-config.php|wp-content) wp-content/!(uploads|blogs.dir|themes|mu-plugins)

    Now we can move on, knowing our personal files are clean.

    Copy it back: cp -r ../wordpress/* .

    Clean it up: rm -rf ../wordpress ../latest.zip

    And now you’re done! When you want to reinstall plugins and themes, I do via wp-cli because it’s faster: wp plugin install NAME and wp theme install NAME

    Then I activate as needed and I’m off to the races. If I deleted my mu-plugins, I copy those back from my backup zip, one at a time, checking each file for hacks.

    The best thing about this is you can apply the logic to any CMS out there. Just know what you have to delete and keep. The downside? It doesn’t touch your database. Rarely is this an issue for me, except in the case of the Pharma hack. I’ve not had a DB infected yet.

    Do you have a solid methodology for cleaning it up?

  • MySQL – my.cnf

    MySQL – my.cnf

    This is a fairly rare file, and one I never would have found had I not needed to run a standard SQL process via cron.

    Names have been changed to protect the innocent.

    As the story goes, no matter what I did, I could not get this one app to stop spewing out ‘smart’ quotes. You know the fancy apostrophes and quotes that curl? Well, that’s not normally a problem, like in WordPress I’d just filter it out, but in this locked down system, I didn’t have that option. I called the vendor, and they said “Make sure you don’t paste in smart quotes.”

    mysqlThat was all fine and dandy for me but I’m not the master of the universe like that. Well, not all the time. I had people to input data for me! They were going to have to manually take the forms (Word Docs), filled in by non-techs, and copy the data into the right places in the app. And you want me to tell them they have to fix this for the non-techs? I thought about how much time that would take, and decided the best fix was to change the forms! Right?

    If you’ve ever worked for a major company, you know why this was about as effective as aspirin for a root canal. No deal. So I decided to get inventive.

    The only time this was a problem, these ugly quotes, was when we ran our weekly reports. This was how I found out about it, a manager complained that there was garbage instead of quotes on the form titles. Ergo: All I need to do is script something to clean them out!

    Enter SQL!

    # REPLACE SMART QUOTES WITH STUPID ONES
    # FIRST, REPLACE UTF-8 characters.
    UPDATE `secretapp_table` SET `formtitle` = REPLACE(`formtitle`, 0xE2809C, '"');
    UPDATE `secretapp_table` SET `formtitle` = REPLACE(`formtitle`, 0xE2809D, '"');
    # NEXT, REPLACE their Windows-1252 equivalents.
    UPDATE `secretapp_table` SET `formtitle` = REPLACE(`formtitle`, CHAR(147), '"');
    UPDATE `secretapp_table` SET `formtitle` = REPLACE(`formtitle`, CHAR(148), '"');
    

    In my testing, if I ran that on formtitle, it cleaned it up for the report. This was a default report in the app, by the way, not something I had any control to change. And you wonder why I love open source? Anyhow, once I knew how this would work, I sent about scripting it. I couldn’t hook into any triggers on the app, though, because they don’t like to make it easy.

    Fine, I decided. A crontab time it is! I made this simple script to run at midnight, every night, and clean up the DB:

    #! /bin/bash
    
    mysql -h "dbname-secretapp" "secretapp_db" < "quotecleaner.sql"
    

    It worked when I ran it by hand, but it failed when cron’d. This took me some headbanging, but after reading up on how SQL works, I realized it worked when I ran it as me because I’m me! But cron is not me. I have permissions to run whatever I want in my database. Cron does not. Nor should it! So how do I script it? I don’t want the passwords sitting in that file, which would be accessible by anyone with the CMS to update it.

    I went around the corner to my buddy who was a DB expert, and after explaining my situation (and him agreeing that the cron/sql mashup was the best), he asked a simple question. “Who has access to log in as you?” The answer? Just me and the admins. The updating tool for our scripts was all stuff we ran on our PCs that pushed out to the servers, so no one but an admin (me) ever logged in directly.

    He grinned and wrote down this on a sticky “.my.cnf”

    Google and a Drupal site told me that it was a file that was used to give the mysql command line tools extra information. You shove it in the home directory of the account, and, well, here’s ours:

    # Secret App user and password
    user=secretapp_user
    password=secretapp_password
    

    The only reason I even remembered all this was because an ex-coworker said he ran into the documentation I left explaining all of this, and was thankful. He had to have it scan the body of the form now, because the managers wanted that in the report too!

  • grep vs ack

    grep vs ack

    I do a lot of things by command line. Still. It’s faster, it’s easier, and in many cases, gives me more control. And as I always mention, people who use command line are people who really lazy. We don’t like sixteen clicks. If we can copy/paste and change one thing, we’re happy.

    For ages, when I wanted to search my local repository of plugins, I’d whip out something like this:

    grep -R "base64" ~/Development/WP-Plugin-Dir/* > ~/Development/WP-Plugin-Greps/base64-grep.txt

    This works, but it’s slow and it’s not very pretty. The file output is a mess and it’s painstaking to sort through and understand.

    /home/me/Development/WP-Plugin-Dir/jetpack/class.jetpack-post-images.php:                ob_start(); // The slideshow shortcode handler calls wp_print_scripts and wp_print_styles... not too happy about that
    /home/me/Development/WP-Plugin-Dir/jetpack/modules/comments/comments.php:                ob_start();
    /home/me/Development/WP-Plugin-Dir/jetpack/modules/contact-form/grunion-contact-form.php:                ob_start();
    /home/me/Development/WP-Plugin-Dir/jetpack/modules/custom-css/custom-css.php:            ob_start('safecss_buffer');
    /home/me/Development/WP-Plugin-Dir/jetpack/jetpack.php:                  ob_start();
    /home/me/Development/WP-Plugin-Dir/jetpack/jetpack.php:          ob_start();
    

    On the other hand, there’s this:

    ack --php 'ob_start' ~/Development/WP-Plugin-Dir/ > ~/obstart.txt

    That actually gives a rather similar output:

    /home/me/Development/WP-Plugin-Dir/jetpack/class.jetpack-post-images.php:36:               ob_start(); // The slideshow shortcode handler calls wp_print_scripts and wp_print_styles... not too happy about that
    /home/me/Development/WP-Plugin-Dir/jetpack/modules/comments/comments.php:138:              ob_start();
    /home/me/Development/WP-Plugin-Dir/jetpack/modules/contact-form/grunion-contact-form.php:264:              ob_start();
    /home/me/Development/WP-Plugin-Dir/jetpack/modules/custom-css/custom-css.php:350:          ob_start('safecss_buffer');
    /home/me/Development/WP-Plugin-Dir/jetpack/jetpack.php:872:                        ob_start();
    /home/me/Development/WP-Plugin-Dir/jetpack/jetpack.php:928:                ob_start();
    

    That’s ack, which claims to be better than grep, and I’m kind of agreeing. Let’s look at the small differences.

    • Line numbers. That will help me find the code later.
    • Only searching PHP files
    • Recursive by default
    • Ignores SVN and other similar folders.

    How do you do only PHP files in grep?

    grep pattern $(find . -name '*.php' -or  -name '*.phpt' -or  -name '*.php3' -or  -name '*.php4' -or  -name '*.php5' -or  -name '*.phtml' )
    

    Right. Like I’m going to remember that.

    And we can make ack better. Let’s ignore a folder:

    ack --ignore-dir=akismet 'string'
    

    How about customizing my output so I can check how often a plugin is doing_it_wrong()?

    ack --php --group 'ob_start' ~/Development/WP-Plugin-Dir/ > ~/obstart.txt
    

    That’s a little easier to read.

    /home/me/Development/WP-Plugin-Dir/jetpack/modules/custom-css/custom-css.php
    350:            ob_start('safecss_buffer');
    
    /home/me/Development/WP-Plugin-Dir/jetpack/jetpack.php
    872:                    ob_start();
    928:            ob_start();
    

    Just want a list of the filenames?

    ack --php -l 'ob_start' ~/Development/WP-Plugin-Dir/ > ~/obstart.txt
    

    Or what if I want to search all instances of ob_start() in jetpack/jetpack.php? You can make ack sit up and beg.

    You can see that ack is a lot more powerful right away when it comes to being able to quickly use the data without a lot of parsing. There are some catches with ack, though, like it has a whitelist of file types that it will search, so if you don’t tell it to search .html, it won’t. That’s a small price to pay for me.

    The documentation is written in nerd, so I generally find looking at concrete examples is more helpful. Do you have tricks with ack (or even grep) that save you time and money?

  • Command Line Mac Trash Tricks

    Command Line Mac Trash Tricks

    RM TrashWarning! I’m going to talk about the ‘rm’ command which is a super-deadly command in the linux world. No matter what, never ever ever consider running it unless you’re certain you know what it does!

    I review a lot of plugins, which means I download them all to my laptop, review all the code, possibly install them, and then delete. This means, on any given week, I have 5000 items in my trash. And this is without unzipping! (Yes, we get a lot of plugins, and TextWrangler lets me review most of them in their zips.)

    When I forget to empty my trash every day, I end up waiting hours for the GUI empty to run unless I use rm -rf from inside the ~/.Trash/ folder. The real command is this:

    $ rm -rf ~/.Trash/*
    

    I like this because it’s crazy fast compared to the GUI, and

    But sometimes I actually just want to commandline my trash. I’ll be banging on things in Terminal and a very simple ’empty trash’ command would be nice, right? OSX Trash lets me type trash -l to see what’s in my trash, and trash -e to run the normal empty command. It’s better than a lot of other scripts, because if I type trash filename and there’s already a file with that name in the trash, it behaves like Mac Norm. That is, it’ll rename my second file ‘filename date’ and I won’t have file conflicts!

    The only thing it’s missing is a ‘trash -p’ command, which would let me run the force rm and just dump it all. Yes, I know rm works, but if you’ve ever typed it in the wrong window, you know why it’s a terrifying command. Still, back to the age old rm commands, what happens when you have that annoying locked file error? Like me, you probably kvetch about quitting everything to delete.

    More command line magic!

    $ cd ~/.Trash
    $ chflags -R nouchg *
    $ rm -rf *
    

    Finally, to make this full circle, I made a dead simple alias to prevent me from fat fingering the rm too much:

    alias trashdump='rm -rf ~/.Trash/*'
    

    Fast, efficient, and potentially deadly, but less than manually typing it in all the time. Deleted 2000 files in seconds, versus minutes.

  • Passwordless SSH

    Passwordless SSH

    I’m incurably lazy, and as we all know, lazy techs like to automate (ltla?).

    I ssh a lot into my personal servers, and I get tired of having to type ssh account@server.com, and then enter my password. So I got smart.

    iTerm ProfilesSince I’m on a Mac, the first thing I did was grab iTerm2. This lets me create simple profiles so with a click, I can log in to any of my servers. When I was using Windows, I used PuTTY and the add-on for Connection Manager.(The real PuTTY CM site is gone, and binarysludge just keeps a copy on hand for the same reasons I do. You never know when you need it. Mine’s in my Dropbox storage.)

    What I really loved about PuTTY CM was that I could fill the pref file with my accounts and passwords, and then one-click connect to any of my servers. This was as The Bank Job, where I had a couple hundred servers to do this with, and when I had to change my password, I could search/replace that file. I know, it’s not secure. At DreamHost, I had the same, but they scripted it so I can sudo in with a handy call that I’m in love with. As long as I remember my password, I’m fine. But see, I told you, I’m horribly lazy and I hate having to log in with my password, then sudo again with my password.

    The first step for this is to make an rsa key pair. This is a fancy way of telling both computers to trust each other, so on your personal computer (we’re assuming linux here), go to your home folder and type this:

    [Laptop] $ ssh-keygen -t rsa

    You’ll be presented with a series of informative notes and questions. Accept all the defaults, and keep your passphrase empty.

    Generating public/private rsa key pair.
    Enter file in which to save the key (/home/ipstenu/.ssh/id_rsa): 
    Created directory '/home/ipstenu/.ssh'.
    Enter passphrase (empty for no passphrase): 
    Enter same passphrase again: 
    Your identification has been saved in /home/ipstenu/.ssh/id_rsa.
    Your public key has been saved in /home/ipstenu/.ssh/id_rsa.pub.
    The key fingerprint is:
    3e:4f:05:79:3a:9f:96:7c:3b:ad:e9:58:37:bc:37:e4 ipstenu@[Laptop]
    

    This saves your public ‘key’ in the .ssh folder (yes, it’s a folder)

    Now we have to setup the server (halfelf.org for example):

    [Laptop] $ ssh myaccount@halfelf.org mkdir -p .ssh
    myaccount@halfelf.org's password: 
    

    This will SSH into halfelf as ‘myaccount’ and create a folder called .ssh. You only need to do this once, so after you set up the key for one computer, you can skip this the next time.

    Finally we’re going to append the public key from my laptop over to HalfElf, so it trusts me:

    [Laptop] $ cat .ssh/id_rsa.pub | ssh myaccount@halfelf.org 'cat >> .ssh/authorized_keys'
    myaccount@halfelf.org's password: 
    

    The reason we’re appending is so that if I decide I want to add my Work Laptop, I can just make the key, and then repeat that last command and it will add it to the bottom, trusting both.

    There’s a caveat here, which caught me last week. I set everything up for my new server, ElfTest, and then moved the server to a VPS. The IP changed, so the trusted key was invalid. You see, every time you connect to a server for the first time, it asks you to trust it. If anything in that fingerprint changes, you have to re-trust. This is annoying:

    The authenticity of host 'elftest.net (111.222.333.444)' can't be established.
    RSA key fingerprint is f3:cf:58:ae:71:0b:c8:04:6f:34:a3:b2:e4:1e:0c:8b.
    Are you sure you want to continue connecting (yes/no)? 
    

    After you respond “yes” the host gets stored in ~/.ssh/known_hosts and you won’t get prompted the next time you connect. When it became invalid, I had to go edit that file and delete the entry for elftest (it’s partly human readable, so it wasn’t too bad).

    If you hate this as much as I do, and you feel you’re immune to man-in-the-middle attacks, there’s a nifty command:

    ssh -o "StrictHostKeyChecking no" user@host

    This turns off the key check. Generally speaking? Don’t do this. I’ve actually only done it once. (This was at the bank, where I was behind so many firewalls, if you’d gotten to my computer, I was in trouble anyway.)