Half-Elf on Tech

Thoughts From a Professional Lesbian

Author: Ipstenu (Mika Epstein)

  • Git Subtrees

    Git Subtrees

    I have a project in Hugo where I wanted the content to be editable by anyone but the theme and config to remain mine. In this way, anyone could add an article to a new site, but only I could publish. Sounds smart, right? The basic concept would be this:

    • A private repository, on my own server, where I maintained the source code (themes etc)
    • A public repository, on GitHub or GitLab, where I maintained the content

    Taking into consideration how Hugo stores data, I had to rethink how I set up the code. By default, Hugo has two main folders for your content: content and data. Those folders are at the main (root) level of a Hugo install. This is normally fine, since I deploy by having a post-deploy hook that pushes whatever I check in at Master out to a temp folder and then runs a Hugo build on it. I’m still using this deploy method because it lets me push commit without having to build locally first. Obviously there are pros and cons, but what I like is being able to edit my content and push and have it work from my iPad.

    Now, keeping this setup, in order to split my repository I need to solve a few problems.

    Contain Content Collectively

    No matter what, I need to have one and only one location for my content. Two folders is fine, but it has to be within a single folder. In order to do this, it’s fairly straightforward.

    In the config.toml file, I set two defines:

    contentdir = "content/posts"
    datadir = "content/data"
    

    Then I moved the files in content to content/posts and moved data to content/data. I ran a quick local test to make sure it worked and, since it did, pushed that change live. Everything was fine. Perfect.

    Putting Posts Publicly

    The second step was making a public repository ‘somewhere.’ The question of ‘where’ was fairly simple. You have a lot of options, but for me it boils down to GitLab or GitHub. While GitHub is the flavor du jour, GitLab lets you make a private repository for free, but both require users to log in with an account to edit or make issues. Pick whichever one you want. It doesn’t matter.

    What does matter is that I set it up with two folders: posts and data

    That’s right. I’m replicating the inside of my content folder. Why? Well that’s because of the next step.

    Serving Subs Simply

    This is actually the hardest part, and led me to complain that every time I use Submodules in Git, I remember why I hate them. I really want to love Submodules. The idea is you check out a module of a specific version of another repository and now you have it. The problem is that updates are complicated. You have to update the Submodule separately and if you work with a team, and one person doesn’t, there’s a possibility you’ll end up pushing the old version of the Submodule because it’s not version controlled in your git repository.

    It gets worse if you have to solve merge conflicts. Just run away.

    On the other hand, there’s a tool called Subtree, which two of my twitter friends introduced me to after I tweeted my Submodule complaint. Subtree uses a merge trick to get the same result of a Submodule, only it actually stores the files in the main repository, and then merges your changes back up to it’s own. Subtrees are not a silver bullet, but in this case it was what I needed.

    Checking out the subtree is easy enough. You tell it where you want to store the repository (a folder named content) and you give it the location of your remote, the branch name, and voila:

    $ git subtree add --prefix content git@github.com:ipstenu/hugo-content.git master --squash
    git fetch git@github.com:ipstenu/hugo-content.git master
    From gitlab.com:ipstenu/hugo-content
     * branch            master     -> FETCH_HEAD
    Added dir 'content'
    

    Since typing in the full path can get pretty annoying, it’s savvy to add the subtree as a remote:

    $ git remote add -f hugo-content git@github.com:ipstenu/hugo-content.git
    

    Which means the add command would be this:

    $ git subtree add --prefix content hugo-content master --squash
    

    Maintaining Merge Manuverability

    Once we have all this in, we hit a new problem. The subtree is not synced by default.

    When a subproject is added, it is not automatically kept in sync with the upstream changes so you have to pull it in like this:

    $ git subtree pull --prefix content hugo-content master --squash
    

    When you have new code to add, run this:

    $ git subtree push --prefix content hugo-content master --squash
    

    That makes the process for a new article a little extra weird but it does work.

    Documenting Data Distribution

    Here’s how I update in the real world:

    1. Edit my local copy of the content folder in the hugo-library repository
    2. Add and commit the changed content with a useful message
    3. Push the subtree
    4. Push the main repository

    Done.

    If someone else has a pull request, I would need to merge it (probably directly on GitHub) and then do the following:

    1. Pull from the subtree
    2. Push to the main repository

    My weird caveat is that updating via Coda can get confused as it doesn’t always remember what repository I want to be on, but since I do all of my pushes from command line, that really doesn’t bother me much.

  • Gmail: Handling Bad Emails

    Gmail: Handling Bad Emails

    No, not bad emails as in the ones that you consider saving and posting for someone’s everlasting internet shame. Bad emails are the ones that go to the wrong place, through none of your fault. We’re talking about the people using an email you’ve not used in a decade, or someone who can’t remember your name is spelled with an A and not an E, and so on. You know, typos.

    One of the things I did on my old email was set up a trash email box. That is, people could email not-me@domain.com or me@olddomain.com and they’d get an auto-reply telling them the email was no longer in service. It was more important for an old domain I owned but didn’t use and yet people I needed to talk to still thought it was real. I could have forwarded it to me, but after 10 years, I upgraded to the “Folks, seriously!” alert.

    Doing this on cPanel was pretty easy, making a custom alias that dev/null’d and sent a reply. Doing it on Gmail was a little weirder and made me think about the situation.

    Canned Replies

    First you have to set up Canned Responses, which is a Lab (go to Gmail -> Settings -> Labs). You made a response like you make an email, only instead of sending it you save it by clicking on the down arrow and saving as a Canned Response:

    Canned Response Save

    Once you have it saved, set up a filter so any email to @domain.com gets a reply of that Canned.

    Don’t Be Sneaky

    If you’re thinking “Aha! I can use this to be sneaky!” with the intent of sending people emails to pretend you really are reading it, there is a problem with that. The email comes back from YOU+canned.response@example.com and no, there’s no really easy way around that. Someone did come up with a Google Script for it, but it’s not for the faint of heart.

    Now the question is, is that a bad thing? Is it bad for people to know they got a canned reply? No, not really. By putting in the +canned.response it’s obvious that it’s a canned, but it’s also obvious for you and you can filter the emails however you want. People who reply to canned? Auto-trash ’em. Or block them.

    Filters

    Instead of the canned reply, though, you can also just discard the email. Either don’t even bother to set up the email (or it’s alias at all), or if you do, filter it out and dump it. The only reason I could see bothering to make an alias for email you don’t want is if you either plan to review it later, or if you have a catch all email address. If you do this, making an alias, make sure you filter the emails and mark them read so you don’t get distracted by them.

    Catch All

    There’s a slightly different approach to all this, though. The idea of a catch-all email. By default, G Suites sends all your misdirected emails to trash. Accidentally mailed bob@example.com instead of b0b@example.com because the numbers and letters look the same? Tough luck. Unless Bob was smart enough to set that up as an alias (which I tend to do), your email was lost. The alternative is to designate a user as a ‘catch all’ account that gets everything that doesn’t belong to an existing user.

    That catch-all can auto-reply to all emails, forward ones that are important, and everything else. If you’re a business, you should do this so you don’t lose any misdirected emails from customers (they can’t spell after all), but remember to check that email often as it will also collect all the spam for all your accounts.

  • Review: Spark Love for Your Gmail

    Review: Spark Love for Your Gmail

    Moving my email to Google Apps has, thus far, been interesting. I don’t regret it, and consolidating multiple emails down to three was a good choice. The learning curve of adding in email aliases so I can mail from all the accounts I use, and the limits of Gmails shitty filters so everything is funneled to the right place, has been tricky.

    As I mentioned before, I have a ton of aliases. Adding them in on the Google Admin back end (just renamed G Suite) is weird but easy enough. To be able to email from them, you have to also add them in via the normal Gmail web app. It’s tucked under Settings > Accounts, and under “Send mail as”, click Add another email address.

    But if you don’t want to use the web app (and I don’t), Gmail can be a bit of a turd. It doesn’t work great with the desktop Mail.app, and it works terribly with iOS’s mail. Gmail and Apple are just at odds with how email works. They both want to control your experience and redefine email in different ways. Frankly I prefer the Mac way, but that’s personal preference.

    What is a universal problem is that I needed a way to email from my aliases, and if you set up email as Google Mail in the iOS mail app … you can’t.

    Yes, you read that right. It is flat out impossible to set up email aliases for a Google mail account. If you want to use the iOS mail app and Goggle email and aliases, you have to set up Gmail as an IMAP app, and that’s sort of a shit show in the making. Gmail’s IMAP implementation is non-standard, to put it simply. Among other things, you can only use 15 connections to IMAP per account. If I had the desktop app open and my iPhone and iPad, weird shit happened.

    Now, there are solutions. You could use the Gmail app, but it sucks and doesn’t have an Apple Watch component. Also it’s ugly. Excuse me. It’s basic. You could also use Google’s Inbox app, but you have to use Inbox and the email filters aren’t as robust.

    This leads us to our final solution. Spark.

    This app was something I’d played with before, as it had email alerts on the Apple Watch, and I wanted to get pinged for some work emails while updating all DreamPress installs over at DreamHost. Sadly, the fault of the app not meeting that need is Gmail, again, which has no way to filter properly and send an alert only when an email meets specific criteria.

    What Spark does do is everything else. It has a Watch component, it syncs between my iPad and iPhone, it looks like an iOS app, it acts like a Google app, it pulls in the features people rave about Inbox, and it has email aliases that are simple to set up. Whew. The only thing it doesn’t do is show me a count for unread messages in my folders.

    I can live with that.

  • Optimizing Images

    Optimizing Images

    After running a regeneration of my images (due to changing things on my theme) my Gtmetrix score dropped from an A to a D! In looking at why, I saw it was telling me my images should be optimized.

    Best to get on that, eh?

    The easiest way is to install jpegoptim on the server:

    $ yum install jpegoptim
    

    And then to run a compression on my images:

    $ jpegoptim IMAGE.jpg
    

    Anyone fancy running that on a few thousand images? Hell no. We have a couple options here. One is to go into each folder and run this:

    $ jpegoptim *.jpeg *.jpg
    

    The other is to sit in the image root folder and run this:

    $ find . -type f -name '*.jp?g' -exec jpegoptim {} --strip-all \;
    

    I picked --strip-all since that removes all the image meta data. While I would never want to consider that on a photoblog, I needed to compress everything and, for some reason, unless I stripped that data I didn’t get smaller sizes. For this case, it wasn’t an issue.

    What about PNGs? Use optipng ($ yum install optipng) and run this:

    $ find . -type f -name '*.png' -exec optipng *.png \;
    

    Going forward, I used Homebrew to install those locally and compress my images better, though I’m usually pretty good about remember that part. Well. I thought I was.

  • Yoast SEO: Selective Stopwords

    Yoast SEO: Selective Stopwords

    Stopwords are those small words that should be removed from URLs. You know like ‘a’ and ‘and’ or ‘or’ and so on and so forth. They make for ungainly and long URLs and really you should remove them.

    If you happen to use Yoast SEO for WordPress, and you want to disable stopwords, there’s a simple way about that. Go to SEO -> Advanced and disable the feature for stopwords.

    Disable stop-word cleanup, but why?

    If you want to kill it with fire and prevent everyone on your site from being able to activate them ever, you can toss this into an MU plugin.

    add_filter( 'wpseo_stopwords', '__return_empty_array' );
    remove_action( 'get_sample_permalink', 'wpseo_remove_stopwords_sample_permalink' );
    

    The first filter makes the stopwords kick back nothing, and the remove action stops the process from running. You probably only need the second one, but better safe than sorry, I always say.

    But … what if you want stop words removed, but you don’t want them removed on certain custom post types? Welcome to my world! I wanted to remove them from two post types only.

    Enter my frankencode:

    <?php
    
    /*
    Plugin Name: Yoast SEO Customizations
    Description: Some tweaks I have for Yoast SEO
    Version: 2.0
    */
    
    // Unless we're on a post or a post editing related page, shut up
    global $pagenow;
    
    $pagenow_array = array( 'post.php', 'edit.php', 'post-new.php' );
    if ( !in_array( $pagenow , $pagenow_array ) ) {
    	return;
    }
    
    // Since we are, we need to know exactly what we're on and this is a hassle.
    global $typenow;
    
    // when editing pages, $typenow isn't set until later!
    if ( empty($typenow) ) {
        // try to pick it up from the query string
        if (!empty($_GET['post'])) {
            $post = get_post($_GET['post']);
            $typenow = $post->post_type;
        }
        // try to pick it up from the query string
        elseif ( !empty($_GET['post_type']) ) {
    	    $typenow = $_GET['post_type'];
        }
        // try to pick it up from the quick edit AJAX post
        elseif (!empty($_POST['post_ID'])) {
            $post = get_post($_POST['post_ID']);
            $typenow = $post->post_type;
        }
        else {
    	    $typenow = 'nopostfound';
        }
    }
    
    $typenow_array = array( 'post_type_shows', 'post_type_characters' );
    if ( !in_array( $typenow , $typenow_array ) ) {
    	return;
    }
    
    add_filter( 'wpseo_stopwords', '__return_empty_array' );
    remove_action( 'get_sample_permalink', 'wpseo_remove_stopwords_sample_permalink', 10 );
    

    There was something funny to this, by the way. Originally I didn’t have the $pagenow code. Didn’t need it. But when I left it out, Yoast SEO broke with a weird error. It refused to load any of the sub-screens for the admin settings!

    Cannot Load Yoast SEO Admin Pages

    After some backpacking of “Okay, was it working before…?” I determined it was the call for global $typenow; – a global that isn’t used at all in the Yoast SEO source code that I could find. Still, by making my code bail early if it’s not even on a page it should be on, I made the rest of the WP Admin faster, and that’s a win for everyone.

  • Two Forks In The Road

    Two Forks In The Road

    I believe in healthy competition.

    Rivals, professionally and personally, have the ability to inspire us to reach great heights. They also have the ability to be terrible, but when a true rival, who respects you and your work, arrives, they should be embraced.

    The other day I said that I would love to see a W3TC killer. Killer was the wrong word, as what I mean is that I would love to see something as amazing as W3TC that reaches out and tackles caching in a new and inventive way. I’d also love to see a WordPress killer, an iPhone killer, and a Linux killer. And a Hybrid Car killer.

    I don’t mean I want any of those things to fail, I mean I want to see them have a challenger who does what they do, differently, in a way that inspires them to do more and more and better.

    Growth stagnates without good rivalry. When you have a rival who does what you do, and they succeed, you want to succeed. When you’re both healthy rivals, you can carry it even further. Reaching out to your rivals and telling them “I am impressed with how you did X! Nice job!” is the greatest gift. With WordPress code, taking a leaf from their book and forking some of their code (with credit) is another way to hat-tip them.

    In truth, W3TC and WP Super Cache never really competed. They can’t. They have wildly different approaches to just about everything, and they’re not even ‘after’ the same customer base. WP Super Cache appeals to people with it’s simplicity and directness. It works and you can (mostly) ignore it. W3TC has an insanely deep and complex set of tools that works closer to the base level of a server. W3TC has options, oh my god it has options, and they can overwhelm.

    But the real crux to all this, besides the take away that caching is hella hard, is that there is always more than one way to solve a problem. And there is always room for multiple solutions in any ecosystem. It comes down to needs, wants, and user preferences. Both plugins I’ve named here do a great job at meeting the needs for their audiences. And both plugins grew out of someone’s need. Donnacha and Fredrick both created something to solve their own problems. They shared these solutions with the world and became unintentional rivals and kings of caching.

    Okay so back to what I said.

    Should there be a ‘killer’ caching plugin? Will there be one?

    Maybe.

    There should never be one killer app, no matter what it is. There should never be one perfect solution. Mostly because I don’t believe there’s such a thing. There’s nothing we can create that will suit everyone’s needs and wants. It’s statistically impossible. So when we talk about a ‘killer’ anything we never mean that. We mean “There should be options and the options creators should be healthy competition with each other to create some awesome things.”

    And I really truly thing we should do that. I would love to see someone tackle WordPress with a serious self-hosted alternative. Something easier to install on my own than Ghost, but as easy as Hugo or Jekyll to write a post. Something extendable like Drupal, but with better backwards compatibility. Something next. And I want to see WordPress take what it learns from those other tools to become even more.

    Because healthy rivalry between friends and equals is a good thing.