Half-Elf on Tech

Thoughts From a Professional Lesbian

Author: Ipstenu (Mika Epstein)

  • Static Content Subdomain

    Static Content Subdomain

    I use a lot of different tools to run my websites, and over time I’ve learned what I want is to have my static content, the files that are uploaded and are images, stored separately from my apps. So while I have the basic folders on my domain (wordpress, wiki, gallery) I have a special subdomain called static.example.com for all those images and videos.

    There are a few reasons I do this. First, I like having my images separate. Second, it allows me to establish a cookie-free subdomain for images and that shuts up YSlow’s check.

    Create The Subdomain

    Do this however your host allows. Keep in mind that some don’t allow you to traverse domain folders. If your host creates your domain as /home/user/example.com and subdomains as /home/user/static.example.com you may have to fight a little more with things depending on your setup. If possible, I prefer to put the subdomain folder inside the main web root.

    If you’re using cPanel, by default you get your static subdomain installed at /home/user/public_html/static which is how I like it. This is perfectly accessible by all things but it’s also browsable at example.com/static/ and we don’t want that. Applying a little .htaccess magic will solve this.

    # CDN
    <If "%{HTTP_HOST} == 'example.com' ">
            RedirectMatch ^/static/(.*)$ http://static.example.com/$1
    </If>
    

    Now we’re ready to go!

    Move WordPress Uploads

    This used to be really easy. Go to Settings -> Media and change things. But we removed that to stop people from blowing themselves up. Now there are a couple ways about it. I jumped right over to editing the options by going to wp-admin/options.php and look for upload_path and upload_url_path.

    Setting image location options

    I change upload_path to /home/example/public_html/static/wordpress which is where I’ve moved all my images. Then upload_url_path becomes http://static.example.com/wordpress and I’m done except for fixing my old posts. It’s actually pretty neat that once I put those paths in, the Media Settings page lists them as editable.

    Fixing the old posts takes a little trick though, and you’ll have to search/replace your posts via the database:

    UPDATE wp_posts SET post_content = REPLACE(post_content,'http://example.com/wp-content/uploads/','http://static.example.com/wordpress/');
    

    Or in wp-cli:

    wp search-replace http://example.com/wordpress/wp-content/uploads http://example.com/wordpress
    

    The gotcha here is that since I use SSL for my administration, I had to set up a new certificate for the static domain. Not a big deal right now since I can set up a self-signed, or use StartSSL until Let’s Encrypt is off the ground. It is something to consider though.

    Move ZenPhoto Uploads

    I have to start by warning you that Zenphoto doesn’t like this. When you install it, it puts your images in an albums folder, in the Zenphoto gallery install. This isn’t so bad, but you actually can move it around. You have to look in your zenphoto.cfg.php file (found in zp-data). The default location for your albums is defined by this:

    $conf['album_folder'] = '/albums/';
    $conf['album_folder_class'] = 'std';
    

    Since I want it in the static location, I tell it my folder path based on ‘web root’ and that its ‘in_webpath’ (which tells ZenPhoto to look in the root and not relative), by changing that section to this:

    $conf['album_folder'] = '/static/gallery/albums/';
    $conf['album_folder_class'] = 'in_webpath';
    

    But that means my URLs for images become http://example.com/static/gallery/albums... and I wanted http://static.example.com/gallery/albums... instead. Thankfully the .htaccess rule I used at the beginning of all this covers me there. Looking into this, I understand this is the case because unlike MediaWiki or WordPress, ZenPhoto only has one ‘location’ setting. The other two have path and URL.

    MediaWiki

    This was … weird. Technically all you have to do is set up the folders and change the following values in LocalSettings.php:

    $wgUploadPath       = "/static/wiki";
    $wgUploadDirectory  = "/home/example/public_html/static/wiki/";
    

    The thing that’s weird is that the documentation says you can do this:

    $wgUploadPath       = "http://static.example.com/wiki";
    

    And when you do, the image URLs properly call from the domain name. They just won’t load. When you dig deeper, it turns out that it’s caused by the settings for responsive images. The way it puts in srcset doesn’t seem to like this. So for now I’ve disabled it and my setup is this:

    $wgUploadPath       = "http://static.example.com/wiki";
    $wgUploadDirectory  = "/home/example/public_html/static/wiki/";
    $wgResponsiveImages = false;
    

    End Result?

    All my uploaded content is on my ‘static’ subdomain, separate from everything else, which makes version control even easier. Also now if I ever decide to move things off to a CDN, I’m pretty well set up.

    The real reason I do this is that while some of my content is uploaded via the content management systems I use (WordPress, ZenPhoto, etc), the majority is not. ZenPhoto, for example, is faster to FTP up a gig of images than it is to use a PHP tool. Ditto videos. And because of them, it’s nice to have a separate location I can give access to without allowing someone full rights on all my tools.

  • Image Compression

    Image Compression

    If you’ve ever tested your site on Google PageSpeed Insights, GTMetix, Yahoo! YSlow, or any of those tools, you may have been told about the value found in compressing images.

    Google Pagespeed has some images for me to optimize

    There are some images you can fix, like the first one on that screenshot is for an image from Twitter I downloaded. There are some you can’t fix, like the last three are all telling me the default smilies for WordPress need some shrinking.

    Either way, the short and skinny of it is that if you make your images smaller then your webpage loads faster. I know, it’s shocking. Many long-term webheads know that you can compress images best on your own computer, using ‘Save For Web’ and other command line tools for compression. That’s great, but that doesn’t work for everyone. Sometimes I’m blogging from my phone or my iPad and I don’t have access to my tools. What then?

    Before You Upload

    I know I just said but what about when you can’t resize before you upload. There are things you can do for this. Photo Compress for iOS will let you make your images smaller. So can Simple Resize. There are a lot of similar apps for Android as well.

    For the desktop, I use Homebrew so I installed ImageMagik (which is also on my server for WP to use) and toss in a command line call for it. Sometimes I’ll use grunt (yes, Grunt, the same thing I use for Bower and coding) and ImageOptim to compress things en masse.

    Of course, if I only have one or two images, I just use Preview which does the same thing more or less. Photoshop, which is still stupid expensive, also lets you do this, but for the layman, I suggest Preview for the Mac. I haven’t the foggiest what you can use on Windows.

    If you’re not uploading images via WordPress (and very often I’m not), you pretty much have to do it old-school.

    While You Upload

    Okay great, but what about WordPress?

    The best way about it is to have WordPress magically compress images while you upload them. And actually it does this out of the box. My cohort in crime at DreamHost, Mike Schroder, was part of the brain trust behind making ImageMagick a part of core WordPress. This was a massive undertaking, but it allowed WordPress to compress images better, faster, and more reliably than GD. I hesitate to say it’s ‘safer’ since the image loss (that is that weird fuzzing you get some times) is less.

    If you want to make it even better, you need to use plugins. I used to tell people to use smush.it which was a Yahoo! run service. But then they deleted it and we were all very sad. WPMU Dev, who owns a SmushIt plugin, installed Smushing on their servers and you can now use WP Smush. I’m a bit of a fan of TinyPNG, which gives you 500 free compressions a month, and that’s enough for me. I like TinyPNG because there are no options. I install it, it runs when I upload images, and done. That doesn’t mean I don’t see value in things like ShortPixel, just that it’s not really what I want. Finally there’s Kraken IO which I love just for the name, but it makes people balk because it’s not free.

    If you don’t want to use an external service, and I totally get why you wouldn’t, there’s EWWW Image Optimizer.

    I personally use either EWWW or TinyPNG, depending on the site.

    Can a CDN Help?

    Maybe. I know a lot of people love offloading their images to CDNs. I currently don’t for no reason other than I’m a bit lazy and I hate trusting someone else to host my images. But that said, you actually can (rather easily) use a CDN if you have Jetpack installed. Their service, Photon, does exactly that. Now, I don’t use Photon because my users in Turkey and China like to visit my site, but there’s nothing at all wrong with those services.

  • Mailbag: Multisite or Not?

    Mailbag: Multisite or Not?

    From Ian:

    Hi, I wanted to comment on one of your articles and see if any one would provide feedback, but the comments are closed. https://halfelf.org/2011/dont-use-wordpress-multisite/ I have been avoiding using Multisite as you recommend. Several have tried to get me to go that way, but it didn’t seem right. So in the cases of when a business wants to have independent sites that share the main sites content, what would you recommend?

    There was more to the email but it boils down to this. They want a site that has all the local content and all the main brand’s content.

    When you get to the point of ‘sharing’ content, Multisite is less and less of a winner. While we’re still in a pre JSON API world, which would make this a bit easier, I lean towards the basic simplicity of categories.

    I’d have a main category for ‘news’ and another for each independent site (say ‘locale1’). Then I’d craft my theme to pull in posts from ‘news’ and ‘locale1’ and style it per the brand designation of each locale.

    The plugin Groups may be what you want to control access for things.

    One site. Easy to cross-relate content. Done.

    No multisite.

  • Encrypt My Site, Please!

    Encrypt My Site, Please!

    By now everyone running a website has heard about how Google gives sites running SSL a bit of a bump with search rankings. It’s been a year since they started doing that. The bump is not significant enough to impact most people, but making all the things HTTPS is still a good idea.

    It’s really all about security. I personally use HTTPS for the backend of my WordPress sites. Logins and wp-admin are all secure. The same is true for my MediaWiki site and my ZenPhoto gallery. I access things securely because the data I transmit could be important. Sure, it’s just passwords and such, but then you look at my site where I sell eBooks.

    That site is on the same server, the same account, and the same WordPress install as this one. You bet your ass I’m making it all secure. But this comes at a cost.

    The invention of SPDY aside, HTTPS is rarely as fast as HTTP. It can be difficult to set up caching or implement terminations like via Pound or Nginx in order to cover for the fact that Varnish doesn’t cache SSL. These things aren’t impossible to do. They’re just harder. And they’re generally going to be slower than plain old HTTP.

    The question has never been if they can or cannot be done, but if they can by entry-level people. Certainly we can say “If they’re not using SSL for all the things, they’re not ready for a great volume of business” and use it as a demarcation for things. And when we’re communicating our private lives, we should certainly consider this. But then, this site, where only I log in?

    Do you need SSL? Would it make you feel better or more secure? All you can do is comment. Do you need the feel-good? Do I need the extra security? If I decide yes, then I have to consider the weight this puts on my site. I have to consider how best to cache the content.

    I also have to think about how to get a certificate and SSL certs can be the biggest Internet scam out there.

    It’s how much for a multi-domain, including subdomain, SSL cert a year? A few hundred? It’s a bigger hassle than the EU Vat drama. It’s expensive, it’s a hassle to get set up, and install. Now thankfully, next month we’re expecting Let’s Encrypt to show up and they’ll make the cost less prohibitive. It doesn’t make the drama of installing the certs any better, but it’ll lower the bar for people who are trying to make things secure.

    Yes, you can get StartSSL for free, but it’s not as simple as all that. When all you need is one certificate, it’s only about $10 a year and that’s fine. When you start getting into the need to secure all the things, it’s a mess.

    What has to happen next, though, is for sever software to step up. Apache and nginx are both far faster now than they have been, but they’re our ultimate breakpoint. PHP has to push itself to handle things better and faster, lest we run over to HHVM. We are getting better of course, but if we want everyone to be on HTTPS, we have to make it easy.

    Not easier.

    Easy.

    The bar is still too high for the majority of people, and that’s a problem. Either we start offering hosting services to handle this or we start making the software easier. But we can’t just say “Oh, it’s simple to make your site HTTPS and fast.” because it’s not.

  • Everything Is Vulnerable

    Everything Is Vulnerable

    Every other day we hear about a tool that has a vulnerability. It’s been the servers we use, Flash, or Silverlight, or the Jeep that was hacked.

    This Is Not New

    The idea that hacking like this is new or novel is, let’s be honest, naive. In the 1800s, people used to hack into the newly born telephone system. Before that, we didn’t call it hacking, we called it conning. Yes, the confidence games people played to get others to trust them and then rip them off is the same idea as a hack.

    A hacker is someone who finds a weakness in a computer system and exploits it to some benefit. Early bank penetration tests, the ones to see if they could get at your money, were as much social engineering as technical skill. A ‘hack’ is simply something taking advantage of an exploitable weakness. This is not new to anyone or anything.

    The Scale Has Changed

    The primary difference between the hacks of old and the ones today is the scale of those hacks. Hacks used to be very personal for a reason: there was no world wide network. Your hacks had to be local and careful, because no one trusted the stranger. You can to build up credibility before taking your win. Of course, now we have near instant communication with the entire world. That means it’s milliseconds to access the server of someone in Africa, all from your happy NYC Starbucks.

    The difference is that now, when someone says “And Flash has a security vulnerability” the number of people impacted is in the millions. And the number of people who can be hurt by it is, similarly, high.

    We’ve spend years trying to create a global internet, and in doing so we’ve quickly shared communicable internet diseases with each other.

    Nothing Is Unhackable

    My boss and I were chatting about the ways one might hack the stock exchange, and he pointed out that one of the ways they slowed down trades was by having a really long cable.

    38-mile coil of fiber-optic cable
    Credit Stefan Ruiz for The New York Times

    This cable, and yes it’s real, is literally used to create a small delay in processing of orders, to level the playing field with traders. In short, it makes sure that the trades from across the ocean run at the same speed as the ones for the people in the room of the New York Stock Exchange. Each additional mile of fiber-optic cable adds 8 microseconds to a transaction, which adds up to 304 microseconds. Among other things this is hard to hack. You can’t send a software signal faster than it goes (physics being what it is), so it made things harder to hack.

    The next Mission Impossible movie will involve Tom Cruise being slowly lowered into the box with that cable in order to shorten it invisibly. Only Cruise can do it because only he is small enough.

    That was my joke. But it’s actually rather demonstrative to the point. You can physically hack things as well.

    Analyze The Risk

    To quote my father, “What can go wrong? How likely is it? What are the consequences?”

    That’s why I don’t own a wifi pluggable garage door or thermostat. Do I think they’re cool? Yes. Do I think they could make much of my life easier? Yes! But they’re new and they’re toys, which means people spend a lot of time poking at them and digging into the underlayer to see how and why they work. Which means people are finding hacks daily.

    That means the likelihood of someone figuring out how to use my thermostat to drive my budget through the roof is pretty high. Someone already did that to his ex-wife if that review is to be believed. Of course he had the access in the first place, but it proves one point. If you get access, you can do things.

    Change it to my garage door? Or my front door? Say good bye to my things. I know I’d be a target because I’m using the pricy toys to start with.

    Educate Yourself

    If you can not do stupid things, the odds of you being hacked are low.

    By stupid things, I mean using insecure passwords. I mean logging in on public WiFi to do your banking. I mean installing any old plugin on a WordPress site running a store.

    The things you know are dangerous.

    Don’t be stupid. Make backups. Be prepared for disaster.

  • Mailbag: Debugging .Com

    Mailbag: Debugging .Com

    From Wesley:

    It’s a one-word question though so I hope it won’t take you too long: my blog is free, which means it’s a .com, not .org [redacted] so I cannot install plugins, right? And an even quicker follow up, there’s no hope for me to erase post revisions so to free space and upload a page that’s over 200K words and refuses to do so, is there? Thank you so much for your attention. Cheers

    Your site is hosted on wordpress.com so you can’t install plugins (or themes). Even on VIP you can’t, though they may do it for you. It’s a Multisite thing.

    Also you can’t erase post revisions, but you also don’t need to worry about that. It’s not why you can’t upload a page of over 200k words. That’s just WP timing out. I’m assuming it’s hanging and timing out… Hard to know for sure without a description of the error.

    Sadly, since it’s on .com and I don’t work for them, you will have to ask how to handle that here: https://en.forums.wordpress.com/

    My suggestion would be split the post into multiple posts. I’ve found people rarely read a post longer than 1500 words. If it was a self hosted WordPress, I’d tell you that you should check the PHP error logs, and see if it’s a PHP timeout (which would be my guess) or something else like a mod_security error. If it’s PHP, there’s not a lot you can do unless you’re on a VPS or better, but really ‘adding more PHP memory’ is not the best move.

    Storing massive amounts of data is slow. 200k in a post is huge. If you were to space it out, it would make a 580 page book, give or take. I may have done NaNoWriMo before. Anyway, the point is that’s why we invented chapters in the first place. I have a mental image of a guy on an old manual page press, being handed the 580 pages to typeset, and breaking down. He’s sitting in a corner, crying, rocking back and forth.

    Today he’s your database. Your database is sitting in a corner with PHP, crying. “Why are you giving me so much to do at once, PHP?” And poor PHP is sobbing. “I don’t know, I can’t even handle it!”

    Anyway. Smaller posts. People won’t read 200k in one go. Not even speed readers.