Half-Elf on Tech

Thoughts From a Professional Lesbian

Tag: security

  • Multiple Domains, Multiple Logins

    Multiple Domains, Multiple Logins

    Every month or so, someone asks me why they have to log in again on multiple domains on WordPress. That is to say, they’re using Multisite and they log in to example.com and then they have to log in again on sub.example.com and this is weird.

    The answer is due to cross-domain browser protection. This is not to say you can’t do it! If you’re just using subdomains, this is really easy:

    define( 'COOKIE_DOMAIN', 'example.com' );
    define( 'ADMIN_COOKIE_PATH', '/' );
    define( 'COOKIEPATH', '/' );
    define( 'SITECOOKIEPATH', '/' );
    define( 'COOKIEHASH', md5('http://example.com') );
    

    The last one is just to prevent conflicts with other sites you may have on example.com that aren’t WordPress related. Or maybe are, but are a separate install for whatever reason.

    But if you’ve read my older posts, you know my COOKIE_DOMAIN is set like this:

    define( 'COOKIE_DOMAIN', $_SERVER[ 'HTTP_HOST' ] );
    

    That’s because I’m mapping domains without a plugin to handle that for me. And that means I have to log in separately to halfelf.org and ipstenu.org and it sucks.

    Like I said before, this is called cross-domain browser protection. You can’t use a cookie on multiple sites, even with integrated logins, with different domains.

    Point in case. The exact same user ID/Password I use on wordpress.org is used on buddypress.org and bbpress.org and I have to log in to each site separately.

    Why? To stop evil people from being evil. Can you imagine what would happen if someone sorted out your cookie hash and was able to let your login work on their sites? That would introduce new levels of phishing scam hells because you would be able to go to fake-paypal.com and your paypal.com login would just magically log you in.

    So at this point it looks like you can’t have your cookies magically work for multiple domains and automagically log you in to them without interaction. But you’re safer this way. But what if you could?

    $cookiehash = md5("http://www.example.com/");
    define('COOKIE_DOMAIN', false);
    define('COOKIEPATH', '/');
    define('SITECOOKIEPATH', '/');
    define('ADMIN_COOKIE_PATH', '/');
    define('COOKIEHASH',  $cookiehash );
    

    Notice how I changed the COOKIE_DOMAIN? Without it being defined, it doesn’t restrict the cookie to one domain. The HASH will protect you ‘enough’ and you should be able to log in on all domains on your network.

    Mind, I don’t do that. It doesn’t work reliably in my experience, which makes sense. It’s just not as safe.

  • SSL Intermediary Certificates

    SSL Intermediary Certificates

    Every now and then, my Andriod friends tell me my store won’t work on their phones.

    Android warning: Your connection is not private

    Now my store works on Chrome, Firefox, Safari, and IE. I get a green lock, which is what you’re looking for on Chrome, and SSL Labs comes back … with varying results of stupidity. I tend to get this:

    Unexpected failure – our tests are designed to fail when unusual results are observed. This usually happens when there are multiple TLS servers behind the same IP address. In such cases we can’t provide accurate results, which is why we fail.

    Now this is a ‘valid’ failure. I have one IP and a multi-domain certificate (ipstenu.org, mothra.ipstenu.org, store.halfelf.org). It’s stupid, mind you, since sometimes it works and sometimes it doesn’t and it gives me a headache. If you look on Digicert or SSLShopper, they both come back just fine. I’ve started to think that the ssllabbs cache is drunk. I’m going to assume I’m okay based on sslcheck, who gives me a B because it can’t tell if I patched for BEAST (I did).

    That said, I did some research and determined I was not the only person having this issue specifically with a Comodo cert! As it happens, the issue was in part due to missing an intermediate certificate in my file. If someone’s already visited another website which has the same certificate seller, the intermediate certificate is remembered in the browser. Sounds great, right? The site loads faster! But if the visitor hasn’t hit my store, then they don’t have the intermediary certificate and it would fail.

    But why does this only happen on an Android phone? Your browser on your big computer has a whole mess of certificates it saves for you, to make things faster for everyone. Your phones don’t.

    To solve a missing intermediate certificate in the SSL connection, you have to add the intermediate certificate to your own certificate file. This is a little annoying with cPanel/WHM, because I can only do it as root. I’d previously added everything via cPanel as my ipstenu.org log in because it was per domain, right? The trick here is that I have to not just add the certificate by pasting that in, but I have to grab the other two certs that came with:

    Two More Certificates!

    Notice how there are four? The first one is my certificate, the one I pasted in. The second is my Root certificate, leave it alone. The bottom two I had to add at the bottom of the cert page, where it said “Certificate Authority Bundle (optional):” Those I pasted the content of, one after the other, and saved it. In my case, I was so annoyed I deleted them all and re-added everyone, pasting in the main cert and using auto-fill, and then manually adding in the bundle.

    I do find it interesting to note that this only failed on Android phones, though.

  • Two Factor Apps

    Two Factor Apps

    Hat tip to Kat for cluing me into this!

    Two Factor Authentication is a wonderful thing between two places. Between ten it’s a hassle.

    I got a new phone and was going through the process of re-entering all my codes on the official Google Authenticator which, once you install it and add a couple codes, looks pretty basic and utilitarian. It lists all your codes and what site they’re for.

    My issue with the app is pretty basic. First of all, it’s Google’s and I’m not a huge fan. Second, the app hasn’t been updated in a year and it shows. Third, I have to pull my phone out when I want to log in (which is the point, I know). Fourth, I got a new phone and had to manually move everything over.

    The last two items are actually the biggest hassle.

    Enter Authy. It hits all four points. It’s not Google, it’s updated to look right on an iPhone 6, it has a desktop app that syncs with your phone, and it’s got backups.

    My fear right away was “Where is my backup?” and this is all they say:

    For your convenience Authy can store an encrypted copy of your Authenticator accounts in the cloud. The account is encrypted/decrypted inside your phone, so neither Authy or anyone affiliated with Authy have access to your accounts.

    I’m not super happy that I don’t know what cloud it’s in, or whose (Amazon probably), and I dislike that unlike 1Password I can’t pick where I put the backups. What if I want to sync to Dropbox? Or iCloud? That would be a great improvement. That said, they’re upfront about their backups and how they work and, unlike Google, appear to have people who are willing to talk to you about things.

    But.

    The only issue I see with Authy’s layout is that if I have more than 12 items, it’s a little weird to scroll around the tiny boxes.

    Now if only Twitter and Paypal would have real 2FA and not ‘SMS’ which doesn’t help me at all outside my home country.

  • Whose Responsibility Is It?

    Whose Responsibility Is It?

    When WordPress 4.0.1 came out, a small number of sites broke.

    For a while, we’ve been touting that minor releases to WordPress core, the ones we auto-upgrade for you, are very safe, very tested, and very important. While all this is true, it has brought a few people complaining to me that obviously I was wrong.

    It’s true that the 4.0.1 release broke people. It was an object lesson in why I tell people not to reinvent the wheel. But this upgrade situation does not mean the upgrades aren’t safe, secure or smart. It does bring thoughts to mind, like what my friend David talks about when he considers WordPress at the enterprise level. I know people who are using this failed upgrade scenario as a reason to tout that WordPress isn’t ready for big business, but I think they’re looking at it from the wrong perspective.

    Caution Minefield sign

    Let’s step back.

    I used to work for The Man. I’m well aware of the machinations you go through to upgrade anything at a massive enterprise. One of the things they do is a code review. Every single upgrade is checked and tested and a dry-run of the upgrade is run to ensure everything works the way they think it should. By allowing WordPress auto-upgrades, you remove that ability. For a massive corporation? I would turn off the auto-update.

    But at the same time, this mythical major company running WordPress would have at least one person who knew WordPress. They would have someone who’s job it was to review every single bit of code that went into their WordPress site. Each plugin would be checked, tested, evaluated for security, and only installed if that WordPress Checker said it was good. Because that’s exactly what you must do in any and all enterprise situations.

    David’s viewpoint is that the vetting of a site should be delegated.

    My gut reaction to say that they should know better has to be tempered with the fact that no, they should not have to. It’s the job of every site owner to vet their system, but to make a platform that is truly global, that vetting should be delegated. Web hosts and security analysts should vet code for collisions and bugs. Theme and plugin shops should ensure that their products adhere to best practices. Putting accountability for the full stack on each site owner is not only inefficient, but impractical. Inherent trust should exist that code in the official repository maintains a baseline level of code, trust that is eroded when the problems that occurred with a subset of sites on this update occur.

    And here, he and I disagree somewhat.

    It’s the job of everyone who uses software to be aware of what they’re doing. Vetting the software before it goes in to your system has to be someone’s job. WordPress core does an amazing job of this for you. WordPress core is safe. The 45k plugins and themes in the world don’t always meet the same level of robust checking. Which means when you introduce WordPress to your environment, you absolutely have to seriously review those third party odds and sods you want to use because they’re so shiny and cool.

    Web hosts vet code for collisions, sure. We do at DreamHost. That’s part of my and Mike’s jobs! We know what’s going into WordPress and if it’s going to blow things up at DreamHost for our customers. But like the site owner who found her site down one morning because we’d upgraded her from PHP 5.2 to 5.4 and it broke her WordPress 2.5 site, we cannot account for everything.

    I think there’s a need for security specialists to review plugins, in a public forum, and point out who’s not doing things in the best way. I also think that there’s a need for developers to remember there’s a reason why we do things a certain way, and while it’s fine not to, you have to keep in mind that it’s now your responsibility to keep a close eye on anything that changes in core that might cause your code not to work as well.

    For example. If I wrote a plugin that worked around the shortcode API for whatever reason, I would have a custom query on trac for any ticket related to Shortcodes and have it as an RSS feed to monitor. Or I might even subscribe to the trac firehose and use a filter to pull out anything that so much as mentioned the word. Because I’ve now made a change that I know might be a problem someday.

    Every business owner should know the risks of all the software they use, be it website or desktop. This responsibility is the cost of doing business. The size of the business and the importance of the software will change what resources you can afford to allocate to that part of your business, but you absolutely cannot ignore it.

    While I really want to say that because WordPress core does due diligence you don’t have to, I would be a lying liar who lies. Even if we do as David suggests and have everyone in the world making sure things are vetted and checked and stamped, it still requires the owners of a site listen to that information and not use the code that’s less optimal. Enforcing that would be impossible unless you wanted to suggest that WordPress outright deactivate code that doesn’t use the proper APIs. That would put a lot of weight on WordPress and slow it down and be pretty annoying for people who are legitimately using non-standard methods of development and implementation.

    No matter what, at the end of the day, the person who is responsible for the code quality is the person who wrote and maintains it. But the person responsible for their site is the person using the code. You have to know what code you’re putting into the site and be aware of the risks you’re introducing to your environment by doing so. If your website is your entire business, you cannot afford to be cavalier about these things.

    Disasters happen. Understanding the risks will prepare you for dealing with them when they do.

  • Mailbag: Site Security Plugins

    Mailbag: Site Security Plugins

    This one comes from Gabriel and WordCamp LAX:

    Hello, I met you at WordCamp LA 2014. Thank you so much for speaking there and giving me great advice. I am now in a pickle again though, I wanted to ask you as an expert. What premium/pro version of site security protection would you find to be the best for a WP site? I am now using the free version of iThemes but I want to start buying pro version of iThemes, which would be $40 a year for a client.

    I don’t use any security plugins on this site. I use Mod Security, some complex .htaccess rules, and a firewall app on my server. None of the weight of the security is on my WordPress install for a few reasons.

    This site may be a nice massive Multisite, but on this server I have a dozen other WordPress sites and not all are my own. I also have a gallery and a wiki, a forum, and a few other non-WordPress things. Using just a WordPress plugin leaves about a third of my site not protected. Worse, it means I have to be sure all my ‘customers’ are equally protected all the time and upgraded and configured right. I opted to take that out of their hands.

    Most major hosts (DreamHost, BlueHost, GoDaddy, LiquidWeb, etc) all have Mod Security and a firewall, or some equivalent. Some of them have fail2ban and others have CSF but they all do have server level protections that frankly do a better job of protecting you against a brute force attack than a plugin ever can. I’ve said this before in many different ways but I’ll spell it out again. I don’t believe a plugin is ever the best choice to protect you from a DDoS. That does not mean a plugin doesn’t help, but it does mean I would never use it as my first and only defense against attacks and hacks. The practical reason is that it makes a site slower, to have it recursively check things.

    With that said, there is a different sort of ‘protection’ to be gained from a security plugin, and that is notifying me as to what files are changed. If you’re using cPanel, WHM has a feature to email you about Recently Uploaded Cgi Scripts which emails me when certain core files on my server changes, but also when a plugin upgrades and messes with email:

    /home/ipstenu/public_html/wp-content/plugins/contact-form-7/includes/submission.php:240:
    /home/ipstenu/public_html/wp-content/plugins/contact-form-7/includes/submission.php:241:       private function mail() {
    /home/ipstenu/public_html/wp-content/plugins/contact-form-7/includes/submission.php:242:               $contact_form = $this->contact_form;
    

    That’s one of my favorite things, by the way. It’s a rare email to get, but I love getting it because I know what dangerous emails are sent. There’s also an add-on feature of CSF called ConfigServer eXploit Scanner which can be used to send emails when any file is changed. This is awesome for scanning PHP changes and even is aware of WordPress though it’s probably going to have a lot of false positives given the nature of WordPress upgrades.

    And this does get us to where I do use security plugins. Rarely, yes, but when I do use them I use products like a malware scanner to make sure my files aren’t changed without me knowing. You hear that called “Security File Integrity Monitoring” sometimes, and the idea is that I want to know when any files on the server are changed. But since Gabriel mentioned ‘for a client’ I can guess that he doesn’t have admin access to the server, which makes the whole thing a lot messier.

    The weakest leg in the security tripod is users. Sorry. Users are people. We make mistakes, we eat gas station sushi (hush, Otto, you get the point), and we don’t think about our actions.

    With that in mind, which plugin would I use? It depends on the client and how much help I think they’ll need cleaning up, and how much help I’m going to be expected to provide. I’d be inclined to hook them up with a service that can help unhack them if I’m worried about that, or if I know they can follow directions well, then a simple scanning plugin is fine.

    It’s really not a simple answer, though.

  • Why You Can’t (Always) Catch Cache

    Why You Can’t (Always) Catch Cache

    Or rather, why you don’t want to cache.

    When speeding up a site, the first thing we always talk about is streamlining the site. Ditch the redundant plugins, clean up the code, get a lightweight theme, dump the stuff you don’t use, and upgrade. Once you’ve done that, however, we talk about caching, and that is problematic. Caching is tricky to get right, because you can’t cache everything.

    Actually you can, but you don’t want to cache everything, and you really shouldn’t.

    The basic concept of a cache is ‘take a picture of my website in this moment.’ We think of it as a static picture, versus the dynamic one we use for WordPress. That static picture is not so static, though. You call CSS and JS still, and you call images. But the idea there is not to call PHP or the database and thus your site will be faster, as those are the slowest things.

    Great, so why don’t I want you to cache everything?

    Missed Catch

    The obvious answer is latency. If you’re designing a site or making a lot of style changes, you need to disable caching because otherwise you get a lag between edits and displaying text. But there’s also server latency.

    When we talk about things like PageSpeed (I use mod_pagespeed my server) to improve web page latency and bandwidth usage, we’re talking about actually changing the resources on that web page to the best practices. This sounds great but we have to remember that by asking the webserver to do the work before WordPress (such as having PageSpeed minify my CSS and JS), we’re still making the server do the work. Certainly it’s faster than having WordPress do it, but there will still be a delay in service.

    The next obvious answer is security. There’s some data you flat-out don’t want to cache, and it’s pretty much everything in wp-admin (on WordPress). Why? If you have multiple users, you don’t want them getting each other’s content. Some are admins, some aren’t, and I know I don’t need my guest editor seeing the post about where I’m firing her and why.

    Actually, we’ll extend this. Very rarely do I want my logged in users to see a cache at all. They’re special, they’re communicating and making edits on the fly. Having them see cached content, and constantly refresh it, will be more draining on my server than just loading the content in the first place. Remember the extra load caused by PageSpeed (or even your plugins) generating the cache? That would be constantly in progress when a logged in user made a change, so let’s skip it all together.

    Tagging on to that, you also don’t want to your admins and logged in users to generate a cached page. This isn’t the same as seeing a cached page, I don’t want non-logged in users to see the version of the site a logged in one gets. A logged in user on WordPress gets the toolbar, for example. I don’t want the non-logged in ones to see it.

    Finally we have to round back to security. If I have SSL on my box and I’m using HTTPS to serve up a page, no way no how do I want to cache anything related to users. In fact, I may not even try to cache JS/CSS either. The basic presumption of https is that I need security. And if I need security, I need to keep users out of each other’s pockets. The best example of this is a store. I don’t want users to see each other’s shopping carts, do I? And your store is going to be https for security, so this is just one more feature there of.

    Of course, there are still things to cache. I setup PageSpeed on my https site so it will compress images, make my URLs root-relative, and compress and minify HTML/CSS/JS. But I don’t have a traditional cache with it at all. This does mean as we start to look towards an https-only world (thank you Google) we’re going to run into a lot of caching issues. All the quick ways we have to speed up our sites may go away, to be replaced by the next new thing.

    I wonder what that will be.