Half-Elf on Tech

Thoughts From a Professional Lesbian

Tag: htaccess

  • Vulnerability Reports Miss The Mark

    Vulnerability Reports Miss The Mark

    Lately I’ve been getting a lot of ‘vulnerability’ reports.

    I use the term loosely because the reality is these are not actually serious vulnerabilities. A couple months ago I started getting a lot of weird reports like this:

    A FLAW FOUND ON YOUR WEBSITE!

    Website Security Vulnerability Notification

    Hello, a security researcher reported a security vulnerability affecting [your] website via [company] coordinated and responsible disclosure program:

    Those can be super scary! Is there really a massive issue?

    No. But I know why it feels that way. And frankly I think a lot of these people are targeting the wrong group. Let’s get into it.

    Scare Tactics

    In the case of all the ones I got, there was only one that I felt actually was. But first, here’s what people reported:

    • The PHPInfo Page was public
    • Directory indexing
    • People can list users (aka User Name disclosures) via the REST API
    • Your xmlrpc is showing
    • Incomplete SSL Protection
    • Your email records allow spoofing/DMARC compliance

    The last one? Absolutely an issue. I thanked that person and kicked them some money. But the others? They’re issues, but they’re also incredibly minor! Heck, this user name listing ‘vulnerability’ does not take the following into consideration:

    1. It’s on a site where every author has a page
    2. We have an ‘about us’ page that lists everyone anyway
    3. Strong passwords are enforced
    4. We have a firewall

    The only way I could really improve that would be to enforce 2FA, which I’m contemplating for admins. But that begs the question… is this a vulnerability?

    Okay, let’s ask why does this work? It’s known that WordPress has a REST API. This API can be used to list public information about registered users. Now the API does ‘expose’ the user data for everyone who’s authored a public post that is shown in the REST API. Posts and pages and some custom post types included. If the user hasn’t authored posts, you won’t have permission. So again, we’re only able to list public authors. Okay.

    Could that be bad? Sure. In the same way having a front door could be bad if someone kicked it in. But ‘security’ isn’t why I would ever consider blocking that. We literally list all the authors publicly already. If someone wants to use wp-json to grab them, cool. It only shows public information we displayed already, after all.

    Why would I consider blocking? To ensure stability. That is, people hammering my site to find out that I’m not user on HalfElf (surprise!) makes my site slower. But… I have a firewall and Mod_Security, and IP Tables, which means if you hit my site enough, it’ll block you. Also a lot of stuff is cached, like it should be. Which means this is not a ‘vulnerability’ but more of a ‘best practice notice’ in my opinion.

    And finally … FFS why are you telling individual site owners this!? If you really think it’s a security issue, take it up with WordPress!

    How Do You Stop Them?

    Well, generally you fix the ‘issues.’ Even if you think it’s full of shit, you fix it. So okay, what do we do?

    PHPInfo? Locked it down. I use it for regular checks of other things. If you’re not, just delete it.

    Directory Indexing? I put this at the top of my .htaccess (and yes, you should, I’d removed it for some tests):

    ### Prevent Directory Browsing ###
    Options All -Indexes

    XMLRPC? I said “Nope, not gonna change.” Because I use the WordPress iOS App.

    SSL? You’ll want to check your setup on things like SSL Checker or Immuniweb or SSL Labs. I found SerpWorx’s tool to be invaluable for spelling out what was missing. The easiest by far was SecurityHeaders.com. For that, I ended up adding this to my .htaccess:

    ### Extra Security
    <IfModule mod_headers.c>
    	Header set Strict-Transport-Security "max-age=31536000; includeSubDomains; preload"
    	Header set X-XSS-Protection "1; mode=block"
    	Header always append X-Frame-Options SAMEORIGIN
    	Header set X-Content-Type-Options nosniff
    	Header always set Expect-CT "max-age=7776000, enforce"
    	Header set Referrer-Policy "same-origin"
    	Header always set Permissions-Policy "geolocation=(); midi=();notifications=();push=();sync-xhr=();accelerometer=(); gyroscope=(); magnetometer=(); payment=(); camera=(); microphone=();usb=(); xr=();speaker=(self);vibrate=();fullscreen=(self);"
    </IfModule>

    The one thing I left out was Content-Security-Policy because that one is crazy complex and needs a lot of testing since a lot of content on the site is remote and needs special rules.

    Email/DMARC? That took a lot longer, and I had to talk to my email provider to sort it out. But you can run your domain through the MXToolBox checker and see what you’re missing. It’s going to make you cry. Email sucks.

    Okay but I wanna hide users!

    I hear you. You can do this in .htaccess:

    ### Block User ID Phishing Requests
    <IfModule mod_rewrite.c>
        RedirectMatch 301 ^/wp-json/wp/v2/users(.*) /about-us/
    
    	RewriteCond %{REQUEST_URI} !^/wp-admin [NC]
    	RewriteCond %{QUERY_STRING} author=\d
    	RewriteRule ^ /about-us/ [L,R=301]
    
        RewriteCond %{QUERY_STRING} rest_route=/(.*) [NC]
        RewriteRule (.*) /wp-json/%1 [L,R=301,QSD]
    </IfModule>

    Now. This means on that site if you go to example.com/?author=1 you will not go to someone’s page. But if you go to example.com/author/ipstenu/ you still would. Which IMO points out how stupid that ‘vulnerability’ is. Yes, I am aware you can see the authors. Oooooh. You’re supposed to!

    Conclusion?

    A lot of those vulnerability emails are bullshit. I politely reply “Thank you for your concern however we are not blocking access to that because the API is used by other things. It’s considered to be public knowledge anyway.” I may end up writing a form letter.

    And the sucky thing is that one of the sites that collects all that stuff relies only on the reporter to determine if it’s resolved. Both issues they have for the domain in question? 100% resolved. But they say ‘unpatched’ … probably because I told both reporters I’m not paying them.

    I added this to my profile:

    We do not accept reports of basic WordPress functionality, such as the Rest API being active, the use of xmlrpc.php, the enumeration of users, etc. Those are an acceptable risk. Please don’t bother reporting them, they should be addressed with WordPress directly, not end users.

    By the way. The bug bounty program that keeps emailing me? Uses WordPress. And guess who’s site has /wp-json/wp/v2/users available to list all their public authors? Yeah. Because it’s not a goddamn major issue.

    I know someone’s gonna point out it could be a major issue. Sure. Like having a window means your house or car could get broken into. That doesn’t mean you remove all the windows!

  • Debugging cPanel’s Default Webpage

    Debugging cPanel’s Default Webpage

    It started with a weird email from someone complaining that a 5 year old link was broken. They were trying to go to tech.ipstenu.org. I don’t, and haven’t used that since maybe 2011 or so. That was when I bought halfelf.org you see. I knew the domain should be forwarding, I set that up a million years ago, but for some reason it wasn’t. I told him the right URL and went back to puttering around.

    But it bugged me, you know?

    And later that day, half my domains started spazzing. It turned out they were still pointing to the ‘temporary’ name servers, ns3 and ns4. I cleaned up my DNS zones and rebuilt them (thank you Dan E. from Liquidweb) but for some reason it was still derping.

    Now… as you know, I set up AutoSSL and Let’s Encrypt, like a good internet monkey.

    In the middle of all this shit, I thought to myself ‘Self, I should fix having a subdomain as an add-on which I don’t need anymore now that we have this set up!’ I deleted store.halfelf.org as an add-on and put it back properly as a named subdomain.

    Then I went and properly re-ran the AutoSSL check…

    Errors:

    3:43:30 AM WARN The domain “store.halfelf.org” has failed domain control validation (The system failed to fetch the <abbr title="Domain Control Validation">DCV</abbr> file at “<a href="http://store.halfelf.org/3712.BIN_AUTOSSL_CHECK_PL__.MREaLFbJJfusZuQX.tmp">http://store.halfelf.org/3712.BIN_AUTOSSL_CHECK_PL__.MREaLFbJJfusZuQX.tmp</a>” because of an error: The system failed to send an <abbr title="Hypertext Transfer Protocol">HTTP</abbr> “GET” request to “http://store.halfelf.org/3712.BIN_AUTOSSL_CHECK_PL__.MREaLFbJJfusZuQX.tmp” because of an error: SSL connection failed for store.halfelf.org: hostname verification failed .). at bin/autossl_check.pl line 449.
    

    I read down and saw I had this error for ALL the bad domains. Coincidence? I think not. And neither do you, right? Right.

    I did what you do and Googled and Googled and came across people saying that it was Sucuri (nope) or some other CloudFlare type firewall (nope), and then I thought about the crux of the error. “SSL connection failed” is a pretty distinct error, I felt. And of course the SSL connection failed, there wasn’t a certificate yet! So why was it trying to get to SSL right away?

    And then I remembered … I have this in my .htaccess

    # Force non WWW and SSL for everyone.
    <IfModule mod_rewrite.c>
            RewriteEngine On
    
            RewriteCond %{HTTP_HOST} ^www\.(.*)$ [NC]
            RewriteRule ^(.*)$ https://%{HTTP_HOST}%{REQUEST_URI} [R=301,L]
    
            RewriteCond %{HTTPS} off
            RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI} [R=301,L]
    <IfModule mod_rewrite.c>
    

    Which MEANS when it goes to http://store.halfelf.org, and doesn’t get the proper reply, it redirects to https which is the bad page that cPanel always does.

    Oh yes.

    Deleted those lines, re-ran AutoSSL, and it works.

    Picard, Riker, and Worf facepalm.

    Okay, smarty, what’s the real fix? Because as much as I want to leave this in place, I’ll have to remember to turn it off every time I add a new domain or subdomain to the system, and while that’s rare, it’s the rare cases that cause the most problems (thank you Herbert Hecht).

    I looked back at the error and recognized the pattern being repeated: .BIN_AUTOSSL_CHECK_PL__. I saw it all over the place. I also knew that the folder AutoSSL puts down for LE is .well-known/acme-challenge (it’s in your web root). And I also knew this extra thing… I knew .htaccess

    My new rule:

    # Force non WWW and SSL for everyone.
    <IfModule mod_rewrite.c>
    	RewriteEngine On
    
    	RewriteCond %{HTTP_HOST} ^www\.(.*)$ [NC]
    	RewriteCond %{REQUEST_URI} !^/\d+\.BIN_AUTOSSL_CHECK_PL__\.\w+\.tmp$
    	RewriteCond %{REQUEST_URI} !^/\.well-known/acme-challenge/
    	RewriteRule ^(.*)$ https://%{HTTP_HOST}%{REQUEST_URI} [R=301,L]
    
    	RewriteCond %{HTTPS} off
    	RewriteCond %{REQUEST_URI} !^/\d+\.BIN_AUTOSSL_CHECK_PL__\.\w+\.tmp$
    	RewriteCond %{REQUEST_URI} !^/\.well-known/acme-challenge/
    	RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI} [R=301,L]
    <IfModule mod_rewrite.c>
    

    Ironically, once I sorted all that out and understood I needed to whitelist things for AutoSSL and LE, I was able to Google and find an answer. cPanel knows about the issue and has a case open to fix it for everyone.

    Still, I’m leaving that code in place for the one account that tends to add subdomains often enough that I would need this, and not-often enough that I’d remember.

  • HTTPS and HSTS

    HTTPS and HSTS

    HTTP Strict Transport Security (HSTS) is a standard to protect visitors by ensuring that their browsers always connect to a website over HTTPS.

    Basically you have your server say “Everyone who accesses my server should use secure connections.” This matters because it prevents man-in-the-middle attacks that change HTTPS to HTTP and steal your credentials. Bad days. If you are using HTTPS/SSL on all your domains you should totally enable HSTS.

    Okay, great. How?

    Well if you have one domain, this is as easy as tossing this into your htaccess:

     <IfModule mod_headers.c>
        Header set Strict-Transport-Security max-age=16070400;
     </IfModule>
    

    But … I have 20+ domains on this server. That would suck to have to edit! In fact, this is really closely related to my issues combatting referrer spam server wide. This stuff isn’t always obvious. For cPanel, I just added that code to my pre_virtualhost_global.conf file, same as I did for a certain referrer spam company.

    If you’re using NGINX, you should read their blog post on the subject for full details but the basic code is this:

    server {
        listen 443 ssl;
    
        add_header Strict-Transport-Security "max-age=31536000; includeSubDomains" always;
    
        # This 'location' block inherits the STS header
        location / {
            root /usr/share/nginx/html;
        }
    
        # Because this 'location' block contains another 'add_header' directive,
        # we must redeclare the STS header
        location /servlet {
            add_header X-Served-By "My Servlet Handler";
            add_header Strict-Transport-Security "max-age=31536000; includeSubDomains" always;
            proxy_pass http://localhost:8080;
        }
    }
    

    And if all else fails and you can’t set this on the server, you can always edit your .htaccess or nginx.conf file locally.

  • One Direction: Sanely

    One Direction: Sanely

    In my ongoing saga of moving from MediaWiki to Jekyll, I decided to be smart and rename all my URLs.

    Stop.

    I know that’s the worst thing you can do. Please put down the pitchforks. I didn’t do this without redirections and I didn’t do it without deep thought. You see, the issue with MediaWiki is that it was a pretty flat file representation of the site. This doesn’t have to be the case. You can choose to put pages as sub-pages on MediaWiki very easily, but few people do. I didn’t. That means I had around 1000 pages that, in Jekyll land, would all be in one folder. And that sucked. Like I mentioned before, I made Collections.

    This meant I was moving from example.com/PAGE to example.com/folder/PAGE/ (and yes, the backslash matters, MediaWiki doesn’t do that). Now, from a structural standpoint, this makes a lot more sense. You want to have a collection of interviews, you have example.com/interviews/year/interviewname/ and that’s ridiculously easy to understand. But. When you’re moving from no structure to some structure, you have to accept a bit of a loss.

    So here’s how to redirect sanely:

    1. Make a list of what has a direct relation to a new page
    2. Look at your page visits to see what’s hit the most and make sure that’s redirected
    3. Have a catch-all at the end

    That’s it! Three simple steps to do it. But how did I do it?

    On the Wiki, I had a whole subsection called “Encyclopedia” which had all my internal documents like the about page, the policy pages, and so on and so forth. Those were all moved to the WordPress blog. Most had a new page, but the category itself did not so I added this the .htaccess in root:

    Redirect 301 /wiki/Category:Wiki /
    Redirect 301 /wiki/Encyclopedia:About /about/
    Redirect 301 /wiki/Encyclopedia:Policy /policy/
    RewriteRule ^wiki/Encyclopedia:Copy(.*)$  /copyrights/   [L,R=301]
    RewriteRule ^wiki/Encyclopedia:Terms(.*)$ /terms-of-use/ [L,R=301]
    RewriteRule ^wiki/Encyclopedia:(.*)isclaimer /disclaimer/ [L,R=301]
    

    That’s all incredibly straightforward for .htaccess rules. The first three are one-to-one direct links and the last three are with variables to handle URL strings like “Terms_Of_Use” and “Terms_of_use”, both of which were Wiki-forwarded to the correct “Terms of Use.” Similarly, I wanted to redirect “General_Disclaimer” and “Disclaimer” to one page, simplifying things.

    This pattern of one-to-ones continued on through my .htaccess. Pages that I could link like that I did. But then I hit on a couple big sections, the Interviews and News Articles. Those I had, for years, broken apart into pages by year. So I cleverly used tricks remembered from changing my WordPress date permalinks to do this:

    Redirect 301 /wiki/Interviews /library/transcript/
    RewriteRule ^wiki/Interviews_\((.*)\)$ /library/transcript/$1 [L,R=301]
    RedirectMatch 301 /wiki/(News|News_Articles) /library/news/
    RewriteRule ^wiki/News_Articles_\((.*)\)$ /library/news/$1 [L,R=301]
    

    The main URLs were directed to the new main locations, but the per-year ones were sent, logically, to their new year locations. But that was the easy part. When I moved things over, I got rid of ‘character’ pages (which I had only sporadically updated anyway) and I wanted to combine a lot of the redundant pages:

    RedirectMatch 301 /wiki/(The_West_Wing|West_Wing|west_wing|ww|Jed_Bartlett) /library/show/west-wing/
    RewriteRule ^wiki/The_West_Wing_\((.*)\)$ /library/show/west-wing-episodes/ [L,R=301]
    

    That’s starting to look a lot better. I did a lot of those. I tried to make them as dynamic as possible, but there were limits. In the end, I had about a dozen popular links I had to do manually. I don’t like that, but that’s the world I got myself into. I wanted to be able to redirect news and interviews to at the very least their year page, but as it turned out, I used the same naming conventions.

    Redirect 301 /wiki/Yahoo_News_(01_January_2000) /library/news/2000/yahoo-1/
    Redirect 301 /wiki/EOnline_(01_January_2011) /library/transcripts/2011/eonline-1/
    

    See the problem? There’s no way to really point those around. I played with a lot of options before I ran a search my recent traffic to see what pages were popular that people were going to in the /wiki/ folder and redirected them as best I could.

    Finally I had my fall back:

    RewriteRule ^wiki/(.*)$ /where-has-the-wiki-gone/ [L,R=301]
    

    This is the last rule. If anything gets through the others and down to this one, it sends them to a new page that explains where the wiki went and links to the popular pages.

    The most important thing to remember in all this is to put things in order. Your .htaccess is a top-down file. It starts at the top, it processes rules in order, until it’s done.

  • Mailbag: .htaccess Magic with Subdomains and Subfolders

    Mailbag: .htaccess Magic with Subdomains and Subfolders

    From Ginger:

    Hiya, I enjoy your blog and humor within

    In case you still do posts from the “mail bag” — I have seen mention of .htaccess on your site and I’m curious how it relates to the subject of redirects. For example, in my day job we have a site that has two ways to access an external support site.

    So, we want to redirect: www.example.com/support and support.example.com to the same external support site. In the past, I would go to our host and edit the DNS to redirect on these URLS to the external site. Is that the best way or should I be handling this in the .htaccess file now that we’re on WordPress?

    Depends on how lazy I am.

    DNS is great because I can just send support.example.com over, but this only works if the external site lets me do that. I can point my domain at tumblr.com all I want, yet until I add my custom domain to their settings, the URL won’t work. This is why your domain mappings in Multisite don’t always work, folks. You have to tell WordPress, the name servers, and the server that the domain lives there. If you can? Super simple for the subdomain, but not for the subfolder.

    Which brings us to .htaccess:

    Apache 2.2

    RewriteCond %{HTTP_HOST} ^example\.com [NC]
    RewriteRule ^support(.*)$ http://supporturl.com/ [L,R=301]
    
    RewriteCond %{HTTP_HOST} ^support\.example\.com [NC]
    RewriteRule ^(.*) http://supporturl.com/ [L,R=301]
    

    Apache 2.4

    <If "%{HTTP_HOST} == 'example.com' ">
    	RedirectMatch ^support(.*) http://supporturl.com/
    </If>
    
    <If "%{HTTP_HOST} == 'support.example.com'">
    	RedirectMatch (.*) http://supporturl.com/
    </If>
    

    Now someone will note that specifying the HTTP_HOST for example.com is silly, but I disagree. This lets me use http://foo.example.com/support/ to redirect somewhere else. I don’t worry about the www part because I always force no-www on all my domains. Saves me steps later.

    Oh and I’m glad Ginger likes my humor. I don’t know how to turn it off!

  • The Revolution of .htaccess and Multiple Domains

    The Revolution of .htaccess and Multiple Domains

    I mentioned this in a post about jiggering Google and Multisite WordPress, and my buddy Jan went “WOW!” So I’ll start with the best thing ever in Apache, to me at least. Apache 2.4 allows for real if/else statements in .htaccess.

    Benjamin Franklin's glasses - because revolution jokes are funny

    In 2012, I wrote about how I did a lot of request header detection in order to make myriad blocks of checks to sort out my .htaccess anarchy. As I explained here, I have to do a check for the domain for each and every rewrite rule:

    Why did I duplicate the RewriteCond? Typically, you cannot use multiple RewriteRule statements following a single RewriteCond. That means for ever call I make to a domain, I can use but one rewrite rule. There are ways around that, but none of them worked well for me.

    It’s ugly on Apache 2.2. Since that time, however, I’ve moved to Apache 2.4, and the world is vastly different thanks to If/Else calls!

    Let’s take this old section I had from my move (about 6 years ago) from blog.ipstenu.org to ipstenu.org, as well as some permalink changes and a change to my uploads folder:

    # Ipstenu Moves
    RewriteCond %{HTTP_HOST} ^blog\.ipstenu\.org
    RewriteRule ^(.*) https://ipstenu.org/$1 [L,R=301]
    RewriteCond %{HTTP_HOST} ^ipstenu\.org
    RewriteRule ^blog/([0-9]{4})/([0-9]{2})/(.*)$ https://ipstenu.org/$1/$3 [L,R=301]
    RewriteCond %{HTTP_HOST} ^ipstenu\.org
    RewriteRule ^blog/(.*)$ https://ipstenu.org/$1 [L,R=301]
    

    Taking each block at a time, the first was easy:

    <If "%{HTTP_HOST} == 'blog.ipstenu.org'">
        RewriteRule ^(.*) https://ipstenu.org/$1 [L,R=301]
    </If>
    

    Pretty simple. If the host is blog.ipstenu.org, redirect. And it mostly worked. Except where http://blog.ipstenu.org/dsfasfsdf sent me to https://ipstenu.org/home/ipstenu/public_html/dsfasfsdf … Which isn’t good! The fix here is that you use RedirectMatch instead of RewriteRule, which gives us this!

    <If "%{HTTP_HOST} == 'blog.ipstenu.org'">
    	RedirectMatch (.*) https://ipstenu.org$1
    </If>
    

    Excellent, then on the second one which becomes this:

    <If "%{HTTP_HOST} == 'ipstenu.org' ">
    	RedirectMatch ^/blog/([0-9]{4})/([0-9]{2})/(.*) https://ipstenu.org/$1/$3
    	RedirectMatch ^/blog/(.*) https://ipstenu.org/$1
    </If>
    

    What’s the difference here? Well, RewriteRule is handled by Apache’s mod_rewrite, while Redirect is handled by mod_alias. I know I have mod_rewrite on, but I don’t know why it insists on tossing in the path statement.

    However I can assure you that these ifs work perfectly. I’m using them right here on this site. My whole .htaccess is wrapped with them.