A while back I talked about Referrer Spam in Google Adsense and I mentioned how you could block referrer spam with some .htaccess calls. That’s cool, but when you have 12 sites on a server, making this one more thing to manage per site is a pain in the ass. Well okay, what can we do constructively? And sadly the answer is “Not much.”
First of all, forget the idea of using a robots.txt file. If they were real seo crawlers, they would honor this. They don’t and that’s how I know they’re evil.
Secondly, this will only work if you have server wide access. That should be obvious, but server wide settings need server wide access, and that’s just how it is. I say that it sucks because it can be a little complicated and messy to understand where you put things.
If you have your own server, like I do, then you can make a custom VirtualHost template
Since I’m using Apache 2.4, I made local templates:
$ cd /var/cpanel/templates/apache2_4/ $ cp ssl_vhost.default ssl_vhost.local $ cp vhost.default vhost.local
If you’re using 2.2 then the files are in /var/cpanel/templates/apache2_2/ instead. In each file, I added this to the top of the VirtualHost settings.
RewriteEngine On RewriteOptions Inherit
What that does is it tells Apache that it should inherit Rewrite rules from the main server. That means each virtual host (i.e. each website) will abide by any rules in the local settings.
Where you put this in that file can be weird. I ended up looking for this section, and putting it right below:
[% IF !vhost.hascgi -%] Options -ExecCGI -Includes RemoveHandler cgi-script .cgi .pl .plx .ppl .perl [% END -%]
Put that in both files. Because you use HTTPS, right? Then you need to bounce httpdconf:
/scripts/rebuildhttpdconf
Since I’m using WHM, the next step is to go in to the Apache Configuration section and open the Include Editor. Then you want to add your blocking directive in ‘Pre-Virtual Host Include’ for All versions. If you don’t use it, you’ll want to edit /usr/local/apache/conf/includes/pre_virtualhost_global.conf and bounce Apache after.
As you can see, I have some content in there already.

I added this below:
<IfModule mod_rewrite.c>
RewriteEngine on
RewriteCond %{HTTP_REFERER} spammerseocompany\.com [NC,OR]
RewriteCond %{HTTP_REFERER} keywords-monitoring-your-success\.com [NC]
RewriteRule .* - [F,L]
</IfModule>
Does it work? Yes. It blocks ‘spammerseocompany’ from all my domains on my server. I put in the other URL since that’s the one they have that’s currently spamming the heck out of my stuff. There are a lot of other options with Apache 2.4, like sending them a 403 and so on. You should read up on using mod_rewrite to control access and pick the method you find most sustainable. For example, you could single like it:
RewriteCond %{HTTP_REFERER} (spammerseocompany|keywords-monitoring-your-success)\.com [NC]
I find that a bit clunky.
If you’re using nginx, you’ll want this I believe:
if ($http_referer ~* "keywords-monitoring-your-success\.com|spammerseocompany\.com") {
return 403;
}
A big note of caution here. If your list gets too long, you’ll end up slowing your server down. A lot. So keep it as simple as you can. I find that CSF does a dandy job of blocking the most of my trouble makers, and I only need this for the unnamed spammerseocompany because they don’t abide by the common rules of robots.
If, one day, they do, I will stop blocking them and allow their robots. As it stands, they’re idiots and need to go away.








Install Nginx


This was straightforward. Follow their directions and it’s fine. I went for free, and fiddled with my Security Settings a lot. I hate captcha. And I know, I knooooow, the users for this site will cry if they get hit by one, so I turned my security to “Essentially Off” – This is the only way to get rid of Captcha. Sad panda. I also turned “Browser integrity check” on for now.
What is static that I can and should cache? JS, CSS, font files, images. What is not static? Blog posts, comments that are happening all the time, fast and furious. A gallery that needs to update. A wiki that has a deadline. Worst of all, it prevented two of my apps from being able to make their own ‘static’ cache in the background. Now really that means I shouldn’t have to make my static cache at all, but this brought up another issue. Coordinated pushes of content, where four separate apps update 1-3 pages each at the same time means I need to be able to purge those pages, right away. And right now, there aren’t extensions to do that.