How To

Restrict Site Access Filters

Handling restricted access and caching together, without going crazy.

I have a demo site I use to for development. One of the things I want is to be able to lock the site to logged in users only and that I can do via Restricted Site Access by 10up.

One of the things the plugin also allows is to open up access to an IP, so someone who doesn't have an account can check the site before you go live. The problem with this feature is caching.

Caching Restricted Pages Caching Restricted Pages

It doesn't really matter what kind of caching system you use, the point is all the same. People who aren't logged in should get a cached version of the content. People who are logged in, or whom you've determined need a unique experience, don't get cached content. That's the barebones of caching.

The problem I ran into with restricted site access is that if I whitelisted an IP range, and someone from that range visited the site, they generated a page which my cache system … cached. That meant the next person got to see the cached content.

Worf from Star Trek face-palming

Now this may not actually be a problem in all cache systems, but I happened to be using Varnish, which is fairly straightforward about how it works. And, sadly, the plugin I'm using doesn't have a way around this. Yet.

Top ↑

Filters and Hooks Filters and Hooks

Like any enterprising plugin hoyden, I popped open the code and determined I needed to address the issue here:

// check if the masked versions match
if ( ( inet_pton( $ip ) & $mask ) == ( $remote_ip & $mask ) ) {

This section of code is checking "If the IP matches the IP we have on our list, stop processing the block. It's okay to show them the content." What I needed was to add something just above the return to tell it "And if it's Varnish, don't cache!"

At first my idea was to just toss a session_start() in there, which does work. For me. Adam Silverstein was leery of that having unintended consequences for others, and wouldn't it be better to make it hookable? After all, then any caching plugin could hook in! He was right, so I changed my pull request to this:

do_action( 'restrict_site_access_ip_match', $remote_ip, $ip, $mask ); // allow users to hook ip match

The next version of the release will have that code.

Top ↑

In The Field In The Field

Now, assuming you've slipped that code into your plugin, how do you actually use it?

Since I need to have this only on my 'dev' site, and I'm incredibly lazy efficient, I decided to put this code into the MU plugins I use for the site:

if ( DB_HOST == '' ) {
	add_action( 'restrict_site_access_ip_match', 'mydevsite_restrict_site_access_ip_match' );

function mydevsite_restrict_site_access_ip_match() {

This is not the only way to do it. I also happen to have a define of define( 'MYSITE_DEV', true ); in my wp-config.php file, so I could have checked if that was true:

if ( defined( 'MYSITE_DEV' ) && MYSITE_DEV ) { ... }

Now, you'll notice I'm using sessions, even after Adam and I determined this could be bad for some people. It can. And in my case, in this specific situation, it's not dangerous. It's a quick and dirty way to tell Varnish not to cache (because PHP sessions indicate a unique experience is needed).

The downside is that not caching means there's more load on my server for the non-logged in user who is legit supposed to be visiting the site. Since this is a development site, I'm okay with that. I would never run this in production on a live site. 

4 replies on “Restrict Site Access Filters”

I write membership software and deal with this all the time. Why not emit HTTP headers to instruct Varnish (and other reverse caching proxies) to not cache the page?

WordPress has a function to do this


However, I’ve found it incomplete, I wrote a wrapper for it to emit the following headers after whatever nocache_headers() sends out. The wrapper also checks to make sure it’s safe to emit new headers using the headers_sent() function. Here’s the additional headers I emit:

header( 'Cache-Control: no-cache, max-age=0, must-revalidate, no-store' );
header( 'Pragma: no-cache' );
header( 'Expires: 0' );

There is still a chance that the cache is poorly behaved and doesn’t respect the headers.
Another tactic I’ve used on caches that are WordPress aware, is to emit a faux WordPress cookie. The caches can’t and don’t validate the cookie, so it doesn’t log in the user when they visit the page. The caches just look to see if the cookie exists, and bypass the cache if they do.


So… you asked me why I didn’t do it your way while at the same time saying there was still a chance it wouldn’t work.

The cache I’m using won’t cache if sessions are true. I know this to be a fact because that’s how I wrote the cache tool. So I use what I know, absolutely, will work.

The wp cookie method always works because it uses the same method the proxy uses to not cache logged-in users.,

The other headers should work, and will still be honored downstream even if the reverse proxy is poorly behaved. They also help improve behavior of backspaced pages which otherwise can break form nonces.


The cookie method actually won’t always work if your proxy cache is configured to ignore certain cookies. Which is actually a pretty common configuration. Many varnish style hosts will delete cookies they don’t recognize, which is why you have to whitelist shopping cart cookies.

Comments are closed.