Half-Elf on Tech

Thoughts From a Professional Lesbian

Author: Ipstenu (Mika Epstein)

  • Genericons: Plugin’d

    Genericons: Plugin’d

    banner-772x250 The thing about all this is that I really like Font Awesome. The licensing drives me to drink. The WordPress Repository has an extra rule, saying everything there has to be GPLV2 or later, for reasons that aren’t the point. What is the point is that the moment Genericons came out, I knew that it should be a plugin, because a totally GPL-compatible version of a font like this was what people wanted.

    Since I also knew Rachel Baker had made a killer Font Awesome Plugin (and yes, that’s the one I use), I quickly stripmined its code and made Genericon’d.(At this point it’s pretty much a re-write, but I always credit where I started!)

    ZabooThe name is not Genericons because it’s not official, and they may want that name later. With that in mind, I thought “Well I totally Genericon’d them all!” because sometimes I talk like Zaboo from “The Guild.” I think of him as the Patron Avatar of this Plugin (though he’d probably ask why there wasn’t a Genericon for his staff, or Codex’s).

    So what are these ‘font icons’ things anyway and how do they work?

    Normally if you want to insert a Twitter image, let’s say, you would have to go find the image, download it, edit it to the right size, upload, embed. On the other hand, with a font you can do this: That will look like this: Isn’t that cool? All you have to do is include the font and the CSS in your site and you’re good to go. All those files are smaller than most images, load faster, and best of all, they scale better.

    [genericon icon=twitter size=4x] Same font, bigger size. Isn’t that cool? Since they’re pure CSS, you can do whatever you want, from changing colors and size to inserting into menus, like I did on another site. When you add in their relatively small file size and scalability, you gain and added level of awesome because your little icons always look amazing on retina displays too! After all, they’re just fonts.

    The alternative to something like this would be to use sprites, which is actually what WordPress uses today on your dashboard, and they look like this:
    WordPress's Menu

    If you go look at your WordPress dashboard, you’ll notice that hovering over these images makes them change between the dull grey and the cool colorized version. In order to do that, you have two images. Not so with Genericons! .genericon-twitter:hover {background-color:pink;color:purple;} would do the same thing (in pretty garish colors…). Just as an example of how it works, here’s a link with a Genericon in it: [genericon icon=twitter] @ipstenu. It’s actually kind of nice how it automatically adapts to the CSS I have in place for hovering over links.

    Basically the reasons to use icon fonts instead of images are that you can style them with CSS, they look good on all displays at any resolution, they easily adapt to fit your site when you change themes and colors, there’s only one HTTP call for the icons, and they’re open source.

    Here are some features in Genericon’d (as of version 1.2) that I think are kinda awesome:

    On the fly color changing.

    You can make a Twitter Blue icon: [genericon icon=twitter color=#4099FF] makes [genericon icon=twitter color=#4099FF]

    On the fly resize.

    You can make a Facebook icon bigger: [genericon icon=facebook size=4x] makes [genericon icon=facebook size=4x]

    And it all pretty much works the way I want it to. I did tweak the CSS a little to use em instead of px, which isn’t perfect. Genericons works best when your font is a derivative of 16, and for some reason, people still default to 12px. Protip: Ask someone with imperfect vision to look at your site. If they squint, your font is too small.

    Genericons, and any font-icon add-on, aren’t perfect for everyone or every site, but they’re here if you need ’em.

  • IE 8 and SVG Smilies

    IE 8 and SVG Smilies

    I don’t like the default smilies in WP. There, I said it. They’re old and busted, so I use the smilies_src to replace them with nicer ones. Recently it came to my attention how old and busted my cute PNGs looked on my iPad and any retina capable computer, so I fiddled around and decided SVG graphics were the way to go. They scale well, and they work on all modern browsers, yay!

    Oh, wait, IE 8 is not a modern browser and it’s pretty common out there… In fact my old job still uses it on a lot of PCs. And so does at least one user on this site (actually 5, and one more uses IE 6, for crying out loud!) so I came up with this:

    // Move Smilies
    add_filter('smilies_src','my_smilies_src', 1, 10);
    function my_smilies_src($img_src, $img, $siteurl) {
        $img = rtrim($img, "gif"); // Remove GIF
    
        if ( strpos( $_SERVER['HTTP_USER_AGENT'], 'MSIE 8' ) || strpos( $_SERVER['HTTP_USER_AGENT'], 'MSIE 7' ) ) {
            $type='png';
        }
        else {
            $type='svg';
        }
        return 'http://domain.com/images/smilies/'.$img.$type.'';
    }
    

    That said, I wish we had more modern smilies available for WP. Finding a set that look okay (like the ones I have here) that are also retina capable are not easy. I could use user agents to go the other way, checking if the visitor was on a ‘new’ iPad or iPhone and show them retina that way, but to the best of my knowledge, there’s no way (yet) to do it so that a retina MacBook also gets the nicer view. With that in mind, I went just with SVG, which scale naturally and meet all of my needs.

    By the way, thank Otto for the smilies filter. You can use it for normal filtering too:

    add_filter('smilies_src','ipstenu_smilies_src', 1, 10);
    function ipstenu_smilies_src($img_src, $img, $siteurl){
        $img = rtrim($img, "gif");
        return $siteurl.'/images/smilies/'.$img.'png';
    }
    

    That’s what I use here.

  • Cacheless (or not)

    Cacheless (or not)

    ETA: As of a month later, I’ve actually switched from APC to Zend Optimizer+

    FilesDon’t get me wrong, I love caching. I love W3 Total Cache (I’m willing to spend my ‘free time’ testing it, after all), and WP Super Cache saved my life once. So why, on a day where I got a 400-600% uptick in traffic (not a joke), did I turn all my caching off? I’m daring, and a little crazy, but I wanted to see if it could be done. I would not have tried this if I was on a smaller server: if you’re getting as much traffic as I am, and you’re on shared hosting, you really need to move to a VPS or Dedicated Server if you want to turn off caching via plugins. It’s not to say that caching is better or worse than not-caching, or vice versa, or that one is a rich-man/poor-man equivalent of the other. Caching plugins are an inexpensive way to speed up your site, and if you can’t afford a bigger server they will buy you the time you need to figure out a better solution. Even with a good plugin and setup, if you get hammered with a lot of traffic, you will crash your site unless the server’s optimized too. Again, what I’ve done is not something I’d try on a low-end server with high traffic.

    When I started measuring the effectiveness of all this, I used:

    To understand what caching is and why we use it, it’s good to understand the basic concepts, and to start by looking at what caching plugins are, how they work, and where their pitfalls are.

    • There are parts of your website that don’t change often (images, javascript, CSS, etc).
    • You want the user to only download what’s changed.

    That sounds easy, but WordPress isn’t static HTML, it’s PHP, and that means every time you visit the page, it runs various proceses to give you the latest and greatest files. The problem with this dynamic code is where content changes rapidly (think ‘comments’ or ‘forums’ or ‘BuddyPress Groups’). Suddenly caching ‘pages’ as wholesale chunks of html doesn’t help if you have to re-cache when someone leaves a remark. Add in the possibility of 4 or 5 people commenting at once, for 12 hours, and now you’re risking a thrashing situation where you keep trying to cache, but it keeps flushing. This is why most people use plugins that handle things elegantly, or try to, where the ‘static’ part of the page (sidebar, etc) are HTMLized, but the dynamic part is left alone. This helps when you have a portion of every page is dynamic, like a shopping cart with a ‘Your order…’ box.

    StorageBut the downside is that you have to write fancy code that remains dynamic portions, and while it certainly can be done, it’s not fun, and let’s be honest, a theme developer doesn’t know which cache you’re going to use, so how can it write the right way for that? The only way to make a truly dynamic and cachable site is to do it from day one, with your theme, server, and plugins all crafted to provide the best experience. And then we have reality, which is we start with something simple, wake up to something large, and experience growing pains.

    Accepting the fact that we’re not starting from nothing, that we have an existing site with content and activity, the first thing most people do is install a plugin. Now, back to what I said before, this isn’t a bad thing. It’s a good first step and will buy you time. It’ll also show you where you need to go. If you don’t have server root access, this may be a your limit, too, as some of the other things I like to do to speed things up without a cache will require it (or you’ll have to ask your hosts and they may tell you to upgrade).

    If you’re going to use a plugin, WP Super Cache (WPSC) and W3 Total Cache (W3TC) are the best two. W3TC is way more advanced, and has a lot of extra bells and whistles, but personally I find that once you can master it, you’re well on your way. Remember though, you’re sacrificing a lot of control here by using a plugin. They’re going to, by their nature, cache everything they can, and we’re back to where we were with the dynamic site generation issue. W3TC has a bunch of extra .htaccess/nginx rules which parse data before you hit WordPress. WPSC can do that, or use PHP (which is slower).

    The dynamic nature of my site is what drove me away from caching plugins. I use other CMS tools, and for my infrequently updated Wiki and ZenPhoto Gallery, where content is very much static, caching makes perfect sense. But when I want to run a simple community site with WordPress, I have to consider all aspects of user experience. Speed is hugely important, but so is the user getting the content they want. Stale content is a killer.

    The reason I decided to see if my site ran slower without caching was that I was reinstalling caching and I thought “This is a perfect time to benchmark.” When I did I was astounded. There was very little difference in a benchmark test. Really no difference between at all, since it was within the results of each other, but I neglected to save the results at the time. I did however snap a picture of my server load(The unrelated part is where I was uploading 10megs of media. Unrelated.):

    load-graph

    Browser Caching is the first thing to tweak, as that tells browsers to cache content. The way this works is your .htaccess tacks on extra information while content like images and CSS are being downloaded, to say “This content is good for X days.” With WordPress, you don’t have to worry about changing the CSS, as most themes and plugins are extra smart, in that they append a version to the end of your CSS like this: style.css?ver=1.9.1 That 1.9.1 is the version of Genesis I’m running, so when that changes, the version changes, and browsers see it as a new file and re-download. That’s pretty cool. (I do wish that child themes pulled in their version, so you could increment that way.) We still have to tell the broswers to cache, and for how long, so near the top of my .htaccess (just below my hotlink protection) I have this:

    ## BEGIN EXPIRES ##
    
        ExpiresActive On
        ExpiresByType image/jpg "access 1 year"
        ExpiresByType image/jpeg "access 1 year"
        ExpiresByType image/gif "access 1 year"
        ExpiresByType image/png "access 1 year"
        ExpiresByType image/x-icon "access 1 year"
    
        ExpiresByType text/css "access 1 month"
        ExpiresByType text/html "access 1 hour"
    
        ExpiresByType application/pdf "access 1 month"
        ExpiresByType application/x-javascript "access 1 month"
        ExpiresByType application/javascript "access 1 month"
        ExpiresByType text/javascript "access 1 month"
        ExpiresByType text/x-js "access 1 month"
    
        ExpiresByType application/x-shockwave-flash "access 1 month"
        
        ExpiresByType video/quicktime "access 1 month"
        ExpiresByType audio/mpeg "access 1 month"
        ExpiresByType video/mp4 "access 1 month"
        ExpiresByType video/mpeg "access 1 month"
        ExpiresByType audio/ogg  "access 1 month"
        ExpiresByType video/ogg  "access 1 month"
    
        ExpiresDefault "access 2 days"
    
    ## END EXPIRES ##
    

    I’ve added in only the types used by my site. I used to use Pragma caching headers as well, but I noticed that Google PageSpeed Insights and YSlow ignore them. Turns out that Pragma headers aren’t honored all the time, in fact, they aren’t honored often, so I just removed them. I don’t think it slowed my site down to have them, but the less to maintain, the better. This had an immediate positive impact, so it was time to look at the server.

    ShapesOver the years, I’ve tuned httpd.conf so it doesn’t crash, I’ve got CSF locked down to prevent people from DoS’ing me over TimThumb, and I of course have APC turned on. Recently I broke down and installed mod_pagespeed when I upgraded to PHP 5.4. Just those things have done a lot to make my site run faster. I intentionally skipped things like Varnish or TrafficServer, as well as a CND or Google’s PageSpeed Service. I (still) don’t need them.

    Since I’m new to Page Speed, I decided to look deep into the filters and enabled the following for the whole server: rewrite_javascript, rewrite_css, collapse_whitespace, elide_attributes. This had a right-away impact of what I jokingly called ‘Effective Minification.’ These filters are new to me, so I spent a lot of time reading up on all the filters, and I find them highly interesting. By having PageSpeed handle things like offloading jQuery, I take the load off of WordPress and other CMS tools, and don’t have to use a plugin.(Don’t get the wrong idea. There are uses for plugins! But I’m all about using the right tool for the right job. I don’t have plugins handle my WordPress database, because I feel it’s like using a screwdriver to hammer in a nail. You can….)

    I added in a couple more to my standard: remove_comments and rewrite_images. Then I went back to my site’s .htaccess and started turning on the things I wanted per-site.

    The ones I picked are:

    Putting those in my .htaccess looks like this (note: no spaces between the filter names, or it all blows an error 500):

    
        ModPagespeedEnableFilters move_css_to_head,defer_javascript,insert_ga
        ModPagespeedAnalyticsID UA-MYCOOLID-1
    
    

    That also means I don’t have to use a plugin to use Google Analytics for my whole site! This may not mean a lot to you, but I have multiple ‘apps’ on my site (four now) and when I edit themes, if I don’t have to do anything, it’s easier. Google will tell you not to do this, but unless they have a way for me to set pagespeed.conf in the /home/user/ folder of my server, I don’t know another per-user way about this.

    Finally I went back on my word, and I installed a plugin. APC by Mark Jaquith. This isn’t a full reversal on my ‘No Plugins!’ stance before, though. All APC is, you see, is but one file that sites in wp-content and kicks things over to APC. Doing this alone moved my TTFB from an F to a B. Which is pretty impressive. Giving it a little time to bake, this worked out okay.

  • Let Your Content Be Copied

    Let Your Content Be Copied

    Do Not CopyRecently I undertook a personal project to convert a website from Flash to WordPress. I didn’t do this for any reason other than I wanted to do something nice for someone who has, in a very strange way, been the reason I am who I am within WordPress. She’s an artist, which means her website was very media heavy, and back in the early 2000s, the way to do this was Flash.

    I hated it.

    Oh I loved how it looked, but really that was it. It made her content slow, and it made it impossible for me to say “Hey, check out the new content!” without also saying “To get there, click on Sputnik, then on the fourth star, then the fifth box…” It’s just really bad UI, and no matter how pretty it is, the barrier between reader and content was nigh insurmountable. Also it doesn’t work on iOS these days.

    My father, similarly, had his old site as all PDFs, so when I redid his site for his birthday (which he loves, and yes, WordPress), I copied his PDFs to text, with a lot of LATex in there for the math, and he complained. “People will steal my content.” I pointed out they could do that anyway. In fact, I had, in essence, done what they would, downloading the PDF and copying out the text and images. He grumbled, but as soon as his peers remarked that they could finally read his work, he calmed down.

    I understand the fear of theft. You want to show you work to draw people in and then sell yourself. My father is a consultant and speaker, and his fear is that people will take his work and plagiarize him, or worse, make it seem like he endorsed them. If you think libel is a rough road on the Internet, try the endorsement shenanigans. Some people will do anything to make themselves seem more appealing.

    At the same time, I agree with Cory Doctorow that giving your books away isn’t bad. The reason my ebooks are pay what you will is that I want people to find me, and find value in me. You can argue I wouldn’t have my job if I didn’t do that. My ebook profits paid for a brake repair and help keep my webhosting fees under control, but I sure don’t make a living off two ebooks. But again, the point is not that the website, directly, makes me money, but that it allows me to make money.

    CopyIt seems counterintuitive. How can I make money giving things away? A website is like advertising. You don’t make money directly on an ad. You pay around $3.5 million dollars to have an ad in the Super Bowl not because you think someone will drop what they’re doing, run out and buy Doritos, but because you are trying to make an impact. The best Super Bowl ads, the best ads in general, are the ones we watch and want to share with our friends. We talk about them, and when we’re at the store and spot Doritos, we have a positive association with them, and are inclined to buy them. Sneaky ain’t it?

    The website is the same thing. You read a lot about WordPress here, so at a certain point you start to associate ‘Ipstenu’ and ‘Half-Elf’ with WordPress. You see me on the forums, posting and helping people, and you get positive reinforcement of that association. Then you see I have an ebook about Multisite and you buy it. So why are the ebooks also pay what you want? Because people come to these things the other way, too. They find the ebooks, wonder about my qualifications and merits, and later come back and pay. And yes, I’ve gotten money that way too. After a while, when you build up your cred, you don’t have to mess with that and you can just sell, but at the time of my ebooks, I wasn’t someone who could say “My job is WordPress” so I couldn’t afford to just sell. I probably could today.

    By why keep giving things away? WordPress (the code) is free, and my content is technically free here. You’re not paying me to read this, after all. It goes back to positive associations. If you get a good association with something, you keep using it. Newspapers, back in the day, were the only way to get news. You paid for two things: the information and the reliability. The radio came and changed the game, letting you listen to news, but the papers stuck around because unlike TV and radio news, you didn’t have to wait for your segment to come up, you could flip the pages and read sports.

    The value of straight news didn’t really change until the Internet, where we started offering you the information at no immediate cost. Most of the time, the Internet sites can’t compete with reliability, but people became increasingly annoyed with having to pay for content. Buying the paper, sure, but I’m already paying for the Internet (access to it). Shouldn’t that be funneling back like TV fees do? Alas, they don’t, which means news media goes through hoops and ladders to try to lock their content down so you can’t copy it, or you have to pay to get at it. In return people like me find ways around paying to read content.

    It’s not that we don’t think we should pay for things. I do pay for books, music, movies, media, news, etc, and I encourage people to do so. It’s not the money at all, it’s the barrier between me and the content. There’s obvious value in reading the news, but the value is diminished by proliferation and frankly by their own quality. It’s true that if you build it they will come, but without letting people share what they found, then you won’t get more readers.

    Ctrl-CThat’s why the walls between your reader and your content need to go. That’s why you need to allow a direct link to your content, so I can say “Hey, I read this awesome article, go here!” You want me to tweet, text, link, post, tumble, and share your content so you get more readers, and more to the point, you get happy readers. The happier your readers, the more they feel like they should share. They’re getting a psychological kick-back from sharing, and we’re back to the positive association reinforcement we want.

    I’m certainly not going to say that giving away all your content is going to make you money, but I will say that giving away some of your content will do so. There’s no magic formula to say where the breakpoint is for your product, but there’s no way to do that for anything. You have to determine where you’re going to make your money. My father makes money with his work and lectures. By posting smaller excerpts of his essays and papers online for free, people can find value in his work and hire him. An artist can post lower resolution/quality versions of their art for free, and let the reason find merit in the product. A writer can put disparate thoughts that don’t really combine themselves well into one work up on their blog, and let people see the value in their books. And by letting people copy your content, by letting them quote in part on in whole, you make them happy.

    Do I worry about plagiarism and content theft? Funny thing, no. By having my SEO ranking high, based on Google and all being able to read my content, if someone searches for phrases found in my articles, they’ll find my site before the sploggers and thieves. By making it easier for people to link to me, I increase my SEO. The same goes for my quality of content. I make it high, people will link to me, and we get a happy circle of reciprocity. I never fear content theft, and because of that, I let my content be copied.

    It’s served me well.

  • wp-cron it up

    wp-cron it up

    CronIf you’ve spent much time mastering *nix servers, you’ve run into cron. I love cron. Cron is a special command that lets you schedule jobs. I use it all the time for various commands I want to run regularly, like the hourly check for new posts in RSS on TinyRSS or RSS2Email, or the nightly backups, or any other thing I want to happen repeatedly. Cron is perfect for this, it runs in the background and it’s always running.

    My first thought, when I heard about wp-cron, was that it clearly tapped into cron! If I scheduled a post, it must write a one-time cron job for that. I was wrong. Among other reasons, not everyone has cron on their servers (Windows, I’m looking at you), but more to the point, cron is a bit overkill here. Instead WordPress has a file called wp-cron.php which it calls under special circumstances. Every time you load a page, WordPress checks if there’s anything for WP-Cron to run and, if so, calls wp-cron.php.

    Some hosts get a little snippy about that, claiming it overuses resources and ask that you disable cron. One might argue that checking on every pageload is overkill, but realistically, with all the other checks WordPress puts in for comments and changes, that seems a bit off. In the old WordPress days, this actually was a little true. If you have a server getting a lot of traffic, it was possible to accidentally loop multiple calls at the same time. This was fixed quite a bit in WordPress 3.3 with better locking (so things only run once). Multisite was a hassle for a lot of people, but seems to have been sorted out.

    StopwatchWe already know that WordPress only calls wp-cron if it needs to, so if you disable it, you have to run that manually (which ironically you could do via a cron job (Read How to Disable WordPress WP-Cron for directions on how to disable and then use real cron.)). To disable wp-cron, just toss this in your wp-config.php:

    define('DISABLE_WP_CRON', true);

    Now, remember, you still want to run a ‘cron’ job to fire off anything scheduled, so you have to add in your own cron-job for this, which I actually do! I have cron turned off, and a ‘real’ cron job runs at 5 and 35 past the hour to call wp-cron with a simple command:

    curl http://example.com/wp-cron.php
    

    Now since I have multiple domains on this install, I actually have that for every site on my Multisite. After all, wp-cron.php on Ipstenu.org doesn’t run for photos.ipstenu.org and so on. Doing this sped up my site, and put the strain back where I felt it should be. The server.

    By the way, wp-cron does more than schedule posts. That’s what checks for plugin, theme and core updates, and you can write code to hook into it to schedule things like backups. Remember before how I said that WP checks to see if there’s anything scheduled? This is a good thing, since if you have your server run a long job (like backing up a 100meg site), you don’t want to wait for that to be done before you page loaded, right?

    For most people, on most sites, this is all fine to leave alone and let it run as is. For the rest, if you chose to use cron instead, keep in mind how things are scheduled. Like I set my cron to run and 5 and 35, so I tend to schedule posts for 30 and the hour. That means my posts will go up no later that 5 minutes after. That’s good, but if I use my DreamObjects plugin, the ‘Backup ASAP’ button schedules a backup to run in 60 seconds (so as not to stop you from doing anything else, eh?), which means I’d have to manually go to example.com/wp-cron.php in order to kick it off. Clearly there are issues doing this for a client.

  • Hotlinking is Evil (And So Is Google)

    Hotlinking is Evil (And So Is Google)

    We all know hotlinking is a bad thing. Hotlinking uses up someone else’s bandwidth, which costs them money. It takes away from any profit they might make on ads, because you’re not going to their site. It removes their credit from images. So why did Google decide to hotlink when they made their faster image search?

    This is what the new image search looks like:

    Faster Image Search

    I’ll admit, that looks pretty nifty. It’s a fast way to see images. But it’s also a fast way to lose attribution. Here’s what just the new image box looks like.

    Close Up

    This image now loads ‘seemingly’ locally. It’s totally a part of Google, though, there’s no reference back to how the site looks (it used to be an overlay). In fact, most people will just see the image, copy if they want, and move on to the next site. No one has any reason to dig deeper and to visit the image’s page.

    By contrast, the thumbnail images you see on Google, if you viewed source, look like this: https://encrypted-tbn3.gstatic.com/images?q=tbn:ANd9GcQPNtMLkk8rwj3lLv6a2kEQ8_eo6BuiUZYn3N5z3cbMu6rVPo3Xkw If you go to gstatic.com though, all you get is a 404 error page, but it’s pretty easy to find out this is where Google saves all their static content. Including images. These thumbnails are in moderate to low quality, and if that was all Google did, show small, iffy, thumbnails and redirect people to the real site, that would be great. Instead, now they actively hotlink from you. Oh yes, that full image you saw in my screenshot was directly linked to the owner’s media file.

    The first thing I did after noticing this was to add the following to my robots.txt:

    User-agent: Googlebot-Image
    Disallow: /
    

    Those directions are right from Google, who doesn’t even pitch you any reason to why you wouldn’t want to do this. Normally they’ll tell you ‘You can, we’d rather you didn’t because of XYZ, but here it is anyway.’ This time, it’s a straight up ‘Here’s how.’ I find that rather telling.

    Naturally I went on to read why Google thought this was a good idea

    The following points are all reasons Google thinks this is better.

    We now display detailed information about the image (the metadata) right underneath the image in the search results, instead of redirecting users to a separate landing page.

    The first part about this, the detailed information, is great. Having the meta-data right there without redirecting to the separate page like the used to, with the data on the side that no one read, is an improvement. Thank you for that.

    We’re featuring some key information much more prominently next to the image: the title of the page hosting the image, the domain name it comes from, and the image size.

    Again, this is great. I think that the data should be more visible than it is, especially the ‘This image may be copyright protected’ stuff. Considering Google won’t allow you to use ads if you use copyright protected material (which they claim I do here, by the way), they really have a higher measure of standard to live up to when it comes to informing people of the stick by which they are measured.

    The domain name is now clickable, and we also added a new button to visit the page the image is hosted on. This means that there are now four clickable targets to the source page instead of just two. In our tests, we’ve seen a net increase in the average click-through rate to the hosting website.

    I can see this being true. Again, the links should be more obvious, and really they should link not to the image directly, but the contextual page in all cases. Traffic is important, and if you send people to the image page where they don’t see the ads, you’re causing them to lose money. So the idea behind this part is really nice, and I’m for it, it just needs some kick-back improvements. Google should give people a good reason to go to the parent site. And this next item is where they fail…

    The source page will no longer load up in an iframe in the background of the image detail view. This speeds up the experience for users, reduces the load on the source website’s servers, and improves the accuracy of webmaster metrics such as pageviews. As usual, image search query data is available in Top Search Queries in Webmaster Tools.

    And now we hit the problem. While this is true (it will be both faster and use less of my bandwidth while decreasing load), it’s still showing my image off my servers! Worse? It’s got the full sized image from my server, which means if I have a 4 meg photo (and I do), they’ll be pulling all 4 megs down, and the reader can just right-click and save. They never need to touch my site.

    As Bill and Ted would say, Bogus.

    Go back to how Google shows thumbnails. They have their own, lower-rez version. I regularly post other people’s images on a site, and when I do, I purposefully keep a lower resolution version on my site, and link to them for the best. Why? Because it’s their image. They did the work, they made it, I should honor them and respect them, and be a good net-denizen. Google’s failing on that.

    For me their search has always been a little questionable for images. Now it’s outright evil.