Half-Elf on Tech

Thoughts From a Professional Lesbian

Tag: heo

  • Penguins Just Gotta Be Me

    Penguins Just Gotta Be Me

    One penguin in a crowd singing
    Credit – Gary Larson

    Google’s Penguin came out and a lot of people got hammered, hard, by the changes. Penguin is the name of their new/updated algorithm, and it seems to have a lot to do with backlinks.

    Backlinks are when other people link to you. Pretty straightforward, but now it appears that people are being penalized for backlinks. Is this true? Yes and no. Google used to credit you for all sites that linked back to you, so the more popular you were, the more referral credit you got, the higher you were ranked, and that seems fair. Now, Google’s no longer passing through backlinks from spammers, so your referrals are dropping and people are being ‘penalized.’ But not really. That’s almost like saying you’re getting fewer phone calls after all the telemarketers stopped calling. Yes, you are, but you’re now not getting junk calls, and the ones you are getting are higher quality. The theory here is that everyone is now being judged fairly again, and by omitting the bad people from giving credit, you’ve leveled the playing field. Nice theory. It still feels pretty horrible to find your rankings dropped.

    How do you find what happened to your rankings? Search Engine Journal has a lengthy explanation, but it boils down to looking at your Google Organic traffic and see if you have noticable drops on April 19th, 25th and 27th. That covers both Panda and Penguin.

    But what caused it? Is it legit drops or unfair ones? That’s really something easily argued in multiple directions. The basic reason is something in your site, or in your site’s backlinks, has been determined to be spam. It sure feels unfair, because how can you be expected to do anything about what other people are doing! They’re the spammers, not you, why are you punished? Again, tis the great equalizer. If you remove all the bad links, what you’re left with may be a lower ranking, but it’s possibly a more honest and accurate one. I say possibly because I’m not able to see behind the Google curtain.

    Few of my sites were impacted, though I generally get traffic from Twitter and Google Plus, because that’s where I advertise. Once in a while, a post gets picked up by another WordPress blog or email list like WP Mail or Matt Mullenweg, and I get 600% traffic. But most of the time I’m pretty steady, increasing slowly and naturally. In part this is because this is my hobby. Certainly I take pride in what I do, but this is not going to make or break me. That’s lent itself to a very odd thing. I’ve managed to follow every single one of Google’s ‘do this!’ suggestions, without ever thinking about it.

    What are these rules? They’re obvious and I’ve touted them many times before.

    1. Write good content.
    2. Don’t spam.
    3. Link naturally.

    The first two are easy, the last one is a bit weird.

    Natural linking is like what I did when linking to Search Engine Journal. I saw a URL, I copied it in, and I put my own description. In addition, I don’t have a massive collection of links anywhere. I link to people and posts in-line, when they come up, and send you around to them in a way like I would if we were talking. In that way, I’m always making backlinks that are valuable for the next guy.

    But like I mentioned before, you can’t control other people’s backlinks to you. If you write WordPress themes and plugins, you maybe getting hit by this, and there is something you can do. It’s just that you won’t like it. See one of the things spammers do is use the same link, with the same URL and href attributes, over and over. What happens when you have an attribution link in your theme or plugin? It’s the same link. Over and over. At first glance, that seems horrible, because a theme would be penalized for having a link credit (like I have here) back to their sites. Some people seem to feel this is exactly what’s happening and the current feeling is that putting in the link as nofollow would be a solution.

    Sidebar: Yes, I’m aware of the debacle with WPMUDev getting hammered by Google Penguin. Of interest to me was that once they removed EduBlogs (a site they run) from having links back to them, the issue seemed to be resolved. A working theory of mine is that Google saw the hundreds of thousands of ‘self’ backlinks from these sites to the parent and it was felt to be gaming the system. This would explain why WordPress, who runs a gazillion number of sites, didn’t get hit, and why not all themes are getting slaughtered. Personally a better move would have been for Google to just throw those results out the window, but…

    Emperor PenguinPlugins, on the other hand, run by different rules. One of the plugin guidelines is no powered by links at all unless the user actively opts-in.(Themes are permitted one in the footer, or an optional one. In part this is because you only ever have one theme at a time, but you can have multiple plugins.) Having too many links out to the same place would be a problem for your SEO, and a plugin that linked multiple times would hurt you. We already know that Google knows how to check your js for hidden links. Back in 2007/2008 they added in the ability to pase onClick events, and it’s only improved since then. So while in 2008 Matt Cuts said it was safe to put a link in your JavaScript if you didn’t want it searched, that seems to no longer be the case. I’ve spot-checked on a couple sites, comparing them before and after, and studying their configurations, and many that have JS controlled ‘powered by’ links are being hurt.

    One major takeaway here is that Google screwed some things up, big time. A day-zero search on Viagra was buck wild and all wrong. It’s fine now, but there’s no way a spammer should have been ranked first on a Viagra search. I’ve complained about how Google prioritizes before, and back in 2009 I declared that Google’s Blog Search was Irrelevant. You couldn’t get a single decent result on anything. With Penguin and Panda, they’ve decided to treat everyone the same, and if a lot of terrible people are using your products, and you have a backlink, you’ll get dinged.

    What does all this mean? Well go Google for ‘panda google wordpress’ and you’ll see a lot of people shouting that they’re being penalized, and the ‘nofollow’ fix is hardly a fix at all. More are shouting that those ‘share this’ plugins, which show up multiple times on one page, are causing rankings to drop because the exact same link shows up multiple times. And right now, we don’t know. Like the Viagra problem, Google is fixing a lot of this on the fly. Google says ‘No algorithm is perfect!’ and that is certainly true, but if Google really is just equalizing things, then why were these bad sites so highly ranked to begin with?

    If you’re a plugin/theme/designer, I’d put in nofollow to my works for now. First, the link-juice didn’t matter anyway if it was in javascript, and second, what you want is people coming to your site and downloading the perfect theme or plugin. They’re going to shop around, and that will, eventually, lead to more sales. Pushing people is a sales technique that falls flat. There are so many options for themes and plugins, a hard-sell will lose you people. So will stuffing your plugin with every SEO trick you know.

    There’s no great answer, and screaming at Google (or WordPress) isn’t going to help. They’re going to do what they want. The best you can do right now is weigh your options between attribution and abuse. Are you really making things better for the users, or are you just doing this for yourself?

  • Permalink Elephants

    Permalink Elephants

    Broken Link The best permalink format for your site is so simple, you’re going to wonder why you never thought of it before. It’s obvious, intuitive, and perfect. In fact, I dare say it’s genius. Want to know what it is?

    The best permalink is the one your visitors can remember.

    I told you it was obvious.

    Look, you can waste immesurable hours and days trying to determine if /postname/ is better than /2012/postname, or you can sit down and remember you’re not making a site for search engines, but for your visitors.

    SEO does have a point, don’t get me wrong. If it’s easy for people to find your site, you get more traffic. One of my sites, following the recent Panda and Penguin updates on Google, jumped from 5th place to 3rd on a search for the major keyword. Another went from 12th to 9th (we’re working on that). None of that has to do with me changing anything, or even picking the best SEO plugin. It was done the traditional way.

    1. I wrote good copy
    2. I networked with related sites for links
    3. I advertised
    4. I was memorable

    Those three things, when done correctly, are how you get your site to rank high. And it’s that last item, being memorable, that should drive your URL choices.

    A URL like http://example.com/12897342134/jkahsdu.aspx isn’t memorable. It tells me nothing of what your site’s about, what the topics are, what the subject is.

    Elephants being adorableOn the other hand, a URL like http://example.com/2011/how-to-save-elephants tells me quite a bit. I know when the post was written, so if there was a big to-do about elephants in 2011, it probably is related. But it’s not always easy to tell someone that URL, nor is it a given I’ll remember it tomorrow. I may remember that example.com had a cool posts about saving elephants,  however. It’s certainly more likely I’ll remember it than the other link!

    This is where WordPress does something cool, though. See, I can tell someone to go to http://example.com/how-to-save-elephants/ and that will redirect them to the right URL! You can do this on Drupal as well with a module called Global Redirect (Drupal folks, correct me if I’m wrong/there’s a better one).

    To me, that says the issue isn’t what permalink pattern you pick, but what permalink slug you use! On that train of thought, what if I made my URL http://example.com/2011/save-elephants instead? Naturally then http://example.com/save-elephants  would also work.

    Now we can clearly see that ultimate issue is not the permalink structure. The only thing I don’t like about how WordPress defaults URLs is that I have to tell people ‘it’s example dot com slash save dash elephants’ and that’s not as easy as ‘example dot com slash elephants.’ Or even ‘saveelephants, all one word’ (I don’t know why that’s easier, but people tell me it is).

    The whole reason people like short URLs is that they’re short and easier to remember. If I told you to get to a site you used http://bit.ly/elephant, you’d have a much higher likelihood of remembering. Invariably, however, we look at branding and think “I don’t want bit.ly to have my name.” That’s a case for Yourls, and now, as long as you customize all your Yourls, you’re in it to win it. I know most people use short URLs for Twitter and such, but I find that making a handy short URL to tell someone ‘go to foo dot in slash facebook’ works astonishingly well. Of course Facebook knows that too, and lets you use http://fb.com/username to find people.(I don’t have a yourls setup here because I’m incapable of finding a short URL I like.)

    Sadly, there is one problem with this, and it’s that you can only use each ‘slug’ once, so once you’ve use ‘elephant’ you’re never able to use it again.

    Name your slugs wisely and plan, as best you can, for the future.

  • Duplication Dillution

    Duplication Dillution

    A common enough request in the WordPress forums is people wanting to have two sites on a MultiSite network have the same content. Usually I take the time to ask why they’re doing this, and see if there are better ways around it. Like if you want to have a collection of ‘all’ posts, then you want WordPress MU Sitewide Tags Pages (weirdest name ever) or Diamond MultiSite Widgets, both of which help you do that admirably. And if you want to have the same ‘about’ page on multiple sites so they can keep their per-site branding, I point out that best companies don’t do that, they link home to mama. Amazon.com’s subsites all link back to amazon.com’s about page, after all. That’s good business practice!

    Now, before someone gets all het up with an angry comment, there are good reasons to duplicate content, but it’s never 100% in the same language. If you have a site that needs to be bilingual, chances are you’re going to have a copy in English and (let’s say) French that, if translated would be pretty much the same. Not 100%, because of what idioms are, but basically the same. That’s a great reason for a duplicated site. Regional stores, where most of the content is the same, but some are different, is another good one. And those really aren’t ‘duplicated’ content.

    But imagine the guy who wants to have 50 sites with different domains (no problem), and they’re all the ‘same.’ Well. What? You mean I can go to example.com and example2.com and example3.com and they’re 100% the same? Same theme, same posts, same content? That seems silly, doesn’t it?

    So why do they persist in doing this? Some people cite SEO reasons (“My site ranks better for this domain…”) and other say its regional (“I need a separate domain for Canada!”) and they’re pretty much all wrong. Unless your domains are offering up different content, you are going to lose SEO ranking and readers by having multiple, identical, domains.

    In D&D (yes, I play it), we always say ‘Don’t split the party!’ For the gamers, this is good because everyone gets to play with everyone together. For the person running the game (GM or DM, depending on your flavor of game), they can talk to everyone at once. Splitting the party means half the time, some of your players have to leave the room and kill time while the other people get to do fun things. And then the game rummer has to repeat things for the next group when you switch places! It’s annoying and boring 50% of the time, and it causes duplication of effort.

    Splitting up your visitors means you have to figure out how to push content that is identical. This is not difficult, but it can cause problems. Every time you edit a post, the PHP calls your database and overwrites things. Multiply that by however many places you’re overwriting, and that could slow down posting. But then you think about using something like Content Mirror, which pulls post data in from across sites. Sounds great, until you remember that the blog switching code isn’t efficient (i.e. slows things down), and that all the smart people tell you switch_to_blog() is rough on caching.

    All that aside, there are major reason you don’t want to duplicate content. The foremost is that Google hates it. So do you, by the way. Duplicating content is what spammers and splogs do. (The Illustrated Guide to Duplicate Content in the Search Engines.) For those who go “Hey, but I have the same content on my archive pages for dates and categories!” you should read Ozh on the ‘Wrong SEO Plugins.’. The tl;dr takeaway is that good themes already do this for you!

    Second only to SEO is now you’ve given your users multiple places to have the same conversation. Cross-posting is never a good idea. You dilute your content by have multiple conversations about the same thing. Should I post on foobar.com or foobaz.com to talk about my foo? The more time your readers spend thinking about where to comment, the less time they’re engaging with your site. This is, by the way, one of my nagging dislikes about BuddyPress. With groups and forums and blogs, you can dilute your message. Thankfully, you can use the group page to pull in all those conversations to one place where people will see them, which helps a lot.

    I put the question to my friends. Why would you want 100% content duplication on multiple sites, in the same language, just with different URLs? Here are there answers:

    http://twitter.com/jan_dembowski/status/175590010555342848

    http://twitter.com/bluelimemedia/status/175606857967214593

    What’s yours?

  • I Don’t Write For Search Engines (And Neither Should You)

    I Don’t Write For Search Engines (And Neither Should You)

    I see a lot of posts where people talk about how to make your site better for search engines, and how to write a post for a search engine. I can honestly tell you that I have never sat down with that as my goal for anything I’ve written. Just like I don’t advocate designing your site for search engines, I would never suggest you customize your content for them. The web is for humans.(At least until our robot overlords take over.)

    At the risk of being repetitive, I will reiterate that you are not making your site in order to be ranked number one in search engines. You are making your site for people to read. If you’re making a site just to be number one, you’re doing it wrong.

    No matter what your topic, no matter your product, your goal is to make it something people value. So why is it a search for “how to write for search engines” has so many hits? A large number of those hits are for spam sites, who over-sell advertising and promise you hundreds of hits a day. Others, however, offer the same advice I’m telling you. Don’t write for search engines.

    Yes, if you get highly ranked on search engines, you’ll attract more people, but it’s not all about getting them to your site. Once you get someone in the door, you have to keep them. If you’ve ever been to a store where you know you need a salesman and they all ignore you, then you know exactly what it’s like to go to a website that’s all SEO and no content of merit.

    The part that confounds me is that all the SEO advice is drivel anyway, as it’s stuff you’re already doing. Also, they confuse the idea of writing for SEO benefit and writing SEO friendly content. There are tips and tricks you making your post layout be friendlier to search engines, while simultaneously making them easier for people to read.

    WCSF Shirt

    Coincidentally enough, Jane Wells (aka JaneForShort, aka if you don’t know who she is, you probably aren’t a WordPress fan) came up with the above comic (with permission from Randall Monroe of XKCD) and I felt it clearly and hilariously made my points for me. (True confession, I actually wrote this post in early July, but not until Jane’s comic did I finish it. Yes, I’m taking advantage of the timing.) In both sides of the argument, the panelists are ignorant of their absolute truth: together, with a good tool and good writing, you become king.

    Just recently Andy Stratton spoke at WordCamp Chicago (You can see a copy of his presentation, which he also used at WC Raleigh, at DIET PILLS, SEO, THEME FRAMEWORKS – There are no magic bullets.) and said “If content is king: context is queen […] Content is king, Backlinks are the Emperor.” For years I’ve espoused ‘contextual links.’ I will, rarely, put up a list of links, but when I do, it’s to organize them contextually. A link on it’s own is meaningless for the user who reads it and the site you’re linking to. If no one follows that link, it doesn’t matter how much ‘link juice’ you’re sending them, because no one’s clicking it.

    Don’t write for SEO, don’t make links for links sake. Listen to what your teachers said: write clearly, write well. Link with context, and people will see the effects of your work and link back. Write for the humans. We’re the ones reading.

  • Introducing HEO

    Introducing HEO

    We all know that SEO is ‘Search Engine Optimization.’ I humbly suggest we pay better attention to HEO – Human Experience Optimization.

    After you spend hours and hours optimizing your site for search engines, you should sit back and think about how the humans who are reading your site. This should be blindingly obvious to everyone, but more and more we hear about how you should make your URLs SEO friendly, or your post excerpts/slugs/format/meta-data the best to get highly ranked in Google. At a certain point, you’re missing the goal of a website.

    A website is not for search engines, a website is for humans.

    Humans like to be able to find what they want relatively painlessly. They like to know when something was written (or when whatever it’s about took place). They like to be able to search, sort, surf and select. They like to know weird things. It’s your job to make sure that when a user hits your site, they stay.

    Fonts

    I’ve mentioned before that font choices matter on your site. Perhaps the most important thing to remember about fonts is that people have to be able to read them. A lot of sites make their fonts very small, which force viewers to hit Ctrl-+. This is one of Jakob Nielsen’s pet peeves. Users should be able to control their font size, but you should also set your font starting size to something legible.

    Imagine my surprise when I went to a site and saw this:
    Example of a site with teeny tiny text

    I had to zoom in to read. That font is set to font: 11px/13px "Lucida Grande"..... Just by changing it to 12px/20px it was easier to read, but to make it a perfect starting point, it should really be 14px/20px. You’ll need to balance on your font choice with the size, though, as too-thick and too-thin fonts are equally painful for people to read.

    Colors

    I’m in my mid-thirties with the best worst vision you’ll find before someone gets classified legally blind (that said, I have fantastic night vision). I cannot read black backgrounds with white text for more than a few seconds without getting after-images. I’m not in the minority of the world. There’s a reason books, eReaders, newspapers and magazines tend to print dark text on light backgrounds, and it’s not just the cost. More people can read that setup. On top of that, don’t use background images. The busier the background, the more difficult it will be to read and you’ll draw the attention away from the text.

    The colors on your site need to be easy to read, and not strain the eyes.

    Layout

    Did you know that users tend to read to the left? This sort of flow makes sense when you consider that most languages are read left-right. Jakob Neilsen points out that people spend “more than twice as much time looking at the left side of the page as they did the right.” (Jakob Nielsen’s Alertbox, April 6, 2010: Horizontal Attention Leans Left) Not only that, but people actually tend to read pages in a pretty distinct F-shaped pattern. (Jakob Nielsen’s Alertbox, April 17, 2006: F-Shaped Pattern For Reading Web Content)

    So how do you best layout your website? I tend to think people read content better if it’s on the left, so I put the body of my text left and the sidebars right. I also take into account that newspapers and magazine break up text into columns for readability reasons, and set a fixed width to my site. That choice is somewhat controversial among my friends, but I like to look at the iPad and Kindle for examples as to why you want to not allow forever-width pages. Monitors are big, browser windows can be huge, but in the human head, eyes are spaced in a certain way. Making your page’s content too wide is a drain.

    Page Length

    There used to be a concept of ‘The fold’, which was basically that people didn’t scroll down on webpages in the early days of the web, so if they didn’t see your important content on the top half of your page (i.e. above the fold), they weren’t going to see it at all. It’s 2011. People know to scroll down a page.(Jakob Nielsen’s Alertbox, March 22, 2010: Scrolling and Attention) But you still need to make sure your site has the most important content ‘above’ the fold.

    Where’s the fold these days, though? Monitor size is a lot more variable today than it was in 1995, and the break-point on a page is getting pretty difficult to figure out. Unlike a newspaper, where the ‘fold’ is pretty obvious (unless you’re the Chicago Sun Times), you have to take a pretty good guess at where the ‘top’ of your site is. Oddly, this is a lot easier with the iPad, which currently is my benchmark for ‘the fold.’

    Keeping that in mind, page length matters! I try to keep each post no more than 1200 words, because of human attention span. If I happen to dip longer, I’ll consider breaking the post into multiples.

    Permalinks/URLS

    Samuel Wood (aka Otto) said it simply:

    Humans care about dates. Leaving a date identifier (like the year) out
    of the URL is actually de-optimizing the site for humans.

    Not everything should have a date, mind you. Resources like WikiPedia or other sites that act as repositories for static, timeless material (like a book), certainly do not need date stamps. Deciding if your site needs to include the year in the URL (like I do here), or not at all (like I do elsewhere), is something you need to think long and hard about. If you’re making a ‘traditional’ blog, or a newspaper, or some site that acts as a repository for time-based information, the answer is simple: Yes you do.

    In addition to sorting out if you need dates or not on your site, you have to think about the post format. I’m a huge proponent of pretty URLs, so I tend to lean to custom crafted URLs. On WordPress, I always review the permalink and, if I think it could be better shorter, I do so. MediaWiki defaults to whatever you want to name the page and puts that in as your page title(Oddly you can only override this with {{DISPLAYTITLE:Custom title}} , which has weird results in searches.), but WordPress uses the ‘title’ of your post and makes that your page title.

    Permalink Example

    This is pretty easy to change, though. Just click on edit and make it shorter (which I strongly suggest you do in most cases).

    What else?

    I could go on and on. Like how you shouldn’t use too many ads (and whatever you use, they shouldn’t be bigger than your post content!), don’t use flashing images/text, and keep in mind your audience! What are your hot-button topics for making your site human friendly?