Half-Elf on Tech

Thoughts From a Professional Lesbian

Tag: seo

  • Permalink Elephants

    Permalink Elephants

    Broken Link The best permalink format for your site is so simple, you’re going to wonder why you never thought of it before. It’s obvious, intuitive, and perfect. In fact, I dare say it’s genius. Want to know what it is?

    The best permalink is the one your visitors can remember.

    I told you it was obvious.

    Look, you can waste immesurable hours and days trying to determine if /postname/ is better than /2012/postname, or you can sit down and remember you’re not making a site for search engines, but for your visitors.

    SEO does have a point, don’t get me wrong. If it’s easy for people to find your site, you get more traffic. One of my sites, following the recent Panda and Penguin updates on Google, jumped from 5th place to 3rd on a search for the major keyword. Another went from 12th to 9th (we’re working on that). None of that has to do with me changing anything, or even picking the best SEO plugin. It was done the traditional way.

    1. I wrote good copy
    2. I networked with related sites for links
    3. I advertised
    4. I was memorable

    Those three things, when done correctly, are how you get your site to rank high. And it’s that last item, being memorable, that should drive your URL choices.

    A URL like http://example.com/12897342134/jkahsdu.aspx isn’t memorable. It tells me nothing of what your site’s about, what the topics are, what the subject is.

    Elephants being adorableOn the other hand, a URL like http://example.com/2011/how-to-save-elephants tells me quite a bit. I know when the post was written, so if there was a big to-do about elephants in 2011, it probably is related. But it’s not always easy to tell someone that URL, nor is it a given I’ll remember it tomorrow. I may remember that example.com had a cool posts about saving elephants,  however. It’s certainly more likely I’ll remember it than the other link!

    This is where WordPress does something cool, though. See, I can tell someone to go to http://example.com/how-to-save-elephants/ and that will redirect them to the right URL! You can do this on Drupal as well with a module called Global Redirect (Drupal folks, correct me if I’m wrong/there’s a better one).

    To me, that says the issue isn’t what permalink pattern you pick, but what permalink slug you use! On that train of thought, what if I made my URL http://example.com/2011/save-elephants instead? Naturally then http://example.com/save-elephants  would also work.

    Now we can clearly see that ultimate issue is not the permalink structure. The only thing I don’t like about how WordPress defaults URLs is that I have to tell people ‘it’s example dot com slash save dash elephants’ and that’s not as easy as ‘example dot com slash elephants.’ Or even ‘saveelephants, all one word’ (I don’t know why that’s easier, but people tell me it is).

    The whole reason people like short URLs is that they’re short and easier to remember. If I told you to get to a site you used http://bit.ly/elephant, you’d have a much higher likelihood of remembering. Invariably, however, we look at branding and think “I don’t want bit.ly to have my name.” That’s a case for Yourls, and now, as long as you customize all your Yourls, you’re in it to win it. I know most people use short URLs for Twitter and such, but I find that making a handy short URL to tell someone ‘go to foo dot in slash facebook’ works astonishingly well. Of course Facebook knows that too, and lets you use http://fb.com/username to find people.(I don’t have a yourls setup here because I’m incapable of finding a short URL I like.)

    Sadly, there is one problem with this, and it’s that you can only use each ‘slug’ once, so once you’ve use ‘elephant’ you’re never able to use it again.

    Name your slugs wisely and plan, as best you can, for the future.

  • Duplication Dillution

    Duplication Dillution

    A common enough request in the WordPress forums is people wanting to have two sites on a MultiSite network have the same content. Usually I take the time to ask why they’re doing this, and see if there are better ways around it. Like if you want to have a collection of ‘all’ posts, then you want WordPress MU Sitewide Tags Pages (weirdest name ever) or Diamond MultiSite Widgets, both of which help you do that admirably. And if you want to have the same ‘about’ page on multiple sites so they can keep their per-site branding, I point out that best companies don’t do that, they link home to mama. Amazon.com’s subsites all link back to amazon.com’s about page, after all. That’s good business practice!

    Now, before someone gets all het up with an angry comment, there are good reasons to duplicate content, but it’s never 100% in the same language. If you have a site that needs to be bilingual, chances are you’re going to have a copy in English and (let’s say) French that, if translated would be pretty much the same. Not 100%, because of what idioms are, but basically the same. That’s a great reason for a duplicated site. Regional stores, where most of the content is the same, but some are different, is another good one. And those really aren’t ‘duplicated’ content.

    But imagine the guy who wants to have 50 sites with different domains (no problem), and they’re all the ‘same.’ Well. What? You mean I can go to example.com and example2.com and example3.com and they’re 100% the same? Same theme, same posts, same content? That seems silly, doesn’t it?

    So why do they persist in doing this? Some people cite SEO reasons (“My site ranks better for this domain…”) and other say its regional (“I need a separate domain for Canada!”) and they’re pretty much all wrong. Unless your domains are offering up different content, you are going to lose SEO ranking and readers by having multiple, identical, domains.

    In D&D (yes, I play it), we always say ‘Don’t split the party!’ For the gamers, this is good because everyone gets to play with everyone together. For the person running the game (GM or DM, depending on your flavor of game), they can talk to everyone at once. Splitting the party means half the time, some of your players have to leave the room and kill time while the other people get to do fun things. And then the game rummer has to repeat things for the next group when you switch places! It’s annoying and boring 50% of the time, and it causes duplication of effort.

    Splitting up your visitors means you have to figure out how to push content that is identical. This is not difficult, but it can cause problems. Every time you edit a post, the PHP calls your database and overwrites things. Multiply that by however many places you’re overwriting, and that could slow down posting. But then you think about using something like Content Mirror, which pulls post data in from across sites. Sounds great, until you remember that the blog switching code isn’t efficient (i.e. slows things down), and that all the smart people tell you switch_to_blog() is rough on caching.

    All that aside, there are major reason you don’t want to duplicate content. The foremost is that Google hates it. So do you, by the way. Duplicating content is what spammers and splogs do. (The Illustrated Guide to Duplicate Content in the Search Engines.) For those who go “Hey, but I have the same content on my archive pages for dates and categories!” you should read Ozh on the ‘Wrong SEO Plugins.’. The tl;dr takeaway is that good themes already do this for you!

    Second only to SEO is now you’ve given your users multiple places to have the same conversation. Cross-posting is never a good idea. You dilute your content by have multiple conversations about the same thing. Should I post on foobar.com or foobaz.com to talk about my foo? The more time your readers spend thinking about where to comment, the less time they’re engaging with your site. This is, by the way, one of my nagging dislikes about BuddyPress. With groups and forums and blogs, you can dilute your message. Thankfully, you can use the group page to pull in all those conversations to one place where people will see them, which helps a lot.

    I put the question to my friends. Why would you want 100% content duplication on multiple sites, in the same language, just with different URLs? Here are there answers:

    http://twitter.com/jan_dembowski/status/175590010555342848

    http://twitter.com/bluelimemedia/status/175606857967214593

    What’s yours?

  • Pretty URLs Matter

    Pretty URLs Matter

    Just over a year ago, Lifehacker changed their site and we all learned about the hashbang. I believed that pretty URLs mattered:

    I would posit that, since the web is based on look and feel, the design of your site still relies, in part, on the ease of someone in understanding the URL.

    Well Dan Webb agrees with me: (It’s About The Hashbangs)

    After quite a lot of thought and some attention to some of the issues that surround web apps that use hashbang URLs I’ve come to conclusion that it most definitely is about the hashbangs. This technique, on its own, is destructive to the web. The implementation is inappropriate, even as a temporary measure or as a downgrade experience.

    This Dan guy happens to be in charge of switching Twitter from the #! back into nice, normal, readbale URLs:

    You can read the whole Twitter convo on Storify. But the take-away is, I admit, a little dance from me of ‘I told you so.’

    In the comments of my previous post, someone pointed out that Google uses Hashbangs in a search. I have not seem them in my searches (here’s an example of what I do see on page 3 of a search for ‘sadasdas’):

    https://www.google.com/search?sourceid=chrome&ie=UTF-8&
    q=sadasdas#q=sadasdas&hl=en&prmd=imvns&ei=8CthT86EEoWDgwerpqDwDA&
    start=20&sa=N&bav=on.2,or.r_gc.r_pw.r_cp.r_qf.,cf.osb&fp=6b8f6c2ef22f8dde&
    biw=1324&bih=879

    The thing is, Google doesn’t matter in this instance. Actually, none of Googles URLs ever really matter when it comes to the reasons we want Pretty URLs. Remember our credos?

    URLs are important. Yes, they are, but they’re important when you’re looking for something. They should be important on, say, YouTube, and they certainly are important on Google Code pages. But for a normal Google search? You don’t type in google.com/ipstenu or even google.com/search/ipstenu to find pages about ipstenu (Though, really, that would be really awesome if you could! At the very least, the 404 should have a ‘Do you want to search for that?’ options. Get on it, Google Monkeys!) Which is the point I was trying to make. When you go to a site for specific content, you want people to say ‘Go to ipstenu.org/about’ and be done with it.

    URLs are forever. Again, yes they are, except when they’re not. For Google? This doesn’t matter. They don’t search themselves, and rarely do I know people who bookmark a Google search. But… Did you know Googles search URL formats are backwards compatible? That’s right. If you had an old URL saved, it would still be usable.

    Cool URLs don’t change. Except for search URLs. But Google knows if you’ve gotta change ’em, make ’em backwards compatible. Ben Cherry has a nice article on why it’s good, too, which I found enlightening. I still don’t like them on the front end of websites, mind you, for a reason Ben points out:

    The hashbang also brings a tangible benefit. Someday, we will hopefully be rid of the thing, when HTML5 History (or something like it) is standard in all browsers. The Web application is not going anywhere, and the hash is a fact of life.

    That’s the problem right there. We’re building something we know is going to go away! We just broke a credo!

    Are there ever reasons for using Hashbangs? Sure, but most of us will never need them. They’re great for loading AJAX which is awesome because you can load new parts of your site faster. Heck, WordPress has some AJAXification, but it’s primarily on the backend. They’re fantastic for web apps and they have some attractive functionality. But they don’t work for everyone, and they never will.

    The ‘replacement’ is HTML5, which introduces pushState, which lets you use a JavaScript library with HTML5 history.pushState and … well basically you write an app with that instead of AJAX and the Hashbang, your URLs stay pretty and your page can still have that nice ‘click to add the new tweets’ functionality that we’ve all seen on the new-new-new-Twitter. And Facebook.

    See, pushState is giving us these three things (among others):

    • better looking urls: A ‘real’, bookmarkable, ‘server-reachable’ url.
    • graceful degradation: Render correct pages when you don’t have JS enabled.
    • faster loading: By only loading parts of the page, slow-net users can get the content faster.

    The downside? It doesn’t work on IE right now. That’s okay, though, since it’s got graceful degradation, IE users will just have to manually refresh things. Which sucks for IE users, but perhaps will be the fire under Microsoft’s feet to get on that one.

    Keep an eye out in your webapps. They’ll be moving over to this sooner than you think.

  • SEO Doesn’t Auto Post Anymore

    SEO Doesn’t Auto Post Anymore

    I don’t auto post to Twitter, Tumblr, Facebook, Google Plus or LiveJournal. I stopped about a year ago, and since then, I’ve stopped crossposting to everywhere except the places I actually frequent. That’s not to say I don’t skim Tumblfeeds (the spam monsters than they are) or check in on my LJ communities. It means that I no longer have any code that automatically posts to those places when I make a new blog post. Any time you see a link to my posts on another site, made by me, that means I took the time to log in and fill in the data by hand.

    Why? Well, I’ll jump around chronologically and tell you that a pair of articles hit my feed recently. First, one about how “3rd party APIs […] are punished in Facebook’s EdgeRank algorithm” (Source: Edgerank Checker — Does Using a 3rd Party API Decrease Your Engagement Per Post?) and the second, which linked back, said that pages that auto publish lose 70% of ‘likes’ and comments. (Source: Insider Facebook — Study: Auto-Posting to Facebook Decreases Likes and Comments by 70%)

    AutobotBoth of those back up what I’ve always said about SEO and HEO. If you want people to come to your site, you have to engage with them. That means you need to interact, not spam, and converse. Find out what they like and how they like it. They hammer home a point that was obvious to many of us old hats know, but many young bucks ignore. You have to be in touch with your readers, and no automated system in the world can do that for you.

    Now, certainly, I use tools like Google Analytics’ Campaign feature, and Crowdrise, to help me determine what posts of mine are popular, when and where, and attempt to comprehend the why of it all. It’s a very fuzzy science. I know I hit paydirt when my @-replies on Twitter are coming so fast I can’t keep up, and my comment-feed is burning a hole in my screen. But until we perfect an AI that knows, before I do, what I want, we’ll never have one that can predict accurately what we need to do to make our sites popular. And we all know that popularity is the end game.

    Popularity has a strange converse, though. For example, you may think that auto-tweeting your blog posts is a great idea to get the content out there and read. This is true, but I found that the more I auto-tweeted, the more splogs came to my site! That’s right, I was increasing the attack of content scrapers, and tweet bots that spam, which in turn decreased any SEO benefit I might have acquired. Sucks, doesn’t it? Thankfully, manually crafting a quick tweet, and taking the time to phrase it right, got me more traction than anything else.

    The other massive downside to auto-posting on social networks is that you rarely get to make the post look the way you want to. I want to pick which image I’ve attached to be the thumbnail, and I want to make sure my custom excerpt (which I always write) is picked up, and I want to maybe put in an extra explanation on G+, but not on Tumblr or Facebook, and … you know, I want people to know I’m thinking about them.

    There is a huge desire to share everything with everyone. To tell your friends in one social network the same things you tell them in others. And for self-promotion, this is big as well. But as the media is learning, a blanket advertisement like you see on TV doesn’t work so well anymore. How many commercials can you remember well? I can remember the Old Spice guy and the Most Interesting Man in the World (also a couple weird phone commercials), but we don’t always remember the products, nor do we actually always buy the product being advertised. I don’t use Old Spice or Dos Equis. Still, blanket ads are hard to land, since you don’t know who’s going to read your site. Similarly, blanket ‘Hey look at me!’ is hard to make efficient, because you’re not reaching out to your audience and making them a part of your process. You’re shouting at them, not talking with them.

    DecepticonWhat about those of us who aren’t advertising. If you’re crosslinking just to share with friends, an automated system seems fine, except these are your friends and don’t you want to be personal with your friends? Don’t you think they deserve the time and effort of a real ‘Hey, this is what I’m up to!’ instead of a blanket letter? Wouldn’t it be nice to say “Bob, I thought of you when I wrote this because of that conversation we had about …” Or if you send it to a group of your friends and Bob’s, then Bob feels great because you’ve brought him into the conversation and your friends know you think about them.

    The funny thing about this is I also stopped scheduling posts. I used to set posts up to run once a week, minimum, even if I was going to be off line. Now, because I want to interact with people, I post them only when I know I’ll be around.

    The next time you see a social media post of mine, linking back to a blog post, know that I took the time and effort to link it. If it’s styled pretty, I did that on purpose. I try to make it personal and not just slap a link up, and I think that effort shows, and comes back to me in pageviews, comments and likes.

  • All Subs Are Equal

    The Drebbel For years we’ve told people “The only difference between subdomains and subfolders is that search engines see subfolders as being all a part of the same site, and subdomains as separate sites.

    That may not be true for very much longer.

    As August ended, Google dropped this little bomb casually:

    Most people think of example.com and www.example.com as the same site these days, so we’re changing it such that now, if you add either example.com or www.example.com as a site, links from both the www and non-www versions of the domain will be categorized as internal links. We’ve also extended this idea to include other subdomains, since many people who own a domain also own its subdomains—so links from cats.example.com or pets.example.com will also be categorized as internal links for www.example.com.

    via Official Google Webmaster Central Blog: Reorganizing internal vs. external backlinks.

    This is, I think, a good thing. They make sure you understand that ipstenu.wordpress.com won’t be the same as jane.wordpress.com, which makes sense since their own blogger platform runs off subdomains as well. Somewhere in their logic they know ‘Aha! Ipstenu doesn’t own wordpress.com! But she does own Ipstenu.org.’

    To reiterate,  this only affects things in Google’s Webmaster tools:

    This update only changes how links are displayed in Webmaster Tools. It doesn’t affect how links are valued in relation to the search algorithm or ranking. It has nothing to do with Panda, nothing to do with keywords, nothing to do with PageRank.

    I fully expect that it’s going to change and expand.  Since most of us end up registering our ‘master’ domain (i.e. ipstenu.org) off Google Webmaster, they can easily leverage the data they already have in order to tweak search engine results.  Another tactic would be to start using a new meta type tags.  After all, the big guys already organized schema.org (which isn’t as picked up as it might be yet).

    schema.orgSidebar: Schema.org’s problem is how complicated it is to fold into CMS tools.  If they’d started by releasing this with a couple plugins/extensions for the popular open source tools like Drupal, Joomla, MediaWiki, MovableType and WordPress, which as they’re open source, they could have, we’d be way ahead of the game.  As it stands, the one WordPress plugin I’ve seen requires you to import SQL tables!  If they get schema ironed out so it’s simple and easy to use, then we can add in ‘parent’ or ‘main’ site parameters to our code and every search engine can know ‘halfelf.org is internal to ipstenu.org, but taffys.ipstenu.org is not.’  And wouldn’t that be cool!

    Personally, I have ipstenu.org, ipstenu.org, halfelf.org and so on all listed separately in Google Webmaster right now.  I could do that with subfolders as well, to track data internally to itself (i.e. each subdomain/subfolder), and still track them at the master level with the main domain.

    So no effect on your SEO or ranking or any of that stuff, but a nice change for Webmaster Tools.

    Where do you see this going?

  • I Don’t Write For Search Engines (And Neither Should You)

    I Don’t Write For Search Engines (And Neither Should You)

    I see a lot of posts where people talk about how to make your site better for search engines, and how to write a post for a search engine. I can honestly tell you that I have never sat down with that as my goal for anything I’ve written. Just like I don’t advocate designing your site for search engines, I would never suggest you customize your content for them. The web is for humans.(At least until our robot overlords take over.)

    At the risk of being repetitive, I will reiterate that you are not making your site in order to be ranked number one in search engines. You are making your site for people to read. If you’re making a site just to be number one, you’re doing it wrong.

    No matter what your topic, no matter your product, your goal is to make it something people value. So why is it a search for “how to write for search engines” has so many hits? A large number of those hits are for spam sites, who over-sell advertising and promise you hundreds of hits a day. Others, however, offer the same advice I’m telling you. Don’t write for search engines.

    Yes, if you get highly ranked on search engines, you’ll attract more people, but it’s not all about getting them to your site. Once you get someone in the door, you have to keep them. If you’ve ever been to a store where you know you need a salesman and they all ignore you, then you know exactly what it’s like to go to a website that’s all SEO and no content of merit.

    The part that confounds me is that all the SEO advice is drivel anyway, as it’s stuff you’re already doing. Also, they confuse the idea of writing for SEO benefit and writing SEO friendly content. There are tips and tricks you making your post layout be friendlier to search engines, while simultaneously making them easier for people to read.

    WCSF Shirt

    Coincidentally enough, Jane Wells (aka JaneForShort, aka if you don’t know who she is, you probably aren’t a WordPress fan) came up with the above comic (with permission from Randall Monroe of XKCD) and I felt it clearly and hilariously made my points for me. (True confession, I actually wrote this post in early July, but not until Jane’s comic did I finish it. Yes, I’m taking advantage of the timing.) In both sides of the argument, the panelists are ignorant of their absolute truth: together, with a good tool and good writing, you become king.

    Just recently Andy Stratton spoke at WordCamp Chicago (You can see a copy of his presentation, which he also used at WC Raleigh, at DIET PILLS, SEO, THEME FRAMEWORKS – There are no magic bullets.) and said “If content is king: context is queen […] Content is king, Backlinks are the Emperor.” For years I’ve espoused ‘contextual links.’ I will, rarely, put up a list of links, but when I do, it’s to organize them contextually. A link on it’s own is meaningless for the user who reads it and the site you’re linking to. If no one follows that link, it doesn’t matter how much ‘link juice’ you’re sending them, because no one’s clicking it.

    Don’t write for SEO, don’t make links for links sake. Listen to what your teachers said: write clearly, write well. Link with context, and people will see the effects of your work and link back. Write for the humans. We’re the ones reading.