Half-Elf on Tech

Thoughts From a Professional Lesbian

Tag: seo

  • SEO Advice I Ignore

    SEO Advice I Ignore

    I watch a lot of WordCamp presentations, and I pick up on a lot of ‘advice’ people give. Some of it is, I feel, useless. Today I want to tackle all the SEO advice I’ve seen and read lately that doesn’t matter as much as it might, or at least, not enough to make me change.

    Fisherman and Pelican ignoring each other on a beach

    Before that, though, I want to stress the one part of SEO that will matter, now and forever, no matter what Google does, and that is to have Good Content. Second to that is to have a good network of people who link to you, share your posts, and retweet them. Human interaction is the best measure of your SEO. If people are sticking around, you’re going to be okay.

    Don’t Use Dates in Your URLs

    While my SEO hero, Yoast, isn’t a fan of dates in URLs, here’s what he says:

    “Putting the date in the URL has very few benefits, if any. I’m not a fan because it “dates” your older results, possibly getting a lower click through over time.”

    So I use them here, and on my personal blog, as example.com/%year%/%postname%/ for a couple reasons, but it boils down to the fact that dates matter with the content I provide. The first thing I look at when I see a post about a technical subject is when was it written. Then I read the post to see if it mentions a specific version of the product. If there’s a notice at the top like “Read the updated version…” then I’ll go open both and read the older and the newer.

    The point is, as Jen Mylo might say, technology changes. And because of that, it should be obvious when a post took place. So I firmly think dates matter for the humans. And since they don’t matter for SEO, use what makes you feel good. Of course… shorter URLs are better. I’d suggest more people use categories if they could manage to only post in one category at a time. Still, go to https://halfelf.org/recovering-your-cape/ and you’ll be redirected because WordPress is really good.

    Use Related Posts

    I can see why people think this is important. Establishing cross-links between your old posts can pull new readers over. But I find this is more important in getting the search engines to scan your older content. I cross link between posts manually all the time, not to get better SEO, but to help my readers see where I came from before. So I don’t need to have them automagically made for me. This comes back to good content. My good content is relating posts in the best way possible, and making sure they’re the best links to relate.

    Always Use Images

    Images are pretty, I agree, but I think it’s more important to use images that matter. You don’t have to use images all the time. Certainly people like them, it breaks up the monotony of a post, but for SEO, you should worry more about your alt/description fields than having an image. Also remember to compress your images, please. It’ll make your site run faster which will help your SERP. When in doubt about an image, leave it out. Or use something silly.

    Suomi: Jymyjussien pelaajat vaihtokopilla
    Suomi: Jymyjussien pelaajat vaihtokopilla
    Source: WikiCommons

    Never Change Your URLs

    While this is just good advice, as long as you have good redirects for your old links to the new ones, you won’t lose your Google Juice. So yes, you can change your URLs. I do it all the time, but I’m proud to say that links from fifteen years ago still work. They redirect to the new URLs, of course, but they work because of those redirects, and my Google Juice is amazeballs.

    Keep Posts under 600 (or 750) Words

    I laughed a lot here. While people like Otto marvel at my verbosity, and other people tell me “Your posts are too long!” when I cut a post back from 2000 words to 1000 it’s not because of SEO, it’s readability. If a 4000 page post is the most popular on your site and CNN links to it, I’m pretty sure your SEO won’t get hurt. Your post should be as long as it needs to be to clearly and accurately communicate what you’re trying to communicate. That’s your rule. Keep with it. Now, if your human readers tell you that you’re too verbose, that’s something else.

    Use Subdomains

    I’m going to quote Matt Cutts (aka Mr. Google) here, on the subject of sub-domain vs sub-folder:

    “They’re roughly equivalent. I would basically go with whichever one is easier for you in terms of configuration, your CMSs, [and] all that sort of stuff.”

    Can we move on now, please?

    Use Menus

    500 Menu Items? Ain't nobody got time fo that!Okay, yes, you should use Menus, but I’ve seen people stuff every possible link into a menu, and then be upset no one sees the menu item that’s four tiers down. I barely use menu tiers, or if I do I limit them to one, and only one, sublevel. So the SEO advice of jamming everything into a menu is just useless, and given how people are using it like keyword stuffing, I bet Google’s next release (Penguin, Panda…. Pterodactyl?) will check if the CSS or HTML5 code indicates a menu and, if so, ignore it. I know I would.

    Anything Else?

    What do you think is just plain ol’ outdated or wrong SEO advice?

  • Is SEO Best Handled by a Plugin or Theme?

    Is SEO Best Handled by a Plugin or Theme?

    I’m not an SEO expert, but I know a heck of a lot more than many people who claim they are. For the record, I’ve been messing with SEO since it was ‘correct’ to put hidden text in the source code of your site. I used to spend time getting sites to rank well on Lycos and Altavista, back when I was but a wee intern for my friends. It’s fair to say I’ve been around the block with SEO.

    I don’t consider myself an expert because of skill, though in the last couple years, I’ve decided not to keep up as closely with things like schema, mostly because I don’t have to. I still retain a solid grounding in what does and does not make for good SEO (content!), and I understand that part of good SEO isn’t just content, it’s how the content is displayed for the reader, but also how the information is sorted for the computers at search engine companies.

    Credit: Plymouth UK
    Credit: Plymouth UK
    About every couple months, someone asks me if I prefer using a theme or a plugin to manage my SEO, and I have been giving the same answer for a couple years now. I don’t use either.

    This does not mean that the themes I use aren’t ‘SEO’ optimized, of course. It means that I don’t use their ‘extra’ features. I use, primarily, StudioPress’ Genesis Framework right now, and that comes with an SEO settings page which I never use. Ever. In fact, I turn it off in any child theme I make. This is not because I don’t think that it’s useful, but that what I do ‘use’ for SEO is already included.

    My SEO consists of making my content fantastic, using a theme that includes schema headers (or adding them myself if not), and following the guidelines Yoast outlines in his article WordPress SEO Tutorial. I don’t do everything he says (he likes ‘category/postname’ for permalinks, I like ‘year/postname’ but if date doesn’t really matter, I use category instead), but I do read and think about what it means.

    That’s the crux isn’t it? I don’t blindly follow advice, or use a plugin or theme because people say I have to. I read, I think, and I come to logical conclusions, and I apply them after I write my post.

    For example, Yoast says not to use ‘stopwords’ in titles and make them SEO friendly. I take this to mean your human readable title should be gripping, but the title slug should be short, to the point, and descriptive. So I customize every single title. I come up with four or five before I post, and then when I have one with a good grab, I tweak the title slug to be as short as possible, while still being descriptive. Sometimes I’m better at this than others, but I keep working it.

    pgpoaNext I customize my ‘publicize’ lede. This has to be good and it has to be short. I know I’m using my helf.us yourls, so the URL itself will be tiny, but that doesn’t mean I should use just my title for Twitter. I customize it, trying to make it a little more witty and pithy, to reflect me and my readers. Finally I customize my excerpt. Oh yes, my excerpts are all custom written, and they are intended to grab you hard. Like Yoast, I feel the only well written description is a hand written one, and I do it. For everything.

    This puts me at a funny disadvantage. Most plugins and themes I’ve seen tend to want you to make a custom meta description. There are plugins (like the one I do use, listed further down in this post) that allow you to use your excerpt as descriptions, but I’ve never quite understood why themes make this so hard. In Genesis, I have a field for “Custom Post/Page Meta Description” in every post, which if I use it, will change the meta value for description.

    When I dug into the code, I saw that it was pulling this:

    genesis_get_custom_field( '_genesis_description' );
    

    Clearly all I need to do is make that default to what I want. And when I figure that out, I’ll let you know. Right now, all I could do was remove Genesis’ function and replace it with my own. Not elegant at all.

    Now all that said, there are times when I see to ‘improve’ upon the SEO I’ve been given, because someone else is handling the content will far less care than I give. When that happens, I grab Yoast’s WordPress SEO Plugin. But for the most part, I don’t do anything on a regular basis that involves having to ‘customize’ my SEO, so it’s infinitely portable to any theme I want.

  • Collecting Conflicting Stats

    Collecting Conflicting Stats

    StatisticsWhile, like many people, I use Google Analytics, I don’t really trust it’s parsing. I do use mod_pagespeed which lets me auto-embed my GA code in every page without plugins or extra work on my part, which is great, but the results are questionable and often wildly disparate and conflicting.

    Let me demonstrate:

    Google AWStats Webalizer
    Page views 2,607 10,354 8,502
    Hits 49,830 59,542
    Visits 888 1,274 2,255

    First of all, I can’t find ‘hits’ anywhere on Google. Their layout is different and changes regularly. Secondly, and I’m sure this jumps out at you, according to AWStats and Webalizer, I’m getting 4 to 5 times the pageviews compared to Google. I previously configured AWStats and Webalizer to exclude wp-admin and other ‘back end’ pages by editing the configuration files. I did the same in my .htaccess for PageSpeed, so I know no one is tracking admin pages.

    I already know that AWStats errs on the site of users, so if it can’t tell something is a bot, it assumes it’s a user. I also know it tends to overcount, since it bases its counts on traffic in a way that is a little generous (a 60 minute count for a visit). Not a huge deal, but enough to say that yes, the 10k pageview is probably closer to the 9 or 8 of Webalizer. Speaking of Webalizer, it uses a 30 minute count, so there it skews higher. Fine, let’s be harsh and halve them.

    That gives me 4000-ish pageviews. Google gave it 2600-ish.

    Interestingly, Google gives a 30 minute visit count too, but it also uses cookies and javascript, which while fairly safe, doesn’t run on everyone’s browser. As an amusing side-bar, when I switched from using a plugin or manually injecting Google Analytics into my sites and started using mod_pagespeed’s insertion, my results went up. Noticeably. In part this is attributed to the fact that my site is having higher traffic than normal, but when I compared it to WordPress Stats, it was a bigger than expected jump.(I’m not using WordPress’s Stats ala Jetpack in this experiment because it only counts WordPress pages, and the site I’m using is not just WP. However on a pure WP site, WP’s stats tend to skew higher than GA.)

    Which one is right? Most people will say Google is ‘closer to the truth’ but I don’t know how much I can rely on that. Certainly it’s more true for how many actual people are visiting my site, and when I’m judging metrics for marketing, I’m a little more inclined to use Google. That said, if I’m trying to understand why my page speed is slow, or where I’m getting hammered with traffic, AWStats and Webalizer are far more accurate, since they’re counting everything.

    Data that can, and cannot, be measured
    From “Manga Guide to Statistics,” Shin Takahashi, 2008
    Right now, I’m keeping Google Analytics on my sites. I don’t really need the measurements for marketing (that would involve doing marketing), but there are better social engagement stats provided that make it helpful. Like of all the social media sites, Facebook and Twitter are tied for traffic, and Google Plus is only high scored on my tech blog. I think that if Google let us auto publish to Google+, those stats would change, but for now, it’s all manual.

    This is not to say that I think auto-posting is great for social engagement, but I find I actual pay attention more to the social aspect of the media if I don’t have to remember to post all over the place. This is a massive shift since October 2011, when I’d stopped auto-posting for SEO reasons. Why did I change my stance? Well it because easier to autopost and keep that personal touch with Jetpack’s Publicize feature. Now I can easily insert a custom message, and I know it’s going to (mostly) use my excerpt.(For some reason Tumblr is a moron about this) That saves me effort and allows me to spend more time actually interacting!

    Auto-generating my stats with little effort, and being able to easily read them without needing a degree in SEO (no they don’t exist) is also hugely important. Google Analytics is easy to read, but curiously I find it overly complicated to understand. The different pages and layouts make it surprisingly hard to find ‘What were my stats for yesterday?’ Sometimes I have a boom in traffic on one day (like the day I had a 600% increase) and I want to see what went on and why. Where was this traffic coming from? WordPress’s stats do this amazingly well, just as an example.

    No one tool provides all the data I need to measure all aspects of my site, nor does anyone one tool collect all the data. Google tells me more about browser size, screen resolution, and everything it can grab about the user, where AWStats and Webalizer give me more information about traffic by showing me everything, bots and humans. Basically server tools are great for collecting server stats, and webpage tools are great for user stats. But you need both.

    So in the end, I have at least four different statistic programs I check on, regularly, to try and understand my traffic and measure success.

  • Social Throttling

    Social Throttling

    Lately we’ve all seen the ads on Facebook ‘Promote this post and have it seen by a wider audience!’ And many of us pish-posh it, because the people who liked us will see our posts, and who needs it. Right? Wrong. Not even 25% of the people who like you, or your page, will ever see your posts if you don’t promote them. What they call ‘organic’ reach is highly limited, especially with their new timelines. The way in which they filter the new timeline is going to make this even harder.

    A lot of people I know had no idea that Facebook limits who can see your posts. When you start combining this with Google’s encrypting of search terms, the easy valuation of your SEO is creeping swiftly into overly complicated. A lot of A/B testing relies on this information and now that it’s being taken away, we’re back to the age old metrics of grabbing people off the street and asking them which sock is whiter.(Of course with on-line prompts to fill out those Q&As, we’ve already hurt ourselves. We’ve all trained ourselves to ignore those ad-like things that get between us and content. But that’s another post.)

    Facebook Ad
    Facebook Ad Example
    Twitter has had ‘promoted‘ tweets for a while, which is probably why they get so tetchy about the other Twitter apps. The concept is that you pay and more people see your tweets. Not a bad model, really, though most of us just roll our eyes and ignore them. Still, all your tweets are just as popular and shared with people who follow you.

    Not so, Facebook. For a long time, Facebook did something kind of similar. Your sidebar has “sponsored” posts, which are just plain ads. If you have a Page, you can pay to sponsor your posts, similar to Twitter, and push your brand. But here’s where Facebook’s a shit-bird: Edgerank.

    The concept is like Pagerank from Google. The more popular, and active, your page/post are, the more they’re worth. But that doesn’t make sense for people, since my brother’s edgerank may be low, but I still want to see all his posts. Supposedly Edgerank doesn’t affect this, but I’m not so sure, given how many ‘important’ Facebook messages I miss. This probably stands out more to be since I don’t visit the site with any regularity. If I have a blog post (like this one) that I push to Facebook via Wordpress, I may go back to see if people commented there. That’s really the only time I notice what’s in my timeline, and while I do quick scan it, it’s filled with cruft.

    What happens on the Death Star stays on the Death Star.
    What happens on the Death Star stays on the Death Star.
    If you’re on Facebook all the time, you’d never notice. If you got Facebook emails, you’d never notice. I don’t meet either of those requirements. If I use Facebook, I use it. If I don’t, I don’t want a hundred emails cluttering up my space, and this is a problem. See, if I don’t participate actively, by clicking like (something I rarely do), then I cause your edgerank to drop and fewer people to see your posts, so fewer of us click like and thus it sucks. They aren’t wrong with their algorithm, as Facebook is a work based on connections. A likes B likes C who shares A with D. That’s how things get around.(If you’re interested in the math, Dan Zarrella did the math and Harvard reblogged it.)

    With that in mind, if you’re trying to improve your ranking for your product, Ari Herzog has a suggestion: Concentrate on interactions. If you have some regular people who leave comments, talk with them. They’ll be more inclined to share and retweet your posts, which gets you better ranking. Remember, we’re working under the assumption that everyone wants to get noticed more for business, and while SEO is not a zero-sum game, there are winners and losers. Concentrating too much on the media aspect of social media is a quick way to lose.

    For the rest of us who just want to communicate with our friends and family, you’re better off getting a blog that emails them directly. At least then the only fear you have is the spam filters. But of course, that falls into the argument of why you should own your own content anyway, and is a post for another day.

  • SEO: Impossible

    SEO: Impossible

    For someone who thinks SEO is crap, I sure talk about it a lot. Google’s got a new toy: Dissavow links.

    In the wake of Panda, a lot of sites got hit with bad SEO rankings from having crappy backlinks. In specific, I know many WordPress theme developers were hurt, including WPMUDev, because spammers and scammers used their themes. Basically their own popularity bit them in the ass, through no fault of their own save their success. After all, a pretty common question people have is “Do those crappy, low-quality inbound links hurt me?” And most of the time, the answer was no. Except when it did with Panda. At the time, it didn’t seem fair to anyone that your popularity would be detrimental to your SEO, and thus we have Dissavow. (Amusingly enough, Bing got there first.)

    But what does it do? Here’s Matt Cuts explaining this:

    For the rest of us, it lets you say ‘These links are crap and they’re not related to me, so please don’t let them impact my search ranking.’ Many of you are looking confused here, and wondering why they impacted you in the first place. After all, it’s not your responsibility to monitor the quality of sites on the Internet, is it? That’s why Google and Bing make the big bucks. And yet we all know how terrible search results can be, and frankly Google’s blog search is horrible. I have to hand it to Google, though. Search is hard, and crowdsourcing the work of teaching a computer what is and is not spam is actually a good idea.

    Google (and Bing’s) methodology rub me wrong. Now that Google has us doing the work for them, by picking out spammy sites and effectively reporting them, you’d think all is good for the theme world. Alas, not so. I’ve heard rumblings that Google is now asking theme developers to remove backlinks!

    While I don’t feel a theme developer will be broken for this, it will make it much harder for them to promote their works. On the plugin end of things, I’ve had people ask me to remove their plugins because we don’t permit WordPress plugins to show backlinks unless they’re opt-in, and this means the dev can’t make money. Part of why is that you can have hundreds of plugins, but only one active theme. The other part is we feel it looks spammy. Now, so does Google.

    But all that aside, if you want to disavow your backlinks, you can now do it, and the directions aren’t complicated. Click on the disavow link, upload a text file formatted in a certain way, reap benefits. Sounds great, right? What if I told you that Google sends you no confirmation at all? There’s no confirmation, no way to see if what you did worked or not, and worst of all, this could take weeks, if not months, for them to crawl, sort, and re-crawl your sites. During that time, you hear nothing. When it’s done, you hear nothing.

    You do all this work and end up in a vacuous hole of ‘well, there’s that then’ with no assurance of anything at all being done. That caught my attention in a bad way. How can I tell I’ve done the right thing? We’re already being killed by not being able to track encrypted search terms, and now we’re not going to be able to tell if removing the links from the bad people is going to help our SERP?

    This is why I think SEO is full of it. To one degree or another, it’s always been about gaming the system, and tricking search engines into letting you rise to the top. Meta tags trumped quality, and then it was links (because obviously if people link to you, you’re valuable). Now we know people game links, so we remove that, which actually doesn’t hurt as much as you think. See, a lot of your search engine ranking came from the quality of sites that linked back to you. But the most valuable sites (like MediaWiki) have stringent policies and rules about not linking, or linking and using nofollow, to prevent you from getting link-juice. In the case of MediaWiki, it makes sense since anyone can edit it.

    But…

    That just went to prove the system was broken. Blogs (WordPress included) nofollows comment links for the same reason. If the door was open, the spammers would use it and make themselves look more important. And as the tools got smarter and started making those links worthless, the spammers started scraping your quality content, which Google et al had to learn to filter. We’re at the point where links are valueless. It doesn’t matter who links to you anymore, because none of the good sites will give you a lot of value since they’re trying to get rid of the spammers. So why is Google giving any weight to these spammer links?

    If the state of link-relativity is so poor that search engines are asking us to remove backlinks from themes, and also to tell them which links to us are worthless, then all links are more trouble than they’re worth and we need to figure out a better way to measure the usefulness of our sites. What measuring sticks do you use?

  • Penguins Just Gotta Be Me

    Penguins Just Gotta Be Me

    One penguin in a crowd singing
    Credit – Gary Larson

    Google’s Penguin came out and a lot of people got hammered, hard, by the changes. Penguin is the name of their new/updated algorithm, and it seems to have a lot to do with backlinks.

    Backlinks are when other people link to you. Pretty straightforward, but now it appears that people are being penalized for backlinks. Is this true? Yes and no. Google used to credit you for all sites that linked back to you, so the more popular you were, the more referral credit you got, the higher you were ranked, and that seems fair. Now, Google’s no longer passing through backlinks from spammers, so your referrals are dropping and people are being ‘penalized.’ But not really. That’s almost like saying you’re getting fewer phone calls after all the telemarketers stopped calling. Yes, you are, but you’re now not getting junk calls, and the ones you are getting are higher quality. The theory here is that everyone is now being judged fairly again, and by omitting the bad people from giving credit, you’ve leveled the playing field. Nice theory. It still feels pretty horrible to find your rankings dropped.

    How do you find what happened to your rankings? Search Engine Journal has a lengthy explanation, but it boils down to looking at your Google Organic traffic and see if you have noticable drops on April 19th, 25th and 27th. That covers both Panda and Penguin.

    But what caused it? Is it legit drops or unfair ones? That’s really something easily argued in multiple directions. The basic reason is something in your site, or in your site’s backlinks, has been determined to be spam. It sure feels unfair, because how can you be expected to do anything about what other people are doing! They’re the spammers, not you, why are you punished? Again, tis the great equalizer. If you remove all the bad links, what you’re left with may be a lower ranking, but it’s possibly a more honest and accurate one. I say possibly because I’m not able to see behind the Google curtain.

    Few of my sites were impacted, though I generally get traffic from Twitter and Google Plus, because that’s where I advertise. Once in a while, a post gets picked up by another WordPress blog or email list like WP Mail or Matt Mullenweg, and I get 600% traffic. But most of the time I’m pretty steady, increasing slowly and naturally. In part this is because this is my hobby. Certainly I take pride in what I do, but this is not going to make or break me. That’s lent itself to a very odd thing. I’ve managed to follow every single one of Google’s ‘do this!’ suggestions, without ever thinking about it.

    What are these rules? They’re obvious and I’ve touted them many times before.

    1. Write good content.
    2. Don’t spam.
    3. Link naturally.

    The first two are easy, the last one is a bit weird.

    Natural linking is like what I did when linking to Search Engine Journal. I saw a URL, I copied it in, and I put my own description. In addition, I don’t have a massive collection of links anywhere. I link to people and posts in-line, when they come up, and send you around to them in a way like I would if we were talking. In that way, I’m always making backlinks that are valuable for the next guy.

    But like I mentioned before, you can’t control other people’s backlinks to you. If you write WordPress themes and plugins, you maybe getting hit by this, and there is something you can do. It’s just that you won’t like it. See one of the things spammers do is use the same link, with the same URL and href attributes, over and over. What happens when you have an attribution link in your theme or plugin? It’s the same link. Over and over. At first glance, that seems horrible, because a theme would be penalized for having a link credit (like I have here) back to their sites. Some people seem to feel this is exactly what’s happening and the current feeling is that putting in the link as nofollow would be a solution.

    Sidebar: Yes, I’m aware of the debacle with WPMUDev getting hammered by Google Penguin. Of interest to me was that once they removed EduBlogs (a site they run) from having links back to them, the issue seemed to be resolved. A working theory of mine is that Google saw the hundreds of thousands of ‘self’ backlinks from these sites to the parent and it was felt to be gaming the system. This would explain why WordPress, who runs a gazillion number of sites, didn’t get hit, and why not all themes are getting slaughtered. Personally a better move would have been for Google to just throw those results out the window, but…

    Emperor PenguinPlugins, on the other hand, run by different rules. One of the plugin guidelines is no powered by links at all unless the user actively opts-in.(Themes are permitted one in the footer, or an optional one. In part this is because you only ever have one theme at a time, but you can have multiple plugins.) Having too many links out to the same place would be a problem for your SEO, and a plugin that linked multiple times would hurt you. We already know that Google knows how to check your js for hidden links. Back in 2007/2008 they added in the ability to pase onClick events, and it’s only improved since then. So while in 2008 Matt Cuts said it was safe to put a link in your JavaScript if you didn’t want it searched, that seems to no longer be the case. I’ve spot-checked on a couple sites, comparing them before and after, and studying their configurations, and many that have JS controlled ‘powered by’ links are being hurt.

    One major takeaway here is that Google screwed some things up, big time. A day-zero search on Viagra was buck wild and all wrong. It’s fine now, but there’s no way a spammer should have been ranked first on a Viagra search. I’ve complained about how Google prioritizes before, and back in 2009 I declared that Google’s Blog Search was Irrelevant. You couldn’t get a single decent result on anything. With Penguin and Panda, they’ve decided to treat everyone the same, and if a lot of terrible people are using your products, and you have a backlink, you’ll get dinged.

    What does all this mean? Well go Google for ‘panda google wordpress’ and you’ll see a lot of people shouting that they’re being penalized, and the ‘nofollow’ fix is hardly a fix at all. More are shouting that those ‘share this’ plugins, which show up multiple times on one page, are causing rankings to drop because the exact same link shows up multiple times. And right now, we don’t know. Like the Viagra problem, Google is fixing a lot of this on the fly. Google says ‘No algorithm is perfect!’ and that is certainly true, but if Google really is just equalizing things, then why were these bad sites so highly ranked to begin with?

    If you’re a plugin/theme/designer, I’d put in nofollow to my works for now. First, the link-juice didn’t matter anyway if it was in javascript, and second, what you want is people coming to your site and downloading the perfect theme or plugin. They’re going to shop around, and that will, eventually, lead to more sales. Pushing people is a sales technique that falls flat. There are so many options for themes and plugins, a hard-sell will lose you people. So will stuffing your plugin with every SEO trick you know.

    There’s no great answer, and screaming at Google (or WordPress) isn’t going to help. They’re going to do what they want. The best you can do right now is weigh your options between attribution and abuse. Are you really making things better for the users, or are you just doing this for yourself?