Half-Elf on Tech

Thoughts From a Professional Lesbian

Category: How It Works

  • Custom Post Types Are Not Posts

    Custom Post Types Are Not Posts

    This confuses the heck out of a lot of people. Custom Post Types are’t posts they are post types.

    Otto wrote a very nice primer on WordPress 3.0 and CPTs which points this out. Nearly a year later, people are still getting it wrong becuase they refuse to let go of the word ‘post.’ It was, in retrospect, probably a poor choice of names. Ditto Post Formats, in my opinion, but there we are.

    I blame the naming, really. “Custom Post Types” makes the implication that these are “Posts”. They’re not. “Post Type” is really referring to the internal structure of WordPress. See, all the main content in WordPress is stored in a table called “wp_posts”. It has a post_type column, and until now, that column has had two main values in it: “post” and “page”.

    So now, with CPTs, we can make new ‘post types’ and name them whatever we want. It’s very important to note that the column name of post_type is why we call these Custom “Post Types.” If you can let go of the (very logical) connection of ‘Custom Post Type is a type of post’ and start thinking of it as ‘Custom Post Type is a new Post Type’ then you’re halfway to victory.

    If you’ve ever used a Wiki, there is no real post hierarchy like there is with a default WordPress installation. In WordPress, you always have the ability to frame your post URL slugs with date, or even category(As of WordPress 3.3, it’s not as disastrous as it was to use /%category%/%postname%/ in your URLs.), as it happens. Look at MediaWiki. Everything is pretty much top-level. You don’t sort by subfolders, or categories, or anything. All the URLs are domain.com/post-name.

    What about SEO? I’ve said it before, and I’ll say it again: SEO doesn’t care. Google doesn’t care if your URL is domain.com/foobar or domain.com/2001/foobar – Your readers might care (which is why I advocate using at least the year in your URLs for HEO), but Google, not so much.  If they did, why would MediaWiki be ranked so high on most searches?  No, what SEO cares about is your content, your context, and your relationships.

    That really begs the question of why would anyone use CPTs at all?  Last year, Otto advocated you don’t use them if you’re just blogging.  He’s right.  You shouldn’t.  But I use them here to make custom pages for my plugins, and I use them on another site to record all the questions people send me.  They’re unorganized, when you compare them to posts.  But I can, and have, added in taxonomies support to sort them.  Thanks to people like Justin Tadlock, there are tutorials on how to correctly make your Custom Post Type and I know to just add 'taxonomies' => array( 'post_tag', 'category '), to let my CPT use tags and categories. Want to limit it even more? How about linking specific post types and taxonomies!

    Some great examples of CPTs are things like bbPress 2.0, the new forum plugin from WordPress, but also this is the future of BuddyPress! People use them to create movie databases, actor pages, a FAQ, or pretty much anything that needs its own structure. What shouldn’t you use CPTs for? Basically if you want something to act like a blog, don’t use CPTs. If you want something to live on it’s own, like a forum, a wiki, a Facebook page, then you want a CPT. If you want multiple blogs, with unrelated, unconnected, content that just happens to have the same author, we call that MultiSite.(See? There are great reasons to use MultiSite!)

    But they’re not for everything, and never will be, any more than WordPress is right for everyone. So let go of the ‘But they’re posts!’ argument, because you are flat out wrong. They’re post types. Not posts.

  • Don’t Use WordPress MultiSite

    Don’t Use WordPress MultiSite

    Edit: It’s May 2015 and this post is still relevant.

    I talked about this at WordCamp SF 2013. Check out my slides or watch the video.

    I love MultiSite. I think it’s awesome and very helpful when you want to make a network of sites. But more and more I see people doing things where I just tilt my head and wonder why they’re using MultiSite for that particular use-case.  People seem to think that simply because they can use MultiSite that they should use it, and this simply is not the case!

    MultiSite, either by intention or effect, works best when you think of it as running your very own version of WordPress.com.  You have a network of sites that are disconnected from each other, data wise, but share the same available user base.  That means the only ‘information’ that is shared between two sites is your user ID, and even then, unless you’re explicitly granted access to the site, you’re nothing more than a subscriber.  Which is to say you can read the site, and comment.(You could get nitpicky here and point out that there are a lot more things one can do as a subscriber on a site, but you understand the gist.)  That means that while there are many perfectly valid reasons for having a MultiSite, it will never be a perfect solution for all people.

    One of the best alternatives to MultiSite is Custom Post Types.  They let you make ‘subfolder’ additions to your site and format them as you want.  There is a drawback, though, in that you cannot use YYYY/MM/DD in your permalinks for them (Otto on Custom Post Types – wp-testers email list) however I would wonder why people use that anyway these days?  The only reason I use YYYY in my URLs is that I believe there’s a date on the usefulness of these posts, and if you come back in five years, you should know how old the information is.

    Another alternative is good planning.  If you sit down and define your needs for your site before you build it out, and plan for the growth you desire, a lot of things become clear.  Think about how many different places you’d want to go to maintain your site.

    Here are some examples of sites that should not be built out as MultiSites:

    To Categorize Posts

    File CabinetThis one comes from my girl, Andrea, who reminded me of a fellow we ran into who wanted to have one site to post from, and each post would go to a special site based on the category.  WordPress already has that built in!  It’s called, get this, ‘categories.’  Now the user in question said he didn’t want categories because your URL shows up as /category/pemalink, and that wasn’t his desire.  So I suggested Custom Post Types.  /posttype/name was much better, and he could add in tags as he wanted.

    When Your Site is Homogenous

    Do you want your whole network to look and feel 100% the same?  Don’t use MultiSite.  If every single subsite is going to be exactly the same, except for content, but the content is all written the same way, you don’t need MultiSite.  Replicating the theme and settings on every subsite is a pain, and you can achieve the same result with categories, tags and CPTs.  You can even use a membership plugin to control who sees, and has access to, each CPT!(Role Scoper claims to do this, in fact.)

    Now someone will point out that this site fails that check!  If you notice, three (four, kind of) of the sites look very similar. Same general layout, same links and sidebars, but different headers.  This site could have all been done as categories and CPTs, and not needed the multisite until I hit on the children sites like the one for my grandmother.  But.  When I built it out, I decided to put my tech posts on their own page to separate the writing.  They are separate sites.  What I write here is vastly different from my blog, and that’s important to me.  The site has the same ‘feel’ in look alone: the context is what separates us.(And I have a plan for the photo blog.)

    For One Special ‘Thing’

    I’m guilty of this one.  I had a site that was a blog, and I wanted to make a ‘video’ section.  So I made a MultiSite!  Boy was that dumb.  Two admin areas, two sections for layout, and I wanted the site to still look like ‘itself.’  I caught a clue later on and converted the whole thing to Custom Post Types!  Much easier to maintain!  Now I have a smaller, faster, site.

    Users Shouldn’t Know About Each Other (AKA Separate User Databases)

    Andrew Norcross pointed this out.  If you need users to be on different sites, but not aware that they’re on a network, don’t use MultiSite!  Now, yes, there are ways around this, however it’s an auditing nightmare for any large company, and a security risk that you should be aware of before you start.

    Hidden UserCurtiss Grymala points out that if you need totally separate user databases, this is a strong case against MultiSite.  Be it for security or just obscurity, if the users need to be separated  don’t do it.  There are workarounds, but you’ll spend more time on that then updating your sites.

    Hosting Small Client Sites

    I don’t host my Dad’s site, Woody.com, even though I maintain it.  Why?  Because, as

    Cristian Antohe said, he just needs a standalone WP install.  Would it be easier for me to have one place to go to upgrade him?  Yes and no.  He’s small, he doesn’t need a lot, and he now owns his domain, his site and his email, all in one place.  It costs him $7 a month, plus the number of meals he buys me when we’re in town together, and he’s master of his own domain.  This is great for him, because if he fires me, he still has everything.  Also, if he does something weird that spikes his traffic 500% (like last month), it doesn’t affect the rest of my sites.  Factor that into your budget.  Make your client own their own data.

    Users Need To Embed JS Into Posts

    This is not a bug, people.  Only the Super Admin on a MultiSite install has the access to include iframes, javascript, and other non-oEmbed’d data into posts! You don’t want them to!  If you’re running a MultiSite, you’re the big dog, and you’re responsible for limiting their actions to things that won’t take down everyone because they don’t understand what an insecure iframe hack is.  Yes, there’s a plugin that will let you allow this.  No, I won’t tell you what it us, because unless you’re using a 100% locked down, you approve users you know and trust with your car, site, you do not want to open this door.

    If you can’t give them they access they need via shortcodes, then they need to host themselves, or you host them separately.  Protect everyone on your network, and don’t give them unregulated access.

    Users Need To Install Themes/Plugins

    Curtiss again reminded me that MultiSite doesn’t let you let your users install themes and plugins as they want.  You can, via the use of clever themes that save settings per site (like TwentyEleven) and plugins that allow you to tweak CSS (like WordPress.com Custom CSS) give them more customization, but you cannot give them access to install plugins and themes.  Why?  Because those things will be available to everyone on the whole Network.(There are plugins to manage plugins more granularly, and only permit some sites to use certain plugins, but again, this isn’t something everyone on your network should have access to do.)  Remember, we’re sharing here!

    Same Post, Every Site

    I keep running into this one.  “I want to have the same post pushed to every single site on my network!”  I understand why people do this, I just think they’re doing it wrong.  It’s not just that MultiSite is meant to be separate (aka individual) sites, it’s that you’re diluting your content.  The more different places someone can go to in order to get the information you’re providing, they less impact you have because you’ve given them too many options.  Decisions.  Make one. Also, as Andrea reminded me, identical content in multiple places is something spammers do. Google will downgrade your site ranking if you do this.(This doesn’t impact categories, tags and archives because of the use of canonical links.)

    Mimeograph (copy)Now, one user said he needed to do this as a business decision, because each of his (mapped) domains was a separate brand.  But the separate brands had shared data.  So … they’re not actually separate, but children.  Me?  I’d have everything link the shared data back to the master brand.  McDonalds may sub-brand out happymeal.com (they did!) and make a whole separate site, but if you click on their ‘Privacy’ link, you go back to macdonalds.com!  Why?  Because the parent brand is where that stuff belongs.

    BuddyPress Separation

    This comes from Andrea again.  If you need to have totally separate BuddyPress installs, use separate installs entirely.  Just … y’know, you can do it other ways, but it’s not worth it.

    What else?

    This list could go on and on, so jump in and tell me your reasons why you’d never use MultiSite!

  • All Subs Are Equal

    The Drebbel For years we’ve told people “The only difference between subdomains and subfolders is that search engines see subfolders as being all a part of the same site, and subdomains as separate sites.

    That may not be true for very much longer.

    As August ended, Google dropped this little bomb casually:

    Most people think of example.com and www.example.com as the same site these days, so we’re changing it such that now, if you add either example.com or www.example.com as a site, links from both the www and non-www versions of the domain will be categorized as internal links. We’ve also extended this idea to include other subdomains, since many people who own a domain also own its subdomains—so links from cats.example.com or pets.example.com will also be categorized as internal links for www.example.com.

    via Official Google Webmaster Central Blog: Reorganizing internal vs. external backlinks.

    This is, I think, a good thing. They make sure you understand that ipstenu.wordpress.com won’t be the same as jane.wordpress.com, which makes sense since their own blogger platform runs off subdomains as well. Somewhere in their logic they know ‘Aha! Ipstenu doesn’t own wordpress.com! But she does own Ipstenu.org.’

    To reiterate,  this only affects things in Google’s Webmaster tools:

    This update only changes how links are displayed in Webmaster Tools. It doesn’t affect how links are valued in relation to the search algorithm or ranking. It has nothing to do with Panda, nothing to do with keywords, nothing to do with PageRank.

    I fully expect that it’s going to change and expand.  Since most of us end up registering our ‘master’ domain (i.e. ipstenu.org) off Google Webmaster, they can easily leverage the data they already have in order to tweak search engine results.  Another tactic would be to start using a new meta type tags.  After all, the big guys already organized schema.org (which isn’t as picked up as it might be yet).

    schema.orgSidebar: Schema.org’s problem is how complicated it is to fold into CMS tools.  If they’d started by releasing this with a couple plugins/extensions for the popular open source tools like Drupal, Joomla, MediaWiki, MovableType and WordPress, which as they’re open source, they could have, we’d be way ahead of the game.  As it stands, the one WordPress plugin I’ve seen requires you to import SQL tables!  If they get schema ironed out so it’s simple and easy to use, then we can add in ‘parent’ or ‘main’ site parameters to our code and every search engine can know ‘halfelf.org is internal to ipstenu.org, but taffys.ipstenu.org is not.’  And wouldn’t that be cool!

    Personally, I have ipstenu.org, ipstenu.org, halfelf.org and so on all listed separately in Google Webmaster right now.  I could do that with subfolders as well, to track data internally to itself (i.e. each subdomain/subfolder), and still track them at the master level with the main domain.

    So no effect on your SEO or ranking or any of that stuff, but a nice change for Webmaster Tools.

    Where do you see this going?

  • TimThumb and the Pseudo (D)DoS Effect

    TimThumb and the Pseudo (D)DoS Effect

    Over the course of a day, my server rebooted httpd twice. That’s not a common thing for me, as after days and hours of work, I managed to finegal my server into a semblance of stability so it could handle massive loads. At first I thought it was due to my traffic spike of an increase of about 1500% (no I did not screw up a decimal place there, my traffic went from a couple hundred to nearly 4000 in one day). Then I thought that couldn’t be right, because my traffic had actually mellowed out by the time I was crashing.

    So I went to my emails and pulled up the Apache Status log and noticed that 70% of the calls to my site were GETs to pages like this:

    /wp-content/themes/newsworld-1.0.0/thumb.php?src=/g0../0d1.
    /wp-content/themes/DeepFocus/tools/timthumb.php?src=/g0../0
    /wp-content/themes/TheProfessional/tools/timthumb.php?src=/

    Thumb Wrestling And thanks to that massive spike in traffic, my server was slowing down to the point that HTTP was becoming unresponsive and it had to reboot itself. In short, the TimThumb exploit was causing my server to behave like it was under a Denial of Service Attack, even though I don’t use TimThumb! My server was able to handle this, but if I’d been back on my old Shared Server, I’d probably have not gotten a text from the server at 11pm saying “Man, we had to reboot, but it’s okay now. Can I have a beer?”, but instead woken up to ‘Dude, where’s my website!?’ And this is with having a fantastic web host who cares and takes the time to help me out.

    Normally this is where I’d tell you what to do if you’ve been infected via the TimThumb exploit, but Cleaning Up the TimThumb Hack covered it pretty well. Just remember this, if you have been infected, you must reset all your passwords. This is true of any and all hacks. As soon as someone has access to meddle with files on your server, you could be hurt worse than you know. At the very least, you need to read the post “Technical details and scripts of the WordPress Timthumb.php hack” by the guy who ‘fixed’ TimThumb.

    What I wanted to do here was sort out how to block people who were looking for timthumb.php files (I can’t block thumb.php as I use that filename elsewhere). Reading up on Perishable Press’s Stupid .htaccess Tricks it’s clear we can do this:

    # prevent viewing of a specific file
    
     order allow,deny
     deny from all
    

    That should simply block access. An .htaccess block is a pretty simple way to reduce your server load, because people are getting punted before they get very far into things. Still, it’s something I have to install on each account on my server. Right now they’re just hammering ipstenu.org, and this is not the only domain on my server. This is, by the way, the same problem with using a plugin like WordPress Firewall. It’s a fantastic idea, if all you have is one account on a server. Protect yourself.

    I don’t. I run a VPS, and I have four domains here which I should be protecting. It’s easy enough to make that change on all four, plus two other external servers, but is that the best use of my time? I say no. I think I should automate as much of this as I can. What I really want is to say ‘If you’re looking for timthumb.php, then the odds are you’re looking for security vulnerabilities, and you need to just die in a fire.’ Or at least firewall block them. Seeing as I already have CSF on my server, it was logical for me to start there.

    Ice Blocks floating in the lake - Sarah OhBlocking an IP is easy, and I can do it via command line or GUI. Auto-detecting a URL though, is more complicated. Part of me thinks that, much like I can auto-block someone who tries to log in with any ID 10 times in a row, I should be able to add in a file pattern somewhere. Turns out you can’t, at least not the way I wanted. Instead, you have to do it differently.

    TimThumb’s exploit scanner isn’t actually a DDoS Attack, but it acts like one. A denial-of-service attack (DoS attack) or distributed denial-of-service attack (DDoS attack) is an attempt to make your site unavailable. Basically they hit your site with so much traffic that it’s incapable of its job: serving up webpages to happy readers! That’s why I call this a pseudo (D)DoS attack. The goal of the scanner is to see if you’re using Timthumb and, if so, to be evil. It’s not really distributed (i.e. multiple computers at once), though because of the number of people running the exploit scanner, it can seem that way. The side effect is that your site is hammered and can’t do what it wants. Which leads us to Connection Tracking.

    CSF has a tool called ‘Connection Tracking’ which lets you limit how many times a specific IP can hit your site at once before they get tossed to a shit-list. I set this to 300, and told it to only scan ports 80 and 443 (because I need to have unlimited FTP, and sometimes I log in from different IPs – yes, my home IP is on the whitelist).

    Connection Tracking. This option enables tracking of all connections from IP addresses to the server. If the total number of connections is greater than this value then the offending IP address is blocked. This can be used to help# prevent some types of DOS attack.

    Care should be taken with this option. It’s entirely possible that you will see false-positives. Some protocols can be connection hungry, e.g. FTP, IMAPD and HTTP so it could be quite easy to trigger, especially with a lot of closed connections in TIME_WAIT. However, for a server that is prone to DOS attacks this may be very useful. A reasonable setting for this option might be around 300.

    Setting this up is a little less obvious for the new person. Go to WHM > Plugins > ConfigServer Security & Firewall and look for “Firewall Configuration”
    CSF Firewall Configuration

    Click on the button, look for CT_LIMIT and change it to 300.
    CT_LIMIT settings

    Scroll down, click ‘Change’ and then restart CSF.

    Now, you could put this as low as 100, or as high as you want, but I did some reading and 300 seems like something not too likely to trip innocent people up, but just enough to catch the bad guys. I may way to lower this to 200 or so, but I know that a lot of people come to one part of my server for the image gallery, and they tend to open lots of pages at once. I don’t want to hurt them. The other thing to keep in mind is how short is the block time. The IP block is set for 30 minutes, which isn’t much at all, but it could be just enough to make the transient DDoSers go away. ETA: As of February 2012, I’ve actually lowered this to 50, and it’s made a world of difference! My day to day CPU is a little higher, but the number of spikes that caused outages has dropped.

    I’m not doing this to stop the people who want to bring my server to its knees. I’m doing it to stop the people who are ‘scanning’ for exploits. A true DDoS is hard to block because as soon as I block it, I have to defend against it again and again. CSF would be like a sump pump in your flooded basement, always running until it burns out the motor. It comes from too many sources, and for the little guy (i.e. me), I may just have to shut things down for an hour and wait it out. But these scanners, well, I can block them with this trick, and not hurt my server doing so!

  • Understanding Zero-Day

    Understanding Zero-Day

    If you run a website or work with computers much at all, you’ve heard the term ‘Zero-Day Exploit’ and you probably had no idea what that meant.

    At its heart, a “zero day” attack or exploit is one that happens before any of the developers are aware of it. It’s pretty straight forward, in that the attacks take place in that window of opportunity between code release and code patch. Logically, you’d think that all exploits are ‘zero day’ because a programmer would never release a product with known vulnerabilities. Right?

    Wrong.

    We already accept the fact that human beings are not perfect and thus, by extension, neither is our code. We cannot make every product work on every setup, otehrwise there wouldn’t be browser and OS wars. Keeping that in mind, we have to accept the fact that there will always be security holes in code. And sometimes we developers are well aware of them, but consider them acceptable risks. brorwser wars - by Shoze This means that when a vulnerability is plastered as a zero day, the question becomes ‘By whose calendar is this a zero day exploit?’

    If you found a zero-day flaw in a product, the ethical thing to do is privately communicate with the product developers ‘Hey, if I do this, I can get access to THAT.’ At that point, the product developers should take the time to craft a fix and quietly push it out to everyone. The public often isn’t told about this until the patch is written and available, and even then, details are withheld a few days so that, during the critical time it takes everyone to upgrade, people aren’t exploited further. This also allows people to apply one patch instead of 17, as multiple fixes can be wrapped up into one install.

    Of course that’s a perfect world scenario. There are multiple cases of exploits being announced in the wild before a fix has been made. Sometimes it’s a case of an over enthusiastic reporter, but also sometimes the people who report the bug get mad at how long it takes to fix it, and release the information in order to speed up the process. There are unprintable words for those fools, and the fact that they can’t understand how they’re making the situation worse is sad.

    By its nature, an exploit no one knows about is the one you can’t protect yourself from. That’s why vulnerability disclosure is such a touchy subject. Sometimes the fixes are really easy, but more often they’re not. Like a vulnerability exploit in your car is the gas tank. Anyone can walk up, unscrew your fill cap, and pour in anything they want. That they don’t has more to do with the fear of retribution than anything else, but they certainly could. Also vulnerable? Your mail. I can’t tell you how many times I see the mailman leave the cart on the sidewalk while she goes in to deliver our mail. Someone could steal the mail, but rarely does that happen.

    In 2008, there a DNS cache poisoning vulnerability was discovered.(ZDNet – Has Halvar figured out super-secret DNS vulnerability? by Ryan Naraine | July 21, 2008, 2:12pm PDT) The details of the exploit itself are inconsequential to this story. When the vulnerability was discovered, the folks ‘in charge’ asked for a thirty-day embargo where no one would ask about it or talk about it, to allow the code to be patched and deployed. This radio-silence would end with a news release and explanation. This did not work as well as one might have hoped. (ZDNet – Vulnerability disclosure gone awry: Understanding the DNS debacle by Ryan Naraine | July 22, 2008, 7:09am PDT) People accused the organizers of performing a bit of media hacking (i.e. social hacking) and spinning the news to make a bigger impact for themselves. Essentially, they claimed there were no altruistic reasons to keep the lid on the issue.

    When you seen a report of a zero-day exploit, the important thing is not to panic. Firstly, check to see if there’s already a patch. Secondly, remember that just because you’re vulnerable does not mean someone’s spiked your gas tank. Thirdly, accept reality for what it is and know that you’ll be impacted at least once in your life, and that’s okay.

    If you know how to recover from this, you’re better off. But that’s another topic.

  • What’s Your Net Worth?

    What’s Your Net Worth?

    I get a lot of requests from people to link to their sites.  Back in the day, we all used to have massive link pages where we just listed all the cool sites we knew about.  On a fansite, I actually still have one where I list all the related sites, organized by how they’re related, separated by language, etc etc.  Here, though, you see a list on the right of links, broken down into “Websites” and “WordPress” and that’s pretty much it.

    The reason is that I subscribe to the belief of contextual links.  If a link, by itself, has no context, my reader cannot determine the inherent value of the link.  When I write a blog post, I try to put links that make sense inside my post.  On my fansite, where I have a moderately sized wiki, I link from the related page to the related site.

    Still, when people ask me to link to their site (or to friend them on Twitter/Facebook whatever) my knee-jerk reaction is “Why?” and it should be yours too!  You should always ask that when someone wants to network.  What’s in it for me?  What good will this bring me?  Do you write good content?  If you’re asking someone to link to you, you had better be bringing something good to the table, other wise you’re an unsolicited request, and no one likes those.

    Perhaps this flies in the face of my SEO advice (which is to network), but networking doesn’t mean you should cold-call everyone with a related site and ask for attention.  Sometimes networking is linking to people, but it’s also tweeting and working the community.  If you have a site about dog biscuits, hang out on the Milk Bone forum and talk to people.  If someone has a question about the best biscuits for an old dog missing teeth, and you know you wrote a great post to it, you link to it.  “Hi, Bob.  My dog is 16 and he’s got no teeth on the right side, I know your pain!  I spent a lot of time researching this problem, and hopefully this will help you. Link.”

    Look at that!  You were nice, polite, and helpful!  It’s even better if you stick around and talk to Bob some more, if he needs it.  You’re building your reputation in a productive and constructive way.(Yes, it’s a lot of work.  If you haven’t caught on to that yet, I also have a bridge for sale …)  The most important part is that you told Bob why your link was going to help him.  You put up some cred and you didn’t make it too long.

    When you think about it, the best way to get people to link to you is to get them interested in your site. The best way to get them interested in your site is to make content of value. Part of having a site with percieved value is having a site that attracts myriad walks of life. It’s a vicious circle. You have to get that foot in the door for people to notice you, and that’s what makes you popular.

    How do you get the foot in the door if you don’t want to spend all your time on related sites?

    You don’t.

    Look. If this was a brick and mortar company, you’d be advertising, wouldn’t you? You’d know you had to network your vegan dog biscuits to all the hippies and dog lovers out there, and you wouldn’t think twice about it. You’d hire that idiot kid to stand on the corner in a gorilla outfit handing out coupons, or spin a sign while dressed as a sandwich. You would spend money and time to introduce the world to your brand.

    The Internet is the exact same way. So when you cold-email someone and say ‘Hi, I really like your stuff! Will you link to me site?’ you need to bring your A Game. You need to sell your work, explain to me why you’re worth space on my site, and how come I should read your blog. Just saying ‘I, too, am a blog about vegan dog food!’ doesn’t cut it for the bigger sites. You can’t expect people to spend all their time checking out people they should link to, especially if you’re not already linking to them. Think of it like coming up with a good cover letter for your resume. You want people to read that page and go “Yeah, this cat is cool!”

    Your links make or break you, but more important than who links to you is who, and how, you link to others. If you link to every dog site in the world, links from you are worthless. If you’re discerning and link only to the ones that mean the most to you, or are the most like your own site, then you’ve shown the ability to tell the difference between any old site and one of value. You’ve made yourself worth something.

    And when you’re there, you won’t need ask people to link to you any more. That’s when you’ve made it.

    Just don’t think it’ll happen all in one day.