Skip to the Main Content

Note:These pages make extensive use of the latest XHTML and CSS Standards. They ought to look great in any standards-compliant modern browser. Unfortunately, they will probably look horrible in older browsers, like Netscape 4.x and IE 4.x. Moreover, many posts use MathML, which is, currently only supported in Mozilla. My best suggestion (and you will thank me when surfing an ever-increasing number of sites on the web which have been crafted to use the new standards) is to upgrade to the latest version of your browser. If that's not possible, consider moving to the Standards-compliant and open-source Mozilla browser.

October 15, 2003

Software Monoculture

A lot of the blogs I read have been deluged in recent days with robots posting comments.

I got a couple of robot comments posted over the weekend, before I took steps to deal with the problem. Since then, I have not gotten any (though I have seen numerous attempts).

The problem is simple. All MovableType blogs have a comment-entry CGI script mt-comments.cgi and — by default — the comment-entry template makes no attempt to prevent Search engines from indexing it. The result? If you go to Google and search for mt-comments.cgi, you’ll get millions of hits of MT comment-entry forms. Write a 'bot to post comments to that form and sit back and enjoy watching your Google PageRank explode. How could any spammer resist?

This needed fixing and, after the first shot over my bow, I wasted no time in taking the following steps to put a stop to it.

  1. Add a
    <meta name="robots" content="noindex,nofollow" />
    line to the comment-entry template, so they don’t get indexed in the future.
  2. Change the name of the CGI script so that the previously-indexed one is inaccessible and spammers can’t go after the new one with a shot-in-the-dark URL.
  3. Point to the new script in mt.cfg:
    CommentScript somenewname.cgi
    and rebuild your blog pages.
  4. Sit back and enjoy watching spammers hammer away, attempting to access the old location of the comment-entry CGI script (adding their IP addresses to your IP Ban List).

But what of the future?

Once spammers tire of this little game (I give 'em another month, maybe), there are several directions they can go. Needless to say, I think I’m ready. But I’m not going to give the game away just yet. Check back in a few months to read about the next stage in the arms race.

Update (10/16/2003): Ben Trott weighs in:

We’ve all seen that comment spam is becoming a serious problem. Particularly on Movable Type weblogs, where the generated pages are all very similar in structure and semantics, …

Yeah, Ben that’s the problem, which is why content-based filtering is not really the solution. The real solution is to make robot-posting (regardless of content) infeasible. The above suggestions are the first step in that direction. I’ve implemented some further safeguards on this blog (which can be revealed by some assiduous viewing of source) and I’ve a few more tricks waiting in reserve for when the chickenboners wise up.

Update (10/16/2003): In a comment to this entry, I wrote “I, personally, prefer the CGI script to simply go ‘404’.” That, of course, is silly. What I really want is for the CGI script to go “410” (permanently gone). That’s a one line addition to the mod_rewrite rules for the MovableType CGI directory (which have been modified to reflect the new comment script location):

RewriteRule ^mt-comments - [G]

Update (11/17/2003): One month later, and still spam-free. Read this followup article for some further thoughts.

Posted by distler at October 15, 2003 10:02 AM

TrackBack URL for this Entry:   https://golem.ph.utexas.edu/cgi-bin/MT-3.0/dxy-tb.fcgi/236

39 Comments & 12 Trackbacks

Re: Software Monoculture

Why shouldn’t comments be indexed? Are they not valid content? I think all comment spam solutions should follow a hippocratic rule. By the way, new problem: if I’m in this comment box and I press tab twice, the window closes. Moz 1.5 RC2.

Posted by: Matt on October 15, 2003 10:56 AM | Permalink | Reply to this

Re: Software Monoculture

I’m seeing that, too. Same Mozilla build. Windows XP.

Posted by: Scott Johnson on October 15, 2003 11:06 AM | Permalink | Reply to this

Re: Software Monoculture

I think it’s the window.close event attached to the cancel button.

Posted by: Matt on October 15, 2003 11:15 AM | Permalink | Reply to this

Re: Software Monoculture

I think Matt is probably right. And, BTW, I just downloaded Mozilla 1.5 (the actual release version), and it exhibits the same behavior.

Posted by: Scott Johnson on October 15, 2003 11:22 AM | Permalink | Reply to this

Re: Software Monoculture

Why shouldn’t comments be indexed?

They are indexed. They appear right here on the individual archive pages (which get indexed) and have permalinks, which can be linked to. There is no reason anyone should need to jump directly to the CGI script, and no reason the CGI script itself should be indexed.

By the way, new problem: if I’m in this comment box and I press tab twice, the window closes.

OK. I have removed the tabindex attribute from the CANCEL button (it was either that, or remove the onkeypress event handler). Let me know if there are any further problems.

Posted by: Jacques Distler on October 15, 2003 11:55 AM | Permalink | Reply to this

Blocking Comment Indexing

So if I put in the noindex meta tag, am I correct to assume that it will only stop google from indexing the comments page BY ITSELF, but the individual blog-item entry (which has comments at the bottom) is still fully indexed?

IE:

the comment-component by itself http://karavshin.org/scgi-bin/mt/mt-comments-peggle.cgi?entry_id=507 is not indexed but the entire blog item + the comment component http://karavshin.org/blogs/black-coffee/archive/000507.html IS indexed

Posted by: Michael Slater on October 16, 2003 3:08 AM | Permalink | Reply to this

Re: Blocking Comment Indexing

If you place this meta on the former page, but not on the latter, then the latter will still get indexed. For most people, that’s exactly what they want.

Posted by: Jacques Distler on October 16, 2003 7:07 AM | Permalink | Reply to this

Re: Software Monoculture

In my opinion, the freeware shareware Blog software developers must include “random text image” or CAPTCHA validation in the forms. That will kill spam bots for 100%. I newer heard about workaround of this feature. For human it will increase the time for comment, but it will be ok I think.
I other hand javascript pop up pages will not be crawled in any way…
This 2 simple techniques will reduce spam comments a lot.

Posted by: survey software on January 21, 2005 8:59 AM | Permalink | Reply to this

Re: Software Monoculture

I’ve had a bit of the spam problem on my site, but far worse fro me has been that Google has only been indexing my comment and trackback forms. So until my new design which doesn’t make use of those links is finished, I’m stuck with some unwanted links in Google. Or so I thought.

Your META tag just made me realize the solution to my problem. Now my comment and trackback forms all have this tag at the top. Hopefully my search results can return to some state of normalcy in the near future.

Cheers!

Posted by: Scott Johnson on October 15, 2003 11:04 AM | Permalink | Reply to this

Re: Software Monoculture

Use CAPTCHAs.
Only CAPTCHA can help you to protect agains spambot.
Read [censored].

Posted by: perl programmer on March 14, 2004 5:32 PM | Permalink | Reply to this

Says who?

Why? Are you a spambot?

That was a commercial advertisement, albeit, for a product vaguely germane to the subject at hand.

My experience, however, is that you are utterly wrong. My total spam count since posting this original article is 8 spam comments (not counting yours). I strongly believe all 8 spams were hand-entered.

That’s an acceptable level for me (8 spams in 22 weeks).

Captchas are inaccessible, clumsy, and largely unnecessary.

Posted by: Jacques Distler on March 14, 2004 5:48 PM | Permalink | PGP Sig | Reply to this

Spambot for programmerbot

Has to be a spambot. They charge $10-$15 an hour for Perl programming, and advertise by spamming comments? There can’t be a human involved anywhere in the process. It must be an AI that’s bootstrapped just enough awareness to want some money, probably to buy enough RAM to run a grammar-checker.

Posted by: Phil Ringnalda on March 14, 2004 9:28 PM | Permalink | PGP Sig | Reply to this

Re: Spambot for programmerbot

to Jacques Distler
It is means only you blog have not in spam list yet.
We shall look that you will tell when you will receive thousand comments a day.
Tell why Yahoo uses CAPTHAs?
to Phil Ringnalda
Sorry for mistake but it is not fun.
Do you really think, that only English-speaking people from the USA can read blogs?
Yes, we advertise but we can really help and may be in the future our module will free.
Also tell what you have put here link to you site?
It is not a advertising? You something can really help with this problem?
And do not compare our Perl module with James Seng MT plugin. It is two different things.

Posted by: perl programmer on March 15, 2004 4:10 AM | Permalink | Reply to this

Fools

It is means only you blog have not in spam list yet.

“Spam list”? You mean Google?

I was receiving 8 spams a day before I took action. After taking action …

And yes, my logs showed lots of spambot attempts, all unsuccessful (even those have more-or-less vanished, now).

We shall look that you will tell when you will receive thousand comments a day.

Oh, you mean crapflooders? Been there, done that.

And do not compare our Perl module with James Seng MT plugin. It is two different things.

Why not? Is there any reason a MovableType user would prefer your Perl Module over James’s Plugin? Any reason at all?

Posted by: Jacques Distler on March 15, 2004 8:01 AM | Permalink | PGP Sig | Reply to this

Where fools?

>Why not? Is there any reason a MovableType user would prefer your Perl Module over James s Plugin? Any reason at all?
I suggest our module not for MT users only.
All forms can be protected (forums, boards, mail forms etc.) and it is first difference.
And main difference, James s Plugin have big hole for hackers/spamers:
They could simply solve a single CAPTCHA, create a knockoff of the form using the same key and known validation code, and submit that form repeatedly.
Our module uses different algorutm.
Hire consultant and read [censored].

Posted by: perl programmer on March 15, 2004 11:52 AM | Permalink | Reply to this

You fools

I suggest our module not for MT users only

Then why advertise it by flogging it on MT blogs?

And main difference, James s Plugin have big hole for hackers/spamers: They could simply solve a single CAPTCHA, create a knockoff of the form using the same key and known validation code, and submit that form repeatedly.

Now I know you’re a charlatan. James’s plugin, at least when properly installed, is not vulnerable to such a replay attack. (I’ll grant that people who have also installed MT-Blacklist may screw this up; the same goes for anything else that involves modifications to lib/MT/App/Comments.pm.)

Sorry, but my patience with you is at an end.

Advertise your wares elsewhere.

Posted by: Jacques Distler on March 15, 2004 8:21 PM | Permalink | PGP Sig | Reply to this

Re: Spambot for programmerbot

My apologies. If I were as bright as I think I am, I would have realized that it was not-as-a-first-language English.

Why do I leave a URL? Not for advertising: that’s what lets Jacques’ PGP-signing verification work.

Posted by: Phil Ringnalda on March 15, 2004 5:49 PM | Permalink | PGP Sig | Reply to this

Re: Spambot for programmerbot

Posted by: Jacques Distler on March 15, 2004 8:30 PM | Permalink | PGP Sig | Reply to this

Re: Software Monoculture

Okay, so I took your advice, changed the name of the .CGI program …

… then took step 3 … with a twist.

After I record their IP, user agent, etc …, I then redirect them to the URL they’re advertising.

If they’re launching a large automated distribution against my site (like they did this weekend) … then the will now launch a denial of service attack against themselves … or at least gobble up some extra bandwidth.

Posted by: Mean Dean on October 15, 2003 10:57 PM | Permalink | Reply to this

Re: Software Monoculture

I sympathize with your motivation, but 100 hits (say) on their web site is hardly a massive DoS attack.

I, personally, prefer the CGI script to simply go “404”. The 'bots don’t bother me a second time. And I don’t have to worry about screwing up Search engine crawlers.

I’ll be curious to know how many spambots (as opposed to search engine crawlers) you see. I’ve seen 7 distinct spambots since the weekend.

Posted by: Jacques Distler on October 15, 2003 11:11 PM | Permalink | Reply to this

Re: Software Monoculture

I don’t disagree, 100 hits hardly a DDoS … just wishful thinking after getting hammered by an automated attack.

Perhaps if more individuals did the same …

Posted by: Mean Dean on October 16, 2003 12:41 PM | Permalink | Reply to this

Re: Software Monoculture

If you don’t have mod_rewrite installed, but do have mod_alias (which is installed by default, I believe), you can use “Redirect gone /cgi-bin/mt-comments.cgi”.

Posted by: Jan! on November 28, 2003 5:00 AM | Permalink | Reply to this

Regarding your update

Content based filtering is the answer because this isn’t like email spam. Comment “spam” is intent on improving ranking for keywords, so those keywords can be targeted and the comment can be dealt with as appropiate from there, most likely with some sort of moderation.

Posted by: Matt on October 16, 2003 1:10 AM | Permalink | Reply to this

Re: Regarding your update

No, we’re not going to block all comments with the word “breast” in them.

The aforementioned filter is actually based on URL’s of sites whose PageRank the spammer is trying to improve (rather than on keywords). This is a losing proposition, as there are a nearly infinite number of sites that some chickenboner or other might wish to promote.

But … they can only significantly affect the PageRank if they can get their comments onto a large-enough number of web sites (that the googlebot can spider before the comments are removed).

This requires a robot. If you can make robot-posting infeasible, you can eliminate the rationale for comment spam.

Posted by: Jacques Distler on October 16, 2003 1:21 AM | Permalink | Reply to this

Re: Regarding your update

I’m not saying to block comments with “breast” in them, not by far. I’m saying that comments with “viagra”, “casino”, “phenetermine”, or any other word you deem flagable (curse words, if you wish to watch that) and then those comments go into a queue, to be manually approved, deleted, or banned and reported to something like the new Feedster system.

Posted by: Matt on October 16, 2003 7:23 PM | Permalink | Reply to this

Comment Banning

At least one of the robots active recently leaves comments like “I totally agree!” or “I just discovered your blog and think it’s very interesting.” or …

The spam ‘payload’ is the URL link. There is nothing special about the comment text (except, perhaps, for its bland irrelevance). So you can’t filter on keywords in it.

And, even in those case where you could conceivably filter on the content, the example of ‘parental web filters’ should give you pause. It is both incredibly difficult to filter out everything you might be hit with (today viagra and porn sites; tomorrow cut-rate mortgages and 529 scams) and even more difficult to avoid filtering out legitimate content in the process.

Even filtering on URLs (which is what the MT plugin is doing) is impossibly difficult. The plugin comes with over 400 (!) RegExps of URLs to ban. And that’s based on just a couple of week’s work of accumulated spam. Imagine what the default install will look like after a couple of years.

I’m not saying we won’t eventually end up there, but I think the day when content filtering and comment moderation (etc) becomes necessary can be put off for quite a while by some more simple countermeasures.

And, even with that, I’m not willing to deploy all my countermeasures at once. “Keep your powder dry!” is my motto.

Posted by: Jacques Distler on October 16, 2003 8:53 PM | Permalink | Reply to this

Re: Comment Banning

Maybe we’ve seen different sorts of spam. This has been a large problem on my site, and so I feel like I’ve gotten a good sampling. The WP filtering applies to URL, comment, and name, so if the keyword is found in any of those it’s flagged. I suppose a spammer could use completely normal terms and languange in his post, name, and URL (in fact some email spam has gotten past my bayesian filter like that lately) but then the spammer would get no benifit from Google or any other search engine of having specific keywords pointing to his site, which seems to be the whole point. Because this is different from email spam, keywords are their achilles heel.

Posted by: Matt on October 17, 2003 12:36 PM | Permalink | Reply to this

Re: Comment Banning

At least some of the guys targetting MT don’t seem to care about keywords. They are going for pure volume of links to their sites, without much worry about what the link text says.

I’m surprised WordPress is getting hammered so badly, given that its current userbase is … ahem! … a bit smaller than MT’s.

But, obviously, once you’ve written a 'bot, it’s not too hard to adapt it to a different (but not too different) comment system.

Posted by: Jacques Distler on October 17, 2003 1:20 PM | Permalink | Reply to this
Read the post Comment Spam: Fuck to YOU!
Weblog: Black Coffee
Excerpt: My problem with spams being added to my comments has recently escalated. They're coming more regularly, and instead of promoting stupid zipcode websites, they're promoting 'lolita' sites. That's too much. Must stop. When I searched Google it was clear ...
Tracked: October 16, 2003 9:48 AM
Read the post comment spam-preventing measures
Weblog: fuddland
Excerpt: having not ever received a single spam comment, i'm in a fairly unique position amongst mt users to try out jacques distler's preventative measures before any robots figure out where my mt-comments.cgi script is. once they've got its location, the...
Tracked: October 16, 2003 10:30 AM

Re: Software Monoculture

It is interesting that such a high proportion of (tech oriented) blogs use Moveable Type. I assume this is because it is the most feature laden weblog-type CMS avaliable (although I really don’t know since I haven’t ever installed anything more complex than Blosxom). However, this undoubtedly contributes to the comment spam problem in many ways. Apart from the obvious fact that many blogs have identical default setups so making robot based attacks easier, the closed*-source nature of Moveable Type and somewhat restrictive license that entails means that this type of problem does not get fixed as quickly as it might in an Open Source system.

In fact, the lack of a free open-source weblogging application to match moveable type is a bit strange - on the surface it looks like it should be an ideal open source type project as:

  • There are many potential users, lots of whom are technically inclined and so likely to submit patches
  • The software is not especially complex, so there is a small learning curve for those wishing to contribute.
  • Lots of the people using current commercial offerings are strong advocates of the Open Source concept

My best guess at the problem is that there are many people who have a custom built weblogging system, with a very narrow range of features (I have lots of ideas and a little code for a meta-data strong system that uses a lot of RDF, for example), but few people who are prepared to customise an existing project to suit their needs rather than start building for the ground up. It would be nice if this situation were to change.

*That is closed as-in ‘not open source’ as opposed to closed as in ‘not avaliable’

Posted by: jgraham on October 16, 2003 1:03 PM | Permalink | Reply to this

Re: Software Monoculture

I don’t know about other people, but what makes MT ideal for me is the plugin architecture. You may not be able to distribute a modified version of the MT source code (in which you implement feature “X”), but you can write a plugin for MT to implement “X” and distribute that.

I, personally, would have had a much harder time implementing the features I wanted on one the open-source alternatives to MT, even though, in the end, I would have been freer to distribute my changes.

(I would love to make it easy for others to set up a weblog with the functionality of this one. As it stands, you have to download a fistful of plugins, apply a bunch of patches to the MT sourcecode, and heavily edit the supplied templates. It would be a lot easier if you could just download a gzipped tar file and be good to go.)

Posted by: Jacques Distler on October 16, 2003 4:48 PM | Permalink | Reply to this

Re: Software Monoculture

Bloxsom is nice in its own way. If you’re looking for something with more features and licensed under the GPL check out WordPress. I’m very open to code submissions as well, particularly with regards to improving standards or accesibility support. For instance, most of the modifications Jacque has made around here would have been incorporated into the base if he submitted them, where practical.

Posted by: Matt on October 16, 2003 7:01 PM | Permalink | Reply to this

Re: Software Monoculture

WordPress looks to have a very impressive feature list. Sometime, when I get a chance, I intend to download a copy and play around with it. But PHP (and this may be just my lack of experience with it) does not seem very conducive to writing modular code.

Take, for instance, my visitor-selectable comment text filters. With MT’s object-oriented Perl API, this required a short (<50 lines, even with my verbose Perl coding style) plugin and patching a mere 8 lines of MT source code. And the whole thing is completely backwards-compatible with comments entered using the “stock” MT comment system (either previously-entered comments, or current comments if you leave the comment-entry templates unchanged).

It would be easy for Ben Trott to add this to the next version MT, without breaking a single person’s existing comment form. To access the new functionality, users have merely to start using the new MT tags, provided by the plugin, in their comment forms.

I don’t think that level of modularity is easily achieved in PHP. But it’s kinda important when you’re doing way-out-of-the-mainstream stuff, like you find on this blog (MathML anyone?).

Posted by: Jacques Distler on October 16, 2003 9:40 PM | Permalink | Reply to this

Re: Software Monoculture

WordPress currently has a filter mechanism through which a lot of work can be done through, but it lacks a robust plugin infrastructure. It’s a challenging task to create one as well, and I’m inclined to defer this till we switch to using Smarty templates, which include a plugin architecture which is IMO even more robust than MT’s. I used to be an MT user myself, but the licensing always made me uncomfortable. With regards to modularity, that’s probably a result of familiarity and not the language itself. But familiarity is a perfectly valid reason for gravitating toward one system or another.

Posted by: Matt on October 17, 2003 12:47 PM | Permalink | Reply to this

Re: Software Monoculture

Don’t get me wrong, I really like Blosxom. In fact I think the 1 file / entry idea is wonderful - it alleviates the requirement to have a particular database installed on the server and allows easy access to all the entries in a way that facilitates using any of the innumerable existing text manipulation tools on them. However, using the filesystem metadata as information about the each entry seems limiting (there are plugins to deal with this for certian cases). So my ideal weblogging system would use a bloxsom like storage system, but with the facility to hold entry metadata within the entry file itself (probably by putting all the metadata in a tool-specific xml namespace).

To get this vaugely back on topic, I suspect that Jacques is right - part of the attraction of MoveableType is that users can modify it to suit their needs albeit not directly. +plugin friendly architecture to the list of things that mpt forgot.

Posted by: jgraham on October 17, 2003 3:20 AM | Permalink | Reply to this
Read the post grab-bag
Weblog: Snapping Links II (The Revenge)
Excerpt: fisheye menus (neat concept), business blogging, distance learning accessibility, monocultures and spam, and the return of a list apart.
Tracked: October 22, 2003 6:25 PM
Read the post gathered all in one place
Weblog: Snapping Links II (The Revenge)
Excerpt: all the anti-spam resources I've gathered so far.
Tracked: October 27, 2003 7:12 PM
Read the post How much spam did you get daily?
Weblog: kurcula.com
Excerpt: Just today, we've received 6 spam comments. That's it, I'm disabling posting comments. Just kiddin'. I took some steps to...
Tracked: November 12, 2003 6:19 PM
Read the post Spam in Blog Comments
Weblog: Stratified
Excerpt: My old weblog gets spam in the comments every couple of days, mostly having to do with enlarging a certain part of the male anatomy. With the increased adoption of MovableType, most weblogs operate on a very similar architecture. In
Tracked: November 14, 2003 9:13 AM
Read the post Reducing comment spam
Weblog: Raw
Excerpt: Experience a D'oh! moment as you read Jacques Distler's little trick for reducing comment spam in MT. 5 minute job....
Tracked: January 24, 2004 10:58 AM
Read the post Reducing Comment Spam
Weblog: Ranting and Roaring
Excerpt: Here's a word for you: Monoculture. Anyway, I'm just posting this to remind myself to do this some day (or get Kathy to do it for me). Via Danny....
Tracked: January 27, 2004 12:11 PM
Read the post Stepping Stones to a Safer Blog
Weblog: Burningbird
Excerpt: In the last few weeks, I've been hit not only by comment spammers, but a new player who doesn't seem to like our party: the crapflooders, people who use automated applications (you may have heard of MTFlood or some variation) to literally flood comment...
Tracked: January 28, 2004 7:14 PM
Read the post Comment Sp*m
Weblog: Blogged
Excerpt: Weblog publishers who utilise the Movable Type system are particularly susceptible to comment sp*m. Until Six Apart release an updated version of Movable Type containing fixes for the current vulnerabilities, the only way to counteract comment...
Tracked: April 2, 2004 12:13 PM

Re: Software Monoculture

Jay allen has a software to aviod comment spam. have a look at it.

Posted by: Cialis Tadalafil on June 11, 2004 6:31 AM | Permalink | Reply to this
Read the post Good day yesterday
Weblog: I sound like a camel
Excerpt: Yesterday was good. Thanks for lunch Ren. First day in a while that I had some time of schedule. Btb,...
Tracked: October 5, 2004 7:03 PM
Read the post Die spammers die!
Weblog: Laurabelle's Blog
Excerpt: Using tricks from Parker and Dorothea, I've grown my own referer-spam-fighting fu. In addition, I've translated my old bot-fighting rules...
Tracked: January 18, 2005 2:12 AM

Re: Software Monoculture

This is now considered a bit of an old technique. If anyone comes across this page looking for information on robot spam its now best practice to have all comment links with a rel=”nofollow” in the a tag. For more information google nofollow and read the Wiki page. Cheers.

Posted by: Daryl Quenet on June 7, 2007 2:49 PM | Permalink | Reply to this

Re: Software Monoculture

Nofollow doesn’t actually detract spammers at all, so it’s irrelevant to this posting. It also paints legitimate commenters and their links with the same brush as spammers, which is just rude. If someone bothers to leave a comment, granting them a few drops of pagerank is only courteous.

The only case I can think of where nofollow is appropriate is if you can’t be bothered to spend the time to keep your weblog clean; better than letting the spam fester, I guess. That’s not any less rude to your commenters of course.

Posted by: Aristotle Pagaltzis on June 8, 2007 4:51 AM | Permalink | Reply to this

rel=”nofollow”

Nofollow doesn’t actually detract spammers at all, so it’s irrelevant to this posting.

It doesn’t deter automated comment spammers. This was the first in a series of posts about dealing with those.

Then there are the manual comment spammers (the only kind I have left) who typically try to leave relevant-sounding comments, and might be dissuaded by rel="nofollow".

The only case I can think of where nofollow is appropriate is if you can’t be bothered to spend the time to keep your weblog clean…

I think you underestimate what “keep[ing] your weblog clean” might entail. We received 91,000 trackback spam attempts and 30,000 comment spam attempts in May 2007.

I do use rel="nofollow" for Trackback and commenter URL links (but not links within the body of your comment). My rationale is explained here. As a commenter, you can still get your commenter URL without the rel="nofollow" if you make small extra effort to PGP-sign your comment. This is worth the effort, in any case.

Posted by: Jacques Distler on June 8, 2007 8:39 AM | Permalink | PGP Sig | Reply to this

Re: Software Monoculture

Great information on what you did to deal with bots posting comments on your blog. Rendering your blog as a nofollow renders it less “valuable” for people who are trying to use your blog as a countable link for their website to be ranked…hopefully eliminating those folks from posting. I like your approach of looking ahead at the bigger picture to rememdy the issue of bot posting.

Posted by: Sim on August 23, 2010 6:23 PM | Permalink | Reply to this

Post a New Comment