Misplaced Pages

:Requests for adminship/RedirectCleanupBot: Difference between revisions - Misplaced Pages

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.
< Misplaced Pages:Requests for adminship Browse history interactively← Previous editNext edit →Content deleted Content addedVisualWikitext
Revision as of 21:51, 5 October 2007 editJohn Reaves (talk | contribs)Extended confirmed users22,487 edits RedirectCleanupBot: cmt← Previous edit Revision as of 21:55, 5 October 2007 edit undoAS 001 (talk | contribs)22 edits Discussion: oppose, against policyNext edit →
Line 347: Line 347:
#::::Checkuser confirms this account was made by {{user|Dereks1x}} while banned. As such, this vote is discounted as banned editors are not allowed to edit while banned. --] <small>]</small> 16:51, 5 October 2007 (UTC) #::::Checkuser confirms this account was made by {{user|Dereks1x}} while banned. As such, this vote is discounted as banned editors are not allowed to edit while banned. --] <small>]</small> 16:51, 5 October 2007 (UTC)
#'''Oppose''' I'm just not crazy about deletions by bot. An admin should at least spend the time it takes to read the redirect to decide if maybe there's some other action to take...maybe the redirect should be retargeted, or maybe there should be an article there...or maybe in some cases the target does need an article after all (depending on why the target was deleted if that's why the redirect is broken)...whatever, a final pair of eyes is needed. I'm also worried about a general increase in acceptance of bot deletions for the same reasons, don't want to see this become a precedent for other requests of this kind. I don't want this to go through just because of a general enthusiasm for technology. Unless there's a compelling reason to keep the broken redirect backlog clear I don't see why we need this. And just a quick comment, I certainly trust the developer(s) and operators....it's not an issue of the bot going sideways or anything. ] 16:37, 5 October 2007 (UTC) #'''Oppose''' I'm just not crazy about deletions by bot. An admin should at least spend the time it takes to read the redirect to decide if maybe there's some other action to take...maybe the redirect should be retargeted, or maybe there should be an article there...or maybe in some cases the target does need an article after all (depending on why the target was deleted if that's why the redirect is broken)...whatever, a final pair of eyes is needed. I'm also worried about a general increase in acceptance of bot deletions for the same reasons, don't want to see this become a precedent for other requests of this kind. I don't want this to go through just because of a general enthusiasm for technology. Unless there's a compelling reason to keep the broken redirect backlog clear I don't see why we need this. And just a quick comment, I certainly trust the developer(s) and operators....it's not an issue of the bot going sideways or anything. ] 16:37, 5 October 2007 (UTC)
#:'''Oppose''' I am intrigued at this application because my industry supports technology and innovation. However, I have learned over time the real dangers of overreliance on automation. WJBcribe is already an administrator and granting a person access to 2 accounts with administrator access is prohibited according to Misplaced Pages rules. The operator of the bot could run the bot without sysop tools and simply monitor its findings. He does not need 2 sysop privileges. ] 19:55, 5 October 2007 (UTC) #'''Oppose''' WJBcribe is already an administrator and granting a person access to 2 accounts with administrator access is prohibited according to Misplaced Pages rules. The operator of the bot could run the bot without sysop tools and simply monitor its findings. He does not need 2 sysop privileges, which is against Misplaced Pages policy. ] 21:55, 5 October 2007 (UTC)

#::Above user blocked by {{user|^demon}} as a ] used for disruption. - ] ] 20:25, 5 October 2007 (UTC)
'''Neutral''' '''Neutral'''



Revision as of 21:55, 5 October 2007

RedirectCleanupBot

This is a request for a fully automated adminbot.

Voice your opinion (talk page) (95/6/2); Scheduled to end 20:55, 11 October 2007 (UTC)

RedirectCleanupBot (talk · contribs) - This is a very different RfA from the type I am used to writing nominations for. In fact in many ways the title is wrong - I am not proposing that a new administrator be created, but that a Bot account is given a +sysop flag. It is incapable of the judgment we require an administrator to show. I will however outline why giving it the ability to use a sysop tool will be beneficial to the encyclopedia.


The task

Special:BrokenRedirects list pages that are redirects to deleted or non-existent pages. They meet the speedy deletion criteria - CSD R1. When reviewing that list the only human action that is necessary is to ensure that each page does not contain useful history otherwise it is deleted. The page is updated every 2/3 days and I delete well over 100 redirects each time. It occurs to me however that this trivial task could be done just as well automatically.

Of course, a Bot cannot discern useful page history, so this Bot will only delete pages that have no history.
It will work off Special:BrokenRedirects and if an entry there is:

  1. A redirect to a deleted or non-existent page
  2. Has only one entry in its history

It will delete that redirect.


Technical details


Conclusions

This task is largely uncontroversial other than the fact that the Bot requires the ability to delete pages. If the Bot deletes a page other than a broken redirect with one revision, it will be blocked on sight. Blocked accounts cannot delete pages. This Bot will not be taking on any future tasks - this will be its sole function and its scope will not be expanded. Bots are designed to perform repetitive tasks that can be performed just as well automatically as manually. I believe this is such a task. WjBscribe 20:52, 4 October 2007 (UTC)



Questions about the Bot

Ask any questions below:

Questions from AA

1. What will be in the edit summary?
A:It is in the source code, line 41, currently it is BOT: Deleting dead redirect per CSD R1. Changing this is not that hard to do. Suggestions are welcome :). —— Eagle101 21:30, 4 October 2007 (UTC)
What about including the target in there (e.g. BOT: Deleting dead redirect to @@@@ per CSD R1)?
That has been suggested, and I will include it shortly, I must get some food now, but I'll have code updates within 8 hours or so. As it is open source, if someone wants to give me a unix patch file they are more then welcome to e-mail that to me ;) —— Eagle101 21:44, 4 October 2007 (UTC)
patch sent. --uǝʌǝsʎʇɹnoɟʇs 23:12, 4 October 2007 (UTC)
Oy! the wonders of open source! I'll have a look and put it in. —— Eagle101 00:21, 5 October 2007 (UTC)
Could you make it include the target of the redirect it's deleting in the summary? That way if a redirect ever needed to be recreated it could easily be done by looking at the bot logs or page logs. Just a suggestion, possibly non-useful. I've already added my support. --JayHenry 03:06, 5 October 2007 (UTC)
Oh wait, nevermind, that's what was meant by the @@@@ above. I am teh illiterate. You guys are one step ahead! --JayHenry 03:08, 5 October 2007 (UTC)
2. Are there instances when a broken redirect should not be deleted from user/user talk space?
A: I can't immediately think of any. If there are some we can restrict the Bot further. In my experience such redirects arise either (1) where an article was drafted in userspace, moved to the mainspace and subsequently deleted or (2) where someone is renamed and deletes unwanted subpages, forgetting the redirects left over from the rename. It seems appropriate to delete these redirect. As with other namespaces, the Bot won't delete if the page has more than 1 edit in the history. WjBscribe 21:30, 4 October 2007 (UTC)
True, but if I have myself renamed and later decide to leave Misplaced Pages and delete my user page (not because of m:RTV reasons, but just because I left), it can actually be useful to have a redirect from User:Melsaran to User:Mynewname so that people who click a link from Melsaran's signature know Melsaran's current name and how to access his talk page/contribs. I think it'd be a good idea to disable deleting redirects from userpages (not subpages) to other userpages. Melsaran (talk) 12:45, 5 October 2007 (UTC)
I'm not sure I've entirely followed you here. In this scenario you were renamed before you left the project? It seems to me that redirects are trivial to recreate if the user wants them (or if a third party thinks one would be useful and the user has no objection). I'm not sure we should keep redlinked redirects from old usernames of people who have left indefinitely in case they return. Certainly we are not doing so at the moment - admins are deleting such redirects... WjBscribe 12:51, 5 October 2007 (UTC)

Questions from F Mita

3. Will the bot also delete the redirect page's talk page?
A: Yes, as the talk page will also show up in special:BrokenRedirects. —— Eagle101 22:46, 4 October 2007 (UTC)
A: But only if the talkpage is also a redlink redirect with no page history (which they usually are if there's a talkpage at all). WjBscribe 22:51, 4 October 2007 (UTC)
OK, so if it is not a redlink redirect, or has a page history it will be kept. My understanding is that talk pages are usually deleted if the main page is deleted, but I am not sure if this would be considered a problem. F Mita 23:07, 4 October 2007 (UTC)
4. Should the presence of a talk page perhaps be a non-delete criteria for the bot?
A: Should it be? I can't think of any reason why it should be. If you can think of a reason why, please let me know, and we shall code it in. —— Eagle101 22:46, 4 October 2007 (UTC)
The only situation that occurred to me was that there was some discussion on the talk page that might impact whether the redirect should be deleted. In other words there could be a discussion that might make the deletion controversial (for want of a better word) and which should therefore be reviewed by a human. If there are not many instances of talk pages that aren't redirects, then maybe it would be better to err on the side of caution when there is one. Just a thought. F Mita 23:07, 4 October 2007 (UTC)
From experience, those talkpages are usually a Wikiproject tag or {{Talkheader}}. I still can't think of a talkpage discussion that would make the redirect controversial and need to be kept where the page has only edit in its history and points to a deleted page. I don't have a problem with narrowing the Bot's scope to avoid potential problems, but I haven't yet imagined even one scenario where the presence of a talkpage would be a relevant (let alone determining) factor... WjBscribe 00:36, 5 October 2007 (UTC)
5. Are there any tags that could be placed on the redirect page that should be a non-delete criteria?
A:No, and frankly its not needed, as a second edit to a redirect page will cause the bot to skip it. (The bot only deletes pages with one edit, so any edit to add a tag, or to point it at a new target would cause the bot to skip). —— Eagle101 22:46, 4 October 2007 (UTC)
A:Not that I can think of - usually its categories (like {{R from alternative spelling}}) but those shouldn't affect the need to delete. If a template is added after the redirect is created, it will have more than 1 edit in the history and not be deleted anyway. WjBscribe 22:51, 4 October 2007 (UTC)
6. Does the bot check that the redirect target page has not been created since the redirect page was listed on Special:BrokenRedirects? (I am assuming here that Special:BrokenRedirects is not recreated on the fly each time it is viewed)
A: I will check the regexs that I use in a few hours. I am not ignoring this question, and yes the bot should ignore any redirects that get re-created. —— Eagle101 22:46, 4 October 2007 (UTC)
OK. Just to be clear, I didn't mean where the actual redirect page had been recreated (I guess that would be covered alreay because there would be mnore than one edit), rather where the target page (i.e. the page that the redirect points) had been recreated since the redirect page had been added to the list. F Mita 23:07, 4 October 2007 (UTC)
To add - I expressed this to Eagle earlier, but was not too concerned about it, but since we are loading the actual redirect page now, it should be easy to load the target as well - it's up to eagle as to efficiency, however. --uǝʌǝsʎʇɹnoɟʇs 23:16, 4 October 2007 (UTC)
I do think that this should be a definite requirement for the bot, as it should not be deleting pages that actually redirect to a page. This situation would probably happen very infrequently, but should be catered for (no doubt a human admin would spot the fact that the target page did actually exist). F Mita 23:34, 4 October 2007 (UTC)
It will be able to tell if a target exists by seeing if the target is a blue link in the special:BrokenRedirects, I have not yet had a chance to test this, but before it is live, I will be sure that anything that points to an existing page is not deleted. In addition we will include a pointer in the deletion summary of where the redirect pointed. —— Eagle101 00:15, 5 October 2007 (UTC)
7. What proportion of pages are estimated fall under the "single edit" criteria?
A: You are free to see for yourself in the test results at User:Eagle_101/RedirectCleanupBot. Anything with a -- 1 following the name would have been deleted had the actually had the delete function enabled. —— Eagle101 22:46, 4 October 2007 (UTC)
A: Proportion of pages is probably the wrong way to think of this- they mostly result from admins not looking for redirects when they delete pages and so are being created all the time. They're also deleted all the time. I would estimate that about 150 broken redirects are listed every few days Special:BrokenRedirects and that 80-90% of those contain only one edit. WjBscribe 22:51, 4 October 2007 (UTC)
Question from Ronnotel
8. What controls will prevent the bot from being compromised and allowing a non-admin to gain admin powers? Ronnotel 22:40, 4 October 2007 (UTC)
A: A good password, which I hope the bot operator has implented. Besides that, nothing. A bot is just like any other account, and logs in vie the same methods any other normal user can login. If you trust the admin that is running the bot (WJScribe) not to hand out the password to the account, then there should be no problems. —— Eagle101 22:48, 4 October 2007 (UTC)
A:I tend to go a little overboard on passwords. The password for the Bot is 20+ characters - including letters, numbers and characters that are neither. WjBscribe 22:53, 4 October 2007 (UTC)
Access to the account isn't what I'm worried about. How will you know that the code itself hasn't been compromised? Is there some sort of code migration mechanism that requires admin rights to place code into production? I would think ths would be a requirement before we can consider granting a bot admin rights. Ronnotel 23:03, 4 October 2007 (UTC)
Well, unless someone manages to break my ssh private/public key combo to toolserver, I think this is highly doubtful. As long as WJScribe always gets the code from my toolserver account there should not be any problems at all. —— Eagle101 23:09, 4 October 2007 (UTC)
Question from CO
9. Why is WJBscribe running it, seeing as he has never run a bot before? A bot is one thing to run, but one with the sysop bit? Is WJB knowledgeable with bots and programming to run it?
He should be fine, all it is, is telling it to run... its not rocket science ;). Its about as much work as clicking on an icon on your screen. I don't have the time to run and monitor a bot like this, which is why I'm letting someone knowledgeable with the task to operate it. I will be on hand to fix any bugs, and anyone else is free to submit to me unix patches. —— Eagle101 23:42, 4 October 2007 (UTC)
The idea was to combine knowledge of the task with someone with good coding experience. I will be overseeing the Bot as someone who does a lot of redirect related work (both speedies and closing most WP:RFDs) so can spot if things were going wrong or if the task needed further restriction. It was I who investigated the possibility of this Bot in the first place. If anything beyond my technical abilities occurs, Eagle says he'll hold my hand, but he doesn't want the day-to-day supervision of the Bot (which I'm happy to do). Also there may be newbie questions to field (someone may assume the Bot deleted their article rather than a redirect if it was moved after they created it) which I happy to do... WjBscribe 23:54, 4 October 2007 (UTC)
Question from dgies
10. I'm uncomfortable with the idea of a bot having the +sysop flag. Is there some reason why this bot can't simply scan for these pages and send you an email with links? I realize it's slightly more work but presuming you can trust the email (cryptographic signing by your bot?) then you should be able to tear through the list pretty quickly. —dgiesc 23:43, 4 October 2007 (UTC)
The point of the bot is to do it automatically. The end result is the same from an e-mail with someone clicking links, and the bot just doing it. Which one is easier, and in some ways safer? (do realize that a bot will never accidentally delete something. Bots are by their very nature more accurate then us humans. —— Eagle101 23:47, 4 October 2007 (UTC)
Yes I realize computers don't make mistakes, but programmers do. A human is never vulnerable to escape characters, to name one problem. —dgiesc 23:50, 4 October 2007 (UTC)
Yes, but once the code is right, it never makes a mistake, or if it does make a mistake, it make the same mistake predictably. Thats why we debug code :) —— Eagle101 00:13, 5 October 2007 (UTC)
Additionally, remember that many programmers have reviewed and tested the bot without any complaints, any issue would have to be very minor. --uǝʌǝsʎʇɹnoɟʇs 00:26, 5 October 2007 (UTC)
Questions from SQL
11. I do this a lot myself, by hand, and, bot assisted. Something I come across frequently, is user or user talk pages that are protected, and, redirected to the userpage (which, does not always exist), due to abuse from a blocked user. It is not generally a good idea to delete these. How would this bot handle this situation?
Its not all that hard, in fact its already done :), anything that is protected will by its very nature have more then 1 event in its history, preventing the bot from deleting at all. (the protection shows up in the history log). —— Eagle101 23:50, 4 October 2007 (UTC)
I hadn't noticed the 1 edit thing, that pretty much covers most of my concerns. SQL 23:53, 4 October 2007 (UTC)
12. It is not always appropriate to nuke broken redirects. Sometimes it's bad pagemoves, sometimes it's vandalism, sometimes it's other things. How would this bot account for these variables?
In practice the only time these shouldn't be deleted is when they have good history. The Bot will only delete redirects with no history, so nothing useful could be lost. Worst case scenario, a redirect is easily recreated - its not like losing an article. But I don't envisage that being necessary given the Bot's limited scope. WjBscribe 23:56, 4 October 2007 (UTC)
Yep, not a big deal and, the 1 edit in history, safegaurd prevents this sort of issue... I meant to strike this question, as, the answer was right there... Thanks! SQL 23:59, 4 October 2007 (UTC)
Hmm, I'm not so sure... I have created redirects; they have no history. Now the page being pointed to gets vandal-moved. Then what? The redirect is lost. Is it possible for the bot to check wether the page itself has moved, and if so, simply change the redirect? — EdokterTalk21:21, 5 October 2007 (UTC)
Bear in mind that this Bot won't delete redirects to redirects (double redirects) which are what result from vandal page moves. It's only if the target page is actually deleted that the redirect is redlinked and meets the Bot's criteria. For example if you redirect Duke of buckingham to Duke of Buckingham and a vandal moves the latter to Who ate all the massive pies, the page Duke of Buckingham still exists (though now as a redirect) so the redirect from Duke of buckingham would still be a bluelink and wouldn't be deleted by the Bot. WjBscribe 21:42, 5 October 2007 (UTC)
Thanks for clearing that up. — EdokterTalk21:48, 5 October 2007 (UTC)
Question from Mercury
13. Can the bot be coded to check for talk page messages, and shutdown once a message is posted?
A: The bot will stop running if it is blocked. Stopping on talk page messages often can cause issues, as it's too easy to disable a useful bot - it's up to the operators on this, but as a BAG member, there are not many bots that do this. --uǝʌǝsʎʇɹnoɟʇs 00:26, 5 October 2007 (UTC)
A: No doubt it could be - but I worry about it being interrupted too often. When I delete redirects, newbies sometimes come to my page asking why I deleted their article. Usually its because the article was moved before it was speedied or AfDed and the creator expected to find it where they created it and find my name in the deletion log. I expect the Bot will get the same kind of inquiries that will need to answered, or forwarded to the admin that deleted the actual content. Those could be directed to my talkpage but I suspect (judging from other Bots) some will still post to its talkpage. So my instinct is not for it to stop when it gets a talkpage message... WjBscribe 00:28, 5 October 2007 (UTC)
A: If people want it I can implent it... but not many bots run with that on as far as I know. Its just as easy to post to WP:ANI and get an admin to block it. This the admin should do instantly. We can always restart the bot ;). —— Eagle101 00:37, 5 October 2007 (UTC)
A2: After thinking some... there will be a page at User:RedirectCleanupBot/Shutoff. I would hope that this would not be needed, but its there anyway. I'm coding the changes in as we speak. Please do note that page is semi-protected. —— Eagle101 01:12, 5 October 2007 (UTC)
Questions from EdJohnston
14. Is it possible that the decision process for the administrators who normally process Special:Brokenredirects is more subtle than what this bot would do? There is a remark in the edit history of the Talk page suggesting that an admin could take any of several repair actions, depending on what is found.
A: Yes, however non-delete actions are in my experience confined to when the page has history. Eg. a vandal changes an article into a broken redirect - the Bot won't delete because it has more than 1 edit. The type of broken redirect being described that results from multiple page moves is a double redirect - another Bot fixes those and this Bot won't delete those as there is a target. Note the editor's conclusion:"If there is no edit history and target page/history is empty, I add {{db|this page is a redirect to an empty page}}" (a tag for speedy deletion), which is what a on-admin would do - an admin would delete. That quoted sentence is exactly the set of circumstances under which the Bot will be deleting. Also I confess that I don't check how recently the target was deleted (in case it was accidental and will be reverted) - the Bot will do that so in this way its going to be more subtle than me... WjBscribe 03:11, 5 October 2007 (UTC)
15. Could a new set of test results be generated, where we can still see the redirects? The ones listed in User:Eagle 101/RedirectCleanupBot have mostly been cleared out already as part of normal maintenance. Without being able to see the now-deleted redirect, it's hard to know if the bot diagnosed the redirect's history correctly.
A: If one of the pages on there has been deleted, thats confirmation the bot was correct in its actions (or what it would have done). Next time special:BrokenRedirects regenerates either me or WJBscribe will give an updated list. —— Eagle101 02:54, 5 October 2007 (UTC)
A: Yeah - its a bit difficult to ask admins not to do cleanup work for the 7 days this RfA will run. The Bot can have a new run next time Special:BrokenRedirects updates and obviously admins can check the deleted pages, but I can't see a way for more output to be generated now... WjBscribe 02:59, 5 October 2007 (UTC)
16. What is the rule that determines whether redirects show up as red or blue in Special:Brokenredirects? How are the strikethroughs made? (The page has no history tab).
A: The page is a little odd and special pages never have histories. Redirects that are blue with an arrow towards another page that is red are broken redirects. When they are deleted, they usually turn red and are struck through. If instead of being deleted, the target is changed to a bluelink (or the page is replaced with content) the entry is also struck out, but remains blue. Some of those struck blue entries are interwiki redirects, which are also picked up by the special page. WjBscribe 02:59, 5 October 2007 (UTC)
Question from Chaser
17. There's no edit or deletion rate in the BRFA. Is it possible to throttle the deletions to 2-3 per minute to allay the inevitable concern, however well founded, that the bot will go beserk?--chaser - t
A: : In the start when it does its testing for WP:BAG it will probably make one or two deletes per minute. When BAG approves of it, it will probably increase speed to 4 deletes per minute. It won't be doing anything insane. —— Eagle101 04:36, 5 October 2007 (UTC)
Question from NE2
18. Does the bot perform any other cleanup tasks, such as removing links to the redirect?
A: No it doesn't: (1) to keep the code simple and (2) because realistically working out which redlinks should stay because there will one day be content, and which should go because it should never be created requires human judgment. Admins reviewing broken redirects don't tend to tidy up the links either - maybe we should but its more work, and often editors of the articles are better placed to make that judgment anyway. WjBscribe 07:04, 5 October 2007 (UTC)
Thank you; I was making sure you understood that sometimes you shouldn't remove backlinks. --NE2 07:07, 5 October 2007 (UTC)
A: If people really want detailed logs about things in the backlinks etc, so that they may remove them as they wish... the bot can generate such reports and post them within its userspace, but it will not do any more deleting then what is specified here. The bot may end up posting to its space a list of things in which it did not know what to do with etc. —— Eagle101 07:14, 5 October 2007 (UTC)
Question from Ronnotel
19. Will consider signing the perl script with your private key and publishing your public key? If you have already done so, can you include that information on this page?
A: Well... I consider this whole excersise pointless, but I'll look into it. I frankly find that publishing the code on toolserver under my account is foolproof enough. Unless someone has my private ssh key (good luck in breaking that) getting into that account is virtually impossible. —— Eagle101 13:43, 5 October 2007 (UTC)

General comments


Please keep criticism constructive and polite. Remain civil at all times.

Having reviewed the code for the bot, I can pretty confidently say that the bot is unlikely to misbehave and will perform as specified. My reservation, however, is that I'm not sure the scope of the bot's function is really one necessary for an adminbot. The task it is performing -- deleting approx. 300 pages per week -- is one that could be accomplished quite simply by a single admin, or small group of admins, in under an hour per week. What concerns me is that granting this bot adminship stands to set an, as of yet unestablished, precedent that adminbots are generally accepted. This is a precedent I would be fine with having set, but would see it more appropriately established through the promotion of Curpsbot or similar currently active, though "illegitimate," adminbots. I also worry that the scope of this bot will be expanded to include other admin tasks following its promotion and with only the consent of the bot approvals group, a group that generally shares my view and readily approves most adminbot functions. My support for the promotion of this bot would be strictly on the condition that the bot's scope not be expanded to include any other tasks without bringing the bot back to RfA for reconfirmation by the community. Should the bot undertake any additional tasks, with or without the approval of the BAG, the bot should be stripped of its admin status and deactivated on sight. If the bot's operator and writer can agree to these terms, and if the BAG agrees not to authorize any further tasks for this bot without a further RfA for the explicit tasks, then the bot has my support. AmiDaniel (talk) 21:34, 4 October 2007 (UTC)

I will say right now, this bot will not do any further tasks. I have always advocated that admin bots should have one task per approved account. That allows the community to make its own choices on a task by task basis in a more public fora then WP:BRFA. Any new tasks warrent a new bot account, and a new RFA. —— Eagle101 21:39, 4 October 2007 (UTC)
This Bot will perform no other tasks. May a steward take both mine and the Bot's flag if I lie. As Eagle says, a new task means a new Bot and a new RfA. WjBscribe 21:49, 4 October 2007 (UTC)
I would add that if the bot operator began using the admin account for anything outside the scope under which the administrator flag was granted, I would personally request a steward desysop the account, as well as block the account. So, I don't think there's any danger. That plus the fact that I trust WJBscribe implicitly, meaning that me thinking hypothetically about what I'd do is totally unnecessary :-) --Deskana (talk) 23:14, 4 October 2007 (UTC)
Thank you all for your reassurance -- glad that we're clear on that point :) AmiDaniel (talk) 00:12, 5 October 2007 (UTC)

My concern, somewhat alleviated by the answer to Q.8, is the possibly of inappropriate rights promotion. Grabbing admin rights through poor code management is, quite literally, the oldest trick in the hacker's book. Provided WJBscribe and Eagle 101 understand that losing control of their bot's code will be no different than losing control of their admin account, with sanctions that might entail, then I'm perfectly satisfied to support. Ronnotel 23:25, 4 October 2007 (UTC)

I do want to make it clear, that I wanted to go rouge and run an unauthorized bot on my account I could do so just as easily under my main admin account as I could do under any special "bot" account. Its technically the same thing, just the bot account separates WJScribe's deletions and the bot's deletions. That trick you speak of applies to every one of our 1,200+ admins. —— Eagle101 23:44, 4 October 2007 (UTC)
I just wanted to make sure that the chain of admin rights is unbroken. From what you're telling me, all involved in this project are already admins so this shouldn't be a problem. Thanks. Ronnotel 00:01, 5 October 2007 (UTC)
Along the lines of this inquiry, I found one design decision with this bot rather strange. Typically, or at least in my opinion, it is very unsafe practice to store passwords inside the source code, especially when the source code that contains this password is on a (even quite secure) shared server. I would definitely recommend that Eagle revise the code to 1) prompt the operator for a password each time the bot is run, 2) receive the input of this password with echoing disabled, 3) log out at the end of the session, and 4) nullify the memory used to store this password immediately after the log-in has been completed successfully. (1) is obvious -- it ensures that the password is not stored in an actual file anywhere on the server, and so cannot be accessed even by server admins. (2) prevents shoulder-surfing and accidentally copying the password to somewhere it shouldn't be. (3) prevents against session hijacking by ensuring that sessions only last for a brief amount of time. (4) guarantees that the password can't be retrieved from the toolserver's RAM or other shared memory (naturally, this can, theoretically, only be done by server admins, but it's possible nonetheless). I also might recommend that the bot operate off the secure server instead of the http server, which would make it nearly impossible to fish out the bot's password. This may be a bit of overkill, but I think (1) and (4) should be absolutely mandatory for any bot, regardless of whether it has an admin flag or not. AmiDaniel (talk) 00:11, 5 October 2007 (UTC)
Number 1 at least will be implented, I wrote the bot up in about 2 hours, and did not think about that. The bot will most likely *not* be run on toolserver, but on a computer resource of WJBscribe's. I will of course by logical extension nullify the memory. I'll look into doing the others. :) —— Eagle101 00:19, 5 October 2007 (UTC)
Great to hear :) Thanks for the speedy response. AmiDaniel (talk) 00:23, 5 October 2007 (UTC)

I am worried about this bot for a couple of reasons. First, the bot can be compromised and block everyone (admins included) at a time when no stewards are available (i.e. late at night). Second, I am wondering where this TestWiki is so that we all can see what the bot actually does. Miranda 01:19, 5 October 2007 (UTC)

Do you think it any more likely to compromised than a human admin account? It would certainly be easier to tell if it had been, as unusual behaviour would be easily reconcilable. Remember that any admin blocked by the Bot can unblock themselves. If you want to see what it does, look at User:Eagle 101/RedirectCleanupBot. The entries on that list with 1 by them are the ones the Bot would have deleted - all others would be left alone. WjBscribe 01:24, 5 October 2007 (UTC)
The bot has no ability to block, zero, zip, nada, zilch, 0. Its not in its code, I offer up the code to your review at http://tools.wikimedia.de/~eagle/rdbot.pl, mind you its slightly out of date, as I'm working on adding a few new safe guards/features such as checking a page to make sure it is allowed to run. This page will be able to be modified by any registered user. The absolute worst that can happen here is the bot deletes a redirect its not supposed to, which hopefully someone would report that to me and I could fix it ;). Please do scour the test results and let me know if something should not be deleted that is marked for deletion. —— Eagle101 01:29, 5 October 2007 (UTC)
The testwikis in question are wiki.xyrael.net (eagle's official test) and st47.dyndns.org/index.php (my initial tests of the code and also some development work). They may be a little hard to follow, however. --uǝʌǝsʎʇɹnoɟʇs 01:41, 5 October 2007 (UTC)
Thanks for your responses. Miranda 01:55, 5 October 2007 (UTC)
Coding comments

(I couldn't find anywhere else to put this, so I made a new section just here.)

There's a possible slight bug in the code, although it's unlikely to come up. If a broken redirect is deleted, and then a new page is created at the same title that contains no wikilinks, the page isn't edited from then until when the bot runs, but the broken redirect is still cached in Special:Brokenredirects, the bot will delete the new page due to the else block of the if($targetname) check. Why is that fallback there? (I'd prefer it if the bot only deleted redirects that contain at least one link; I'm not sure if it's possible to create a page which qualifies as being a redirect without it containing a link, but if it is such pages should be sufficiently infrequent that the bot wouldn't be needed to delete/correct them, so I'd prefer it if the redirect were removed.) I'd also prefer it if $targetname was explicitly set to undef as it was created, and if the if() two lines down explicitly checked defined($targetname) (otherwise there will be weirdness in such cases as a broken redirect to 0 due to the way Perl treates boolean values, and this will also avoid some potential interpreter warnings about undefined variables; you are running this with warnings on, presumably?). --ais523 11:53, 5 October 2007 (UTC)

Oh, and I'd like to see the new source code (after the changes you mention above and the ones I've just requested have been made) before I express an opinion on this RfA. (I have nothing against adminbots, but I don't like supporting RfAs without good evidence, usually in the form of contributions (but in this case in the form of source code), of how the new admin is going to behave.) --ais523 11:55, 5 October 2007 (UTC)

Actually, there's possibly a slightly worse problem along the same lines as the old one; if the first link in the new article is a redlink, then that will also cause the new page to be deleted. The bot absolutely needs to check that the page in question is, in fact, a redirect, before trying to delete it. (A case-insensitive check for #REDIRECT at the start of the article's content should rule out false positives completely (which is important for an adminbot) without producing too many false negatives (adminbots should err on the side of caution, and a false negative is no big deal because it will just lead to the adminbot not deleting a broken redirect when it could have done)). --ais523 12:01, 5 October 2007 (UTC)

Ok, checking over this now. —— Eagle101 12:30, 5 October 2007 (UTC)
Ok, first off I should thank you for actually checking the code :), that bit was introduced by ST47 in his patch above. I merely checked that patch to make sure it was not doing anything untoward, and just added it to the code base as it was given to me by a bag member. I appreciate you having found that rather then me having to debug that at a later point. Saying that I have put up a new version of the code at the same location as before, the script now checks User:RedirectCleanupBot/Shutoff for the string {allow}, if that is not there, it shuts off. Please do review the source code and if you have any further additions or changes, I would appreciate a unix diff patch, or you can just describe them ;). —— Eagle101 13:11, 5 October 2007 (UTC)
There's a slight error in the uninitialisation of $targetname; "" is a blank string, and therefore is defined (you want to write my $targetname = undef; rather than my $targetname = "";; sorry for not sending a patch, last time I tried this happened); likewise, the second check for defined (the one containing next REDIRECT should just be a test for nonemptiness ('' ne $target->{'content'} is one way to do this, I think, but there are many others). One other error (this one won't cause the bot to do anything crazy, thought, just causes a few false negatives): $pageTitle will come out HTML-encoded from the regex you use (I can't think of an easy way to avoid this), and as a result an attempt to check "casting directors society of canada" (to take an example from the current special page) will instead end up checking ], which is a different page. (Hmm... the deletion log from the correct page shows up there, but it's definitely a different page... maybe this is a MediaWiki bug?) However, the sanity check that checks whether the page is a redirect and whether it is broken will catch this, meaning that this is a false negative; a fix for this would be nice but complicated, I think (unless Perl has an XML-unescape function somewhere in its libraries, which it probably does, for this purpose; if it does, use it to unescape $pageTitle just inside loop REDIRECT). --ais523 17:35, 5 October 2007 (UTC)
Correct, I know about the quote problem, and frankly as long as the bot does not take that error and delete something else, I'm ok with it for now. It means it won't touch anything with a " in it. As far as target name I was having a minor problem debugging something, and I forgot to fix that, would have shown up rather quickly in any serious debugging, but thanks for pointing it out. I'll make the test back to a test for non-emptiness. (that is what it was before I tried to get cute >.>). Thanks again. `—— Eagle101 17:44, 5 October 2007 (UTC)

Discussion

Support

  1. Per general support for adminbots. Is this a move towards sanity? Moreschi 20:54, 4 October 2007 (UTC)
  2. As the programmer, I support my code. —— Eagle101 20:55, 4 October 2007 (UTC)
  3. Support as bot operator who has reviewed and tested the code - it works as expected, there are no security issues or potentially bad situations that could cause the bot to perform unexpected operations. --uǝʌǝsʎʇɹnoɟʇs 20:58, 4 October 2007 (UTC)
  4. Strong support This will be very useful to have, and of course I trust the owner and coder :) Majorly (talk) 20:59, 4 October 2007 (UTC)
  5. Strong support (note: Bot approvals group member)) - code is public, its hard to screw up, no judgment involved whatsoever on the bots part. It's a no brainer support -- Tawker 21:01, 4 October 2007 (UTC)
  6. Support - (many edit conflicts) The concept is good and I trust that the code is sound. I was initially concerned on understanding that this was to be an admin bot, and no less a deletion bot, but since it will only delete under such strict circumstances, I cannot imagine it being a problem. Nihiltres 21:02, 4 October 2007 (UTC)
  7. SupportHiDrNick! 21:03, 4 October 2007 (UTC)
  8. Support — if it won't make decisions, why not? --Agüeybaná 21:06, 4 October 2007 (UTC)
  9. Obvious support. A job that a bot can easily perform and a bot with thoroughly reviewed code. Will (aka Wimt) 21:07, 4 October 2007 (UTC)
  10. Support Job is uncontroversial and tedious, perfect application for a bot. I have read, and understood the source code, and it will function as described. ((1 == 2) ? (('Stop') : ('Go')) 21:08, 4 October 2007 (UTC)
  11. Support - code makes sense, and is a good opportunity for us to take the step of allowing admin bots, especially for menial chores such as this (where the human admin is effectively taking on the role of robot when deleting the redirects). Important point is that bot only deletes pages with one edit, so most applications will be to pseudonyms created by the author of a bio article since deleted under CSD G7. Effectively, most applications of this bot will simply extend the human speedy deletion procedure. Martinp23 21:13, 4 October 2007 (UTC)
  12. Definite support! We definitely need a bot like this. I'm pretty sure that anything coded by Eagle101 would handle just fine. :) *Cremepuff222* 21:14, 4 October 2007 (UTC)
  13. Support As someone who's done this tedious task, it will be a relief for a bot to take over! I'm happy that with the restrictions specified, there is little chance of any problems. I trust the author and code reviewers judgements that it's bug free. → AA21:21, 4 October 2007 (UTC)
  14. Strong support, and I also recuse myself from closing this RFA. Andre (talk) 21:21, 4 October 2007 (UTC)
  15. Support - This looks to me like a bot task, and the bot will be operated by an admin. Od Mishehu 21:25, 4 October 2007 (UTC)
  16. Support as BAG member. Simple bot task, a matter of giving a +sysop flag to a +sysop user already. An incredibly simple task, no major requirements. Code is available which is a big plus. Matt/TheFearow (Talk) (Contribs) (Bot) 21:29, 4 October 2007 (UTC)
  17. I am in general support of this bot getting an admin bit, but I have to wonder if there are more pressing tasks that deserve admin bots. The reasoning behind the nom is that the task is fairly tedious and it is a "trivial task could be done just as well automatically." That describes many admin tasks... (CSD backlogs, anyone?) This task does not seem so pressing, but if it can allow a kind-of back door allowance for more admin bots with solid code and trusted operators, why not? --Jeffrey O. Gustafson - Shazaam! - <*> 21:33, 4 October 2007 (UTC)
  18. Hmm, let's see. No answers to questions, no indication that it will communicate well with users, not enough prior experience in the area where it plans to use its admin powers, and an edit count of 0 is a bit low. Yes, I'm joking.
    I've just reviewed the code. I can only see one way it would go wrong, and it's in an extremely improbable situation: an announcement is added to the Misplaced Pages interface, containing a bulleted list, and one bullet point begins with a link to a page that's only been edited once. The bot would then delete that announcement page, thinking it was a dead redirect. I can't see this happening in reality, and it wouldn't be a huge problem even if it did, since it would only affect one page and would be easily fixed. Human admins are much more likely to make much bigger mistakes.
    In short, I trust this bot to do what it's meant to do, and I support. rspeer / ɹəədsɹ 21:34, 4 October 2007 (UTC)
  19. Support, meets all of my criteria in User:Gracenotes/Admin bot. Two not-so-essential requests, though: 1. Could the bot recognize the {{bots}} templates for exceptions? 2. Could the bot please include the target page in the edit summary? The former will allow for (as stated) exceptions, in case one is needed, and the latter can allow for human review, and help in situations where retargeting is more appropriate than deletion. Granted, these changes will make make the code less pithy, but in my opinion they're worth it. Gracenotes § 21:35, 4 October 2007 (UTC)
    We can include the target page. —— Eagle101 21:37, 4 October 2007 (UTC)
  20. Support - the operator is an admin and the coder is an admin. The bot has already been approved by WP:BRFA. I've read the code and it seems fine assuming the format for our history pages don't radically change. ~a (usertalkcontribs) 21:38, 4 October 2007 (UTC)
    The bot has not yet been approved, approval of the bot is pending the result of the RFA, at least that is how I understand it. —— Eagle101 21:41, 4 October 2007 (UTC)
  21. Do I trust WJBscribe? Absolutely. Do I trust Eagle 101? Absolutely. By extension, can I support this bot? Absolutely. Wizardman' 21:53, 4 October 2007 (UTC)
  22. Support This is a beneficial and no-risk task, perfectly suited for a bot. —Ruud 22:00, 4 October 2007 (UTC)
  23. Support --while it may set precedent, its advantages overweigh its disadvantages. Danny 22:07, 4 October 2007 (UTC)
  24. Support. Would this be the first admin bot to pass an RfA? Anyways, I trust the owners and the users who have reviewed the code. RyanGerbil10(C-Town) 22:10, 4 October 2007 (UTC)
  25. Support based on assurance by WjBscribe and Eagle101 that the task scope will never be expanded. Upon further rumination, rather than oppose or remain neutral on this particular RfA, it makes much more sense for me to say that I will strongly oppose any future bot RfA where the operator does not explicitly make this same assurance. I’ll be strking out my neutral below in a second. --barneca (talk) 22:15, 4 October 2007 (UTC)
  26. Very Weak Support - I've always hated the idea of having a bot become an admin, but ressurence counts. I'll give it a go for now. --Hirohisat 21:44, 4 October 2007 (UTC)
  27. I strongly support this nomination. It doesn't bother me that this nominee isn't a person. All bots I know about are extremely useful and needed. Acalamari 22:28, 4 October 2007 (UTC)
  28. Support this thoughtful and reasonable request.--chaser - t 22:42, 4 October 2007 (UTC)
  29. Support Shouldn't even need to go through this process. east.718 at 22:53, October 4, 2007
  30. Support. The code looks correct, and this looks like a useful task for a bot. --Carnildo 23:00, 4 October 2007 (UTC)
  31. Support even though the bot has unacceptably low template and portal talk counts.  :-) - Philippe | Talk 23:18, 4 October 2007 (UTC)
  32. OH NOES! ADMIN BOTS WILL BLOCK US ALL!!!!! Err ... I mean ... strong support. Seriously, I've always thought opposition to adminbots was silly. I trust a bot more than I do a human. Humans can go rogue. Humans can push a POV. Bots just do what you tell them to. --B 23:30, 4 October 2007 (UTC)
  33. Support there is nothing wrong, very useful bot. Carlosguitar 23:34, 4 October 2007 (UTC)
  34. Support - trust the bot operator and the coder. Have overcome fear that bots will block us all. If the bot needs admin powers to accomplish this one tedious task, it seems reasonable to allow that. Operator and coder have allowed that it be immediately blocked if they/it make(s) a mistake, so I see no reason to disallow this experiment in cleanup. - Kathryn NicDhàna 23:36, 4 October 2007 (UTC)
  35. Support I just tried going through broken redirects myself, and even with Twinkle deletion to speed it up, its still a boring, menial task. The faster we can clear such uncontroversial backlogs the better. Mr.Z-man 23:41, 4 October 2007 (UTC)
  36. Strongest possible support. Administrators shouldn't be stuck doing work an automated process can manage with a 0% error rate. We choose administrators here because they write content, or are good at dealing with people, or know their way around the more complicated sections of this website. To drag people away from doing much more meaningful work here is ludicrous. I've had a hell of a lot of experience with Eagle's coding with the old IRC Spotbot stuff and I've never, at any time, had any suspicions he doesn't know exactly what he's doing. I've also rustled up some half baked utilities for Misplaced Pages myself, so I believe I'm qualified to repeat "a program can only do what it's designed to do" idiom. Nick 23:45, 4 October 2007 (UTC)
  37. Support given the strong promises made to never expand the bot's approved task. --ElKevbo 23:53, 4 October 2007 (UTC)
  38. Support No reason not to, plus, the bot saves me the annoyance! :) SQL 23:54, 4 October 2007 (UTC)
  39. I trust WJBscribe and Eagle 101 for the code so the bot should be fine. A bit irregular to trust a bot, but the source code has been reviewed by AmiDaniel so everything seems good. --DarkFalls 23:59, 4 October 2007 (UTC)
  40. Support. I have no problem supporting this endeavor provided my concern (noted above) about maintaining an unbroken chain of admin rights in the management of the code base is met. Ronnotel 00:03, 5 October 2007 (UTC)
  41. Support I can definately see why this bot should have adminship. Captain panda 00:09, 5 October 2007 (UTC)
  42. Support. I have no problem with a tightly-restricted adminbot. I'm a little concerned about unanticipated corner cases, so I hope there will be a log created that everyone can check, before too many redirects are processed. Regarding Question 4, from F Mita, I argue that the Talk pages of dead redirects should be kept for human inspection. (There shouldn't be very many of them). I also support the two features requested by Gracenotes, above. EdJohnston 00:10, 5 October 2007 (UTC)
    To clarify re: Q.4 - they will be. They will only be deleted if the talkpage is also itself a dead redirect - otherwise the Bot will leave them alone. WjBscribe 00:14, 5 October 2007 (UTC)
    We already have a log. If you would like to check the bot's activities, please see User:Eagle 101/RedirectCleanupBot. Anything with a -- 1 next to it would have been deleted by the bot. —— Eagle101 00:30, 5 October 2007 (UTC)
  43. Support with the hope that over time we may see more tasks given to bots with the sysop bit turned on. ++Lar: t/c 00:12, 5 October 2007 (UTC)
  44. support As long as bots with admin tools are coded properly, they should not do anything like delete the main page as they are not self aware... yet --Hdt83 00:33, 5 October 2007 (UTC)
  45. support as per B.Pharaoh of the Wizards 00:38, 5 October 2007 (UTC)
  46. Don't tell Aphaia. – Steel 00:43, 5 October 2007 (UTC)
  47. Strong support. Daniel 00:56, 5 October 2007 (UTC)
  48. Support Gizza 00:57, 5 October 2007 (UTC)
  49. Support Okay. Jmlk17 01:03, 5 October 2007 (UTC)
  50. Support Bot will do a menial yet necessary task. Most concerns are addressed, and I'm highly confident in Will's and other reviewer's judgment about the bot. However, a bot is only as good as it's curator, so I suggest constant supervision by both Eagle and other admins. - Mtmelendez 01:40, 5 October 2007 (UTC)
  51. Support. Seems an appropriate task, with little downside should any unanticipated cases arise. Gimmetrow 01:53, 5 October 2007 (UTC)
  52. Support, as there's no way that (as it stands) this bot will prove any harm with having a delete button.—Ryūlóng (竜龍) 01:58, 5 October 2007 (UTC)
  53. Support Pile-on support from this admin ELIMINATORJR 02:04, 5 October 2007 (UTC)
  54. Support - My questions are well answered. My suspicions are proven false, since this bot does not have an admin flag. Miranda 02:05, 5 October 2007 (UTC)
  55. It's the distant future, the year 2000. The humans are dead. -- John Reaves 02:08, 5 October 2007 (UTC)
  56. Support When a task needs to be done, and that task requires a certain access level, it should be granted. This bot, like all bots, can and will be shut down if it ever begins rampaging; nothing to fear. - auburnpilot talk 02:14, 5 October 2007 (UTC)
  57. Support - reasonably limited scope (to prevent false positives being fucked up by the bot), operator is not a dimwit, apparently there will be a publically usable shutoff function, and hey, if it goes berserk there's always undelete+block, no? MessedRocker (talk) 02:19, 5 October 2007 (UTC)
  58. Ok. My concerns have been addressed, and I trust the operator. I'll go along with this. Mercury 02:22, 5 October 2007 (UTC)
  59. Support We definitely need a bot like this. No major concerns here. --Siva1979 02:48, 5 October 2007 (UTC)
  60. Support Per the commitment that the bot never adds any other tasks. --JayHenry 03:06, 5 October 2007 (UTC)
  61. Support - I doubt the bot will go rogue, and if it does, it can be fixed, right? ;) Also, I think the user is trusted enough to maintain this bot. Regards, Ανέκδοτο 03:08, 5 October 2007 (UTC)
  62. Support. Is this the first Adminbot? J-ſtanContribs 03:22, 5 October 2007 (UTC)
    Yes, well... first official adminbot, there have been and probably are some bots running that are not publicly known. This will, if approved, be the first adminbot to use the functions with the full knowledge and will of the community... at least the portion that watches RFA. —— Eagle101 03:49, 5 October 2007 (UTC)
  63. Support. Looks good to me. I'm glad to see it'll have an emergency shutoff. Folic 03:24, 5 October 2007 (UTC)
  64. Code looks fine to me. MER-C 03:57, 5 October 2007 (UTC)
  65. Support While this is the first I know of such things, I too have visions of rouge FrankenBots. However, all seems well. —Ignatzmicecontribs 04:02, 5 October 2007 (UTC)
  66. Support Nothing controversial, so I'm fine with it. Nishkid64 (talk) 04:12, 5 October 2007 (UTC)
  67. Support - adequate assurances from the nomination and the management of two very well trusted editors earns my full support. Sephiroth BCR 04:56, 5 October 2007 (UTC)
  68. Support The point that some opposers seem to miss is that these redirects are currently being deleted by hand by admins which, of course, have a brain but choose not to use it too much when doing these repetitive, beyond-boring deletions. Are they being lazy? No, they're being efficient. If anything, this bot may help in decreasing the mistakes made when deleting broken redirects. It would be trivial for the bot to be modified so that it tags broken redirects that have a history so that they can be considered with more care by admins. Pascal.Tesson 05:24, 5 October 2007 (UTC)
    True, though in practice just looking through the remaining entries in Special:BrokenRedirects when the Bot has finished its run will be just as effective as having the Bot tag stuff. And it seemed like a good idea to keep this Bot's code as simple as possible... WjBscribe 05:28, 5 October 2007 (UTC)
  69. Support with complete trust and a slap in the face with a big wet trout to opposers :) --Ben 05:33, 5 October 2007 (UTC)
  70. + This is a surprise, I really didn't think I could ever support a bot with a flag. However, this is an absolutely non-controversial, menial task and I do on a personal level trust the coding knowledge of WJBScribe, AmiDaniel, and Eagle 101. So this is a rare trust support from myself. Keegan 05:49, 5 October 2007 (UTC)
  71. support. I don't program, so I haven't reviewed the code. But I trust WJBscribe and Eagle_101, as well as the others who have reviewed the code. This type of task is exactly what an AdminBot would be useful for. -- Flyguy649 contribs 06:12, 5 October 2007 (UTC)
  72. Support Will and Eagle have my full trust and support and I have no qualms about their proposal. Sarah 06:23, 5 October 2007 (UTC)
  73. Support. This is a perfectly sensible task for a bot to do, the source code is completely open, and the developers and nominator are all trusted members of the community. --krimpet 06:39, 5 October 2007 (UTC)
  74. Absolutely. ~ Riana 06:44, 5 October 2007 (UTC)
  75. Support; the operator knows what he's doing. --NE2 07:09, 5 October 2007 (UTC)
  76. Support - no issues at all here. And sources too? Wow! - Alison 07:36, 5 October 2007 (UTC)
  77. Support, sounds like a good task for a bot. Kusma (talk) 08:09, 5 October 2007 (UTC)
  78. Support Good task for a bot and two trustworthy admins in charge. пﮟოьεԻ 57 08:30, 5 October 2007 (UTC)
  79. Weak support. I know absolutely nothing about programming or how bots work. However, as I understand it, the bot can be blocked if it does anything wrong, and its admin rights will then be revoked by a steward. I think we may as well give this a chance for now. As pointed out above, humans are more likely to go insane than bots are, and are much harder to stop. Besides, this may be a good precedent for the future; one day I would like to see a bureaucrat-bot closing RfAs, counting the votes and promoting accordingly, thus eliminating the need for a small élite group who have far too much power for their own good. Walton 10:09, 5 October 2007 (UTC)
  80. Support one must think in technical terms. This bot is not becoming an admin. Not really. An administrator is one who manages and who helps out. This is a request for the +sysop flag, rather than a request to take on that responsibility. -- Anonymous Dissident 10:23, 5 October 2007 (UTC)
  81. Support per the code being publically available (from henceforth, this will be my criteria for any and all adminbot RFAs). Neil  10:51, 5 October 2007 (UTC)
  82. Support. This bot can accomplish a great deal and I see very little that could go wrong. If the bot malfunctions it can be quickly blocked. As others have pointed out, this isn't going to be an AI admin, just a bot with an extra flag to help it get its work done. Chaz 11:59, 5 October 2007 (UTC)
  83. Strongest Possible Support. I've said before that there are many CSD that could be automated, and this is one of them. CSD R1 is one of the simplest to determine in terms of delete/keep. If the redirect is red, nuke it. Human administrators delete many many daily. Why shouldn't such a mundane task be put to a bot, who (when programmed, and I fully trust Eagle's abilities) can do it faster and without taking the time. To those who wish WJBscribe to be more bot-experienced, I must say that I do trust his judgement, and all bot operators were new at some point. Plus, let's assume it went crazy and deleted stuff it shouldn't...wouldn't it be easy enough to block the bot and restore the pages? Finally, as WJBscribe pointed out, admins cannot delete when they are blocked, so we have well over 1000 administrators who can stop it if it goes awry, not having to wait for bot operator intervention. ^demon 12:34, 5 October 2007 (UTC)
  84. Of course. What could go wrong with deleting redirects to nonexistent pages? Unlike with vandalism reversions or image taggings, the potential for errors here is extremely small. What a redirect to a nonexistent page is is an objective thing. Automation is good, when it's handed responsibly. It takes away the workload from our "real" admins, and gives them more time to do the more complicated deletions. Melsaran (talk) 12:50, 5 October 2007 (UTC)
  85. Support - As many others have said, this will give our admins more time to do less repetitive tasks, thereby helping the encyclopedia. Neranei (talk) 13:31, 5 October 2007 (UTC)
  86. Support we already tolerate several adminbots making deletions (some of them, tens of thousands) and of content a lot more controversial than redirects. So... here's an uncontroversial cleanup task, it would be silly to not have a bot doing it. Opposers should take a few admins to ArbCom if they feel so strongly about adminbots. --W.marsh 14:39, 5 October 2007 (UTC)
  87. Support - the bot's operator and programmer have my fullest confidence and trust, therefore I have no reason not to support their effort to implement this bot with the sysop bit. ɑʀкʏɑɴ 15:05, 5 October 2007 (UTC)
  88. Function, check. Transparency, check. Source code, check. Operator, check. Checklist complete, cleared for mop. - Mailer Diablo 15:13, 5 October 2007 (UTC)
  89. Support I have no objection to adminbots with trustworthy operators running on boring tasks like this. --Hut 8.5 15:39, 5 October 2007 (UTC)
  90. Support as long as this bot is operated by WjBscribe.--Duk 15:43, 5 October 2007 (UTC)
  91. Per betacommand. — Nearly Headless Nick {C} 15:58, 5 October 2007 (UTC)
    Betacommand has expressed that he "Very strong oppose" this RfA. Did you mean to place your !vote in the oppose section? --ElKevbo 18:16, 5 October 2007 (UTC)
  92. Support useful, and seems like it will have lots of eyes on it to prevent/troubleshoot problems. Rigadoun (talk) 16:01, 5 October 2007 (UTC)
  93. Support. I have inspected the code. It's very simple and thus easy to test. This provides confidence that the bot will work as planned. - Jehochman 16:14, 5 October 2007 (UTC)
  94. Support simple ˉˉ╦╩ 18:59, 5 October 2007 (UTC)
  95. One of the reasons I don't generally like the idea of an admin bot is the number of judgement calls that the average administrator must make. This bot will be involving itself in none of those areas. EVula // talk // // 19:49, 5 October 2007 (UTC)
  96. Support I am all for careful, thoughtful automation of tedious or repetitive tasks. κaτaʟavenoC 20:10, 5 October 2007 (UTC)
  97. Technical and moral support; I have reviewed the code, and it is sufficiently straightforward that any putative damaged caused by an unseen bug would be minor, trivially detected and simple to undo. Morally, I have no opposition to an admin bot iff it has exactly one well-defined task, strict criterion to do the task, and the operator has the bit himself (to avoid privilege escalation). This is currently the case. — Coren  20:13, 5 October 2007 (UTC)

Oppose

  1. Strongest possible oppose. Bots should not have admin bits. Corvus cornix 20:54, 4 October 2007 (UTC)
    Why? Moreschi 20:55, 4 October 2007 (UTC)
    Bots are not people. Bots are left to run on their own with no checks and balances. Bots have no edit histories. Corvus cornix 20:56, 4 October 2007 (UTC)
    Bots are not people - well, duh. Bots do have human operators, however, and we can certainly trust these operators. They serve as check and balance, and bot will only do what they tell it to. If it ballses up, we can block it and if necessary desysop it in around 30 seconds. Your paranoia is unjustified. Moreschi 21:01, 4 October 2007 (UTC)
    I should let you know though, the source code is public, thats probably better then an edit history, its like being in the "admin's" mind. ;) —— Eagle101 20:58, 4 October 2007 (UTC)
    The code is public, and has been reviewed by - so far - three bot operators. Based on my review, I'm not saying that there is little chance the user will misuse the tools, I am saying that there is no chance whatsoever that we will have problems. --uǝʌǝsʎʇɹnoɟʇs 21:01, 4 October 2007 (UTC)
    Checks and balances are exactly what bots have. It will not do anything except what the code tells it to do. ((1 == 2) ? (('Stop') : ('Go')) 21:16, 4 October 2007 (UTC)
    Bot's are nothing except checks and balances. Thats how they work - they make simple checks, and simple decisions. If its not simple, it lets a human decide. Matt/TheFearow (Talk) (Contribs) (Bot) 21:29, 4 October 2007 (UTC)
    I assume the comment above that bots have no edit history is incorrect. If that were true then I'd be more worried. EdJohnston 02:01, 5 October 2007 (UTC)
    What do you mean? Of course this bot has no history, in that it was just created, and was not yet approved, so has not begun to operate. --uǝʌǝsʎʇɹnoɟʇs 02:04, 5 October 2007 (UTC)
    I really wish RFA discussions were threaded without this support/oppose/neutrul nonsense, but I digress... I'll oppose this request until Q13 is addressed. With sysop abilities, I really think its a good idea to add a capability to stop the bot, without having to be an admin. Mercury 00:37, 5 October 2007 (UTC)
    This? Daniel 00:56, 5 October 2007 (UTC)
    There will be a shutoff function, but it will not be the talk page. Talk pages tend to get messages that don't always result in the need to shut the bot off. (Newbies asking why a page was deleted, not realizing that the bot deleted the redirect etc.). These are not something that should shut the bot off. —— Eagle101 01:18, 5 October 2007 (UTC)
  2. Strong oppose A bot is not a human. Has anyone here ever seen I, Robot? Basically, that film sums up my point. Cheers,JetLover (Report a mistake) 03:50, 5 October 2007 (UTC)
    I don't suppose you've read the book I, Robot, the original Asimov? No? In that, the robots prove the protectors of mankind from self-inflicted destruction. No one does read the books these days, but if you haven't, do so. Our Frankenstein complexes are fundamentally irrational. Moreschi 17:53, 5 October 2007 (UTC)
    In case you have not yet noticed, this whole wiki is nothing but a jumbled mess of code... what shall we do when it gains the capacity of humans? —— Eagle101 03:53, 5 October 2007 (UTC)
    My point is that it won't gain the capacity of humans. It's a robot. Cheers,JetLover (Report a mistake) 04:08, 5 October 2007 (UTC)
    So you're opposing it not for its usefulness, but just because it is an automated process? --DarkFalls 04:14, 5 October 2007 (UTC)
    No. Twinkle is an automated process, I'm opposing it because it doesn't have human judgment. Have you ever seen how many times MartinBot, ClueBot, BetacommandBot have have screwed up? It could go berserk and delete good articles. Cheers,JetLover (Report a mistake) 04:19, 5 October 2007 (UTC)
    Even in the rare case that the bot goes berserk, admins can patch up the problems easily (it's article restoration...not difficult at all). Overall, I think it's fair to believe that the positives strongly outweigh the negatives. Nishkid64 (talk) 04:22, 5 October 2007 (UTC)
    But still, why take that chance? The bot could go nuts, sure it could be fixed, but why take the chance? Cheers,JetLover (Report a mistake) 04:25, 5 October 2007 (UTC)
    If you care to look at the bot's code or the nomination, the bot will only delete redirects and only when the redirects have one edit in the history. Therefore, the bot cannot, and will not be able to delete "good articles". And in another point, the bot cannot do anything out of the ordinary until its source code is changed. It all comes down on the trust you have on the nominators and bot operators, not the bot itself. --DarkFalls 04:28, 5 October 2007 (UTC)
    I really don't see how it could go nuts. It only has a very few lines of code. I suppose I could alter it so it goes nuts- but why would I? I already have an admin account I'm trusted not to go nuts with. Really, human editors are much more likely to go nuts that Bots whose scripts are available for all to see. Its a bit like if you could read my mind. This Bot is nowhere as complicated as MartinBot or ClueBot which actually have to make judgment calls - weighing up factors using a scoring system based on their programmes. This Bot can't do that. It simply checks if a page meets a predetermined criteria and can only delete if it does and it doesn't even go looking for redirects, it works from a list like Special:BrokenRedirects which is generated independently... WjBscribe 04:32, 5 October 2007 (UTC)
    JetLover, consider the positives to the negatives. There is a minute chance that the bot will go screwy and delete maybe 10-20 articles, but from the other perspective, you can see that the bot can accurately delete thousands of redirects. Mathematically, I think trusting the admin bot is a safe bet. Nishkid64 (talk) 04:34, 5 October 2007 (UTC)
    C'mon, what harm will it do? IT does not need cognitive thought to do what it is proposed to do. Mercury 04:38, 5 October 2007 (UTC)
    A minute chance? It's still a chance. 10-20 pages could be deleted...not good. Would you vote in an admin that might forget to take his medication and go crazy and delete 10-20 articles? I doubt that. Cheers,JetLover (Report a mistake) 04:40, 5 October 2007 (UTC)
    Nishkid, there is no chance it will delete any articles ;). The worst that can happen is the bot deletes a few redirects it should not do... in which case they are reported to me and I fix the problem. But I highly doubt that will happen... this task is just that simple. —— Eagle101 04:43, 5 October 2007 (UTC)
    (ec) I disagree with Nishkid, there is no such chance. The Bot cannot deleted articles because (1) articles are not listed at Special:BrokenRedirects, (2) articles are not redirects to deleted pages and (3) articles contain more than one revision. WjBscribe 04:45, 5 October 2007 (UTC)
    But the thing is, it could happen, why let it? Would you vote in a candidate who had a mental illness, and he could forget to take his medication and delete 10-20 redirects? No. Why do it with a bot? Cheers,JetLover (Report a mistake) 04:48, 5 October 2007 (UTC)
    It can't happen - the code does not allow for it. To take the unfortunate example of a mental illness, we cannot tell whether a person has one or not - you can tell whether this Bot does or doesn't because every line of code is available. It is not capable of doing anything not in that code. WjBscribe 04:57, 5 October 2007 (UTC)
    Perhaps, but I still, and will never, entrust the sysop tools with a machine. Cheers,JetLover (Report a mistake) 05:00, 5 October 2007 (UTC)
    Oh please, this bot is bound to be smarter than many of our current sysops. -- John Reaves 08:12, 5 October 2007 (UTC)
    Do you have any programming experience, JetLover ? Nick 09:12, 5 October 2007 (UTC)
    Generalising that comment to be "Misplaced Pages generally" would probably achieve the same response. Daniel 11:21, 5 October 2007 (UTC)
    This bot is far less likely to screw up than any other admin - in fact, if you trust eagle and wjb, there is no chance whatsoever. This isn't some super-intelligent evolving robot that has the capacity to choose to make decisions, and it's simple enough that I am sure that there is no way it can 'screw up' - it simply is not possible unless the operator tells it to - and we've already trusted him with the flag. --uǝʌǝsʎʇɹnoɟʇs 10:32, 5 October 2007 (UTC)
    I realize you've got me...but I just can't support. I will not entrust the admin tools to a robot. I am sorry, but I just can't. Cheers,JetLover (Report a mistake) 21:30, 5 October 2007 (UTC)
    It's not like it matters anyway, because I doubt the bureaucrats are really going to place much value in an oppose based on something that happened in a movie. -- John Reaves 21:51, 5 October 2007 (UTC)
  3. Strong Oppose. Not nearly enough editing experience , especially in project space. Someguy1221 06:36, 5 October 2007 (UTC)
    All the editing experience in the world couldn't help it... Please look above, it's a bot. --DarkFalls 06:40, 5 October 2007 (UTC)
    I'll do you one better! Here is the source a.k.a. the bot's brain. Enjoy... it can't do anything not specified there. —— Eagle101 06:53, 5 October 2007 (UTC)
    I'd understand if you didn't trust a bot with the bit or something, but did you actually look at the contributions you linked? Surely you would notice the bot has no contributions at all, let alone any in mainspace, and that this is a special case... --krimpet 08:00, 5 October 2007 (UTC)
    a.)You are insane. b.) haha, good joke. Please choose one. -- John Reaves 08:07, 5 October 2007 (UTC)
  4. Oppose - I'm really really sorry to do this, but I really don't think this is a good idea. I'm more than happy with us having admin bots - properly sourced bots could take away quite a few tedious admin tasks. However, for me to support a user running an admin bot, I would have to see a lot of previous experience running bots. I trust Eagle (and everyone else who understands the code) that it would work and do a good job, however they still have the potential to go wrong. WJB, I fully trust your judgement, but in area's that you have expertise (which is just about everywhere else). I just don't feel comfortable giving the most powerful bot to someone with very little bot experience. Ryan Postlethwaite 09:47, 5 October 2007 (UTC)
    I don't see a huge problem with that - if there is an issue, Eagle can just as easily fix the code and send it to WJB, and WJB is surely able to stop the bot in the meantime. --uǝʌǝsʎʇɹnoɟʇs 10:34, 5 October 2007 (UTC)
    Ryan, Eagle has a lot of programming experience and has run bots on Misplaced Pages before, as well as coding and running a number of IRC "RC" type bots. If you care to visit #wikipedia-en-image-uploads, you'll see one of his bots in action right now, I should think (I'm hoping I've got the channel correct). Nick 10:51, 5 October 2007 (UTC)
    It's not that I don't trust the bot per se, I trust Eagle's bot writing abilities fully and am sure this is a well worked code. I just don't believe it should be operated by a rookie bot operator. If Eagle was running this, I would support, but he's not. Most other bots, a new bot owner would have my full support, I mean we learn experience - but not an admin bot when there are plenty of other extremely qualified users to run it. Ryan Postlethwaite 11:10, 5 October 2007 (UTC)
    Ryan, with respect the "most powerful" Bot argument is rubbish. Its 40 lines of code. Its probably the most simple Bot that is going to be running on Misplaced Pages. Yes its final action will be to delete content - I have quite a lot of experience of deleting redirects (both speedy and as after discussion) but as far as its programming goes it is extremely simple. The specific of the task are layed out and the code that allows it to be done is public. I have difficulty in seeing what else you might want from us... WjBscribe 11:04, 5 October 2007 (UTC)
    It has got more tools at it's dispense than any other bot on the project so it is more powerful and I strongly disagree that the arguement for that is rubbish. I trust the code, I trust the your experience in deleting redirects, I just don't see any experience of you running a bot which I believe would be essential for someone running an adin bot. Ryan Postlethwaite 11:10, 5 October 2007 (UTC)
    How would whether or not I have operated a Bot before make a difference? Do you think I will use it at a greater speed than the agreed rate? Or modify its code so that it does something its not approved to do? In either case this would be spotted pretty easily and I suspect rightly the Community would have serious issues with that. I can't see how by accident or design I am any more likely to misuse this Bot simply because I have no previously operated one... WjBscribe 11:14, 5 October 2007 (UTC)
    It's just about knowing how to use the bot, sorting problems with it if it goes wrong, always having to go and hunt someone to fix it, making changes by accident that cause problems. I know you wouldn't purposely abuse the bot, I know you too well to suspect any differently - I just think there's potential here to go wrong if things screw up with the bot or settings get changed. For me anyway, experience is key if someone is going to be allowed to run an admin bot. Ryan Postlethwaite 11:21, 5 October 2007 (UTC)
    From what I can tell from the above discussion, WJBscribe has no intention of changing the code himself or doing anything besides turning it on/off and entering the login password. While I'm not familiar enough with Perl to know for sure, there do not seem to be any settings to change that could affect the operation of the bot as designed. Mr.Z-man 17:02, 5 October 2007 (UTC)
  5. Very strong oppose This program has a Pre-alpha AI, cannot help in CAT:UNBLOCK, and it cannot block stupid vandals. β 13:08, 5 October 2007 (UTC)
    I'm having a difficult time understanding your oppose reasons. The bot is programmed to do a specific and simple task, why would it need AI? Also, how is it relevant that the bot won't block or unblock anyone? Chaz 14:06, 5 October 2007 (UTC)
    Strong oppose / Illegal RFA. Current wikipedia rules forbid non-humans from being admins. See the top of the RFA page. "Requests for adminship (RfA) is the process by which the Misplaced Pages community decides who will become administrators" Note the "who". Who is a person. Also note the WP:administrator which states "Misplaced Pages practice is to grant administrator status to anyone who has been an active and regular Misplaced Pages contributor for at least a few months, is familiar with and respects Misplaced Pages policy, and who has gained the trust of the community." This bot has made no mainstream edits and has no brain so cannot respect WP policy. Furthermore, this bot has not answered any of the questions although some proxy has done the job for it. EIJ initials 16:16, 5 October 2007 (UTC)
    You registered an account just to Wikilawyer over this RfA? Why? Chaz 16:24, 5 October 2007 (UTC)
    User blocked indefinitely as a troll. - auburnpilot talk 16:36, 5 October 2007 (UTC)
    Checkuser confirms this account was made by Dereks1x (talk · contribs) while banned. As such, this vote is discounted as banned editors are not allowed to edit while banned. --Deskana (talk) 16:51, 5 October 2007 (UTC)
  6. Oppose I'm just not crazy about deletions by bot. An admin should at least spend the time it takes to read the redirect to decide if maybe there's some other action to take...maybe the redirect should be retargeted, or maybe there should be an article there...or maybe in some cases the target does need an article after all (depending on why the target was deleted if that's why the redirect is broken)...whatever, a final pair of eyes is needed. I'm also worried about a general increase in acceptance of bot deletions for the same reasons, don't want to see this become a precedent for other requests of this kind. I don't want this to go through just because of a general enthusiasm for technology. Unless there's a compelling reason to keep the broken redirect backlog clear I don't see why we need this. And just a quick comment, I certainly trust the developer(s) and operators....it's not an issue of the bot going sideways or anything. RxS 16:37, 5 October 2007 (UTC)
  7. Oppose WJBcribe is already an administrator and granting a person access to 2 accounts with administrator access is prohibited according to Misplaced Pages rules. The operator of the bot could run the bot without sysop tools and simply monitor its findings. He does not need 2 sysop privileges, which is against Misplaced Pages policy. AS 001 21:55, 5 October 2007 (UTC)

Neutral

Neutral pending reassurance switched to support. As a non-botwise person, I have an irrational phobia of rampaging admin Franken-bots that I’m really trying to overcome. Do I understand right that, if bots start getting sysop bits, a two step process will be required? First, it goes through WP:RFA, with a detailed summary of exactly what tasks it will perform. If that is successful, it goes through WP:BRFA, which will verify that it does what it claims to, and only what it claims to. If it passes both, a 'crat gives it a sysop flag and a bot flag. Every time a new task that requires sysop permission is thought up, either a new bot is nominated for RFA, or the old bot goes thru RFA again. Right? We won’t be in a position where, as time goes by, new tasks for a previously-approved bot only go through BRFA?
The reason I mention this here, on this relatively uncontroversial task, is that this is obviously the thin end of the wedge. I don’t share Corvus cornix's complete rejection of admin bots, and I think it would be good in some cases for the community to approve bots with sysop permission. But I wouldn’t want to go down a slippery slope where soon, admin bots are given new tasks without the entire community (i.e. RFA) being able to approve.
If it is already specifically enshrined somewhere that any new admin-related task has to be approved by the entire community, rather than just at BRFA, I’d support (and please point me in the right general direction of where that policy is). If not, I’d oppose until such an assurance was made. --barneca (talk) 21:31, 4 October 2007 (UTC)
If you read the statement, this bot will accept no further tasks. The code will only change to make the bot better at what it does. I will probably add in a mechanism by which the bot will confirm that the page that was redirected to was deleted over a day ago before deleting the redirect. —— Eagle101 21:35, 4 October 2007 (UTC)
I read the statement, and certainly believe WJBscribe that the scope of this particular bot will never change. Again, I'm just concerned that if we're going to start giving sysop bits to a bot, that there should be some mechanism to ensure that it will always be like this, and I'd prefer that mechanism be in place first. If this were not the thin end of the wedge, I would support. But it is, hence my question. --barneca (talk) 21:42, 4 October 2007 (UTC)
What mechanism would you like? It seems to me that if this Bot doesn't do what it says here, any of our 1000+ admins may block it. Admins are amazingly trigger happy in blocking Bots that do something they're not authorised to do (or do it too fast). Warnings are not given, the Bot operator is not consulted. Because the task is so simple, it will be very easy to show it is doing something wrong - whereas admin actions by humans are open to more interpretation :-) ... WjBscribe 21:46, 4 October 2007 (UTC)
All bots need approval by the bot approval group which understands how bots work. ((1 == 2) ? (('Stop') : ('Go')) 21:46, 4 October 2007 (UTC)
No problem, please do remember that this is setting a precedent and a good method for approving admin bots. There is no policy on it yet, but policy can always follow the practice. As it is now, the practice is following what you consider to be the ideal case, and this is what I consider to be the ideal case. One task per admin bot account. Each new one goes under a new account, and is approved at RFA, rather then BRFA (which is not all that public as far as people that see those pages). —— Eagle101 21:48, 4 October 2007 (UTC)
If that was policy, instead of good practice, my concerns would be addressed. Since policy can change overnight anyway, I’m not sure I actually need to see the policy written in stone first, although that’s what I would prefer. I’ll sleep on it. But I’d just feel better about this whole thing knowing that current consensus is that each new bot task requiring a sysop bit will trigger a new RFA (or some other, less-formal community-wide process that hasn’t been thought up yet), and that there wouldn’t be much support for a bold bot operator sometime in the future adding a task they considered uncontroversial. As for minor upgrades to the bot code, I’m perfectly happy to have BAG approve new version of the bot, and maybe not even that is necessary (not sure what current practice is) as long as there was no task creep. --barneca (talk) 22:02, 4 October 2007 (UTC)
There will not be task creep, but I can be sure the code will be changed, but if it is, the changes will be open to all to see. —— Eagle101 22:09, 4 October 2007 (UTC)

Very Weak Support - I've always hated the idea of having a bot become an admin, but ressurence counts. I'll give it a go for now. --Hirohisat 21:44, 4 October 2007 (UTC)

Is this in the wrong section? --barneca (talk) 22:02, 4 October 2007 (UTC)
Oops --Hirohisat 22:22, 4 October 2007 (UTC)
  1. Neutral leaning towards support - I'm not an expert with bots and my little experience (aka run-ins) with bots have been rather . . . odd. I do have faith in the nominators, but I only hope this little Bot doesn't decide to go rogue... -WarthogDemon 22:20, 4 October 2007 (UTC)
    Yeah, I don't know much about bots and stuff, but I don't see the bot going nuts and blocking folks or deleting the main page... Or, um, killing people and stealing our spaceship. And there will be a shutoff function. So... --Jeffrey O. Gustafson - Shazaam! - <*> 01:36, 5 October 2007 (UTC)
  2. I'm paranoid too...so I can't support, basically as per Warthog Demon. Not that it'll mattter.  — Dihydrogen Monoxide (H2O) 01:13, 5 October 2007 (UTC)