Jump to content

Topic on Project:Support desk/Flow

Mikebisson (talkcontribs)

My wiki - www.jerripedia.org - has been plagued for some weeks by what appears to be bot attacks. At one point the problem was so severe that the hosting company - Krystal - shut the site down. They were less than helpful to me, despite my lack of technical knowledge, and insisted that I had to employ a consultant to resolve the problem. They did turn the site back on.

Finding someone to help proved extremely difficult, with Wiki support companies I contacted not taking on new customers. I was advised to contact Wikiworks, and Yaron Koren took up the challenge. However, during the course of a couple of weeks exchange of emails, with the site sometimes running normally, at other times very slow, our discussions concentrated on an upgrade to the Mediawiki software, which is several years out of date, but has worked very well for years.

I allowed myself to believe that this would solve the bot attack problem, but when I posed the question directly, somewhat belatedly, I was assured that the two were not connected. I asked Yaron to concentrate on trying to solve the bot problem.

He made some changes to /public_html/.htaccess, which did not improve things, and suggested that I should 'find somebody who specialised in this sort of thing'. A day later the site was working normally, and did so for the best part of a week. But the problem has now returned, and is probably as bad as it has ever been.

I searched online for potential solutions and ended up editing robots.txt, also with no effect.

I am convinced that the problem is bots, although Krystal were ambivalent about this. An image-holding page which I have been working on daily, but only opening three or four times a day, has had some 3,000 visits in the past week, despite not being part of the menu structure.

If anyone reading this believes that they can be of help to me, could they please contact me through [email protected] so that we can take things further. Depending on what is involved I am prepared to pay for assistance, although I stress that this is a non-commercial site funded by user donations.

Mike Bisson

Leaderboard (talkcontribs)

Firstly: you MUST absolutely upgrade. You're on MediaWiki 1.17, which is very much ancient and can lead you to all kinds of serious security issues. You don't even have basic anti-spam tools such as Extension:AbuseFilter after all.

But, I don't see evidence of new accounts after 8 July 2023, which makes me think that the problem is something else. Can you describe the problems you've having in more detail?

Bawolff (talkcontribs)

There are a lot of different types of "bot" problems.


First of all, you have to figure out if these are intentionally malicious bots or just normal web scrappers. For normal web scrappers blocking by user-agent or IP address in .htaccess can often be sufficient. In that case better caching can also often be helpful (parser cache or varnish cache).

if it is malicious, then you have to separate between spammers or ddos. For spammers who want to maliciously edit your site, see spam.

If it an intentional attempt to take down your site (DDOS), you probably need to use different steps. The easiest approach is probably using cloudflare. They provide a product to filter out non human requests to your site. I think they have a free offering for situations like yours.

Jonathan3 (talkcontribs)

If you don't care about your website being available in foreign countries, and they're ignoring robots.txt, you could just look at the server access log, find out where most of the requests come from, and just block those IP ranges, e.g. <code>ufw deny from 129.0.0.0/8</code>. That always seems to calm things down...

Reply to "Bot attacks"