User Details
- User Since
- Oct 13 2014, 4:56 PM (528 w, 1 d)
- Availability
- Available
- LDAP User
- XZise
- MediaWiki User
- XZise [ Global Accounts ]
Nov 5 2020
Sep 29 2018
Jul 22 2016
I think it is fixed by my patch which is merged as 0c8150f, but I don't know why it isn't closed. Maybe it needs to be verified (and for me it has been to long to remember how much I actually tested it).
Jan 13 2016
You linked to the merge commit and b96e4c8 is the actual commit which changed it. Anyway I think the only option (without undoing it) might be to detect that the site is using SUL and maybe ([[https://test.wikidata.org/w/api.php?action=help&modules=query+globalallusers|query globalallusers]]?) there is a way to query if there is a SUL user with that name.
Jan 7 2016
Dec 16 2015
There are two things to point out though: A test may not be labeled net when we screw up and there is actually a plugin for nose which prevents at least certain connections.
After some time it deletes the log (afaik 1 month, ping @hashar) unfortunately.
It seems that at some point in Python 3.5, Formatter._vformat changed what it returned and we had to override it so this is broken. Now I have 3.5 installed locally and there _vformat returns only one value but the current version on CPython's repository returns two and unpacks them.
Dec 12 2015
Dec 11 2015
Shouldn't it also be possible to have substitution parameters which won't be substituted by doubling the number of brackets (like {{version}} will result in {version}).
Dec 8 2015
Alternatively could you maybe replace the {1} with {1!r} and check if that traceback disappears? It'll probably output the error but it would be interesting to see what the actual error message is.
No the RCs never had that bit of code which cause the specific error mentioned in the OP. It was added in master some time back but not backported. Regarding the nightlies page, I don't know who can edit that but @Ladsgroup usually does that.
Well 2.0rc3 does not have that error message and thus won't cause that failure.
I'm not really sure what I can do here from the bot point's perspective. There is T113316 which talks about associating card pages which haven't been moved with the actual project name, but that won't fix the button target as you mentioned.
Dec 7 2015
Just as a general note, these 6 lines mentioned in the OP could be easily shortened into one: return len(page.text) > 50
Dec 4 2015
Well if you look in my comment (T102315#1371485) you'll see that for most purposes we are not doing HTTP requests but HTTPS requests.
Could you see the password? Otherwise it might be mistyped.
Do both accounts use the same password? And if not, which account's password are you using?
Dec 3 2015
Maybe it has a different account set up as sysop? Can you check your user-config.py what username it contains for the wiki you are trying to delete stuff on. Additionally the output of python pwb.py version together with the wiki you edit can be helpful.
Nov 20 2015
Is this actually resolved? As @Ladsgroup mentioned the parts are still in pywikibot.
Nov 3 2015
The duplicates are because we have flake8 for Python 2 (flake8) and Python 3 (flake8-py3) and a set of more strict rules (flake8-docstrings-mandatory). And for example pywikibot/__init__ is tested by all while tests/wikibase_tests is not tested by flake8-docstrings-mandatory (which is only working on specific files and tus the different output in line 5).
Oct 26 2015
[[https://stedolan.github.io/jq/|jq]] is a script to parse JSON in bash. Has nothing to do with pywikibot 😉
Oops sorry, I introduced it in 4e393d63.
Oct 23 2015
The wiki supports to define the timestamp of the latest revision in the edit so it can detect it. And we in theory apply that (see APISite.editpage) but only if the actual base revision is cached. Otherwise it might load the latest revision directly before the save which of course defeats the purpose as there won't be a conflict with the latest revision if it's loaded almost immediately before saving it.
The wiki should actually detect there is an editing conflict and report that accordingly. It should also print a warning (at least in that script) that an editing conflict occurred.
Oct 21 2015
PAH! Anyway ;) The tasks to watch are probably https://gerrit.wikimedia.org/r/243487, https://gerrit.wikimedia.org/r/246791 or https://gerrit.wikimedia.org/r/247041.
Oct 20 2015
The site is now available again.
Primarily a response from @jayvdb to the comments in https://gerrit.wikimedia.org/r/#/c/237977/. And as soon as that got merged we are going to get more information from Travis. Not sure who can then help with that information though.
Oct 18 2015
You -1'ed a patch caused it to stall. And to preempt anyone opening a bug report because of the bug the patch is fixing, John opened this one here. It's not about preventing you to -1 anything but about the bug in our library. And it mentions that there is already an unmerged patch which is -1'ed by you at that time.
Oct 17 2015
This does actually prevent T85786 at the moment.
By the way the cache time doesn't matter that much as the cached item does not store the time of expiration but the time of creation. So we could reduce the time and anyone who uses a version after that won't have cached it for about 100 years.
Oct 16 2015
Those then can't use socketio_client.
Oct 15 2015
There are two different components:
I'm a bit confused what you actually want. We already check for both “Bots” and “Nobots” template.
Oct 14 2015
I'm wondering how we could add support for that on older wikis. In theory if we query revisions in order and have queried the next (or if reversed, the previous) revision we could use that id but might cause problem because of T91883: prop=revisions sorts by rev_id, not by rev_timestamp.
Oct 13 2015
Isn't it already possible to also get subcategories using -subcats/-subcatsr? Now with regard to the talkpages @jayvdb afaik suggested to make the pywikibot.pagegenerators.PageWithTalkPageGenerator visible via the command line so that you can use it's generator. But I don't think that'll be useful for the formatter as the it only works on one page at a time.
Oct 12 2015
Okay I can verify that the size is 3135 bytes with @Mpaa's change and 2936 Bytes without. But did you also use the -botflag option?
How did you use it and have you verified that you use @Mpaa's change?
Afaik this but is invalid as listpages already supports redirection?
Oct 11 2015
Oh sorry it seemed I overlooked the last part of your opening post. Now for the moment I'm a bit worried that the package is a bit overkill when we need just one dict and then that it is a string and not a function. And that string is not even valid Python code:
I think it should be possible to either have a script deriving the plural rules using Unicode's CLDR definition or to introduce/use that XML directly. For the second part the main problem is probably the copyright (although it seems you are able to publish and distribute the data files (which would include plurals.xml).
@Ricordisamoa: I guess your dislike has the same reason as T101807: Run Pywikibot tests against Win32 using Appveyor? In which case I strongly agree with @jayvdb's response (T101807#1355749).
Oct 9 2015
Hmm we still get 403s: https://travis-ci.org/wikimedia/pywikibot-core/jobs/84420867#L1253
Oct 8 2015
Oct 7 2015
@jayvdb already suggested to make CategoryListifyRobot obsolete by the listpages script. And in theory you could also pipe the result simply I guess?
Okay I looked through my IRC logs and found something from June. The pywikibot.data.api.PageGenerator always adds iiprop=metadata which includes a lot of data in case of djvu files like in File:Alberti - De re aedificatoria, 1541.djvu.
If I remember correctly the issue is that the metadata is transferred unsolicited. But I can't find the task for it.
Oct 5 2015
Okay because I thought this site might actually block Travis somehow I just ran that test with just that URL and included a bit of debug output. This is the result:
That issue seems similar to the fact that deleting latest_revision_id also doesn't properly reset everything. So maybe there should be one “clear_cache” which is called by both latest_revision_id.deleter and purge?
Oct 4 2015
Okay I guess it happens because the prefix fr is defined via the interwiki map and thus considered an interwiki link. Now I guess the correct fix would be to compare the sites (aka if fr redirects to itself) and then don't consider it an interwiki link instead of using the language code.
Anyway I don't like how we aren't improving the tests. It might be that issues are covered by our master branch tests, but those don't test the 2.0 branch and we might very well introduce a bug due to incorrect cherry picking.
What is the advantage of introducing new unnecessary unicode_literals liabilities?
Oct 2 2015
Off topic: As a tip for phabricator, you don't need to actually link tasks but can just write it's ID (T114487) and the same happens (as you saw in my comment) with the git commit hash (if it contains at least 7 digits).
Ehm you didn't link rPWBC1ff1cec7a33d but 44898b7. So with the 2.0 branch you are fine? But yes could you please open a separate bug for this thanks 😄
Okay just checked, the pypi version hasn't been updated though.
Ugh 😕 it should've been updated to 2.0rc3 which has that issue fixed, but it seems nobody has uploaded it to pypi.
What do you mean with 5 tries?
But at the moment it's mostly dependencies afaik. In my patches it's not like that we are unable to improve pywikibot without breaking 2.6 support. The only exception might be the argparse patch but there is a backport specifically for Python 2.6, so it's not like that dependency will drop support for Python 2.6.
This looks similar to T103080: Link should recognise {{ns:Project}} in text (@jayvdb).
Afaik it supports 3.5 even without fixing getargspec. It'll just report deprecation warnings (which are by default disabled).
The mentioned django bug report does not mention https://www.python.org/dev/peps/pep-3138/#motivation:
I merged T72976 into this one as the patch fixing both bugs primarily mentions this bug.
Afaik this has been solved with 38589d30. And if we do want to solve it properly (see T66958) this problem cannot appear anyway.
Please note that my paste hasn't been implemented and I haven't got around to port my code to the implemented variant.
Oct 1 2015
In this specific case upgrading to 2.0rc3 should suffice.
Sep 30 2015
This should be solved on both branches.