User talk:Magnus Manske/Archive 3
This page is an archive. Please do not modify it. Use the current page, even to continue an old discussion. |
Widar edit made by bot can't be marked as bot edit
see Special:Contributions/RobotGMwikt. By the way, In my computer, If I try to authorize Widar by clicking "allow" in //www.mediawiki.org/w/index.php?title=Special:OAuth/authorize&oauth_token=xxx&oauth_consumer_key=xxx (both http and https), Widar is not really authorized (the page still ask me to authorize).--GZWDer (talk) 10:50, 2 January 2014 (UTC)
WikiDataStats
Hello,
Is it planned to move WikiDataStats on WMF labs ? This tool was interesting. It doens't work anymore. I can try to find a developper if you don't have time to maintain this tool.
Happy new year, Pyb (talk) 16:26, 5 January 2014 (UTC)
- I think that WikidataQuery can handle this more generally, as the number of results is always output. --Izno (talk) 16:44, 5 January 2014 (UTC)
WikidataQuery results not being updated
The data doesn't seem to have updated in a week and change. Is that normal? I'm getting the same number of results when I performed a query like http://208.80.153.172/wdq/?q=claim[107:215627] as I did on the 28th. --Izno (talk) 16:52, 5 January 2014 (UTC)
- Bug report filed. Response lacking. --Magnus Manske (talk) 10:46, 7 January 2014 (UTC)
- Thanks for poking the bug. It looks like the results have updated. --Izno (talk) 22:53, 7 January 2014 (UTC)
Person/Expression
Hi, thanks for adding authority control IDs. Please check item type more accurate: [1], [2]. — Ivan A. Krestinin (talk) 19:35, 12 January 2014 (UTC)
WiDaR
Hello!
I try to use Widar and I authorized it with OAuth, but when I'm going on http://tools.wmflabs.org/widar/, the page says "Error retrieving token: mwoauthdatastore-request-token-not-found". Did I make something wrong?
Thank you very much. Kvardek du (talk) 09:59, 8 January 2014 (UTC)
- Not sure about this. Do you have cookies enabled? --Magnus Manske (talk) 13:20, 8 January 2014 (UTC)
- Yes, and I have the same problem in an other browser. Kvardek du (talk) 01:19, 9 January 2014 (UTC)
- It works now. Thank you. Kvardek du (talk) 11:40, 10 January 2014 (UTC)
- Yes, and I have the same problem in an other browser. Kvardek du (talk) 01:19, 9 January 2014 (UTC)
- I've been having this same problem on Fx 26.0 on Windows 7. Cookies are allowed. Steps that I used:
- Go to http://tools.wmflabs.org/widar/, click on authorize link
- Get sent to log in page at MediaWiki.org (is MediaWiki.org not on SUL?...)
- Input password, get returned the "Widar would like to perform actions for you" message.
- Click Allow, get sent back to tools.wmflabs.org/widar with the oauth_identifer verifier and oauth_token set in the URL
- Nothing has changed about page otherwise; when I hit F5, I get message that the other user posted
- Any help for me? :) --Izno (talk) 02:49, 13 January 2014 (UTC)
- Same for me — Felix Reimann (talk) 09:25, 16 January 2014 (UTC)
I'm not sure what's happening here; sounds like a problem with OAuth. Just for clarification, once you have authorized WiDaR, and you go to AutoLists, and try to add some claims, what error do you get? In case it's not clear, WiDaR itself has no interface other than the auth link; it is to be used by other tools, currently only AutoLists. --Magnus Manske (talk) 09:35, 16 January 2014 (UTC)
- Oh. I'll see if it works later. Could you make that clearer on the WiDaR page? --Izno (talk) 14:53, 16 January 2014 (UTC)
- omg! same for me. Sorry for bothering you. I got to WiDaR because of a posting which said that it works great - and I did not see any functionality. But now: yes, auto list works great. A small hint at WiDaR that the authorization is successful and nothing more will happen could help. — Felix Reimann (talk) 16:17, 16 January 2014 (UTC)
missing_props.js
Hello, Thanks for this so usefull gadget ! Unfortunatly, it seems that there is a problem. It doesn't work anymore for me even if desactive all others js files. Do yo know if the problem is global or particular ? Or where in the code, an eventual bug should be fixed ? --Shonagon (talk) 13:28, 18 January 2014 (UTC)
- I made a small fix to missing_props.js since it was opening the source input box of the last claim instead of adding a new claim/property. (The fix is minor and consists in changing the selector names in the
addProp
method.) See the code here. -- CristianCantoro (talk) 10:09, 23 January 2014 (UTC)- Thanks, done. Feel free to "commit" your future improvements directly on the source page! --Magnus Manske (talk) 14:57, 23 January 2014 (UTC)
- Thanks a lot both of you ! For information, a fork has been done, for visual artworks : WikiProject_Visual_arts/Tools#missing_props.js (it has no vocation to be included in the main script, because it's too much complicated and needs to be improved) Shonagon (talk) 19:38, 23 January 2014 (UTC)
Helping out
Hi Magnus,
As a newbie I'm keen to push Wikidata forward, and am surprised that more of the infobox data hasn't been imported from the wikipedias, or more of dbpedia. Having read a lot of discussion, I've found that I agree with pretty much everything you've posted on the subject, so thanks for the work there! What is the current status of it all? Is it a question of getting data into the right form and handing to a bot, because I'm willing to help there and have experience with scripts to manipulate the wikidumps. Let me know if I can help, anyway. Smb1001 (talk) 00:20, 8 February 2014 (UTC)
useful.js / gender
Thanks for updating the gender options to reflect the contents of sex or gender (P21) can we change man and woman to male and female, we have a lot of people still using the wrong values, and while your script adds the correct values the strings used in your interface don't match the actual values used.
Had a moment to do a quick sketch and figured I'd share https://www.dropbox.com/s/c0sjm4ib2axlktf/useful%20sketch.png feel free to use it if you find it helpful.
Jared Zimmerman (talk) 02:36, 8 February 2014 (UTC)
Can WDQ link command check interwiki to Wikisource ?
Hi,
With this new project landing, can we include criteria on links to it in WDQ ? If we can, with which syntax ?
Thanks,
Luc
- These is actually code in WDQ to query for links to specific projects, but the updater is broken; you can ask, but there's no data. I'll have to fix it, hopefully on the weekend. --Magnus Manske (talk) 09:17, 16 January 2014 (UTC)
- Hi Luc, it finally seems to work. Try this for all books that have a link to en.wikisource. You can get all the items not linking to en.wikisource using nolink instead of link, but it's awfully slow (minutes!) for the moment. --Magnus Manske (talk) 00:33, 8 February 2014 (UTC)
- That's great, thanks! I've started playing around with this. 1 remark: yesterday some changes in the sitelinks were not reflected in the queries whereas changes in properties were. This morning, it's OK. Is that normal (different replication timing), or was there a bug that was fixed overnight?
- Also, the wikilinks to Wikisource in the right column are not active.
- Thanks again, --LBE (talk) 09:17, 9 February 2014 (UTC)
How do you search items with only japanese wiki article in them that have certain claim
I am trying to improve japanese articles in wikidata. I was wondering, how do you search items with only japanese wiki article for them using the auto list? --Napoleon.tan (talk) 07:15, 13 February 2014 (UTC)
- Add "and link[jawiki]" to the query. See WDQ docs for full capabilities. --Magnus Manske (talk) 14:42, 13 February 2014 (UTC)
- Thanks. Just what i was looking for --Napoleon.tan (talk) 01:36, 14 February 2014 (UTC)
Autolist or Widar
Is it possible to remove claims using Widar? It'll be useful to get rid of obsolete/deprecated/redundant claims.--GZWDer (talk) 11:13, 13 February 2014 (UTC)
- I am not certain :-) They are fiddling with some broken permissions on Wikidata/OAuth this very day. Removing claims should be possible. I'll let you know when I have the bandwidth to check. --Magnus Manske (talk) 14:43, 13 February 2014 (UTC)
- Another feature request (it's getting complicated): It would be great, if you could also add claims with sources. Currently, adding sources is a tedious task (a lot of clicking) if you do not own a bot. If you could enhance the "Set claims" feauture to also take a source (i.e. a list of property/value pairs from which a single source is added to the claim) it would be great and definitely ease sourcing of statements. Nonetheless, thank you for your tools! — Felix Reimann (talk) 11:47, 14 February 2014 (UTC)
Feature request: Locator map image in Resonator for geographical data
I think you should add property 242 (Locator image map) in the resonator for geographical data. --Napoleon.tan (talk) 01:33, 14 February 2014 (UTC)
- It should already do that. Is there an example where it doesn't? --Magnus Manske (talk) 10:34, 14 February 2014 (UTC)
- For example, http://tools.wmflabs.org/reasonator/?&q=1903520 it does not show the map locator map. --Napoleon.tan (talk) 03:55, 16 February 2014 (UTC)
Tiny bug in reasonator
Hello, there is a tiny bug when you mouse over an image in the reasonator. At the bottom http://tools.wmflabs.org/reasonator/?lang=fr&q=Q15273842 when you hover over the image of the naked sitting girl sleeping in midair, the tooltip reads 'L' while it should be the French caption of Q15730739, which is "L'Enlèvement de Psyché". Something wrong with the quotation mark, I suppose. --Zolo (talk) 17:26, 15 February 2014 (UTC)
- Thanks, fixed. --Magnus Manske (talk) 18:57, 15 February 2014 (UTC)
Trees
Hi, I see that your tree tool is based on Wikidataquery, so I am wondering if there are ways to give them the full power of the tool. More concretely: Solimões (Q2844812) is part of (P361) Amazon (Q3783). Is there a way I can get a tree of the Amazon tributaries like this, but where tributaries of Solimões (Q2844812) would be handled as tributaries of Amazon (Q3783) ? --Zolo (talk) 22:01, 20 February 2014 (UTC)
- Like this? --Magnus Manske (talk) 22:45, 20 February 2014 (UTC)
- That's better than nothing, but ideally "Solimões" should not be shown at all as this is really just a part of the Amazon. Purus River should be shown as dicharging directly into the Amazon, just like the Jari River and others. --
- Wikidataquery itself does not allow such fudging. Could be done with post-processing. Maybe later. --Magnus Manske (talk) 00:02, 21 February 2014 (UTC)
- Ok wasn't sure actually. If you could provide a link to some quick doc about how to use the tool, thay may avoid silly questions like mines ;). --Zolo (talk) 14:02, 21 February 2014 (UTC)
- Wikidataquery itself does not allow such fudging. Could be done with post-processing. Maybe later. --Magnus Manske (talk) 00:02, 21 February 2014 (UTC)
- That's better than nothing, but ideally "Solimões" should not be shown at all as this is really just a part of the Amazon. Purus River should be shown as dicharging directly into the Amazon, just like the Jari River and others. --
Bug with enormous queries ?
{{instance list|Q5}} is supposed to link to a list of all people in Wikidata, but it returns "no result" perhaps because there are too many ? If that is so, could you make it a more informative message like "too many results". I am using this link in a template used in Talk:Q5, hoping to make Wikidata structure more understandable, but if the link doesn't work, that may actually be counter-productive :|. --Zolo (talk) 09:08, 22 February 2014 (UTC)
Hi Magnus! I created the page for localization (title translation, the description should be / can be added as well) of the Q / P pages.
It looks to me that the order of the added missing properties is reversed (the most relevant should show first). Can this be confirmed? / Changed? Thanks in advance! לערי ריינהארט (talk) 10:18, 23 February 2014 (UTC)
- a) Please add
'edition' : [ [31,3331189] ] ,
- below the line:
'book' : [ [31,571] ] ,
- b) Please add
'edition' : [629,50,98,110,407,655,393,291,123,577,179,212,243] ,
- below the line:
'book' : [357,392,364,50,98,136,135,393,291,123,577,179,110,212,243,166,155,156,144] ,
- Thanks in advance! לערי ריינהארט (talk) 11:22, 23 February 2014 (UTC), Thanks @Kolja21 !
- @GZWDer Can you please activate these changes? User:Magnus Manske/missing props.js/help is translatable. Can you add the zh translation please? Please do not hesitate to comment on / add to / modify the help' talk page, Thanks in advanced! לערי ריינהארט (talk) 10:43, 5 March 2014 (UTC)
Hi, I've added the 'edition' lines. Not sure what you mean by "zh translation", I don't speak Chinese. --Magnus Manske (talk) 12:32, 5 March 2014 (UTC)
- a) The help page can be translated line by line. Depending on the language you are using you may be asked to add a translation.
- b) I was not aware that beside 'edition' one should support (using an OR statement) also edition or translation of (P629) : I did not pay enough attention because this proposal was made together with @Kolja21,
- c) At the top of this section I made a note about the display order. The most important properties are now at the bottom. Can you reverse the order please? Thanks in advance! לערי ריינהארט (talk) 20:02, 7 March 2014 (UTC)
Scopus ID
I have found a second Scopus ID for Magnus Manske (Q13520818). You can request the second one be merged via the form at http://www.scopus.com/search/form/authorFreeLookup.url John Vandenberg (talk) 00:37, 2 March 2014 (UTC)
Merging items
Hallo Magnus Manske,
For merging items, you may want to use the merge.js gadget from help page about merging. It has an option "Request deletion for extra items on RfD" to automatically place a request to delete the emptied page. This way of nominating makes it a lot easier for the admins to process the requests.
With regards, - - (Cycn/talk) 13:08, 6 March 2014 (UTC)
- Thanks, though I saw somewhere this week that the merging script caused a lot of broken statements? --Magnus Manske (talk) 20:15, 6 March 2014 (UTC)
real time queries
Hi Magnus! I understand that the query for occupation (occupation (P106)) with value chess composer (chess composer (Q2627699)) is
Results are:
- {"status":{"error":"OK","items":6,"querytime":"64.656ms","parsed_query":"CLAIM[106:2627699]"},"items":[3608137,3619294,3766775,3886391,3887263,3935913]}
Reasonator shows a higher number of pages (37 or more). Is a) the query on a image / dump? b) are some parameters not reset? c) Other explanation? Regards לערי ריינהארט (talk) 18:38, 12 March 2014 (UTC)
http://208.80.153.172/api?q=claim[106:14467526] is a query about linguists (linguist (Q14467526)).
- a) Can you please add a counter (how many are found)?
- b) Can you support a parameter which generates a clickable output as
https://toolserver.org/~magnus/ts2/php/wd_query.php?q=["Q14467526"] ? - c) Can you limit the output with two parameters generally used at Mediawiki with the significance of start from" and "how many"?
Best regards לערי ריינהארט (talk) 20:29, 12 March 2014 (UTC)
First, at the moment WDQ is stuck at about 2014-02-27, because of the confluence of some major code changes and Labs no longer supporting daily diff dumps. That should change with the next bi-weekly data dump, after which updates should be every 10 minutes again.
Second, I can add these parameters (weekend, maybe), but won't do wd_query.php links, as 1) JSON and 2) that tool is outdated. For a nicer display, paste the query into Autolist. --Magnus Manske (talk) 12:50, 13 March 2014 (UTC)
autodesc.js
Hi Magnus! I've seen that your autodesc.js
tool was translated in German, so I've done the same for French on one of my subpages. The script works well (from a technical point of view), but there is an issue due to one of the specificities of the French grammar/orthography: nationality should be after occupation. Could you make this change and merge the translation in your script please? Thanks a lot. — Ayack (talk) 16:21, 18 March 2014 (UTC)
- Thanks! I have added your changes to this version, which should be the "official" one - more people can edit, and it scales better than Labs. --Magnus Manske (talk) 10:00, 19 March 2014 (UTC)
- Thanks! — Ayack (talk) 14:54, 19 March 2014 (UTC)
Tree data not refreshing when deleting claim value
Hi magnus, I like your tree application it helps me check which article is already completed. I use it with the map (around app) in tandem. I am mapping administrative region to heirarchally check items. I notice that tree immediately update if you add or edit an is in administrative region property. But it does not delete the link if i remove the property.
Please see link, the link for "Trilla in Manila" is an event so I removed the is in administrative region claim and changed it to location event. But it remained in the tree.
--Napoleon.tan (talk) 03:35, 18 March 2014 (UTC)
- When I switch to the list view, I can't find "Trilla" in there. Is it still in there? If so, which Q number? Note that there is always a ~10min delay in updating the dataset. --Magnus Manske (talk) 08:53, 18 March 2014 (UTC)
- Yeah it is in both the list and the tree view. I am sorry, I mispelled it before that may be the reason why you did not find it. The correct spelling is "Thrilla in Manila" with an "h". By the way, I really appreciate you creating these tools. The tree view is really interesting. I can clearly see that articles of my home country is lagging behind other. just comparing philippines (q928) and japan (q17). The one for united states wont even load because of too much nested items. I can see that even the california tree is more denser than the whole japan tree.
--Napoleon.tan (talk) 06:10, 20 March 2014 (UTC)
Order of position held in list of Resonator
Hi magnus,
Just want to inform you that the position held list for resonator may need additional sorting. If a person held a position twice but not consecutively, it include all the position of the person as chronological in the list but not in the timeline tool.
For example in the "vice president of the philippines", Fernando Lopez served thrice but the third time is not consequtive with the second term. So, I think it would be better if you sort it by start date and not by person only.
--Napoleon.tan (talk) 01:38, 1 April 2014 (UTC)
- Thanks, that should be fixed now. --Magnus Manske (talk) 09:43, 1 April 2014 (UTC)
Hi! re: User:Magnus Manske/authority control.js and NUKAT ID (P1207)
Please add
NUKAT:{key:'NUKAT',p:1207} , /* "NUKAT Center (Poland)" ; Ludwik Lejzer Zamenhof ([[Q11758]]) [[viaf:73885295]] */
as new AC identifier. Thanks in advance! לערי ריינהארט (talk) 09:57, 17 March 2014 (UTC)
- BTW: Reasonator should handle NUKAT ID (P1207) as an authority control properly; example [3] . לערי ריינהארט (talk) 11:37, 20 March 2014 (UTC)
- Done. For future reference, you can edit the Reasonator sidebar properties here. --Magnus Manske (talk) 12:36, 20 March 2014 (UTC)
- P.S.: Also found an Esperanto article for your example :-)
- I wondered what the Esperanto article is ... לערי ריינהארט (talk) 12:46, 2 April 2014 (UTC)
semantic ambiguosities, Reasonator and the Sandbox-CommonsMediaFile (P368)
Hi! The Wikidata Wikidata Sandbox (Q4115189) http://tools.wmflabs.org/reasonator/?lang=eo&q=Q4115189&live handles the Sandbox-CommonsMediaFile (P368) as a normal textual property.
I encountered dozens of Wikidata objects which need semantic maintenance; mainly I worked on professions / occupations and the related fields of activity. These can not be fixed in five minutes because many languages with many scripts are involved.
Basically the following information should be displayed:
- a) Q-id for suggested profession / occupation
- b) Q-id for suggested fields of activity
- c)
Q-id'sWMF language codes (a list) which neither link to the semantic meaning of a) or b)
- d) list of WMF language codes that should be confirmed
Example ilustrating a), b), c), and d): ro:Înot, ro:Înot sportiv and ro:Natație which relate to swimming as general term the sport competition term and the relation between these.
Whatever implemantation would be helpfull. It should be possible to generate a list of affected WD objects. Best regards לערי ריינהארט (talk) 13:13, 2 April 2014 (UTC)
- Sandbox-CommonsMediaFile (P368) is now displayed in the "related media" section. Since that property should be used on the Sandbox item only, it didn't seem to be a high-priority task.
- As for the rest of your message, I have no idea what you are talking about. You seem to want to get some kind of item suggestions, but how there are to be generated, and where these should show up, is not exactly clear. Are you still talking about Reasonator? --Magnus Manske (talk) 13:37, 2 April 2014 (UTC)
- more examples:
- 1) construction (Q385378) is "process that consists of the building or assembling of infrastructure"; however ro:Edificiu is a (type of) building
- 2) an example with many impacts relates to craft (Q2207288) (in German "Handwerk"), artisan (Q1294787) (in German "Handwerker") whichs field of this occupation (P425) is handicraft (Q4869079) . There are hundreds of professions / occupations which depend on this. I collected hundreds of FireFox bookmarks on professions / occupations and similar conflicts (semantic ambiguosities). Regards לערי ריינהארט (talk) 14:25, 2 April 2014 (UTC)
GeneaWiki
Hi Magnus, GeneaWiki is a really great tool except that when there are too many people in the tree, it stops working showing nothing (ex: http://tools.wmflabs.org/reasonator/geneawiki2/?q=Q509362, http://tools.wmflabs.org/reasonator/geneawiki2/?q=Q346, ...). Could you add a limit to it, in order to display only the first levels of the tree please (rather than nothing)? Thanks in advance. — Ayack (talk) 18:05, 1 March 2014 (UTC)
- Genealogy trees are tricky. I am aware that this is not working properly for large datasets, but there is no quick fix. I'll eventually rewrite it "properly". --Magnus Manske (talk) 08:36, 8 April 2014 (UTC)
GeneaWiki
This slooks like a great tool, Im sorry if I did something wrong I loaded one object, and then realized that its maybe not a test page. Is there anywhere a documentation hos to use it? I was thinking it could be useful to show family relations between the members in the opposition party Folkungar in Sweden during 1210-1280. I have done a graphical table at sv:Folkungar#Sl.C3.A4kttavla and was quorious to see how it would look with GeneaWiki. Sorry if I did something wrong. Dan Koehl (talk) 08:23, 8 April 2014 (UTC)
- The family relations need to be defined in the respective Wikidata items. Once that is done, just point the tool to one of the items, like so (this is the item for "Folke den tjocke"). This currently shows only that one item, because that item does not have any family relations, e.g. children. Add those, reload, and it should show. --Magnus Manske (talk) 08:40, 8 April 2014 (UTC)
Broken tool?
It seems that the Not-in-the-other-language tool is not functioning since a couple of days. Can you repair it? I like it the tool a lot :) Kind regards, Lymantria (talk) 12:51, 22 April 2014 (UTC)
- Done.--Magnus Manske (talk) 17:09, 22 April 2014 (UTC)
- Thanks :) Lymantria (talk) 17:19, 22 April 2014 (UTC)
Wikidata item creator doesn't work in zh-classicalwiki, zh-yuewiki
please fix it.--GZWDer (talk) 12:32, 4 April 2014 (UTC)
- Still not able to create items in these wiki, please fix it.--GZWDer (talk) 14:21, 24 April 2014 (UTC)
JIRA bog for the tree tool
Hi Magnus! I could neither find the tree product at your JIRA page nor elsewhere at JIRA with general search.
http://tools.wmflabs.org/wikidata-todo/tree.html?lang=eo&q=2419397&rp=279&method=d3&live for therapist (Q2419397) has only one "child" via subclass of (P279). In such cases the text for the parent and the child are written one on top of the other. Please confirm and let me know the JIRA ID. Thanks in advance! לערי ריינהארט (talk) 13:22, 24 April 2014 (UTC)
- Hi, JIRA is dead/dying; the place would be WMF bugzilla (ugh), or for some tools Bitbucket. The error seems to be in the D3 layout algorithm anyway, won't fix that... --Magnus Manske (talk) 13:35, 24 April 2014 (UTC)
Some suggestions
- It should be able to add string/number/coordinate/file values via toolscript.
- It should be able to remove claims via toolscript.
- It should be able to get/set labels/sitelinks/... via toolscript.
- It should be able to search values of string property by regex.
--GZWDer (talk) 14:20, 24 April 2014 (UTC)
Widar error
When I grant OAuth permission to Widar the following error message appears on the top of the page: Warning: session_write_close(): write failed: No space left on device (28) in /data/project/magnustools/public_html/php/oauth.php on line 123 Warning: session_write_close(): Failed to write session data (files). Please verify that the current setting of session.save_path is correct (/var/lib/php5) in /data/project/magnustools/public_html/php/oauth.php on line 123
. What happened?--Micru (talk) 20:17, 29 April 2014 (UTC)
- Nevermind, it works anyway. Perhaps, the only thing missing is that Autolist only allows me "select all" of the page that I'm seeing, so I can just claim 50 items each time.--Micru (talk) 21:00, 29 April 2014 (UTC)
- I'll try to find that bug, but if it works anyway... the 50 limitation is by design; I've had complaints about people editing too fast through this tool already. --Magnus Manske (talk) 21:34, 29 April 2014 (UTC)
- In any case many, many thanks for this, it saves so much work. Btw, how do I query the items of a category that don't have a property? I looked for Chopin compositions without composer, it showed no results, but it should have shown list of compositions by Frédéric Chopin by genre (Q1785783). I guess it is querying first and then looking for the category condition, however in this case it should be the other way round. --Micru (talk) 08:44, 30 April 2014 (UTC)
- NOCLAIM cannot be the first command, it can only be used to restrict a previous one, otherwise the list gets too gigantic (all items that are not a composition? ALL items? Really?). Check the docs. --Magnus Manske (talk) 09:52, 30 April 2014 (UTC)
- I imagined so, but how can I make the category the first parameter of the query? (or display the items conained in a category and then perform a query on them).--Micru (talk) 11:00, 30 April 2014 (UTC)
- You can't. Find another claim to put as first parameter, or use toolscript. --Magnus Manske (talk) 11:41, 30 April 2014 (UTC)
- Exactly what I needed! With
list.hasProperty
I managed to filter out the items in the category that had the property already. Later on I did another experiment to get the wd item of the category first to extract all the site links, and then check all categories in all languages, but I didn't manage to get that far. Apparentlylist.getSitelinks
doesn't work with category items... it might be because of line 442 of the toolscript.js ("if ( v.page_namespace != 0 ) return ;"). --Micru (talk) 13:57, 1 May 2014 (UTC)
- Exactly what I needed! With
- You can't. Find another claim to put as first parameter, or use toolscript. --Magnus Manske (talk) 11:41, 30 April 2014 (UTC)
- I imagined so, but how can I make the category the first parameter of the query? (or display the items conained in a category and then perform a query on them).--Micru (talk) 11:00, 30 April 2014 (UTC)
- NOCLAIM cannot be the first command, it can only be used to restrict a previous one, otherwise the list gets too gigantic (all items that are not a composition? ALL items? Really?). Check the docs. --Magnus Manske (talk) 09:52, 30 April 2014 (UTC)
- In any case many, many thanks for this, it saves so much work. Btw, how do I query the items of a category that don't have a property? I looked for Chopin compositions without composer, it showed no results, but it should have shown list of compositions by Frédéric Chopin by genre (Q1785783). I guess it is querying first and then looking for the category condition, however in this case it should be the other way round. --Micru (talk) 08:44, 30 April 2014 (UTC)
- I'll try to find that bug, but if it works anyway... the 50 limitation is by design; I've had complaints about people editing too fast through this tool already. --Magnus Manske (talk) 21:34, 29 April 2014 (UTC)
Magnus
I saw you added a statement that Lila Tretikov went to Moscow state university. Is that right? I though she moved to the USA when she was 16? Filceolaire (talk) 19:50, 2 May 2014 (UTC)
- Just adding what English Wikipedia tells me ;-) --Magnus Manske (talk) 20:25, 2 May 2014 (UTC)
Autolists
Has anything changed with wdq ? I think queries like those in Talk:Q811979/class used to work, but not it just seems to run forever, even for simple queries. --Zolo (talk) 08:55, 26 April 2014 (UTC)
- Replace the _ with spaces and it works. --Magnus Manske (talk) 18:57, 26 April 2014 (UTC)
- thanks I should have found it myself :|. There also seems to be a problem with the in-Wikipedia-category option. It doesn't give anything, and the toolscript examples using category trees that you provide categories give 0 results with error messages in the output (copied one below for faster checking). --Zolo (talk) 08:44, 3 May 2014 (UTC)
p3 = ts.categorytree ( { project:"wikipedia", lang:"en", cats:[ // Single string, or array of strings "American scientists", "20th-century births" ], depth:12, // The category search depth redirects:"none", // No redirects please! wikidata:"noitem" // Only results without a Wikidata item } ) ; ts.show ( p3 ) ;
Help
(Spanish) Hola Magnus Manske, como puedes hacer esto con la aplicación Widar [1.2]. Me puedes enseñar. Le agradezco su respuesta. Un saludo afectuoso Leitoxx (talk) 19:32, 3 May 2014 (UTC)
- I am using this tool at the moment. There are also other tools that use Widar for editing. --Magnus Manske (talk) 20:16, 3 May 2014 (UTC)
Unauthorized bot
You seem to be operating an unauthorized bot. Please pause, I don't want to block you. See Wikidata:Administrators' noticeboard#Flooding of Special:RecentChanges. Multichill (talk) 12:34, 4 May 2014 (UTC)
- Blocked for 10 minutes to stop the bot. I hope the bot doesn't automatically resume after these 10 minutes. Multichill (talk) 13:20, 4 May 2014 (UTC)
- I guess your tool doesn't check for blocks. Started editing right after the 10 minutes block. Added an indefinite block now. Can be lifted by anyone once the tool has been stopped. Multichill (talk) 13:32, 4 May 2014 (UTC)
- pfff it's a silly decision. Pyb (talk) 14:51, 4 May 2014 (UTC)
- Thanks. Tool has stopped, editing is throttled. I'm curious how long this will take... --Magnus Manske (talk) 14:57, 4 May 2014 (UTC)
- The unblocking or the editing? Multichill (talk) 15:00, 4 May 2014 (UTC)
- Well, I throttled tool editing to 1/5th, and you just unblocked me, so wondering no more :-) I had the exact same tool running all day yesterday, and no one complained, so I figured it's OK. Can't get a bot flag for OAuth AFAIK, so now trying this. Maybe OAuth edits should be bot-flagged by default anyway? --Magnus Manske (talk) 15:20, 4 May 2014 (UTC)
- Probably different things to hash in:
- How many edits in one run?
- Does the user have the option to use a bot flag (the user is in group bot or flooders)
- Do you want to offer user the option do enable/disable "bot"?
- Enable it by default or not?
- Throttle based on flag or no flag and the maxlag
- Are you at the hackathon next week? Would be a good topic to discus and implement. Multichill (talk) 15:25, 4 May 2014 (UTC)
- Not at the hackathon, pesky work getting in the way :-( Maybe you could still bring it up though? My OAuth tool ("WiDaR") actually has high-frequency editing enabled; not sure what that does though. The various tools that use WiDaR would actually know the editing volume; Reasonator does single edits only, AutoList 1 and 2 do mass edits, etc. I'd be happy to pass the appropriate flag, if the API allows it and it's documented somewhere. --Magnus Manske (talk) 15:36, 4 May 2014 (UTC)
- You're still going rather fast. I think 4 edits per minute would be a good compromise for now. Multichill (talk) 15:30, 4 May 2014 (UTC)
- I'll try to limit it further. Pity, this is the last large category (en:"living people") that is clean enough to do this, and it will take forever now, instead of being done tonight. --Magnus Manske (talk) 15:36, 4 May 2014 (UTC)
- Probably different things to hash in:
- Well, I throttled tool editing to 1/5th, and you just unblocked me, so wondering no more :-) I had the exact same tool running all day yesterday, and no one complained, so I figured it's OK. Can't get a bot flag for OAuth AFAIK, so now trying this. Maybe OAuth edits should be bot-flagged by default anyway? --Magnus Manske (talk) 15:20, 4 May 2014 (UTC)
- The unblocking or the editing? Multichill (talk) 15:00, 4 May 2014 (UTC)
- Thanks. Tool has stopped, editing is throttled. I'm curious how long this will take... --Magnus Manske (talk) 14:57, 4 May 2014 (UTC)
- pfff it's a silly decision. Pyb (talk) 14:51, 4 May 2014 (UTC)
- I guess your tool doesn't check for blocks. Started editing right after the 10 minutes block. Added an indefinite block now. Can be lifted by anyone once the tool has been stopped. Multichill (talk) 13:32, 4 May 2014 (UTC)
TypeErrors in your scripts
Sorry for spamming your talk pages. These posts are meant as reminders. Here is an overview:
- User talk:Magnus Manske/authority control.js#addOnloadHook is deprecated
- User talk:Magnus Manske/consistency check.js#addOnloadHook is deprecated
- User talk:Magnus Manske/missing props.js#TypeError
- User talk:Magnus Manske/missing props.js#addOnloadHook is deprecated
- User talk:Magnus Manske/wikidata useful.js#TypeError
- User talk:Magnus Manske/wikidata useful.js#addOnloadHook is deprecated
I will try to debug the type errors a bit more. Please tell me how I can help you the most. --Thiemo Mättig (WMDE) 09:04, 6 May 2014 (UTC)
- Hi Thiemo, habe addOnloadHook überall ersetzt, und einen TypeError (hoffentlich) repariert. Der andere ist eine race condition, da hilft nur force-reload... --Magnus Manske (talk) 10:18, 6 May 2014 (UTC)
- Jetzt sind die Meldungen in meinem Test auf den einen zusammen geschrumpft. Vielen lieben Dank. Für die verbliebene habe ich da einen Vorschlag hinterlassen. --Thiemo Mättig (WMDE) 16:05, 6 May 2014 (UTC)
Improvement suggestions for Autolists 2
Hi, and thanks again for this tool. The throttling is really frustrating, I hope it can be removed soon ...
A few suggestions, after transitioning from v1 :
- As a French user, I really miss the ability to specify the default language used to retrieve labels and descs.
- I understand the upside of removing the 50 items per page results. But on large outputs, it's not always practical. I usually scan the list quickly to check for obvious errors. That's OK for a list of a few hundred items, but at a thousand it's not feasible. And huge results (all people without a nationality) crash my browser. Maybe there could be a parameter to specify the number of results to be displayed.
- The added statements could be sourced with an "imported from" property, for category queries, like most bots do.
- Ideally, the changelog for the modified item should even mention the category name used to infer the property. I don't know if that's feasible, but that's something I've often wished bots would do, when I tried to understand strange statements.
While I am at it, a suggestion for WDQ: would it be possible to have commands like nolabel(langcode) and nodesc(langcode) ?
Thanks again,
--LBE (talk) 13:02, 10 May 2014 (UTC)
- Thanks for adding the chunks parameter. Just a bug report: in my Firefox, the labels and desc are only retrieved for the 1st 8/9 items. That's less than the list initially visible. On a big list, labels are retrieved as soon as you start scrolling, so no issue. But if the list is small enough that you cannot scroll, there is no way to get the labels displayed.
- Thanks, --LBE (talk) 08:15, 13 May 2014 (UTC)
queries for datatype string etc.
Hi! The DDC value 400 is used at via Dewey Decimal Classification (P1036) at language (Q315) . http://dewey.info/class/400/about is the start of Dewey's language catalogization. The list of values I evaluated reads as:
- http://dewey.info/class/4/about Language ... http://dewey.info/class/419.415/about Irish Sign Language ... http://dewey.info/class/449.709449/about Provençal dialect ... http://dewey.info/class/496.3452/about Bambara ... http://dewey.info/class/499.992/about Esperanto language . The actual list is commented out here.
- a) How can I query for a specific value ad DDC 499.15 for http://dewey.info/class/499.15/about Aboriginal Australian languages ??
- b) Is it possible to query for 499.1* or *92 (biographies)
- c1) Is it possible wo query for a range as 5* TO 6* ( http://dewey.info/class/5/about Science TO http://dewey.info/class/6/about Technology )
- c2) Is it possible to exclude values from a range?
- d) Is it possible to query for the presence of Dewey Decimal Classification (P1036) AND the absence of BNCF Thesaurus ID (P508)?
- e) Is it possible to query all persons with the date of birth (P569) or date of death (P570) for the 17th of May?
Thanks for any help? gangLeri לערי ריינהארט (talk) 12:49, 13 May 2014 (UTC)
- Sorry, I don't know much about DDC, or the correct usage here. Maybe try Property talk:P1036? --Magnus Manske (talk) 13:54, 13 May 2014 (UTC)
- Question a) is the simplest.
- example: The GND ID (P227) value 4065367-5 is used at Interlingua (Q35934) .
- Assume someone wants to identify the WD item with the GND value 4065367-5.
- What would be that query? Is there an url? Or is there a tool where the answer is available via a dialog? לערי ריינהארט (talk) 14:11, 13 May 2014 (UTC)
- You can't do that on Wikidata. You can do it on WDQ, but only for specific values. This is the query for the GND example. Check the WDQ API for more options. --Magnus Manske (talk) 14:34, 13 May 2014 (UTC)
- http://tools.wmflabs.org/wikidata-todo/autolist.html?q=string[1036:"499.15"] is exactly what I need. Thanks a lot! לערי ריינהארט (talk) 15:33, 13 May 2014 (UTC)
- You can't do that on Wikidata. You can do it on WDQ, but only for specific values. This is the query for the GND example. Check the WDQ API for more options. --Magnus Manske (talk) 14:34, 13 May 2014 (UTC)
documentattion
The new tools.wmflabs.org/wikidata-todo/autolist.html url links to http://wdq.wmflabs.org/api_documentation.html. There some links as the one with the label Items with VIAF string "64192849" link to wdq.wmflabs.org/api. The link still works today but one could add also an alternate link as this one:
Regards gangLeri לערי ריינהארט (talk) 16:52, 13 May 2014 (UTC)
"Contains label" in autolist
Can't imagine what happens:
--Zolo (talk) 19:58, 13 May 2014 (UTC)
Adding items from Wikipedia without checking and creating duplicates
Q15995165 is a widar-enabled dup of IEEE Transactions on Signal Processing (Q15765422) and a bit of IEEE Transactions on Image Processing (Q15756967) thrown in[7]. Please do not create items for periodicals without first checking the authority control data in Wikidata and in wherever you are doing imports from, as it is very likely any automated edit is likely to be erroneous. This is why people should only do automated edits after getting community approval for their bot. John Vandenberg (talk) 00:44, 12 May 2014 (UTC)
- I auto-add only items for pages for which a Wikipedia article exists, at least at the time; see the actual creation. It's always possible that this changes later, by no fault of mine. Also, if you bother to check the history, you'll see I did not add any statements to the item, that was someone else. --Magnus Manske (talk) 12:33, 12 May 2014 (UTC)
- I dont see how your response is appropriate given my message to you. major disconnect happening? yes the Wikipedia article existed; it was created after the items were created in Wikidata. The items in Wikidata had authority control information which you should be using to prevent creating duplicates. That is if you are not a bot. fwiw, today I found three more dups of journals created by you in the same batch.
- p.s. if you had bothered to check the history, you would have seen it was my bot that added the statements on your dup instead of my bot informing me that there is a Wikipedia article about a journal that is not connected, so that I can smartly determine if the item already exists in Wikidata. We used to have wonderful bots and smart humans who carefully joined wiki articles together; now we have Widar-enabled people, creating duplicates all over the place, making it harder for interwiki bots to link unconnected Wikipedia articles to existing Wikipedia articles in other languages. John Vandenberg (talk) 13:46, 12 May 2014 (UTC)
- Major disconnect? Probably. My OAuth-based tools can not create Wikidata items unless there is a Wikipedia article without an item, which has to be supplied. By technical limitation, WiDaR can not create "blank" items, that is, without a Wikipedia page. Maybe that page should have been connected to an existing item instead; but we'll always have those, and then we have to merge them. OAuth-based item creation may have this problem slightly more emphasized, but it's neither new nor unique to this method. --Magnus Manske (talk) 16:26, 12 May 2014 (UTC)
Magnus, do you rember the original goal of this project? It was intended to centralise interlanguage links. I have the impression Bot-Flagged- Flooders like you, GerardM, GZWDer are not aware of this goal anymore. Creating „empty items” (eg. without a single propery) without checking them against existing ones is not really helpfull. --Succu (talk) 20:02, 12 May 2014 (UTC)
- Succu, I do very well remember the initial goal of Wikidata. I would, however, appreciate it for you not trying to tell me what I'm "aware of"; it comes across as arrogant and condescending.
- As Zolo has pointed out below, in an ideal world, Wikidata would be kept up-to-date perfectly by volunteers, hand-checking each article against the existing data. In the real world, this is lacking; even the highly-watched "Living people" category on en.wp has, at the moment of me writing this, ~100 articles without an item. Other areas of en.wp are paid even less attention, not to mention other Wikipedias with less volunteers. I, for one, prefer articles in Wikipedias to have an item in Wikidata than not. Wikidata makes it actually easier to find duplicates, though I don't think we have fully explored the infrastructure to do this at maximal efficiency.--Magnus Manske (talk) 07:34, 13 May 2014 (UTC)
- False dichotomy Magnus. Wikidata also makes it simpler to detect and prevent duplicates, yet we have dumber bots than we had without Wikidata.
- There are people who care about specific domains of knowledge or even narrower sets of topics. When Wikidata has authority control information, and other important attributes like date of birth and death, taxon name, etc, these should be consulted before creating an item. Otherwise the bot operators (and widar tool users) are overruling the people who curate their area. This means the bots/tools have to get smarter as the quality of Wikidata improves.
- Having people that dont care about a domain creating duplicate items in that domain prevents others who are working on domain specific areas from detecting new Wikipedia pages (e.g. using newitem.py on a relevant category, or your item creator) that need to be investigated before being dumped into Wikidata.
- Allowing people to indiscriminately create duplicates because a new page happens to appear on another Wikipedia is a going to be the cause of a never ending mess, as we want existing Wikidata items to be converted into Wikipedia pages is a constantly growing number of languages. People who care about data quality will get pissed off and leave. John Vandenberg (talk) 10:48, 13 May 2014 (UTC)
Certainly, directly linking articles to the right item, is best, but from my experience, more often than not, the alternative to Widar (or bot) automated item creation has been leaving Wikipedia articles unconnected to Wikidata. That is not really better. It is easier to find duplicates directly in Wikidata than in 200 different versions of Wikipedia. That said, couldn't we find better ways to create item ? In particular, I'd think the creator tool would benefit from allowing to add statements to the items it create, so that we have at least bacic idea of what they are about. --Zolo (talk) 20:49, 12 May 2014 (UTC)
- Thanks Zolo. I'll have a look at adding statements during creation, or at least creating an item list to feed into AutoList. Also, the tool could check for perfect title matches on Wikidata before creation; that might help avoid creating some duplicates. --Magnus Manske (talk) 07:34, 13 May 2014 (UTC)
- Update: By default, Creator will now skip the creation of items for pages where items with the exact page title as label or alias (in any language) exist. Also, created items are added to a list, which can be opened with a single click in AutoList2, to add statements. --Magnus Manske (talk) 09:34, 13 May 2014 (UTC)
- I think if possible, a list of items which have a same label/alias with the page that I will create a item for should be displayed. It will be easy to link unlinked pages to suitable items using this list. And a option "still create such items" should be provided, which can be used after all linkable pages are linked.--GZWDer (talk) 09:49, 13 May 2014 (UTC)
- That list should be displayed already. Tell me if that's not the case, please, with an example. To create items anyway, re-do the same run with the checkbox unchecked. --Magnus Manske (talk) 10:45, 13 May 2014 (UTC)
- I think if possible, a list of items which have a same label/alias with the page that I will create a item for should be displayed. It will be easy to link unlinked pages to suitable items using this list. And a option "still create such items" should be provided, which can be used after all linkable pages are linked.--GZWDer (talk) 09:49, 13 May 2014 (UTC)
- Update: By default, Creator will now skip the creation of items for pages where items with the exact page title as label or alias (in any language) exist. Also, created items are added to a list, which can be opened with a single click in AutoList2, to add statements. --Magnus Manske (talk) 09:34, 13 May 2014 (UTC)
- Thank you Magnus. Much appreciated. Zolo, while it is easier to find duplicates directly in Wikidata, than in the search interface of 200 plus Wikipedias, that isnt a valid comparison. The bots used to access 200 plus Wikipedias with ease. And there are very few people in Wikidata search for and removing the duplicates - least of all the people who are creating them, and definitely not the people who are using Widar enabled tools to mass create/edit items. See for e.g. Herman Bieling (Q5629532) dup of Herman Bieling (Q2277308). Both are updated by the same person, using widar, and existing for over a year. That used to be a simple interwiki case - exact title matches on the primarily language wikis. Instead we have lots of busy work by bots, and the cleanup being left to others. John Vandenberg (talk) 10:35, 13 May 2014 (UTC)
- John, there will always be cleanup to do; we have learned that much form Wikipedia :-) I'm all for minimizing the need for cleanup where possible, but any kind of editing (manual or tool/bot-based) can never be error-free. Bot edits are, by nature, more error-prone than manual edits, but they also save tremendous amounts of manual work when they get it right. There is a cost for clean-up, but there is also a cost for doing everything manually (and then clean that up). For example, I have added sex or gender (P21) for ~200K items in the last 10 days or so, using one of my tools. I'm sure I got some of them wrong. But, will the "cost" for cleaning the few error over time really exceed the saved work of adding all of those 200K statements manually? I doubt that. Meanwhile, more (reasonably correct) information in Wikidata will make the dataset more valuable, attracting more interest, people, checker bots, etc. --Magnus Manske (talk) 10:45, 13 May 2014 (UTC)
- Thank you Magnus. Much appreciated. Zolo, while it is easier to find duplicates directly in Wikidata, than in the search interface of 200 plus Wikipedias, that isnt a valid comparison. The bots used to access 200 plus Wikipedias with ease. And there are very few people in Wikidata search for and removing the duplicates - least of all the people who are creating them, and definitely not the people who are using Widar enabled tools to mass create/edit items. See for e.g. Herman Bieling (Q5629532) dup of Herman Bieling (Q2277308). Both are updated by the same person, using widar, and existing for over a year. That used to be a simple interwiki case - exact title matches on the primarily language wikis. Instead we have lots of busy work by bots, and the cleanup being left to others. John Vandenberg (talk) 10:35, 13 May 2014 (UTC)
@Magnus Manske:, above you said that this was not possible with your updated tool because of David Malet Armstrong (Q1173590). A bug perhaps? John Vandenberg (talk) 05:14, 14 May 2014 (UTC)
- It's a deleted page. I'm not an admin here, so that's all it's telling me. Check the code here (function createItemFromPage); it's the only function to create items, and it forces a site/page. I don't see how it could work otherwise. Bug in MediaWiki? --Magnus Manske (talk) 07:23, 14 May 2014 (UTC)
- You can determine the relevant parts of the deleted page from this edit to David Malet Armstrong (Q1173590). I looked at that code and didnt see the block which prevents "creation of items for pages where items with the exact page title as label or alias (in any language) exist." John Vandenberg (talk) 07:34, 14 May 2014 (UTC)
- That only shows that no page link was merged, not that it was created without one. The function I pointed to passes a site/page structure to the item creation; if Wikidata creates the item even though the page is already linked elsewhere, it's a bug in MediaWiki. --Magnus Manske (talk) 07:40, 14 May 2014 (UTC)
- The link shows a Italian page link and Italian label was merged. The item Q16872877 was created with a label (in Italian) that was identical to a label another item (Q1173590) in a different language. I thought you said that you had updated your code to prevent that happening. John Vandenberg (talk) 07:50, 14 May 2014 (UTC)
- As of yesterday, I prevent same-label-creation by default (through Creator, though not toolscript); but since you may actually want to create such items, one can turn the checkbox off. Forcibly preventing the creation of an item because an identical label exists elsewhere would be quite silly; Wikidata is using Q IDs to allow multiple items having the same label. --Magnus Manske (talk) 07:54, 14 May 2014 (UTC)
- The link shows a Italian page link and Italian label was merged. The item Q16872877 was created with a label (in Italian) that was identical to a label another item (Q1173590) in a different language. I thought you said that you had updated your code to prevent that happening. John Vandenberg (talk) 07:50, 14 May 2014 (UTC)
- That only shows that no page link was merged, not that it was created without one. The function I pointed to passes a site/page structure to the item creation; if Wikidata creates the item even though the page is already linked elsewhere, it's a bug in MediaWiki. --Magnus Manske (talk) 07:40, 14 May 2014 (UTC)
- You can determine the relevant parts of the deleted page from this edit to David Malet Armstrong (Q1173590). I looked at that code and didnt see the block which prevents "creation of items for pages where items with the exact page title as label or alias (in any language) exist." John Vandenberg (talk) 07:34, 14 May 2014 (UTC)
- Could you add case-insensitivity to that, at least for the first letter? I suspect Les Sœurs Boulay (Q16885778) ('Les sœurs Boulay') dup of Les Sœurs Boulay (Q16390125) slipped through as a result of that. many items on wikidata have a different capitalisation of the first letter, e.g. because the word is not a proper noun in English, but other languages have other rules causing first letter to be lower case. John Vandenberg (talk) 07:52, 16 May 2014 (UTC)
- Also I think it should prevent same-sitelink creation by default (i.e. a different language page with the same pagename as already exists in a sitelink elsewhere in Wikidata). John Vandenberg (talk) 07:52, 16 May 2014 (UTC)
Magnus, could you add a mandatory checkbox to the Creator tool ask "Have you checked this list for potential duplicates?" or something similar to allow its good uses to continue, and advise people that bad uses are the users own responsibility. John Vandenberg (talk) 05:31, 16 May 2014 (UTC)
- I could, but if they uncheck the current checkbox to create dupes, a second checkbox won't stop them. Know there is another tool of mine which uses the same OAuth (WiDaR). That one is basically free JavaScript, written/pasted by the user, with some convenience functions thrown in. It would be extremely hard to even replicate the "warning mode" from Creator there. In the end, it's not the tools that are wrong, it's people using them in the wrong way. --Magnus Manske (talk) 07:55, 16 May 2014 (UTC)
- A lot of people are using it wrong; at some point, that problem becomes the tools responsibility. e.g. here we see another user, user:Ahoerstemeier , creating items without checking for duplicates. Even yourself, creating ~2000 species items without checking. (I spot checked; every one of the 20 I checked already existed).
- I am aware of the toolscript; I havent played with it myself, but it isnt my concern as, by its very nature, it has barriers to entry. John Vandenberg (talk) 08:36, 16 May 2014 (UTC)
- That items dupes I created became real dupes - in that example I mentioned in project chat widar showed me that no item existed for that article in Thai (which I can read a bit), but in fact there already was one having the same sitelink. I thought that widar only lists those articles which have no wikidata item yet, but that functionality is/was partially broken. That widar cannot know that there are items which may be same in another language is another problem. Ahoerstemeier (talk) 09:02, 16 May 2014 (UTC)
- The last sentence show that you haven't really figured our what's going on. WiDaR is the "OAuth conduit" between several tools and Wikidata. WiDaR doesn't show anything - you are talking about one of the tools using WiDaR. I assume you mean Creator. If you try to create an item for article X, it searches for labels and aliases identical to "X". If it finds any, and the checkbox is checked, it does not create a new item, but alerts you that an item with that label/alias exists, and which one it is. And yes, pages that Creator suggests should not have an item; if one does, maybe it's a sync lag of the Labs databases? --Magnus Manske (talk) 09:15, 16 May 2014 (UTC)
- So now I know difference between Widar and item creator, there was/is a problem with the item creator. Nongsuawittayakhom School (Q13027327) was created in May 2013, and when I used the item creator to add the school in Thailand not yet on Wikidata in April 2014, it created Nongsuawittayakhom School (Q16307415) having the same sitelink. The way I understood item creator, this article shouldn't have shown in the list of wikipedia pages after the category scan. Sync lag can be ruled out since it was almost 1 year between the two. IIRC that checkbox wasn't there yet then, so the check for identical label wasn't done. But maybe item creator isn't the culprit in this case, but some wrongly indexed database entries - I was also able to create a real dupe with the same sitelink manually for Sanom (Q15257719) even though Sanom (Q13025275) already existed, something the UI should have stopped me to do. Ahoerstemeier (talk) 09:58, 16 May 2014 (UTC)
- The last sentence show that you haven't really figured our what's going on. WiDaR is the "OAuth conduit" between several tools and Wikidata. WiDaR doesn't show anything - you are talking about one of the tools using WiDaR. I assume you mean Creator. If you try to create an item for article X, it searches for labels and aliases identical to "X". If it finds any, and the checkbox is checked, it does not create a new item, but alerts you that an item with that label/alias exists, and which one it is. And yes, pages that Creator suggests should not have an item; if one does, maybe it's a sync lag of the Labs databases? --Magnus Manske (talk) 09:15, 16 May 2014 (UTC)
- That items dupes I created became real dupes - in that example I mentioned in project chat widar showed me that no item existed for that article in Thai (which I can read a bit), but in fact there already was one having the same sitelink. I thought that widar only lists those articles which have no wikidata item yet, but that functionality is/was partially broken. That widar cannot know that there are items which may be same in another language is another problem. Ahoerstemeier (talk) 09:02, 16 May 2014 (UTC)
I assume it's a Wikidata/MediaWiki bug. Checking for site links for both your items (both having the sitelink):
MariaDB [wikidatawiki_p]> select * from wb_items_per_site WHERE ips_item_id IN (13027327,16307415); +------------+-------------+-------------+-----------------------------------------------------------------------+ | ips_row_id | ips_item_id | ips_site_id | ips_site_page | +------------+-------------+-------------+-----------------------------------------------------------------------+ | 544166052 | 16307415 | thwiki | โรงเรียนหนองเสือวิทยาคม | +------------+-------------+-------------+-----------------------------------------------------------------------+ 1 row in set (0.00 sec)
Only Nongsuawittayakhom School (Q16307415) has the sitelink (in the database; both have them in the JSON text object). If the sitelink of Nongsuawittayakhom School (Q13027327) was never registered, my tool would have been unable to see it. That would also explain why Wikidata allowed this duplicate sitelink to be created, and why Creator doesn't suggest it to be created yet again (now the sitelink exists). --Magnus Manske (talk) 10:10, 16 May 2014 (UTC)
Suggestion: Create a "Widar user" list
This is like Wikipedia:AutoWikiBrowser/CheckPage (Q11214943), users in this list can use widar editing faster (And other users can only use it at 1-4 edits/min), and use some high-risk tool like toolscript. This page should be located at Wikidata:Widar/CheckPage.--GZWDer (talk) 15:09, 16 May 2014 (UTC)
Non-English labels in autolist
Hello Magnus, for some reason, I get fallback to non-English labels here, not here nor there ? --Zolo (talk) 19:36, 9 May 2014 (UTC)
- You'd get always English if available; in your first example, there were just few English labels. I have now changed it to prefer the Wikipedia language, if available. --Magnus Manske (talk) 10:56, 10 May 2014 (UTC)
- Thanks, the strange cases where those where I did not get any label, just Q numbers. Thins appear to be fixed for my second example, but in [8] there are still two items for which I do not get any label even though there a French label in Wikidata. --Zolo (talk) 11:10, 10 May 2014 (UTC)
- Stale cache? I get four items, all with French labels. --Magnus Manske (talk) 11:27, 10 May 2014 (UTC)
- I retried and now get 0 labels (Firefox). I have just tried on IE and got 0 labels too. --Zolo (talk) 11:40, 10 May 2014 (UTC)
- [9] still gets me the four French labels. --Magnus Manske (talk) 12:24, 10 May 2014 (UTC)
- I still don't. As LBE says, it seems actually to be resricted to items that are too short to be scrolled down. --Zolo (talk) 18:01, 16 May 2014 (UTC)
- Try it now. Even is it doesn't show right away, it should update when you move the mouse in/out of the table. --Magnus Manske (talk) 11:20, 17 May 2014 (UTC)
- I still don't. As LBE says, it seems actually to be resricted to items that are too short to be scrolled down. --Zolo (talk) 18:01, 16 May 2014 (UTC)
- [9] still gets me the four French labels. --Magnus Manske (talk) 12:24, 10 May 2014 (UTC)
- I retried and now get 0 labels (Firefox). I have just tried on IE and got 0 labels too. --Zolo (talk) 11:40, 10 May 2014 (UTC)
- Stale cache? I get four items, all with French labels. --Magnus Manske (talk) 11:27, 10 May 2014 (UTC)
- Thanks, the strange cases where those where I did not get any label, just Q numbers. Thins appear to be fixed for my second example, but in [8] there are still two items for which I do not get any label even though there a French label in Wikidata. --Zolo (talk) 11:10, 10 May 2014 (UTC)