Validate External Links

From OniGalore
Revision as of 22:28, 25 August 2020 by Iritscen (talk | contribs) (→‎How to fix bad links: improving documentation, linking to new reference subpages)
Jump to navigation Jump to search

Developed by Iritscen, Validate External Links ("ValExtLinks" for short, or "Val" for even shorter) is a script made to help fight the battle against link rot on OniGalore.

Background

While MediaWiki makes it easy to find bad links to pages on our own wiki, marking them in red and providing tools like Wantedpages, there is no automatic check of external links. MediaWiki compiles external links into a table, but it does not ping the URLs to see if they give any response. The most you are allowed to do is search through the links table, and even that isn't implemented well. Over the years, many links on our wiki have gone dead as the Web has changed and various file hosts have gone out of business.

So here's how ValExtLinks helps with this: at 6:20am and 2:20pm (GMT) each day, a script written by Alloc dumps the wiki's external links table to this location. Val, which runs on Iritscen's computer at 3:00pm (GMT) each day, then walks through the table and looks for URLs that return problematic codes such as 404. It also detects other lesser problems with links. Val then makes suggestions for fixing these links.

How to fix link issues

  • NG: In most cases, fixing an "NG" link will mean finding the desired web page in the Internet Archive's Wayback Machine and linking to that archived page instead. In some cases, an NG link will not be recoverable and should be either removed from the page or, if the link was a part of a conversation and it would be confusing for it to be absent, it should be surrounded in nowiki tags like this to prevent it from showing up in future reports.
    • Val automatically queries the Archive for the latest snapshot of each NG page and will put the returned snapshot URL in its report. Note that you still have to verify this link by clicking on it, as it may not have the correct content. You may have to go further back in the Wayback Machine to find the proper snapshot to use. Sometimes the Archive simply never got around to archiving a given site. In that case, you will need to follow the advice above as to deleting the link or marking it with nowiki tags.
  • RD: The site is redirecting the browser to a new page. The new page should be evaluated, and if it has the content we intended to link to then we should update the link to point to the new location. However, many redirects actually are "soft 404s" and simply redirect the browser to the site's main page. In this case, an RD link needs to be treated like an NG link (see above).
  • EI: An external link (bare URL) for a page on our own wiki that could simply be an intrawiki link. Sometimes an "external internal" may seem to be necessary, but there's a special wiki feature that allows you to avoid it:
    • If you want to link to a specific version of a page, which used to require putting the full URL, like this. In fact, there's no need to link to any page at all, as the "ID" of an edit, like the one you see in that sample URL, is unique wiki-wide. All you need to do is supply the revision ID to the Special:Permalink page like this — Special:Permalink/7685 — and you're done.
    • If you need to link to a diff between two revisions of a page, or between two different pages, plug the old and new revision numbers into the Special:Diff page like this: Special:Diff/21491/21492 (no need for page names, as explained above).
    • If there's no provision like this for replacing a bare URL with a smarter link, see "Exceptions" below to remove the link from the report.
  • IW: An external link (bare URL) that could be an interwiki link. Interwiki links are shorter and more resistant to rot. The suggested interwiki link markup will be given in the report. For foreign-language Wikipedia pages, you can add a language code, e.g. [[wp:de:Test]] for the German version of the page.
  • (xxx): The HTTP response code (see reference HERE).
  • (000-xx): The Unix tool 'curl' did not get an HTTP response code, but instead returned this exit code (see HERE). The most common by far is "000-28", a timeout.

Exceptions

Some links simply have to be presented the way that they are. Some links return error codes but actually work fine. These links can be added to the exceptions list in order to hide them in future reports.

Coming features

  • It seems that many external links no longer display the content that they were intended to display. In many cases, web sites are silently redirecting the user to their main page without using the appropriate code that indicates the content was not found. Only visual inspections of the pages can catch these issues. Once we have dealt with the low-hanging fruit of pages that return "NG" codes, the screenshot feature in the script will be activated, which will allow us to easily confirm if the "OK" links (and snapshots from the Internet Archive) are actually loading the intended page.
  • Over time, the exceptions list should be audited so it doesn't acquire cruft. Val needs to mention URLs in the list that aren't in extlinks.csv anymore, and URLs which returned different codes from what was expected.
  • Val will eventually expand and test the interwiki links. Currently there is no way to know if they've gone bad.
  • Test section links found in interwiki and intrawiki links. Even intrawiki links will only show up as redlinks if the page itself does not exist; MediaWiki does not look at the section anchor to see if it exists.
  • If possible, Val will eventually start warning about externally-hosted images, as many of these have gone down with their file hosts over the years.

Source code

The Bash script and related files are found here.