Moodle plugins directory: Link crawler robot | Moodle.org
Link crawler robot
Administration tool ::: tool_crawler
Maintained by Catalyst IT, Brendan Heywood, Daniel Thee Roperto
An admin tool robot crawler which scans your moodle for broken, large or slow links.
Latest release:
162 sites
73 downloads
36 fans
Current versions available: 2
It is an admin tool with a moodle cron task, but it reaches into your moodle via curl effectively from outside moodle, and scrapes each page, parses it and follows links. By using this architecture it will only find broken links that actually matter to students. Because it comes in from outside it needs to authenticate and has a dependancy on the moodle-auth_basic plugin. It is recommended that you setup a dedicated 'robot' user who has readonly access to all the site pages you wish to crawl. You should give the robot similar capabilites that real students will have.
Contributors
Catalyst IT (Lead maintainer)
Brendan Heywood
Daniel Thee Roperto: Coder at Catalyst IT Australia
Please login to view contributors details and/or to contact them
Running Moodle 3.1+ (Build: 20160616) on php5/apache2 with mysql (5.5.47-0+deb6u1)
Thank you for reporting this, we updated the auth_basic plugin.
You can find it here:
https://moodle.org/plugins/auth_basic
Cheers,
Daniel
cool, thank you! But there is another problem in function crawler->reset_for_recrawl. I created a new issue on GitHub. Thank you!
-Max
The report is displayed on screen currently. Any plan to add an export option so the users could download a copy of report in csv/excel format for offline analysis and data manipulation? This will greatly help to search and identify broken links in a particular course efficiently without scrolling through page by page. Also, any part of the report can be distributed to the respective course developer(s) whom do not have administrator access to review the broken links offline.
I've logged that new feature idea here:
https://github.com/central-queensland-uni/moodle-tool_crawler/issues/23
issues using this?
There should be no privacy issues, as the crawler results are only visible to admin and course managers by default. Also what the robot can see is completely configurable via moodle's capabilities so if anything sensitive is being scraped you can turn it off. Also the robot is only interested in links, it ignores all other content. So the only real privacy issue could be visibility of an external link, but either way all content that is scraped is visible to normal course admins / students anyway.
The main security issue is making sure the robots credentials aren't leaked as then someone could gain access to whatever the robot can see. But this is exactly the same as managing credentials for any other user. The robot user should not have any write permissions to anything at all. If you need to roll the password for the robot this is trivial.
If you, or anyone, identify any privacy or security issues I've missed please ensure they are logged here: https://github.com/central-queensland-uni/moodle-tool_crawler/issues
thanks!
great idea and something I assume many people were waitiing for. Thanks for providing the plugin!
Your solution technically seems to work from the comments here, but I think at least for some it answers the wrong (business) question. In our university, responsibility for maintaining course content such as external links does not lie with the (poor) administrator or cron job running the plugin, but with the trainer of the respective course.
For full business utility, the plugin should sort the dead links by trainer and course and send an email to every trainer with his or her broken links, ordered by course and - if possible - givie the title in the course along with the link target.
I assume that due to the limited capabilities of individual Moodle plugins the email functionality would require an additional plugin. Anyway, the individual email per trainer solves any data privacy issues that might arise from sending the full report to all trainers - even in the very critical Germany.
Please let me know what you think of this idea.
Thank you very much, best regads,
Gero
This plugin currently provides reports at both the site level and also add filtered reports at the course level so that each course coordinator can see just what affects their courses. See Course > Course administration > Reports > Link crawler robot > (4 new reports).
If they find issues they can fix it on the spot, and then from those course level reports flag a url for recrawling and these get a higher priority than the background en-masse crawling. We could very easily add an email report within this plugin if we wanted to - but for the client that sponsored this plugin that was actually undesirable. Broadly I would like to evolve this plugin to have a similar set of features and reports to the Google webmaster tool aka Search console.
I'd happy to implement some sort of email based reporting, as long as it was opt in and was configurable at the site level. If you want to sponsor this new feature please contact us:
https://www.catalyst-au.net/contact-us
Brendan
Any update to Moodle 3.4?
Ricardo