This was an interesting request we were asked about last week. We were notified that one of the links that an instructor placed on in a course was bad. Bad, in the sense that it had malware distribution on it. That's bad.
We were asked if we could check that somehow in the moodle.
I had never thought about it before. Now, this seems kind of like filtering in schools: just about impossible to do, but you gotta start somewhere. A little research shows Google's safe browsing service might suit. I am not a programmer in the slightest, but I'm going to try because it would be a terribly nice feature to have long term. (Short term, I'm going to export the db of URLs and mung them into form to query using the (vastly easier) lookup API.)
As said, I'm not a programmer, and the browsing API, while much better for usage, is much harder to implement. And I'm going to have to wade through learning how to do all this git stuff and whatnot. So anybody who's interested, please speak up!
What about a script that gets all the external URLs, runs them through safesearch with the API and gives you a report? Maybe one in siteroot/admin/.
Personally I think we only need detail of which links are bad - total number checked then list the bad ones and which page + topic number they're on.
I think this needs to be a full plugin, because the Google API needs a key to use. Otherwise, it blocks you after a certain number of tries. But, yes, it's basically a script to check and report.
I don't know where it needs to be. Ideally, I'd like to check new links when cron runs, but I don't know how to make sure that gets reported. Then I'd like to have the opportunity to do a scheduled run and report- weekly should be fine. It needs to throw away links that are dupes, and bad links. And then it needs to list as you said, along with course name & number.
I think I had about 30% dupes in the URL list, and probably about 50% near dupes, fyi
Off to get an API key, for today's work.