select * from `mdl_files` where `contenthash` like "0d74d5947507bca548de06af166f8513fa3a0048"
What does that show/list?
The search pulls up 29 rows, all with the same filename, fliearea where one row = private, one row = content and the rest = draft. All have the same contextid = 102 except for the filearea = content > contextid = 4231 with component = mod_page.
Work around? We'll give Moodle what it's looking for ... but not really.
Manually, with whatever tool you have to work with files/browse files ...
First browse to moodledata and see if there is a '0d' directory. If there is NOT create one. Set permissions and ownerships like the other folders you see in that area. If there is a '0d' directory, see if there is a '74' directory inside. If not, create one (permissions/ownerships same as folders/files above). Inside the '74' directory is there a '0d74d5947507bca548de06af166f8513fa3a0048' file? If not, create a text file with that as the filename. Same ownership/permissions as files/folders above it.
There was a folder called 0d. However, there was not a '74' within so I created it changed the permissions to reflect the top folder and then inside 74 created a text file with the filename as specified.
From the trace, looks like trying to update a file that was still in draft.
When you run cron jobs there are routines that deal with file system and does move files from draft to a component area (doesn't really 'move' physically, rather the reference in the DB is 'moved'. Are there any errors in running cron?
Tried running a cron - job we shall see.
Since you've been running a long time, same database? Have any tools to check the status of tables - especially the mdl_files table status?
I have been having troubles recently with my mdl_files table as draft files have been repeated for several of the same files. After a few weeks I get my server provider telling me to reduce the database under 500 mb. I manually search for all draft files and delete from database. It is a pain but seems to work. I still need to set up a cron job that will take care of this but not sure how.