Hi everyone
After some advice on best approach for cross database support of my requirement here of a executing a SQL query to update one table using data from a joined table.
Specifics
I have a data set that is imported from another system that contains information I want to import into Moodle custom module approximately 200,000 rows.
This external data for simple definition contains two pieces of info USERNAME and SCORE, and has been imported into a new custom table in moodle. This table also contains a column USERID which is NULL by default. The imported USERNAME will match the moodle username BUT there will be records here without an equivalent moodle user account.
I want to update this USERID column in this table to the moodle user id from user table, before I then continue processing the imported data.
Issue with my current solution
For maximum cross db support I currently loop through each record in this external data and use $DB->get_record() to get the user account from mdl_user where username matches then do a $DB->update_record() on my data.
Whilst this metjhod works and is cross DB compatable, performance is attrrocious as I am doing potentially 400,000 DB queries (one for each query of user and one for each update).
Better Solution and where my issue begins
A better solution is to do a SQL update of the external data with a JOIN to users - fast and efficient. The issue is the differing syntax for databases for example:
MYSQL
UPDATE mdl_report_results
JOIN mdl_user ON (mdl_report_results.loginname = mdl_user.username)
SET mdl_report_results.userid = mdl_user.id
Microsoft SQL Server
UPDATE mdl_report_results
SET mdl_report_results.userid = mdl_user.id
FROM mdl_report_results
JOIN mdl_user ON (mdl_report_results.loginname = mdl_user.username)
I dont really want to add different SQL code for each DB so anyone any thoughts on a better approach or best practice?
Thanks
Martin