At the moment, you have to upload a CSV file. (You could make this by downloading data from the responses report, and then editing that CSV file a bit to get what you want.)
Improving the workflow there, so you don't have to do it manually, is something that is tentatively pencilled in for the next phase of development on this, but when that happens has not been decided yet. We built a bunch of the most important features, and now we want to see how people use them and the comments we have, before we build anything else.
The Amati tool ... I am not sure if I would really call it an NLP tool. A better description would be that it uses machine learning techniques to deduce a set of possible matching rules from a set of graded responses. It is the work of an OU researcher called Alistair Willis. He has a paper about it here: http://www.aclweb.org/anthology/W15-0628. We basically took his ideas (with permission!) and built them into Moodle.
Alistair's machine learning code has not been shared publicly. Moodle talks to it over a simple web service protocol. The idea was that other researchers might like to try coding this own AI to try to auto-create matching rules. I don't think we have properly documented the protocol. You would have to look at the code: https://github.com/moodleou/moodle-qtype_pmatch/blob/master/classes/amati_rule_suggestion.php#L72. Basically Moodle makes a HTTP post request with the graded responses JSON-encoded in the POST request. Then the other end does its work, and sends back a JSON-encoded set of matching rules.
Note, I can take almost no credit in for pmatch. It was initially created by Jamie Pratt (freelance Moodle developer) working under the direction of my former colleague Phil Butcher. Then all the Amati stuff was done by my colleagues Colin Chambers and John Beedell. My only real contribution was some code review.