We use the Feedback activity extensively for standardised evaluations amongst c 1500 courses per semester and we feed the data out to an analytics tool.
I don't think this is really feasible through the web interface - we do it using the raw data. There's a complex set of relationships that are intended to obfuscate who the respondent actually is for anonymity, but you can indeed map each response back to the specific user if you need to through linking various tables (we have done this because one project wanted to compare evaluation feedback with course outcomes for each student - so we had to "unanonymise" them, combine them with completion and grade book statistics and then reanonymise the whole lot again.
But to be honest, since Feedbacks can be used for lots of different things, it is tricky to be able to capture all the data without holding elsewhere a list of course module numbers that refer to the specific feedback activities we deploy centrally as evaluation forms.
I was considering forking the feedback activity, to create something specifically called "evaluation feedback" so that for people like us who want to know the answers to the same set of questions to measure student experience across the whole institution, nobody else could tinker with them.
But to answer some of the OP's original questions:
No, we could not find a way of "branching" questions.
Yes, it can be locked at deployment time, but no, that means they can't edit it at all. We created a "course evaluation extra" feedback for if they wanted to ask additional questions on the basis that responses to these questions would not be extracted to the learning analytics system.
But as I say, most of what we do is done a. by using data from the Moodle database rather than pre-built reports within the Feedback module itself, and b. the analysis is done by a back end stats package that we transfer data to.