I have thought about this in the past (off and on since the idea of using questions from the question bank in Lesson was first suggested more than 10 years ago.)
How the logic should work
I think there are basically three approaches to consider:
A) For some question types, the values returned by the methods $question->classify_response and $qtype->get_possible_responses.
If you want to understand these methods, they are currently only used by the quiz statistics report, to do the response analysis. However, they would also be useful here, for branching by choices in multiple-choice questions, or true/false. It could also be used for short-answer questions which have been set up to recognise various different right or wrong answers.
It will not work for questions with multiple parts (e.g. a matching question) because there is not single value to base the branch on.
B) So, another option that would work for any question type, is simply to offer a three-way branch based on whether the response is correct, partially correct or wrong.
C) The third option is to decide that neither of these options is good enough, and instead add a new API to the question system.
As question bank component maintainer, I think supporting the question bank in the lesson is sufficiently important that it is worth changing the core question system if necessary to make it work well. However, currently I think options A) and B) are all we need. I am just putting this on record, because we may find as we analyse the design further that A) or B) are not enough for some reason. I just want to make it clear that one should not give up at that point.
How the UI should look
The only point I want to make here is that I don't think you should add lesson-specific stuff to the question editing form. First they are complicated enough already, and also if you did, it would not be possible to move a question from a quiz to the lesson, because it would be missing the extra settings that the lesson expects.
Instead, I think you should separate out the step where you create the question (which should not change at all from how you currently create questions in a quiz or in the question bank) for the business of linking questions together.
Instead, the page where you assemble the lesson, then that needs to show the possible outcomes for each question (as determined by one of A-C above) and then let you control which way to branch for each possible outcome.
What about question behaviours?
Have you thought about question behaviours? I would say that the lesson should take note of them, and should offer the user a choice of any behaviour where can_questions_finish_during_the_attempt returns true.
Random questions?
I think randomisation in the Lesson is already served by clusters. Therefore, any further consideration of randomisation at this time would get in the way of the key focus here, which is to let lesson use any question from the question bank. Therefore, I would rule that bit clearly out-of-scope for this project.
I hope that helps. I am happy to discuss this more, either here, or in something like a Google hangout. However, I am quite busy at the moment, and will have more time in a couple of weeks.