AI questions generator

Question bank plugins ::: qbank_questiongen
Maintained by ByCS Lernplattform, Philipp Memmel
This plugin allows you to automatically create questions using a large language model. It currently requires the local_ai_manager plugin (https://github.com/bycs-lp/moodle-local_ai_manager) that manages the connection to the external AI system. It's a fork of the "AI Text to questions generator" created by Yedidia Klein and Ruthy Salomon.
Latest release:
139 sites
168 downloads
12 fans
Current versions available: 1

AI question generator

This plugin allows you to automatically create questions using a large language model. It currently requires the local_ai_manager plugin (https://moodle.org/plugins/local_ai_manager, https://github.com/bycs-lp/moodle-local_ai_manager) that manages the connection to the external AI system. It's a fork of the "AI Text to questions generator" created by Yedidia Klein and Ruthy Salomon.

For a description of the features please have a look at the README on github: https://github.com/bycs-lp/moodle-qbank_questiongen

Screenshots

Screenshot #0
Screenshot #1
Screenshot #2
Screenshot #3
Screenshot #4

Contributors

ByCS Lernplattform (Lead maintainer)
Please login to view contributors details and/or to contact them

Comments

Show comments
  • Plugins bot
    Wed, 2 July 2025, 3:30 AM
    Approval issue created: CONTRIB-9966
  • Çağrı Akkaya
    Thu, 7 Aug 2025, 7:20 PM
    Will you release a version for moodle 4.5?
  • Philipp Memmel
    Fri, 8 Aug 2025, 5:01 AM
    Hi, thanks for reaching out! Unfortunately, we currently do not plan to make this plugin compatible with moodle 4.5. In moodle 5.0 a lot (especially also question banks) have changed and the plugin is in a pretty early stage. So developing for both branches would be an investment we currently cannot take. Additionally, further development to the dependent plugin local_ai_manager will also only be continued for moodle 5 and later.
  • Przemek Kaszubski
    Mon, 2 Mar 2026, 1:54 AM
    Hi, I'm testing this plugin under Moodle 5.0. Don't know if I'm doing sth wrong, but the plugin keeps generating almost the same question and MC options all the time (I'm using multiple choice with text or course resources passed); the outputted questions differ minimally in wording, but are otherwise duplicates of one another. I had not encountered this issue with the original https://github.com/yedidiaklein/moodle-local_aiquestions , which gave me a nice set of, say. ten different questions when I ran it with a submitted story.
    I have tried generating only 2 question per go instead of 5 or 10, sending or not sending the existing questions from the current category as context, but no luck sad . And I've been using the standard templates provided.
    Is this an issue we should file on the github? Am I doing something wrong?
    Thanks for any help.
  • Philipp Memmel
    Mon, 2 Mar 2026, 4:04 AM
    This basically is a prompting or LLM issue. I don't think the plugin can do much about it. You might want to make sure that the LLM has enough content. In the mode that the plugin is using course content the LLM is advised to only use the content from the course. So you will have to make sure the content you are providing is big and variable enough so the LLM can generate different questions. If your content provides too little information, you might want to try the topic mode instead (LLM will use own trainings data).
  • Przemek Kaszubski
    Mon, 2 Mar 2026, 8:09 AM
    Thanks for this reply. I was trying to see how the original plugin sent the requests, but so far I have not been able to dig this out. I would be happy to adjust my prompting accordingly if this were to help. As for the amount of context I provide, I use exactly the same text input with the old plugin and this one, and the difference in the amount of variance on return between both tools is massive. And I already had tried the topic mode, too..
  • Heikki Wilenius
    Thu, 5 Mar 2026, 12:36 AM
    I just tried to reproduce this, but I couldn't. So, as Philipp writes, I suspect this is a prompting issue. But can you share more details: what model are you using, and what prompt were you using? My profile has my email address, if you'd rather share them privately.
  • Przemek Kaszubski
    Sat, 7 Mar 2026, 3:27 AM
    Hello. Thanks for your kind offer to help me out, Heikki. I have been using an admin account (thus with "unlimited role"), with a default single tenant , with the question generator purpose assigned to a tool using the gpt-4o model, configured like this: Endpoint: api.openai.com/v1/chat/completions; Temperature default: Balanced.

    Limits configuration: Time window for maximum number of requests: 1 day; Purpose Question generation
    Maximum number of requests per time window (base role): 10
    Maximum number of requests per time window (extended role) : 50
    (though probably not relevant - using admin account with unlimited role)

    I'll use email to also supply you with the short text file I was passing to the question generator, for which I wanted to generate some 4-5 comprehension multiple choice questions (Later on I also test True-false, but my need is the MC type).
    Here I can just say that my texts are short from 400-700 words each.

    On the AI questions generator page I tested:

    Number of questions to generate - from 2 to 10
    Mode: Tried all the three: Topic, Provide contents, Course contents (with that last one I used Moodle pages in the course)
    I tried both checking and unchecking the "Send existing questions as context"

    As for the prompts I used the defaults:

    Primer: "You are a helpful teacher's assistant that creates moodle multiple choice questions based on the topics given by the user."

    Instructions: "Write a multiple choice question in {{currentlang}} language in moodle XML format on a topic I will specify to you separately. Do not include the question title in the beginning of the question text. Return the answer as plain text without any formatting or wrapping like for example using markdown backticks. Only return the generated XML without any additional text before and after. For maths formulas use latex notation, wrap the formulas in \( ... \) for inline style and $$ ... $$ for display style."

    (BTW. A nice feature of the original plugin is that it takes the first line of the outputted question and makes it the title of the question. I was trying top add this request to the Primer text once, but it did not seem to work. Maybe we need a separate option alongside the "Add a preconfigured prefix ("AI generated - ") to the question name" ?).

    I did not change anything tin the Example text setting. I'm not quoting it here.

    I suppose that's that. I will now write that email - I will be sending it from a gmail account I use for my profile here, though I keep it hidden.

    Thanks !
  • Przemek Kaszubski
    Sat, 7 Mar 2026, 3:38 AM
    PS. Some of that instruction text got bitten off from my comment after saving, but i DID use the full detail version of it in the question generation queries on my test site (which BTW currently stands at Version 5.0.4+ (Build: 20251219)), due for an update to a later 5.0.* soon,)
Please login to post comments