AI Manager

Local plugins ::: local_ai_manager
Maintained by Peter Mayer, Philipp Memmel, ByCS Lernplattform
The local_ai_manager is a powerful Moodle plugin that enables the integration of AI functionalities for different tenants. Tenants are separated by specific user fields such as institution and department. The plugin has a modular structure and supports a variety of language models that can be easily extended.
Latest release:
405 sites
443 downloads
18 fans
Current versions available: 2

The local_ai_manager acts as a central interface for connecting and managing different language models within a Moodle system. The tenant separation is realized through the use of user fields such as institution and department, which enables a clear demarcation and management of AI resources.

Main functions:
  1. Modular architecture: The plugin is designed to support different language models (e.g. ChatGPT, Ollama, Gemini) and can easily be extended to support other models due to its subplugin structure.
  2. Define purposes: Administrators can define specific deployment scenarios for the language models to provide different configurations for different use case
  3. Tenant administrators: Each tenant administrator has control over whether and which AI functionalities are activated for the users of their tenant.
  4. Credit management: Each tenant can independently procure credit and make it available to their teachers and students. This enables flexible and needs-based use of the AI tools.
  5. Detailed Statistics: The tenant admin can view detailed statistics about the usage of users and different language models. More statistics than the default ones can be enabled by capabilities.
  6. User Control: The tenant admin can enable and disable each user individually
  7. Role control: Each user can have a role to act as. The consequence is, that the tenant admin can configure different language models for different roles. E.g. gpt4o-mini for students and gpt4o for teachers.
  8. Integration of self-hosted AI tools: In addition to external language models, AI tools hosted by your organization themselves (e.g. Ollama) can also be seamlessly integrated.
  9. Extensibility: The plugin is designed to support future extensions and the integration of new AI tools.

The local_ai_manager provides a flexible and scalable solution that enables educational institutions to efficiently use and manage state-of-the-art AI technologies.

You need other plugins to work with the ai_manager: 

Screenshots

Screenshot #0
Screenshot #1
Screenshot #2
Screenshot #3
Screenshot #4
Screenshot #5

Contributors

Peter Mayer (Lead maintainer)
ByCS Lernplattform
Please login to view contributors details and/or to contact them

Comments RSS

Show comments
  • Plugins bot
    dt., 24 de set. 2024, 5:00 PM
    Approval issue created: CONTRIB-9696
  • Aaron Tian
    dl., 22 de set. 2025, 11:05 AM
    This plugin gives extensive support to AI integration. Thank you so much and if the later version could provide OpenAI-API-compatible for alternative providers, that will be amazing!
  • Philipp Memmel
    dl., 22 de set. 2025, 12:45 PM
    Hi Aaron, thank you for your reply. Basically this already does exist, but we were hesitant to provide the possibility, because there is no such thing as an OpenAI-compatible API. They all at least to some minor extent have their own ways of for example returning different errors etc. So we did not want users to believe that our plugin is buggy, because people are trying to use "OpenAI-compatible APIs" smile But we're likely to add a switch soon to allow this as we of course see the necessity for this. Thanks for your response!
  • Muhamad Oka Augusta
    dl., 29 de set. 2025, 8:46 PM
    is this plugin working? I keep getting error 404 in AI chat and tiny AI, other AI doesnt work too. I use Gemini AI first, doesnt work, then use Vertex AI, still doesnt work.
  • Muhamad Oka Augusta
    dl., 29 de set. 2025, 9:34 PM
    gemini model 1.5 flash has been retired, thats why I keep getting error 404. Just in case someone else has the same issue and is as foolish as I am.

    Would be nice if we can type our own model instead of providing dropdown, to prevent further issues in case another model is retired so we can just input our own model.
Please login to post comments