Ragable: Multi-turn Agent Chatbots simplified [Open-Source]

Ragable: Multi-turn Agent Chatbots simplified [Open-Source]

I have been building multi-turn chatbots and AI applications for a while now, and there are great libraries out there for this purpose, however, sometimes they are overkill.

If you are new to machine learning or simply want to build a multi-turn chatbot that can route between different functions to fetch data, then Ragable is for you!

What is Ragable?

Ragable is an ML library that makes building Agent-based multi-turn chatbots much easier.

It comes with most of the essentials you'll ever need such as:

  • Vector store integration: easily with just a few lines of code ingest data from multiple sources and perform RAG-type searches.

  • Agent router: The agent analyses the user's input and then intelligently figures out which function in your code to execute.

  • Pure Python Functions: No Fanciness, simple Python functions that can be aware of user data such as sessions, request objects, and just about anything else in your codebase. It is 100% safe as well because Ragable does not use OpenAI functions, only the output of your function is sent to the LLM.

Here's a code example:

from ragable.agent import get_openai_agent
from ragable.runnable import Runnable, runnable_from_func
from ragable.adapters.qdrant import QdrantAdapter
from ragable.embedders import StandardEmbedder

    Name="All about php strings",
    Instruction="When the human asks about php"
def php_strings(params):
    response = """
        str_replace('x', 'y', $z)
        stripos($the_big_blob_of_text, $the_thing_to_search_for)
    return response

    Name="All about legendary pokemon",
    Instruction="When the human asks about legendary pokemon"
def legendary_pokemon(params):
    context_data = ""
    with open("./testdata/legendary_pokemon.txt", "r") as f:
        txt = f.read()
    return context_data

if __name__ == "__main__":
    # Sets up an OpenAI powered agent.
    # Agents can register multiple tasks and will intelligently route the LLM
    # - to tasks based on the Runnable "Instruction" prompt.

    agent = get_openai_agent()

    # Easy integration with the Qdrant vector store (you will need Qdrant running locally)
    # Pass in "dsn" and "api_key" for any other setup.

    qdrant = QdrantAdapter("ragable_documents")

    # The embedder Allows you to feed most common document types into the RAG system.
    # Each document is chunked into LLM friendly chunks and vector embedded.
    embedder = StandardEmbedder(qdrant)

    # Path to your document. Optionally, you can also pass in a "doc_id".
    # The doc_id can be an integer or uuid.
    # Formats supported: txt, pdf, docx, odt, pptx, odp

    # You can also embed and index regular strings.
    # doc_id is required.
    # embedder.train_from_text("some text", 1234)

    # A none decorator verson of a Runnable.
    bulbasaur_knowledge = Runnable(
        Name="Information about bulbasaur",
        Instruction="When the human asks about bulbasaur",

    # Tell the agent which Runnable functions it's allowed to execute.

    questions = [
        "What is a legendary pokemon?",
        "How to perform a string replace in PHP?",
        "How to find a string in another string in PHP?",
        "Which Pokemon are the evolved forms of bulbasaur?"

    # Here you can feed the Agent any additional prompts as needed.
    # For example, you can store the chat history in Redis or a local session and
    # - then add each of the historical messages using this function.
    # Supported message types: system, user, ai, assistant
    agent.add_message("You are a useful informational bot.", "system")

    for q in questions:
        response = agent.invoke(q)

You can get your copy of Ragable here: https://github.com/plexcorp-pty-ltd/ragable

Ragable is still in BETA, and not available as a PIP package yet. So use with caution, the stable version will be released soon!