Skip to content

Ollama Agent Worker

AgentWorker(inventory, broker: str, worker_name: str, exit_event=None, init_done_event=None, log_level: str = 'WARNING', log_queue: object = None) ¤

Bases: NFPWorker

This class represents a worker that interacts with a language model to handle various tasks such as chatting with users, retrieving inventory, and producing version reports of Python packages.

Parameters:

Name Type Description Default
inventory

The inventory object to be used by the worker.

required
broker str

The broker URL to connect to.

required
worker_name str

The name of this worker.

required
exit_event

An event that, if set, indicates the worker needs to stop/exit.

None
init_done_event

An event to set when the worker has finished initializing.

None
log_level str

The logging level of this worker. Defaults to "WARNING".

'WARNING'
log_queue object

The logging queue object.

None

Attributes:

Name Type Description
agent_inventory

The inventory loaded from the broker.

llm_model str

The language model to be used. Defaults to "llama3.1:8b".

llm_temperature float

The temperature setting for the language model. Defaults to 0.5.

llm_base_url str

The base URL for the language model. Defaults to "http://127.0.0.1:11434".

llm_flavour str

The flavour of the language model. Defaults to "ollama".

llm

The language model instance.

Methods:

Name Description
worker_exit

Placeholder method for worker exit logic.

get_version

Produces a report of the versions of Python packages.

get_inventory

Returns the agent's inventory.

get_status

Returns the status of the worker.

_chat_ollama

Handles the chat interaction with the Ollama LLM.

chat

Handles the chat interaction with the user by processing the input through a language model.

Source code in norfab\workers\agent_worker.py
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
def __init__(
    self,
    inventory,
    broker: str,
    worker_name: str,
    exit_event=None,
    init_done_event=None,
    log_level: str = "WARNING",
    log_queue: object = None,
):
    super().__init__(
        inventory, broker, SERVICE, worker_name, exit_event, log_level, log_queue
    )
    self.init_done_event = init_done_event

    # get inventory from broker
    self.agent_inventory = self.load_inventory()
    self.llm_model = self.agent_inventory.get("llm_model", "llama3.1:8b")
    self.llm_temperature = self.agent_inventory.get("llm_temperature", 0.5)
    self.llm_base_url = self.agent_inventory.get(
        "llm_base_url", "http://127.0.0.1:11434"
    )
    self.llm_flavour = self.agent_inventory.get("llm_flavour", "ollama")

    if self.llm_flavour == "ollama":
        self.llm = OllamaLLM(
            model=self.llm_model,
            temperature=self.llm_temperature,
            base_url=self.llm_base_url,
        )

    self.init_done_event.set()
    log.info(f"{self.name} - Started")

get_version() ¤

Generate a report of the versions of specific Python packages and system information. This method collects the version information of several Python packages and system details, including the Python version, platform, and a specified language model.

Returns:

Name Type Description
Result

An object containing a dictionary with the package names as keys and their respective version numbers as values. If a package is not found, its version will be an empty string.

Source code in norfab\workers\agent_worker.py
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
def get_version(self):
    """
    Generate a report of the versions of specific Python packages and system information.
    This method collects the version information of several Python packages and system details,
    including the Python version, platform, and a specified language model.

    Returns:
        Result: An object containing a dictionary with the package names as keys and their
                respective version numbers as values. If a package is not found, its version
                will be an empty string.
    """
    libs = {
        "norfab": "",
        "langchain": "",
        "langchain-community": "",
        "langchain-core": "",
        "langchain-ollama": "",
        "ollama": "",
        "python": sys.version.split(" ")[0],
        "platform": sys.platform,
        "llm_model": self.llm_model,
    }
    # get version of packages installed
    for pkg in libs.keys():
        try:
            libs[pkg] = importlib.metadata.version(pkg)
        except importlib.metadata.PackageNotFoundError:
            pass

    return Result(result=libs)

get_inventory() ¤

NorFab task to retrieve the agent's inventory.

Returns:

Name Type Description
Result

An instance of the Result class containing the agent's inventory.

Source code in norfab\workers\agent_worker.py
114
115
116
117
118
119
120
121
def get_inventory(self):
    """
    NorFab task to retrieve the agent's inventory.

    Returns:
        Result: An instance of the Result class containing the agent's inventory.
    """
    return Result(result=self.agent_inventory)

get_status() ¤

NorFab Task that retrieves the status of the agent worker.

Returns:

Name Type Description
Result

An object containing the status result with a value of "OK".

Source code in norfab\workers\agent_worker.py
123
124
125
126
127
128
129
130
def get_status(self):
    """
    NorFab Task that retrieves the status of the agent worker.

    Returns:
        Result: An object containing the status result with a value of "OK".
    """
    return Result(result="OK")

_chat_ollama(user_input, template=None) -> str ¤

NorFab Task that handles the chat interaction with Ollama LLM.

Parameters:

Name Type Description Default
user_input str

The input provided by the user.

required
template str

The template for generating the prompt. Defaults to a predefined template.

None

Returns:

Name Type Description
str str

The result of the chat interaction.

Source code in norfab\workers\agent_worker.py
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
def _chat_ollama(self, user_input, template=None) -> str:
    """
    NorFab Task that handles the chat interaction with Ollama LLM.

    Args:
        user_input (str): The input provided by the user.
        template (str, optional): The template for generating the prompt. Defaults to a predefined template.

    Returns:
        str: The result of the chat interaction.
    """
    self.event(f"Received user input '{user_input[:50]}..'")
    ret = Result(task=f"{self.name}:chat")
    template = (
        template
        or """Question: {user_input}; Answer: Let's think step by step. Provide answer in markdown format."""
    )
    prompt = ChatPromptTemplate.from_template(template)
    chain = prompt | self.llm

    self.event("Thinking...")
    ret.result = chain.invoke({"user_input": user_input})

    self.event("Done thinking, sending result back to user")

    return ret

chat(user_input, template=None) -> str ¤

NorFab Task that handles the chat interaction with the user by processing the input through a language model.

Parameters:

Name Type Description Default
user_input str

The input provided by the user.

required
template str

A template string for formatting the prompt. Defaults to

None

Returns:

Name Type Description
str str

Language model's response.

Raises:

Type Description
Exception

If the llm_flavour is unsupported.

Source code in norfab\workers\agent_worker.py
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
def chat(self, user_input, template=None) -> str:
    """
    NorFab Task that handles the chat interaction with the user by processing the input through a language model.

    Args:
        user_input (str): The input provided by the user.
        template (str, optional): A template string for formatting the prompt. Defaults to

    Returns:
        str: Language model's response.

    Raises:
        Exception: If the llm_flavour is unsupported.
    """
    if self.llm_flavour == "ollama":
        return self._chat_ollama(user_input, template)
    else:
        raise Exception(f"Unsupported llm flavour {self.llm_flavour}")