-
Notifications
You must be signed in to change notification settings - Fork 21
Creating handlers
Handlers are the classes that handle some of Newelle's features so that the user can use the best one that suits their needs. Extensions can also add handlers to Newelle.
Supported handlers:
-
LLMHandler
: handle the chatbot responses -
TTSHandler
: handle the Text to Speech -
STTHandler
: handle the Speech Recognition This is an UML diagram of the structure
classDiagram
Handler <|-- LLMHandler
Handler <|-- TTSHandler
Handler <|-- STTHandler
Handler <|-- NewelleExtension
class Handler{
+str key
+require_sandbox_escape()
+get_extra_settings()
+get_extra_requirements()
+install()
+is_installed()
+get_setting(key)
+set_setting(key, value)
+get_default_setting(key)
}
class LLMHandler{
+stream_enabled()
+load_model()
+generate_text(prompt, history, system_prompt)
+generate_text_stream(prompt, history, system_prompt, on_update, extra_args)
+send_message(window, message)
+send_message_stream(window, message, on_update, extra_args)
+get_suggestions(request_prompt, amount)
+generate_chat_name(requrest_prompt)
}
class TTSHandler{
+ get_voices()
+ voice_available(voice)
+ save_audio(message, file)
+ play_audio(message)
+ connect(signal, callback)
+ playsound(path)
+ stop()
+ get_current_voice()
+ set_voice()
}
class STTHandler{
+ recognize_file(path)
}
class NewelleExtension {
+ name
+ id
+ get_llm_handlers()
+ get_tts_handlers()
+ get_stt_handlers()
+ get_additional_prompts()
+ get_replace_codeblocks()
+ get_gtk_widget()
+ get_answer()
}
The handler base class provides the default implementation for some useful features common to all handlers. Usually you only need to override the get_extra_settings
, get_extra_requirements
and requires_sanbox_escape
.
First of all, you need to specify all the settings that user might need to edit. (For example API keys, language, preferences...).
To do this, you must override get_extra_settings
in the following format:
def get_extra_settings(self) -> list:
"""
Extra settings format:
Required parameters:
- title: small title for the setting
- description: description for the setting
- default: default value for the setting
- type: What type of row to create, possible rows:
- entry: input text
- toggle: bool
- combo: for multiple choice
- values: list of touples of possible values (display_value, actual_value)
- range: for number input with a slider
- min: minimum value
- max: maximum value
- round: how many digits to round
Optional parameters:
- folder: add a button that opens a folder with the specified path
- website: add a button that opens a website with the specified path
- update_settings (bool) if reload the settings in the settings page for the specified handler after that setting change
"""
return []
The OpenAIHandler
LLMHandler is a good example of how to create these values.
def get_extra_settings(self) -> list:
return [
{
"key": "api",
"title": _("API Key"),
"description": _("API Key for OpenAI"),
"type": "entry",
"default": ""
},
{
"key": "endpoint",
"title": _("API Endpoint"),
"description": _("API base url, change this to use interference APIs"),
"type": "entry",
"default": "https://api.openai.com/v1/"
},
{
"key": "model",
"title": _("OpenAI Model"),
"description": _("Name of the OpenAI Model"),
"type": "entry",
"default": "gpt3.5-turbo"
},
{
"key": "streaming",
"title": _("Message Streaming"),
"description": _("Gradually stream message output"),
"type": "toggle",
"default": True
},
{
"key": "advanced_params",
"title": _("Advanced Parameters"),
"description": _("Include parameters like Max Tokens, Top-P, Temperature, etc."),
"type": "toggle",
"default": True
},
{
"key": "max-tokens",
"title": _("Max Tokens"),
"description": _("Max tokens of the generated text"),
"website": "https://help.openai.com/en/articles/4936856-what-are-tokens-and-how-to-count-them",
"type": "range",
"min": 3,
"max": 400,
"default": 150,
"round-digits": 0
},
{
"key": "top-p",
"title": _("Top-P"),
"description": _("An alternative to sampling with temperature, called nucleus sampling"),
"website": "https://platform.openai.com/docs/api-reference/completions/create#completions/create-top_p",
"type": "range",
"min": 0,
"max": 1,
"default": 1,
"round-digits": 2,
},
{
"key": "temperature",
"title": _("Temperature"),
"description": _("What sampling temperature to use. Higher values will make the output more random"),
"website": "https://platform.openai.com/docs/api-reference/completions/create#completions/create-temperature",
"type": "range",
"min": 0,
"max": 2,
"default": 1,
"round-digits": 2,
},
# Omitted some parameters for semplicity
]
This will allow the handler to display these extra settings in the preferences page.
After you have specified these settings, you are free to use self.get_setting(key)
and self.set_setting(key, value)
anywhere in the class.
If you need extra pip libraries for the handler, you can specify them overriding the static methodget_extra_settings
. The dependencies will be installed in a local pip path.
For example:
def get_extra_requirements() -> list:
return ["google-generativeai"]
The default implementations of install() and is_installed() will install/check the dependencies specified there. You can also override install and is_installed methods if you need something else:
@staticmethod
def get_extra_requirements() -> list:
return ["openai-whisper"]
def is_installed(self) -> bool:
return True if find_module("whisper") is not None else False
def install(self):
print("Installing whisper...")
super().install()
import whisper
print("Whisper installed, installing tiny model...")
whisper.load_model("tiny")
Note: here I had to override is_installed method because the pip dependency name openai-whisper
and the module name whisper
were different.
The install method will run when the button near the handler is clicked.
If is_installed
returns true, the button disappears and the handler can be selected.
If you are building an extension, you can override the get_llm_handlers
, get_tts_handlers
or the get_stt_handlers
methods.
For example:
class MyCustomExtension(NewelleExtension):
...
def get_llm_handlers(self) -> list[dict]:
"""
Returns the list of LLM handlers
Returns:
list: list of LLM handlers in this format
{
"key": "key of the handler",
"title": "title of the handler",
"description": "description of the handler",
"class": LLMHanlder - The class of the handler,
}
"""
return [{
"key": "hyperbolic",
"title": _("Hyperbolic API"),
"description": _("Hyperbolic API"),
"class": HyperbolicHandler,
}]
If you are contributing to Newelle, then you can specify the handlers in constants.py
in the same way.
Some handlers might need the permission to run commands on the user PC, escaping the flatpak sandbox. In order to display a warning if the user does not have the necessary permissions, you can override the requires_sandbox_escape
function:
@staticmethod
def requires_sandbox_escape() -> bool:
"""If the handler requires to run commands on the user host system"""
return True