You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardexpand all lines: docs/contribute.md
+1-2
Original file line number
Diff line number
Diff line change
@@ -152,9 +152,8 @@ TurnkeyML is provided as a package on PyPI, the Python Package Index, as [turnke
152
152
The following public APIs are available for developers. The maintainers aspire to change these as infrequently as possible, and doing so will require an update to the package's major version number.
153
153
154
154
- From the top-level `__init__.py`:
155
-
-`turnkeycli`: the `main()` function of the `turnkey` CLI
156
-
-`evaluate_files()`: the top-level API called by the CLI
157
155
-`turnkeyml.version`: The package version number
156
+
-`State` class and `load_state`: structure that holds build state between Tools; function to load `State` from disk.
158
157
- From the `common.filesystem` module:
159
158
-`get_available_builds()`: list the builds in a turnkey cache
Copy file name to clipboardexpand all lines: docs/lemonade/getting_started.md
+1-1
Original file line number
Diff line number
Diff line change
@@ -60,7 +60,7 @@ To install `lemonade` from source code:
60
60
61
61
## From Lemonade_Server_Installer.exe
62
62
63
-
The `lemonade` server is available as a standalone tool with a one-click Windows installer `.exe`. Check out the [Lemonade_Server_Installer.exe guide](lemonade_server_exe.md) for installation instructions and the [server spec](https://github.com/onnx/turnkeyml/blob/main/docs/lemonade/server_spec.md) to learn more about the functionality.
63
+
The Lemonade Server is available as a standalone tool with a one-click Windows installer `.exe`. Check out the [Lemonade_Server_Installer.exe guide](lemonade_server_exe.md) for installation instructions and the [server spec](https://github.com/onnx/turnkeyml/blob/main/docs/lemonade/server_spec.md) to learn more about the functionality.
The `lemonade` server is available as a standalone tool with a one-click Windows installer `.exe`. Check out the [server spec](https://github.com/onnx/turnkeyml/blob/main/docs/lemonade/server_spec.md) to learn more about the functionality.
3
+
The Lemonade Server is available as a standalone tool with a one-click Windows installer `.exe`. Check out the [server spec](https://github.com/onnx/turnkeyml/blob/main/docs/lemonade/server_spec.md) to learn more about the functionality.
4
4
5
-
## GUI Installation and Usage
5
+
## GUI Installation
6
6
7
7
> *Note:* you may need to give your browser or OS permission to download or install the .exe.
8
8
9
9
1. Navigate to the [latest release](https://github.com/onnx/turnkeyml/releases/latest).
10
10
1. Scroll to the bottom and click `Lemonade_Server_Installer.exe` to download.
11
11
1. Double-click the `Lemonade_Server_Installer.exe` and follow the instructions.
12
12
13
-
Now that you have the server installed, you can double click the desktop shortcut to run the server process. From there, you can connect it to applications that are compatible with the OpenAI completions API.
13
+
## Usage
14
14
15
-
## Silent Installation and Command Line Usage
15
+
Now that you have the server installed, you can double click the desktop shortcut to run the server process.
16
16
17
-
Silent installation and command line usage are useful if you want to fully integrate `lemonade` server into your own application. This guide provides fully automated steps for downloading, installing, and running `lemonade` server so that your users don't have to install `lemonade` separately.
17
+
From there, you can connect it to applications that are compatible with the OpenAI completions API. The Lemonade Server [examples folder](https://github.com/onnx/turnkeyml/tree/main/examples/lemonade/server) has guides for how to use Lemonade Server with a collection of applications that we have tested.
18
18
19
-
Definitions:
20
-
- "Silent installation" refers to an automatic command for installing `lemonade` server without running any GUI or prompting the user for any questions. It does assume that the end-user fully accepts the license terms, so be sure that your own application makes this clear to the user.
21
-
- Command line usage allows the server process to be launched programmatically, so that your application can manage starting and stopping the server process on your user's behalf.
19
+
## Developing with Lemonade Server
22
20
23
-
### Download
24
-
25
-
Follow these instructions to download a copy of `Lemonade_Server_Installer.exe`.
Command line invocation starts the `lemonade` server process so that your application can connect to it via REST API endpoints.
88
-
89
-
#### Foreground Process
90
-
91
-
These steps will open lemonade server in a terminal window that is visible to users. The user can exit the server by closing the window.
92
-
93
-
In a `cmd.exe` terminal:
94
-
95
-
```bash
96
-
conda run --no-capture-output -p INSTALL_DIR\lemonade_server\lemon_env lemonade serve
97
-
```
98
-
99
-
Where `INSTALL_DIR` is the installation path of `lemonade_server`.
100
-
101
-
For example, if you used the default installation directory and your username is USERNAME:
102
-
103
-
```bash
104
-
C:\Windows\System32\cmd.exe /C conda run --no-capture-output -p C:\Users\USERNAME\AppData\Local\lemonade_server\lemon_env lemonade serve
105
-
```
106
-
107
-
#### Background Process
108
-
109
-
This command will open lemonade server without opening a window. Your application needs to manage terminating the process and any child processes it creates.
Where `INSTALL_DIR` is the installation path of `lemonade_server`.
21
+
Interested in integrating Lemonade Server into an application you are developing? Check out the [Lemonade Server integration guide](server_integration.md) to learn more.
This guide provides instructions on how to integrate Lemonade Server into your application.
4
+
5
+
There are two main ways in which Lemonade Sever might integrate into apps:
6
+
* User-Managed Server: User is responsible for installing and managing Lemonade Server.
7
+
* App-Managed Server: App is responsible for installing and managing Lemonade Server on behalf of the user.
8
+
9
+
The first part of this guide contains instructions that are common for both integration approaches. The second part provides advanced instructions only needed for app-managed server integrations.
10
+
11
+
## General Instructions
12
+
13
+
14
+
### Identifying Compatible Devices
15
+
16
+
AMD Ryzen™ AI `Hybrid` models are available on Windows 11 on all AMD Ryzen™ AI 300 Series Processors. To programmatically identify supported devices, we recommend using a regular expression that checks if the CPU name contains "Ryzen AI" and a 3-digit number starting with 3 as shown below.
17
+
18
+
```
19
+
Ryzen AI.*\b3\d{2}\b
20
+
```
21
+
22
+
Explanation:
23
+
-`Ryzen AI`: Matches the literal phrase "Ryzen AI".
24
+
-`.*`: Allows any characters (including spaces) to appear after "Ryzen AI".
25
+
-`\b3\d{2}\b`: Matches a three-digit number starting with 3, ensuring it's a standalone number.
26
+
27
+
There are several ways to check the CPU name on a Windows computer. A reliable way of doing so is through cmd's `reg query` command as shown below.
The recommended way of directing users to the server installer is pointing users to our releases page at [`https://github.com/onnx/turnkeyml/releases`](https://github.com/onnx/turnkeyml/releases). Alternatively, you may also provide the direct path to the installer itself or download the installer programmatically as shown below:
Please note that the Server Installer is only available on Windows. Apps that integrate with our server on a Linux machine must install Lemonade from source as described [here](https://github.com/onnx/turnkeyml/blob/main/docs/lemonade/getting_started.md#from-source-code).
51
+
52
+
## Stand-Alone Server Integration
53
+
54
+
Some apps might prefer to be responsible for installing and managing Lemonade Server on behalf of the user. This part of the guide includes steps for installing and running Lemonade Server so that your users don't have to install Lemonade Server separately.
55
+
56
+
Definitions:
57
+
- "Silent installation" refers to an automatic command for installing Lemonade Server without running any GUI or prompting the user for any questions. It does assume that the end-user fully accepts the license terms, so be sure that your own application makes this clear to the user.
58
+
- Command line usage allows the server process to be launched programmatically, so that your application can manage starting and stopping the server process on your user's behalf.
59
+
60
+
### Silent Installation
61
+
62
+
Silent installation runs `Lemonade_Server_Installer.exe` without a GUI and automatically accepts all prompts.
63
+
64
+
In a `cmd.exe` terminal:
65
+
66
+
Install *with* Ryzen AI hybrid support:
67
+
68
+
```bash
69
+
Lemonade_Server_Installer.exe /S /Extras=hybrid
70
+
```
71
+
72
+
Install *without* Ryzen AI hybrid support:
73
+
74
+
```bash
75
+
Lemonade_Server_Installer.exe /S
76
+
```
77
+
78
+
The install directory can also be changed from the default by using `/D` as the last argument.
Only `Qwen2.5-0.5B-Instruct-CPU` is installed by default in silent mode. If you wish to select additional models to download in silent mode, you may use the `/Models` argument.
Command line invocation starts the Lemonade Server process so that your application can connect to it via REST API endpoints.
102
+
103
+
#### Foreground Process
104
+
105
+
These steps will open the Lemonade Server in a terminal window that is visible to users. The user can exit the server by closing the window.
106
+
107
+
In a `cmd.exe` terminal:
108
+
109
+
```bash
110
+
conda run --no-capture-output -p INSTALL_DIR\lemonade_server\lemon_env lemonade serve
111
+
```
112
+
113
+
Where `INSTALL_DIR` is the installation path of `lemonade_server`.
114
+
115
+
For example, if you used the default installation directory and your username is USERNAME:
116
+
117
+
```bash
118
+
C:\Windows\System32\cmd.exe /C conda run --no-capture-output -p C:\Users\USERNAME\AppData\Local\lemonade_server\lemon_env lemonade serve
119
+
```
120
+
121
+
#### Background Process
122
+
123
+
This command will open the Lemonade Server without opening a window. Your application needs to manage terminating the process and any child processes it creates.
Returns a list of key models available on the server in an OpenAI-compatible format. This list is curated based on what works best for Ryzen AI Hybrid. Additional models can be loaded via the `/api/v0/load` endpoint by specifying the Hugging Face checkpoint.
168
+
Returns a list of key models available on the server in an OpenAI-compatible format. This list is curated based on what works best for Ryzen AI Hybrid. Only models available locally are shown.
Copy file name to clipboardexpand all lines: examples/lemonade/README.md
+9-1
Original file line number
Diff line number
Diff line change
@@ -1,6 +1,14 @@
1
1
# Lemonade Examples
2
2
3
-
This folder contains examples of how to use `lemonade` via the high-level APIs. These APIs make it easy to load a model, generate responses, and also show how to stream those responses.
3
+
This folder contains examples of how to deploy `lemonade` into applications.
4
+
5
+
## Server Examples
6
+
7
+
The `server/` folder contains examples of how to use Lemonade Server with existing applications that support server interfaces. Learn more in `server/README.md`.
8
+
9
+
## API Examples
10
+
11
+
This folder has examples of using the Lemonade API to integrate LLMs into Python applications. These APIs make it easy to load a model, generate responses, and also show how to stream those responses.
4
12
5
13
The `demos/` folder also contains some higher-level application demos of the APIs. Learn more in `demos/README.md`.
|[Open WebUI](https://github.com/open-webui/open-webui)|[How chat with lemonade LLMs in Open WebUI](https://ryzenai.docs.amd..com/en/latest/llm/server_interface.html#open-webui-demo)|
8
+
|[Continue](https://www.continue.dev/)|[How use lemonade LLMs as a coding assistant in Continue](continue.md)|
[Continue](https://www.continue.dev/) is a coding assistant that lives inside of a VS Code extension. It supports chatting with your codebase, making edits, and a lot more.
6
+
7
+
## Expectations
8
+
9
+
We have found that the `Qwen-1.5-7B-Chat-Hybrid` model is the best Hybrid model available for coding. It is good at chatting with a few files at a time in your codebase to learn more about them. It can also make simple code editing suggestions pertaining to a few lines of code at a time.
10
+
11
+
However, we do not recommend using this model for analyzing large codebases at once or making large or complex file edits.
12
+
13
+
## Setup
14
+
15
+
### Prerequisites
16
+
17
+
1. Install Lemonade Server using the [installer .exe](https://github.com/onnx/turnkeyml/blob/main/docs/lemonade/lemonade_server_exe.md#lemonade-server-installer).
18
+
19
+
### Install Continue
20
+
21
+
> Note: they provide their own instructions [here](https://marketplace.visualstudio.com/items?itemName=Continue.continue)
22
+
23
+
1. Open the Extensions tab in VS Code Activity Bar.
24
+
1. Search "Continue - Codestral, Claude, and more" in the Extensions Marketplace search bar.
25
+
1. Select the Continue extension and click install.
26
+
27
+
This will add a Continue tab to your VS Code Activity Bar.
28
+
29
+
### Add Lemonade Server to Continue
30
+
31
+
> Note: The following instructions are based on instructions from Continue found [here](https://docs.continue.dev/customize/model-providers/openai#openai-compatible-servers--apis)
32
+
33
+
1. Open the Continue tab in your VS Code Activity Bar.
34
+
1. Click the gear icon at the top to open Settings.
35
+
1. Under "Configuration", click "Open Config File".
36
+
1. Replace the "models" key in the `config.json` with the following and save:
37
+
38
+
```json
39
+
"models": [
40
+
{
41
+
"title": "Lemonade",
42
+
"provider": "openai",
43
+
"model": "Qwen-1.5-7B-Chat-Hybrid",
44
+
"apiKey": "-",
45
+
"apiBase": "http://localhost:8000/api/v0"
46
+
}
47
+
],
48
+
```
49
+
50
+
## Usage
51
+
52
+
> Note: see the Continue [user guide](https://docs.continue.dev/) to learn about all of their features.
53
+
54
+
To try out Continue:
55
+
- Open the Continue tab in your VS Code Activity Bar, and in the "Ask anything" box, type a question about your code. Use the `@` symbol to specify a file or too.
56
+
- Example: "What's the fastest way to install lemonade in @getting_started.md?"
57
+
- Open a file, select some code, and push Ctrl+I to start a chat about editing that code.
; If hybrid is enabled, check if at least one hybrid model is selected
318
+
SectionGetFlags${Llama1BSec}$1
319
+
IntOp$1$1 & ${SF_SELECTED}
320
+
${If}$1==${SF_SELECTED}
321
+
Goto end
322
+
${EndIf}
323
+
324
+
SectionGetFlags${Llama3BSec}$1
325
+
IntOp$1$1 & ${SF_SELECTED}
326
+
${If}$1==${SF_SELECTED}
327
+
Goto end
328
+
${EndIf}
329
+
330
+
SectionGetFlags${PhiSec}$1
331
+
IntOp$1$1 & ${SF_SELECTED}
332
+
${If}$1==${SF_SELECTED}
333
+
Goto end
334
+
${EndIf}
335
+
336
+
SectionGetFlags${Qwen7BSec}$1
337
+
IntOp$1$1 & ${SF_SELECTED}
338
+
${If}$1==${SF_SELECTED}
339
+
Goto end
340
+
${EndIf}
341
+
342
+
; If no hybrid model is selected, select Llama-1B by default
343
+
SectionGetFlags${Llama1BSec}$1
344
+
IntOp$1$1 | ${SF_SELECTED}
345
+
SectionSetFlags${Llama1BSec}$1
346
+
MessageBoxMB_OK"At least one hybrid model must be selected when hybrid execution is enabled. Llama-3.2-1B-Instruct-Hybrid has been automatically selected."
347
+
Goto end
348
+
349
+
hybrid_disabled:
350
+
; When hybrid is disabled, select Qwen2.5-0.5B-Instruct-CPU and disable all other hybrid model selections
LangString MUI_TEXT_LICENSE_SUBTITLE ${LANG_ENGLISH}"Please review the license terms before installing AMD Ryzen AI Hybrid Execution Mode."
309
414
LangString DESC_SEC01 ${LANG_ENGLISH}"The minimum set of dependencies for a lemonade server that runs LLMs on CPU."
310
-
LangString DESC_HybridSec ${LANG_ENGLISH}"Add support for running LLMs on Ryzen AI hybrid execution mode, which uses both the NPU and iGPU for improved performance. Only available on Ryzen AI 300-series processors."
415
+
LangString DESC_HybridSec ${LANG_ENGLISH}"Add support for running LLMs on Ryzen AI hybrid execution mode. Only available on Ryzen AI 300-series processors."
416
+
LangString DESC_ModelsSec ${LANG_ENGLISH}"Select which models to install"
0 commit comments