Skip to content

Commit

Permalink
0.1.0 (#8)
Browse files Browse the repository at this point in the history
# Changelog

##
[0.0.2a10](https://github.com/NeonGeckoCom/neon-llm-gemini/tree/0.0.2a10)
(2025-03-03)

[Full
Changelog](0.0.2a9...0.0.2a10)

**Merged pull requests:**

- Update to stable dependencies
[\#7](#7)
([NeonDaniel](https://github.com/NeonDaniel))

##
[0.0.2a9](https://github.com/NeonGeckoCom/neon-llm-gemini/tree/0.0.2a9)
(2025-01-17)

[Full
Changelog](0.0.2a8...0.0.2a9)

**Merged pull requests:**

- Update Docker configuration and async consumer init
[\#6](#6)
([NeonDaniel](https://github.com/NeonDaniel))

##
[0.0.2a8](https://github.com/NeonGeckoCom/neon-llm-gemini/tree/0.0.2a8)
(2025-01-17)

[Full
Changelog](0.0.2a7...0.0.2a8)

**Merged pull requests:**

- Add Docker container update automation
[\#5](#5)
([NeonDaniel](https://github.com/NeonDaniel))

##
[0.0.2a7](https://github.com/NeonGeckoCom/neon-llm-gemini/tree/0.0.2a7)
(2024-04-04)

[Full
Changelog](0.0.2a6...0.0.2a7)

**Merged pull requests:**

- neon-llm-core version bump
[\#4](#4)
([NeonBohdan](https://github.com/NeonBohdan))

##
[0.0.2a6](https://github.com/NeonGeckoCom/neon-llm-gemini/tree/0.0.2a6)
(2024-03-25)

[Full
Changelog](0.0.2a5...0.0.2a6)

**Merged pull requests:**

- neon-llm-core version bump
[\#3](#3)
([NeonBohdan](https://github.com/NeonBohdan))

##
[0.0.2a5](https://github.com/NeonGeckoCom/neon-llm-gemini/tree/0.0.2a5)
(2024-01-17)

[Full
Changelog](0.0.2a3...0.0.2a5)

**Merged pull requests:**

- Use Gemini
[\#1](#1)
([NeonBohdan](https://github.com/NeonBohdan))

##
[0.0.2a3](https://github.com/NeonGeckoCom/neon-llm-gemini/tree/0.0.2a3)
(2023-11-16)

[Full
Changelog](0.0.2a2...0.0.2a3)

##
[0.0.2a2](https://github.com/NeonGeckoCom/neon-llm-gemini/tree/0.0.2a2)
(2023-10-23)

[Full
Changelog](0.0.2a1...0.0.2a2)

##
[0.0.2a1](https://github.com/NeonGeckoCom/neon-llm-gemini/tree/0.0.2a1)
(2023-10-19)

[Full
Changelog](0.0.1...0.0.2a1)

## [0.0.1](https://github.com/NeonGeckoCom/neon-llm-gemini/tree/0.0.1)
(2023-06-30)

[Full
Changelog](0.0.1a4...0.0.1)

##
[0.0.1a4](https://github.com/NeonGeckoCom/neon-llm-gemini/tree/0.0.1a4)
(2023-06-30)

[Full
Changelog](0.0.1a3...0.0.1a4)

##
[0.0.1a3](https://github.com/NeonGeckoCom/neon-llm-gemini/tree/0.0.1a3)
(2023-06-06)

[Full
Changelog](0.0.1a2...0.0.1a3)

##
[0.0.1a2](https://github.com/NeonGeckoCom/neon-llm-gemini/tree/0.0.1a2)
(2023-05-16)

[Full
Changelog](babd77e...0.0.1a2)



\* *This Changelog was automatically generated by
[github_changelog_generator](https://github.com/github-changelog-generator/github-changelog-generator)*
  • Loading branch information
NeonDaniel authored Mar 5, 2025
2 parents da7577b + c75f8fe commit 4b6cd71
Show file tree
Hide file tree
Showing 18 changed files with 359 additions and 262 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/license_tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,4 +9,4 @@ jobs:
license_tests:
uses: neongeckocom/.github/.github/workflows/license_tests.yml@master
with:
packages-exclude: '^(neon-llm-chatgpt|tqdm).*'
packages-exclude: '^(neon-llm|tqdm|klat-connector|neon-chatbot|dnspython|attrs|RapidFuzz).*'
11 changes: 11 additions & 0 deletions .github/workflows/update_docker_images.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
name: Publish Updated Docker Image
on:
workflow_dispatch:

jobs:

build_and_publish_docker:
uses: neongeckocom/.github/.github/workflows/publish_docker.yml@master
secrets: inherit
with:
include_semver: False
70 changes: 59 additions & 11 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,32 +1,80 @@
# Changelog

## [0.0.1a4](https://github.com/NeonGeckoCom/neon-llm-chatgpt/tree/0.0.1a4) (2023-06-30)
## [0.0.2a10](https://github.com/NeonGeckoCom/neon-llm-gemini/tree/0.0.2a10) (2025-03-03)

[Full Changelog](https://github.com/NeonGeckoCom/neon-llm-chatgpt/compare/0.0.1a3...0.0.1a4)
[Full Changelog](https://github.com/NeonGeckoCom/neon-llm-gemini/compare/0.0.2a9...0.0.2a10)

**Merged pull requests:**

- Update config handling to use envvars instead of FS overlay [\#6](https://github.com/NeonGeckoCom/neon-llm-chatgpt/pull/6) ([NeonDaniel](https://github.com/NeonDaniel))
- Update to stable dependencies [\#7](https://github.com/NeonGeckoCom/neon-llm-gemini/pull/7) ([NeonDaniel](https://github.com/NeonDaniel))

## [0.0.1a3](https://github.com/NeonGeckoCom/neon-llm-chatgpt/tree/0.0.1a3) (2023-06-06)
## [0.0.2a9](https://github.com/NeonGeckoCom/neon-llm-gemini/tree/0.0.2a9) (2025-01-17)

[Full Changelog](https://github.com/NeonGeckoCom/neon-llm-chatgpt/compare/0.0.1a2...0.0.1a3)
[Full Changelog](https://github.com/NeonGeckoCom/neon-llm-gemini/compare/0.0.2a8...0.0.2a9)

**Merged pull requests:**

- Add log for handled requests [\#4](https://github.com/NeonGeckoCom/neon-llm-chatgpt/pull/4) ([NeonDaniel](https://github.com/NeonDaniel))
- Update Docker configuration and async consumer init [\#6](https://github.com/NeonGeckoCom/neon-llm-gemini/pull/6) ([NeonDaniel](https://github.com/NeonDaniel))

## [0.0.1a2](https://github.com/NeonGeckoCom/neon-llm-chatgpt/tree/0.0.1a2) (2023-05-16)
## [0.0.2a8](https://github.com/NeonGeckoCom/neon-llm-gemini/tree/0.0.2a8) (2025-01-17)

[Full Changelog](https://github.com/NeonGeckoCom/neon-llm-chatgpt/compare/babd77e0f173fbe3681927677602c72c58774ff0...0.0.1a2)
[Full Changelog](https://github.com/NeonGeckoCom/neon-llm-gemini/compare/0.0.2a7...0.0.2a8)

**Implemented enhancements:**
**Merged pull requests:**

- Add Docker container update automation [\#5](https://github.com/NeonGeckoCom/neon-llm-gemini/pull/5) ([NeonDaniel](https://github.com/NeonDaniel))

## [0.0.2a7](https://github.com/NeonGeckoCom/neon-llm-gemini/tree/0.0.2a7) (2024-04-04)

[Full Changelog](https://github.com/NeonGeckoCom/neon-llm-gemini/compare/0.0.2a6...0.0.2a7)

**Merged pull requests:**

- neon-llm-core version bump [\#4](https://github.com/NeonGeckoCom/neon-llm-gemini/pull/4) ([NeonBohdan](https://github.com/NeonBohdan))

- \[FEAT\] Python Packaging [\#1](https://github.com/NeonGeckoCom/neon-llm-chatgpt/issues/1)
## [0.0.2a6](https://github.com/NeonGeckoCom/neon-llm-gemini/tree/0.0.2a6) (2024-03-25)

[Full Changelog](https://github.com/NeonGeckoCom/neon-llm-gemini/compare/0.0.2a5...0.0.2a6)

**Merged pull requests:**

- Update for packaging and deployment [\#3](https://github.com/NeonGeckoCom/neon-llm-chatgpt/pull/3) ([NeonDaniel](https://github.com/NeonDaniel))
- neon-llm-core version bump [\#3](https://github.com/NeonGeckoCom/neon-llm-gemini/pull/3) ([NeonBohdan](https://github.com/NeonBohdan))

## [0.0.2a5](https://github.com/NeonGeckoCom/neon-llm-gemini/tree/0.0.2a5) (2024-01-17)

[Full Changelog](https://github.com/NeonGeckoCom/neon-llm-gemini/compare/0.0.2a3...0.0.2a5)

**Merged pull requests:**

- Use Gemini [\#1](https://github.com/NeonGeckoCom/neon-llm-gemini/pull/1) ([NeonBohdan](https://github.com/NeonBohdan))

## [0.0.2a3](https://github.com/NeonGeckoCom/neon-llm-gemini/tree/0.0.2a3) (2023-11-16)

[Full Changelog](https://github.com/NeonGeckoCom/neon-llm-gemini/compare/0.0.2a2...0.0.2a3)

## [0.0.2a2](https://github.com/NeonGeckoCom/neon-llm-gemini/tree/0.0.2a2) (2023-10-23)

[Full Changelog](https://github.com/NeonGeckoCom/neon-llm-gemini/compare/0.0.2a1...0.0.2a2)

## [0.0.2a1](https://github.com/NeonGeckoCom/neon-llm-gemini/tree/0.0.2a1) (2023-10-19)

[Full Changelog](https://github.com/NeonGeckoCom/neon-llm-gemini/compare/0.0.1...0.0.2a1)

## [0.0.1](https://github.com/NeonGeckoCom/neon-llm-gemini/tree/0.0.1) (2023-06-30)

[Full Changelog](https://github.com/NeonGeckoCom/neon-llm-gemini/compare/0.0.1a4...0.0.1)

## [0.0.1a4](https://github.com/NeonGeckoCom/neon-llm-gemini/tree/0.0.1a4) (2023-06-30)

[Full Changelog](https://github.com/NeonGeckoCom/neon-llm-gemini/compare/0.0.1a3...0.0.1a4)

## [0.0.1a3](https://github.com/NeonGeckoCom/neon-llm-gemini/tree/0.0.1a3) (2023-06-06)

[Full Changelog](https://github.com/NeonGeckoCom/neon-llm-gemini/compare/0.0.1a2...0.0.1a3)

## [0.0.1a2](https://github.com/NeonGeckoCom/neon-llm-gemini/tree/0.0.1a2) (2023-05-16)

[Full Changelog](https://github.com/NeonGeckoCom/neon-llm-gemini/compare/babd77e0f173fbe3681927677602c72c58774ff0...0.0.1a2)



Expand Down
13 changes: 8 additions & 5 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -1,15 +1,18 @@
FROM python:3.9-slim

LABEL vendor=neon.ai \
ai.neon.name="neon-llm-chatgpt"
ai.neon.name="neon-llm-gemini"

ENV OVOS_CONFIG_BASE_FOLDER=neon
ENV OVOS_CONFIG_FILENAME=diana.yaml
ENV OVOS_DEFAULT_CONFIG=/opt/neon/diana.yaml
ENV XDG_CONFIG_HOME=/config
ENV CHATBOT_VERSION=v2

ENV OVOS_CONFIG_BASE_FOLDER neon
ENV OVOS_CONFIG_FILENAME diana.yaml
ENV XDG_CONFIG_HOME /config
COPY docker_overlay/ /

WORKDIR /app
COPY . /app
RUN pip install /app

CMD [ "neon-llm-chatgpt" ]
CMD [ "neon-llm-gemini" ]
2 changes: 1 addition & 1 deletion LICENSE.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# NEON AI (TM) SOFTWARE, Software Development Kit & Application Development System
# All trademark and other rights reserved by their respective owners
# Copyright 2008-2021 Neongecko.com Inc.
# Copyright 2008-2025 Neongecko.com Inc.
# BSD-3 License

Redistribution and use in source and binary forms, with or without modification, are permitted provided that the
Expand Down
19 changes: 10 additions & 9 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# NeonAI LLM ChatGPT
Proxies API calls to ChatGPT.
# NeonAI LLM Gemini
Proxies API calls to Google Gemini.

## Request Format
API requests should include `history`, a list of tuples of strings, and the current
Expand All @@ -25,20 +25,21 @@ MQ:
port: <MQ Port>
server: <MQ Hostname or IP>
users:
mq-chatgpt-api:
password: <neon_chatgpt user's password>
user: neon_chatgpt
ChatGPT:
key: ""
model: "gpt-3.5-turbo"
neon_llm_gemini:
password: <neon_gemini user's password>
user: neon_gemini
LLM_GEMINI:
model: "gemini-pro"
key_path: ""
role: "You are trying to give a short answer in less than 40 words."
context_depth: 3
max_tokens: 100
num_parallel_processes: 2
```
For example, if your configuration resides in `~/.config`:
```shell
export CONFIG_PATH="/home/${USER}/.config"
docker run -v ${CONFIG_PATH}:/config neon_llm_chatgpt
docker run -v ${CONFIG_PATH}:/config neon_llm_gemini
```
> Note: If connecting to a local MQ server, you may need to specify `--network host`
21 changes: 0 additions & 21 deletions docker_overlay/etc/neon/diana.yaml

This file was deleted.

31 changes: 31 additions & 0 deletions docker_overlay/opt/neon/diana.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,31 @@
log_level: INFO
logs:
level_overrides:
error:
- pika
warning:
- filelock
- watchdog
- httpcore
info:
- openai
- asyncio
- matplotlib
debug: []
MQ:
server: neon-rabbitmq
port: 5672
users:
mq_handler:
user: neon_api_utils
password: Klatchat2021
LLM_GEMINI:
model: "gemini-pro"
role: "You are trying to give a short answer in less than 40 words."
context_depth: 3
max_tokens: 100
num_parallel_processes: 2
llm_bots:
gemini:
- description: You are trying to give a short answer in less than 40 words.
name: assistant
63 changes: 0 additions & 63 deletions neon_llm_chatgpt/chatgpt.py

This file was deleted.

19 changes: 0 additions & 19 deletions neon_llm_chatgpt/default_config.json

This file was deleted.

Loading

0 comments on commit 4b6cd71

Please sign in to comment.