Okahu is a team of AI, observability & cloud engineers working to simplify observability for agentic and other GenAI apps. We serve AI app developers, platform engineers and engineering leaders to build reliable, accurate and safer AI apps. We believe in community driven open source software and are a major contributor to GenAI native observability Project Monocle hosted by Linux Foundation.
We've curated working examples of different kinds of GenAI apps and ways to instrument these apps to get meaningful observability insights. GenAI apps are built using a variety of languages, LLM frameworks, models and cloud services. Instrumentation of these apps is demonstrated using open source including Monocle & OpenTelemetry or commercially supported products from Okahu or friends.
These examples are aimed at AI app developers, platform engineers or anyone else who want to learn to build and operate impactful GenAI apps.
Example | Repository | Description | GenAI app | Observability |
---|---|---|---|---|
![]() |
chatbot-coffee-lambda | Cloud hosted interactive chatbot webapp to answer questions about coffee with traces hosted in AWS S3 | Next.js, TypeScript + Python, Langchain, OpenAI, AWS | Monocle |
![]() |
chatbot-coffee-vercel | Vercel hosted interactive chatbot webapp to answer questions about coffee with traces hosted in AWS S3 | Next.js, TypeScript, Langchain, OpenAI, Vercel | Monocle |
--- | okahu-demo-lc-openai | Personal interactive chatbot command line python app to answer questions about coffee with local traces. Runs on your laptop or Github Codespaces | Python, Langchain, OpenAI, Github Codespaces/Laptop | Okahu |
Okahu team also hosts version of these GenAI apps in our cloud.1
Demo | Instructions | Observability |
---|---|---|
![]() |
AI assistant that answers questions about coffee to demo instrumentation of GenAI apps with open-source Monocle. Ask a question and View Monocle generated traces |
Monocle |
![]() |
Interactive chatbot webapp that answers questions about coffee hosted in AWS with Monocle traces pushed to S3. |
Monocle |
![]() |
REST based chatbot that answers coffee related questions. Follows a RAG design pattern with gpt-3.5-turbo hosted in Azure OpenAI, app code implemented in Llamaindex and deployed in Azure function. 1. Postman Client 2. Azure Functions okahu-demo-bot-az-func 3. Azure OpenAI okahu-openai-dev 4. Okahu Cloud Personal |
Okahu |
![]() |
REST based chatbot that answers coffee related questions. Simple app with an LLM inference call to gpt-3.5-turbo hosted in AWS Bedrock from a Langchain apps served in AWS Lambda. 1. Client 2. AWS Lambda function bedrock-kb-sbx-ask-question 3. AWS Bedrock 4. Okahu Cloud Personal |
Okahu |
![]() |
REST based chatbot that answers coffee related questions. Follows a RAG design pattern with gpt-3.5-turbo hosted in Sagemaker, app code implemented in Langchain and deployed in AWS Lambda function. 1. Client okahu-sagemaker-rag-chatbot-ui 2. AWS Lambda function sagemaker-okahu-demo-langchain 3. AWS Sagemaker Studio 4. Okahu Cloud Personal |
Okahu |
Email dx@okahu.ai to contribute your demo apps to this repo.
Footnotes
-
Use of Okahu hosted demo is covered by Okahu's terms of service for evaluations. Okahu is a team of AI, observability & cloud engineers working to simplify observability for agentic and other GenAI apps. We serve AI app developers, platform engineers and engineering leaders to build reliable, accurate and safer AI apps. We believe in community driven open source software and are a major contributor to GenAI native observability Project Monocle hosted by Linux Foundation. Connect with us on Linkedin, Github or email us at dx@okahu.ai ↩