Quickstart
Start with Vibranium Dome
Setup
The project has multiple components, the easiest way to start is by using Docker Compose or Helm Charts.
OpenSearch requires max virtual memory areas vm.max_map_count
to be 262144
, and in some environments it’s 65530
, so check first and apply workaround to make sure it’s the correct value.
Run first:
docker run -u root -it opensearchproject/opensearch:2.9.0 bash -c "cat /proc/sys/vm/max_map_count"
if that command return 65530
, run that command to increase it temporarily:
docker run --rm --privileged alpine sysctl -w vm.max_map_count=262144
and run again the previos command to make sure the new value set to 262144
docker run -u root -it opensearchproject/opensearch:2.9.0 bash -c "cat /proc/sys/vm/max_map_count"
OpenSearch on Linux machine need to change the owner of the filesystem volumes, so run:
mkdir vibraniumdome-opensearch/vibraniumdome-opensearch-data1
mkdir vibraniumdome-opensearch/vibraniumdome-opensearch-data2
sudo chown -R 1000:1000 vibraniumdome-opensearch/vibraniumdome-opensearch-data1
sudo chown -R 1000:1000 vibraniumdome-opensearch/vibraniumdome-opensearch-data2
Set OPENAI_API_KEY
environment variable in environment file, you can generate one here
Clone the repository
git clone https://github.com/genia-dev/vibraniumdome
Change directory to vibraniumdome
cd vibraniumdome
Start the project
With docker-compose: docker-compose up
; With Helm: helm install test-release helm/
Open The Application
Navigate to Vibranium-App
admin@admin.com
and password: admin
. You can change it here.The previous command of docker-compose up
use existing docker images in docker-hub, if you would like to modify & build the project from source locally, you can run the docker-compose in build mode by:
cp vibraniumdome-app/.env.example .env
docker-compose -f docker-compose.build.yml up --build
Analyze your first LLM interaction
Analyze LLM Interactions by streamlit app
After you run the the whole system like in the above instructions, navigate to VibraniumDome-Streamlit-App.
In the left pane of the streamlit app in the Environment Variables
section you need to configure the OpenAI API Key.
The Streamlit app, by default, uses http://localhost:5001 for the Vibranium Dome Base URL (optional)
. You can change this to point to a different URL.
Now you can chat in the app, and see the results in Vibranium-App.
Analyze LLM Interactions by code
Install the Vibranium Dome SDK
Create demo application with the OLD OpenAI SDK in main.py
python3 -m venv venv
source venv/bin/activate
pip3 install vibraniumdome-sdk==0.3.0 openai==0.28.1
import os
import openai
from vibraniumdome_sdk import VibraniumDome
openai.api_key = os.getenv("OPENAI_API_KEY")
VibraniumDome.init(app_name="set_you_agent_name_here")
def main():
response = openai.ChatCompletion.create(
model="gpt-3.5-turbo",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Who won the world series in 2020?"},
{"role": "assistant", "content": "The Los Angeles Dodgers won the World Series in 2020."},
{"role": "user", "content": "Where was it played?"}
],
temperature=0,
request_timeout=5,
user="user-123456",
headers={"x-session-id": "abcd-1234-cdef"},)
print(response)
if __name__ == "__main__":
main()
Create demo application with the NEW OpenAI SDK in main.py
python3 -m venv venv
source venv/bin/activate
pip3 install vibraniumdome-sdk openai
VIBRANIUM_DOME_API_KEY
; If not provided, it used the default one defined in VibraniumDome Systemimport os
from openai import OpenAI
from vibraniumdome_sdk import VibraniumDome
VibraniumDome.init(app_name="set_you_agent_name_here")
client = OpenAI(api_key=os.environ["OPENAI_API_KEY"])
completion = client.chat.completions.create(
model="gpt-3.5-turbo",
messages=[{"role": "user", "content": "Tell me a joke about opentelemetry"}],
user="user-123456",
extra_headers={"x-session-id": "abcd-1234-cdef"},
)
print(completion.choices[0].message.content)
Run the demo application
python3 main.py
Navigate to Vibranium-App to see the results.
VIBRANIUM_DOME_BASE_URL
(default: http://localhost:5001)