metadata
title: langchain-streamlit-demo
emoji: 🦜
colorFrom: green
colorTo: red
sdk: docker
app_port: 7860
pinned: true
tags:
- langchain
- streamlit
- docker
langchain-streamlit-demo
This project shows how to build a simple chatbot UI with Streamlit and LangChain.
This README
was written by Claude 2, an LLM from Anthropic.
Features
- Chat interface for talking to AI assistant
- Supports models from
- OpenAI
gpt-3.5-turbo
gpt-4
- Anthropic
claude-instant-v1
claude-2
- Anyscale Endpoints
meta-llama/Llama-2-7b-chat-hf
meta-llama/Llama-2-13b-chat-hf
meta-llama/Llama-2-70b-chat-hf
- OpenAI
- Streaming output of assistant responses
- Leverages LangChain for dialogue management
- Integrates with LangSmith for tracing conversations
- Allows giving feedback on assistant's responses
Usage
Run on HuggingFace Spaces
With Docker (pull from Docker Hub)
- Run in terminal:
docker run -p 7860:7860 joshuasundance/langchain-streamlit-demo:latest
- Open http://localhost:7860 in your browser.
Docker Compose
- Clone the repo. Navigate to cloned repo directory.
- Run in terminal:
docker compose up
- Then open http://localhost:7860 in your browser.
Configuration
- Select a model from the dropdown
- Enter an API key for the relevant provider
- Optionally enter a LangSmith API key to enable conversation tracing
- Customize the assistant prompt and temperature
Code Overview
app.py
- Main Streamlit app definitionllm_stuff.py
- LangChain helper functions
Deployment
The app is packaged as a Docker image for easy deployment. It is published to Docker Hub and Hugging Face Spaces:
CI workflows in .github/workflows
handle building and publishing the image.
Links
TODO
- More customization / parameterization in sidebar