ChatDatabricks
Databricks Lakehouse Platform unifies data, analytics, and AI on one platform.
This notebook provides a quick overview for getting started with Databricks chat models. For detailed documentation of all ChatDatabricks features and configurations head to the API reference.
Overview
ChatDatabricks class wraps a chat model endpoint hosted on Databricks Model Serving. This example notebook shows how to wrap your serving endpoint and use it as a chat model in your LangChain application.
Integration details
| Class | Package | Local | Serializable | Package downloads | Package latest |
|---|---|---|---|---|---|
| ChatDatabricks | langchain-databricks | ❌ | beta |
Model features
| Tool calling | Structured output | JSON mode | Image input | Audio input | Video input | Token-level streaming | Native async | Token usage | Logprobs |
|---|---|---|---|---|---|---|---|---|---|
| ✅ | ✅ | ✅ | ❌ | ❌ | ❌ | ✅ | ✅ | ✅ | ❌ |
Supported Methods
ChatDatabricks supports all methods of ChatModel including async APIs.
Endpoint Requirement
The serving endpoint ChatDatabricks wraps must have OpenAI-compatible chat input/output format (reference). As long as the input format is compatible, ChatDatabricks can be used for any endpoint type hosted on Databricks Model Serving:
- Foundation Models - Curated list of state-of-the-art foundation models such as DRBX, Llama3, Mixtral-8x7B, and etc. These endpoint are ready to use in your Databricks workspace without any set up.
- Custom Models - You can also deploy custom models to a serving endpoint via MLflow with your choice of framework such as LangChain, Pytorch, Transformers, etc.
- External Models - Databricks endpoints can serve models that are hosted outside Databricks as a proxy, such as proprietary model service like OpenAI GPT4.