Use Azure Promt flow to deploy Langchain Text based Chat apps

Girish Kurup
2 min readSep 1, 2024

--

Yes, it is possible to deploy LangChain-based LLM applications in production using Azure Prompt Flow and LangServe. LangServe is specifically designed to deploy LangChain agents and build LLM applications over REST APIs using FastAPI. Here’s a brief overview of the process:

### Steps to Deploy LangChain Applications with LangServe

  1. **Installation**
  2. . – Install both client and server components:
  3. . ```bash
  4. . pip install “langserve[all]”
  5. . ```
  6. . – Install LangChain CLI and Poetry:
  7. . ```bash
  8. . pip install -U langchain-cli poetry
  9. . ```

2. **Create a New App**

. – Use LangChain CLI to create a new app:

. ```bash

. langchain app new my-app

. ```

3. **Add Third-Party Packages with Poetry**

. – Add necessary packages:

. ```bash

. poetry add langchain-openai

. ```

4. **Edit the `server.py` File**

. – Import necessary modules:

. ```python

. from fastapi import FastAPI

. from langchain.prompts import ChatPromptTemplate

. from langchain.llms import ChatOpenAI

. from langserve import add_routes

. ```

. – Initialize FastAPI app:

. ```python

. app = FastAPI(title=”LangChain Server”, version=”1.0", description=”LangServe API”)

. ```

. – Add routes:

. ```python

. add_routes(app, ChatOpenAI(), path=”/openai”)

. summarize_prompt = ChatPromptTemplate.from_template(“Summarize this text: {text}”)

. add_routes(app, summarize_prompt | ChatOpenAI(), path=”/openai”)

. ```

. – Run the server:

. ```bash

. uvicorn.run(app, host=”localhost”, port=8000)

. ```

5. **Test the Application**

. – Run the server from the app root directory:

. ```bash

. langchain serve

. ```

. – Navigate to `http://127.0.0.1:8000/summarize/playground/` in your browser to test and debug the app.

6. **Implement Monitoring and Health Checks**

. – Add a health check endpoint:

. ```python

. @app.get(“/health”)

. async def health():

. return {“status”: “Healthy”}

. ```

### Monitoring and Deployment

  • **Monitoring**: LangServe lacks built-in monitoring features, so you can use LangSmith Tracing or integrate with third-party tools like Prometheus.
  • - **Deployment**: You can deploy LangServe to cloud platforms like AWS, Azure, or GCP¹²³.

LangServe simplifies the process of deploying LangChain applications, making it easier to build, test, and deploy LLM applications in production.

Would you like more details on any specific step or tool?

Source: Conversation with Copilot, 01/09/2024

(1) ️ LangServe | ️ LangChain. https://python.langchain.com/v0.1/docs/langserve/.

(2) Introducing LangServe, the best way to deploy your LangChains. https://blog.langchain.dev/introducing-langserve/.

(3) LangServe: Tutorial for Easy LangChain Deployment. https://cheatsheet.md/langchain-tutorials/langserve-for-langchain-deployment.en.

(4) Using LangServe to build REST APIs for LangChain Applications. https://www.koyeb.com/tutorials/using-langserve-to-build-rest-apis-for-langchain-applications.

(5) Gen AI apps: Deploy LangChain on Cloud Run with LangServe – Google Cloud. https://cloud.google.com/blog/products/ai-machine-learning/deploy-langchain-on-cloud-run-with-langserve.

(6) Deploy LangChain applications on AWS with LangServe ️. https://community.aws/content/2eY9TZJMipfxYOTafH5AdjPDCKE/deploy-langchain-applications-on-aws-with-langserve.

--

--

Girish Kurup

Passionate about Writing . I am Technology & DataScience enthusiast. Reach me girishkurup21@gmail.com.