Quickstart: Conversation
Alpha
The conversation building block is currently in alpha.Let’s take a look at how the Dapr conversation building block makes interacting with Large Language Models (LLMs) easier. In this quickstart, you use the echo component to communicate with the mock LLM and ask it for a poem about Dapr.
You can try out this conversation quickstart by either:
- Running the application in this sample with the Multi-App Run template file, or
- Running the application without the template
Note
Currently, only the HTTP quickstart sample is available in Python and JavaScript.Run the app with the template file
Step 1: Pre-requisites
For this example, you will need:
Step 2: Set up the environment
Clone the sample provided in the Quickstarts repo.
git clone https://github.com/dapr/quickstarts.git
From the root of the Quickstarts directory, navigate into the conversation directory:
cd conversation/python/http/conversation
Install the dependencies:
pip3 install -r requirements.txt
Step 3: Launch the conversation service
Navigate back to the http
directory and start the conversation service with the following command:
dapr run -f .
Expected output
== APP - conversation == Input sent: What is dapr?
== APP - conversation == Output response: What is dapr?
What happened?
When you ran dapr init
during Dapr install, the dapr.yaml
Multi-App Run template file was generated in the .dapr/components
directory.
Running dapr run -f .
in this Quickstart started conversation.go.
dapr.yaml
Multi-App Run template file
Running the Multi-App Run template file with dapr run -f .
starts all applications in your project. This Quickstart has only one application, so the dapr.yaml
file contains the following:
version: 1
common:
resourcesPath: ../../components/
apps:
- appID: conversation
appDirPath: ./conversation/
command: ["python3", "app.py"]
Echo mock LLM component
In conversation/components
directly of the quickstart, the conversation.yaml
file configures the echo LLM component.
apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
name: echo
spec:
type: conversation.echo
version: v1
To interface with a real LLM, swap out the mock component with one of the supported conversation components. For example, to use an OpenAI component, see the example in the conversation how-to guide
app.py
conversation app
In the application code:
- The app sends an input “What is dapr?” to the echo mock LLM component.
- The mock LLM echoes “What is dapr?”.
import logging
import requests
import os
logging.basicConfig(level=logging.INFO)
base_url = os.getenv('BASE_URL', 'http://localhost') + ':' + os.getenv(
'DAPR_HTTP_PORT', '3500')
CONVERSATION_COMPONENT_NAME = 'echo'
input = {
'name': 'echo',
'inputs': [{'message':'What is dapr?'}],
'parameters': {},
'metadata': {}
}
# Send input to conversation endpoint
result = requests.post(
url='%s/v1.0-alpha1/conversation/%s/converse' % (base_url, CONVERSATION_COMPONENT_NAME),
json=input
)
logging.info('Input sent: What is dapr?')
# Parse conversation output
data = result.json()
output = data["outputs"][0]["result"]
logging.info('Output response: ' + output)
Step 1: Pre-requisites
For this example, you will need:
Step 2: Set up the environment
Clone the sample provided in the Quickstarts repo.
git clone https://github.com/dapr/quickstarts.git
From the root of the Quickstarts directory, navigate into the conversation directory:
cd conversation/javascript/http/conversation
Install the dependencies:
npm install
Step 3: Launch the conversation service
Navigate back to the http
directory and start the conversation service with the following command:
dapr run -f .
Expected output
== APP - conversation == Input sent: What is dapr?
== APP - conversation == Output response: What is dapr?
What happened?
When you ran dapr init
during Dapr install, the dapr.yaml
Multi-App Run template file was generated in the .dapr/components
directory.
Running dapr run -f .
in this Quickstart started conversation.go.
dapr.yaml
Multi-App Run template file
Running the Multi-App Run template file with dapr run -f .
starts all applications in your project. This Quickstart has only one application, so the dapr.yaml
file contains the following:
version: 1
common:
resourcesPath: ../../components/
apps:
- appID: conversation
appDirPath: ./conversation/
daprHTTPPort: 3502
command: ["npm", "run", "start"]
Echo mock LLM component
In conversation/components
directly of the quickstart, the conversation.yaml
file configures the echo LLM component.
apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
name: echo
spec:
type: conversation.echo
version: v1
To interface with a real LLM, swap out the mock component with one of the supported conversation components. For example, to use an OpenAI component, see the example in the conversation how-to guide
index.js
conversation app
In the application code:
- The app sends an input “What is dapr?” to the echo mock LLM component.
- The mock LLM echoes “What is dapr?”.
const conversationComponentName = "echo";
async function main() {
const daprHost = process.env.DAPR_HOST || "http://localhost";
const daprHttpPort = process.env.DAPR_HTTP_PORT || "3500";
const inputBody = {
name: "echo",
inputs: [{ message: "What is dapr?" }],
parameters: {},
metadata: {},
};
const reqURL = `${daprHost}:${daprHttpPort}/v1.0-alpha1/conversation/${conversationComponentName}/converse`;
try {
const response = await fetch(reqURL, {
method: "POST",
headers: {
"Content-Type": "application/json",
},
body: JSON.stringify(inputBody),
});
console.log("Input sent: What is dapr?");
const data = await response.json();
const result = data.outputs[0].result;
console.log("Output response:", result);
} catch (error) {
console.error("Error:", error.message);
process.exit(1);
}
}
main().catch((error) => {
console.error("Unhandled error:", error);
process.exit(1);
});
Step 1: Pre-requisites
For this example, you will need:
Step 2: Set up the environment
Clone the sample provided in the Quickstarts repo.
git clone https://github.com/dapr/quickstarts.git
From the root of the Quickstarts directory, navigate into the conversation directory:
cd conversation/csharp/sdk
Step 3: Launch the conversation service
Start the conversation service with the following command:
dapr run -f .
Expected output
== APP - conversation == Input sent: What is dapr?
== APP - conversation == Output response: What is dapr?
What happened?
When you ran dapr init
during Dapr install, the dapr.yaml
Multi-App Run template file was generated in the .dapr/components
directory.
Running dapr run -f .
in this Quickstart started the conversation Program.cs.
dapr.yaml
Multi-App Run template file
Running the Multi-App Run template file with dapr run -f .
starts all applications in your project. This Quickstart has only one application, so the dapr.yaml
file contains the following:
version: 1
common:
resourcesPath: ../../components/
apps:
- appDirPath: ./conversation/
appID: conversation
daprHTTPPort: 3500
command: ["dotnet", "run"]
Echo mock LLM component
In conversation/components
, the conversation.yaml
file configures the echo mock LLM component.
apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
name: echo
spec:
type: conversation.echo
version: v1
To interface with a real LLM, swap out the mock component with one of the supported conversation components. For example, to use an OpenAI component, see the example in the conversation how-to guide
Program.cs
conversation app
In the application code:
- The app sends an input “What is dapr?” to the echo mock LLM component.
- The mock LLM echoes “What is dapr?”.
using Dapr.AI.Conversation;
using Dapr.AI.Conversation.Extensions;
class Program
{
private const string ConversationComponentName = "echo";
static async Task Main(string[] args)
{
const string prompt = "What is dapr?";
var builder = WebApplication.CreateBuilder(args);
builder.Services.AddDaprConversationClient();
var app = builder.Build();
//Instantiate Dapr Conversation Client
var conversationClient = app.Services.GetRequiredService<DaprConversationClient>();
try
{
// Send a request to the echo mock LLM component
var response = await conversationClient.ConverseAsync(ConversationComponentName, [new(prompt, DaprConversationRole.Generic)]);
Console.WriteLine("Input sent: " + prompt);
if (response != null)
{
Console.Write("Output response:");
foreach (var resp in response.Outputs)
{
Console.WriteLine($" {resp.Result}");
}
}
}
catch (Exception ex)
{
Console.WriteLine("Error: " + ex.Message);
}
}
}
Step 1: Pre-requisites
For this example, you will need:
Step 2: Set up the environment
Clone the sample provided in the Quickstarts repo.
git clone https://github.com/dapr/quickstarts.git
From the root of the Quickstarts directory, navigate into the conversation directory:
cd conversation/go/sdk
Step 3: Launch the conversation service
Start the conversation service with the following command:
dapr run -f .
Expected output
== APP - conversation == Input sent: What is dapr?
== APP - conversation == Output response: What is dapr?
What happened?
When you ran dapr init
during Dapr install, the dapr.yaml
Multi-App Run template file was generated in the .dapr/components
directory.
Running dapr run -f .
in this Quickstart started conversation.go.
dapr.yaml
Multi-App Run template file
Running the Multi-App Run template file with dapr run -f .
starts all applications in your project. This Quickstart has only one application, so the dapr.yaml
file contains the following:
version: 1
common:
resourcesPath: ../../components/
apps:
- appDirPath: ./conversation/
appID: conversation
daprHTTPPort: 3501
command: ["go", "run", "."]
Echo mock LLM component
In conversation/components
directly of the quickstart, the conversation.yaml
file configures the echo LLM component.
apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
name: echo
spec:
type: conversation.echo
version: v1
To interface with a real LLM, swap out the mock component with one of the supported conversation components. For example, to use an OpenAI component, see the example in the conversation how-to guide
conversation.go
conversation app
In the application code:
- The app sends an input “What is dapr?” to the echo mock LLM component.
- The mock LLM echoes “What is dapr?”.
package main
import (
"context"
"fmt"
"log"
dapr "github.com/dapr/go-sdk/client"
)
func main() {
client, err := dapr.NewClient()
if err != nil {
panic(err)
}
input := dapr.ConversationInput{
Message: "What is dapr?",
// Role: nil, // Optional
// ScrubPII: nil, // Optional
}
fmt.Println("Input sent:", input.Message)
var conversationComponent = "echo"
request := dapr.NewConversationRequest(conversationComponent, []dapr.ConversationInput{input})
resp, err := client.ConverseAlpha1(context.Background(), request)
if err != nil {
log.Fatalf("err: %v", err)
}
fmt.Println("Output response:", resp.Outputs[0].Result)
}
Run the app without the template
Step 1: Pre-requisites
For this example, you will need:
Step 2: Set up the environment
Clone the sample provided in the Quickstarts repo.
git clone https://github.com/dapr/quickstarts.git
From the root of the Quickstarts directory, navigate into the conversation directory:
cd conversation/python/http/conversation
Install the dependencies:
pip3 install -r requirements.txt
Step 3: Launch the conversation service
Navigate back to the http
directory and start the conversation service with the following command:
dapr run --app-id conversation --resources-path ../../../components -- python3 app.py
Note: Since Python3.exe is not defined in Windows, you may need to use
python app.py
instead ofpython3 app.py
.
Expected output
== APP - conversation == Input sent: What is dapr?
== APP - conversation == Output response: What is dapr?
Step 1: Pre-requisites
For this example, you will need:
Step 2: Set up the environment
Clone the sample provided in the Quickstarts repo.
git clone https://github.com/dapr/quickstarts.git
From the root of the Quickstarts directory, navigate into the conversation directory:
cd conversation/javascript/http/conversation
Install the dependencies:
npm install
Step 3: Launch the conversation service
Navigate back to the http
directory and start the conversation service with the following command:
dapr run --app-id conversation --resources-path ../../../components/ -- npm run start
Expected output
== APP - conversation == Input sent: What is dapr?
== APP - conversation == Output response: What is dapr?
Step 1: Pre-requisites
For this example, you will need:
Step 2: Set up the environment
Clone the sample provided in the Quickstarts repo.
git clone https://github.com/dapr/quickstarts.git
From the root of the Quickstarts directory, navigate into the conversation directory:
cd conversation/csharp/sdk/conversation
Install the dependencies:
dotnet build
Step 3: Launch the conversation service
Start the conversation service with the following command:
dapr run --app-id conversation --resources-path ../../../components/ -- dotnet run
Expected output
== APP - conversation == Input sent: What is dapr?
== APP - conversation == Output response: What is dapr?
Step 1: Pre-requisites
For this example, you will need:
Step 2: Set up the environment
Clone the sample provided in the Quickstarts repo.
git clone https://github.com/dapr/quickstarts.git
From the root of the Quickstarts directory, navigate into the conversation directory:
cd conversation/go/sdk/conversation
Install the dependencies:
go build .
Step 3: Launch the conversation service
Start the conversation service with the following command:
dapr run --app-id conversation --resources-path ../../../components/ -- go run .
Expected output
== APP - conversation == Input sent: What is dapr?
== APP - conversation == Output response: What is dapr?
Demo
Watch the demo presented during Diagrid’s Dapr v1.15 celebration to see how the conversation API works using the .NET SDK.
Tell us what you think!
We’re continuously working to improve our Quickstart examples and value your feedback. Did you find this Quickstart helpful? Do you have suggestions for improvement?
Join the discussion in our discord channel.
Next steps
- HTTP samples of this quickstart:
- Learn more about the conversation building block
Feedback
Was this page helpful?
Glad to hear it! Please tell us how we can improve.
Sorry to hear that. Please tell us how we can improve.