1
votes

I have chatbot created in C# using SDK 4 which has multiple dialog's each dialog calls another, in one dialog i am rendering Adaptive card in STEP #1 which is having only 2 inputs be provided with OK button: 1. Date 2. Time 3. OK Button So that i can take the extract/captures the values submitted through OK button in STEP#2 and continue with process.

Issue: How to extract the values that has been submitted in step#1 in STEP#2 in an waterfall dialog in C#?

Language: C#

Bot SDK: V4

Please help as i am new to BOT and coding by providing step by step guide?

I have already tried few things like: 1. Putting the card rendered in prompt 2. Try to extract/capture value through: stepContext.Context.Activity.Value

All of this did not help.

STEP #1: var cardAttachment = CreateAdaptiveCardAttachment(this.cards); var reply = stepContext.Context.Activity.CreateReply(); reply.Attachments = new List<Attachment>() { cardAttachment }; return await stepContext.Context.SendActivityAsync(reply); // or return await stepContext.PromptAsync("datetextPrompt", new PromptOptions() { Prompt = reply, });

STEP #2: I want to extract or capture value? How to do it?

1
Adaptive Cards are not really supported yet in waterfall dialogs. This is the [GitHub issue] (github.com/Microsoft/botbuilder-dotnet/issues/614) track for this request. There are some solutions involving creating your own card prompt such as the one discuss [here] (stackoverflow.com/questions/53009106/…)Ed Boykin
@EdBoykin Clarification: Adaptive cards are absolutely supported in waterfall dialogs. They just can't currently be used as prompts, since that wasn't really their original intent. They can still be displayed, however.mdrichardson
I agree. I should have been more clear on my statement. I appreciate the clarification.Ed Boykin

1 Answers

3
votes

Using Adaptive Cards with Waterfall Dialogs

Natively, Adaptive Cards don't work like prompts. With a prompt, the prompt will display and wait for user input before continuing. But with Adaptive Cards (even if it contains an input box and a submit button), there is no code in an Adaptive Card that will cause a Waterfall Dialog to wait for user input before continuing the dialog.

So, if you're using an Adaptive Card that takes user input, you generally want to handle whatever the user submits outside of the context of a Waterfall Dialog.

That being said, if you want to use an Adaptive Card as part of a Waterfall Dialog, there is a workaround. Basically, you:

  1. Display the Adaptive Card
  2. Display a Text Prompt
  3. Convert the user's Adaptive Card input into the input of a Text Prompt

In your Waterfall Dialog class (steps 1 and 2):

    private async Task<DialogTurnResult> DisplayCardAsync(WaterfallStepContext stepContext, CancellationToken cancellationToken)
    {
        // Display the Adaptive Card
        var cardPath = Path.Combine(".", "AdaptiveCard.json");
        var cardJson = File.ReadAllText(cardPath);
        var cardAttachment = new Attachment()
        {
            ContentType = "application/vnd.microsoft.card.adaptive",
            Content = JsonConvert.DeserializeObject(cardJson),
        };
        var message = MessageFactory.Text("");
        message.Attachments = new List<Attachment>() { cardAttachment };
        await stepContext.Context.SendActivityAsync(message, cancellationToken);

        // Create the text prompt
        var opts = new PromptOptions
        {
            Prompt = new Activity
            {
                Type = ActivityTypes.Message,
                Text = "waiting for user input...", // You can comment this out if you don't want to display any text. Still works.
            }
        };

        // Display a Text Prompt and wait for input
        return await stepContext.PromptAsync(nameof(TextPrompt), opts);
    }

    private async Task<DialogTurnResult> HandleResponseAsync(WaterfallStepContext stepContext, CancellationToken cancellationToken)
    {
        // Do something with step.result
        // Adaptive Card submissions are objects, so you likely need to JObject.Parse(step.result)
        await stepContext.Context.SendActivityAsync($"INPUT: {stepContext.Result}");
        return await stepContext.NextAsync();
    }

In your main bot class (<your-bot>.cs), under OnTurnAsync(), near the beginning of the method, somewhere before await dialogContext.ContinueDialogAsync(cancellationToken) is called (step 3):

var activity = turnContext.Activity;

if (string.IsNullOrWhiteSpace(activity.Text) && activity.Value != null)
{
    activity.Text = JsonConvert.SerializeObject(activity.Value);
}

Additional Context

Adaptive Cards send their Submit results a little different than regular user text. When a user types in the chat and sends a normal message, it ends up in Context.Activity.Text. When a user fills out an input on an Adaptive Card, it ends up in Context.Activity.Value, which is an object where the key names are the id in your card and the values are the field values in the adaptive card.

For example, the json:

{
    "type": "AdaptiveCard",
    "body": [
        {
            "type": "TextBlock",
            "text": "Test Adaptive Card"
        },
        {
            "type": "ColumnSet",
            "columns": [
                {
                    "type": "Column",
                    "items": [
                        {
                            "type": "TextBlock",
                            "text": "Text:"
                        }
                    ],
                    "width": 20
                },
                {
                    "type": "Column",
                    "items": [
                        {
                            "type": "Input.Text",
                            "id": "userText",
                            "placeholder": "Enter Some Text"
                        }
                    ],
                    "width": 80
                }
            ]
        }
    ],
    "actions": [
        {
            "type": "Action.Submit",
            "title": "Submit"
        }
    ],
    "$schema": "http://adaptivecards.io/schemas/adaptive-card.json",
    "version": "1.0"
}

.. creates a card that looks like:

Test Adaptive Card

If a user enters "Testing Testing 123" in the text box and hits Submit, Context.Activity will look something like:

{ type: 'message',
  value: { userText: 'Testing Testing 123' },
  from: { id: 'xxxxxxxx-05d4-478a-9daa-9b18c79bb66b', name: 'User' },
  locale: '',
  channelData: { postback: true },
  channelId: 'emulator',
  conversation: { id: 'xxxxxxxx-182b-11e9-be61-091ac0e3a4ac|livechat' },
  id: 'xxxxxxxx-182b-11e9-ad8e-63b45e3ebfa7',
  localTimestamp: 2019-01-14T18:39:21.000Z,
  recipient: { id: '1', name: 'Bot', role: 'bot' },
  timestamp: 2019-01-14T18:39:21.773Z,
  serviceUrl: 'http://localhost:58453' }

The user submission can be seen in Context.Activity.Value.userText.

Note that adaptive card submissions are sent as a postBack, which means that the submission data doesn't appear in the chat window as part of the conversation--it stays on the Adaptive Card.