1
votes

I am building a chatbot in C# using:

.NetCore 2.1; SDK 4.0;

I started with using this tutorial to build my chatbot with QnA Maker and LUIS integrated.

I am confused on how to user both QnA and LUIS together effectively. For example lets say this bot is going to act as a FAQ bot for now. I have 50 FAQ's that I want the bot to answer. In my head I would create a KB in QnA with these 50 questions with alternate phrases.

A dispatch file would create an app and a single intent that will map the utterances to QnA Maker's KB and I am done?

I was wondering why would I add intents to LUIS like in the tutorial there are two intents of HomeAutomation and Weather that are only in LUIS....until unless I map these intents to questions in a QnA Maker KB....do they perform any function? I am confused why Microsoft deemed it necessary to differentiate whether the reply is coming from QnA Maker or hitting the intent in LUIS. According to my understanding having an intent in LUIS and not having replies coming from QnA is useless?

Secondly, I want to give the client the ability to maintain their KB and intents themselves....if you add a new intent or question, do you have to refresh the dispatch file every time?

1

1 Answers

1
votes

Part of the problem is that if you're using only one QnA KB, and not multiple, then you're following the wrong guide. The one you want to be following is this one:

Use QnA Maker to answer questions

If you add in an additional KB, or add in a LUIS model, then you're going to want to add dispatch. Otherwise, you're only making it more complicated on yourself by adding it in.

To cover your questions: The NLP-with-Dispatch sample that the tutorial you referenced covers only shows how to implement Dispatch. There's a section in the dispatch.cs dialog that shows a basic 'here's what happens when the intent is returned':

 private async Task DispatchToTopIntentAsync(ITurnContext<IMessageActivity> turnContext, string intent, RecognizerResult recognizerResult, CancellationToken cancellationToken)
{
    switch (intent)
    {
    case "l_HomeAutomation":
            await ProcessHomeAutomationAsync(turnContext, recognizerResult.Properties["luisResult"] as LuisResult, cancellationToken);
            break;
        case "l_Weather":
            await ProcessWeatherAsync(turnContext, recognizerResult.Properties["luisResult"] as LuisResult, cancellationToken);
            break;
        case "q_sample-qna":
            await ProcessSampleQnAAsync(turnContext, cancellationToken);
            break;
        default:
            _logger.LogInformation($"Dispatch unrecognized intent: {intent}.");
            await turnContext.SendActivityAsync(MessageFactory.Text($"Dispatch unrecognized intent: {intent}."), cancellationToken);
            break;
    }
}

If you do decided to go with the dispatch model for structuring your NLP, then it would be up to you to begin dialogs in those cases. For example:

protected override async Task RouteAsync(DialogContext dc, CancellationToken cancellationToken = default(CancellationToken))
{
    // Get cognitive models for locale
    var locale = CultureInfo.CurrentUICulture.TwoLetterISOLanguageName;
    var cognitiveModels = _services.CognitiveModelSets[locale];

    // Check dispatch result
    var dispatchResult = await cognitiveModels.DispatchService.RecognizeAsync<DispatchLuis>(dc.Context, CancellationToken.None);
    var intent = dispatchResult.TopIntent().intent;

    // Identify if the dispatch intent matches any Action within a Skill if so, we pass to the appropriate SkillDialog to hand-off
    var identifiedSkill = SkillRouter.IsSkill(_settings.Skills, intent.ToString());

    if (identifiedSkill != null)
    {
        // We have identiifed a skill so initialize the skill connection with the target skill
        var result = await dc.BeginDialogAsync(identifiedSkill.Id);

        if (result.Status == DialogTurnStatus.Complete)
        {
            await CompleteAsync(dc);
        }
    }
    else if (intent == DispatchLuis.Intent.l_general)
    {
        // If dispatch result is general luis model
        cognitiveModels.LuisServices.TryGetValue("general", out var luisService);

        if (luisService == null)
        {
            throw new Exception("The general LUIS Model could not be found in your Bot Services configuration.");
        }
        else
        {
            var result = await luisService.RecognizeAsync<GeneralLuis>(dc.Context, CancellationToken.None);

            var generalIntent = result?.TopIntent().intent;

            // switch on general intents
            switch (generalIntent)
            {
                case GeneralLuis.Intent.Escalate:
                {
                    // start escalate dialog
                    await dc.BeginDialogAsync(nameof(EscalateDialog));
                    break;
                }

                case GeneralLuis.Intent.None:
                default:
                {
                    // No intent was identified, send confused message
                    await _responder.ReplyWith(dc.Context, MainResponses.ResponseIds.Confused);
                    break;
                }
            }
        }
    }
    else if (intent == DispatchLuis.Intent.q_faq)
    {
        cognitiveModels.QnAServices.TryGetValue("faq", out var qnaService);

        if (qnaService == null)
        {
            throw new Exception("The specified QnA Maker Service could not be found in your Bot Services configuration.");
        }
        else
        {
            var answers = await qnaService.GetAnswersAsync(dc.Context, null, null);

            if (answers != null && answers.Count() > 0)
            {
                await dc.Context.SendActivityAsync(answers[0].Answer, speak: answers[0].Answer);
            }
            else
            {
                await _responder.ReplyWith(dc.Context, MainResponses.ResponseIds.Confused);
            }
        }
    }
    else if (intent == DispatchLuis.Intent.q_chitchat)
    {
        cognitiveModels.QnAServices.TryGetValue("chitchat", out var qnaService);

        if (qnaService == null)
        {
            throw new Exception("The specified QnA Maker Service could not be found in your Bot Services configuration.");
        }
        else
        {
            var answers = await qnaService.GetAnswersAsync(dc.Context, null, null);

            if (answers != null && answers.Count() > 0)
            {
                await dc.Context.SendActivityAsync(answers[0].Answer, speak: answers[0].Answer);
            }
            else
            {
                await _responder.ReplyWith(dc.Context, MainResponses.ResponseIds.Confused);
            }
        }
    }
    else
    {
        // If dispatch intent does not map to configured models, send "confused" response.
        // Alternatively as a form of backup you can try QnAMaker for anything not understood by dispatch.
        await _responder.ReplyWith(dc.Context, MainResponses.ResponseIds.Confused);
    }
}

That is a much more robust example of how to actually utilize dispatch, taken from the Virtual Assistant .

If your bot is only going to have one KB, I would avoid using dispatch, because yes, every time you updated it, you would have to refresh your dispatch (update, train, republish, test).