Part of the problem is that if you're using only one QnA KB, and not multiple, then you're following the wrong guide. The one you want to be following is this one:
Use QnA Maker to answer questions
If you add in an additional KB, or add in a LUIS model, then you're going to want to add dispatch. Otherwise, you're only making it more complicated on yourself by adding it in.
To cover your questions: The NLP-with-Dispatch sample that the tutorial you referenced covers only shows how to implement Dispatch. There's a section in the dispatch.cs dialog that shows a basic 'here's what happens when the intent is returned':
private async Task DispatchToTopIntentAsync(ITurnContext<IMessageActivity> turnContext, string intent, RecognizerResult recognizerResult, CancellationToken cancellationToken)
{
switch (intent)
{
case "l_HomeAutomation":
await ProcessHomeAutomationAsync(turnContext, recognizerResult.Properties["luisResult"] as LuisResult, cancellationToken);
break;
case "l_Weather":
await ProcessWeatherAsync(turnContext, recognizerResult.Properties["luisResult"] as LuisResult, cancellationToken);
break;
case "q_sample-qna":
await ProcessSampleQnAAsync(turnContext, cancellationToken);
break;
default:
_logger.LogInformation($"Dispatch unrecognized intent: {intent}.");
await turnContext.SendActivityAsync(MessageFactory.Text($"Dispatch unrecognized intent: {intent}."), cancellationToken);
break;
}
}
If you do decided to go with the dispatch model for structuring your NLP, then it would be up to you to begin dialogs in those cases. For example:
protected override async Task RouteAsync(DialogContext dc, CancellationToken cancellationToken = default(CancellationToken))
{
// Get cognitive models for locale
var locale = CultureInfo.CurrentUICulture.TwoLetterISOLanguageName;
var cognitiveModels = _services.CognitiveModelSets[locale];
// Check dispatch result
var dispatchResult = await cognitiveModels.DispatchService.RecognizeAsync<DispatchLuis>(dc.Context, CancellationToken.None);
var intent = dispatchResult.TopIntent().intent;
// Identify if the dispatch intent matches any Action within a Skill if so, we pass to the appropriate SkillDialog to hand-off
var identifiedSkill = SkillRouter.IsSkill(_settings.Skills, intent.ToString());
if (identifiedSkill != null)
{
// We have identiifed a skill so initialize the skill connection with the target skill
var result = await dc.BeginDialogAsync(identifiedSkill.Id);
if (result.Status == DialogTurnStatus.Complete)
{
await CompleteAsync(dc);
}
}
else if (intent == DispatchLuis.Intent.l_general)
{
// If dispatch result is general luis model
cognitiveModels.LuisServices.TryGetValue("general", out var luisService);
if (luisService == null)
{
throw new Exception("The general LUIS Model could not be found in your Bot Services configuration.");
}
else
{
var result = await luisService.RecognizeAsync<GeneralLuis>(dc.Context, CancellationToken.None);
var generalIntent = result?.TopIntent().intent;
// switch on general intents
switch (generalIntent)
{
case GeneralLuis.Intent.Escalate:
{
// start escalate dialog
await dc.BeginDialogAsync(nameof(EscalateDialog));
break;
}
case GeneralLuis.Intent.None:
default:
{
// No intent was identified, send confused message
await _responder.ReplyWith(dc.Context, MainResponses.ResponseIds.Confused);
break;
}
}
}
}
else if (intent == DispatchLuis.Intent.q_faq)
{
cognitiveModels.QnAServices.TryGetValue("faq", out var qnaService);
if (qnaService == null)
{
throw new Exception("The specified QnA Maker Service could not be found in your Bot Services configuration.");
}
else
{
var answers = await qnaService.GetAnswersAsync(dc.Context, null, null);
if (answers != null && answers.Count() > 0)
{
await dc.Context.SendActivityAsync(answers[0].Answer, speak: answers[0].Answer);
}
else
{
await _responder.ReplyWith(dc.Context, MainResponses.ResponseIds.Confused);
}
}
}
else if (intent == DispatchLuis.Intent.q_chitchat)
{
cognitiveModels.QnAServices.TryGetValue("chitchat", out var qnaService);
if (qnaService == null)
{
throw new Exception("The specified QnA Maker Service could not be found in your Bot Services configuration.");
}
else
{
var answers = await qnaService.GetAnswersAsync(dc.Context, null, null);
if (answers != null && answers.Count() > 0)
{
await dc.Context.SendActivityAsync(answers[0].Answer, speak: answers[0].Answer);
}
else
{
await _responder.ReplyWith(dc.Context, MainResponses.ResponseIds.Confused);
}
}
}
else
{
// If dispatch intent does not map to configured models, send "confused" response.
// Alternatively as a form of backup you can try QnAMaker for anything not understood by dispatch.
await _responder.ReplyWith(dc.Context, MainResponses.ResponseIds.Confused);
}
}
That is a much more robust example of how to actually utilize dispatch, taken from the Virtual Assistant .
If your bot is only going to have one KB, I would avoid using dispatch, because yes, every time you updated it, you would have to refresh your dispatch (update, train, republish, test).