0
votes

What is the suggested solution for handling the mapping of extracted parameters from an intents training phrases to a user-defined value in a database specific to that user.

A practical example I think would be a shopping list app.

Through a Web UI, the user adds catchup to a shopping list which is stored in the database as item.

Then through that agent (i.e. Google Assistant), the utterance results in ketchup being extracted as the item parameter. I wouldn't have a way to know how to map the extracted parameter from the utterance to the user defined value in the daabase

So just to be clear

// in the database added by the user from a web UI
"catchup"

// extracted from voice utterance
"ketchup"

How should I accomplish making sure that the extracted parameters can be matched up to the free form values they have added to the list?

Also, I am inexperienced in this area and have looked through the docs quite a bit and may just be missing this. Wasn't sure if Developer entities, or Session Entities was the solution for this or not.

1

1 Answers

0
votes

Either Developer or Session Entities may be useful here. It depends.

  • If you can enumerate all the possible things that a user can say, and possibly create aliases for some of them, then you should use a Developer Entity. This is easiest and works the best - the ML system has a better chance of matching words when they are pre-defined as part of the training model.

  • If you can't do that, and its ok that you just want to match things that they have already added to a database, then a Session Entity will work well. This really is best for things that you already have about the user, or which may change dramatically based on context.

You may even wish to offer a combination - define as many entities as you can (to get the most common replies), allow free-form replies, and incorporate these free-form replies as Session Entities.