fill requirement with nlp.source option
tiulimg opened this issue · 5 comments
sometimes i wish to save the entire answer.. for example for "why" questions.. or even "name" questions..
I would like to have the option to fill a requirement with nlp.source..
Hi @tiulimg
You can do it in the action (not in the requirements) :
1/ in a first skill, create a variable with true value (on actions tab)
2/ then ask your "why" question
3/ then redirect to a second skill (with wait for user input)
4/ set the memory field of your variable with {{nlp.source}}
Hope that helps!
Aurélie
No, it isn't a good solution. I know that I can do it in the action, but it means I can save the nlp.source only of the last requirement in the skill, then if I have multiple "open" questions I have to split my skills much. Instead of 1 skill for 5-6 open questions I have to use 5-6 skills. If I had more then 5-6 open questions it would mean more skiils.
Hi @tiulimg, I made an example bot to show how you can create open question requirements with a single global skill : https://cai.tools.sap/dominik/open-questions/skills
- In your business skill, create a requirement with an entity that will never be recognized (fake, never present entity, I called it "mock-entity").
- In the "when missing" actions of this requirement, set two memory fields : "question_skill" with the slug of the skill you want to be executed after the user responds to the open question, and "answer_field" with the name of the memory field that should be populated with the user response to the open question.
- Fork the "open-question" skill, and put this condition in your fallback skill :
if _memory.question_skill is-present : goto open-question
I see. Maybe even I can save "{{nlp.source}}" in each "If #entity is complete". Thanks!
Just wanted to update that I've tried using "{{nlp.source}}" in each "If #entity is complete" condition in requirements and for some reason it didn't worked...