LLM does not do anything. #213
Unanswered
Tastaturpilot
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
First of all, thank you for yor work! Amazing idea, sadly it does not work correctly for me.
I use oogabooga text-generation-webui, I tried various models, like your models, Llama3, llama3.1, phi3 and so on. I tried to install it via Generic-Open-ai-compatible..., via text-generation-webui, even lama.cpp I tried. No success. I use output.gbnf, I even installed a character with the standard system prompt, no success. 🙁 Here is the problem:
I have set up a helper, a switch. I wanted the LLM to switch it on or off... it says it has switched the switch, but it doesnt. Here is what the action box prints out:
With Home Assistant:
action: conversation.process
data:
text: Schalte S1 an.
language: DE
conversation_id: my_conversation_1
agent_id: conversation.home_assistant
response:
speech:
plain:
speech: s1 eingeschaltet
extra_data: null
card: {}
language: DE
response_type: action_done
data:
targets: []
success:
- name: S1
type: entity
id: input_boolean.s1
failed: []
conversation_id: null
The switch is now switched, all good.
Now with LLM:
action: conversation.process
data:
text: Schalte S1 aus.
language: DE
conversation_id: my_conversation_1
agent_id: conversation.llm_model_gpt_3_5_turbo_remote
response:
speech:
plain:
speech: |-
Ich habe die Aktion "HassTurnOff" ausgeführt, um S1 auszuschalten.
card: {}
language: DE
response_type: action_done
data:
targets: []
success: []
failed: []
conversation_id: 01J6HF6PN38WC4CA3KX9W8JC8V
The switch is still ON!
I have tried all possible ways to install and set the integration, no success 😕
Please help!
Edit: I suppose it has something to do with the "HassTurnOn" what the LLM is trying to use, but it should use "input_boolean.turn_on" instead....
Beta Was this translation helpful? Give feedback.
All reactions