Add model name to llm output
All checks were successful
Build and push / changes (push) Successful in 39s
Build and push / Lint-Python (push) Successful in 2s
Build and push / Build-and-Push-Docker (push) Successful in 1m17s
Build and push / post-status-to-discord (push) Successful in 1s
Build and push / sync-argocd-app (push) Successful in 2s
All checks were successful
Build and push / changes (push) Successful in 39s
Build and push / Lint-Python (push) Successful in 2s
Build and push / Build-and-Push-Docker (push) Successful in 1m17s
Build and push / post-status-to-discord (push) Successful in 1s
Build and push / sync-argocd-app (push) Successful in 2s
This commit is contained in:
parent
1ce3452f8a
commit
55ab511784
@ -174,7 +174,10 @@ async def send_to_llm(ctx, message):
|
||||
response = requests.post(url, json=payload, headers=headers)
|
||||
answer = response.json()["message"]["content"]
|
||||
|
||||
return remove_between(answer, "<think>", "</think>")
|
||||
return (
|
||||
remove_between(answer, "<think>", "</think>")
|
||||
+ f"\n-# Generated using {llm_rules.get('model')}"
|
||||
)
|
||||
except Exception as e:
|
||||
print(e)
|
||||
return "Somethings wrong, maybe the LLM crashed"
|
||||
|
Loading…
x
Reference in New Issue
Block a user