You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When using your own logs in a RAG scenario the gardrails stoppes the llm to answer any questions due to Privacy. So maybe the gardrails is not needed when the information comes from the prompt itself and not from the training data.
Maybe the implementation makes this hard to differentiate ?
This works in llama3 but not in llama3.1
The text was updated successfully, but these errors were encountered:
When using your own logs in a RAG scenario the gardrails stoppes the llm to answer any questions due to Privacy. So maybe the gardrails is not needed when the information comes from the prompt itself and not from the training data.
Maybe the implementation makes this hard to differentiate ?
This works in llama3 but not in llama3.1
The text was updated successfully, but these errors were encountered: