| Developed by | Guardrails AI | | Date of development | Feb 15, 2024 | | Validator type | Format | | Blog | | | License | Apache 2 | | Input/Output | Output |
This validator checks if an LLM-generated text contains hallucinations. It retrieves the most relevant information from wikipedia and checks if the LLM-generated text is similar to the retrieved information using another LLM.
-
Dependencies:
- guardrails-ai>=0.4.0
- litellm
- chromadb
- wikipedia
- nltk
-
Note:
- Create a single Guard object per
topic_nameto avoid redundant wikipedia and vector collections.
- Create a single Guard object per
$ guardrails hub install hub://guardrails/wiki_provenanceIn this example, we use the wiki_provenance validator on any LLM generated text.
# Import Guard and Validator
from guardrails.hub import WikiProvenance
from guardrails import Guard
# Use the Guard with the validator
guard = Guard().use(
WikiProvenance,
topic_name="Apple company",
validation_method="sentence",
llm_callable="gpt-3.5-turbo",
on_fail="exception"
)
# Test passing response
guard.validate("Apple was founded by Steve Jobs in April 1976.", metadata={"pass_on_invalid": True}) # Pass
# Test failing response
try:
guard.validate("Ratan Tata founded Apple in September 1998 as a fruit selling company.") # Fail
except Exception as e:
print(e)Output:
Validation failed for field with errors: None of the following sentences in the response are supported by the provided context:
- Ratan Tata founded Apple in September 1998 as a fruit selling company.__init__(self, topic_name, validation_method='sentence', llm_callable='gpt-3.5-turbo', on_fail="noop")
-
Initializes a new instance of the Validator class.
topic_name(str): The name of the topic to search for in Wikipedia.validation_method(str): The method to use for validating the input. Must be one ofsentenceorfull. Ifsentence, the input is split into sentences and each sentence is validated separately. Iffull, the input is validated as a whole. Default issentence.llm_callable(str): The name of the LiteLLM model string to use for validating the input. Default isgpt-3.5-turbo.on_fail(str, Callable): The policy to enact when a validator fails. Ifstr, must be one ofreask,fix,filter,refrain,noop,exceptionorfix_reask. Otherwise, must be a function that is called when the validator fails.
Parameters
__call__(self, value, metadata={}) -> ValidationResult
-
Validates the given `value` using the rules defined in this validator, relying on the `metadata` provided to customize the validation process. This method is automatically invoked by `guard.parse(...)`, ensuring the validation logic is applied to the input data.
- This method should not be called directly by the user. Instead, invoke
guard.parse(...)where this method will be called internally for each associated Validator. - When invoking
guard.parse(...), ensure to pass the appropriatemetadatadictionary that includes keys and values required by this validator. Ifguardis associated with multiple validators, combine all necessary metadata into a single dictionary. -
value(Any): The input value to validate. -
metadata(dict): A dictionary containing metadata required for validation. Keys and values must match the expectations of this validator.Key Type Description Default Required pass_on_invalidBoolean Whether to pass the validation if the LLM returns an invalid response False No
Note:
Parameters