This repository is dedicated to a generalized approach of Human-IQ, where users can define their own Langchain pipeline with the flexibility to choose from multiple methods of few-shot examples, including Determinantal Point Processes (DPP) and sentence similarity. The goal of this project is to provide a customizable and extensible framework for building Langchain-based applications.
The project consists of the following files:
-
base_agent.py
: This file contains the abstract base classBaseAgent
that defines the interface for the agent. It includes abstract methods for planning and tool run logging. -
custom_agent.py
: This file contains theCustomAgent
class, which is a concrete implementation ofBaseAgent
. It defines the behavior of the agent, including input handling, planning, and execution. -
output_parsers.py
: This file contains the abstract base classBaseOutputParser
and its concrete implementationCustomOutputParser
. These classes are responsible for parsing the output of the language model and extracting relevant information. -
prompt_templates.py
: This file contains theCustomPromptTemplate
class, which is used to generate prompts for the language model based on the provided template and tools. -
template_construction.py
: This file contains the abstract base classBaseTemplateConstruction
and its concrete implementationTemplateConstruction
. These classes are responsible for constructing the prompt templates used by the agent. -
langchain_impl.py
: This file contains the abstract base classBaseLangchainImpl
and its concrete implementationLangchainImpl
. These classes provide the high-level interface for using the Langchain library and executing the agent. -
tool_provider.py
: This file contains the abstract base classBaseToolProvider
and its concrete implementationCustomToolProvider
. These classes allow users to define their own tools and provide them to the agent.
To use the Human-IQ framework, follow these steps:
-
Clone the repository:
git clone https://github.com/NIMI-research/Human-IQ.git
-
Install the required dependencies:
pip install -r requirements.txt
-
Define your custom tools by creating instances of
Tool
and providing them to theCustomToolProvider
:from langchain.tools import Tool from tool_provider import CustomToolProvider def wikipedia_search(query): # Custom wikipedia search implementation pass def get_wikidata_id(string): # Custom implementation pass custom_tools = [ Tool( name="wikipedia_search", func=wikipedia_search, description="Useful for performing custom searches over wikipedia.", ), Tool( name="get_wikidata_id", func=get_wikidata_id, description="Useful to retrieve wikidata entity ID given their label.", ), ] tool_provider = CustomToolProvider(custom_tools)
-
Create an instance of
LangchainImpl
with the desired parameters and execute the pipeline:from langchain_impl import LangchainImpl dataset = "mintaka" model_name = "gpt-4" dynamic = True DPP = False langchain_impl = LangchainImpl(dataset, model_name, tool_provider, dynamic, DPP) question = "Who was Kentucky's first governor?" out, answer_template_for_inference, completion_tokens = langchain_impl.execute_agent(question)
The HumanIQ framework provides several options for customization:
-
Custom Tools: Define your own tools by creating instances of
Tool
and providing them to theCustomToolProvider
. This allows you to extend the functionality of the Langchain pipeline based on your specific use case. -
Few-Shot Examples: Choose from multiple methods of few-shot examples, including Determinantal Point Processes (DPP) and sentence similarity. Modify the
TemplateConstruction
class intemplate_construction.py
to implement your desired method. -
Custom Agents: Implement your own custom agents by extending the
BaseAgent
class inbase_agent.py
and providing your implementation in a new file. -
Custom Output Parsers: Implement your own output parsers by extending the
BaseOutputParser
class inoutput_parsers.py
and providing your implementation in a new file.