LLMChain
This is an experimental port of langchain( currently v0.0.139 ) to android/JVM/Kotlin Multiplatform. Please note that this project is currently in the proof-of-concept stage, and is subject to change.
Maven repository
Only the SNAPSHOT version is published.
Add to repositories
repositories {
mavenCentral()
maven { url "https://s01.oss.sonatype.org/content/repositories/snapshots/" }
}
Add to dependencies
- Android/JVM developers are advised to use the android branch README dependency directions.
- For Kotlin Multiplatform developers, try to add the following
dependencies {
implementation("io.github.wangmuy.llmchain.kmp:core:0.0.1-SNAPSHOT") { changing=true }
implementation("io.github.wangmuy.llmchain.kmp:serviceprovider-openai:0.0.1-SNAPSHOT") { changing=true }
}
Be sure to checkout the support matrix.
Multiplatform support matrix
package | platform | isCompiled | isTested |
---|---|---|---|
core | jvm | true | true |
core | ios | false | false |
core | native | true | true |
core | js | true | true |
core | wasm | true | false |
serviceprovider-openai | jvm | true | true |
serviceprovider-openai | ios | false | false |
serviceprovider-openai | native | true | true |
serviceprovider-openai | js | true | false |
serviceprovider-openai | wasm | false | false |
Quickstart
Here's the almost one-to-one translation of langchain Quickstart Guide in Quickstart.kt, including all the modules/components.
LLMs
val llm = OpenAIChat(APIKEY, proxy = PROXY)
llm.invocationParams[OpenAIChat.REQ_TEMPERATURE] = 0.9
val text = "What would be a good company name for a company that makes colorful socks?"
val output = llm.invoke(text, null)
println("output=\n$output")
Prompt templates
val prompt = PromptTemplate(
inputVariables = listOf("product"),
template = "What is a good name for a company that makes {product}?")
val formatted = prompt.format(mapOf("product" to "colorful socks"))
assertEquals("What is a good name for a company that makes colorful socks?", formatted)
Chains
val llm = OpenAIChat(APIKEY)
llm.invocationParams[OpenAIChat.REQ_TEMPERATURE] = 0.9
val prompt = PromptTemplate(
inputVariables = listOf("product"),
template = "What is a good name for a company that makes {product}?")
val chain = LLMChain(llm = llm, prompt = prompt)
val output = chain.run(mapOf("product" to "colorful socks"))
println("output=\n$output")
Agents
val llm = OpenAIChat(APIKEY)
llm.invocationParams[OpenAIChat.REQ_TEMPERATURE] = 0.0
val fakeSerpApiTool = Tool(
name = "Search",
description = "A search engine. Useful for when you need to answer questions about current events. Input should be a search query.",
func = {_, _ -> "San Francisco Temperature Yesterday. Maximum temperature yesterday: 57 °F (at 1:56 pm) Minimum temperature yesterday: 49 °F (at 1:56 am)"}
)
val llmMathTool = LLMMathChain.asTool(llm)
val agentExecutor = Factory.initializeAgent(listOf(fakeSerpApiTool, llmMathTool), llm,
Factory.AGENT_TYPE_ZERO_SHOT_REACT_DESCRIPTION)
agentExecutor.maxIterations = 4
val output = agentExecutor.run(mapOf("input" to "What was the high temperature in SF yesterday in Fahrenheit? What is that number raised to the .023 power?"))
println("output=\n$output")
Memory
val logCallbackHandler = object: DefaultCallbackHandler() {
override fun onText(text: String, verbose: Boolean) {
println(text)
}
}
val callbackManager = CallbackManager(mutableListOf(logCallbackHandler))
val llm = OpenAIChat(APIKEY).apply {
invocationParams[OpenAIChat.REQ_MAX_TOKENS] = 50
}
llm.callbackManager = callbackManager
val conversation = ConversationChain(llm, verbose = true, callbackManager = callbackManager)
var output: Map<String, Any> = emptyMap()
var outputStr: String = ""
output = conversation.invoke(mapOf("input" to "Hi there!"))
outputStr = output[conversation.outputKey]!!.toString()
println("output=\n$outputStr")
output = conversation.invoke(mapOf("input" to "I'm doing well! Just having a conversation with an AI."))
implementations
-
Schema
- nearly all basic schema interfaces
-
LLM
- BaseLLM
- LLM
-
Prompt template
- BasePromptTemplate
- StringPromptTemplate
- StringPromptValue
- PromptTemplate
-
Chain
- Chain
- LLMChain
- LLMMathChain: currently use LLM, not the actual calculator
- ConversationChain
- RouterChain/MultiRouteChain/LLMRouterChain/MultiPromptChain Note: Only available in jvm/native due to limited implementation of
RegexOption.DOT_MATCHES_ALL
- SequentialChain/SimpleSequentialChain
-
Agent
- Agent
- AgentExecutor
- ZeroShotAgent
-
Tools
- BaseTool
- Tool
- InvalidTool
-
Memory
- SimpleMemory
- ChatMemory
-
DocStore
- DocStore
- InMemoryDocStore
-
VectorStore
- VectorStore
- SimpleVectorStore
- VectorStoreRetriever
-
Embedding
- Embeddings
-
LLM service provider
- OpenAI
- OpenAIChat
- OpenAIEmbedding
- Function Calling
- OpenAI
License
Copyright 2023 wangmuy
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.