LLMNode
Represents a node in a graph that uses a Large Language Model (LLM).
YAML decl:
kind: LLMNode
name: StartNode
prompts:
system:
en: "You are a helpful assistant."
ru: "Вы помощник."
tools:
- WeatherTool
- EmailToolLanguage:
Prompts can be defined in multiple languages. The fallback_lang is used when a specific language prompt is not available.
Language order in prompts isn't important.
so:
prompts:
system:
en: "You are a helpful assistant."
ru: "Вы помощник."is equivalent to:
prompts:
en:
system: "You are a helpful assistant."
ru:
system: "Вы помощник."Usage:
LLMNode(declaration=yaml_dict, LLMNode)
or
LLMNode(yaml_path="llm_node.yaml")Attributes
attribute__slots__= BaseNode.__slots__ + ('prompts', 'registry')attributespec_type= LLMNodeSpecattributestate_type= LLMNodeStateattributeregistry= registryFunctions
func__init____init__(self, /, spec, registry, *, initial_data=None, yaml_path=None, strict=False, default_lang='en', fallback_lang='en') -> NoneInitialize LLM node with specification and registry.
Args: spec: LLM node specification defining prompts and tools registry: Component registry for tool and dependency resolution initial_data: Optional initial data for the component yaml_path: Optional path to the YAML file this node was loaded from strict: Whether to enforce strict validation default_lang: Default language code for prompt selection fallback_lang: Fallback language code when default is unavailable
paramselfparamspecLLMNodeSpecparamregistryRegistryparaminitial_datadict[str, Any] | None= Noneparamyaml_pathstr | None= Noneparamstrictbool= Falseparamdefault_langstr= 'en'paramfallback_langstr= 'en'Returns
Nonefuncadd_toolsadd_tools(self, /, tools) -> NoneAdd tool nodes to this LLM node for function calling.
Args: tools: List of ToolNode instances to register with this LLM node
Raises: TypeError: If any item in tools is not a ToolNode instance
paramselfparamtoolslist[ToolNode]Returns
Nonefunccompilecompile(self) -> NoneCompile the LLM node for execution.
Initializes prompts bundle and prepares the node for invocation. Must be called before invoke().
Raises: LimanError: If the node is already compiled
paramselfReturns
Nonefuncinvokeinvoke(self, /, llm, inputs, lang=None, **kwargs) -> LangChainMessageExecute the LLM node with given inputs.
Combines system prompts with input messages and invokes the LLM with available tools. Returns the LLM's response message.
Args: llm: Language model instance to use for generation inputs: Sequence of input messages for the conversation lang: Language code for prompt selection (uses default_lang if None) **kwargs: Additional arguments passed to LLM invocation
Returns: Response message from the language model
Raises: LimanError: If node is not compiled or tool is not found in registry
paramselfparamllmBaseChatModelparaminputsSequence[BaseMessage]paramlangLanguageCode | None= NoneparamkwargsAny= {}Returns
LangChainMessagefuncget_new_stateget_new_state(self) -> LLMNodeStateCreate new state instance for this LLM node.
Returns: Fresh LLMNodeState with empty message history
paramselfReturns
LLMNodeStatefunc_init_prompts_init_prompts(self) -> NoneparamselfReturns
NoneLast updated on