Liman

LLMNode

Represents a node in a graph that uses a Large Language Model (LLM).

YAML decl:

kind: LLMNode
name: StartNode
prompts:
  system:
    en: "You are a helpful assistant."
    ru: "Вы помощник."
tools:
  - WeatherTool
  - EmailTool

Language: Prompts can be defined in multiple languages. The fallback_lang is used when a specific language prompt is not available. Language order in prompts isn't important. so:

prompts:
  system:
    en: "You are a helpful assistant."
    ru: "Вы помощник."

is equivalent to:

prompts:
en:
  system: "You are a helpful assistant."
ru:
  system: "Вы помощник."

Usage:

LLMNode(declaration=yaml_dict, LLMNode)
or
LLMNode(yaml_path="llm_node.yaml")

Attributes

attribute__slots__
= BaseNode.__slots__ + ('prompts', 'registry')
attributespec_type
= LLMNodeSpec
attributestate_type
= LLMNodeState
attributeregistry
= registry

Functions

func__init____init__(self, /, spec, registry, *, initial_data=None, yaml_path=None, strict=False, default_lang='en', fallback_lang='en') -> None

Initialize LLM node with specification and registry.

Args: spec: LLM node specification defining prompts and tools registry: Component registry for tool and dependency resolution initial_data: Optional initial data for the component yaml_path: Optional path to the YAML file this node was loaded from strict: Whether to enforce strict validation default_lang: Default language code for prompt selection fallback_lang: Fallback language code when default is unavailable

paramself
paramspecLLMNodeSpec
paramregistryRegistry
paraminitial_datadict[str, Any] | None
= None
paramyaml_pathstr | None
= None
paramstrictbool
= False
paramdefault_langstr
= 'en'
paramfallback_langstr
= 'en'

Returns

None
funcadd_toolsadd_tools(self, /, tools) -> None

Add tool nodes to this LLM node for function calling.

Args: tools: List of ToolNode instances to register with this LLM node

Raises: TypeError: If any item in tools is not a ToolNode instance

paramself
paramtoolslist[ToolNode]

Returns

None
funccompilecompile(self) -> None

Compile the LLM node for execution.

Initializes prompts bundle and prepares the node for invocation. Must be called before invoke().

Raises: LimanError: If the node is already compiled

paramself

Returns

None
funcinvokeinvoke(self, /, llm, inputs, lang=None, **kwargs) -> LangChainMessage

Execute the LLM node with given inputs.

Combines system prompts with input messages and invokes the LLM with available tools. Returns the LLM's response message.

Args: llm: Language model instance to use for generation inputs: Sequence of input messages for the conversation lang: Language code for prompt selection (uses default_lang if None) **kwargs: Additional arguments passed to LLM invocation

Returns: Response message from the language model

Raises: LimanError: If node is not compiled or tool is not found in registry

paramself
paramllmBaseChatModel
paraminputsSequence[BaseMessage]
paramlangLanguageCode | None
= None
paramkwargsAny
= {}

Returns

LangChainMessage
funcget_new_stateget_new_state(self) -> LLMNodeState

Create new state instance for this LLM node.

Returns: Fresh LLMNodeState with empty message history

paramself

Returns

LLMNodeState
func_init_prompts_init_prompts(self) -> None
paramself

Returns

None

Last updated on