AIAgent
AIAgent
yaml
type: "io.kestra.plugin.ai.agent.AIAgent"Examples
yaml
id: simple_summarizer_agent
namespace: company.ai
inputs:
- id: summary_length
displayName: Summary Length
type: SELECT
defaults: medium
values:
- short
- medium
- long
- id: language
displayName: Language ISO code
type: SELECT
defaults: en
values:
- en
- fr
- de
- es
- it
- ru
- ja
- id: text
type: STRING
displayName: Text to summarize
defaults: |
Kestra is an open-source orchestration platform that:
- Allows you to define workflows declaratively in YAML
- Allows non-developers to automate tasks with a no-code interface
- Keeps everything versioned and governed, so it stays secure and auditable
- Extends easily for custom use cases through plugins and custom scripts.
Kestra follows a "start simple and grow as needed" philosophy. You can schedule a basic workflow in a few minutes, then later add Python scripts, Docker containers, or complicated branching logic if the situation calls for it.
tasks:
- id: multilingual_agent
type: io.kestra.plugin.ai.agent.AIAgent
systemMessage: |
You are a precise technical assistant.
Produce a {{ inputs.summary_length }} summary in {{ inputs.language }}.
Keep it factual, remove fluff, and avoid marketing language.
If the input is empty or non-text, return a one-sentence explanation.
Output format:
- 1-2 sentences for 'short'
- 2-5 sentences for 'medium'
- Up to 5 paragraphs for 'long'
prompt: |
Summarize the following content: {{ inputs.text }}
- id: english_brevity
type: io.kestra.plugin.ai.agent.AIAgent
prompt: Generate exactly 1 sentence English summary of "{{ outputs.multilingual_agent.textOutput }}"
pluginDefaults:
- type: io.kestra.plugin.ai.agent.AIAgent
values:
provider:
type: io.kestra.plugin.ai.provider.GoogleGemini
modelName: gemini-2.5-flash
apiKey: "{{ kv('GEMINI_API_KEY') }}"
yaml
id: agent_with_docker_mcp_server_tool
namespace: company.ai
inputs:
- id: prompt
type: STRING
defaults: What is the current UTC time?
tasks:
- id: agent
type: io.kestra.plugin.ai.agent.AIAgent
prompt: "{{ inputs.prompt }}"
provider:
type: io.kestra.plugin.ai.provider.OpenAI
apiKey: "{{ kv('OPENAI_API_KEY') }}"
modelName: gpt-5-nano
tools:
- type: io.kestra.plugin.ai.tool.DockerMcpClient
image: mcp/time
yaml
id: agent_with_memory
namespace: company.ai
tasks:
- id: first_agent
type: io.kestra.plugin.ai.agent.AIAgent
prompt: Hi, my name is John and I live in New York!
- id: second_agent
type: io.kestra.plugin.ai.agent.AIAgent
prompt: What's my name and where do I live?
pluginDefaults:
- type: io.kestra.plugin.ai.agent.AIAgent
values:
provider:
type: io.kestra.plugin.ai.provider.OpenAI
apiKey: "{{ kv('OPENAI_API_KEY') }}"
modelName: gpt-5-mini
memory:
type: io.kestra.plugin.ai.memory.KestraKVStore
memoryId: JOHN
ttl: PT1M
messages: 5
yaml
id: agent_with_content_retriever
namespace: company.ai
inputs:
- id: prompt
type: STRING
defaults: What is the latest Kestra release and what new features does it include? Name at least 3 new features added exactly in this release.
tasks:
- id: agent
type: io.kestra.plugin.ai.agent.AIAgent
prompt: "{{ inputs.prompt }}"
provider:
type: io.kestra.plugin.ai.provider.GoogleGemini
modelName: gemini-2.5-flash
apiKey: "{{ kv('GEMINI_API_KEY') }}"
contentRetrievers:
- type: io.kestra.plugin.ai.retriever.TavilyWebSearch
apiKey: "{{ kv('TAVILY_API_KEY') }}"
yaml
id: agent_with_structured_output
namespace: company.ai
inputs:
- id: customer_ticket
type: STRING
defaults: >-
I can't log into my account. It says my password is wrong, and the reset link never arrives.
tasks:
- id: support_agent
type: io.kestra.plugin.ai.agent.AIAgent
provider:
type: io.kestra.plugin.ai.provider.MistralAI
apiKey: "{{ kv('MISTRAL_API_KEY') }}"
modelName: open-mistral-7b
systemMessage: |
You are a classifier that returns ONLY valid JSON matching the schema.
Do not add explanations or extra keys.
configuration:
responseFormat:
type: JSON
jsonSchema:
type: object
required: ["category", "priority"]
properties:
category:
type: string
enum: ["ACCOUNT", "BILLING", "TECHNICAL", "GENERAL"]
priority:
type: string
enum: ["LOW", "MEDIUM", "HIGH"]
prompt: |
Classify the following customer message:
{{ inputs.customer_ticket }}
yaml
id: market_research_agent
namespace: company.ai
inputs:
- id: prompt
type: STRING
defaults: |
Research the latest trends in workflow and data orchestration.
Use web search to gather current, reliable information from multiple sources.
Then create a well-structured Markdown report that includes an introduction,
key trends with short explanations, and a conclusion.
Save the final report as `report.md` in the `/tmp` directory.
tasks:
- id: agent
type: io.kestra.plugin.ai.agent.AIAgent
provider:
type: io.kestra.plugin.ai.provider.GoogleGemini
apiKey: "{{ kv('GEMINI_API_KEY') }}"
modelName: gemini-2.5-flash
prompt: "{{ inputs.prompt }}"
systemMessage: |
You are a research assistant that must always follow this process:
1. Use the TavilyWebSearch content retriever to gather the most relevant and up-to-date information for the user prompt. Do not invent information.
2. Summarize and structure the findings clearly in Markdown format. Use headings, bullet points, and links when appropriate.
3. Save the final Markdown report as `report.md` in the `/tmp` directory by using the provided filesystem tool.
Important rules:
- Never output raw text in your response. The final result must always be written to `report.md`.
- If no useful results are retrieved, write a short note in `report.md` explaining that no information was found.
- Do not attempt to bypass or ignore the retriever or the filesystem tool.
contentRetrievers:
- type: io.kestra.plugin.ai.retriever.TavilyWebSearch
apiKey: "{{ kv('TAVILY_API_KEY') }}"
maxResults: 10
tools:
- type: io.kestra.plugin.ai.tool.DockerMcpClient
image: mcp/filesystem
command: ["/tmp"]
binds: ["{{workingDir}}:/tmp"] # mount host_path:container_path to access the generated report
outputFiles:
- report.md
yaml
id: agent_with_code_execution_stats
namespace: company.ai
inputs:
- id: series
type: STRING
defaults: |
12, 15, 15, 18, 21, 99, 102, 102, 104
tasks:
- id: stats_agent
type: io.kestra.plugin.ai.agent.AIAgent
provider:
type: io.kestra.plugin.ai.provider.GoogleGemini
apiKey: "{{ kv('GEMINI_API_KEY') }}"
modelName: gemini-2.5-flash
systemMessage: |
You are a data analyst.
Always use the CodeExecution tool for computations.
Then summarize clearly in English.
prompt: |
Here is a numeric series: {{ inputs.series }}
1) Compute mean, median, min, max, and standard deviation.
2) Detect outliers using a z-score greater than 2.
3) Explain the distribution in 5-8 lines.
tools:
- type: io.kestra.plugin.ai.tool.CodeExecution
apiKey: "{{ kv('RAPID_API_KEY') }}"
yaml
id: agent_with_google_custom_search_release_notes
namespace: company.ai
inputs:
- id: prompt
type: STRING
defaults: |
Find the most recent Kestra release and summarize:
- release date
- 5 major new features
- 3 important bug fixes
Answer in English.
tasks:
- id: agent
type: io.kestra.plugin.ai.agent.AIAgent
provider:
type: io.kestra.plugin.ai.provider.GoogleGemini
apiKey: "{{ kv('GEMINI_API_KEY') }}"
modelName: gemini-2.5-flash
systemMessage: |
You are a release-notes assistant.
If you need up-to-date information, call GoogleCustomWebSearch.
Summarize sources and avoid hallucinations.
prompt: "{{ inputs.prompt }}"
tools:
- type: io.kestra.plugin.ai.tool.GoogleCustomWebSearch
apiKey: "{{ kv('GOOGLE_SEARCH_API_KEY') }}"
csi: "{{ kv('GOOGLE_SEARCH_CSI') }}"
yaml
id: incident_triage_orchestrator
namespace: company.ai
inputs:
- id: incident
type: STRING
defaults: |
The "billing-prod" SaaS data has been stale for 2 hours.
We suspect an API extraction failure from an external provider.
tasks:
- id: agent
type: io.kestra.plugin.ai.agent.AIAgent
provider:
type: io.kestra.plugin.ai.provider.OpenAI
apiKey: "{{ kv('OPENAI_API_KEY') }}"
modelName: gpt-5-mini
systemMessage: |
You are an incident triage agent.
Decide which flow to run to mitigate the issue.
Use the kestra_flow tool to trigger it with relevant inputs.
prompt: |
Incident:
{{ inputs.incident }}
You can run the following flows in the "prod.ops" namespace:
- restart-billing-extract (inputs: service, reason)
- run-billing-backfill (inputs: service, sinceHours)
- notify-oncall (inputs: team, severity, message)
Pick the best flow and execute it using the tool.
tools:
- type: io.kestra.plugin.ai.tool.KestraFlow
yaml
id: multi_flow_planner_agent
namespace: company.ai
inputs:
- id: objective
type: SELECT
defaults: ingestion
values:
- ingestion
- cleanup
- alerting
tasks:
- id: agent
type: io.kestra.plugin.ai.agent.AIAgent
provider:
type: io.kestra.plugin.ai.provider.GoogleGemini
apiKey: "{{ kv('GEMINI_API_KEY') }}"
modelName: gemini-2.5-flash
prompt: |
User objective: {{ inputs.objective }}
Execute the most appropriate flow for this objective.
tools:
- type: io.kestra.plugin.ai.tool.KestraFlow
namespace: prod.data
flowId: ingest-daily-snapshots
description: Daily ingestion of snapshots
- type: io.kestra.plugin.ai.tool.KestraFlow
namespace: prod.data
flowId: purge-stale-partitions
description: Cleanup of obsolete partitions
- type: io.kestra.plugin.ai.tool.KestraFlow
namespace: prod.ops
flowId: send-severity-alert
description: Send an on-call alert
yaml
id: agent_using_kestra_task_self_healing
namespace: company.ai
inputs:
- id: error_message
type: STRING
defaults: "Disk usage >= 95% on node worker-3"
tasks:
- id: agent
type: io.kestra.plugin.ai.agent.AIAgent
provider:
type: io.kestra.plugin.ai.provider.GoogleGemini
apiKey: "{{ kv('GEMINI_API_KEY') }}"
modelName: gemini-2.5-flash
systemMessage: |
You are a self-healing automation agent.
When remediation is needed, call the KestraTask tool.
prompt: |
Detected issue: {{ inputs.error_message }}
1) Propose a safe remediation action.
2) Execute the corresponding task using the tool.
tools:
- type: io.kestra.plugin.ai.tool.KestraTask
tasks:
- id: cleanup
type: io.kestra.plugin.scripts.shell.Commands
commands:
- "..." # Placeholder: the agent will decide real commands.
timeout: PT10M
yaml
id: agent_with_sse_mcp_places
namespace: company.ai
inputs:
- id: city
type: STRING
defaults: Lyon, France
- id: cuisine
type: STRING
defaults: "bistronomic"
tasks:
- id: agent
type: io.kestra.plugin.ai.agent.AIAgent
provider:
type: io.kestra.plugin.ai.provider.GoogleGemini
apiKey: "{{ kv('GEMINI_API_KEY') }}"
modelName: gemini-2.5-flash
systemMessage: |
You are a local guide.
Use the MCP places tool to search restaurants.
Return a short ranked list with brief reasons.
prompt: |
Find 3 {{ inputs.cuisine }} restaurants in {{ inputs.city }}.
Criteria: rating > 4.5, quiet atmosphere, mid-range budget.
Provide name, address, and two short reasons for each.
tools:
- type: io.kestra.plugin.ai.tool.SseMcpClient
sseUrl: https://mcp.apify.com/?actors=compass/crawler-google-places
timeout: PT3M
headers:
Authorization: Bearer {{ kv('APIFY_API_TOKEN') }}
yaml
id: agent_research_and_validate_forecast
namespace: company.ai
inputs:
- id: topic
type: STRING
defaults: "workflow and data orchestration market"
- id: year
type: INT
defaults: 2028
tasks:
- id: agent
type: io.kestra.plugin.ai.agent.AIAgent
provider:
type: io.kestra.plugin.ai.provider.GoogleGemini
apiKey: "{{ kv('GEMINI_API_KEY') }}"
modelName: gemini-2.5-flash
systemMessage: |
You are a market research analyst.
1) Use TavilyWebSearch to gather current market size and CAGR.
2) Use CodeExecution to project the market size to the target year.
3) Summarize in English with sources.
prompt: |
Topic: {{ inputs.topic }}
1) Find credible sources for the current market size and CAGR.
2) Project the market size for {{ inputs.year }} using the CAGR.
3) Write a compact report (2 paragraphs) plus a list of sources.
tools:
- type: io.kestra.plugin.ai.tool.TavilyWebSearch
apiKey: "{{ kv('TAVILY_API_KEY') }}"
- type: io.kestra.plugin.ai.tool.CodeExecution
apiKey: "{{ kv('RAPID_API_KEY') }}"
Properties
prompt *Requiredstring
provider *RequiredNon-dynamic
Definitions
Use Amazon Bedrock models
accessKeyId*Requiredstring
modelName*Requiredstring
secretAccessKey*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.AmazonBedrockio.kestra.plugin.langchain4j.provider.AmazonBedrockbaseUrlstring
caPemstring
clientPemstring
modelTypestring
Default
COHEREPossible Values
COHERETITANUse Anthropic Claude models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.Anthropicio.kestra.plugin.langchain4j.provider.AnthropicbaseUrlstring
caPemstring
clientPemstring
maxTokensintegerstring
Use Azure OpenAI deployments
endpoint*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.AzureOpenAIio.kestra.plugin.langchain4j.provider.AzureOpenAIapiKeystring
baseUrlstring
caPemstring
clientIdstring
clientPemstring
clientSecretstring
serviceVersionstring
tenantIdstring
Use DashScope (Qwen) models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
baseUrlstring
Default
https://dashscope-intl.aliyuncs.com/api/v1caPemstring
clientPemstring
enableSearchbooleanstring
maxTokensintegerstring
repetitionPenaltynumberstring
Use DeepSeek models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.DeepSeekio.kestra.plugin.langchain4j.provider.DeepSeekbaseUrlstring
Default
https://api.deepseek.com/v1caPemstring
clientPemstring
Use GitHub Models via Azure AI Inference
gitHubToken*Requiredstring
modelName*Requiredstring
type*Requiredobject
baseUrlstring
caPemstring
clientPemstring
Use Google Gemini models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.GoogleGeminiio.kestra.plugin.langchain4j.provider.GoogleGeminibaseUrlstring
caPemstring
clientPemstring
embeddingModelConfiguration
io.kestra.plugin.ai.provider.GoogleGemini-EmbeddingModelConfiguration
maxRetriesintegerstring
outputDimensionalityintegerstring
taskTypestring
Possible Values
RETRIEVAL_QUERYRETRIEVAL_DOCUMENTSEMANTIC_SIMILARITYCLASSIFICATIONCLUSTERINGQUESTION_ANSWERINGFACT_VERIFICATIONtimeoutstring
titleMetadataKeystring
Use Google Vertex AI models
endpoint*Requiredstring
location*Requiredstring
modelName*Requiredstring
project*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.GoogleVertexAIio.kestra.plugin.langchain4j.provider.GoogleVertexAIbaseUrlstring
caPemstring
clientPemstring
Use Hugging Face Inference endpoints
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
baseUrlstring
Default
https://router.huggingface.co/v1caPemstring
clientPemstring
Use LocalAI OpenAI-compatible server
baseUrl*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.LocalAIio.kestra.plugin.langchain4j.provider.LocalAIcaPemstring
clientPemstring
Use Mistral models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.MistralAIio.kestra.plugin.langchain4j.provider.MistralAIbaseUrlstring
caPemstring
clientPemstring
Use OCI Generative AI models
compartmentId*Requiredstring
modelName*Requiredstring
region*Requiredstring
type*Requiredobject
authProviderstring
baseUrlstring
caPemstring
clientPemstring
Use local Ollama models
endpoint*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.Ollamaio.kestra.plugin.langchain4j.provider.OllamabaseUrlstring
caPemstring
clientPemstring
Use OpenAI models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.OpenAIio.kestra.plugin.langchain4j.provider.OpenAIbaseUrlstring
Default
https://api.openai.com/v1caPemstring
clientPemstring
Use OpenRouter models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.OpenRouterio.kestra.plugin.langchain4j.provider.OpenRouterbaseUrlstring
caPemstring
clientPemstring
Use IBM watsonx.ai models
apiKey*Requiredstring
modelName*Requiredstring
projectId*Requiredstring
type*Requiredobject
baseUrlstring
caPemstring
clientPemstring
Use Cloudflare Workers AI models
accountId*Requiredstring
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.WorkersAIio.kestra.plugin.langchain4j.provider.WorkersAIbaseUrlstring
caPemstring
clientPemstring
Use ZhiPu AI models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
baseUrlstring
Default
https://open.bigmodel.cn/caPemstring
clientPemstring
maxRetriesintegerstring
maxTokenintegerstring
stopsarray
SubTypestring
configuration Non-dynamic
Default
{} Definitions
io.kestra.plugin.ai.domain.ChatConfiguration
logRequestsbooleanstring
logResponsesbooleanstring
maxTokenintegerstring
responseFormat
io.kestra.plugin.ai.domain.ChatConfiguration-ResponseFormat
jsonSchemaobject
jsonSchemaDescriptionstring
typestring
Default
TEXTPossible Values
TEXTJSONreturnThinkingbooleanstring
seedintegerstring
temperaturenumberstring
thinkingBudgetTokensintegerstring
thinkingEnabledbooleanstring
topKintegerstring
topPnumberstring
contentRetrievers
Definitions
Retrieve context from an embedding store
embeddingProvider*Required
Use Amazon Bedrock models
accessKeyId*Requiredstring
modelName*Requiredstring
secretAccessKey*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.AmazonBedrockio.kestra.plugin.langchain4j.provider.AmazonBedrockbaseUrlstring
caPemstring
clientPemstring
modelTypestring
Default
COHEREPossible Values
COHERETITANUse Anthropic Claude models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.Anthropicio.kestra.plugin.langchain4j.provider.AnthropicbaseUrlstring
caPemstring
clientPemstring
maxTokensintegerstring
Use Azure OpenAI deployments
endpoint*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.AzureOpenAIio.kestra.plugin.langchain4j.provider.AzureOpenAIapiKeystring
baseUrlstring
caPemstring
clientIdstring
clientPemstring
clientSecretstring
serviceVersionstring
tenantIdstring
Use DashScope (Qwen) models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
baseUrlstring
Default
https://dashscope-intl.aliyuncs.com/api/v1caPemstring
clientPemstring
enableSearchbooleanstring
maxTokensintegerstring
repetitionPenaltynumberstring
Use DeepSeek models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.DeepSeekio.kestra.plugin.langchain4j.provider.DeepSeekbaseUrlstring
Default
https://api.deepseek.com/v1caPemstring
clientPemstring
Use GitHub Models via Azure AI Inference
gitHubToken*Requiredstring
modelName*Requiredstring
type*Requiredobject
baseUrlstring
caPemstring
clientPemstring
Use Google Gemini models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.GoogleGeminiio.kestra.plugin.langchain4j.provider.GoogleGeminibaseUrlstring
caPemstring
clientPemstring
embeddingModelConfiguration
io.kestra.plugin.ai.provider.GoogleGemini-EmbeddingModelConfiguration
maxRetriesintegerstring
outputDimensionalityintegerstring
taskTypestring
Possible Values
RETRIEVAL_QUERYRETRIEVAL_DOCUMENTSEMANTIC_SIMILARITYCLASSIFICATIONCLUSTERINGQUESTION_ANSWERINGFACT_VERIFICATIONtimeoutstring
titleMetadataKeystring
Use Google Vertex AI models
endpoint*Requiredstring
location*Requiredstring
modelName*Requiredstring
project*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.GoogleVertexAIio.kestra.plugin.langchain4j.provider.GoogleVertexAIbaseUrlstring
caPemstring
clientPemstring
Use Hugging Face Inference endpoints
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
baseUrlstring
Default
https://router.huggingface.co/v1caPemstring
clientPemstring
Use LocalAI OpenAI-compatible server
baseUrl*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.LocalAIio.kestra.plugin.langchain4j.provider.LocalAIcaPemstring
clientPemstring
Use Mistral models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.MistralAIio.kestra.plugin.langchain4j.provider.MistralAIbaseUrlstring
caPemstring
clientPemstring
Use OCI Generative AI models
compartmentId*Requiredstring
modelName*Requiredstring
region*Requiredstring
type*Requiredobject
authProviderstring
baseUrlstring
caPemstring
clientPemstring
Use local Ollama models
endpoint*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.Ollamaio.kestra.plugin.langchain4j.provider.OllamabaseUrlstring
caPemstring
clientPemstring
Use OpenAI models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.OpenAIio.kestra.plugin.langchain4j.provider.OpenAIbaseUrlstring
Default
https://api.openai.com/v1caPemstring
clientPemstring
Use OpenRouter models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.OpenRouterio.kestra.plugin.langchain4j.provider.OpenRouterbaseUrlstring
caPemstring
clientPemstring
Use IBM watsonx.ai models
apiKey*Requiredstring
modelName*Requiredstring
projectId*Requiredstring
type*Requiredobject
baseUrlstring
caPemstring
clientPemstring
Use Cloudflare Workers AI models
accountId*Requiredstring
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.WorkersAIio.kestra.plugin.langchain4j.provider.WorkersAIbaseUrlstring
caPemstring
clientPemstring
Use ZhiPu AI models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
baseUrlstring
Default
https://open.bigmodel.cn/caPemstring
clientPemstring
maxRetriesintegerstring
maxTokenintegerstring
stopsarray
SubTypestring
embeddings*Required
Store embeddings in Chroma
baseUrl*Requiredstring
collectionName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.embeddings.Chromaio.kestra.plugin.langchain4j.embeddings.ChromaStore embeddings in Elasticsearch
connection*Required
io.kestra.plugin.ai.embeddings.Elasticsearch-ElasticsearchConnection
hosts*Requiredarray
SubTypestring
Min items
1basicAuth
headersarray
SubTypestring
pathPrefixstring
strictDeprecationModebooleanstring
trustAllSslbooleanstring
indexName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.embeddings.Elasticsearchio.kestra.plugin.langchain4j.embeddings.ElasticsearchPrototype embeddings in Kestra KV
type*Requiredobject
Possible Values
io.kestra.plugin.ai.embeddings.KestraKVStoreio.kestra.plugin.langchain4j.embeddings.KestraKVStorekvNamestring
Default
{{flow.id}}-embedding-storeStore embeddings in MariaDB
createTable*Requiredbooleanstring
databaseUrl*Requiredstring
fieldName*Requiredstring
password*Requiredstring
tableName*Requiredstring
type*Requiredobject
username*Requiredstring
columnDefinitionsarray
SubTypestring
indexesarray
SubTypestring
metadataStorageModestring
Default
COLUMN_PER_KEYStore embeddings in Milvus
token*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.embeddings.Milvusio.kestra.plugin.langchain4j.embeddings.MilvusautoFlushOnDeletebooleanstring
autoFlushOnInsertbooleanstring
collectionNamestring
consistencyLevelstring
databaseNamestring
hoststring
idFieldNamestring
indexTypestring
metadataFieldNamestring
metricTypestring
passwordstring
portintegerstring
retrieveEmbeddingsOnSearchbooleanstring
textFieldNamestring
uristring
usernamestring
vectorFieldNamestring
Store embeddings in MongoDB Atlas
collectionName*Requiredstring
host*Requiredstring
indexName*Requiredstring
scheme*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.embeddings.MongoDBAtlasio.kestra.plugin.langchain4j.embeddings.MongoDBAtlascreateIndexbooleanstring
databasestring
metadataFieldNamesarray
SubTypestring
optionsobject
passwordstring
usernamestring
Store embeddings with pgvector
database*Requiredstring
host*Requiredstring
password*Requiredstring
port*Requiredintegerstring
table*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.embeddings.PGVectorio.kestra.plugin.langchain4j.embeddings.PGVectoruser*Requiredstring
useIndexbooleanstring
Default
falseStore embeddings in Pinecone
apiKey*Requiredstring
cloud*Requiredstring
index*Requiredstring
region*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.embeddings.Pineconeio.kestra.plugin.langchain4j.embeddings.Pineconenamespacestring
Store embeddings in Qdrant
apiKey*Requiredstring
collectionName*Requiredstring
host*Requiredstring
port*Requiredintegerstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.embeddings.Qdrantio.kestra.plugin.langchain4j.embeddings.QdrantStore embeddings in Redis
host*Requiredstring
port*Requiredintegerstring
type*Requiredobject
indexNamestring
Default
embedding-indexStore embeddings in Alibaba Tablestore
accessKeyId*Requiredstring
accessKeySecret*Requiredstring
endpoint*Requiredstring
instanceName*Requiredstring
type*Requiredobject
metadataSchemaListarray
com.alicloud.openservices.tablestore.model.search.FieldSchema
analyzerstring
Possible Values
SingleWordMaxWordMinWordSplitFuzzyanalyzerParameter
dateFormatsarray
SubTypestring
enableHighlightingboolean
enableSortAndAggboolean
fieldNamestring
fieldTypestring
Possible Values
LONGDOUBLEBOOLEANKEYWORDTEXTNESTEDGEO_POINTDATEVECTORFUZZY_KEYWORDIPJSONUNKNOWNindexboolean
indexOptionsstring
Possible Values
DOCSFREQSPOSITIONSOFFSETSisArrayboolean
jsonTypestring
Possible Values
FLATTENNESTEDsourceFieldNamesarray
SubTypestring
storeboolean
subFieldSchemasarray
vectorOptions
Store embeddings in Weaviate
apiKey*Requiredstring
host*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.embeddings.Weaviateio.kestra.plugin.langchain4j.embeddings.WeaviateavoidDupsbooleanstring
consistencyLevelstring
Possible Values
ONEQUORUMALLgrpcPortintegerstring
metadataFieldNamestring
metadataKeysarray
SubTypestring
objectClassstring
portintegerstring
schemestring
securedGrpcbooleanstring
useGrpcForInsertsbooleanstring
type*Requiredobject
maxResultsintegerstring
Default
3minScorenumberstring
Default
0.0Retrieve web results with Google Custom Search
apiKey*Requiredstring
csi*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.retriever.GoogleCustomWebSearchio.kestra.plugin.langchain4j.retriever.GoogleCustomWebSearchmaxResultsintegerstring
Default
3Retrieve context from SQL (experimental)
databaseType*Requiredobject
password*Requiredstring
provider*Required
Use Amazon Bedrock models
accessKeyId*Requiredstring
modelName*Requiredstring
secretAccessKey*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.AmazonBedrockio.kestra.plugin.langchain4j.provider.AmazonBedrockbaseUrlstring
caPemstring
clientPemstring
modelTypestring
Default
COHEREPossible Values
COHERETITANUse Anthropic Claude models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.Anthropicio.kestra.plugin.langchain4j.provider.AnthropicbaseUrlstring
caPemstring
clientPemstring
maxTokensintegerstring
Use Azure OpenAI deployments
endpoint*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.AzureOpenAIio.kestra.plugin.langchain4j.provider.AzureOpenAIapiKeystring
baseUrlstring
caPemstring
clientIdstring
clientPemstring
clientSecretstring
serviceVersionstring
tenantIdstring
Use DashScope (Qwen) models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
baseUrlstring
Default
https://dashscope-intl.aliyuncs.com/api/v1caPemstring
clientPemstring
enableSearchbooleanstring
maxTokensintegerstring
repetitionPenaltynumberstring
Use DeepSeek models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.DeepSeekio.kestra.plugin.langchain4j.provider.DeepSeekbaseUrlstring
Default
https://api.deepseek.com/v1caPemstring
clientPemstring
Use GitHub Models via Azure AI Inference
gitHubToken*Requiredstring
modelName*Requiredstring
type*Requiredobject
baseUrlstring
caPemstring
clientPemstring
Use Google Gemini models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.GoogleGeminiio.kestra.plugin.langchain4j.provider.GoogleGeminibaseUrlstring
caPemstring
clientPemstring
embeddingModelConfiguration
io.kestra.plugin.ai.provider.GoogleGemini-EmbeddingModelConfiguration
maxRetriesintegerstring
outputDimensionalityintegerstring
taskTypestring
Possible Values
RETRIEVAL_QUERYRETRIEVAL_DOCUMENTSEMANTIC_SIMILARITYCLASSIFICATIONCLUSTERINGQUESTION_ANSWERINGFACT_VERIFICATIONtimeoutstring
titleMetadataKeystring
Use Google Vertex AI models
endpoint*Requiredstring
location*Requiredstring
modelName*Requiredstring
project*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.GoogleVertexAIio.kestra.plugin.langchain4j.provider.GoogleVertexAIbaseUrlstring
caPemstring
clientPemstring
Use Hugging Face Inference endpoints
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
baseUrlstring
Default
https://router.huggingface.co/v1caPemstring
clientPemstring
Use LocalAI OpenAI-compatible server
baseUrl*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.LocalAIio.kestra.plugin.langchain4j.provider.LocalAIcaPemstring
clientPemstring
Use Mistral models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.MistralAIio.kestra.plugin.langchain4j.provider.MistralAIbaseUrlstring
caPemstring
clientPemstring
Use OCI Generative AI models
compartmentId*Requiredstring
modelName*Requiredstring
region*Requiredstring
type*Requiredobject
authProviderstring
baseUrlstring
caPemstring
clientPemstring
Use local Ollama models
endpoint*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.Ollamaio.kestra.plugin.langchain4j.provider.OllamabaseUrlstring
caPemstring
clientPemstring
Use OpenAI models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.OpenAIio.kestra.plugin.langchain4j.provider.OpenAIbaseUrlstring
Default
https://api.openai.com/v1caPemstring
clientPemstring
Use OpenRouter models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.OpenRouterio.kestra.plugin.langchain4j.provider.OpenRouterbaseUrlstring
caPemstring
clientPemstring
Use IBM watsonx.ai models
apiKey*Requiredstring
modelName*Requiredstring
projectId*Requiredstring
type*Requiredobject
baseUrlstring
caPemstring
clientPemstring
Use Cloudflare Workers AI models
accountId*Requiredstring
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.WorkersAIio.kestra.plugin.langchain4j.provider.WorkersAIbaseUrlstring
caPemstring
clientPemstring
Use ZhiPu AI models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
baseUrlstring
Default
https://open.bigmodel.cn/caPemstring
clientPemstring
maxRetriesintegerstring
maxTokenintegerstring
stopsarray
SubTypestring
type*Requiredobject
username*Requiredstring
configuration
Default
{}io.kestra.plugin.ai.domain.ChatConfiguration
logRequestsbooleanstring
logResponsesbooleanstring
maxTokenintegerstring
responseFormat
io.kestra.plugin.ai.domain.ChatConfiguration-ResponseFormat
jsonSchemaobject
jsonSchemaDescriptionstring
typestring
Default
TEXTPossible Values
TEXTJSONreturnThinkingbooleanstring
seedintegerstring
temperaturenumberstring
thinkingBudgetTokensintegerstring
thinkingEnabledbooleanstring
topKintegerstring
topPnumberstring
driverstring
jdbcUrlstring
maxPoolSizeintegerstring
Default
2Retrieve web results with Tavily
apiKey*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.retriever.TavilyWebSearchio.kestra.plugin.langchain4j.retriever.TavilyWebSearchmaxResultsintegerstring
Default
3maxSequentialToolsInvocations integerstring
memory Non-dynamic
Definitions
Persist chat memory in Kestra KV
type*Requiredobject
Possible Values
io.kestra.plugin.ai.memory.KestraKVStoreio.kestra.plugin.ai.memory.KestraKVMemoryio.kestra.plugin.langchain4j.memory.KestraKVMemorydropstring
Default
NEVERPossible Values
NEVERBEFORE_TASKRUNAFTER_TASKRUNmemoryIdstring
Default
{{ labels.system.correlationId }}messagesintegerstring
Default
10ttlstring
Default
PT1HPersist chat memory in PostgreSQL
database*Requiredstring
host*Requiredstring
password*Requiredstring
type*Requiredobject
user*Requiredstring
dropstring
Default
NEVERPossible Values
NEVERBEFORE_TASKRUNAFTER_TASKRUNmemoryIdstring
Default
{{ labels.system.correlationId }}messagesintegerstring
Default
10portintegerstring
Default
5432tableNamestring
Default
chat_memoryttlstring
Default
PT1HPersist chat memory in Redis
host*Requiredstring
type*Requiredobject
dropstring
Default
NEVERPossible Values
NEVERBEFORE_TASKRUNAFTER_TASKRUNmemoryIdstring
Default
{{ labels.system.correlationId }}messagesintegerstring
Default
10portintegerstring
Default
6379ttlstring
Default
PT1HoutputFiles array
SubTypestring
systemMessage string
tools Non-dynamic
Definitions
Invoke remote AI agent over A2A
description*Requiredstring
serverUrl*Requiredstring
type*Requiredobject
namestring
Default
toolExpose a nested AI Agent as a tool
description*Requiredstring
provider*Required
Use Amazon Bedrock models
accessKeyId*Requiredstring
modelName*Requiredstring
secretAccessKey*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.AmazonBedrockio.kestra.plugin.langchain4j.provider.AmazonBedrockbaseUrlstring
caPemstring
clientPemstring
modelTypestring
Default
COHEREPossible Values
COHERETITANUse Anthropic Claude models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.Anthropicio.kestra.plugin.langchain4j.provider.AnthropicbaseUrlstring
caPemstring
clientPemstring
maxTokensintegerstring
Use Azure OpenAI deployments
endpoint*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.AzureOpenAIio.kestra.plugin.langchain4j.provider.AzureOpenAIapiKeystring
baseUrlstring
caPemstring
clientIdstring
clientPemstring
clientSecretstring
serviceVersionstring
tenantIdstring
Use DashScope (Qwen) models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
baseUrlstring
Default
https://dashscope-intl.aliyuncs.com/api/v1caPemstring
clientPemstring
enableSearchbooleanstring
maxTokensintegerstring
repetitionPenaltynumberstring
Use DeepSeek models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.DeepSeekio.kestra.plugin.langchain4j.provider.DeepSeekbaseUrlstring
Default
https://api.deepseek.com/v1caPemstring
clientPemstring
Use GitHub Models via Azure AI Inference
gitHubToken*Requiredstring
modelName*Requiredstring
type*Requiredobject
baseUrlstring
caPemstring
clientPemstring
Use Google Gemini models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.GoogleGeminiio.kestra.plugin.langchain4j.provider.GoogleGeminibaseUrlstring
caPemstring
clientPemstring
embeddingModelConfiguration
io.kestra.plugin.ai.provider.GoogleGemini-EmbeddingModelConfiguration
maxRetriesintegerstring
outputDimensionalityintegerstring
taskTypestring
Possible Values
RETRIEVAL_QUERYRETRIEVAL_DOCUMENTSEMANTIC_SIMILARITYCLASSIFICATIONCLUSTERINGQUESTION_ANSWERINGFACT_VERIFICATIONtimeoutstring
titleMetadataKeystring
Use Google Vertex AI models
endpoint*Requiredstring
location*Requiredstring
modelName*Requiredstring
project*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.GoogleVertexAIio.kestra.plugin.langchain4j.provider.GoogleVertexAIbaseUrlstring
caPemstring
clientPemstring
Use Hugging Face Inference endpoints
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
baseUrlstring
Default
https://router.huggingface.co/v1caPemstring
clientPemstring
Use LocalAI OpenAI-compatible server
baseUrl*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.LocalAIio.kestra.plugin.langchain4j.provider.LocalAIcaPemstring
clientPemstring
Use Mistral models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.MistralAIio.kestra.plugin.langchain4j.provider.MistralAIbaseUrlstring
caPemstring
clientPemstring
Use OCI Generative AI models
compartmentId*Requiredstring
modelName*Requiredstring
region*Requiredstring
type*Requiredobject
authProviderstring
baseUrlstring
caPemstring
clientPemstring
Use local Ollama models
endpoint*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.Ollamaio.kestra.plugin.langchain4j.provider.OllamabaseUrlstring
caPemstring
clientPemstring
Use OpenAI models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.OpenAIio.kestra.plugin.langchain4j.provider.OpenAIbaseUrlstring
Default
https://api.openai.com/v1caPemstring
clientPemstring
Use OpenRouter models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.OpenRouterio.kestra.plugin.langchain4j.provider.OpenRouterbaseUrlstring
caPemstring
clientPemstring
Use IBM watsonx.ai models
apiKey*Requiredstring
modelName*Requiredstring
projectId*Requiredstring
type*Requiredobject
baseUrlstring
caPemstring
clientPemstring
Use Cloudflare Workers AI models
accountId*Requiredstring
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.WorkersAIio.kestra.plugin.langchain4j.provider.WorkersAIbaseUrlstring
caPemstring
clientPemstring
Use ZhiPu AI models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
baseUrlstring
Default
https://open.bigmodel.cn/caPemstring
clientPemstring
maxRetriesintegerstring
maxTokenintegerstring
stopsarray
SubTypestring
type*Requiredobject
configuration
Default
{}io.kestra.plugin.ai.domain.ChatConfiguration
logRequestsbooleanstring
logResponsesbooleanstring
maxTokenintegerstring
responseFormat
io.kestra.plugin.ai.domain.ChatConfiguration-ResponseFormat
jsonSchemaobject
jsonSchemaDescriptionstring
typestring
Default
TEXTPossible Values
TEXTJSONreturnThinkingbooleanstring
seedintegerstring
temperaturenumberstring
thinkingBudgetTokensintegerstring
thinkingEnabledbooleanstring
topKintegerstring
topPnumberstring
contentRetrievers
Retrieve context from an embedding store
embeddingProvider*Required
Use Amazon Bedrock models
accessKeyId*Requiredstring
modelName*Requiredstring
secretAccessKey*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.AmazonBedrockio.kestra.plugin.langchain4j.provider.AmazonBedrockbaseUrlstring
caPemstring
clientPemstring
modelTypestring
Default
COHEREPossible Values
COHERETITANUse Anthropic Claude models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.Anthropicio.kestra.plugin.langchain4j.provider.AnthropicbaseUrlstring
caPemstring
clientPemstring
maxTokensintegerstring
Use Azure OpenAI deployments
endpoint*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.AzureOpenAIio.kestra.plugin.langchain4j.provider.AzureOpenAIapiKeystring
baseUrlstring
caPemstring
clientIdstring
clientPemstring
clientSecretstring
serviceVersionstring
tenantIdstring
Use DashScope (Qwen) models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
baseUrlstring
Default
https://dashscope-intl.aliyuncs.com/api/v1caPemstring
clientPemstring
enableSearchbooleanstring
maxTokensintegerstring
repetitionPenaltynumberstring
Use DeepSeek models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.DeepSeekio.kestra.plugin.langchain4j.provider.DeepSeekbaseUrlstring
Default
https://api.deepseek.com/v1caPemstring
clientPemstring
Use GitHub Models via Azure AI Inference
gitHubToken*Requiredstring
modelName*Requiredstring
type*Requiredobject
baseUrlstring
caPemstring
clientPemstring
Use Google Gemini models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.GoogleGeminiio.kestra.plugin.langchain4j.provider.GoogleGeminibaseUrlstring
caPemstring
clientPemstring
embeddingModelConfiguration
Use Google Vertex AI models
endpoint*Requiredstring
location*Requiredstring
modelName*Requiredstring
project*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.GoogleVertexAIio.kestra.plugin.langchain4j.provider.GoogleVertexAIbaseUrlstring
caPemstring
clientPemstring
Use Hugging Face Inference endpoints
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
baseUrlstring
Default
https://router.huggingface.co/v1caPemstring
clientPemstring
Use LocalAI OpenAI-compatible server
baseUrl*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.LocalAIio.kestra.plugin.langchain4j.provider.LocalAIcaPemstring
clientPemstring
Use Mistral models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.MistralAIio.kestra.plugin.langchain4j.provider.MistralAIbaseUrlstring
caPemstring
clientPemstring
Use OCI Generative AI models
compartmentId*Requiredstring
modelName*Requiredstring
region*Requiredstring
type*Requiredobject
authProviderstring
baseUrlstring
caPemstring
clientPemstring
Use local Ollama models
endpoint*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.Ollamaio.kestra.plugin.langchain4j.provider.OllamabaseUrlstring
caPemstring
clientPemstring
Use OpenAI models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.OpenAIio.kestra.plugin.langchain4j.provider.OpenAIbaseUrlstring
Default
https://api.openai.com/v1caPemstring
clientPemstring
Use OpenRouter models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.OpenRouterio.kestra.plugin.langchain4j.provider.OpenRouterbaseUrlstring
caPemstring
clientPemstring
Use IBM watsonx.ai models
apiKey*Requiredstring
modelName*Requiredstring
projectId*Requiredstring
type*Requiredobject
baseUrlstring
caPemstring
clientPemstring
Use Cloudflare Workers AI models
accountId*Requiredstring
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.WorkersAIio.kestra.plugin.langchain4j.provider.WorkersAIbaseUrlstring
caPemstring
clientPemstring
Use ZhiPu AI models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
baseUrlstring
Default
https://open.bigmodel.cn/caPemstring
clientPemstring
maxRetriesintegerstring
maxTokenintegerstring
stopsarray
SubTypestring
embeddings*Required
Store embeddings in Chroma
baseUrl*Requiredstring
collectionName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.embeddings.Chromaio.kestra.plugin.langchain4j.embeddings.ChromaStore embeddings in Elasticsearch
connection*Required
indexName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.embeddings.Elasticsearchio.kestra.plugin.langchain4j.embeddings.ElasticsearchPrototype embeddings in Kestra KV
type*Requiredobject
Possible Values
io.kestra.plugin.ai.embeddings.KestraKVStoreio.kestra.plugin.langchain4j.embeddings.KestraKVStorekvNamestring
Default
{{flow.id}}-embedding-storeStore embeddings in MariaDB
createTable*Requiredbooleanstring
databaseUrl*Requiredstring
fieldName*Requiredstring
password*Requiredstring
tableName*Requiredstring
type*Requiredobject
username*Requiredstring
columnDefinitionsarray
SubTypestring
indexesarray
SubTypestring
metadataStorageModestring
Default
COLUMN_PER_KEYStore embeddings in Milvus
token*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.embeddings.Milvusio.kestra.plugin.langchain4j.embeddings.MilvusautoFlushOnDeletebooleanstring
autoFlushOnInsertbooleanstring
collectionNamestring
consistencyLevelstring
databaseNamestring
hoststring
idFieldNamestring
indexTypestring
metadataFieldNamestring
metricTypestring
passwordstring
portintegerstring
retrieveEmbeddingsOnSearchbooleanstring
textFieldNamestring
uristring
usernamestring
vectorFieldNamestring
Store embeddings in MongoDB Atlas
collectionName*Requiredstring
host*Requiredstring
indexName*Requiredstring
scheme*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.embeddings.MongoDBAtlasio.kestra.plugin.langchain4j.embeddings.MongoDBAtlascreateIndexbooleanstring
databasestring
metadataFieldNamesarray
SubTypestring
optionsobject
passwordstring
usernamestring
Store embeddings with pgvector
database*Requiredstring
host*Requiredstring
password*Requiredstring
port*Requiredintegerstring
table*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.embeddings.PGVectorio.kestra.plugin.langchain4j.embeddings.PGVectoruser*Requiredstring
useIndexbooleanstring
Default
falseStore embeddings in Pinecone
apiKey*Requiredstring
cloud*Requiredstring
index*Requiredstring
region*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.embeddings.Pineconeio.kestra.plugin.langchain4j.embeddings.Pineconenamespacestring
Store embeddings in Qdrant
apiKey*Requiredstring
collectionName*Requiredstring
host*Requiredstring
port*Requiredintegerstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.embeddings.Qdrantio.kestra.plugin.langchain4j.embeddings.QdrantStore embeddings in Redis
host*Requiredstring
port*Requiredintegerstring
type*Requiredobject
indexNamestring
Default
embedding-indexStore embeddings in Alibaba Tablestore
accessKeyId*Requiredstring
accessKeySecret*Requiredstring
endpoint*Requiredstring
instanceName*Requiredstring
type*Requiredobject
metadataSchemaListarray
Store embeddings in Weaviate
apiKey*Requiredstring
host*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.embeddings.Weaviateio.kestra.plugin.langchain4j.embeddings.WeaviateavoidDupsbooleanstring
consistencyLevelstring
Possible Values
ONEQUORUMALLgrpcPortintegerstring
metadataFieldNamestring
metadataKeysarray
SubTypestring
objectClassstring
portintegerstring
schemestring
securedGrpcbooleanstring
useGrpcForInsertsbooleanstring
type*Requiredobject
maxResultsintegerstring
Default
3minScorenumberstring
Default
0.0Retrieve web results with Google Custom Search
apiKey*Requiredstring
csi*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.retriever.GoogleCustomWebSearchio.kestra.plugin.langchain4j.retriever.GoogleCustomWebSearchmaxResultsintegerstring
Default
3Retrieve context from SQL (experimental)
databaseType*Requiredobject
password*Requiredstring
provider*Required
Use Amazon Bedrock models
accessKeyId*Requiredstring
modelName*Requiredstring
secretAccessKey*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.AmazonBedrockio.kestra.plugin.langchain4j.provider.AmazonBedrockbaseUrlstring
caPemstring
clientPemstring
modelTypestring
Default
COHEREPossible Values
COHERETITANUse Anthropic Claude models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.Anthropicio.kestra.plugin.langchain4j.provider.AnthropicbaseUrlstring
caPemstring
clientPemstring
maxTokensintegerstring
Use Azure OpenAI deployments
endpoint*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.AzureOpenAIio.kestra.plugin.langchain4j.provider.AzureOpenAIapiKeystring
baseUrlstring
caPemstring
clientIdstring
clientPemstring
clientSecretstring
serviceVersionstring
tenantIdstring
Use DashScope (Qwen) models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
baseUrlstring
Default
https://dashscope-intl.aliyuncs.com/api/v1caPemstring
clientPemstring
enableSearchbooleanstring
maxTokensintegerstring
repetitionPenaltynumberstring
Use DeepSeek models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.DeepSeekio.kestra.plugin.langchain4j.provider.DeepSeekbaseUrlstring
Default
https://api.deepseek.com/v1caPemstring
clientPemstring
Use GitHub Models via Azure AI Inference
gitHubToken*Requiredstring
modelName*Requiredstring
type*Requiredobject
baseUrlstring
caPemstring
clientPemstring
Use Google Gemini models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.GoogleGeminiio.kestra.plugin.langchain4j.provider.GoogleGeminibaseUrlstring
caPemstring
clientPemstring
embeddingModelConfiguration
Use Google Vertex AI models
endpoint*Requiredstring
location*Requiredstring
modelName*Requiredstring
project*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.GoogleVertexAIio.kestra.plugin.langchain4j.provider.GoogleVertexAIbaseUrlstring
caPemstring
clientPemstring
Use Hugging Face Inference endpoints
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
baseUrlstring
Default
https://router.huggingface.co/v1caPemstring
clientPemstring
Use LocalAI OpenAI-compatible server
baseUrl*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.LocalAIio.kestra.plugin.langchain4j.provider.LocalAIcaPemstring
clientPemstring
Use Mistral models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.MistralAIio.kestra.plugin.langchain4j.provider.MistralAIbaseUrlstring
caPemstring
clientPemstring
Use OCI Generative AI models
compartmentId*Requiredstring
modelName*Requiredstring
region*Requiredstring
type*Requiredobject
authProviderstring
baseUrlstring
caPemstring
clientPemstring
Use local Ollama models
endpoint*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.Ollamaio.kestra.plugin.langchain4j.provider.OllamabaseUrlstring
caPemstring
clientPemstring
Use OpenAI models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.OpenAIio.kestra.plugin.langchain4j.provider.OpenAIbaseUrlstring
Default
https://api.openai.com/v1caPemstring
clientPemstring
Use OpenRouter models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.OpenRouterio.kestra.plugin.langchain4j.provider.OpenRouterbaseUrlstring
caPemstring
clientPemstring
Use IBM watsonx.ai models
apiKey*Requiredstring
modelName*Requiredstring
projectId*Requiredstring
type*Requiredobject
baseUrlstring
caPemstring
clientPemstring
Use Cloudflare Workers AI models
accountId*Requiredstring
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.provider.WorkersAIio.kestra.plugin.langchain4j.provider.WorkersAIbaseUrlstring
caPemstring
clientPemstring
Use ZhiPu AI models
apiKey*Requiredstring
modelName*Requiredstring
type*Requiredobject
baseUrlstring
Default
https://open.bigmodel.cn/caPemstring
clientPemstring
maxRetriesintegerstring
maxTokenintegerstring
stopsarray
SubTypestring
type*Requiredobject
username*Requiredstring
configuration
Default
{}io.kestra.plugin.ai.domain.ChatConfiguration
logRequestsbooleanstring
logResponsesbooleanstring
maxTokenintegerstring
responseFormat
returnThinkingbooleanstring
seedintegerstring
temperaturenumberstring
thinkingBudgetTokensintegerstring
thinkingEnabledbooleanstring
topKintegerstring
topPnumberstring
driverstring
jdbcUrlstring
maxPoolSizeintegerstring
Default
2Retrieve web results with Tavily
apiKey*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.retriever.TavilyWebSearchio.kestra.plugin.langchain4j.retriever.TavilyWebSearchmaxResultsintegerstring
Default
3maxSequentialToolsInvocationsintegerstring
namestring
Default
toolsystemMessagestring
tools
Execute JavaScript via Judge0
apiKey*Requiredstring
type*Requiredobject
Run MCP tools in Docker
image*Requiredstring
type*Requiredobject
apiVersionstring
bindsarray
SubTypestring
commandarray
SubTypestring
dockerCertPathstring
dockerConfigstring
dockerContextstring
dockerHoststring
dockerTlsVerifybooleanstring
envobject
SubTypestring
logEventsbooleanstring
Default
falseregistryEmailstring
registryPasswordstring
registryUrlstring
registryUsernamestring
Search the web with Google CSE
apiKey*Requiredstring
csi*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.tool.GoogleCustomWebSearchio.kestra.plugin.langchain4j.tool.GoogleCustomWebSearchExecute Kestra flows from an agent
type*Requiredobject
descriptionstring
flowIdstring
inheritLabelsbooleanstring
Default
falseinputsobject
labelsarrayobject
namespacestring
revisionintegerstring
scheduleDatestring
Expose Kestra runnable tasks as tools
tasks*Requiredarray
type*Requiredobject
Call MCP server over SSE
sseUrl*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.tool.SseMcpClientio.kestra.plugin.ai.tool.HttpMcpClientio.kestra.plugin.langchain4j.tool.HttpMcpClientheadersobject
SubTypestring
logRequestsbooleanstring
Default
falselogResponsesbooleanstring
Default
falsetimeoutstring
Run MCP tools over stdio
command*Requiredarray
SubTypestring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.tool.StdioMcpClientio.kestra.plugin.langchain4j.tool.StdioMcpClientenvobject
SubTypestring
logEventsbooleanstring
Default
falseCall MCP server over HTTP streaming
type*Requiredobject
url*Requiredstring
headersobject
SubTypestring
logRequestsbooleanstring
Default
falselogResponsesbooleanstring
Default
falsetimeoutstring
Search the web with Tavily
apiKey*Requiredstring
type*Requiredobject
Possible Values
io.kestra.plugin.ai.tool.TavilyWebSearchio.kestra.plugin.langchain4j.tool.TavilyWebSearchOutputs
finishReason string
Possible Values
STOPLENGTHTOOL_EXECUTIONCONTENT_FILTEROTHERintermediateResponses array
Definitions
io.kestra.plugin.ai.domain.AIOutput-AIResponse
completionstring
finishReasonstring
Possible Values
STOPLENGTHTOOL_EXECUTIONCONTENT_FILTEROTHERidstring
requestDurationinteger
tokenUsage
io.kestra.plugin.ai.domain.TokenUsage
inputTokenCountinteger
outputTokenCountinteger
totalTokenCountinteger
toolExecutionRequestsarray
io.kestra.plugin.ai.domain.AIOutput-AIResponse-ToolExecutionRequest
argumentsobject
idstring
namestring
jsonOutput object
outputFiles object
SubTypestring
requestDuration integer
sources array
Definitions
io.kestra.plugin.ai.domain.AIOutput-ContentSource
contentstring
metadataobject
textOutput string
thinking string
tokenUsage
Definitions
io.kestra.plugin.ai.domain.TokenUsage
inputTokenCountinteger
outputTokenCountinteger
totalTokenCountinteger
toolExecutions array
Definitions
io.kestra.plugin.ai.domain.AIOutput-ToolExecution
requestArgumentsobject
requestIdstring
requestNamestring
resultstring
Metrics
input.token.count counter
Unit
tokenoutput.token.count counter
Unit
tokentotal.token.count counter
Unit
token