Source
yaml
id: aws-log-shipper
namespace: system
tasks:
- id: log_export
type: io.kestra.plugin.ee.core.log.LogShipper
logLevelFilter: INFO
lookbackPeriod: P1D
offsetKey: log_shipper_aws_cloudwatch_state
delete: false
logExporters:
- id: aws_cloudwatch
type: io.kestra.plugin.ee.aws.cloudwatch.LogExporter
accessKeyId: "{{ secret('AWS_ACCESS_KEY_ID') }}"
secretKeyId: "{{ secret('AWS_SECRET_KEY_ID') }}"
region: "{{ vars.region }}"
logGroupName: kestra
logStreamName: kestra-log-stream
triggers:
- id: daily
type: io.kestra.plugin.core.trigger.Schedule
cron: "@daily"
About this blueprint
Cloud Core System
This system blueprint demonstrates how to export and centralize workflow and system logs in AWS CloudWatch for monitoring, auditing, and observability in production environments.
The flow runs on a daily schedule and performs the following actions:
- Collects logs emitted by the orchestration platform over a configurable lookback period.
- Filters logs by severity level to control which events are exported.
- Ships logs to AWS CloudWatch Logs using a dedicated log group and log stream.
- Maintains state using an offset key to ensure logs are exported exactly once and avoid duplication.
This pattern is particularly useful for:
- Centralized logging across workflows and services
- Operational monitoring and incident investigation
- Compliance, auditing, and retention policies
- Integrating logs with CloudWatch alarms, dashboards, and downstream observability tools
Configuration:
- Provide AWS credentials with permission to write to CloudWatch Logs.
- Adjust the log group and stream names to match your AWS logging conventions.
- Tune the log level filter and lookback period based on your operational requirements.
- Control whether logs are retained locally after export.
By shipping logs to CloudWatch, this blueprint enables production-grade observability and integrates workflow execution logs into your existing AWS monitoring stack.