cx-data-pipeline
Data Pipeline Skill
Use this skill when configuring how Coralogix processes, enriches, and transforms data. It covers parsing rules (extract structured fields from raw logs), enrichments (add context from lookup tables), Events2Metrics (derive metrics from log/span events), and recording rules (precompute PromQL expressions).
CLI Commands
| Command | Subcommands | Purpose |
|---|---|---|
cx parsing-rules |
list, get, create, update, delete, bulk-delete, usage-limits |
Manage log parsing rules |
cx enrichments |
list, add, remove, overwrite, limit, settings |
Manage enrichment rules |
cx enrichments custom |
list, get, create, update, delete, search |
Manage custom enrichment tables |
cx e2m |
list, get, create, update, delete, labels-cardinality, limits |
Manage Events2Metrics definitions |
cx recording-rules |
list, get, create, update, delete |
Manage Prometheus recording rule groups |
Key flags:
- All create/update operations use
--from-file <path>(or-for stdin) - All commands support
-o jsonfor structured output and-p <profile>for profile selection cx parsing-rules updateandcx recording-rules updaterequire both--from-fileand the rule group IDcx enrichments custom searchrequires--id <table-id>and--query <text>cx parsing-rules bulk-deleterequires--ids <id1> <id2> ...
Working with JSON Payloads
These commands use complex JSON structures. Always template from an existing resource to avoid format errors:
# 1. Get an existing resource as a template
cx parsing-rules get <rule-group-id> -o json > template.json
# 2. Modify the template (change fields, remove the ID for create operations)
# 3. Create or update
cx parsing-rules create --from-file template.json
cx parsing-rules update --from-file template.json <rule-group-id>
This pattern applies to all create/update operations across all 4 commands. It prevents payload format errors that are the #1 cause of failed attempts.
Parsing Rules Workflow
1. List Existing Rules
cx parsing-rules list -o json
cx parsing-rules list -o json | jq '[.[] | {id, name, enabled, rule_count: (.rules | length)}]'
2. Get a Template
cx parsing-rules get <existing-rule-group-id> -o json > rule-template.json
3. Create New Rule Group
Edit the template for your new service, then:
cx parsing-rules create --from-file rule-template.json
4. Verify Parsing
Query recent logs to confirm fields are extracted (load cx-telemetry-querying for log querying):
cx logs 'source logs | filter $d.subsystem == "my-service" | limit 10' -o json
5. Check Usage Limits
cx parsing-rules usage-limits -o json
Enrichment Workflow
1. List Enrichment Rules
cx enrichments list -o json
cx enrichments settings -o json
cx enrichments limit -o json
2. Create Custom Enrichment Table (if needed)
cx enrichments custom list -o json
cx enrichments custom create --from-file table-definition.json
3. Add Enrichment Rules
cx enrichments add --from-file enrichment-rules.json
4. Search Custom Table Data
cx enrichments custom search --id <table-id> --query "search term"
5. Verify Enriched Fields
Query logs on hot storage (FrequentSearch tier) to confirm enriched fields appear. Avoid querying archive for verification - ingestion delays can cause false negatives.
cx logs 'source logs | filter $d.enriched_field != null | limit 5' -o json
Events2Metrics Workflow
1. Design the Metric
Decide the metric name, labels, and aggregation type before creating.
2. Check Limits
cx e2m limits -o json
cx e2m labels-cardinality -o json
3. Get a Template
cx e2m list -o json
cx e2m get <existing-e2m-id> -o json > e2m-template.json
4. Create E2M Definition
cx e2m create --from-file e2m-definition.json
5. Verify Metric
Confirm the new metric appears (load cx-telemetry-querying for metrics querying):
cx metrics search --name "new_metric_name"
Recording Rules Workflow
1. List Existing Recording Rules
cx recording-rules list -o json
cx recording-rules list -o json | jq '[.[] | {id, name, rules: [.rules[]?.record]}]'
2. Get a Template
cx recording-rules get <existing-id> -o json > recording-rule-template.json
3. Create Recording Rule Group
cx recording-rules create --from-file recording-rule-group.json
4. Verify with PromQL
Confirm the precomputed metric is available (load cx-telemetry-querying for metrics querying):
cx metrics query "new_precomputed_metric" --time now
Key Principles
- Always template from existing -
cx <command> get <id> -o json > template.jsonbefore any create - Verify after create - query logs/metrics to confirm the pipeline change took effect
- Use
-o json- all payload inspection and creation should use JSON output - Check limits first -
cx parsing-rules usage-limitsandcx e2m limitsbefore creating to avoid hitting caps - Bulk operations - use
cx parsing-rules bulk-delete --idsfor cleanup, not individual deletes
Related Skills
cx-telemetry-querying- discover what data is available before configuring pipeline, and verify parsing results and enriched fields via log/metrics queries
More from coralogix/cx-cli
cx-telemetry-querying
|
102cx-alerts
This skill should be used when the user asks to "manage alerts", "create alert", "list alerts", "check alert status", "enable alert", "disable alert", "investigate firing alerts", "check which alerts are active", "find alerting rules", "set up an alert", "configure alerting", "mute an alert", "silence an alert", "see alert definitions", "check alert priority", or wants to manage Coralogix alert definitions using the cx CLI.
95cx-observability-setup
>
87cx-incident-management
>
85cx-create-dashboard
>
83cx-cost-optimization
>
79