wren-connection-info

Installation
SKILL.md

Wren Connection Info Reference

Connection info can only be configured through the MCP server Web UI at http://localhost:9001. Do not attempt to set it programmatically. Always direct the user to the Web UI.


Supported data sources

Value Database Fields reference
POSTGRES PostgreSQL databases.md
MYSQL MySQL / MariaDB databases.md
MSSQL SQL Server databases.md
CLICKHOUSE ClickHouse databases.md
ORACLE Oracle databases.md
DORIS Apache Doris databases.md
REDSHIFT Amazon Redshift databases.md
TRINO Trino databases.md
BIGQUERY Google BigQuery databases.md
SNOWFLAKE Snowflake databases.md
DUCKDB DuckDB databases.md
ATHENA AWS Athena databases.md
DATABRICKS Databricks databases.md
SPARK Apache Spark databases.md
S3_FILE Amazon S3 file-sources.md
GCS_FILE Google Cloud Storage file-sources.md
MINIO_FILE MinIO file-sources.md
LOCAL_FILE Local files file-sources.md

Read the linked reference file for the user's data source to get required fields, default ports, and setup notes.


Common patterns

Most database connectors need: host, port, user, password, database.

Exceptions:

  • BigQuery — uses project_id, dataset_id, credentials (base64-encoded). See databases.md for encoding instructions.
  • Snowflake — uses account instead of host, plus schema.
  • Trino — needs catalog and schema instead of database.
  • Databricks — uses serverHostname, httpPath, accessToken (or service principal with clientId, clientSecret).
  • Spark — only host and port (Spark Connect protocol, no auth fields).
  • File sources — use url, format, plus bucket/credentials. See file-sources.md.

Docker host hint

If the database runs on the host machine and Wren Engine runs inside Docker, localhost cannot reach the host. Use host.docker.internal instead:

Original Inside Docker
localhost host.docker.internal
127.0.0.1 host.docker.internal
Cloud/remote hostname No change needed

Sensitive fields

Never log, display, or pass sensitive values through the AI agent unnecessarily.

Connector Sensitive fields
Postgres / MySQL / MSSQL / ClickHouse / Oracle / Doris / Redshift password
BigQuery credentials
Snowflake password
Athena aws_access_key_id, aws_secret_access_key
Databricks (token) accessToken
Databricks (service principal) clientId, clientSecret
S3 / MinIO access_key, secret_key
GCS key_id, secret_key, credentials
Trino / Spark / Local files (none)
Related skills
Installs
3
GitHub Stars
661
First Seen
Mar 18, 2026