Connections
Connecting to databases and object storage providers
Overview
Framework provides a unified interface for connecting to databases and object storage. Define connections once in settings.yml and Framework handles the rest. For running queries and working with database helpers, see the Database Integration guide.
Configuring Connections
Define database connections in settings.yml:
connections:
analytics:
driver: postgres
host: env("DB_HOST")
port: 5432
database: analytics
user: env("DB_USER")
password: env("DB_PASSWORD")
warehouse:
driver: duckdb
database: data/warehouse.duckdb
local:
driver: sqlite
database: data/local.db
The env() syntax reads values from environment variables, keeping credentials out of your config files.
Store sensitive values in .env:
# .env (gitignored)
DB_HOST=db.example.com
DB_USER=analyst
DB_PASSWORD=secret
Object Storage Connections
Define object storage endpoints and buckets in settings.yml, then use helpers like s3_ls(), s3_download(), and s3_upload() to browse or sync data:
object_storage:
raw_data:
provider: s3
bucket: env("S3_BUCKET")
region: env("AWS_REGION")
access_key_id: env("AWS_ACCESS_KEY_ID")
secret_access_key: env("AWS_SECRET_ACCESS_KEY")
Both the GUI and the R helpers respect these connection profiles, so you can browse buckets in framework::gui() or automate transfers in code.
Listing Connections
View all configured connections (databases and object storage) with connections_list(), or limit to only database entries using db_list():
connections_list()
db_list() # Databases only
Best Practices
Use env() for Credentials
Never hardcode credentials:
# Good
password: env("DB_PASSWORD")
# Never do this
password: "my_secret_password"
Name Connections Clearly
connections:
production_analytics: # Clear purpose
...
staging_analytics: # Environment obvious
...