Hoppa till innehåll

Amazon S3

Det här innehållet är inte tillgängligt på ditt språk än.

Connect an Amazon S3 bucket to Agent Context. Agent Context reads structured files from your bucket and makes them queryable with SQL. Each file becomes a table.

  1. Go to Agent Context in the dashboard
  2. Click Add SourceAmazon S3
  3. Enter your credentials and the S3 path to the file
  4. Click Test Connection to verify, then Save
ParameterRequiredDescription
RegionNoAWS region (e.g., us-east-1)
EndpointNoCustom S3-compatible endpoint (for MinIO, Cloudflare R2, etc.)
Access KeyNoAWS access key ID
Secret KeyNoAWS secret access key
S3 URIYesPath to the file: bucket/path/to/file.parquet
File FormatNoAuto-detected from extension. Override: csv, tsv, parquet, json, jsonl

Alternatively, you can paste a connection string with all parameters.

FormatExtensionNotes
Parquet.parquetColumnar format. Best performance for large datasets.
CSV.csvAuto-detects delimiter, headers, and column types.
TSV.tsvTab-separated values.
JSON.jsonJSON array format.
JSONL.jsonlNewline-delimited JSON. One record per line.

The S3 connector works with any S3-compatible storage. Set the Endpoint field to your provider’s URL:

ServiceEndpoint
Amazon S3Leave blank (default)
MinIOYour MinIO URL
Cloudflare R2Your R2 endpoint
DigitalOcean Spaceshttps://<region>.digitaloceanspaces.com
Backblaze B2Your B2 S3-compatible URL
IssueFix
Access deniedVerify the access key has s3:GetObject and s3:ListBucket permissions on the bucket.
No tables foundCheck that the S3 URI points to a supported file format.
Schema mismatchEnsure the file has consistent column structure.
TimeoutCheck network connectivity to the S3 region.