for your information

For data teams & enterprise

Pipe NZ data straight into your warehouse. Snowflake zero-copy share, one-command Meltano / Fivetran / Azure Data Factory generators, SLA, and direct support.

Three ways into your stack

Pick the one that matches your existing infrastructure. They all return the same data.

Snowflake data share

Zero-copy, read-only Iceberg tables surfaced in your Snowflake account. You pay your own compute; we don't sit in the data path.

CREATE DATABASE eolas
  FROM SHARE MRWTYQY-WZ79363.vs_warehouse_share;

SELECT * FROM eolas.stats_nz.nz_cpi
WHERE date >= '2020-01-01';

Queryable from anything that talks Snowflake — Tableau, dbt, Power BI, Sigma, Mode.

New

Connector generators

One command produces a ready-to-run config for Meltano, Fivetran, or Azure Data Factory. Drop it into your existing pipeline runner — done.

eolas integrate meltano \
  --datasets nz_cpi,nz_gdp,sa2_2023 \
  --output ./my-pipeline/

# Or
eolas integrate fivetran --datasets nz_cpi
eolas integrate azure-data-factory ...

Generator runs server-side; Enterprise plan only.

🔌

REST API (unlimited)

Same API as Pro, no monthly cap. Useful if you have an ETL platform you'd rather not change and just want to call our API from it.

curl -H "X-API-Key: $KEY" \
  "https://api.eolas.fyi/v1/datasets/nz_cpi/data"

# No row cap. No monthly quota.
# Streaming response for huge tables.

Plus a written SLA — typically 99.5% monthly uptime.

Connector generators, in detail

The thing nobody else does. One command, three platforms, no copy-pasting from docs.

🚢

Meltano

Generates a complete Meltano project: meltano.yml with tap-eolas + a default target (Postgres / Snowflake / BigQuery), .env.example, and a README.md. meltano install && meltano run and you're loading data.

eolas integrate meltano --datasets nz_cpi,nz_gdp --output ./my-pipeline/
cd my-pipeline && meltano install && meltano run

Fivetran (Custom Connector)

Generates a Fivetran Custom Connector deployable as a Lambda/Cloud Function. Returns NDJSON in Fivetran's expected schema; we maintain it so you don't have to.

eolas integrate fivetran --datasets nz_cpi
# Outputs: connector.py, fivetran.yml, requirements.txt, README.md

Azure Data Factory

Produces an ARM-template pipeline + linked-service JSON for ADF. Drop into your ADF instance, configure the sink (Synapse, Data Lake, SQL DB), schedule.

eolas integrate azure-data-factory --datasets nz_cpi,nz_gdp
# Outputs: pipeline.json, linkedService.json, README.md

The generators are also exposed at /v1/integrations/{meltano,fivetran,azure-data-factory} if you'd rather call them from your own automation.

Procurement-ready

We've answered the usual security questionnaire questions on a public page so your procurement team can self-serve.

  • Data lineage: Every dataset shows its source URL + licence. See security §1.
  • Hosting: AWS Sydney, open Iceberg format. Details →
  • SLA: Written agreement, typically 99.5% monthly. Refresh-cadence guarantees per dataset.
  • Incident response: Auto-paging on alert; 72hr customer notification policy. Details →
  • Vendor questionnaire: Happy to fill one in on request. Typical turnaround: 2 working days.

Pricing

Enterprise is custom — priced by usage, data volume, support tier, and whether you need the Snowflake share. Typical entry point: $500/mo.

Get a quote

Email response within one working day.