Airbyte is an open-source data integration platform used as an ETL. With this integration, you can connect Lago billing data to any warehouses.
You can push Lago billing data to destinations such as Snowflake, BigQuery, Redshift, S3 buckets or Azure, for instance. The entire list of data destinations enabled by Airbyte is listed on their destinations documentation.
Data available for extraction
With Airbyte’s native integration of Lago, you can push the following billing data to warehouses:
At present this connector only supports full refresh syncs meaning that each time you use the connector it will sync all available records from scratch. Please use cautiously if you expect your API to have a lot of records.
Find the full documentation of Airbyte’s native Lago integration.
1. Connect Lago to Airbyte
First of all, you simply need to bring your Lago private API key. In airbyte:
- Go to Sources;
- Select getLago as a source of data; and
- Paste your Lago private API key.
Lago data source in Airbyte
2. Select a destination
You can select any of the data destinations available in Airbyte. It could be a warehouse (BigQuery, Redshift, Snowflake…) or a file storage tool (S3, for instance). Please find here the entire list of data destinations available in Airbyte.
Destination in Airbyte
3. Sync billing data
In the following example, we connected Lago billing data to Snowflake data warehouse. Obviously, you can select another destination if needed.
- Create a data sync between Lago source and your destination;
- Define a sync frequency; and
- Activate the sync in Airbyte between Lago source and your destination.
This action will populate Lago billing data into a warehouse (Snowflake in our example).
Lago data in Snowflake
4. Query Lago billing data
Once the data has been populated in your destination, a warehouse in our example, you can easily query your billing data. Here is a query calculating your monthly revenue with Lago:
Query in snowflake