DataHub API
Ingest data from various sources for contextualized analysis of your spend.
The DataHub API allows you to import data from internal and external sources into the DoiT Console. You can leverage this functionality to monitor your internal resource usage, optimize resource allocation and manage operational expenses.
Operations
- POST /datahub/v1/events : Sends a batch of events to DataHub using a JSON payload.
- DELETE /datahub/v1/events: Deletes all the previously sent events.
- DELETE /datahub/v1/events/delete: Deletes specific events using either
eventId
orprovider
as the filter. - POST /datahub/v1/csv/upload: Sends a batch of events to DataHub using a CSV file, either uncompressed or compressed in ZIP or GZ format.
Data formats
The DataHub API supports two data formats:
Data format | Media type | Limits |
---|---|---|
JSON |
| Up to 255 events per request |
CSV file:
|
| Maximum file size: 30MB |
If you choose CSV, be aware that:
-
The DataHub API only accepts files no larger than 30MB. If your CSV is beyond 30MB, compress it using ZIP or GZ.
-
The Data Hub API doesn't accept ZIP or GZ archives that contain more than one CSV.
See also rate limits.
Data latency
Once you receive a success response (HTTP 201 OK), the ingested data will be available in the DoiT console within 15 minutes, regardless of the data formats.
JSON format
Events schema
Property | Type | Description |
---|---|---|
provider (required) | string | This field is used to identify the data provider. Choose a human-readable value so you can easily recognize the dataset. For example, Allowed characters: alphanumeric (0-9, a-z, A-Z), underscore (_), space, dash (-) |
id | string | A unique identifier for the event. If not set, a UUIDv4 is generated at ingestion time. |
dimensions.type | string | Possible values: |
dimensions.key | string | If the type is |
dimensions.value | string | The value of the dimension. |
time (required) | string (RFC3339) | The time that the event occurred. For example, |
metrics.type | string | For integration with billing data. Possible values: |
metrics.value | number | Value for the metric. |
Allowed keys for fixed
dimensions
fixed
dimensionsEvent dimension key | Report field |
---|---|
billing_account_id | Billing Account ID |
country | Country |
is_marketplace | Marketplace |
operation | Operation |
pricing_unit | Unit |
project_id | Project/Account ID |
project_name | Project/Account Name |
project_number | Project/Account Number |
region | Region |
resource_id | Resource |
resource_global_id | Global Resource |
service_description | Service |
service_id | Service ID |
sku_description | SKU |
sku_id | SKU ID |
zone | Zone |
NoteSee Standard dimensions for details about the report fields.
Sample request: Publish events
YOUR_API_KEYReplace "YOUR_API_KEY" with your actual API key as explained at Get Started
curl --request POST \
--url https://api.doit.com/datahub/v1/events \
--header 'Content-Type: application/json' \
--header 'Authorization: Bearer {YOUR_API_KEY}' \
--data '{
"events": [
{
"provider": "Datadog",
"id": "12345678-1234-1234-1234-123412341234",
"dimensions": [
{
"key": "service_description",
"value": "Devices",
"type": "fixed"
},
{
"key": "region",
"value": "us-east-1b",
"type": "fixed"
},
{
"key": "zone",
"value": "us-east-1b",
"type": "fixed",
},
{
"key": "env",
"value": "prod",
"type": "label",
},
{
"key": "team",
"value": "alpha",
"type": "label",
},
{
"key": "department",
"value": "rnd",
"type": "project_label",
}
],
"time": "2024-02-23T12:00:00.00Z",
"metrics": [
{
"value": 352,
"type": "cost",
},
{
"value": 36,
"type": "usage",
}
]
}
]
}
'
CSV file
See CSV ingestion for the syntax, conventions, and limitations when sending data using CSV.
YOUR_API_KEYReplace "YOUR_API_KEY" with your actual API key as explained at Get Started
Sample: Publish events using a CSV compressed in ZIP
curl --request POST \
--url https://api.doit.com/datahub/v1/csv/upload \
--header โContent-Type: multipart/form-dataโ \
--header 'Authorization: Bearer {YOUR_API_KEY}' \
--form 'provider="mysuperdataset"' \
--form 'file=@โ/Users/me/test_csv.zipโโ
{
"batch": "test_csv.zip_1730885309695",
"ingestedRows": 2
}
Updated 13 days ago