Availability
This API is available to enterprise customers only. If you would like to use this feature, please reach out to your Branch account team.
This is a quick start guide. Visit our Scheduled Log Exports API Reference to learn more about the API calls below.
Overview
Scheduled Log Exports allow you to schedule daily or hourly exports of log-level data from Branch. The results are made available in one of two ways:
- Data stored in Branch's AWS S3 bucket
- Data delivered to your cloud data service (i.e. AWS S3)
Use Cases
- Automated Branch log-level data retrieval based on a schedule (hourly or daily)
- Select your own data columns and filters.
- Use Scheduled Log Exports to set up notifications on sudden day-over-day increases/decreases in attributed installs.
- Export your data directly to your S3 bucket. No need to manually request exports. Instead, your data goes directly to your S3 bucket where you can access it at any time.
- Set up regular exports to agencies with the ability to hide certain data from partners to protect privacy, increasing your control over your data.
- Set up recurring emails with custom reports to share agency-attributed installs with your agency via email; share ad network postback records with your ad network.
Limitations
Limitation | Details |
---|---|
Subscription Limit | 35 subscriptions |
Scheduled Log Exports Terms
Term | Definition |
---|---|
subscription | one report (data source, columns, filters) that is set up to be exported over and over at a regular cadence. |
job | each job represents one time that the report was run and data was exported. The result is one or more files with exported data. Depending on the cadence, a job results in exporting data for either one hour or one day. |
Implementation
Access Token
An Access Token is required for the following APIs. It can be created and retrieved on the Branch dashboard, on the User page — learn more here.
Accessing Data via API
1. Create Scheduled Log Export Subscription
First, you need to create the Scheduled Log Export subscription. You will use similar syntax to Branch's Custom Export API for part of the request.
Example Request/Response:
- Export all installs attributed to a specific channel
curl -X POST -H 'Content-Type: application/json' -H 'Accept: application/json' \
-H 'access-token: api_app_99999999999999999999999999999999' -d '{
"report": {
"cadence": "hourly",
"response_format": "csv",
"response_format_compression": "gz",
"report_type": "eo_install",
"report_fields": [
"id",
"timestamp",
"user_data_os",
"user_data_idfa",
"user_data_idfv",
"user_data_aaid"
],
"filter": [
"eq",
"attributed",
"true"
]
},
"destination": {
"subscription_type": "branch"
}
}' 'https://api2.branch.io/scheduled-exports/logs/subscribe?app_id=123456789009876543'
{
"subscription_id": "13dbe05c-175b-11ec-9621-0242ac130002",
"status": "PENDING",
"description": "Generating sample report. Subscription will finish after Report is uploaded",
"subscription_url": "https://api2.branch.io/scheduled-exports/exports/logs/subscription/13dbe05c-175b-11ec-9621-0242ac130002?app_id=123456789009876543"
}
2. Check Subscription Status
You can then check the subscription status via the subscription_url
provided. Make sure the subscription_status
is set to ACTIVE
Example Request/Response:
curl -H 'access-token: api_app_99999999999999999999999999999999' \
'https://api2.branch.io/scheduled-exports/logs/subscription/13dbe05c-175b-11ec-9621-0242ac130002?app_id=123456789009876543'
{
"subscription_id": "13dbe05c-175b-11ec-9621-0242ac130002",
"report": {
"cadence": "hourly",
"filter": [
"eq",
"attributed",
"true"
],
"response_format": "csv",
"response_format_compression": "gz",
"report_type": "eo_install",
"report_fields": [
"id",
"timestamp",
"user_data_os",
"user_data_idfa",
"user_data_idfv",
"user_data_aaid"
],
"subscription_status": "ACTIVE"
},
"destination": {
"cloud": null,
"bucket": null,
"prefix": null,
"subscription_type": "branch",
"resource_access_id": null
}
}
3. Query for Export Jobs
Once your subscription is active, you can start querying for the exported data.
Example Request/Response:
curl -H 'access-token: api_app_99999999999999999999999999999999' \
'https://api2.branch.io/scheduled-exports/logs/subscription/13dbe05c-175b-11ec-9621-0242ac130002/jobs?start_date=2021-09-16T01:00:00.000&end_date=2021-09-16T02:59:59.999&app_id=123456789009876543'
[
{
"job_id": "185c97bb-f928-4f0a-b295-63f2a087780e",
"status": "SUCCEEDED",
"start_date": "2021-09-16T01:00:00",
"end_date": "2021-09-16T01:59:59",
"export_url": [
"https://branch-exports-web.s3.us-west-1.amazonaws.com/v2/y%3D2021/m%3D09/d%3D16/app_id%3D123456789009876543/job_id%3D185c97bb-f928-4f0a-b295-63f2a087780e/task_id%3D0/123456789009876543-2021-09-16-eo_install-v2-7b7b72ee4e1a68a1ab57b626d3bc3701f0f39f4b976141fc5fcadf83e5ba5605-FSQEXZ-0.csv.gz?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Date=20210916T023737Z&X-Amz-SignedHeaders=host&X-Amz-Expires=604800&X-Amz-Credential=AKIA3HUFQARV3GLL7FKD%2F20210916%2Fus-west-1%2Fs3%2Faws4_request&X-Amz-Signature=19ad618f5c8a13bdf072925f664699c522b525187a968fb33442cb4abc915680"
],
"lines_exported": 43
},
{
"job_id": "cdf3be6a-1f6e-4ba2-8918-905e4c886ea0",
"status": "SUCCEEDED",
"start_date": "2021-09-16T02:00:00",
"end_date": "2021-09-16T02:59:59",
"export_url": [
"https://branch-exports-web.s3.us-west-1.amazonaws.com/v2/y%3D2021/m%3D09/d%3D16/app_id%3D123456789009876543/job_id%3Dcdf3be6a-1f6e-4ba2-8918-905e4c886ea0/task_id%3D0/123456789009876543-2021-09-16-eo_install-v2-7b7b72ee4e1a68a1ab57b626d3bc3701f0f39f4b976141fc5fcadf83e5ba5605-DTihs8-0.csv.gz?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Date=20210916T034610Z&X-Amz-SignedHeaders=host&X-Amz-Expires=604800&X-Amz-Credential=AKIA3HUFQARV3GLL7FKD%2F20210916%2Fus-west-1%2Fs3%2Faws4_request&X-Amz-Signature=376e666f124d113c774a33b90d40984fc908f6475adf945dbcd3cd2445aed4de"
],
"lines_exported": 56
}
]
Once you successfully locate jobs, as in the example above, you can download the data via the array export_url
. For data accessible via API, the files are stored in Branch's S3 bucket, and you are provided a pre-signed S3 URL for exporting data. Typically data is accessible via API for 7 days, so be sure to download your data shortly after it is generated.
Accessing Data via Emailed Links
1. Create Scheduled Log Export Subscription
First, you need to create the Scheduled Log Export subscription. You will use similar syntax to Branch's Custom Export API for part of the request.
Note that emails are sent after the end of the UTC day.
Example Request/Response:
curl -X POST -H 'Content-Type: application/json' -H 'Accept: application/json' \
-H 'access-token: api_app_99999999999999999999999999999999' -d '{
"report": {
"cadence": "hourly",
"response_format": "csv",
"response_format_compression": "gz",
"report_type": "eo_install",
"report_fields": [
"id",
"timestamp",
"user_data_os",
"user_data_idfa",
"user_data_idfv",
"user_data_aaid"
],
"filter": [
"eq",
"attributed",
"true"
]
},
"destination": {
"subscription_type": "email",
"recipient_emails": [ "[email protected]" ]
}
}' 'https://api2.branch.io/scheduled-exports/logs/subscribe?app_id=123456789009876543'
{
"subscription_id": "6c13330c-5073-4aba-b96b-800386245363",
"status": "PENDING",
"description": "Generating sample report. Subscription will finish after Report is uploaded",
"subscription_url": "https://api2.branch.io/scheduled-exports/logs/subscription/6c13330c-5073-4aba-b96b-800386245363?app_id=271025631825777235"
}
2. Check Subscription Status
You can then check the subscription status via the subscription_url
provided. Make sure the subscription_status
is set to ACTIVE
Example Request/Response:
curl -H 'access-token: api_app_99999999999999999999999999999999' \
'https://api2.branch.io/scheduled-exports/logs/subscription/6c13330c-5073-4aba-b96b-800386245363?app_id=123456789009876543'
{
"subscription_id": "6c13330c-5073-4aba-b96b-800386245363",
"report": {
"cadence": "daily",
"filter": [
"eq",
"attributed",
"true"
],
"response_format": "csv",
"response_format_compression": "gz",
"report_type": "eo_install",
"report_fields": [
"id",
"timestamp",
"user_data_os",
"user_data_idfa",
"user_data_idfv",
"user_data_aaid"
],
"subscription_status": "ACTIVE"
},
"destination": {
"cloud": null,
"bucket": null,
"prefix": null,
"subscription_type": "email",
"resource_access_id": null,
"recipient_emails": [
"[email protected]"
]
}
}
3. Check your email
Once your subscription is active, you will start to receive links for your recurring export via email to the email addresses you inputted when creating the subscription.
Uploading Data Directly to your Cloud
Prerequisite
This section assumes that you have created a new S3 bucket for Branch exports that is logically separate from your existing S3 buckets, and you created a folder or prefix within that bucket.
For example, you may want Branch exports to be uploaded to
s3://my-branch-export-bucket/my-folder/
.
- Bucket Name:
my-branch-export-bucket
- Prefix:
my-folder
1. Get External ID from Branch
Example Request/Response:
curl -H 'access-token: api_app_99999999999999999999999999999999' \
'https://api2.branch.io/scheduled-exports/logs/subscription/externalId?app_id=123456789009876543'
{"external_id":"9da763b0-175b-11ec-9621-0242ac130002"}
For more information on exact request and response body parameters, see the Scheduled Log Exports API Reference.
2. Configure your S3 Bucket Role
Once you have your external_id
and you have your dedicated S3 bucket for exports from Branch, you can configure the S3 bucket:
a. Log into AWS and navigate to the IAM Dashboard → Roles
b. Click "Create role"
c. Click "S3" → "S3" (in "Select your use case" section)
3. Configure your S3 Bucket Permissions
a. Click "Next: Permissions".
b. Click "Create policy" (in the "Attach permissions policy" section). This should open a new browser tab.
c. Choose the "JSON" tab
d. Paste the following JSON:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "ListObjectsInBucket",
"Effect": "Allow",
"Action": [
"s3:ListBucket"
],
"Resource": [
"arn:aws:s3:::{bucket_name}"
]
},
{
"Sid": "BranchExportsPut",
"Effect": "Allow",
"Action": [
"s3:AbortMultipartUpload",
"s3:DeleteObject",
"s3:GetObject",
"s3:ListBucket",
"s3:ListBucketMultipartUploads",
"s3:ListMultipartUploadParts",
"s3:PutObject"
],
"Resource": [
"arn:aws:s3:::{bucket_name}/{prefix}/*"
]
}
]
}
Replace both instances of {bucket_name}
with your S3 bucket name, and replace {prefix}
with the S3 prefix (folder_ where you would like Branch to upload data.
e. Click "Next: Tags", and add any tags.
f. Click "Next: Review", and review your choices for this policy. Enter a descriptive name for the policy, and remember it for the next step.
- We recommend specifying "branch" in the policy name.
g. Click "Create policy"
4. Configure your S3 Bucket Policy
a. Back to the original browser tab, click the refresh button above the table of policies.
b. Search by name for the policy you just created.
c. Click the checkbox next to the policy, and click "Next: Tags" and add any tags.
d. Click "Next: Review" and review your choices made for this role.
e. Enter a name for this role. The role name must start with the substring Branch-Export-Upload
for Branch to recognize it
f. Click "Create role" to complete the role creation.
5. Configure your S3 Bucket Role's Trusted Entities
a. Click your newly created role to edit it.
b. You should see a summary including a Role ARN. Note down this Role ARN (e.g. arn:aws:iam::xxxxxxxxxxxx:role/Branch-Export-Upload-*
) as you will use it when setting up a new log export subscription.
c. Click the "Trust relationships" tab.
d. Click "Edit trust relationship".
e. Paste the following JSON:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::772300342379:role/BranchTMRoleProdDataPlatformExports"
},
"Action": "sts:AssumeRole",
"Condition": {
"StringEquals": {
"sts:ExternalId": "{EXTERNAL_ID}"
}
}
}
]
}
Replace {EXTERNAL_ID}
with the external_id
retrieved from Branch in the above section.
d. Click "Update Trust Policy"
6. Create Scheduled Log Export Subscription
Now you are ready to set up a new log export subscription. You can use the API call below, customizing the report
and destination
based on your needs.
Example Request/Response:
curl -X POST -H 'Content-Type: application/json' -H 'Accept: application/json' \
-H 'access-token: api_app_99999999999999999999999999999999' -d '{
"report": {
"cadence": "hourly",
"response_format": "csv",
"response_format_compression": "gz",
"report_type": "eo_install",
"report_fields": [
"id",
"timestamp",
"user_data_os",
"user_data_idfa",
"user_data_aaid",
"user_data_idfv"
],
"filter": [
"eq",
"attributed",
"true"
]
},
"destination": {
"subscription_type": "cloud",
"cloud": "s3",
"resource_access_id": "arn:aws:iam::xxxxxxxxxxxx:role/Branch-Export-Upload-tbd",
"bucket": "my-branch-export-bucket",
"prefix": "my-folder"
}
}' 'https://api2.branch.io/scheduled-exports/logs/subscribe?app_id=123456789009876543'
{
"subscription_id": "1d7d7c2e-175b-11ec-9621-0242ac130002",
"status": "PENDING",
"description": "Generating sample report. Subscription will finish after Report is uploaded",
"subscription_url": "https://api2.branch.io/scheduled-exports/logs/subscription/1d7d7c2e-175b-11ec-9621-0242ac130002?app_id=123456789009876543"
}
7. Check Subscription Status
You can then check the subscription status via the subscription_url
provided. Make sure the subscription_status
is set to ACTIVE
Example Request/Response:
curl -H 'access-token: api_app_99999999999999999999999999999999' \
'https://api2.branch.io/scheduled-exports/logs/subscription/1d7d7c2e-175b-11ec-9621-0242ac130002?app_id=123456789009876543'
{
"subscription_id": "1d7d7c2e-175b-11ec-9621-0242ac130002",
"report": {
"cadence": "hourly",
"filter": [
"eq",
"attributed",
"true"
],
"response_format": "csv",
"response_format_compression": "gz",
"report_type": "eo_install",
"report_fields": [
"id",
"timestamp",
"user_data_os",
"user_data_idfa",
"user_data_aaid",
"user_data_idfv"
],
"subscription_status": "ACTIVE"
},
"destination": {
"cloud": "s3",
"bucket": "my-branch-export-bucket",
"prefix": "my-folder",
"subscription_type": "cloud",
"resource_access_id": "arn:aws:iam::xxxxxxxxxxxx:role/Branch-Export-Upload-tbd"
}
}
8. Find your Exported Data
Once your subscription is active, you should check your S3 bucket for Branch data. The subscription will only be active if we've successfully uploaded a file. Remember to check inside the folder that you specified when creating the log export subscription.
Exports are placed into subfolders, as demonstrated in this example:
s3://my-branch-export-bucket/my-folder/subscription_id=1d7d7c2e-175b-11ec-9621-0242ac130002/report_type=eo_install/y=2021/m=09/d=17/h=00/
Remember to uncompress the files before attempting to read the data.
That's it! Congratulations on setting up a scheduled log export directly to your cloud!
Testing and Best Practices
Test Export & Cadence
The basic steps are to set up a subscription, ensure that destination.subscription_status
is ACTIVE
, then wait for a test export run to complete. We recommend choosing an hourly cadence so that the first export arrives sooner and thus you get a chance to validate sooner. It also results in data being delivered more frequently and thus with less delay.
Subscription Status Pending
If you check the subscription status and it returns PENDING
, try to wait a few minutes, and then you should see it updated to ACTIVE
. For more information on exact request and response body parameters, see the Scheduled Log Exports API Reference.
Querying Exported Data
After your subscription is active, we recommend searching through the full day initially (set via the start_date
and end_date
parameters), then narrowing down after that if you plan to export data one hour at a time.