Configure your AWS or GCP account to store Scheduled Log Exports API data.
This guide walks you through configuration setup for the cloud account you want to send your Branch Scheduled Log Exports API data to.
For general information about the Scheduled Log Exports API, visit the API Reference guide.
Overview
Branch makes Scheduled Log Exports API data available to you in one of three ways:
- The data is delivered to your own cloud data service.
- The data is stored in Branch's AWS S3 bucket.
- The data is delivered to you via email.
This guide will walk you through the cloud configuration steps for option 1 - sending the data to your own cloud service provider.
Branch supports exporting to either Amazon Web Services (AWS) or Google Cloud Platform (GCP).
Access
Access to the Scheduled Log Exports API requires our Advanced Data Feeds add-on.
The Advanced Data Feeds add-on also includes access to:
- The Cross-Events Export API.
- The Unified Analytics Export API.
- Data integrations for exports.
- Webhook functionality.
Learn more on our pricing page.
Cloud Configuration
Amazon Web Services
1. Get External ID from Branch
Branch provides a unique external ID for your Branch account. You will need this ID when you configure your S3 bucket role's trusted entities in step 7.
The Scheduled Log Exports API has an endpoint which you can use to retrieve your external ID.
2. Create S3 Bucket
To start, create a new bucket in S3 that will store your Branch Scheduled Log Exports API data. Make sure that it is logically separate from your existing S3 buckets. Note the name of this bucket.
3. Create Folder Within S3 Bucket
Create a folder within your new S3 bucket. This folder name will be considered the prefix in the destination path you pass to the API.
4. Create Your S3 Bucket Role
- Log in to AWS and navigate to the IAM dashboard, then go to Roles.
- Click "Create role".
- Click "S3" → "S3" (in the "Select your use case" section).
5. Create Your S3 Bucket Policy
- Click "Next: Permissions".
- Click "Create policy" (in the "Attach permissions policy" section) - this will open a new browser tab.
- Choose the "JSON" tab.
- Paste the following JSON, replacing instances of
<bucket_name>
with your S3 bucket name and<prefix>
with the name of the folder you created:{ "Version": "2012-10-17", "Statement": [ { "Sid": "ListObjectsInBucket", "Effect": "Allow", "Action": [ "s3:ListBucket" ], "Resource": [ "arn:aws:s3:::<bucket_name>" ] }, { "Sid": "BranchExportsPut", "Effect": "Allow", "Action": [ "s3:AbortMultipartUpload", "s3:DeleteObject", "s3:GetObject", "s3:ListBucket", "s3:ListBucketMultipartUploads", "s3:ListMultipartUploadParts", "s3:PutObject" ], "Resource": [ "arn:aws:s3:::<bucket_name>/<prefix>/*" ] } ] }
- Click "Next: Tags", and add any tags.
- Click "Next: Review", and review your choices for this policy.
- Enter a descriptive name for the policy. We recommend specifying "branch" in the policy name.
- Click "Create policy".
6. Configure Your S3 Bucket Policy
- Navigate back to your original browser tab. Click the refresh button above the table of policies.
- Search by name for the policy you just created.
- Click the checkbox next to the policy, then click "Next: Tags" and add any tags.
- Click "Next: Review" and review the choices you made for this role.
- Enter a name for this role. The role name must start with the substring
Branch-Export-Upload
for Branch to recognize it. - Click "Create role" to complete role creation.
7. Configure Your S3 Bucket Role's Trusted Entities
- Click your newly created role to edit it.
- You will see a summary, including a Role ARN. Note this value, as you will use it when setting up a new Scheduled Log Exports API subscription. The format of a Role ARN looks like
arn:aws:iam::xxxxxxxxxxxx:role/Branch-Export-Upload-\*
. - Click the "Trust relationships" tab.
- Click "Edit trust relationship".
- Paste the following JSON, replacing
<external_id>
with the Branch external ID you retrieved from the Scheduled Log Exports API earlier:{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Principal": { "AWS": "arn:aws:iam::772300342379:role/BranchTMRoleProdDataPlatformExports" }, "Action": "sts:AssumeRole", "Condition": { "StringEquals": { "sts:ExternalId": "<external_id>" } } } ] }
8. Export Data From Branch to AWS
Once you have completed all the steps above, you can create a new Scheduled Log Exports subscription to start exporting data.
Make sure to set destination.subscription_type
to cloud
and set destination.cloud
to s3
in the request body. You must also set the following parameters:
destination.resource_access_id
(the ARN for the AWS Role)destination.bucket
(name of S3 bucket)destination.prefix
(fors3://my-branch-export-bucket/my-folder/
, the prefix would bemy-folder
)
9. Check S3 for Data
Once you've created a subscription and a job associated with it has successfully run, the report.subscription_status
field will be in the ACTIVE
state. You can check the state with the Scheduled Log Exports API.
At this point, you can visit your S3 bucket to look for Branch data.
Exports are placed into subfolders, as demonstrated in this example:
s3://my-branch-export-bucket/my-folder/subscription_id=1d7d7c2e-175b-11ec-9621-0242ac130002/report_type=eo_install/y=2021/m=09/d=17/h=00/
Remember to decompress the files before attempting to read the data.
Google Cloud Platform
1. Get External ID from Branch
Branch provides a unique external ID for your Branch account.
The Scheduled Log Exports API has an endpoint which you can use to retrieve your external ID.
2. Create Custom Roles for Bucket Access
Preexisting bucket roles in GCP offer more permissions than are required by the Scheduled Log Exports API, so Branch recommends that you create two custom roles with more limited permissions.
To create the first role:
- Navigate to the IAM & Admin dashboard, then go to Roles.
- Use the drop-down to make sure you are creating the role within the correct project.
- Click "Create Role", and give the role a name, such as
Branch SLE Object List
. - Click "Add Permissions", and give the role the
storage.objects.list
permission.
To create the second role:
- Click "Create Role", and give the role a name, such as
Branch SLE Upload
. - Click "Add Permissions", and give the role three permissions:
storage.objects.create
(allows for creating objects in a bucket)storage.objects.get
(primarily used for folder cleanup in the event of upload failures)storage.objects.delete
(primarily used for folder cleanup in the event of upload failures)
3. Create New Service Account
- Within the IAM & Admin dashboard, go to Service Accounts.
- Click "Create Service Account".
- Give the new service account a name and ID, for example
branch-sle-service-acct
. You can give the service account any name you want, and you can use the same value for the name as you do for the ID.
4. Configure Service Account
- Go to the "Permissions" tab for the new service account.
- In the "View By Principals" section, click the "Grant Access" button.
- Add a new principal that points to Branch's service account:
[email protected]
- Give the new principal the preexisting GCP role called
Service Account Token Creator
.
5. Create New Bucket
Create a new bucket that will store your Branch Scheduled Log Exports API data. Make sure it is logically separate from your other buckets.
- Navigate to "Cloud Storage", then "Buckets".
- Click the "Create" button, and give the bucket a name.
6. Create Folder Within Bucket
Create a folder within your new bucket. This folder name will be considered the prefix in the destination path you pass to the API.
7. Configure Bucket
-
Go to the bucket details, then to the "Permissions" tab.
-
In the "View By Principals" section, click the "Grant Access" button.
-
In the "New principals" box, add the ID for the new service account you created, which we called
branch-sle-service-acct
in this example. -
Under the "Assign roles" section, assign access to the two roles you created earlier, which we called
Branch SLE Object List
andBranch SLE Upload
in this example. -
For the
Branch SLE Upload
role, add an IAM condition. Give the condition a descriptive title, such asPermit SLE Upload to Bucket
. -
In the "Condition Editor" tab for the IAM condition, add a
resource.name.startsWith
statement, which contains the following values:<YOUR-BUCKET>
is the name you gave to the new bucket you created<YOUR-PREFIX>
is the folder path within the bucket where you want to put Branch data<YOUR-EXTERNAL-ID>
is the external ID you retrieved from the Scheduled Log Exports API
Generic format
resource.name.startsWith("projects/_/buckets/<YOUR-BUCKET>/objects/<YOUR-PREFIX>/external_id=<YOUR-EXTERNAL-ID>/")
Example
// `my-branch-bucket` is the bucket name // `branch/important_reports` is the prefix // `0a0a0-a0a0a0a-0a0a0a` is the external ID resource.name.startsWith("projects/_/buckets/my-branch-bucket/objects/branch/important_reports/external_id=0a0a0-a0a0a0a-0a0a0a/")
8. Export Data From Branch to GCP
Once you have completed all the steps above, you can create a new Scheduled Log Exports subscription to start exporting data.
Make sure to set destination.subscription_type
to cloud
and set destination.cloud
to gcs
in the request body. You must also set the following parameters:
destination.resource_access_id
(the full service account name, which looks like an email address, for example[email protected]
)destination.bucket
(name of GCP bucket; for/buckets/my-branch-bucket/objects/branch/important_reports/
this would bemy-branch-bucket
)destination.prefix
(the prefix path for the bucket; for/buckets/my-branch-bucket/objects/branch/important_reports/
this would bebranch/important_reports
)
9. Check GCP for Data
Once you've created a subscription and a job associated with it has successfully run, the report.subscription_status
field will be in the ACTIVE
state. You can check the state with the Scheduled Log Exports API.
At this point, you can visit your GCP bucket to look for Branch data. Make sure to look inside the prefix path you defined during configuration, which may include folders.
Remember to decompress the files before attempting to read the data.