Search specific term/phrase surrounded by double quotes. e.g. “deep linking”
Exclude records that contain a specific term prefixed with a minus. e.g. Android -Firebase

Configure your AWS or GCP account to store Scheduled Log Exports API data.

This guide walks you through configuration setup for the cloud account you want to send your Branch Scheduled Log Exports API data to.

For general information about the Scheduled Log Exports API, visit the API Reference guide.

Overview

Branch makes Scheduled Log Exports API data available to you in one of three ways:

  1. The data is delivered to your own cloud data service.
  2. The data is stored in Branch's AWS S3 bucket.
  3. The data is delivered to you via email.

This guide will walk you through the cloud configuration steps for option 1 - sending the data to your own cloud service provider.

Branch supports exporting to either Amazon Web Services (AWS) or Google Cloud Platform (GCP).

Access Restrictions

Before you configure your cloud account, make sure your Branch account has access enabled for the Scheduled Log Exports API. Access to the Scheduled Log Exports API is restricted to Enterprise accounts. Please contact your Branch Account Manager to confirm if your account is eligible for enablement.

Cloud Configuration

Amazon Web Services

1. Get External ID from Branch

Branch provides a unique external ID for your Branch account. You will need this ID when you configure your S3 bucket role's trusted entities in step 7.

The Scheduled Log Exports API has an endpoint which you can use to retrieve your external ID.

2. Create S3 Bucket

To start, create a new bucket in S3 that will store your Branch Scheduled Log Exports API data. Make sure that it is logically separate from your existing S3 buckets. Note the name of this bucket.

3. Create Folder Within S3 Bucket

Create a folder within your new S3 bucket. This folder name will be considered the prefix in the destination path you pass to the API.

4. Create Your S3 Bucket Role

  1. Log in to AWS and navigate to the IAM dashboard, then go to Roles.
  2. Click "Create role".
  3. Click "S3" → "S3" (in the "Select your use case" section).

5. Create Your S3 Bucket Policy

  1. Click "Next: Permissions".
  2. Click "Create policy" (in the "Attach permissions policy" section) - this will open a new browser tab.
  3. Choose the "JSON" tab.
  4. Paste the following JSON, replacing instances of <bucket_name> with your S3 bucket name and <prefix> with the name of the folder you created:
    {
        "Version": "2012-10-17",
        "Statement": [
            {
                "Sid": "ListObjectsInBucket",
                "Effect": "Allow",
                "Action": [
                    "s3:ListBucket"
                ],
                "Resource": [
                    "arn:aws:s3:::<bucket_name>"
                ]
            },
            {
                "Sid": "BranchExportsPut",
                "Effect": "Allow",
                "Action": [
                    "s3:AbortMultipartUpload",
                    "s3:DeleteObject",
                    "s3:GetObject",
                    "s3:ListBucket",
                    "s3:ListBucketMultipartUploads",
                    "s3:ListMultipartUploadParts",
                    "s3:PutObject"
                ],
                "Resource": [
                    "arn:aws:s3:::<bucket_name>/<prefix>/*"
                ]
            }
        ]
    }
    
  5. Click "Next: Tags", and add any tags.
  6. Click "Next: Review", and review your choices for this policy.
  7. Enter a descriptive name for the policy. We recommend specifying "branch" in the policy name.
  8. Click "Create policy".

6. Configure Your S3 Bucket Policy

  1. Navigate back to your original browser tab. Click the refresh button above the table of policies.
  2. Search by name for the policy you just created.
  3. Click the checkbox next to the policy, then click "Next: Tags" and add any tags.
  4. Click "Next: Review" and review the choices you made for this role.
  5. Enter a name for this role. The role name must start with the substring Branch-Export-Upload for Branch to recognize it.
  6. Click "Create role" to complete role creation.

7. Configure Your S3 Bucket Role's Trusted Entities

  1. Click your newly created role to edit it.
  2. You will see a summary, including a Role ARN. Note this value, as you will use it when setting up a new Scheduled Log Exports API subscription. The format of a Role ARN looks like arn:aws:iam::xxxxxxxxxxxx:role/Branch-Export-Upload-\*.
  3. Click the "Trust relationships" tab.
  4. Click "Edit trust relationship".
  5. Paste the following JSON, replacing <external_id> with the Branch external ID you retrieved from the Scheduled Log Exports API earlier:
    {
      "Version": "2012-10-17",
      "Statement": [
        {
          "Effect": "Allow",
          "Principal": {
            "AWS": "arn:aws:iam::772300342379:role/BranchTMRoleProdDataPlatformExports"
          },
          "Action": "sts:AssumeRole",
          "Condition": {
            "StringEquals": {
              "sts:ExternalId": "<external_id>"
            }
          }
        }
      ]
    }
    

8. Export Data From Branch to AWS

Once you have completed all the steps above, you can create a new Scheduled Log Exports subscription to start exporting data.

Make sure to set destination.subscription_type to cloud and set destination.cloud to s3 in the request body. You must also set the following parameters:

  • destination.resource_access_id (the ARN for the AWS Role)
  • destination.bucket (name of S3 bucket)
  • destination.prefix (for s3://my-branch-export-bucket/my-folder/, the prefix would be my-folder)

9. Check S3 for Data

Once you've created a subscription and a job associated with it has successfully run, the report.subscription_status field will be in the ACTIVE state. You can check the state with the Scheduled Log Exports API.

At this point, you can visit your S3 bucket to look for Branch data.

Exports are placed into subfolders, as demonstrated in this example:
s3://my-branch-export-bucket/my-folder/subscription_id=1d7d7c2e-175b-11ec-9621-0242ac130002/report_type=eo_install/y=2021/m=09/d=17/h=00/

Remember to decompress the files before attempting to read the data.

Google Cloud Platform

1. Get External ID from Branch

Branch provides a unique external ID for your Branch account.

The Scheduled Log Exports API has an endpoint which you can use to retrieve your external ID.

2. Create Custom Roles for Bucket Access

Preexisting bucket roles in GCP offer more permissions than are required by the Scheduled Log Exports API, so Branch recommends that you create two custom roles with more limited permissions.

To create the first role:

  1. Navigate to the IAM & Admin dashboard, then go to Roles.
  2. Use the drop-down to make sure you are creating the role within the correct project.
  3. Click "Create Role", and give the role a name, such as Branch SLE Object List.
  4. Click "Add Permissions", and give the role the storage.objects.list permission.

To create the second role:

  1. Click "Create Role", and give the role a name, such as Branch SLE Upload.
  2. Click "Add Permissions", and give the role three permissions:
    1. storage.objects.create (allows for creating objects in a bucket)
    2. storage.objects.get (primarily used for folder cleanup in the event of upload failures)
    3. storage.objects.delete (primarily used for folder cleanup in the event of upload failures)

3. Create New Service Account

  1. Within the IAM & Admin dashboard, go to Service Accounts.
  2. Click "Create Service Account".
  3. Give the new service account a name and ID, for example branch-sle-service-acct. You can give the service account any name you want, and you can use the same value for the name as you do for the ID.

4. Configure Service Account

  1. Go to the "Permissions" tab for the new service account.
  2. In the "View By Principals" section, click the "Grant Access" button.
  3. Add a new principal that points to Branch's service account: [email protected]
  4. Give the new principal the preexisting GCP role called Service Account Token Creator.

5. Create New Bucket

Create a new bucket that will store your Branch Scheduled Log Exports API data. Make sure it is logically separate from your other buckets.

  1. Navigate to "Cloud Storage", then "Buckets".
  2. Click the "Create" button, and give the bucket a name.

6. Create Folder Within Bucket

Create a folder within your new bucket. This folder name will be considered the prefix in the destination path you pass to the API.

7. Configure Bucket

  1. Go to the bucket details, then to the "Permissions" tab.

  2. In the "View By Principals" section, click the "Grant Access" button.

  3. In the "New principals" box, add the ID for the new service account you created, which we called branch-sle-service-acct in this example.

  4. Under the "Assign roles" section, assign access to the two roles you created earlier, which we called Branch SLE Object List and Branch SLE Upload in this example.

  5. For the Branch SLE Upload role, add an IAM condition. Give the condition a descriptive title, such as Permit SLE Upload to Bucket.

  6. In the "Condition Editor" tab for the IAM condition, add a resource.name.startsWith statement, which contains the following values:

    1. <YOUR-BUCKET> is the name you gave to the new bucket you created
    2. <YOUR-PREFIX> is the folder path within the bucket where you want to put Branch data
    3. <YOUR-EXTERNAL-ID> is the external ID you retrieved from the Scheduled Log Exports API

    Generic format

    resource.name.startsWith("projects/_/buckets/<YOUR-BUCKET>/objects/<YOUR-PREFIX>/external_id=<YOUR-EXTERNAL-ID>/")
    

    Example

    // `my-branch-bucket` is the bucket name
    // `branch/important_reports` is the prefix
    // `0a0a0-a0a0a0a-0a0a0a` is the external ID
    
    resource.name.startsWith("projects/_/buckets/my-branch-bucket/objects/branch/important_reports/external_id=0a0a0-a0a0a0a-0a0a0a/")
    

8. Export Data From Branch to GCP

Once you have completed all the steps above, you can create a new Scheduled Log Exports subscription to start exporting data.

Make sure to set destination.subscription_type to cloud and set destination.cloud to gcs in the request body. You must also set the following parameters:

  • destination.resource_access_id (the full service account name, which looks like an email address, for example [email protected])
  • destination.bucket (name of GCP bucket; for /buckets/my-branch-bucket/objects/branch/important_reports/ this would be my-branch-bucket)
  • destination.prefix (the prefix path for the bucket; for /buckets/my-branch-bucket/objects/branch/important_reports/ this would be branch/important_reports)

9. Check GCP for Data

Once you've created a subscription and a job associated with it has successfully run, the report.subscription_status field will be in the ACTIVE state. You can check the state with the Scheduled Log Exports API.

At this point, you can visit your GCP bucket to look for Branch data. Make sure to look inside the prefix path you defined during configuration, which may include folders.

Remember to decompress the files before attempting to read the data.