Please note it may take a few minutes for the job to initialize.
Explore documentation
Google Cloud Pub/Sub logging
Start logging in 5 minutes
Forward logs from Google Cloud Platform to Better Stack.
You will need a Pub/Sub Subscription which you can find or create in Google Cloud Console → Pub/Sub → Subscriptions.
Note the Subscription name for future use, in the format projects/<YOUR_PROJECT>/subscriptions/<YOUR_SUBSCRIPTION>
.
Deploy the Dataflow job using Web UI
- Go to Google Cloud Console → Dataflow → Create job from template.
- Fill in Job name and Region.
- Under Dataflow template, select Custom template.
- As Template path, use
betterstack/pubsub-to-betterstack.json
- Fill in the Required Parameters:
Input Pub/Sub Subscription: projects/$PROJECT/subscriptions/$NAME
Better Stack Source Token: $SOURCE_TOKEN
Better Stack Ingesting Host: $INGESTING_HOST
Then, click Run job 🚀
You should see your logs in Better Stack → Live tail.
Deploy the Dataflow job using CLI (alternative)
If you prefer using gcloud
CLI, you can run the job using the following command in your active project and region:
PROJECT="$(gcloud config get-value project)"
REGION="$(gcloud config get-value compute/region)"
SUBSCRIPTION="projects/$PROJECT/subscriptions/<YOUR_SUBSCRIPTION_NAME>"
gcloud dataflow flex-template \
run "pubsub-to-betterstack-$(date +%Y%m%d-%H%M%S)" \
--template-file-gcs-location="gs://betterstack/pubsub-to-betterstack.json" \
--parameters input_subscription="$SUBSCRIPTION" \
--parameters better_stack_source_token="$SOURCE_TOKEN" \
--parameters better_stack_ingesting_host="$INGESTING_HOST" \
--region="$REGION"
You should see your logs in Better Stack → Live tail.
Please note it may take a few minutes for the job to initialize.
Need help?
Please let us know at hello@betterstack.com.
We're happy to help! 🙏
Additional information
Managing cost
When not specified, the streaming job uses the default machine type for Google Compute Engine instances used in your pipeline execution. For example, n1-standard-1
.
You can customize the used machine type in Optional Parameters section while creating the job. Uncheck the Use default machine type and pick a different type. Alternatively, you can use --worker-machine-type
CLI parameter.
You can also customize the autoscaling options or the used zone for running your Dataflow job.
You can read more on Pricing in official Google Cloud docs.
Need to fine-tune the log forwarding?
Supply the following option in the Optional Parameters section in Web UI, or via --parameters
in the CLI command.
batch_size
- Number of messages to batch before sending. Default:100
window_size
- Window size in seconds for batching messages. Default:10
max_retries
- Maximum number of retry attempts for failed requests. Uses exponential backoff between retries. Default:3
initial_retry_delay
- Initial delay in seconds between retries. The delay doubles with each retry attempt. Default:1
You can also fork the open-source repository on Github and fully customize the template.