GCP - Vertex AI Enum
Tip
Learn & practice AWS Hacking:
HackTricks Training AWS Red Team Expert (ARTE)
Learn & practice GCP Hacking:HackTricks Training GCP Red Team Expert (GRTE)
Learn & practice Az Hacking:HackTricks Training Azure Red Team Expert (AzRTE)
Support HackTricks
- Check the subscription plans!
- Join the 💬 Discord group or the telegram group or follow us on Twitter 🐦 @hacktricks_live.
- Share hacking tricks by submitting PRs to the HackTricks and HackTricks Cloud github repos.
Vertex AI
Vertex AI is Google Cloud’s unified machine learning platform for building, deploying, and managing AI models at scale. It combines various AI and ML services into a single, integrated platform, enabling data scientists and ML engineers to:
- Train custom models using AutoML or custom training
- Deploy models to scalable endpoints for predictions
- Manage the ML lifecycle from experimentation to production
- Access pre-trained models from Model Garden
- Monitor and optimize model performance
Key Components
Models
Vertex AI models represent trained machine learning models that can be deployed to endpoints for serving predictions. Models can be:
- Uploaded from custom containers or model artifacts
- Created through AutoML training
- Imported from Model Garden (pre-trained models)
- Versioned with multiple versions per model
Each model has metadata including its framework, container image URI, artifact location, and serving configuration.
Endpoints
Endpoints are resources that host deployed models and serve online predictions. Key features:
- Can host multiple deployed models (with traffic splitting)
- Provide HTTPS endpoints for real-time predictions
- Support autoscaling based on traffic
- Can use private or public access
- Support A/B testing through traffic splitting
Custom Jobs
Custom jobs allow you to run custom training code using your own containers or Python packages. Features include:
- Support for distributed training with multiple worker pools
- Configurable machine types and accelerators (GPUs/TPUs)
- Service account attachment for accessing other GCP resources
- Integration with Vertex AI Tensorboard for visualization
- VPC connectivity options
Hyperparameter Tuning Jobs
These jobs automatically search for optimal hyperparameters by running multiple training trials with different parameter combinations.
Model Garden
Model Garden provides access to:
- Pre-trained Google models
- Open-source models (including Hugging Face)
- Third-party models
- One-click deployment capabilities
Tensorboards
Tensorboards provide visualization and monitoring for ML experiments, tracking metrics, model graphs, and training progress.
Service Accounts & Permissions
By default, Vertex AI services use the Compute Engine default service account (PROJECT_NUMBER-compute@developer.gserviceaccount.com), which has Editor permissions on the project. However, you can specify custom service accounts when:
- Creating custom jobs
- Uploading models
- Deploying models to endpoints
This service account is used to:
- Access training data in Cloud Storage
- Write logs to Cloud Logging
- Access secrets from Secret Manager
- Interact with other GCP services
Data Storage
- Model artifacts are stored in Cloud Storage buckets
- Training data typically resides in Cloud Storage or BigQuery
- Container images are stored in Artifact Registry or Container Registry
- Logs are sent to Cloud Logging
- Metrics are sent to Cloud Monitoring
Encryption
By default, Vertex AI uses Google-managed encryption keys. You can also configure:
- Customer-managed encryption keys (CMEK) from Cloud KMS
- Encryption applies to model artifacts, training data, and endpoints
Networking
Vertex AI resources can be configured for:
- Public internet access (default)
- VPC peering for private access
- Private Service Connect for secure connectivity
- Shared VPC support
Enumeration
# List models
gcloud ai models list --region=<region>
gcloud ai models describe <model-id> --region=<region>
gcloud ai models list-version <model-id> --region=<region>
# List endpoints
gcloud ai endpoints list --region=<region>
gcloud ai endpoints describe <endpoint-id> --region=<region>
gcloud ai endpoints list --list-model-garden-endpoints-only --region=<region>
# List custom jobs
gcloud ai custom-jobs list --region=<region>
gcloud ai custom-jobs describe <job-id> --region=<region>
# Stream logs from a running job
gcloud ai custom-jobs stream-logs <job-id> --region=<region>
# List hyperparameter tuning jobs
gcloud ai hp-tuning-jobs list --region=<region>
gcloud ai hp-tuning-jobs describe <job-id> --region=<region>
# List model monitoring jobs
gcloud ai model-monitoring-jobs list --region=<region>
gcloud ai model-monitoring-jobs describe <job-id> --region=<region>
# List Tensorboards
gcloud ai tensorboards list --region=<region>
gcloud ai tensorboards describe <tensorboard-id> --region=<region>
# List indexes (for vector search)
gcloud ai indexes list --region=<region>
gcloud ai indexes describe <index-id> --region=<region>
# List index endpoints
gcloud ai index-endpoints list --region=<region>
gcloud ai index-endpoints describe <index-endpoint-id> --region=<region>
# Get operations (long-running operations status)
gcloud ai operations describe <operation-id> --region=<region>
# Test endpoint predictions (if you have access)
gcloud ai endpoints predict <endpoint-id> \
--region=<region> \
--json-request=request.json
# Make direct predictions (newer API)
gcloud ai endpoints direct-predict <endpoint-id> \
--region=<region> \
--json-request=request.json
Model Information Gathering
# Get detailed model information including versions
gcloud ai models describe <model-id> --region=<region>
# Check specific model version
gcloud ai models describe <model-id>@<version> --region=<region>
# List all versions of a model
gcloud ai models list-version <model-id> --region=<region>
# Get model artifact location (usually a GCS bucket)
gcloud ai models describe <model-id> --region=<region> --format="value(artifactUri)"
# Get container image URI
gcloud ai models describe <model-id> --region=<region> --format="value(containerSpec.imageUri)"
Endpoint Details
# Get endpoint details including deployed models
gcloud ai endpoints describe <endpoint-id> --region=<region>
# Get endpoint URL
gcloud ai endpoints describe <endpoint-id> --region=<region> --format="value(deployedModels[0].displayName)"
# Get service account used by endpoint
gcloud ai endpoints describe <endpoint-id> --region=<region> --format="value(deployedModels[0].serviceAccount)"
# Check traffic split between models
gcloud ai endpoints describe <endpoint-id> --region=<region> --format="value(trafficSplit)"
Custom Job Information
# Get job details including command, args, and service account
gcloud ai custom-jobs describe <job-id> --region=<region>
# Get service account used by job
gcloud ai custom-jobs describe <job-id> --region=<region> --format="value(jobSpec.workerPoolSpecs[0].serviceAccount)"
# Get container image used
gcloud ai custom-jobs describe <job-id> --region=<region> --format="value(jobSpec.workerPoolSpecs[0].containerSpec.imageUri)"
# Check environment variables (may contain secrets)
gcloud ai custom-jobs describe <job-id> --region=<region> --format="value(jobSpec.workerPoolSpecs[0].containerSpec.env)"
# Get network configuration
gcloud ai custom-jobs describe <job-id> --region=<region> --format="value(jobSpec.network)"
Access Control
# Note: IAM policies for individual Vertex AI resources are managed at the project level
# Check project-level permissions
gcloud projects get-iam-policy <project-id>
# Check service account permissions
gcloud iam service-accounts get-iam-policy <service-account-email>
# Check if endpoints allow unauthenticated access
# This is controlled by IAM bindings on the endpoint
gcloud projects get-iam-policy <project-id> \
--flatten="bindings[].members" \
--filter="bindings.role:aiplatform.user"
Storage and Artifacts
# Models and training jobs often store artifacts in GCS
# List buckets that might contain model artifacts
gsutil ls
# Common artifact locations:
# gs://<project>-aiplatform-<region>/
# gs://<project>-vertex-ai/
# gs://<custom-bucket>/vertex-ai/
# Download model artifacts if accessible
gsutil -m cp -r gs://<bucket>/path/to/artifacts ./artifacts/
# Check for notebooks in AI Platform Notebooks
gcloud notebooks instances list --location=<location>
gcloud notebooks instances describe <instance-name> --location=<location>
Model Garden
# List Model Garden endpoints
gcloud ai endpoints list --list-model-garden-endpoints-only --region=<region>
# Model Garden models are often deployed with default configurations
# Check for publicly accessible endpoints
Privilege Escalation
In the following page, you can check how to abuse Vertex AI permissions to escalate privileges:
References
Tip
Learn & practice AWS Hacking:
HackTricks Training AWS Red Team Expert (ARTE)
Learn & practice GCP Hacking:HackTricks Training GCP Red Team Expert (GRTE)
Learn & practice Az Hacking:HackTricks Training Azure Red Team Expert (AzRTE)
Support HackTricks
- Check the subscription plans!
- Join the 💬 Discord group or the telegram group or follow us on Twitter 🐦 @hacktricks_live.
- Share hacking tricks by submitting PRs to the HackTricks and HackTricks Cloud github repos.
HackTricks Cloud

