Model Armor integration with Google Cloud services
Stay organized with collections
Save and categorize content based on your preferences.
Model Armor integrates with various Google Cloud services:
Google Kubernetes Engine (GKE) and Service Extensions
Vertex AI
GKE and Service Extensions
Model Armor can be integrated with GKE through
Service Extensions. Service Extensions allow you to integrate
internal (Google Cloud services) or external (user-managed) services to process
traffic. You can configure a service extension on application load balancers,
including GKE inference gateways, to screen traffic to and from a
GKE cluster. This verifies that all interactions with the AI models
are protected by Model Armor. For more information, see
Integration with GKE.
Vertex AI
Model Armor can be directly integrated into Vertex AI using either
floor settings or
templates.
This integration screens Gemini model requests and responses, blocking
those that violate floor settings. This integration provides prompt and response
protection within Gemini API in Vertex AI for the
generateContent method. You need to enable Cloud Logging to get visibility
into the sanitization results of prompts and responses. For more information, see
Integration with Vertex AI.
Before you begin
Enable APIs
You must enable Model Armor APIs before you can use Model Armor.
At the bottom of the Google Cloud console, a
Cloud Shell
session starts and displays a command-line prompt. Cloud Shell is a shell environment
with the Google Cloud CLI
already installed and with values already set for
your current project. It can take a few seconds for the session to initialize.
Run the following command to set the API endpoint for the
Model Armor service.
For the REST API integration option, Model Armor functions as a detector
only using templates. This means it primarily identifies and reports potential
policy violations based on predefined templates, rather than actively preventing
them.
With the Vertex AI integration option, Model Armor provides
inline enforcement using floor settings or templates. This means
Model Armor actively enforces policies by intervening directly
in the process without requiring modifications to your application code.
Similar to Vertex AI, the GKE integration option also
offers inline enforcement only using templates. This indicates that
Model Armor can enforce policies directly within the inference gateway
without requiring modifications to your application code.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-29 UTC."],[],[],null,["# Model Armor integration with Google Cloud services\n\n| **Preview**\n|\n|\n| This feature is subject to the \"Pre-GA Offerings Terms\" in the General Service Terms section\n| of the [Service Specific Terms](/terms/service-terms#1).\n|\n| Pre-GA features are available \"as is\" and might have limited support.\n|\n| For more information, see the\n| [launch stage descriptions](/products#product-launch-stages).\n\nModel Armor integrates with various Google Cloud services:\n\n- Google Kubernetes Engine (GKE) and Service Extensions\n- Vertex AI\n\nGKE and Service Extensions\n--------------------------\n\nModel Armor can be integrated with GKE through\nService Extensions. Service Extensions allow you to integrate\ninternal (Google Cloud services) or external (user-managed) services to process\ntraffic. You can configure a service extension on application load balancers,\nincluding GKE inference gateways, to screen traffic to and from a\nGKE cluster. This verifies that all interactions with the AI models\nare protected by Model Armor. For more information, see\n[Integration with GKE](/security-command-center/docs/model-armor-gke-integration).\n\nVertex AI\n---------\n\nModel Armor can be directly integrated into Vertex AI using either\n[floor settings](/security-command-center/docs/model-armor-vertex-integration#configure-floor-settings) or\n[templates](/security-command-center/docs/model-armor-vertex-integration#configure-templates).\nThis integration screens Gemini model requests and responses, blocking\nthose that violate floor settings. This integration provides prompt and response\nprotection within Gemini API in Vertex AI for the\n`generateContent` method. You need to enable Cloud Logging to get visibility\ninto the sanitization results of prompts and responses. For more information, see\n[Integration with Vertex AI](/security-command-center/docs/model-armor-vertex-integration).\n\nBefore you begin\n----------------\n\n### Enable APIs\n\nYou must enable Model Armor APIs before you can use Model Armor. \n\n### Console\n\n1.\n\n\n Enable the Model Armor API.\n\n\n [Enable the API](https://console.cloud.google.com/flows/enableapi?apiid=modelarmor.googleapis.com)\n\n \u003cbr /\u003e\n\n2. Select the project where you want to activate Model Armor.\n\n### gcloud\n\nBefore you begin, follow these steps using the Google Cloud CLI with the\nModel Armor API:\n\n1.\n\n\n In the Google Cloud console, activate Cloud Shell.\n\n [Activate Cloud Shell](https://console.cloud.google.com/?cloudshell=true)\n\n\n At the bottom of the Google Cloud console, a\n [Cloud Shell](/shell/docs/how-cloud-shell-works)\n session starts and displays a command-line prompt. Cloud Shell is a shell environment\n with the Google Cloud CLI\n already installed and with values already set for\n your current project. It can take a few seconds for the session to initialize.\n\n \u003cbr /\u003e\n\n2. Run the following command to set the API endpoint for the\n Model Armor service.\n\n ```bash\n gcloud config set api_endpoint_overrides/modelarmor \"https://modelarmor.\u003cvar translate=\"no\"\u003eLOCATION\u003c/var\u003e.rep.googleapis.com/\"\n ```\n\n Replace \u003cvar translate=\"no\"\u003eLOCATION\u003c/var\u003e with the region where you want to use Model Armor.\n\nRun the following command to enable Model Armor.\n\n\u003cbr /\u003e\n\n```bash\n gcloud services enable modelarmor.googleapis.com --project=PROJECT_ID\n \n```\n\n\u003cbr /\u003e\n\nReplace \u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e with the ID of the project.\n\nOptions when integrating Model Armor\n------------------------------------\n\nModel Armor offers the following integration options. Each option provides different\nfeatures and capabilities.\n\nFor the REST API integration option, Model Armor functions as a detector\nonly using templates. This means it primarily identifies and reports potential\npolicy violations based on predefined templates, rather than actively preventing\nthem.\n\nWith the Vertex AI integration option, Model Armor provides\ninline enforcement using floor settings or templates. This means\nModel Armor actively enforces policies by intervening directly\nin the process without requiring modifications to your application code.\n\nSimilar to Vertex AI, the GKE integration option also\noffers inline enforcement only using templates. This indicates that\nModel Armor can enforce policies directly within the inference gateway\nwithout requiring modifications to your application code."]]