This document explains how to export and ingest IAM Analysis logs into Google Security Operations using Cloud Storage. The parser extracts user and resource information from IAM JSON data. It then maps the extracted fields to the UDM, creating user entities with associated roles and resource relationships, ultimately enriching security context within the Google SecOps platform.
Before you begin
Ensure that you have the following prerequisites:
Google SecOps instance.
IAM is set up and active in your Google Cloud environment.
Privileged access to Google Cloud and appropriate permissions to access IAM logs.
On the Create a bucket page, enter your bucket information. After each of the following steps, click Continue to proceed to the next step:
In the Get started section, do the following:
Enter a unique name that meets the bucket name requirements; for example, google-cloud-iam-logs.
To enable hierarchical namespace, click the expander arrow to expand the Optimize for file oriented and data-intensive workloads section, and then select Enable Hierarchical namespace on this bucket.
To add a bucket label, click the expander arrow to expand the Labels section.
Click Add label, and specify a key and a value for your label.
In the Choose where to store your data section, do the following:
Select a Location type.
Use the location type menu to select a Location where object data within your bucket will be permanently stored.
To set up cross-bucket replication, expand the Set up cross-bucket replication section.
In the Choose a storage class for your data section, either select a default storage class for the bucket, or select Autoclass for automatic storage class management of your bucket's data.
In the Choose how to control access to objects section, select not to enforce public access prevention, and select an access control model for your bucket's objects.
In the Choose how to protect object data section, do the following:
Select any of the options under Data protection that you want to set for your bucket.
To choose how your object data will be encrypted, click the expander arrow labeled Data encryption, and select a Data encryption method.
For STORAGE_BUCKET, directly from the attachedResourceFullName field in the raw log, but with any trailing resource names removed. For BigQuery datasets, it's the projectName (extracted from attachedResourceFullName) followed by a colon and the datasetName (extracted from attachedResourceFullName) field.
relations.entity.resource.resource_type
Determined by the pattern of the attachedResourceFullName field in the raw log.
relations.entity_type
Set to RESOURCE, except for SERVICE_ACCOUNT, where it's set to USER.
relations.relationship
Set to MEMBER.
metadata.collected_timestamp
Directly from the timestamp field in the raw log.
metadata.entity_type
Set to USER.
metadata.product_name
Set to GCP IAM ANALYSIS.
metadata.vendor_name
Set to Google Cloud Platform.
iamBinding.role
entity.user.attribute.roles.name
Directly from the iamBinding.role field in the raw log.
identityList.identities.name
entity.user.attribute.roles.type
Set to SERVICE_ACCOUNT if the identityList.identities.name field contains the string serviceAccount.
entity.user.email_addresses
If the identityList.identities.name field contains an @ symbol, it's treated as an email address.
entity.user.userid
If the identityList.identities.name field doesn't contain an @ symbol, it's treated as a userid.
identityList.identities.product_object_id
entity.user.product_object_id
Directly from the identityList.identities.product_object_id field in the raw log.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-29 UTC."],[[["\u003cp\u003eThis document guides you through exporting and ingesting Google Cloud IAM Analysis logs into Google Security Operations (SecOps) using Cloud Storage.\u003c/p\u003e\n"],["\u003cp\u003eThe process involves creating a Cloud Storage bucket, configuring Google Cloud IAM Analysis log exports to that bucket, and setting up a feed in Google SecOps to ingest the logs.\u003c/p\u003e\n"],["\u003cp\u003eThe logs are parsed to extract user and resource information, which are then mapped to the Unified Data Model (UDM) within Google SecOps for enriched security context.\u003c/p\u003e\n"],["\u003cp\u003eYou will need a Google SecOps instance, an active Google Cloud IAM setup, and appropriate permissions to access IAM logs and configure Cloud Storage.\u003c/p\u003e\n"],["\u003cp\u003eThe document provides a UDM mapping table that details how various log fields from Cloud IAM Analysis are translated into UDM fields within the Google SecOps platform, enabling a standardized view of user and resource data.\u003c/p\u003e\n"]]],[],null,["# Collect Identity and Access Management (IAM) Analysis logs\n==========================================================\n\nSupported in: \nGoogle secops [SIEM](/chronicle/docs/secops/google-secops-siem-toc)\n| **Note:** This feature is covered by [Pre-GA Offerings Terms](https://chronicle.security/legal/service-terms/) of the Google Security Operations Service Specific Terms. Pre-GA features might have limited support, and changes to pre-GA features might not be compatible with other pre-GA versions. For more information, see the [Google SecOps Technical Support Service guidelines](https://chronicle.security/legal/technical-support-services-guidelines/) and the [Google SecOps Service Specific Terms](https://chronicle.security/legal/service-terms/).\n\nThis document explains how to export and ingest IAM Analysis logs into Google Security Operations using Cloud Storage. The parser extracts user and resource information from IAM JSON data. It then maps the extracted fields to the UDM, creating user entities with associated roles and resource relationships, ultimately enriching security context within the Google SecOps platform.\n\nBefore you begin\n----------------\n\nEnsure that you have the following prerequisites:\n\n- Google SecOps instance.\n- IAM is set up and active in your Google Cloud environment.\n- Privileged access to Google Cloud and appropriate permissions to access IAM logs.\n\nCreate a Cloud Storage bucket\n-----------------------------\n\n1. Sign in to the [Google Cloud console](https://console.cloud.google.com/).\n2. Go to the **Cloud Storage Buckets** page.\n\n [Go to Buckets](https://console.cloud.google.com/storage/browser)\n3. Click **Create**.\n\n4. On the **Create a bucket** page, enter your bucket information. After each of the following steps, click **Continue** to proceed to the next step:\n\n 1. In the **Get started** section, do the following:\n\n 1. Enter a unique name that meets the bucket name requirements; for example, **google-cloud-iam-logs**.\n 2. To enable hierarchical namespace, click the expander arrow to expand the **Optimize for file oriented and data-intensive workloads** section, and then select **Enable Hierarchical namespace on this bucket**.\n\n | **Note:** You cannot enable hierarchical namespace on an existing bucket.\n 3. To add a bucket label, click the expander arrow to expand the **Labels** section.\n\n 4. Click **Add label**, and specify a key and a value for your label.\n\n 2. In the **Choose where to store your data** section, do the following:\n\n 1. Select a **Location type**.\n 2. Use the location type menu to select a **Location** where object data within your bucket will be permanently stored.\n\n | **Note:** If you select the **dual-region** location type, you can also choose to enable **turbo replication** by using the relevant checkbox.\n 3. To set up cross-bucket replication, expand the **Set up cross-bucket replication** section.\n\n 3. In the **Choose a storage class for your data** section, either select a **default storage class** for the bucket, or select **Autoclass** for automatic storage class management of your bucket's data.\n\n 4. In the **Choose how to control access to objects** section, select **not** to enforce **public access prevention** , and select an **access control model** for your bucket's objects.\n\n | **Note:** If public access prevention is already enforced by your project's organization policy, the **Prevent public access** checkbox is locked.\n 5. In the **Choose how to protect object data** section, do the following:\n\n 1. Select any of the options under **Data protection** that you want to set for your bucket.\n 2. To choose how your object data will be encrypted, click the expander arrow labeled **Data encryption** , and select a **Data encryption method**.\n5. Click **Create**.\n\n| **Note:** Be sure to provide your Google SecOps service account with permissions to Read or Read \\& Write to the newly created bucket.\n\nConfigure IAM Analysis logs export\n----------------------------------\n\n1. Sign in to the [Google Cloud console](https://console.cloud.google.com/).\n2. Go to **Logging \\\u003e Log Router**.\n3. Click **Create Sink**.\n4. Provide the following configuration parameters:\n\n - **Sink Name** : enter a meaningful name; for example, `IAM-Analysis-Sink`.\n - **Sink Destination** : select **Cloud Storage Storage** and enter the URI for your bucket; for example, `gs://gcp-iam-analysis-logs`.\n - **Log Filter**:\n\n logName=\"*iam*\"\n resource.type=\"gce_instance\"\n\n Configure permissions for Cloud Storage\n ---------------------------------------\n\n5. Go to **IAM \\& Admin \\\u003e IAM**.\n\n6. Locate the **Cloud Logging** service account.\n\n7. Grant the **roles/storage.admin** on the bucket.\n\nSet up feeds\n------------\n\nTo configure a feed, follow these steps:\n\n1. Go to **SIEM Settings** \\\u003e **Feeds**.\n2. Click **Add New Feed**.\n3. On the next page, click **Configure a single feed**.\n4. In the **Feed name** field, enter a name for the feed; for example, **IAM Analysis Logs**.\n5. Select **Google Cloud Storage V2** as the **Source type**.\n6. Select **GCP IAM Analysis** as the **Log type**.\n7. Click **Get Service Account** next to the **Chronicle Service Account** field.\n8. Click **Next**.\n9. Specify values for the following input parameters:\n\n - **Storage Bucket URI** : Cloud Storage bucket URL; for example, `gs://gcp-iam-analysis-logs`.\n - **Source deletion options**: select the deletion option according to your preference.\n\n | **Note:** If you select the `Delete transferred files` or `Delete transferred files and empty directories` option, make sure that you granted appropriate permissions to the service account.\n - **Maximum File Age**: Includes files modified in the last number of days. Default is 180 days.\n\n10. Click **Next**.\n\n11. Review your new feed configuration in the **Finalize** screen, and then click **Submit**.\n\nUDM Mapping Table\n-----------------\n\n**Need more help?** [Get answers from Community members and Google SecOps professionals.](https://security.googlecloudcommunity.com/google-security-operations-2)"]]