Stay organized with collections
Save and categorize content based on your preferences.
This page describes how to set up alerts for data pipelines in
Cloud Data Fusion.
You can create alerts through the following services:
Alerts through Cloud Data Fusion: you set up these alerts when you
design a batch pipeline in Cloud Data Fusion. After you deploy the pipeline,
you can't add or edit an alert.
Log-based alerts through Cloud Monitoring: you set up these alerts in
the Google Cloud console before and after you deploy the pipeline.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-29 UTC."],[[["\u003cp\u003eThis guide details setting up alerts for data pipelines in Cloud Data Fusion, offering two methods: through Cloud Data Fusion directly or via log-based alerts in Cloud Monitoring.\u003c/p\u003e\n"],["\u003cp\u003eAlerts configured directly within Cloud Data Fusion must be set up during the design phase of a batch pipeline and cannot be added or edited after the pipeline is deployed.\u003c/p\u003e\n"],["\u003cp\u003eLog-based alerts, managed through Cloud Monitoring in the Google Cloud console, can be created both before and after a pipeline's deployment, offering greater flexibility.\u003c/p\u003e\n"],["\u003cp\u003eWhen creating log based alerts, the logs need to be enabled and a specific query needs to be defined to identify the type of alert to generate.\u003c/p\u003e\n"],["\u003cp\u003eAlerts are triggered when a pipeline running on a Dataproc cluster fails, but not for failures during the Dataproc cluster creation itself.\u003c/p\u003e\n"]]],[],null,["# Create pipeline alerts\n\nThis page describes how to set up alerts for data pipelines in\nCloud Data Fusion.\n\nYou can create alerts through the following services:\n\n- **Alerts through Cloud Data Fusion**: you set up these alerts when you\n design a batch pipeline in Cloud Data Fusion. After you deploy the pipeline,\n you can't add or edit an alert.\n\n- **Log-based alerts through Cloud Monitoring**: you set up these alerts in\n the Google Cloud console before and after you deploy the pipeline.\n\nBefore you begin\n----------------\n\nYou need a Cloud Data Fusion instance. For more information, see\n[Creating a Cloud Data Fusion instance](/data-fusion/docs/how-to/create-instance).\n\nAdd an alert in Cloud Data Fusion\n---------------------------------\n\n| **Note:** You cannot add or edit alerts on deployed pipelines in Cloud Data Fusion.\n\n1. Open your instance.\n\n 1. In the Google Cloud console, go to the Cloud Data Fusion page.\n\n 2. To open the instance in the Cloud Data Fusion Studio,\n click **Instances** , and then click **View instance**.\n\n [Go to Instances](https://console.cloud.google.com/data-fusion/locations/-/instances)\n2. Go to the **Studio** page where you're designing your pipeline.\n\n3. Click **Configure** \\\u003e **Pipeline alerts** \\\u003e add **Add**.\n\n4. Select and configure the alert. For more information about configuring\n alerts, see\n [Batch pipeline alerts](https://cdap.atlassian.net/wiki/spaces/DOCS/pages/705265700/Batch+Pipeline+Alerts).\n\n | **Note:** For email alerts, if your pipeline doesn't have a long-running Dataproc cluster with Simple Mail Transfer Protocol (SMTP) server, enter the mail server or service's information, such as its host and port requirements.\n5. After you configure the alert, click **Validate** to check for errors, and\n then click **Confirm**.\n\n### Create an alert in Monitoring\n\n1. [Enable Cloud Logging](/data-fusion/docs/how-to/viewing-stackdriver-logs#enabling-stackdriver)\n in your Cloud Data Fusion instance.\n\n2. [Create a log-based alert](/logging/docs/alerting/log-based-alerts).\n\n 1. In the Google Cloud console, go to the **Logs Explorer** page.\n\n [Go to the Logs Explorer](https://console.cloud.google.com/logs/query)\n\n This page lets you store, query, update, and delete data.\n 2. Enter the following query:\n\n resource.type=\"cloud_dataproc_cluster\"\n logName=\"projects/\u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e/logs/datafusion-pipeline-logs\"\n resource.labels.project_id=\"\u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e\"\n severity=ERROR\n jsonPayload.message=~\"Pipeline '.*' failed.\"\n\n Replace \u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e with your project's ID.\n 3. Click **Create alert** and configure the alert.\n\nAn alert is only generated when the pipeline runs on a Dataproc cluster\nand fails. For Dataproc cluster creation failures, the alert is\nnot generated.\n\nWhat's next\n-----------\n\n- Refer to the CDAP OS [Batch Pipeline Alerts](https://cdap.atlassian.net/wiki/spaces/DOCS/pages/705265700/Batch+Pipeline+Alerts) reference pages."]]