If you selected the Backfill historical data checkbox, then Datastream will stream all existing data, in addition to changes to the data, from the source into the destination.
If you didn't select this checkbox, then Datastream will stream only changes to the data. To have Datastream stream a snapshot of all existing data from the source to the destination, you must initiate backfill for the objects that contain this data. The objects are in the form of database schemas, tables, and columns.
Another reason for initiating backfill for an object is if data is out of sync between the source and the destination. For example, a user can delete data in the destination inadvertently, and the data is now lost. In this case, initiating backfill for the object serves as a "reset mechanism" because all data is streamed into the destination in one shot. As a result, the data is synced between the source and the destination.
After initiating backfill for an object, you can stop backfill for it. In the preceding example, the user modifies the database schema, and the schema or data is corrupted. You don't want this schema or data to be streamed into the destination, and so you stop backfill for the object.
You can also stop backfill for objects for load balancing purposes. Datastream can run multiple backfills in parallel. This may put an additional load on the source. If the load is significant, stop backfill for the objects, and then initiate backfill for them, one by one.
Object statuses
The various statuses in the lifecycle of initiating and stopping backfill for an object include:
No status (represented in the UI as -): Reasons for an object receiving
this status include:
The stream hasn't been started.
The Backfill historical data checkbox wasn't selected (so the
backfill is defined as manual).
The object is excluded explicitly from being backfilled automatically.
The stream is configured to include future tables. If this happens, then
when new tables are added to the source, there's no automatic backfill
task created for them (because new tables typically don't have any
"historical" data to backfill).
Pending: backfill hasn't yet started for the object.
Active: backfill is in progress for the object.
Completed: backfill is completed for the object.
Stopped: backfill is stopped for the object. If backfill is initiated
again for the object, then Datastream will stream all existing data
associated with the object from the source into the destination.
Failed: backfill failed for the object and the backfill must be initiated
again.
Initiate backfill
Go to the Streams page in the Google Cloud Console.
Click the stream that contains objects for which you want to initiate backfill.
Click the OBJECTS tab.
Select the checkbox for each object for which you want to initiate backfill.
Click INITIATE BACKFILL.
If you selected only one object, then in the dialog, click INITIATE OBJECT BACKFILL. Otherwise, if you selected multiple objects, then click INITIATE OBJECT BACKFILLS.
Datastream will start backfill for the objects that you selected, and
the status of each object will change from Pending to Active to
Completed. When an object has a status of Completed, this means that
Datastream has read all the data for the object, but the data might
still be loading to the destination.
Stop backfill
Go to the Streams page in the Google Cloud Console.
Click the stream that contains objects for which you want to stop backfill.
Click the OBJECTS tab.
Select the checkbox for each object for which you want to stop backfill.
Click STOP BACKFILL.
If you selected only one object, then in the dialog, click STOP OBJECT BACKFILL. Otherwise, if you selected multiple objects, then click STOP OBJECT BACKFILLS.
Datastream will stop backfill for the objects that you selected, and the status of each object will change to Stopped.
When an object has this status, backfill is stopped for the object. If backfill is initiated again for the object, then Datastream will stream all existing data associated with the object from the source into the destination.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-29 UTC."],[[["\u003cp\u003eDatastream can backfill historical data and stream ongoing changes from a source database into a destination.\u003c/p\u003e\n"],["\u003cp\u003eBackfill can be initiated to stream all existing data from source objects (schemas, tables, columns) into the destination, or to resync data if it's out of sync.\u003c/p\u003e\n"],["\u003cp\u003eBackfill can be stopped to prevent corrupted schema or data from being streamed, or for load balancing if multiple backfills are overloading the source.\u003c/p\u003e\n"],["\u003cp\u003eObjects in a stream have various statuses like \u003ccode\u003ePending\u003c/code\u003e, \u003ccode\u003eActive\u003c/code\u003e, \u003ccode\u003eCompleted\u003c/code\u003e, \u003ccode\u003eStopped\u003c/code\u003e, and \u003ccode\u003eFailed\u003c/code\u003e, which indicate the state of the backfill process.\u003c/p\u003e\n"],["\u003cp\u003eThe object tab can be used to initiate or stop backfill for objects, and view additional information including status, recent events, total size of data streamed, and details of the table columns.\u003c/p\u003e\n"]]],[],null,["# Manage backfill for the objects of a stream\n\nA stream in Datastream can backfill historical data, as well as stream ongoing changes into a destination. As part of creating a stream, you [configured information about the source database for the stream](/datastream/docs/create-a-stream#configuresourcedbinfo).\n\nIf you selected the **Backfill historical data** checkbox, then Datastream will stream all existing data, in addition to changes to the data, from the source into the destination.\n\nIf you didn't select this checkbox, then Datastream will stream only changes to the data. To have Datastream stream a snapshot of all existing data from the source to the destination, you must initiate backfill for the objects that contain this data. The objects are in the form of database schemas, tables, and columns.\n\nAnother reason for initiating backfill for an object is if data is out of sync between the source and the destination. For example, a user can delete data in the destination inadvertently, and the data is now lost. In this case, initiating backfill for the object serves as a \"reset mechanism\" because all data is streamed into the destination in one shot. As a result, the data is synced between the source and the destination.\n\nAfter initiating backfill for an object, you can stop backfill for it. In the preceding example, the user modifies the database schema, and the schema or data is corrupted. You don't want this schema or data to be streamed into the destination, and so you stop backfill for the object.\n\nYou can also stop backfill for objects for load balancing purposes. Datastream can run multiple backfills in parallel. This may put an additional load on the source. If the load is significant, stop backfill for the objects, and then initiate backfill for them, one by one.\n\nObject statuses\n---------------\n\nThe various statuses in the lifecycle of initiating and stopping backfill for an object include:\n\n- No status (represented in the UI as `-`): Reasons for an object receiving\n this status include:\n\n - The stream hasn't been started.\n - The **Backfill historical data** checkbox wasn't selected (so the backfill is defined as manual).\n - The object is excluded explicitly from being backfilled automatically.\n - The stream is configured to include future tables. If this happens, then when new tables are added to the source, there's no automatic backfill task created for them (because new tables typically don't have any \"historical\" data to backfill).\n\n | **Note:** For more information, see [Configure information about the source database for the stream](/datastream/docs/create-a-stream#configuresourcedbinfo).\n- `Pending`: backfill hasn't yet started for the object.\n\n- `Active`: backfill is in progress for the object.\n\n- `Completed`: backfill is completed for the object.\n\n- `Stopped`: backfill is stopped for the object. If backfill is initiated\n again for the object, then Datastream will stream all existing data\n associated with the object from the source into the destination.\n\n- `Failed`: backfill failed for the object and the backfill must be initiated\n again.\n\nInitiate backfill\n-----------------\n\n1. Go to the **Streams** page in the Google Cloud Console.\n\n [Go to the Streams page](https://console.cloud.google.com/datastream/streams)\n2. Click the stream that contains objects for which you want to initiate backfill.\n\n3. Click the **OBJECTS** tab.\n\n4. Select the checkbox for each object for which you want to initiate backfill.\n\n5. Click **INITIATE BACKFILL**.\n\n \u003cbr /\u003e\n\n | If an object has a status of `Pending` or `Active`, then you can't initiate backfill for the object.\n\n \u003cbr /\u003e\n\n6. If you selected only one object, then in the dialog, click **INITIATE OBJECT BACKFILL** . Otherwise, if you selected multiple objects, then click **INITIATE OBJECT BACKFILLS**.\n\n Datastream will start backfill for the objects that you selected, and\n the status of each object will change from `Pending` to `Active` to\n `Completed`. When an object has a status of `Completed`, this means that\n Datastream has read all the data for the object, but the data might\n still be loading to the destination.\n | If an object has a status of `Failed`, then backfill failed for the object, and you must initiate the backfill again.\n\nStop backfill\n-------------\n\n1. Go to the **Streams** page in the Google Cloud Console.\n\n [Go to the Streams page](https://console.cloud.google.com/datastream/streams)\n2. Click the stream that contains objects for which you want to stop backfill.\n\n3. Click the **OBJECTS** tab.\n\n4. Select the checkbox for each object for which you want to stop backfill.\n\n5. Click **STOP BACKFILL**.\n\n \u003cbr /\u003e\n\n | If an object has a status of `Completed`, `Stopped`, or `Failed`, then you can't stop backfill for the object.\n\n \u003cbr /\u003e\n\n6. If you selected only one object, then in the dialog, click **STOP OBJECT BACKFILL** . Otherwise, if you selected multiple objects, then click **STOP OBJECT BACKFILLS**.\n\n Datastream will stop backfill for the objects that you selected, and the status of each object will change to `Stopped`.\n\n When an object has this status, backfill is stopped for the object. If backfill is initiated again for the object, then Datastream will stream all existing data associated with the object from the source into the destination.\n\n| You can also use the **OBJECTS** tab to view additional information about the objects of a stream. This information includes:\n|\n| - The status of the objects.\n| - How many events Datastream processed and loaded into the destination for an object in the last 7 days.\n| - The total size (in GB) of all events that Datastream processed and loaded into the destination for an object in the last 30 days.\n| - Details about the table columns of the database schemas that are streamed from the source into the destination.\n\nWhat's next\n-----------\n\n- To learn more about streams, see [Stream lifecycle](/datastream/docs/stream-states-and-actions).\n- To learn how to view information about your stream, see [View a stream](/datastream/docs/view-a-stream).\n- To learn how to modify a stream, see [Modify a stream](/datastream/docs/modify-a-stream).\n- To learn how to monitor a stream, see [Monitor a stream](/datastream/docs/monitor-a-stream).\n- To learn how to recover a failed stream, see [Recover a stream](/datastream/docs/recover-a-stream)."]]