Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

This section describes the transaction behavior of the GCP BigQuery agent. For more information about general transactions, see, Transactions, in Workflow Monitor.

The GCP BigQuery batch forwarding agent uses the streaming insert API that is designed to ensure that data can be loaded at extremely high volumes and also that loaded data is available to queries in real-time. As part of the implementation that does both of these things, newly inserted data is added to a streaming buffer where it is immediately available for queries. However, this data is not moved into standard storage until more than an hour after being loaded. While the data is in the streaming buffer, it can only be queried. It cannot be updated, deleted, or copied. You will need to refer to the GCP documentation for further information on the streaming insert API and the streaming buffer.

...

Info
titleExample - View Joining Data Table and Batch Status Table

The following DDL query is used in the BigQuery Query Editor to create a view under the user_analytics Dataset with the table named view1.

Code Block
CREATE VIEW IF NOT EXISTS user_analytics.view1 AS
SELECT * FROM user_analytics.data_tbl1 AS t1
FULL JOIN user_analytics.batch_status_tbl1 AS t2
USING (id)  WHERE t2.status = 0;

Input/Output data 

The agent emits commands that change the state of the file currently processed.

...

Command

...

Description

...

Cancel Batch

...

Emitted if any error occurs during insertion of rows into Data table or error when mapping the UDR to rows.

The agent retrieves commands from other agents and based on them generates a state change of the file currently processed.

...

Command

...

Description

...

Begin Batch

...

Retrieves a Transaction ID.

...

End Batch

...

Updates the Batch status to indicate the successful insert of all the UDRs into the Data table. This status will be loaded to the Batch Status table.

...

Cancel Batch

...