Skip to content

EXPORT.DATA: Basic Examples

The examples in the official documentation are reproduced in this section.

Export data to CSV format

The following example exports data to a CSV file. It includes options to overwrite the destination location, write header rows, and use ; as a delimiter. Note that the declarations need to be set at the beginning of the script for this function to successfully execute in each of the following examples.

SET export_uri = "gs://bucket/folder/*.csv";
SET export_format = "CSV";
SET query_statement = "SELECT field1, field2 FROM myproject.mydataset.table1 ORDER BY field1 LIMIT 10";
SET optional_arguments = JSON '{"overwrite": true, "header": true, "field_delimiter": ";"}';

CALL `EXPORT.DATA`(
    query_statement, 
    export_format, 
    export_uri, 
    optional_arguments
    );

It is also possible to execute the function in a more concise manner without using variables, by directly passing the arguments when calling the function:

CALL `EXPORT.DATA`(
    "SELECT field1, field2 FROM myproject.mydataset.table1 ORDER BY field1 LIMIT 10", 
    "CSV", 
    "gs://bucket/folder/*.csv", 
     JSON '{"overwrite": true, "header": true, "field_delimiter": ";"}'
    );

However the first version is typically preferred for readability, especially when executing more complex logic.

Export data to Avro format

The following example exports data to Avro format using Snappy compression.

SET export_uri = "gs://bucket/folder/*";
SET export_format = "AVRO";
SET query_statement = "SELECT field1, field2 FROM myproject.mydataset.table1 ORDER BY field1 LIMIT 10";
SET optional_arguments = JSON '{"compression": "SNAPPY"}';

CALL `EXPORT.DATA`(
    query_statement, 
    export_format, 
    export_uri, 
    optional_arguments
    );

Export data to Parquet format

The following example exports data to Parquet format. It includes the option to overwrite the destination location.

SET export_uri = "gs://bucket/folder/*";
SET export_format = "PARQUET";
SET query_statement = "SELECT field1, field2 FROM myproject.mydataset.table1 ORDER BY field1 LIMIT 10";
SET optional_arguments = JSON '{"overwrite": true}';

CALL `EXPORT.DATA`(
    query_statement, 
    export_format, 
    export_uri, 
    optional_arguments
    );

Export data to an Amazon S3 bucket

The following example exports query results that run against a BigLake table based on Amazon S3 to your Amazon S3 bucket.

SET export_uri = "s3://bucket/folder/*";
SET export_format = "JSON";
SET query_statement = "SELECT field1, field2 FROM myproject.mydataset.table1 ORDER BY field1 LIMIT 10";
SET optional_arguments = JSON '{"overwrite": true}';

CALL `EXPORT.DATA`(
    query_statement, 
    export_format, 
    export_uri, 
    optional_arguments
    );