Java idiomatic client for Google Cloud BigQuery.
Note: This client is a work-in-progress, and may occasionally make backwards-incompatible changes.
If you are using Maven, add this to your pom.xml file
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>google-cloud-bigquery</artifactId>
<version>0.30.0-beta</version>
</dependency>
If you are using Gradle, add this to your dependencies
compile 'com.google.cloud:google-cloud-bigquery:0.30.0-beta'
If you are using SBT, add this to your dependencies
libraryDependencies += "com.google.cloud" % "google-cloud-bigquery" % "0.30.0-beta"
BigQueryExample
- A simple command line interface providing some of Cloud BigQuery's functionality. Read more about using this application on theBigQueryExample
docs page.
See the Authentication section in the base directory's README.
Google Cloud BigQuery is a fully managed, NoOps, low cost data analytics service. Data can be streamed into BigQuery at millions of rows per second to enable real-time analysis. With BigQuery you can easily deploy Petabyte-scale Databases.
Be sure to activate the Google Cloud BigQuery API on the Developer's Console to use BigQuery from your project.
See the BigQuery client library docs to learn how to interact with Google Cloud BigQuery using this Client Library.
For this tutorial, you will need a
Google Developers Console project with the BigQuery API
enabled. You will need to enable billing to
use Google Cloud BigQuery.
Follow these instructions to get your
project set up. You will also need to set up the local development environment by installing the
Google Cloud SDK and running the following commands in command line:
gcloud auth login
and gcloud config set project [YOUR PROJECT ID]
.
You'll need to obtain the google-cloud-bigquery
library. See the Quickstart section
to add google-cloud-bigquery
as a dependency in your code.
To make authenticated requests to Google Cloud BigQuery, you must create a service object with credentials. You can then make API calls by calling methods on the BigQuery service object. The simplest way to authenticate is to use Application Default Credentials. These credentials are automatically inferred from your environment, so you only need the following code to create your service object:
import com.google.cloud.bigquery.BigQuery;
import com.google.cloud.bigquery.BigQueryOptions;
BigQuery bigquery = BigQueryOptions.getDefaultInstance().getService();
For other authentication options, see the Authentication page.
With BigQuery you can create datasets. A dataset is a grouping mechanism that holds zero or more tables. Add the following import at the top of your file:
import com.google.cloud.bigquery.DatasetInfo;
Then, to create the dataset, use the following code:
// Create a dataset
String datasetId = "my_dataset_id";
bigquery.create(DatasetInfo.newBuilder(datasetId).build());
With BigQuery you can create different types of tables: normal tables with an associated schema, external tables backed by data stored on Google Cloud Storage and view tables that are created from a BigQuery SQL query. In this code snippet we show how to create a normal table with only one string field. Add the following imports at the top of your file:
import com.google.cloud.bigquery.Field;
import com.google.cloud.bigquery.Schema;
import com.google.cloud.bigquery.StandardTableDefinition;
import com.google.cloud.bigquery.Table;
import com.google.cloud.bigquery.TableId;
import com.google.cloud.bigquery.TableInfo;
Then add the following code to create the table:
TableId tableId = TableId.of(datasetId, "my_table_id");
// Table field definition
Field stringField = Field.of("StringField", LegacySQLTypeName.STRING);
// Table schema definition
Schema schema = Schema.of(stringField);
// Create a table
StandardTableDefinition tableDefinition = StandardTableDefinition.of(schema);
Table createdTable = bigquery.create(TableInfo.of(tableId, tableDefinition));
BigQuery provides several ways to load data into a table: streaming rows or loading data from a Google Cloud Storage file. In this code snippet we show how to stream rows into a table. Add the following imports at the top of your file:
import com.google.cloud.bigquery.InsertAllRequest;
import com.google.cloud.bigquery.InsertAllResponse;
import java.util.HashMap;
import java.util.Map;
Then add the following code to insert data:
Map<String, Object> firstRow = new HashMap<>();
Map<String, Object> secondRow = new HashMap<>();
firstRow.put("StringField", "value1");
secondRow.put("StringField", "value2");
// Create an insert request
InsertAllRequest insertRequest = InsertAllRequest.newBuilder(tableId)
.addRow(firstRow)
.addRow(secondRow)
.build();
// Insert rows
InsertAllResponse insertResponse = bigquery.insertAll(insertRequest);
// Check if errors occurred
if (insertResponse.hasErrors()) {
System.out.println("Errors occurred while inserting rows");
}
BigQuery enables querying data by running queries and waiting for the result. Queries can be run directly or through a Query Job. In this code snippet we show how to run a query directly and wait for the result. Add the following imports at the top of your file:
import com.google.cloud.bigquery.FieldValue;
import com.google.cloud.bigquery.QueryJobConfiguration;
import com.google.cloud.bigquery.QueryResponse;
import java.util.Iterator;
import java.util.List;
Then add the following code to run the query and wait for the result:
// Create a query request
QueryJobConfiguration queryConfig =
QueryJobConfiguration.of("SELECT * FROM my_dataset_id.my_table_id");
// Request query to be executed and wait for results
QueryResponse queryResponse = bigquery.query(
queryConfig,
QueryOption.of(QueryResultsOption.maxWaitTime(60000L)),
QueryOption.of(QueryResultsOption.pageSize(1000L)));
// Read rows
System.out.println("Table rows:");
for (FieldValues row : queryResponse.getResult().iterateAll()) {
System.out.println(row);
}
In InsertDataAndQueryTable.java we put together all the code shown above into one program. The program assumes that you are running on Compute Engine or from your own desktop. To run the example on App Engine, simply move the code from the main method to your application's servlet class and change the print statements to display on your webpage.
To get help, follow the instructions in the shared Troubleshooting document.
BigQuery uses HTTP for the transport layer.
Java 7 or above is required for using this client.
This library has tools to help make tests for code using Cloud BigQuery.
See TESTING to read more about testing.
This library follows Semantic Versioning.
It is currently in major version zero (0.y.z
), which means that anything
may change at any time and the public API should not be considered
stable.
Contributions to this library are always welcome and highly encouraged.
See CONTRIBUTING for more information on how to get started.
Apache 2.0 - See LICENSE for more information.