There are no any limitations on the dataset size and in this you can get reports by billions-size datasets in the near real-time. This youtube video illustrates how sample 1.3Gb data is aggregated by BigQuery in seconds and displayed as a pivot table by SeekTable.
Connection String should be a valid connection string for Simba ODBC driver for Google BigQuery; for example:
Driver=Simba ODBC Driver for Google BigQuery;OAuthMechanism=1;RefreshToken=your_google_oauth_refresh_token;Catalog=your_api_project_id;
|Catalog||The name of your BigQuery project. This project is the default project that the Simba ODBC Driver for Google BigQuery queries against.|
|RefreshToken||The refresh token that you obtain from Google for authorizing access to BigQuery. Section below explains how to generate the token.|
If you don't have a refresh token you get it in one of the following ways:
ClientSecretin the connection string.
If your table has an ARRAY-type columns and you got an error like Error converting invalid input with source encoding UTF-8 using ICU:
SELECT * except(array_col1, array_col2) FROM some_tableand save the cube form.
SELECT * FROM some_tabledo NOT check Infer dimensions and measures by dataset. Save the form.