BigQueryR Versions Save

R Interface with Google BigQuery

v0.5.0

4 years ago
  • Support listing more than 50 datasets in bqr_list_dataset
  • Change bqr_list_tables to list all tables in a dataset by default
  • Add bqr_copy_dataset
  • Add Table and bqr_update_table
  • Support uploading nested lists via toJSON
  • Add writeDisposition to table loads
  • Allow creation of empty tables
  • Supporting supplying SQL via a file ending with .sql for bqr_query()
  • Update to new googleAuthR>1.1.1

v0.4.0

6 years ago
  • support nullMarker, maxBadRecords, fieldDelimiter in upload jobs
  • Support BigQuery type DATE for R class Date data.frame columns (BigQuery type TIMESTAMP still default for POSIXctcolumns) (#48)
  • Allow custom user schema for uploads of data.frames (#48)
  • Rename misnamed global functions from bq_ prefix to bqr_ prefix
  • Add allowJaggedRows and allowQuotedNewlines options to upload via bqr_upload_data()
  • bqr_get_job now accepts a job object as well as the jobId
  • Fix bug with bqr_upload_data where autodetect=TRUE didn't work with gcs:// loads from Cloud Storage
  • Fix bug with bqr_query() that caused a 404 error sometimes.

v0.3.2

6 years ago

bigQuery 0.3.2

  • Move to new batch endpoint (#41)
  • Remove travis env arg

bigQuery 0.3.1

  • Fix asynch job fail if user previously set.seed() (#37)
  • skip tests on CRAN causing error
  • fix warning in scope check (#40)

v0.3.0

7 years ago
  • Add support for realtime queries, useQueryCache = FALSE
  • Add support for standard SQL (#21)
  • Add support for hms/timestamp class uploads (#27)
  • Add support for partitioned tables (#28)
  • Fix bug that only returned one row for single column queries (#31 - thanks Rob)
  • Allow loading of data from Google Cloud Storage to BigQuery for large files
  • no error if delete non-existent table (#26)
  • Add auto authentication if set environment var BQ_AUTH_FILE to location of auth file
  • Add default project if set environment var BQ_DEFAULT_PROJECT_ID to project-id
  • Add default dataset if set environment var BQ_DEFAULT_DATASET to dataset-id
  • Make it clearer when jobs resulted in errors in the job print methods
  • Migrate to using googleCloudStorageR for Cloud Storage stuff
  • Set default authentication scope to https://www.googleapis.com/auth/cloud-platform
  • Unit tests
  • Upload table will now correctly report errors
  • More user feedback on BigQuery jobs when running
  • Allow upload of data.frames asynchrnously
  • Allow auto-detection of schema for uploads

v0.2.0

7 years ago
  • Download asynch queries straight to disk via googleCloudStorageR

v0.1.0

8 years ago