ddotta/parquetize
R package that allows to convert databases of different formats to parquet format
RApache-2.0
Issues
- 1
Replace read_delim by read_delim_arrow
#56 opened by ddotta - 2
Fix error on fedora-clang OS
#52 opened by ddotta - 0
- 5
rds gzfile cannot open connection
#51 opened by ChristosMichaliaslis - 5
Add the feature to convert txt files
#11 opened by ddotta - 8
table_to_parquet: SPSS-file is not correctly converted to .parquet when it has user defined missings
#40 opened by Schakel17 - 0
Specify minimal version for haven
#46 opened by ddotta - 0
- 0
Arguments `compression` and `compression_level` are never passed to `write_parquet_at_once`
#36 opened by ddotta - 0
- 0
- 3
- 1
Add in the other functions of parquetize the functionality with chunks proposed in `table_to_parquet()`
#28 opened by ddotta - 0
- 0
Update vignette when PR #23 will be merged
#24 opened by ddotta - 0
- 1
- 1
- 0
Add metrics
#16 opened by ddotta - 0
Use a callback function in read_by_chunk() ?
#15 opened by ddotta - 1
Add the feature to convert duckdb files
#14 opened by ddotta - 0
Add the feature to convert sqlite files
#13 opened by ddotta - 1
Add the feature to convert pickle files
#10 opened by ddotta - 0
Add the feature to convert rds files
#3 opened by ddotta - 0
Add the feature to convert json files
#12 opened by ddotta - 0
Improve code coverage with utilities functions
#9 opened by ddotta - 0
Check if `path_to_parquet` exists
#8 opened by py-b - 0
Add a function for RData files
#2 opened by ddotta - 0
Add a function for SPSS files
#5 opened by ddotta - 0
- 0
Add a function for SAS files
#4 opened by ddotta - 0
- 1
Add compression argument to csv_to_parquet
#6 opened by ddotta