Add dbWriteTable method for the CSV#254
Conversation
|
Thanks! Can you comment if this approach could be a viable solution for the performance problems in RMariaDB too:
|
|
I think the My goal in this PR was just to reduce memory usage for more optimal ETL. |
krlmlr
left a comment
There was a problem hiding this comment.
Thanks. Instead of adding yet another dbWriteTable() method, I'd rather implement a new postgresAppendTableCSV() that does everything minus table creation/removal. We can later decide if we add a new dbAppendTable() method or implement a new dbAppendTableCSV() generic.
|
Would you like to help with a |
|
Sure. Can you provide more details? I didn't find anything in this repo. |
|
DBI now has |
|
Fast draft the postgresAppendTableCSV <- function(conn, name, value, fields = NULL, sep = ",", header = TRUE, na.strings = "NA", encoding = "UTF-8") {
sql <- paste0(
"COPY ", dbQuoteIdentifier(conn, name),
" (", paste(dbQuoteIdentifier(conn, fields), collapse = ","), ") ",
"FROM STDIN ",
"(FORMAT CSV DELIMITER '", sep, "', HEADER '", header, "' NULL '", na.strings, "' ENCODING '", encoding, "')"
)
connection_copy_file(conn@ptr, sql, path.expand(value))
invisible(TRUE)
} |
To load CSV directly: