I cannot import data from a CSV to Postgresql with dbWriteTable from R

Asked

Viewed 621 times

3

I am trying to import a CSV into a Postgresql database from R. On the Postgresql server, I created an empty database, called "Data".

    library(RPostgreSQL)
    library(sqldf)

    drv <- dbDriver("PostgreSQL")

    con <- dbConnect(drv, 
             dbname="Dados",
             port = 1704, 
             host="localhost",
             user = "postgres",
             password = "dados")

The connection is successful:

    dbListConnections(drv)
    # [[1]]
    # An object of class "PostgreSQLConnection"
    # Slot "Id":
    # [1] 8652    0

Then I create any date.frame and then save it to a file

    tabela <- data.frame(var1 = c(1,2,3,NA),
                var2 = c(2,3,4,1))

    write.table(tabela,"tabela.csv", sep = "\t")

But then, when I execute the command dbWriteTable an error occurs:

    dbWriteTable(conn = con, name = "tabela1", value = paste0(getwd(),"/tabela.csv"))
    # Error in postgresqlExecStatement(conn, statement, ...) : 
    #   RS-DBI driver: (could not Retrieve the result : ERRO:  não pôde abrir 
    #   arquivo "c:/users/rogerio/desktop/tabela.csv" para leitura: Permission 
    #   denied )
    # [1] FALSE
    # Warning message:
    # In postgresqlImportFile(conn, name, value, ...) :
    #   could not load data into table

"Tabela1" is effectively created in Database, in Postgresql, but no data is imported.

The same procedure works perfectly on an Sqlite connection...

Does anyone have any tips?

1 answer

1


I’ve found that one path may be the following:

  library(data.table)
  postgresqlWriteTable(con = con, name = "tabela1", 
                 value = fread("tabela.csv"))

But I still don’t know if this is a great alternative... The fread is loading everything into RAM as a data.frame/data.table. Then it is passed to the SQL server.

If the bank is too big, it can be...

Browser other questions tagged

You are not signed in. Login or sign up in order to post.