如何将数据帧插入SQL Server表?

时间:2021-10-24 16:52:03

I'm trying to upload a dataframe to a SQL Server table, I tried breaking it down to a simple SQL query string..

我正在尝试将数据帧上传到SQL Server表,我尝试将其分解为一个简单的SQL查询字符串。

library(RODBC)
con <- odbcDriverConnect("driver=SQL Server; server=database")

df <- data.frame(a=1:10, b=10:1, c=11:20)

values <- paste("(",df$a,",", df$b,",",df$c,")", sep="", collapse=",")

cmd <- paste("insert into MyTable values ", values)

result <- sqlQuery(con, cmd, as.is=TRUE)

..which seems to work but does not scale very well. Is there an easier way?

..它似乎工作,但不能很好地扩展。有更容易的方法吗?

3 个解决方案

#1


18  

[edited] Perhaps pasting the names(df) would solve the scaling problem:

[编辑]或许粘贴名称(df)可以解决缩放问题:

   values <- paste( " df[  , c(", 
                     paste( names(df),collapse=",") ,
                                   ")] ", collapse="" ) 
      values
      #[1] " df[  , c( a,b,c )] "

You say your code is "working".. I would also have thought one would use sqlSave rather than sqlQuery if one wanted to "upload".

你说你的代码“工作”..如果有人想“上传”,我也会想到会使用sqlSave而不是sqlQuery。

I would have guessed this would be more likely to do what you described:

我猜想这会更有可能做你所描述的:

 sqlSave(con, df, tablename = "MyTable")

#2


5  

Since insert INTO is limited to 1000 rows, you can dbBulkCopy from rsqlserver package.

由于插入INTO限制为1000行,因此可以从rsqlserver包中获取dbBulkCopy。

dbBulkCopy is a DBI extension that interfaces the Microsoft SQL Server popular command-line utility named bcp to quickly bulk copying large files into table. For example:

dbBulkCopy是一个DBI扩展,它将Microsoft SQL Server流行的命令行实用程序bcp与快速批量复制大文件连接到表中。例如:

url = "Server=localhost;Database=TEST_RSQLSERVER;Trusted_Connection=True;"
conn <- dbConnect('SqlServer',url=url)
## I assume the table already exist
dbBulkCopy(conn,name='T_BULKCOPY',value=df,overwrite=TRUE)
dbDisconnect(conn)

#3


1  

This worked for me and I found it to be simpler.

这对我有用,我发现它更简单。

library(sqldf)
library(odbc)
con <- dbConnect(odbc(),
                 Driver = "SQL Server",
                 Server = "ServerName",
                 Database = "DBName",
                 UID = "UserName",
                 PWD = "Password")
dbWriteTable(conn = con, 
             name = "TableName", 
             value = x)  ## x is any data frame

#1


18  

[edited] Perhaps pasting the names(df) would solve the scaling problem:

[编辑]或许粘贴名称(df)可以解决缩放问题:

   values <- paste( " df[  , c(", 
                     paste( names(df),collapse=",") ,
                                   ")] ", collapse="" ) 
      values
      #[1] " df[  , c( a,b,c )] "

You say your code is "working".. I would also have thought one would use sqlSave rather than sqlQuery if one wanted to "upload".

你说你的代码“工作”..如果有人想“上传”,我也会想到会使用sqlSave而不是sqlQuery。

I would have guessed this would be more likely to do what you described:

我猜想这会更有可能做你所描述的:

 sqlSave(con, df, tablename = "MyTable")

#2


5  

Since insert INTO is limited to 1000 rows, you can dbBulkCopy from rsqlserver package.

由于插入INTO限制为1000行,因此可以从rsqlserver包中获取dbBulkCopy。

dbBulkCopy is a DBI extension that interfaces the Microsoft SQL Server popular command-line utility named bcp to quickly bulk copying large files into table. For example:

dbBulkCopy是一个DBI扩展,它将Microsoft SQL Server流行的命令行实用程序bcp与快速批量复制大文件连接到表中。例如:

url = "Server=localhost;Database=TEST_RSQLSERVER;Trusted_Connection=True;"
conn <- dbConnect('SqlServer',url=url)
## I assume the table already exist
dbBulkCopy(conn,name='T_BULKCOPY',value=df,overwrite=TRUE)
dbDisconnect(conn)

#3


1  

This worked for me and I found it to be simpler.

这对我有用,我发现它更简单。

library(sqldf)
library(odbc)
con <- dbConnect(odbc(),
                 Driver = "SQL Server",
                 Server = "ServerName",
                 Database = "DBName",
                 UID = "UserName",
                 PWD = "Password")
dbWriteTable(conn = con, 
             name = "TableName", 
             value = x)  ## x is any data frame