dplyr left_join按小于、大于条件

kiz8lqtg  于 2023-01-06  发布在  其他
关注(0)|答案(5)|浏览(93)

这个问题与Efficiently merging two data frames on a non-trivial criteriaChecking if date is between two dates in r有些关系,我在这里发布的问题是询问是否存在该特性:GitHub issue
我希望使用dplyr::left_join()连接两个 Dataframe 。我用于连接的条件是小于、大于,即<=>dplyr::left_join()是否支持此功能?或者键之间是否仅使用=运算符。这可以直接从SQL运行(假设 Dataframe 在数据库中)
以下是MWE:我有两个数据集,一个是公司年(fdata),另一个是每5年发生一次的调查数据,所以对于fdata中两个调查年之间的所有年份,我加入了相应的调查年数据。

id <- c(1,1,1,1,
        2,2,2,2,2,2,
        3,3,3,3,3,3,
        5,5,5,5,
        8,8,8,8,
        13,13,13)

fyear <- c(1998,1999,2000,2001,1998,1999,2000,2001,2002,2003,
       1998,1999,2000,2001,2002,2003,1998,1999,2000,2001,
       1998,1999,2000,2001,1998,1999,2000)

byear <- c(1990,1995,2000,2005)
eyear <- c(1995,2000,2005,2010)
val <- c(3,1,5,6)

sdata <- tbl_df(data.frame(byear, eyear, val))

fdata <- tbl_df(data.frame(id, fyear))

test1 <- left_join(fdata, sdata, by = c("fyear" >= "byear","fyear" < "eyear"))

我得到

Error: cannot join on columns 'TRUE' x 'TRUE': index out of bounds

除非left_join可以处理这个条件,但是我的语法缺少了什么?

8dtrkrch

8dtrkrch1#

data.table从v1.9.8开始添加非对等连接

library(data.table) #v>=1.9.8
setDT(sdata); setDT(fdata) # converting to data.table in place

fdata[sdata, on = .(fyear >= byear, fyear < eyear), nomatch = 0,
      .(id, x.fyear, byear, eyear, val)]
#    id x.fyear byear eyear val
# 1:  1    1998  1995  2000   1
# 2:  2    1998  1995  2000   1
# 3:  3    1998  1995  2000   1
# 4:  5    1998  1995  2000   1
# 5:  8    1998  1995  2000   1
# 6: 13    1998  1995  2000   1
# 7:  1    1999  1995  2000   1
# 8:  2    1999  1995  2000   1
# 9:  3    1999  1995  2000   1
#10:  5    1999  1995  2000   1
#11:  8    1999  1995  2000   1
#12: 13    1999  1995  2000   1
#13:  1    2000  2000  2005   5
#14:  2    2000  2000  2005   5
#15:  3    2000  2000  2005   5
#16:  5    2000  2000  2005   5
#17:  8    2000  2000  2005   5
#18: 13    2000  2000  2005   5
#19:  1    2001  2000  2005   5
#20:  2    2001  2000  2005   5
#21:  3    2001  2000  2005   5
#22:  5    2001  2000  2005   5
#23:  8    2001  2000  2005   5
#24:  2    2002  2000  2005   5
#25:  3    2002  2000  2005   5
#26:  2    2003  2000  2005   5
#27:  3    2003  2000  2005   5
#    id x.fyear byear eyear val

您还可以在1.9.6中使用foverlaps,只需多花点功夫。

50few1ms

50few1ms2#

使用filter。(但是注意,这个答案 * 没有 * 产生正确的LEFT JOIN;但MWE使用INNER JOIN给出了正确的结果。)
dplyr包不喜欢合并两个表,因为没有合并的内容,所以在下面的代码中,我在两个表中创建了一个伪变量,然后过滤,最后删除dummy

fdata %>% 
    mutate(dummy=TRUE) %>%
    left_join(sdata %>% mutate(dummy=TRUE)) %>%
    filter(fyear >= byear, fyear < eyear) %>%
    select(-dummy)

请注意,如果您在PostgreSQL中执行此操作(例如),查询优化器会看穿dummy变量,如以下两个查询解释所示:

> fdata %>% 
+     mutate(dummy=TRUE) %>%
+     left_join(sdata %>% mutate(dummy=TRUE)) %>%
+     filter(fyear >= byear, fyear < eyear) %>%
+     select(-dummy) %>%
+     explain()
Joining by: "dummy"
<SQL>
SELECT "id" AS "id", "fyear" AS "fyear", "byear" AS "byear", "eyear" AS "eyear", "val" AS "val"
FROM (SELECT * FROM (SELECT "id", "fyear", TRUE AS "dummy"
FROM "fdata") AS "zzz136"

LEFT JOIN 

(SELECT "byear", "eyear", "val", TRUE AS "dummy"
FROM "sdata") AS "zzz137"

USING ("dummy")) AS "zzz138"
WHERE "fyear" >= "byear" AND "fyear" < "eyear"

<PLAN>
Nested Loop  (cost=0.00..50886.88 rows=322722 width=40)
  Join Filter: ((fdata.fyear >= sdata.byear) AND (fdata.fyear < sdata.eyear))
  ->  Seq Scan on fdata  (cost=0.00..28.50 rows=1850 width=16)
  ->  Materialize  (cost=0.00..33.55 rows=1570 width=24)
        ->  Seq Scan on sdata  (cost=0.00..25.70 rows=1570 width=24)

而用SQL做得更干净,会得到 * 完全 * 相同的结果:

> tbl(pg, sql("
+     SELECT *
+     FROM fdata 
+     LEFT JOIN sdata 
+     ON fyear >= byear AND fyear < eyear")) %>%
+     explain()
<SQL>
SELECT "id", "fyear", "byear", "eyear", "val"
FROM (
    SELECT *
    FROM fdata 
    LEFT JOIN sdata 
    ON fyear >= byear AND fyear < eyear) AS "zzz140"

<PLAN>
Nested Loop Left Join  (cost=0.00..50886.88 rows=322722 width=40)
  Join Filter: ((fdata.fyear >= sdata.byear) AND (fdata.fyear < sdata.eyear))
  ->  Seq Scan on fdata  (cost=0.00..28.50 rows=1850 width=16)
  ->  Materialize  (cost=0.00..33.55 rows=1570 width=24)
        ->  Seq Scan on sdata  (cost=0.00..25.70 rows=1570 width=24)
avwztpqn

avwztpqn3#

这看起来就像是包fuzzyjoin要处理的任务,包中的各种函数看起来和工作起来都类似于dplyrjoin函数。
在这种情况下,fuzzy_*_join函数中的一个将为您工作。dplyr::left_joinfuzzyjoin::fuzzy_left_join之间的主要区别在于,您使用match.fun参数给予了匹配过程中要使用的函数列表。注意,by参数的编写方式仍然与left_join中的相同。
下面是一个例子,我用来匹配的函数是>=<,分别用于fyearbyear的比较和fyeareyear的比较。

library(fuzzyjoin)

fuzzy_left_join(fdata, sdata, 
             by = c("fyear" = "byear", "fyear" = "eyear"), 
             match_fun = list(`>=`, `<`))

Source: local data frame [27 x 5]

      id fyear byear eyear   val
   (dbl) (dbl) (dbl) (dbl) (dbl)
1      1  1998  1995  2000     1
2      1  1999  1995  2000     1
3      1  2000  2000  2005     5
4      1  2001  2000  2005     5
5      2  1998  1995  2000     1
6      2  1999  1995  2000     1
7      2  2000  2000  2005     5
8      2  2001  2000  2005     5
9      2  2002  2000  2005     5
10     2  2003  2000  2005     5
..   ...   ...   ...   ...   ...
xqkwcwgp

xqkwcwgp4#

dplyr的dev版本现在包含了执行非相等连接的能力,语法几乎和您尝试的完全一样。对于具有许多部分匹配的数据,这将比在过度包含连接之后使用fuzzyjoinfilter步骤更高效。

#devtools::install_github("tidyverse/dplyr") # currently v1.0.99.9000
library(dplyr)
left_join(fdata, sdata, join_by(fyear >= byear,fyear < eyear))
fkvaft9z

fkvaft9z5#

一种选择是将行连接为列表列,然后取消嵌套该列:

# evaluate each row individually
fdata %>% rowwise() %>% 
    # insert list column of single row of sdata based on conditions
    mutate(s = list(sdata %>% filter(fyear >= byear, fyear < eyear))) %>% 
    # unnest list column
    tidyr::unnest()

# Source: local data frame [27 x 5]
# 
#       id fyear byear eyear   val
#    (dbl) (dbl) (dbl) (dbl) (dbl)
# 1      1  1998  1995  2000     1
# 2      1  1999  1995  2000     1
# 3      1  2000  2000  2005     5
# 4      1  2001  2000  2005     5
# 5      2  1998  1995  2000     1
# 6      2  1999  1995  2000     1
# 7      2  2000  2000  2005     5
# 8      2  2001  2000  2005     5
# 9      2  2002  2000  2005     5
# 10     2  2003  2000  2005     5
# ..   ...   ...   ...   ...   ...

相关问题