pyspark上有spark 3.0的图表吗+

yv5phkfx  于 2021-07-13  发布在  Spark
关注(0)|答案(2)|浏览(422)

我想知道是否graphxapi在pyspark中可用于spark3.0+?我在官方文件中找不到这种东西。所有的例子都是用scala开发的。我在哪里可以得到更多的更新。
谢谢,达珊

lf5gs5x2

lf5gs5x21#

不。
只有使用scala和rddapi才支持graphx计算。
看到了吗https://docs.databricks.com/spark/latest/graph-analysis/graph-analysis-graphx-tutorial.html
graphx是一种遗产,这是有道理的。

dgenwo3n

dgenwo3n2#

根据网站上提供的文件http://ampcamp.berkeley.edu/big-data-mini-course/graph-analytics-with-graphx.html:
“graphx api目前仅在scala中可用,但我们计划将来提供java和python绑定。”
但是,您应该查看graphframes(https://github.com/graphframes/graphframes),它将graphx算法封装在dataframesapi下,并提供python接口。
下面是一个来自https://graphframes.github.io/graphframes/docs/_site/quick-start.html,稍加修改即可工作。
首先,在加载graphframes pkg的情况下启动pyspark。 pyspark --packages graphframes:graphframes:0.1.0-spark1.6 python代码:

from graphframes import *

# Create a Vertex DataFrame with unique ID column "id"

v = sqlContext.createDataFrame([
  ("a", "Alice", 34),
  ("b", "Bob", 36),
  ("c", "Charlie", 30),
], ["id", "name", "age"])

# Create an Edge DataFrame with "src" and "dst" columns

e = sqlContext.createDataFrame([
  ("a", "b", "friend"),
  ("b", "c", "follow"),
  ("c", "b", "follow"),
], ["src", "dst", "relationship"])

# Create a GraphFrame

g = GraphFrame(v, e)

# Query: Get in-degree of each vertex.

g.inDegrees.show()

# Query: Count the number of "follow" connections in the graph.

g.edges.filter("relationship = 'follow'").count()

# Run PageRank algorithm, and show results.

results = g.pageRank(resetProbability=0.01, maxIter=20)
results.vertices.select("id", "pagerank").show()

相关问题