我在哪里可以找到pyspark.ml包中的K-means源代码?

时间:2022-09-10 15:09:26

From the PySpark 2.0.1 documentation,I only can find some code like this :

从PySpark 2.0.1文档中,我只能找到这样的代码:

the codes from pyspark.ml package

来自pyspark.ml包的代码

These code just assign values to some properties.Where I can find the key python codes to finds out how the algorithm runs?

这些代码只是为一些属性赋值。在哪里可以找到关键的python代码来找出算法的运行方式?

1 个解决方案

#1


0  

Spark is primarily implemented in Scala, and PySpark wraps some subset of the API to expose it in Python. You can find the algorithm's implementation in the MLLib codebase.

Spark主要在Scala中实现,PySpark包装了API的一些子集以在Python中公开它。您可以在MLLib代码库中找到算法的实现。

#1


0  

Spark is primarily implemented in Scala, and PySpark wraps some subset of the API to expose it in Python. You can find the algorithm's implementation in the MLLib codebase.

Spark主要在Scala中实现,PySpark包装了API的一些子集以在Python中公开它。您可以在MLLib代码库中找到算法的实现。