Skip to content

Commit 8b87450

Browse files
authored
updated README (#22)
1 parent 36e654a commit 8b87450

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

README.md

+3-3
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ Introduction
77

88
MACHINE LEARNING Algorithm library, running on KUNPENG chipset, is an accelerated library that provides a rich set of higher-level tools for machine learning algorithms. It is based on the original APIs from Apache [Spark 2.3.2](https://github.com/apache/spark/tree/v2.3.2) and [breeze 0.13.1](https://github.com/scalanlp/breeze/tree/releases/v0.13.1). The accelerated library for performance optimization greatly improves the computational performance of big data algorithm scenarios.
99

10-
The current version provides three common learning algorithms: Support Vector Machine (SVM) Algorithm, Random Forest Classifier (RFC) algorithm, and Gradient Boosting Decision Tree (GBDT) algorithm.
10+
The current version provides nine common learning algorithms: Support Vector Machine (SVM) Algorithm, Random Forest Classifier (RFC) algorithm, Gradient Boosting Decision Tree (GBDT) algorithm, Decision Tree (DT) algorithm, K-means Clustering algorithm, Linear Regression algorithm, Logistic Regression algorithm, Principal Component Analysis (PCA) algorithm, and Singular Value Decomposition (SVD) algorithm.
1111
You can find the latest documentation, including a programming guide, on the project web page. This README file only contains basic setup instructions.
1212

1313

@@ -22,9 +22,9 @@ Building And Package
2222
mvn clean package
2323

2424

25-
(2) get "sophon-ml-core_2.11-1.0.0.jar" under the "Spark-ml-algo-lib/ml-core/target"
25+
(2) get "sophon-ml-core_2.11-1.1.0.jar" under the "Spark-ml-algo-lib/ml-core/target"
2626

27-
get "sophon-ml-acc_2.11-1.0.0.jar" under the "Spark-ml-algo-lib/ml-accelerator/target"
27+
get "sophon-ml-acc_2.11-1.1.0.jar" under the "Spark-ml-algo-lib/ml-accelerator/target"
2828

2929

3030

0 commit comments

Comments
 (0)