You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
See the [MXNet Scala API Documentation](http://mxnet.incubator.apache.org/api/scala/docs/index.html).
3
+
See the [MXNet Scala API Documentation](http://mxnet.io/api/scala/docs/index.html).
4
4
5
5
MXNet supports the Scala programming language. The MXNet Scala package brings flexible and efficient GPU
6
6
computing and state-of-art deep learning to Scala. It enables you to write seamless tensor/matrix computation with multiple GPUs in Scala. It also lets you construct and customize the state-of-art deep learning models in Scala,
@@ -9,20 +9,20 @@ computing and state-of-art deep learning to Scala. It enables you to write seaml
9
9
You can perform tensor or matrix computation in pure Scala:
@@ -36,7 +36,7 @@ You can perform tensor or matrix computation in pure Scala:
36
36
37
37
## Resources
38
38
39
-
*[MXNet Scala API Documentation](http://mxnet.incubator.apache.org/api/scala/docs/index.html)
40
-
*[Handwritten Digit Classification in Scala](http://mxnet.incubator.apache.org/tutorials/scala/mnist.html)
41
-
*[Neural Style in Scala on MXNet](https://github.com/apache/incubator-mxnet/blob/master/scala-package/examples/src/main/scala/org/apache/mxnetexamples/neuralstyle/NeuralStyle.scala)
*[MXNet Scala API Documentation](http://mxnet.io/api/scala/docs/index.html)
40
+
*[Handwritten Digit Classification in Scala](http://mxnet.io/tutorials/scala/mnist.html)
41
+
*[Neural Style in Scala on MXNet](https://github.com/dmlc/mxnet/blob/master/scala-package/examples/src/main/scala/ml/dmlc/mxnetexamples/neuralstyle/NeuralStyle.scala)
Copy file name to clipboardexpand all lines: docs/api/scala/io.md
+3-3
Original file line number
Diff line number
Diff line change
@@ -8,7 +8,7 @@ Topics:
8
8
*[Data Iterator Parameters](#parameters-for-data-iterator) clarifies the different usages for dataiter parameters.
9
9
*[Create a Data Iterator](#create-a-data-iterator) introduces how to create a data iterator in MXNet for Scala.
10
10
*[How to Get Data](#how-to-get-data) introduces the data resource and data preparation tools.
11
-
*[IO API Reference](http://mxnet.incubator.apache.org/api/scala/docs/index.html#org.apache.mxnet.IO$) explains the IO API.
11
+
*[IO API Reference](http://mxnet.io/api/scala/docs/index.html#ml.dmlc.mxnet.IO$) explains the IO API.
12
12
13
13
14
14
## Data Iterator Parameters
@@ -83,7 +83,7 @@ First, explicitly specify the kind of data (MNIST, ImageRecord, etc.) to fetch.
83
83
## How to Get Data
84
84
85
85
86
-
We provide [scripts](https://github.com/apache/incubator-mxnet/tree/master/scala-package/core/scripts) to download MNIST data and CIFAR10 ImageRecord data. If you want to create your own dataset, we recommend using the Image RecordIO data format.
86
+
We provide [scripts](https://github.com/dmlc/mxnet/tree/master/scala-package/core/scripts) to download MNIST data and CIFAR10 ImageRecord data. If you want to create your own dataset, we recommend using the Image RecordIO data format.
87
87
88
88
## Create a Dataset Using RecordIO
89
89
@@ -93,7 +93,7 @@ RecordIO implements a file format for a sequence of records. We recommend storin
93
93
* Packing data together allows continuous reading on the disk.
94
94
* RecordIO has a simple way to partition, simplifying distributed setting. We provide an example later.
95
95
96
-
We provide the [im2rec tool](https://github.com/apache/incubator-mxnet/blob/master/tools/im2rec.cc) so you can create an Image RecordIO dataset by yourself. The following walkthrough shows you how.
96
+
We provide the [im2rec tool](https://github.com/dmlc/mxnet/blob/master/tools/im2rec.cc) so you can create an Image RecordIO dataset by yourself. The following walkthrough shows you how.
97
97
98
98
### Prerequisites
99
99
Download the data. You don't need to resize the images manually. You can use `im2rec` to resize them automatically. For details, see "Extension: Using Multiple Labels for a Single Image," later in this topic.
The interface is very similar to the old `FeedForward` class. You can pass in batch-end callbacks using `setBatchEndCallback` and epoch-end callbacks using `setEpochEndCallback`. You can also set parameters using methods like `setOptimizer` and `setEvalMetric`. To learn more about the `FitParams()`, see the [API page](http://mxnet.io/api/scala/docs/index.html#org.apache.mxnet.module.FitParams). To predict with a module, call `predict()` with a `DataIter`:
51
+
The interface is very similar to the old `FeedForward` class. You can pass in batch-end callbacks using `setBatchEndCallback` and epoch-end callbacks using `setEpochEndCallback`. You can also set parameters using methods like `setOptimizer` and `setEvalMetric`. To learn more about the `FitParams()`, see the [API page](http://mxnet.io/api/scala/docs/index.html#ml.dmlc.mxnet.module.FitParams). To predict with a module, call `predict()` with a `DataIter`:
52
52
53
53
```scala
54
54
mod.predict(val_dataiter)
55
55
```
56
56
57
-
The module collects and returns all of the prediction results. For more details about the format of the return values, see the documentation for the [`predict()` function](http://mxnet.incubator.apache.org/api/scala/docs/index.html#org.apache.mxnet.module.BaseModule).
57
+
The module collects and returns all of the prediction results. For more details about the format of the return values, see the documentation for the [`predict()` function](http://mxnet.io/api/scala/docs/index.html#ml.dmlc.mxnet.module.BaseModule).
58
58
59
59
When prediction results might be too large to fit in memory, use the `predictEveryBatch` API:
@@ -108,7 +108,7 @@ We provide some basic ndarray operations, like arithmetic and slice operations.
108
108
You can use MXNet functions to save and load a list or dictionary of NDArrays from file systems, as follows:
109
109
110
110
```scala
111
-
scala>importorg.apache.mxnet._
111
+
scala>importml.dmlc.mxnet._
112
112
scala>vala=NDArray.zeros(100, 200)
113
113
scala>valb=NDArray.zeros(100, 200)
114
114
scala>// save list of NDArrays
@@ -128,20 +128,20 @@ The good thing about using the `save` and `load` interface is that you can use t
128
128
Device information is stored in the `mxnet.Context` structure. When creating NDArray in MXNet, you can use the context argument (the default is the CPU context) to create arrays on specific devices as follows:
Currently, we *do not* allow operations among arrays from different contexts. To manually enable this, use the `copyto` member function to copy the content to different devices, and continue computation:
The basic arithmetic operators (plus, minus, div, multiplication) are overloaded for
@@ -36,7 +36,7 @@ The basic arithmetic operators (plus, minus, div, multiplication) are overloaded
36
36
The following example creates a computation graph that adds two inputs together.
37
37
38
38
```scala
39
-
scala>importorg.apache.mxnet._
39
+
scala>importml.dmlc.mxnet._
40
40
scala>vala=Symbol.Variable("a")
41
41
scala>valb=Symbol.Variable("b")
42
42
scala>valc= a + b
@@ -79,12 +79,12 @@ To attach attributes, you can use ```AttrScope```. ```AttrScope``` automatically
79
79
There are two ways to save and load the symbols. You can use the `mxnet.Symbol.save` and `mxnet.Symbol.load` functions to serialize the ```Symbol``` objects.
80
80
The advantage of using `save` and `load` functions is that it is language agnostic and cloud friendly.
81
81
The symbol is saved in JSON format. You can also get a JSON string directly using `mxnet.Symbol.toJson`.
82
-
Refer to [API documentation](http://mxnet.incubator.apache.org/api/scala/docs/index.html#org.apache.mxnet.Symbol) for more details.
82
+
Refer to [API documentation](http://mxnet.io/api/scala/docs/index.html#ml.dmlc.mxnet.Symbol) for more details.
83
83
84
84
The following example shows how to save a symbol to an S3 bucket, load it back, and compare two symbols using a JSON string.
85
85
86
86
```scala
87
-
scala>importorg.apache.mxnet._
87
+
scala>importml.dmlc.mxnet._
88
88
scala>vala=Symbol.Variable("a")
89
89
scala>valb=Symbol.Variable("b")
90
90
scala>valc= a + b
@@ -112,7 +112,7 @@ which is typically constructed by calling the [`simpleBind(<parameters>)`] metho
112
112
To group the symbols together, use the [mxnet.symbol.Group](#mxnet.symbol.Group) function.
0 commit comments