Software for unsupervised deep architectures
Get uncomplicated access to unsupervised deep neural networks, from building their architecture to their training and evaluation
In order to develop Ruta models, you will need to install its dependencies first and then get the package from CRAN.
Ruta is based in the well known open source deep learning library Keras and its R interface, which
is integrated in Tensorflow.
In order to install them easily, you can use the keras::install_keras()
function. Depending on whether you want to use the system installation,
a Conda environment or a Virtualenv, you may need to call use_condaenv()
or use_virtualenv()
from reticulate
.
Another straightforward way to install these dependencies is to use
global system-wide (sudo pip install
) or user-wide
(pip install --user
) installation with pip
.
This is generally not recommended unless you are sure you will not need
alternative versions or clash with other packages. The following shell
command would install all libraries expected by Keras:
$ pip install --user tensorflow tensorflow-hub tensorflow-datasets scipy requests pyyaml Pillow h5py pandas pydot
Otherwise, you can follow the official installation guides:
Check whether Keras is accesible from R by running:
::is_keras_available() # should return TRUE keras
From an R interpreter such as the R REPL or the RStudio console, run one of the following commands to get the Ruta package:
# Just get Ruta from the CRAN
install.packages("ruta")
# Or get the latest development version from GitHub
::install_github("fdavidcl/ruta") devtools
All R dependencies will be automatically installed. These include the
Keras R interface and purrr
.
The easiest way to start working with Ruta is to use the
autoencode()
function. It allows for selecting a type of
autoencoder and transforming the feature space of a data set onto
another one with some desirable properties depending on the chosen
type.
1:4] |> as.matrix() |> autoencode(2, type = "denoising") iris[,
You can learn more about different variants of autoencoders by reading A practical tutorial on autoencoders for nonlinear feature fusion.
Ruta provides the functionality to build diverse neural architectures
(see autoencoder()
), train them as autoencoders (see
train()
) and perform different tasks with the resulting
models (see reconstruct()
), including evaluation (see
evaluate_mean_squared_error()
). The following is a basic
example of a natural pipeline with an autoencoder:
library(ruta)
# Shuffle and normalize dataset
<- iris[, 1:4] |> sample() |> as.matrix() |> scale()
x <- x[1:100, ]
x_train <- x[101:150, ]
x_test
autoencoder(
input() + dense(256) + dense(36, "tanh") + dense(256) + output("sigmoid"),
loss = "mean_squared_error"
|>
) make_contractive(weight = 1e-4) |>
train(x_train, epochs = 40) |>
evaluate_mean_squared_error(x_test)
For more details, see other examples and the documentation.