CRAN Package Check Results for Package FCNN4R

Last updated on 2018-01-20 02:49:06 CET.

Flavor Version Tinstall Tcheck Ttotal Status Flags
r-devel-linux-x86_64-debian-clang 0.6.2 26.16 38.43 64.59 NOTE
r-devel-linux-x86_64-debian-gcc 0.6.2 24.79 28.17 52.96 NOTE
r-devel-linux-x86_64-fedora-clang 0.6.2 95.15 NOTE
r-devel-linux-x86_64-fedora-gcc 0.6.2 81.84 NOTE
r-devel-windows-ix86+x86_64 0.6.2 110.00 94.00 204.00 OK
r-patched-linux-x86_64 0.6.2 27.97 36.65 64.62 NOTE
r-patched-solaris-x86 0.6.2 97.90 ERROR
r-release-linux-x86_64 0.6.2 30.24 40.99 71.23 NOTE
r-release-windows-ix86+x86_64 0.6.2 94.00 93.00 187.00 OK
r-release-osx-x86_64 0.6.2 OK
r-oldrel-windows-ix86+x86_64 0.6.2 94.00 71.00 165.00 OK
r-oldrel-osx-x86_64 0.6.2 OK

Check Details

Version: 0.6.2
Check: compiled code
Result: NOTE
    File ‘FCNN4R/libs/FCNN4R.so’:
     Found no calls to: ‘R_registerRoutines’, ‘R_useDynamicSymbols’
    
    It is good practice to register native routines and to disable symbol
    search.
    
    See ‘Writing portable packages’ in the ‘Writing R Extensions’ manual.
Flavors: r-devel-linux-x86_64-debian-clang, r-devel-linux-x86_64-debian-gcc, r-devel-linux-x86_64-fedora-clang, r-devel-linux-x86_64-fedora-gcc, r-patched-linux-x86_64, r-release-linux-x86_64

Version: 0.6.2
Check: examples
Result: ERROR
    Running examples in ‘FCNN4R-Ex.R’ failed
    The error most likely occurred in:
    
    > ### Name: FCNN4R-package
    > ### Title: Fast Compressed Neural Networks for R
    > ### Aliases: FCNN4R-package
    > ### Keywords: package
    >
    > ### ** Examples
    >
    >
    > # set up the XOR problem inputs and outputs
    > inp <- c(0, 0, 1, 1, 0, 1, 0, 1)
    > dim(inp) <- c(4, 2)
    > outp <- c(0, 1, 1, 0)
    > dim(outp) <- c(4, 1)
    > # create a 2-6-1 network
    > net <- mlp_net(c(2, 6, 1))
    > # set activation function in all layers
    > net <- mlp_set_activation(net, layer = "a", "sigmoid")
    > # randomise weights
    > net <- mlp_rnd_weights(net)
    > # tolerance level
    > tol <- 0.5e-4
    > # teach using Rprop, assign trained network and plot learning history
    > netmse <- mlp_teach_rprop(net, inp, outp, tol_level = tol,
    + max_epochs = 500, report_freq = 10)
    Rprop; epoch 10, mse: 0.250534675880111 (desired: 5e-05)
    Rprop; epoch 20, mse: 0.249772549729617 (desired: 5e-05)
    Rprop; epoch 30, mse: 0.244018094767975 (desired: 5e-05)
    Rprop; epoch 40, mse: 0.204328388234391 (desired: 5e-05)
    Rprop; epoch 50, mse: 0.16582866805283 (desired: 5e-05)
    Rprop; epoch 60, mse: 0.108439252597266 (desired: 5e-05)
    Rprop; epoch 70, mse: 0.015991824896717 (desired: 5e-05)
    Rprop; epoch 80, mse: 0.00419098445346732 (desired: 5e-05)
    Rprop; epoch 90, mse: 0.12180017468459 (desired: 5e-05)
    > net <- netmse$net
    > plot(netmse$mse, type = 'l')
    > # plot network with weights
    > mlp_plot(net, TRUE)
    > # if the algorithm had converged, prune using Optimal Brain Surgeon and plot
    > if (mlp_mse(net, inp, outp) <= tol) {
    + net <- mlp_prune_obs(net, inp, outp, tol_level = tol,
    + max_reteach_epochs = 500, report = TRUE)[[1]]
    + mlp_plot(net, TRUE)
    + }
    Error in mlp_prune_obs(net, inp, outp, tol_level = tol, max_reteach_epochs = 500, :
     network should be trained with MSE reduced to given tolerance level (5e-05) before pruning; MSE is 6.39118830017683e-05
    Execution halted
Flavor: r-patched-solaris-x86