Machine learning in Perl, Part3: Deep Convolutional Generative Adversarial network

Hello all, Quick update on the status of AI::MXNet.
Recently MXNet proper got a cool addition, new imperative PyTorch like interface called Gluon. If interested please read about it at Gluon home page.
I am pleased to announce that Perl (as of AI::MXNet 1.1) is joining a happy family of Lua and Python that are able to express ML ideas with Torch like elegance and fluidity.
Today's code is from Perl examples, and if you would like to understand it deeper please read the details at ="http://gluon.mxnet.io/chapter14_generative-adversa…

Machine learning in Perl, Part2: a calculator, handwritten digits and roboshakespeare.

Hello all,
The eight weeks that passed after that were quite fruitful, I've ported whole python's test suite, fixed multiple bugs, added docs, examples, high level RNN interface, Perl API docs has been added to the official MXNet website.
This time I'd like to review in detail three examples from the examples directory.
First one is a simple calculator, a fully connected net that is structured to learn four basi…

Machine learning in Perl

Hello everybody, this is my first post, so please go easy on me. I work with Perl for last 19 years and it always treated me extremely well in regards of interesting work or material compensation. Past December my company decided that it's time to finally join in the fashion of the day and start experimenting with ML.

I started researching and found out that's my lovely Perl is stuck in the past in regards to ML support and there's no any recent developments in this area (like full last decade).

Now look at Python! Tensorflow, MXNet, Keras, Theano, Caffe, and many, many more. Java has it's deeplearning4j, Lua has Torch and what had Perl ?

We had AI::FANN, the interface (good one, I used it, it's good) to the C lib that has not seen any real development since 2007, only feed-forward neural networks, no convolutional networks, no recurrent networks, all advances in last 10 years just were happening outside of Perl.

Please do not get me wrong, PDL is wonderful, CPAN is livelier than ever, Perl 5 is actively being developed, but ignoring such important topic completely does not look like a good state of the things.

So I had a sad day around Dec 15 and decided to try to repay Perl for 19 years of taking care of me. After researching around and comparing the existing libraries I chose on MXNet as the best lib around (it's most scalable, fast, efficient, with cleanest design and has backing of Amazon as its official ML lib).

I started to write Perl interface to the lib, it took about 1.5 months of work but it's finally got to rough usable state (pod is mostly Python yet, and some aspects of Python interface is not yet ported and I am actively working on fixing the remaining stuff). Today it was accepted into official MXNet github repository :-) and I hope to be able to keep Perl interface on par with Python for years to come.

Ok, now for the code example, this is straight out from the t/ dir. I am conscientiously keeping the outer sugar very close to original Python usage so the examples written in Python are applicable (just add $ sigils :-))

Convolutional NN for recognizing hand-written digits in MNIST dataset

## It's considered "Hello, World" for Neural Networks
## For more info about the MNIST problem please refer to http://neuralnetworksanddeeplearning.com/chap1.html
use strict;
use warnings;
use AI::MXNet qw(mx);
use AI::MXNet::TestUtils qw(GetMNIST_ubyte);
use Test::More tests => 1;
# symbol net
my $batch_size = 100;
### model
my $data = mx->symbol->Variable('data');
my $conv1= mx->symbol->Convolution(data => $data, name => 'conv1', num_filter => 32, kernel => [3,3], stride => [2,2]);
my $bn1  = mx->symbol->BatchNorm(data => $conv1, name => "bn1");
my $act1 = mx->symbol->Activation(data => $bn1, name => 'relu1', act_type => "relu");
my $mp1  = mx->symbol->Pooling(data => $act1, name => 'mp1', kernel => [2,2], stride =>[2,2], pool_type=>'max');
my $conv2= mx->symbol->Convolution(data => $mp1, name => 'conv2', num_filter => 32, kernel=>[3,3], stride=>[2,2]);
my $bn2  = mx->symbol->BatchNorm(data => $conv2, name=>"bn2");
my $act2 = mx->symbol->Activation(data => $bn2, name=>'relu2', act_type=>"relu");
my $mp2  = mx->symbol->Pooling(data => $act2, name => 'mp2', kernel=>[2,2], stride=>[2,2], pool_type=>'max');
my $fl   = mx->symbol->Flatten(data => $mp2, name=>"flatten");
my $fc1  = mx->symbol->FullyConnected(data => $fl,  name=>"fc1", num_hidden=>30);
my $act3 = mx->symbol->Activation(data => $fc1, name=>'relu3', act_type=>"relu");
my $fc2  = mx->symbol->FullyConnected(data => $act3, name=>'fc2', num_hidden=>10);
my $softmax = mx->symbol->SoftmaxOutput(data => $fc2, name => 'softmax');
# check data
GetMNIST_ubyte();
my $train_dataiter = mx->io->MNISTIter({
    image=>"data/train-images-idx3-ubyte",
    label=>"data/train-labels-idx1-ubyte",
    data_shape=>[1, 28, 28],
    batch_size=>$batch_size, shuffle=>1, flat=>0, silent=>0, seed=>10});
my $val_dataiter = mx->io->MNISTIter({
    image=>"data/t10k-images-idx3-ubyte",
    label=>"data/t10k-labels-idx1-ubyte",
    data_shape=>[1, 28, 28],
    batch_size=>$batch_size, shuffle=>1, flat=>0, silent=>0});
my $n_epoch = 1;
my $mod = mx->mod->new(symbol => $softmax);
$mod->fit(
    $train_dataiter,
    eval_data => $val_dataiter,
    optimizer_params=>{learning_rate=>0.01, momentum=> 0.9},
    num_epoch=>$n_epoch
);
my $res = $mod->score($val_dataiter, mx->metric->create('acc'));
ok($res->{accuracy} > 0.8);

I hope for this extension to be useful and may be to be able to chip just a bit from some of the negativity that surrounds Perl these days.

Thank you for reading this far!, if interested please checkout MXNet at MXNet Github repository