Update on rakudo.js

Current State

rakudo.js (Rakudo compiled to JavaScript) compiles 70% of the core setting.
I'm working on getting it to compile the whole setting.
The setting executes a bunch of code at compile time (it has BEGIN blocks, constant declarators etc.) so the code the compiler is generated is validated to some degree (the test suit will exercise it much more).
I'm mostly fixing bugs, and implementing missing features in the backend (most are small some required bigger changes to the way we handle things, like nqp::attrinited).
While doing that I'm a…

Nqp-js update.

nqp-js/rakudo.js is now targeting ECMAScript 6

Scott McWhirter helped a ton with the transitions (as well as with some general cleanup).
Most of the modern browsers now support ECMAScript 6 so I feel it makes sense to target it.
When targeting old ones that don't we can use polyfills and compilers from ECMAScript 6 to 5.

After doing most of the obvious and promising nqp-js optimizations I'm focusing again on getting rakudo.js to work.
Before that I'm cleaning up the nqp-js code base to remove hacks that might shoot us in the back while working on rakudo.js/var/www/users/pawel_murias/index.html

Adding profiling support to nqp-js

Currently the focus of the work on the js backend is on making nqp-js emit code that runs at a reasonable speed (so that compiling Rakudo and its setting doesn't take eons and I can iterate on it more easily). Being able to easily profile nqp-js code is very useful for that.

The js profilers I have tried didn't work out so well

  • devtools had trouble with native modules as it runs in
  • running directly inside chrome require webpacking
  • node-inspector didn't support console.profile/console.profileEnd and it's interface locked up while profiling
  • some other ones were bitrotted

Saving data using v8-profiler and loading it via chrome itself proved to work the best The run_profiled($what, $kind, $filename) method of the backend passed to the HLL::Compiler proved to be a good fit for hooking it up On the MoarVM backend $kind is either 'instrumented' or 'heap'. 'instrumented' is useful for profiling CPU usage and 'heap' for memory usage. For now I implemented only CPU profiling on the js backend so on the js backend the $kind is ignored for know. The profiling is exposed by the --profile-compile (which profiles the compilation of the code) and --profile (which profiles the code itself) options.

Example: node nqp-bootstrapped.js --profile-compile --profile-filename literals.cpuprofile t/nqp/001-literals.t

literals.cpuprofile can be then loaded using the developer tools in Chrome.

The first test file of which compilation I profiled was t/nqp/001-literals.t repeated 20 times.

It turned out removing some debugging leftovers cut the compilation roughly in half.

A bunch of other promising stuff I'll work on next also popped up.

Working on getting the Perl 6 setting to compile.

Currently rakudo.js is at the point where:
works but node rakudo.js -e 'say "Hello World"' doesn't.

What's needed for the later is to get rakudo.js (Rakudo compiled to JavaScript) to compile the setting

The general work-flow for that is:

  1. Try to compile the setting with rakudo.js.

  2. While rakudo.js is compiling some error appears.

  3. I then figure out wheter it's a result of a missing…

Optimizing nqp-js-on-js to make it fast enough to compile Rakudo

Having failed to find a working profiler on npm I ended up webpacking nqp-js-on-js and profiling it directly in Chrome.
I implemented them.
The second big slowdown was actually the slurp() function.
MoarVM doesn't handle concatenation large amounts of huge strings very well so the cross compiler so instead of concatenating bits of javascript code it's often much faster to write them to disk and then slurp it back in.
On the nqp-js-on-js due to a misset buffer size slurp turned out sluggish.