Adding profiling support to nqp-js

Currently the focus of the work on the js backend is on making nqp-js emit code that runs at a reasonable speed (so that compiling Rakudo and its setting doesn't take eons and I can iterate on it more easily). Being able to easily profile nqp-js code is very useful for that.

The js profilers I have tried didn't work out so well

  • devtools had trouble with native modules as it runs in
  • running directly inside chrome require webpacking
  • node-inspector didn't support console.profile/console.profileEnd and it's interface locked up while profiling
  • some other ones were bitrotted

Saving data using v8-profiler and loading it via chrome itself proved to work the best The run_profiled($what, $kind, $filename) method of the backend passed to the HLL::Compiler proved to be a good fit for hooking it up On the MoarVM backend $kind is either 'instrumented' or 'heap'. 'instrumented' is useful for profiling CPU usage and 'heap' for memory usage. For now I implemented only CPU profiling on the js backend so on the js backend the $kind is ignored for know. The profiling is exposed by the --profile-compile (which profiles the compilation of the code) and --profile (which profiles the code itself) options.

Example: node nqp-bootstrapped.js --profile-compile --profile-filename literals.cpuprofile t/nqp/001-literals.t

literals.cpuprofile can be then loaded using the developer tools in Chrome.

The first test file of which compilation I profiled was t/nqp/001-literals.t repeated 20 times.

It turned out removing some debugging leftovers cut the compilation roughly in half.

A bunch of other promising stuff I'll work on next also popped up.

Working on getting the Perl 6 setting to compile.

Currently rakudo.js is at the point where:
works but node rakudo.js -e 'say "Hello World"' doesn't.

What's needed for the later is to get rakudo.js (Rakudo compiled to JavaScript) to compile the setting

The general work-flow for that is:


  1. Try to compile the setting with rakudo.js.

  2. While rakudo.js is compiling some error appears.

  3. I then figure out wheter it's a result of a missing…

Optimizing nqp-js-on-js to make it fast enough to compile Rakudo

Having failed to find a working profiler on npm I ended up webpacking nqp-js-on-js and profiling it directly in Chrome.
I implemented them.
The second big slowdown was actually the slurp() function.
MoarVM doesn't handle concatenation large amounts of huge strings very well so the cross compiler so instead of concatenating bits of javascript code it's often much faster to write them to disk and then slurp it back in.
On the nqp-js-on-js due to a misset buffer size slurp turned out sluggish.

Short summary of the current state of the rakudo-js grant

nqp-js-on-js (NQP compiled to JavaScript and running on node.js) passes it's test suit (almost, there is a bug with how regexes compiled at runtime capture stuff which I haven't yet figured out).
While nqp-js-on-js compiles parts of rakudo (with a minor bug fix) it turn out for some reason it's unacceptably slow on some of the larger files (like Perl6::World).
As such I have turned my attention to figuring out what's the problem and speeding nqp-js-on-js up.
Hopefully the next blog posts will be more detailed and contain the description of some nifty optimizations.

What should Rakudo-js aim for first?

I'm considering applying for a TPF grant to allow me to fully focus on working on getting Rakudo to target JS. To focus the grant application (and pin down the deliverables) I need to choose a use case for rakudo-js to focus on. Possible ones (ideas for new ones are appreciated).
  • running a single page app in a browser (using react.js/jquery or just vanilla js).
  • running on top of node.js
  • running on top of react.native on a mobile phone
  • exploring Perl 6 in your browser (having a awesome REPL, being able to execute snippets etc.)