Mould King 13106 Forklift Review

I got myself the MouldKing 13106 Forklift, which is based on the MOC 3681 by KevinMoo and wanted to share my impressions with you.

First of all, MouldKing actually improved the set by exclusively using back technic pins instead of the blue ones like in the MOC. Also they are officially cooperating with the MOC designer – so he is likely getting some share of the sales.

The set comes with “New PowerModule 4.0″, which means it supports proportional output. If you use the new joystick controller (like I do in the video) or use the app, you can have smooth controls of the motors and not just binary 0% or 100% throttle as with the standard remote.

As you can see, I actually put on some of the stickers. Some purists never do anything like this, because they argue that after some time the stickers start peeling off and look used. This is certainly a good point if you are building a sports-car – with a Forklift however, I would argue broken stickers add to the looks.

Compared to the original MOC, Mould King removed the lights, but added a pallet similar to the one found in the Lego 42079 Forklift.

Interested in getting the set? Support this Site by using the following affiliate Link:

Manual errata & comments

Generally, I prefer the Mould King manual to the original by Kevin Moo as I like renderings more than photographs. However, its nice to have the original at hand if something looks fishy. While building, I noticed the following:

Step 34: Cable-management is almost completely skipped in the manual. I laid all cables through the opening behind the threads. This keeps them out of the way later. The fiddle through the cables of the motors, that you add at steps 52 & 55.

Step 100: The battery-box position is wrong. It will collide with the bar you added at step 96. To make it fit, just rotate the battery-box by 180°.

Also, the direction of motor A has to be reversed. Press and hold left-shoulder, up and down for 3 seconds for this.

Step 111: The arms that you added in steps 89/ 90 should be oriented upwards to hold the footstep.

Step 143: Use a black bush instead of the 2-pin-axle beam, so things look symmetrical. This is a leftover from the original MOC, which squashed the IR receiver in there.

Step 156: Attach the levers to the front console at step 173 instead of attaching them to the seat here. After all they are supposed to control the fork and not the backrest.

Step 214: I suggest using gray 2-axles at step 230 instead of the suggested whites. This way the front facing axes will be all gray. For this just use white 2-axles here. Those wont be visible at all anyway.

Step 277: When adding the fork to the lift-arm, make sure that it has as much play as possible. Otherwise the fork will get stuck when moved all the way up.

Interior with fixes at step 143 & step 156

Step 278: Do not fix the threads yet. Wait until the end so you can correctly measure the lowest position of the fork (which gives you the length of the threads).

Step 286: Make sure that the 3-pin pops out towards the 8-axle. This will make joining things at step 288 much easier.

Comparing water filters

Lets say, you want to reduce the water carbonate hardness because you got a shiny coffee machine and descaling that is a time-consuming mess.

If you dont happen to run a coffee-shop, using a water-jug is totally sufficient for this. Unfortunately, while the jug itself is quite cheap, the filters you need will cost you an arm and a leg – similar to how the printer-ink business works.

The setup

Here, we want to look at the different filter options and compare their performance. The contenders are

NamePricing
Brita Classic~15.19 €
PearlCo Classic12.90 €
PearlCo Protect+15.90 €

As said initially, the primary goal of using these filters is to reduce the water carbonate – any other changes, like pH mythology, will not be considered.

To measure the performance in this regard, I am using a digital total dissolved solid meter – just like the one used in the Wikipedia article. To make the measurement robust against environmental variations, I am not only measuring the PPM in the filtered water, but also in the tap water before filtering. The main indicator is then the reduction factor.

Also, you are not using the filter only once, so I repeat the measuring over the course of 37 days. Why 37? Well, most filters are specified for 30 days of usage – but I want to see how much cushion we got there.

So – without further ado – the results:

Results

NameØ PPM reductionØ absolute PPM
Brita Classic31%206
PearlCo Classic24%218
PearlCo Protect+32%191

As motivated above, the difference in absolute PPM can be explained by environmental variation – after all the measurements took place over the course of more than 3 months.

However, we see that the pricing difference is indeed reflected by filtering performance. By paying ~20% more, you get a ~30% higher PPM reduction.

The only thing missing, is the time-series to see beyond 30 days:

As you can see, the filtering performance is continuously declining after a peak at about 10-15 days of use.

And for completeness, the absolute PPM values:

How to generate random points on a sphere

This question often pops up, when you need a random direction vector to place things in 3D or you want to do a particle simulation.

We recall that a 3D unit-sphere (and hence a direction) is parametrized only by two variables; elevation \theta \in [0; \pi] and azimuth \varphi \in [0; 2\,\pi] which can be converted to Cartesian coordinates as

\begin{aligned} x &= \sin\theta \, \cos\varphi \\ y &= \sin\theta \, \sin\varphi \\ z &= \cos\theta \end{aligned}

If one takes the easy way and uniformly samples this parametrization in numpy like

phi = np.random.rand() * 2 * np.pi
theta = np.random.rand() * np.pi

One (i.e. you as you are reading this) ends with something like this:

While the 2D surface of polar coordinates uniformly sampled (left), we observe a bias of sampling density towards the poles when projecting to the Cartesian coordinates (right).
The issue is that the cos mapping of the elevation has an uneven step size in Cartesian space, as you can easily verify: cos^{'}(x) = sin(x).

The solution is to simply sample the elevation in the Cartesian space instead of the spherical space – i.e. sampling z \in [-1; 1]. From that we can get back to our elevation as \theta = \arccos z:

z = 1 - np.random.rand() * 2 # convert rand() range 0..1 to -1..1
theta = np.arccos(z)

As desired, this compensates the spherical coordinates such that we end up with uniform sampling in the Cartesian space:

Custom opening angle

If you want to further restrict the opening angle instead of sampling the full sphere you can also easily extend the above. Here, you must re-map the cos values from [1; -1] to [0; 2] as

cart_range = -np.cos(angle) + 1 # maximal range in cartesian coords
z = 1 - np.random.rand() * cart_range
theta = np.arccos(z)

Optimized computation

If you do not actually need the parameters \theta, \varphi, you can spare some trigonometric functions by using \sin \theta = \sqrt { 1 - z^2} as

\begin{aligned} x &= \sqrt { 1 - z^2} \, \cos\varphi \\ y &= \sqrt { 1 - z^2} \, \sin\varphi \end{aligned}

Self-built NAS for Nextcloud hosting

With Google cutting its unlimited storage and ending the Play Music service, I decided to use my own Nextcloud more seriously.
In part because Google forced all its competitors out of the market, but mostly because I want to be independent of any cloudy services.

The main drawback of my existing Nextcloud setup, that I have written about here, was missing redundancy; the nice thing about putting your stuff in the cloud is that you do not notice if one of the storage devices fails – Google will take care of providing you with a backup copy of your data.

Unfortunately, the Intel NUC based build I used, while offering great power efficiency did not support adding a second HDD to create a fail-safe RAID1 setup. Therefore I had to upgrade.

As I still wanted to keep things power-efficient in a small form-factor, my choice fell on the ASRock DeskMini series. Here, I went with the AMD Variant (A300) in order to avoid paying the toll of spectre mitigations with Intel (resulting in just 80% of baseline performance).

The photos above show you the size difference, which is considerable – yet necessary to cram two 2.5″ SATA drives next to each other.
Here, keep in mind that while the NUC devices have their CPU soldered on, we are getting the standardized STX form-factor with the A300, which means you can replace and upgrade the mainboard and the CPU as you wish, while with a NUC you are basically stuck with what you bought initially.

The full config of the build is as follows and totals at about 270€

  • ASRock DeskMini A300
  • AMD Athlon 3000G
  • 8GB Crucial DDR4-2666 RAM
  • WD Red SA500 NAS 500GB
  • Crucial MX500 500GB

Note, that I deliberately chose SSDs from different vendors to reduce the risk of simultaneous failure.
Also, while the 3000G is not the fastest AMD CPU, it is sufficient to host nextcloud and is still a nice upgrade from the Intel Celeron I used previously.
Furthermore, its 35W TDP nicely fits the constrained cooling options. Note, that you can limit for Ryzen 3/5 CPUs to 35W in the BIOS as well, so there is not need to get their GE variants.
However, for a private server you probably do not need that CPU power anyway, so just go with the Athlon 3000G for half the price.

Unfortunately, the A300 system is not designed for passive cooling and comes with a quite annoying CPU fan. To me the fan coming with the Athlon 3000G was less annoying, so I used that instead.
Anyway, you should set the fan RPM to 0% below 50° C in the BIOS, which results in 800 RPM and is unhearable while keeping the CPU reasonably cool.

Power Consumption

As the machine will run 24/7, power consumption is an important factor. As a rule of thumb you can calculate with 1€ for 1 Watt drained non-stop for 1 Year.

The 35W TDP gives us a upper limit of what the system will consume on persistent load – however the more interesting measure is the idle consuption as thats the state the system will be most of the time.

As I already tried some builds with different architectures, we have some interesting values to compare to, putting the A300 build in perspective

BuildCPUIdleLoad
Odroid U3Exynos44123.7 W9 W
Gigabyte BRIXIntel N33504.5 W9.6 W
A300Athlon 3000G7.3 W33.6 W

While you can obviously push the system towards 35W by with multiple simultaneous users, the 7.3 W idle consumption is quite nice.
Keep in mind, that the A300 was measured with two SATA drives operating as RAID1. If you only use one you can subtract 1W – at which point it is only 1.5 W away from the considerably weaker NUC system.

You might now wonder, whether the load or the idle measure is closer to the typical consumption. For this I measured the consumption for one day, which totalled at 0.18 kWh – or 7.5 Watts.

Power optimizations

To reach that 7.3 W idle, you need to tune some settings though. The most important one and luckily the easiest to fix is using a recent kernel.
If you are on Ubuntu 18.04, update to 20.04 or install the hwe kernel (5.4.0) – it saves you 4 Watts (11.3 to 7.3).

For saving about 0.5 watts, you can downgrade the network interface from 1Gbit to 100Mbit by executing

ethtool -s enp2s0 speed 100 duplex full autoneg on

Additionally, you can use Intels powertop to tune your system settings for power saving as

powertop --auto-tune

Lecture on Augmented Reality

Due to the current circumstances, I had to record the lectures on augmented reality, which I am typically holding live. This was much more work than anticipated..
On the other hand, this means that I can make them available via Youtube now.

So, if you ever wanted to learn about the basic algorithms behind Augmented Reality, now is your chance.

The lecture is structured in two parts

  • Camera Calibration, and
  • Object Pose Estimation.

You can click on the TOC links below to directly jump to a specific topic.

Camera Calibration

Object Pose Estimation

Xiaomi AirDots Pro 2 / Air2 Review

So after having made fun of people for “wearing toothbrushes”, I finally came to buy such headphones for myself.

Having used non-true wireless Bluetooth headphones before I was curious what the usability advantage would feel like.

Here I went for the Xiaomi AirDots Pro 2 aka Air2 which I could grab for 399 Yuan which is about 51€, which seems like the right price-point for this kind of accessoire.

Keep in mind that the built-in battery only survives so many charging cycles and once it dies you can throw them away.

The initial feeling of using true wireless headphones is surprisingly relieving – there is simply no cord to untangle or to be aware of while wearing.
This is especially true during phone calls, where one needs to keep the microphone aligned.

The downside is that the headphones are too small to accommodate any buttons for volume and playback control.

The Air 2 kind of make up for it by automatically connecting to your phone once you put them on and by automatically pausing the music when you put one out of the ear. This is achieved by a built-in brightness sensor.

Furthermore you have double-tap actions, which default to play/ pause on the right headphone and launching the voice assistant (e.g. Google Assistant) on the left headphone.

The battery life is stated with 4 hours per-charge with 2 extra charges in the case. I could confirm those on a long distance flight.

Compared to the Airpods 2

Looking at the feature-list above or simply at the images, the similarity to the Apple Airpods is apparent.

Out of curiosity I borrowed some from friend for comparison. The most important point is probably sound quality. Here we found the two virtually in-distinguishable. But keep in mind that we only did a quick test and did not use them extensively.

The second point is likely the form. Here, both earphones have the same ear-part and only differ by the shaft. So if one fits your ear, so should the other.

The shaft however is considerably wider on the Airdots. This is less apparent when viewed from the side as the thickness is similar.

For me, the more important difference is being able to control the headphones from my Android smartphone. This is currently not possible with the Airpods, while there is some way for the Airdots;

Companion app & Software integration

To control the earphones, you have to sideload the Xiao Ai Lite App. The main purpose of it is to provide the Xiaomi voice assistant and the Air2 options likely just ended up there instead of Xiaomi Home as they offer an always-on assistant integration just like the Airpods.

It handles firmware updates and allows you to configure the douple-tap action per earphone as well as displaying the charging status of the earphones and the case. By default android will only display the charging status of the least charged earphone.

Furthermore, you can use the fast-paring if the app is running. Here, it is sufficient to hold the earphone case close to the phone and just open it. The app will ask for confirmation. This is only slightly more convenient then holding the pairing button and using the normal bluetooth pairing procedure.

The downside is that the app is currently only available in Chinese and consequently the voice assistant only works with Chinese.

Addendum: “mauronfrio” translated most of the app and provides a modifed APK with english language support at XDA Developers. So there is no need to fiddle with google lens any more.

Below you find some views of the earphone related settings translated with google lens

I tried out some voice commands via google translate and everything works as it should. However if you are not fluent in chinese it is far from practical. Most people should disable the assistant in the settings to avoid accidentally triggering it.

A serious advantage of Xiomi/ Huawei phones is the availability of the LHDC Bluetooth Codec which offers a superior bandwidth and latency.
While I am fine with the bandwidth provided by AAC when listening to music, there is still a noticeable and annoying delay when watching videos and playing games.

Noise Shielding

The firmware upgrade to v2.6.9.0 significantly improved the acoustic pattern of the headphones by tuning up the low bands (base) and thus general sound quality.

This results in a very noticeable noise reduction compared to v2.6.2.0 – especially in the lower frequencies; things like your footsteps get filtered out. Higher frequencies like car motor sounds are still perceivable though. This is however a good compromise for me.

Addendum: previously I stated that the headphones provide active noise cancellation. They do not. I was mistaken by the significantly improved noise shielding through the tuned up bases.

Compared to True Wireless Earphones 2 Basic / Air 2 SE (Addendum)

Ultimately my left earphone died (see below) – likely due to a deep discharge. The second pair of the Air 2 still works though. I will update this review when I find out whether I just had bad luck or whether there is a systematic issue.

Anyway.. as a replacement, I ordered myself the Air2 SE earphones, given my phone does not support LHDC and other online reviews described them as “the Air2 without LHDC”.

Well, I guess some sites try to sell an unboxing story as a review..

As you can see on the image above the Air 2 SE are considerably longer than the normal Air 2 and also have different vents.

While they are slightly too long now for my taste, my largest complaint is the sound quality. Basically, the bases which improved sound quality and shielding on the Air2 after the firmware update are missing here. So we have no shielding and a noticeably worse sound quality here.
Next, the microphones on the Air 2 SE are noticeable less sensitive and I had to raise my voice from normal level so my callees could here me.
Finally, the Air2 SE are not supported by the XiaoAI app, so you get no firmware updates and no configurable tap-actions.

Given the small pricing difference between the Air 2 and the Air 2 SE, I therefore recommend to always go for the former.

Also, if you encounter some sites that tell you the Air2 and Air2 SE are the same, please enable your adblocker. Especially if they jabber about different touch zones, even though the marketing material makes it quite clear that up/ middle/ down means you can touch the single zone at any position.

Charging issues

Over time I noticed that the left earphone consistently runs out of battery before the right one.
The issue seems to be that it discharges while stored inside the case. Putting it in and out resolves the issue – but only until it is fully charged. This makes keeping the earphones pre-charged and ready quite an issue.

Hopefully this can be addressed with a future firmware update. (reproduced with firmwares up to v2.7.1.0)

Addendum: the replacement pair (see above) do not have any charging issues. Therefore I guess the issue is just a singular manufacturing defect.

Beyond the Raspberry Pi for Nextcloud hosting

When using Nextcloud it makes some sense to host it yourself at home to get the maximum benefit of having your own cloud.

If you would use a virtual private server or shared hosting, your data would still be exposed to a third party and the storage would be limited as you would have to rent it.

When setting up a server at home one is tempted to use a Raspberry Pi or similar ARM based device. Those are quite cheap and only consume little power. Especially the latter property is important as the machine will run 24/7.

I was as well tempted and started my self-hosting experience with an ARM based boards, so here are my experiences.

Do not use a Raspberry Pi for hosting

Actually this is true for any ARM based board. As for the Pi itself, only the most recent Pi 4B has a decent enough CPU and enough RAM to handle multiple PHP request (WebCAL, Contacts, WebDAV) from different clients without slowdown.
Also only with the Pi 4B you can properly attach storage over USB3.0 – previously your transfer rates would be limited by the USB2.0 bus.

One might argue that other ARM based computers are better suited. Indeed you could get the decently equipped Odroid U3, long before the Pi 4B was available.
However, non-pi boards have their own set of problems. Typically, they are based on an Smartphone design (e.g. the Odroid U3 essentialy is a Galaxy Note 2).

This makes them plagued by the Android update issues, as these boards require a custom kernel, that includes some of the board specific patches which means you cannot just grab an Ubuntu ARM build.
Instead you have to wait for a special image from the vendor – and just as with Android, at some point, there will be no more updates.

Furthermore ARM boards are actually not that cheap. While the Pi board itself is indeed not expensive at ~60€, you have to add power-supply housing and storage.

Intel NUC devices are a great choice

While everyone was looking at cheap and efficient ARM based boards, Intel has released some great NUC competitors.
Those went largely unnoticed as typically only the high-end NUCs get news coverage. It is more impressive to report how much power one can cram into a small form-factor.

However one can obviously also put only little power in there. More precisely, Intels tablet celeron chips that range around 4-6W TDP and thus compete with ARM boards power-wise. (Still they are an order of magnitude faster then a Raspberry Pi)

DevicePower (Idle)Power (load)
Odroid U33.7 W9 W
GB-BPCE-3350C4.5 W9.6 W

Here, you get the advantages of the mature x86 platform, namely interchangeable RAM, interchangeable WiFi modules, SATA & m2 SSD ports and notably upstream Linux compatibilty (and Windows for that matter).

As you might have guessed by the hardware choice above, I made the switch already some time ago. On the one hand you only get reports for the by now outdated N3350 CPU – but on on the other hand it makes this a long term evaluation.

Regarding the specific NUC model, I went with the Gigabyte GB-BPCE-3350C, which are less expensive (currently priced around 90€) than the Intel models.

Consequently the C probably stands for “cheap” as it lacks a second SO-DIMM slot and a SD-card reader. However it is fan-less and thus perfectly fine for hosting.

So after 2 Years of usage and a successful upgrade between two Ubuntu LTS releases, I can report that switching to the x86 platform was worth it.

If anything I would probably choose a NUC model that also supports M.2/ M-Key in addition to SATA to build a software RAID-1.

Ubuntu on the Lenovo D330

The Lenovo D330 2-in-1 convertible (or netbook as we used to say) is a quite interesting device. It is based on Intels current low-power core platform, Gemini Lake (GLK), and thus offers great battery-life and a fan-less design.

This similar to what you would from an ARM based tablet. However being x86 based and Windows focused we can expect to get Ubuntu Linux running – without requiring any out-of-tree drivers or custom kernels that never get updated as we are used-to from the ARM world.
This post will be about my experiences on doing so.

For this I will use the most recent Ubuntu 19.04 release as it contains fractional scaling support, which is essential for a 10″ 1920x1200px device. Also the orientation sensor (mostly) works out of the box, when compared to the 18.04 LTS release.

Continue reading Ubuntu on the Lenovo D330

Do not use Meson

Recently the Meson Build System gained some momentum. It is time to stop that.
Not that Meson is a bad piece of software – on the contrary, it is quite well designed.
Still it makes building C/C++ applications worse, by (quoting xkcd) basically creating this:

It sets out to create a cross-platform, more readable and faster alternative to autotools. But there is already CMake that solves this.

You might say that CMake is ugly, but note that the CMake 2.x you might have tried is not the same CMake 3.x that is available today. Many patterns have improved and are now both more logical and more readable.

Nowadays the difference between Meson and CMake is just a matter of syntactic preference. The Meson authors seem to agree here.

The actual criterion for selecting a build system however should be tooling support and community spread. CMake easily wins here:

After the introduction of the server mode it got native support by QtCreator, CLion, Android Studio (NDK) and even Microsofts Visual Studio. Native means that you do not have to generate any intermediate project files, but the CMakeLists.txt is used directly by the IDE.

On the community spread side we got e.g. KDE, OpenCV, zlib, libpng, freetype and as of recently Boost. These projects using CMake not only guarantees that you can easily use them, but that you can also include them in your build via add_subdirectory such that they become part of your project. This is especially useful if you are cross-compiling – for instance to a Raspberry Pi.

On the other hand, reinventing a wheel that is tailored to the needs of a specific community (Gnome), means that it will fall behind and eventually die. This is what is currently happening to the Vala language that had a similar birth to Meson.

The meson devs might object that Meson generates build files that run faster on a Raspberry Pi. However if your cross compiling is working you do not need that. And honestly, that particular improvement could have been also achieved by providing a patch to the CMake Ninja generator..

Addendum 27-11-2018
Stumbled over another great guide to modern CMake.

Addendum 15-06-2018
A new guide for CMake called CGold can be found here, which is of comparable quality to the Meson docs.

Addendum 4-1-2018
Some comments (rightfully) note that Meson has generally a better documentation and avoids some of its pitfalls. However this is mostly due to Meson not being around long enough such that the way you do things in Meson changed. Neither did it see such a widespread use like CMake yet. (think of corner-cases)

But even if you argue that this is precisely the point why you should use Meson, I would argue that improving the existing documentation in CMake and adding more educational warnings is easier then writing something from scratch.

Addendum 29-02-2019
Part of the perceived superiority of Meson, was that it just was not in use for long enough to notice its flaws – contrary to long lasting legacy of CMake.
With its adaptation, things like only one global namespace start to get attention – things that are already solved in CMake..

Creating PyGTK app snaps with snapcraft

Snap is a new packaging format introduced by Ubuntu as an successor to dpkg aka debian package. It offers sandboxing and transactional updates and thus is a competitor to the flatpak format and resembles docker images.

As with every new technology the weakest point of working with snaps is the documentation. Your best bet so far is the snappy-playpen repository.

There are also some rough edges regarding desktop integration and python interoperability, so this is what the post will be about.

I will introduce some quircks that were needed to get teatime running, which is written in Python3 and uses Unity and GTK3 via GObject introspection.

The most important thing to be aware of is that snaps are similar to containers in that each snap has its own rootfs and only restricted access outside of it. This is basically what the sandboxing is about.
However a typical desktop application needs to know quite a lot about the outside world:

  • It must know which theme the user currently uses, and after that it also needs to access the theme files.
  • For saving anything it needs access to /home
  • If it should access the internet it needs system level access as well; like querying whether there actually is an active internet connection

To declare that we want to write to home, play back sound and use unity features we use the plugs keyword like

apps:
    teatime:
         # ...
         plugs: [unity7, home, pulseaudio]

However we must also tell our app to look for the according libraries inside its snap instead of the system paths. For this one must change quite a few environment variables manually. Fortunately Ubuntu provides wrapper scripts that take care of this for us. They are called desktop-launchers.

To use the launcher the configures the GTK3 environment we have to extend the teatime part like this:

apps:
    teatime:
        command: desktop-launch $SNAP/usr/share/teatime/teatime.py
        # ...
parts:
    teatime:
         # ...
         after: [desktop/gtk3]

The desktop-launch script takes care of telling PyGTK where the GI repository files are.

You can see the full snapcraft.yml here.

Update:

Before my fix, one had to use this rather lengthy startup command

env GI_TYPELIB_PATH=$SNAP/usr/lib/girepository-1.0:$SNAP/usr/lib/x86_64-linux-gnu/girepository-1.0 desktop-launch $SNAP/usr/share/teatime/teatime.py

which hard-coded the architecture.

End Update

After this teatime will start, but the paths still have to be fixed. Inside a snap “/” still refers to the system root, so all absolute paths must be prefixed with $SNAP.

Actually I think the design of flatpak is more elegant in this regard where “/” points to the local rootfs and one does not have to change absolute paths. To bring in the system parts flatpak uses bind mounts.

Conclusions

Once you get the hang of how snaps work, packaging becomes quite straightforward, however currently there are still some drawbacks

  • the snap package results in 120MB compared to a 12KB deb. This is actually a feature as shipping all the dependencies makes the snap installable on every linux distribution. However I hope that we can get this down by introducing shared frameworks (like GTK3, Python) that do not have to be included in the app package.
  • [Update: fixed] Due to another issue, your snap has only the C locale available and thus will not be localized.
  • [Update: fixed] Unity desktop notifications do not work. You will get a DBus exception at the corresponding call.
  • [Update: fixed] The shipped .desktop file is not hooked up with the system, so you can only launch the app via the command line.