Saturday 26 November 2016

Installing Lua LGI on macOS

I wanted to see how easy it would be to build some very simple GUI apps in Lua. As I already have GTK+3 installed on my machine, I thought it would be a good idea to try to use it from Lua. LGI is gobject-introspection based dynamic Lua binding to GObject based libraries including GTK+3. It can be installed via Luarocks.

A simple attempt at installing LGI results in an error as Luarocks can't find the libffi library that LGI requires. It is a similar problem to the one I encountered with installing the Ruby GTK+3 Gem.  Applying a similar solution to the one kindly supplied by the Ruby Gnome 2 team did the trick.

This is the command that worked:


  sudo PKG_CONFIG_PATH=/usr/local/opt/libffi/lib/pkgconfig luarocks-jit install lgi

Notes:

I installed GTK+3 using Homebrew
I am using LuaJIT 2.0.4
I am using luarocks-jit 2.3.0


Sunday 13 November 2016

Swift Int32.divideWithOverflow doesn't always actually protect against overflows!!

Good news update.

I've started looking into Swift with the intention of using it to build personal applications running under macOS. I needed to generate some tests of Red/System's integer processing. I would normally generate such tests in Rebol or Ruby but as Red/System integers overflow, rather than raising exceptions, I needed to use a language that supports overflowing integers.

Swift has a collection of <operand>WithOverflow functions. So I went ahead and wrote a Swift program to generate the tests. Then I came across a problem, I got an overflow exception when using divideWithOverflow. It seemed strange. Here is the culprit:

Int32.divideWithOverflow(Int32.min, -1)

Playground execution failed: error:myPlayground.playground:31:7: 
error: division '-2147483648 / -1' results in an overflow
Int32.divideWithOverflow(Int32.min, -1)

I checked on the Swift-User mailing list if that is the expected behaviour. Apparently it is. Perhaps the name of the function needs to be updated to divideWithOverflowExeceptWhenDividingMinByMinusOne? ;-)

As a sanity check, I wrote a very simple C program to check what should be returned when dividing -2147483648 by -1.

Here's the code:
#include <stdio.h>

int main(int argc, const char * argv[]) {

    int32_t i = -2147483648 / -1;
    printf("-2147483648 / -1 = %d\n", i);

    return 0;
}

Here's the output:
-2147483648 / -1 = -2147483648

Based on that I went ahead and wrote the test generator in C. I got quite a surprise when performing the calculations, I got an overflow exception. It seems that the compiler is happy when using integer literals but not 32-bit integers. The following code gives an overflow exception. (Funnily enough it doesn't when multiplying.)

This code

  #include <stdio.h>

  int main(int argc, const char * argv[]) {
    
      int32_t i = -2147483648;
      int32_t j = -1;
      int32_t k = i * j;
      printf("-2147483648 * -1 = %d\n", k);
      int32_t m = i / j;
      printf("-2147483648 / -1 = %d\n", m);
    
      return 0;
  }

prints

  -2147483648 * -1 = -2147483648

and then fails with an overflow exception.

Thanks to Nenad Rakocevic, the creator of Red Language, I found that the compilers are delegating the division to the cpu. The behaviour seems to be cpu dependant. The Intel processor in my laptop raises an overflow error whilst the ARM processor in a Raspberry Pi doesn't. (It does give somewhat inconsistent results though).

So I've learnt that if there is a small chance that you could be dividing -2147483648 by -1 in a program with "overflowing" integers you need to protect against it.