Alright, so here’s something I didn’t think was actually possible–ANSI C compliant binary literals using macro abuse. I’ve tried to do something like this before, but I ran into limitations with the preprocessor and couldn’t figure out a way to do it. However, someone appears to have found a solution over here, and I modified it a little to fit my stylistic desires. It’s here in modified form as follows:
#include <stdint.h>
#define HEXIFY_(n) 0x##n##LU
#define B8_(n) \
((!!(n & 0x0000000FLU) << 0) | \
(!!(n & 0x000000F0LU) << 1) | \
(!!(n & 0x00000F00LU) << 2) | \
(!!(n & 0x0000F000LU) << 3) | \
(!!(n & 0x000F0000LU) << 4) | \
(!!(n & 0x00F00000LU) << 5) | \
(!!(n & 0x0F000000LU) << 6) | \
(!!(n & 0xF0000000LU) << 7))
#define B8(n) ((uint8_t)B8_(HEXIFY_(n)))
#define B16(a, b) ((uint16_t)((B8(a) << 8) | B8(b)))
#define B32(a, b, c, d) ((uint32_t)((B8(a) << 24) | \
(B8(b) << 16) | \
(B8(c) << 8) | \
(B8(d) << 0)))
Pretty interesting code, despite how evil macro abuse in C is.