If the input string, with leading whitespace and possible +/- signs removed, begins with 0x or 0X (a zero, followed by lowercase or uppercase X), radix is assumed to be 16 and the rest of the string is parsed as a hexadecimal number.
If the input string begins with any other value, the radix is 10 (decimal).
Hell, no. If I wanted to save bytes, I’d use a binary format, or just fucking zip the JSON. Looking at a request-response pair and quickly understanding the transferred data is invaluable.
Speaking at a software conference in 2009, Tony Hoare hyperbolically apologized for "inventing" the null reference:[26] [27]
I call it my billion-dollar mistake. It was the invention of the null reference in 1965. At that time, I was designing the first comprehensive type system for references in an object oriented language (ALGOL W). My goal was to ensure that all use of references should be absolutely safe, with checking performed automatically by the compiler. But I couldn't resist the temptation to put in a null reference, simply because it was so easy to implement. This has led to innumerable errors, vulnerabilities, and system crashes, which have probably caused a billion dollars of pain and damage in the last forty years.