What the heck is real life number
An actual measured data point, as opposed to a randomly generated number. Also this principle applies specifically to the first digit. Overall the title is a complete mess.
Basically, when you gather a bunch of data points about real world quantitative phenomena (e.g. town population, lake surface area, etc), you find this distribution curve of leading digits where 1 is something like 30% most frequent, gradually decreasing down to 9 being least frequent.
This is called Benford’s Law, it’s basically an emergent property about how orders of magnitude work. It’s useful because you can use it to detect fake data, since if your data faker doesn’t know about it they’ll generate fake data that looks random but doesn’t follow this distribution.
something that isn’t an imaginary life number
This is used to catch tax fraud. People who forge reciepts tend to use random numbers, so they stand out as outliers, and they get caught that way.
Title needs some work. I would suggest:
TIL about Benfords law: In many real life data-sets the leading number is ‘1’ 30% of the time.
Also you could’ve included the wiki link in the post, so people could read up on what you just wrote:
This is a bit weird. I was just listening to Infinity 2 today (great book. Totally recommend), and there’s a section where the characters use Benford’s Law to prove reality. I then had to look it up myself.
Just a super weird coincidence…unless Lemmy is listening to me…
We are not listening to you Travis.
That had a 1 in a million chance, but I had to try.
It was worth the shot if you ask me, Michael
This is called the Baader–Meinhof phenomenon, or frequency illusion.
but… why?
https://en.m.wikipedia.org/wiki/Benford's_law
Look at the logarithmic scale. This law has to do with number sets in the wild, so apparently the scaling is flat over the set of data they examined. If you look at the distribution of the number sets over the logarithmic scale, they are evenly distributed. If you looked at the same numbers on a linear scale, they would become more and more sparse as they grow in size.
Digits, not numbers.
Does anybody know if this is a feature of a decimal system?
The distribution shown in this post is for base 10, but Benford’s Law includes distributions for other bases too. The wiki article linked in another comment goes into detail on that too.
The percentages change. At the lower end, in binary every number that isn’t 0 itself starts with a 1.
This fact is actually used to save one bit in the format that computers usually use to store floating point (fractional instead of integer) numbers.
If you were in Base 12 or something it would still lean towards 1 but the percentage would be a little different.