The definition of “randomness” is “you don’t know what’s coming next”.
How do you prove you don’t know something? You can’t. There are various statistical tests for randomness, but remember that a suitably encrypted message can pass every one of them, and a person who knows the message
knows that the bitstream is not truly random.
On Fri, 28 Mar 2025 21:16:29 -0000 (UTC), I wrote:
The definition of “randomness” is “you don’t know what’s coming next”.
How do you prove you don’t know something? You can’t. There are various >> statistical tests for randomness, but remember that a suitably encrypted
message can pass every one of them, and a person who knows the message
knows that the bitstream is not truly random.
Here’s an even simpler proof, by reductio ad absurdum.
Suppose you have a sequence of numbers which is provably random. Simply pregenerate a large bunch of numbers according to that sequence, and store them. Then supply them one by one to another party. The other party doesn’t know what’s coming next, but you do. Therefore they are not random
to you.
Which contradicts the original assumption of provable randomness. QED.
Random is without a predictable pattern or plan.
A lot of people are pushing QRNGs of various kinds right now. I’ve yet
to be convinced, personally.
A lot of people are pushing QRNGs of various kinds right now. I've yet
to be convinced, personally.
Talked to a guy at MIT in the 90s who was trying to extract random
numbers from the turbulence of gas surrounding a hard drive. Never
learned the tech or theoretical details -- above my amateur pay grade.
Random is without a predictable pattern or plan.
On Sat, 29 Mar 2025 11:50:06 +0000, Richmond wrote:
Random is without a predictable pattern or plan.
Let’s say I collect and store a sequence that meets your definition. Then I play it back when you ask me for a random number sequence. Does it still meet your definition? If not, what has changed?
On Fri, 28 Mar 2025 21:16:29 -0000 (UTC), I wrote:
The definition of “randomness” is “you don’t know what’s coming next”.
How do you prove you don’t know something? You can’t. There are various >> statistical tests for randomness, but remember that a suitably encrypted
message can pass every one of them, and a person who knows the message
knows that the bitstream is not truly random.
Here’s an even simpler proof, by reductio ad absurdum.
Suppose you have a sequence of numbers which is provably random. Simply pregenerate a large bunch of numbers according to that sequence, and store them. Then supply them one by one to another party. The other party doesn’t know what’s coming next, but you do. Therefore they are not random
to you.
Which contradicts the original assumption of provable randomness. QED.
From the original article:
As deterministic systems, classical computers cannot create true
randomness on demand. As a result, to offer true randomness in
classical computing, we often resort to specialized hardware that
harvests entropy from unpredictable physical sources, for instance,
by looking at mouse movements, observing fluctuations in
temperature, monitoring the movement of lava lamps or, in extreme
cases, detecting cosmic radiation. These measures are unwieldy,
difficult to scale and lack rigorous guarantees, limiting our
ability to verify whether their outputs are truly random.
Physical sources can be found in pretty much every commodity CPU for the
last decade . So not that "difficult to scale" apparently.
There's also an interesting paper by Anna Johnston on entropy, in which
she makes the (correct, in my opinion) remark that entropy really is a relative notion.
I get the feeling here that, by the same token, you could never have a provably secure cryptosystem because someone knows the private key?
On 29 Mar 2025 18:38:08 -0300, Mike Spencer wrote:
Talked to a guy at MIT in the 90s who was trying to extract random
numbers from the turbulence of gas surrounding a hard drive. Never
learned the tech or theoretical details -- above my amateur pay grade.
That is in production use today. I believe it's a standard part of the entropy-gathering process in the Linux kernel.
As a tech and math amateur, I made a setup to try to extract random
numbers from serial images of a plasma ball taken by a consumer-grade
web cam.
Simple circuits using the (ancient) 2N3904 transistor abound on the
internet, and pre-date it as well.
Here's a newer circuit design specifically for battery-powered
cryptographic use and with lots of analysis and comparison with
another circuit:
https://betrusted.io/avalanche-noise
None of it requires cutting-edge technology. The main issue in the
past has simply been that it wasn't part of the original PC
architecture, so things like "looking at mouse movements" needed to
be done at first until it was added to modern hardware.
Exactly! All the stuff about lava lamps, helium motion inside hard^generators
disks, etc is just gimmicks. Real random numbers are tiny electronic
components built into CPUs, HSMs, etc.
Richard Kettlewell <invalid@invalid.invalid> writes:
Exactly! All the stuff about lava lamps, helium motion inside hard^generators
disks, etc is just gimmicks. Real random numbers are tiny electronic
components built into CPUs, HSMs, etc.
Strictly I should probably say “entropy sources”, since there’s >generally a DRBG between the electronics and the application, as well.
On Sat, 29 Mar 2025 20:25:23 -0300, Ethan Carter wrote:
There's also an interesting paper by Anna Johnston on entropy, in which
she makes the (correct, in my opinion) remark that entropy really is a
relative notion.
That makes sense. I’ve long thought that one’s estimates of the probabilities of various events depends very much on one’s point of view.
I get the feeling here that, by the same token, you could never have a
provably secure cryptosystem because someone knows the private key?
None of our cryptosystems are provably secure.
RSA depends on the assumed difficulty of two problems: factorizing
large integers, and computing discrete logarithms, and would break if
either one was solved. There is no proof that either of these problems
is actually hard: we simply don’t know of any good algorithms for
them, after decades, even centuries of looking.
The definition of ``probability'' (in the sense of how to interpret it)
is sort of an open problem.
Thus far we have interpreted the probability of an event of a given >experiment as being a measure of how frequently the event will occur
when the experiment is con- tinually repeated.
The main issue in the past has simply been that it wasn't part of
the original PC architecture, so things like "looking at mouse
movements" needed to be done at first until it was added to modern
hardware.
On 30 Mar 2025 09:31:01 +1000, Computer Nerd Kev wrote:
The main issue in the past has simply been that it wasn't part of
the original PC architecture, so things like "looking at mouse
movements" needed to be done at first until it was added to modern
hardware.
The trouble with building in a purported random-number source is: how can you be sure you can trust it?
Intel added random-number generation instructions to the x86 architecture; but how can be we sure they work as they're advertised?
Lawrence D'Oliveiro <ldo@nz.invalid> writes:
On Sat, 29 Mar 2025 11:50:06 +0000, Richmond wrote:
Random is without a predictable pattern or plan.
Let’s say I collect and store a sequence that meets your definition.
Then I play it back when you ask me for a random number sequence. Does
it still meet your definition? If not, what has changed?
Because you have stored it, it is predictable by you and you have a
plan.
Lawrence D'Oliveiro <ldo@nz.invalid> wrote:
Intel added random-number generation instructions to the x86
architecture; but how can be we sure they work as they're advertised?
How can you be sure anything works as advertised?
The definition of ``probability'' (in the sense of how to interpret
it) is sort of an open problem.
Lawrence D'Oliveiro <ldo@nz.invalid> writes:
On Sat, 29 Mar 2025 20:25:23 -0300, Ethan Carter wrote:
I get the feeling here that, by the same token, you could never have a
provably secure cryptosystem because someone knows the private key?
None of our cryptosystems are provably secure.
One example of provably secure system is the one-time pad.
|In spite of its superficial appeal, the limit frequency
|interpretation has been widely discarded, primarily because there
|is no assurance that the above limit really exists for the actual
|sequences of events to which one wishes to apply probability
|theory.
|
"Quantum Mechanics" (1998) - Leslie E. Ballentine
On Sun, 30 Mar 2025 11:19:00 -0300, Ethan Carter wrote:
Lawrence D'Oliveiro <ldo@nz.invalid> writes:
On Sat, 29 Mar 2025 20:25:23 -0300, Ethan Carter wrote:
I get the feeling here that, by the same token, you could never have a >>>> provably secure cryptosystem because someone knows the private key?
None of our cryptosystems are provably secure.
One example of provably secure system is the one-time pad.
But it’s not. Where do you get the pad from? Proof of security of the system relies on proof of the randomness of the pad. Which takes us back
to square one.
On 30 Mar 2025 14:32:46 GMT, Stefan Ram wrote:
|In spite of its superficial appeal, the limit frequency
|interpretation has been widely discarded, primarily because there
|is no assurance that the above limit really exists for the actual
|sequences of events to which one wishes to apply probability
|theory.
|
"Quantum Mechanics" (1998) - Leslie E. Ballentine
Discarded or not, it’s the definition used in gambling. In other words, people literally bet money on it.
Lawrence D'Oliveiro <ldo@nz.invalid> writes:
On 30 Mar 2025 14:32:46 GMT, Stefan Ram wrote:
|In spite of its superficial appeal, the limit frequency
|interpretation has been widely discarded, primarily because there |is
no assurance that the above limit really exists for the actual
|sequences of events to which one wishes to apply probability |theory.
|
"Quantum Mechanics" (1998) - Leslie E. Ballentine
Discarded or not, it’s the definition used in gambling. In other words,
people literally bet money on it.
Discarded in its theoretical use, which is where the discussion is. I
think nearly nobody disputes how useful the limit-frequency
interpretation is.
I think your ``square one'' is that no system is provably secure.
Richard Kettlewell <invalid@invalid.invalid> wrote:im no expert but can't you just amplify thermal (white) noise and just
Richard Kettlewell <invalid@invalid.invalid> writes:
Exactly! All the stuff about lava lamps, helium motion inside hard^generators
disks, etc is just gimmicks. Real random numbers are tiny
electronic
components built into CPUs, HSMs, etc.
Strictly I should probably say “entropy sourcesâ€_, since there’s
generally a DRBG between the electronics and the application, as
well.
The problem with those genuine random number generators is that they
are usually comparatively slow. They take milliseconds to spit out a
number, sometimes tens or even hundreds of them. So we use the
genuine RNG to seed a PNG in situations where we don't need complete randomness but need pretty good randomness and need a lot of it fast.
Knuth has a discussion of this.
--scott
im no expert but can't you just amplify thermal (white) noise and just
sample it? it's completely random.
im no expert but can't you just amplify thermal (white) noise and just
sample it? it's completely random.
kludge@panix.com (Scott Dorsey) wrote:
Richard Kettlewell <invalid@invalid.invalid> wrote:
Richard Kettlewell <invalid@invalid.invalid> writes:
Exactly! All the stuff about lava lamps, helium motion inside hard
disks, etc is just gimmicks. Real random number [generators] are tiny
electronic components built into CPUs, HSMs, etc.
Strictly I should probably say “entropy source”, since there’s
generally a DRBG between the electronics and the application, as
well.
The problem with those genuine random number generators is that they
are usually comparatively slow. They take milliseconds to spit out a
number, sometimes tens or even hundreds of them. So we use the
genuine RNG to seed a PNG in situations where we don't need complete
randomness but need pretty good randomness and need a lot of it fast.
Knuth has a discussion of this.
im no expert but can't you just amplify thermal (white) noise and just
sample it? it's completely random.
Sysop: | DaiTengu |
---|---|
Location: | Appleton, WI |
Users: | 1,029 |
Nodes: | 10 (0 / 10) |
Uptime: | 152:11:05 |
Calls: | 13,333 |
Calls today: | 3 |
Files: | 186,574 |
D/L today: |
3,812 files (999M bytes) |
Messages: | 3,355,830 |