• Truly Random Numbers On A Quantum Computer??

    From Lawrence D'Oliveiro@ldo@nz.invalid to comp.misc on Fri Mar 28 21:16:29 2025
    From Newsgroup: comp.misc

    These researchers claim to have a technique, based on quantum
    computing, that can generate provably random numbers <https://www.csoonline.com/article/3855710/researchers-claim-their-protocol-can-create-truly-random-numbers-on-a-current-quantum-computer.html>.

    Trouble is, there ain’t no such thing. This part doesn’t make any
    sense:

    Then, to verify that true random numbers had been generated, the
    randomness of the results was mathematically certified to be
    genuine using classical supercomputers at the US Department of
    Energy.

    The definition of “randomness” is “you don’t know what’s coming next”.
    How do you prove you don’t know something? You can’t. There are
    various statistical tests for randomness, but remember that a suitably encrypted message can pass every one of them, and a person who knows
    the message knows that the bitstream is not truly random.
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From Lawrence D'Oliveiro@ldo@nz.invalid to comp.misc on Fri Mar 28 23:10:36 2025
    From Newsgroup: comp.misc

    On Fri, 28 Mar 2025 21:16:29 -0000 (UTC), I wrote:

    The definition of “randomness” is “you don’t know what’s coming next”.
    How do you prove you don’t know something? You can’t. There are various statistical tests for randomness, but remember that a suitably encrypted message can pass every one of them, and a person who knows the message
    knows that the bitstream is not truly random.

    Here’s an even simpler proof, by reductio ad absurdum.

    Suppose you have a sequence of numbers which is provably random. Simply pregenerate a large bunch of numbers according to that sequence, and store them. Then supply them one by one to another party. The other party
    doesn’t know what’s coming next, but you do. Therefore they are not random to you.

    Which contradicts the original assumption of provable randomness. QED.
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From Richmond@dnomhcir@gmx.com to comp.misc on Sat Mar 29 11:50:06 2025
    From Newsgroup: comp.misc

    Lawrence D'Oliveiro <ldo@nz.invalid> writes:

    On Fri, 28 Mar 2025 21:16:29 -0000 (UTC), I wrote:

    The definition of “randomness” is “you don’t know what’s coming next”.
    How do you prove you don’t know something? You can’t. There are various >> statistical tests for randomness, but remember that a suitably encrypted
    message can pass every one of them, and a person who knows the message
    knows that the bitstream is not truly random.

    Here’s an even simpler proof, by reductio ad absurdum.

    Suppose you have a sequence of numbers which is provably random. Simply pregenerate a large bunch of numbers according to that sequence, and store them. Then supply them one by one to another party. The other party doesn’t know what’s coming next, but you do. Therefore they are not random
    to you.

    Which contradicts the original assumption of provable randomness. QED.

    I think your definition of randomness is wrong. If the sequence can be
    repeated by anyone, then it is pseudo random, not random.

    Random is without a predictable pattern or plan.
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From Richard Kettlewell@invalid@invalid.invalid to comp.misc on Sat Mar 29 15:05:58 2025
    From Newsgroup: comp.misc

    Richmond <dnomhcir@gmx.com> writes:
    [...]
    Random is without a predictable pattern or plan.

    I can think of worse definitions.

    From the original article:

    As deterministic systems, classical computers cannot create true
    randomness on demand. As a result, to offer true randomness in
    classical computing, we often resort to specialized hardware that
    harvests entropy from unpredictable physical sources, for instance,
    by looking at mouse movements, observing fluctuations in
    temperature, monitoring the movement of lava lamps or, in extreme
    cases, detecting cosmic radiation. These measures are unwieldy,
    difficult to scale and lack rigorous guarantees, limiting our
    ability to verify whether their outputs are truly random.

    Physical sources can be found in pretty much every commodity CPU for the
    last decade . So not that “difficult to scale” apparently.

    A lot of people are pushing QRNGs of various kinds right now. I’ve yet
    to be convinced, personally.
    --
    https://www.greenend.org.uk/rjk/
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From kludge@kludge@panix.com (Scott Dorsey) to comp.misc on Sat Mar 29 12:58:05 2025
    From Newsgroup: comp.misc

    Richard Kettlewell <invalid@invalid.invalid> wrote:
    A lot of people are pushing QRNGs of various kinds right now. I’ve yet
    to be convinced, personally.

    The QRNG may not in fact be random, but if they turn out not to be random
    this indicates some sort of currently-unknown determinism in the
    universe and that in itself is really interesting... far more interesting
    than the mere quality of a random number generator.

    One of the traditional high-entropy RNGs has been related to the decay
    of a radioactive source since you can never tell when an atom in a sample
    is going to decay. If you COULD tell, it would be extremely useful and
    worth a Nobel at the absolutely minimum.
    --scott
    --
    "C'est un Nagra. C'est suisse, et tres, tres precis."
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From Mike Spencer@mds@bogus.nodomain.nowhere to comp.misc on Sat Mar 29 18:38:08 2025
    From Newsgroup: comp.misc


    Richard Kettlewell <invalid@invalid.invalid> writes:

    A lot of people are pushing QRNGs of various kinds right now. I've yet
    to be convinced, personally.

    As a tech and math amateur, I made a setup to try to extract random
    numbers from serial images of a plasma ball taken by a consumer-grade
    web cam. Really random stuff happening in there, right? I never got
    any results, despite experiments with various datum selection
    strategies, image formats etc. that were any where near acceptable.

    The concept still seems to me to be potentially usable, but
    whaddoiknow?

    Talked to a guy at MIT in the 90s who was trying to extract random
    numbers from the turbulence of gas surrounding a hard drive. Never
    learned the tech or theoretical details -- above my amateur pay
    grade.
    --
    Mike Spencer Nova Scotia, Canada
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From Lawrence D'Oliveiro@ldo@nz.invalid to comp.misc on Sat Mar 29 22:08:11 2025
    From Newsgroup: comp.misc

    On 29 Mar 2025 18:38:08 -0300, Mike Spencer wrote:

    Talked to a guy at MIT in the 90s who was trying to extract random
    numbers from the turbulence of gas surrounding a hard drive. Never
    learned the tech or theoretical details -- above my amateur pay grade.

    That is in production use today. I believe it’s a standard part of the entropy-gathering process in the Linux kernel.
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From Lawrence D'Oliveiro@ldo@nz.invalid to comp.misc on Sat Mar 29 22:09:46 2025
    From Newsgroup: comp.misc

    On Sat, 29 Mar 2025 11:50:06 +0000, Richmond wrote:

    Random is without a predictable pattern or plan.

    Let’s say I collect and store a sequence that meets your definition. Then
    I play it back when you ask me for a random number sequence. Does it still meet your definition? If not, what has changed?
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From Richmond@dnomhcir@gmx.com to comp.misc on Sat Mar 29 22:39:26 2025
    From Newsgroup: comp.misc

    Lawrence D'Oliveiro <ldo@nz.invalid> writes:

    On Sat, 29 Mar 2025 11:50:06 +0000, Richmond wrote:

    Random is without a predictable pattern or plan.

    Let’s say I collect and store a sequence that meets your definition. Then I play it back when you ask me for a random number sequence. Does it still meet your definition? If not, what has changed?

    Because you have stored it, it is predictable by you and you have a
    plan.

    If I took some numbers from the square root of 2, maybe thousands of
    digits into it, and then presented them to you, to you they would be
    random because you wouldn't know where they came from or what the next
    digit in the sequence would be. But they aren't random, because I know
    and I can repeat them. I don't even need to store them.
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From Ethan Carter@ec1828@gmail.com to comp.misc on Sat Mar 29 20:25:23 2025
    From Newsgroup: comp.misc

    Lawrence D'Oliveiro <ldo@nz.invalid> writes:

    On Fri, 28 Mar 2025 21:16:29 -0000 (UTC), I wrote:

    The definition of “randomness” is “you don’t know what’s coming next”.
    How do you prove you don’t know something? You can’t. There are various >> statistical tests for randomness, but remember that a suitably encrypted
    message can pass every one of them, and a person who knows the message
    knows that the bitstream is not truly random.

    Knuth gives a nice lecture about the definition of randomness in TAoCP,
    volume 2, section 3.5---what is a random sequence? He gives a nice
    definition (definition R1, page 152), which doesn't quite work, though
    it's quite simple; he then patches it various times, reaching definition
    R6, which he claims it works against all criticisms. It's quite a
    precise definition, so it's worthy of mention.

    There's also an interesting paper by Anna Johnston on entropy, in which
    she makes the (correct, in my opinion) remark that entropy really is a
    relative notion.

    --8<-------------------------------------------------------->8---
    Note that entropy is relative. It is not a solid, physical
    entity. Entropy depends on perspective or what is known and unknown
    about the data to a given entity. Once viewed, all information in the
    data is known to the viewer (zero entropy in the viewers perspective),
    but the data still contains entropy to non-viewers. The belief that
    entropy is something that has a classical, fixed measure is false and
    causes many interpretation issues. -- Anna Johnston, ``Comments on Cryptographic Entropy Measurement'', 2019, section 2, page 3.

    Source:
    <https://eprint.iacr.org/2019/1263.pdf> --8<-------------------------------------------------------->8---

    Here’s an even simpler proof, by reductio ad absurdum.

    Suppose you have a sequence of numbers which is provably random. Simply pregenerate a large bunch of numbers according to that sequence, and store them. Then supply them one by one to another party. The other party doesn’t know what’s coming next, but you do. Therefore they are not random
    to you.

    Which contradicts the original assumption of provable randomness. QED.

    I get the feeling here that, by the same token, you could never have a
    provably secure cryptosystem because someone knows the private key?
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From not@not@telling.you.invalid (Computer Nerd Kev) to comp.misc on Sun Mar 30 09:31:01 2025
    From Newsgroup: comp.misc

    Richard Kettlewell <invalid@invalid.invalid> wrote:
    From the original article:

    As deterministic systems, classical computers cannot create true
    randomness on demand. As a result, to offer true randomness in
    classical computing, we often resort to specialized hardware that
    harvests entropy from unpredictable physical sources, for instance,
    by looking at mouse movements, observing fluctuations in
    temperature, monitoring the movement of lava lamps or, in extreme
    cases, detecting cosmic radiation. These measures are unwieldy,
    difficult to scale and lack rigorous guarantees, limiting our
    ability to verify whether their outputs are truly random.

    Physical sources can be found in pretty much every commodity CPU for the
    last decade . So not that "difficult to scale" apparently.

    Simple circuits using the (ancient) 2N3904 transistor abound on the
    internet, and pre-date it as well.

    Here's a newer circuit design specifically for battery-powered
    cryptographic use and with lots of analysis and comparison with
    another circuit:
    https://betrusted.io/avalanche-noise

    None of it requires cutting-edge technology. The main issue in the
    past has simply been that it wasn't part of the original PC
    architecture, so things like "looking at mouse movements" needed to
    be done at first until it was added to modern hardware.
    --
    __ __
    #_ < |\| |< _#
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From Lawrence D'Oliveiro@ldo@nz.invalid to comp.misc on Sun Mar 30 04:58:15 2025
    From Newsgroup: comp.misc

    On Sat, 29 Mar 2025 20:25:23 -0300, Ethan Carter wrote:

    There's also an interesting paper by Anna Johnston on entropy, in which
    she makes the (correct, in my opinion) remark that entropy really is a relative notion.

    That makes sense. I’ve long thought that one’s estimates of the probabilities of various events depends very much on one’s point of view.

    I think Bayes’ Theorem says as much.

    I get the feeling here that, by the same token, you could never have a provably secure cryptosystem because someone knows the private key?

    None of our cryptosystems are provably secure. RSA depends on the assumed difficulty of two problems: factorizing large integers, and computing
    discrete logarithms, and would break if either one was solved. There is no proof that either of these problems is actually hard: we simply don’t know of any good algorithms for them, after decades, even centuries of looking.
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From Mike Spencer@mds@bogus.nodomain.nowhere to comp.misc on Sun Mar 30 04:37:59 2025
    From Newsgroup: comp.misc


    Lawrence D'Oliveiro <ldo@nz.invalid> writes:

    On 29 Mar 2025 18:38:08 -0300, Mike Spencer wrote:

    Talked to a guy at MIT in the 90s who was trying to extract random
    numbers from the turbulence of gas surrounding a hard drive. Never
    learned the tech or theoretical details -- above my amateur pay grade.

    That is in production use today. I believe it's a standard part of the entropy-gathering process in the Linux kernel.

    Cool. I hope my friend, with whom I've lost contact, has been able to
    cash in on the development, either academicaally or financially.
    --
    Mike Spencer Nova Scotia, Canada
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From ram@ram@zedat.fu-berlin.de (Stefan Ram) to comp.misc on Sun Mar 30 09:14:24 2025
    From Newsgroup: comp.misc

    Mike Spencer <mds@bogus.nodomain.nowhere> wrote or quoted:
    As a tech and math amateur, I made a setup to try to extract random
    numbers from serial images of a plasma ball taken by a consumer-grade
    web cam.

    Even stuff like the current CPU load or the exact time right
    now adds a bit of "entropy." Plus, my computer here has a
    microphone input that probably picks up some noise too.
    I'm guessing you could get roughly evenly distributed values
    in a certain range by using modulo or XOR operations on that.

    The stats for quantum random numbers can differ from those of
    classical random numbers - but honestly, asking whether "quantum
    randomness" or "classical randomness" is the "real randomness"
    seems kind of pointless to me. Random values for observables are
    definitely central to quantum physics, though whether the world
    is fundamentally deterministic or random is still not fully
    understood! ("Measurement problem in quantum physics").

    Here's something I've posted before in comp.lang.javascript,
    in <randomness-20170601030554@ram.dialup.fu-berlin.de>:

    |I'd say, a bit source is truly random when the probability
    |of any party to correctly guess the next bit is 0.5.
    |
    |(Possibly interesting in this context:
    |
    |"In contrast with software-generated randomness (called
    |pseudo-randomness), quantum randomness is provable
    |incomputable, i.e., it is not exactly reproducible by any
    |algorithm. We provide experimental evidence of incomputability
    |--- an asymptotic property --- of quantum randomness by
    |performing finite tests of randomness inspired by algorithmic
    |information theory."
    |
    |arXiv.org > quant-ph > arXiv:1004.1521
    |
    |and also
    |
    |arXiv:quant-ph/0611029v2
    |
    |.)

    .


    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From Richard Kettlewell@invalid@invalid.invalid to comp.misc on Sun Mar 30 11:14:21 2025
    From Newsgroup: comp.misc

    not@telling.you.invalid (Computer Nerd Kev) writes:
    Simple circuits using the (ancient) 2N3904 transistor abound on the
    internet, and pre-date it as well.

    Here's a newer circuit design specifically for battery-powered
    cryptographic use and with lots of analysis and comparison with
    another circuit:
    https://betrusted.io/avalanche-noise

    None of it requires cutting-edge technology. The main issue in the
    past has simply been that it wasn't part of the original PC
    architecture, so things like "looking at mouse movements" needed to
    be done at first until it was added to modern hardware.

    Exactly! All the stuff about lava lamps, helium motion inside hard
    disks, etc is just gimmicks. Real random numbers are tiny electronic
    components built into CPUs, HSMs, etc.
    --
    https://www.greenend.org.uk/rjk/
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From Richard Kettlewell@invalid@invalid.invalid to comp.misc on Sun Mar 30 11:28:51 2025
    From Newsgroup: comp.misc

    Richard Kettlewell <invalid@invalid.invalid> writes:
    Exactly! All the stuff about lava lamps, helium motion inside hard
    disks, etc is just gimmicks. Real random numbers are tiny electronic
    ^generators
    components built into CPUs, HSMs, etc.

    Strictly I should probably say “entropy sources”, since there’s
    generally a DRBG between the electronics and the application, as well.
    --
    https://www.greenend.org.uk/rjk/
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From kludge@kludge@panix.com (Scott Dorsey) to comp.misc on Sun Mar 30 09:11:47 2025
    From Newsgroup: comp.misc

    Richard Kettlewell <invalid@invalid.invalid> wrote:
    Richard Kettlewell <invalid@invalid.invalid> writes:
    Exactly! All the stuff about lava lamps, helium motion inside hard
    disks, etc is just gimmicks. Real random numbers are tiny electronic
    ^generators
    components built into CPUs, HSMs, etc.

    Strictly I should probably say “entropy sources”, since there’s >generally a DRBG between the electronics and the application, as well.

    The problem with those genuine random number generators is that they are usually comparatively slow. They take milliseconds to spit out a number, sometimes tens or even hundreds of them. So we use the genuine RNG to
    seed a PNG in situations where we don't need complete randomness but need pretty good randomness and need a lot of it fast. Knuth has a discussion
    of this.
    --scott
    --
    "C'est un Nagra. C'est suisse, et tres, tres precis."
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From Ethan Carter@ec1828@gmail.com to comp.misc on Sun Mar 30 11:19:00 2025
    From Newsgroup: comp.misc

    Lawrence D'Oliveiro <ldo@nz.invalid> writes:

    On Sat, 29 Mar 2025 20:25:23 -0300, Ethan Carter wrote:

    There's also an interesting paper by Anna Johnston on entropy, in which
    she makes the (correct, in my opinion) remark that entropy really is a
    relative notion.

    That makes sense. I’ve long thought that one’s estimates of the probabilities of various events depends very much on one’s point of view.

    The definition of ``probability'' (in the sense of how to interpret it)
    is sort of an open problem. Knuth takes note of that in section 3.5,
    page 142 when he also points the reader to von Mises's book on the
    subject.

    What you're describing here is a certain notion, which is discussed in
    Shell Ross' ``A First Course in Probability'' on section 2.7, chapter 2,
    8th edition.

    --8<-------------------------------------------------------->8---
    Thus far we have interpreted the probability of an event of a given
    experiment as being a measure of how frequently the event will occur
    when the experiment is con- tinually repeated. However, there are also
    other uses of the term probability. For instance, we have all heard such statements as “It is 90 percent probable that Shake- speare actually
    wrote Hamlet” or “The probability that Oswald acted alone in assas- sinating Kennedy is .8.” How are we to interpret these statements?

    The most simple and natural interpretation is that the probabilities
    referred to are measures of the individual’s degree of belief in the statements that he or she is making. In other words, the individual
    making the foregoing statements is quite certain that Oswald acted alone
    and is even more certain that Shakespeare wrote Hamlet. This
    interpretation of probability as being a measure of the degree of one’s belief is often referred to as the personal or subjective view of
    probability.

    It seems logical to suppose that a “measure of the degree of one’s belief” should satisfy all of the axioms of probability. For example, if
    we are 70 percent certain that Shakespeare wrote Julius Caesar and 10
    percent certain that it was actually Mar- lowe, then it is logical to
    suppose that we are 80 percent certain that it was either Shakespeare or Marlowe. Hence, whether we interpret probability as a measure of belief
    or as a long-run frequency of occurrence, its mathematical properties
    remain unchanged. --8<-------------------------------------------------------->8---

    I get the feeling here that, by the same token, you could never have a
    provably secure cryptosystem because someone knows the private key?

    None of our cryptosystems are provably secure.

    One example of provably secure system is the one-time pad. It is likely
    the simplest. But there many more. For example, Michael Rabin's
    cryptosystem based on modular square roots is provably secure. For a definition of ``provably secure'', take a look at chapter 3 of Douglas Stinson's ``Cryptography: Theory and Practice'', 4th edition.

    --8<-------------------------------------------------------->8---
    Another approach is to provide evidence of security by means of a reduc-
    tion. In other words, we show that if the cryptosystem can be “broken”
    in some specific way, then it would be possible to efficiently solve
    some well-studied problem that is thought to be difficult. For example,
    it may be possible to prove a statement of the type “a given
    cryptosystem is secure if a given integer n cannot be factored.” Cryptosystems of this type are sometimes termed provably secure, but it
    must be understood that this approach only provides a proof of security relative to some other problem, not an absolute proof of security. --8<-------------------------------------------------------->8---

    RSA depends on the assumed difficulty of two problems: factorizing
    large integers, and computing discrete logarithms, and would break if
    either one was solved. There is no proof that either of these problems
    is actually hard: we simply don’t know of any good algorithms for
    them, after decades, even centuries of looking.

    That has no effect on the property of a system being provably secure.
    See section 6.8, for example, of Stinson's. Notice his use of the words ``provably secure'' and ``secure''.

    --8<-------------------------------------------------------->8---
    In this section, we describe the Rabin Cryptosystem, which is
    computationally secure against a chosen-plaintext attack provided that
    the modulus n = pq cannot be factored. Therefore, the Rabin Cryptosystem provides an example of a provably secure cryptosystem: assuming that the problem of factoring is computationally infeasible, the Rabin
    Cryptosystem is secure. --8<-------------------------------------------------------->8---
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From ram@ram@zedat.fu-berlin.de (Stefan Ram) to comp.misc on Sun Mar 30 14:32:46 2025
    From Newsgroup: comp.misc

    Ethan Carter <ec1828@gmail.com> wrote or quoted:
    The definition of ``probability'' (in the sense of how to interpret it)
    is sort of an open problem.

    |The probability P(A|C) is interpreted as a measure of the
    |tendency, or propensity, of the physical conditions describe
    |by C to produce the result A. It differs logically from the
    |older limit-frequency theory in that probability is
    |interpreted, but not redefined or derived from anything more
    |fundamental. It remains, mathematically, a fundamental
    |undefined term.
    "Quantum Mechanics" (1998) - Leslie E. Ballentine

    Thus far we have interpreted the probability of an event of a given >experiment as being a measure of how frequently the event will occur
    when the experiment is con- tinually repeated.

    |One of the oldest interpretations is the /limit frequency/
    |interpretation. If the conditioning event /C/ can lead
    |to either A or "not A", and if in /n/ repetitions of such
    |a situation the event A occurs /m/ times, then it is asserted
    |that P(A|C) = lim n-->oo (m/n). This provides not only
    |an interpretation of probability, but also a definition
    |of probability in terms of a numerical frequency ratio.
    |Hence the axioms of abstract probability theory can
    |be derived as theorems of the frequency theory.
    |
    |In spite of its superficial appeal, the limit frequency
    |interpretation has been widely discarded, primarily because
    |there is no assurance that the above limit really exists for
    |the actual sequences of events to which one wishes to apply
    |probability theory.
    |
    "Quantum Mechanics" (1998) - Leslie E. Ballentine


    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From Lawrence D'Oliveiro@ldo@nz.invalid to comp.misc on Sun Mar 30 21:18:59 2025
    From Newsgroup: comp.misc

    On 30 Mar 2025 09:31:01 +1000, Computer Nerd Kev wrote:

    The main issue in the past has simply been that it wasn't part of
    the original PC architecture, so things like "looking at mouse
    movements" needed to be done at first until it was added to modern
    hardware.

    The trouble with building in a purported random-number source is: how can
    you be sure you can trust it?

    Intel added random-number generation instructions to the x86 architecture;
    but how can be we sure they work as they’re advertised?
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From not@not@telling.you.invalid (Computer Nerd Kev) to comp.misc on Mon Mar 31 08:15:54 2025
    From Newsgroup: comp.misc

    Lawrence D'Oliveiro <ldo@nz.invalid> wrote:
    On 30 Mar 2025 09:31:01 +1000, Computer Nerd Kev wrote:

    The main issue in the past has simply been that it wasn't part of
    the original PC architecture, so things like "looking at mouse
    movements" needed to be done at first until it was added to modern
    hardware.

    The trouble with building in a purported random-number source is: how can you be sure you can trust it?

    That's the justification the designer of the circuit I linked to
    stated for why they decided to use a separate circuit made from
    discrete components. USB devices using similar circuits can also be
    purchased for the same reason. Anyway, you don't need a quantum
    computer to do it.

    Intel added random-number generation instructions to the x86 architecture; but how can be we sure they work as they're advertised?

    How can you be sure anything works as advertised? There's always
    the risk of backdoors in the Intel Management Engine enabling all
    sorts of possible attacks. That designer likes FPGA-based CPUs for
    this reason, although there's still a small risk that the FPGAs
    could be maliciously designed to specifically sabotage that
    approach too.
    --
    __ __
    #_ < |\| |< _#
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From Lawrence D'Oliveiro@ldo@nz.invalid to comp.misc on Mon Mar 31 01:29:30 2025
    From Newsgroup: comp.misc

    On Sat, 29 Mar 2025 22:39:26 +0000, Richmond wrote:

    Lawrence D'Oliveiro <ldo@nz.invalid> writes:

    On Sat, 29 Mar 2025 11:50:06 +0000, Richmond wrote:

    Random is without a predictable pattern or plan.

    Let’s say I collect and store a sequence that meets your definition.
    Then I play it back when you ask me for a random number sequence. Does
    it still meet your definition? If not, what has changed?

    Because you have stored it, it is predictable by you and you have a
    plan.

    But you don’t.
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From Lawrence D'Oliveiro@ldo@nz.invalid to comp.misc on Mon Mar 31 01:30:09 2025
    From Newsgroup: comp.misc

    On 31 Mar 2025 08:15:54 +1000, Computer Nerd Kev wrote:

    Lawrence D'Oliveiro <ldo@nz.invalid> wrote:

    Intel added random-number generation instructions to the x86
    architecture; but how can be we sure they work as they're advertised?

    How can you be sure anything works as advertised?

    There are ways to test things. But not (easily) with randomness.
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From Lawrence D'Oliveiro@ldo@nz.invalid to comp.misc on Mon Mar 31 01:32:52 2025
    From Newsgroup: comp.misc

    On Sun, 30 Mar 2025 11:19:00 -0300, Ethan Carter wrote:

    The definition of ``probability'' (in the sense of how to interpret
    it) is sort of an open problem.

    It’s a term which can be defined in more than one way. One obvious one is
    as the relative frequency of different possible outcomes. I think there
    are others.

    Lawrence D'Oliveiro <ldo@nz.invalid> writes:

    On Sat, 29 Mar 2025 20:25:23 -0300, Ethan Carter wrote:

    I get the feeling here that, by the same token, you could never have a
    provably secure cryptosystem because someone knows the private key?

    None of our cryptosystems are provably secure.

    One example of provably secure system is the one-time pad.

    But it’s not. Where do you get the pad from? Proof of security of the
    system relies on proof of the randomness of the pad. Which takes us back
    to square one.
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From Lawrence D'Oliveiro@ldo@nz.invalid to comp.misc on Mon Mar 31 01:34:54 2025
    From Newsgroup: comp.misc

    On 30 Mar 2025 14:32:46 GMT, Stefan Ram wrote:

    |In spite of its superficial appeal, the limit frequency
    |interpretation has been widely discarded, primarily because there
    |is no assurance that the above limit really exists for the actual
    |sequences of events to which one wishes to apply probability
    |theory.
    |
    "Quantum Mechanics" (1998) - Leslie E. Ballentine

    Discarded or not, it’s the definition used in gambling. In other words, people literally bet money on it.
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From Ethan Carter@ec1828@somewhere.edu to comp.misc on Tue Apr 1 10:25:30 2025
    From Newsgroup: comp.misc

    Lawrence D'Oliveiro <ldo@nz.invalid> writes:

    On Sun, 30 Mar 2025 11:19:00 -0300, Ethan Carter wrote:

    Lawrence D'Oliveiro <ldo@nz.invalid> writes:

    On Sat, 29 Mar 2025 20:25:23 -0300, Ethan Carter wrote:

    I get the feeling here that, by the same token, you could never have a >>>> provably secure cryptosystem because someone knows the private key?

    None of our cryptosystems are provably secure.

    One example of provably secure system is the one-time pad.

    But it’s not. Where do you get the pad from? Proof of security of the system relies on proof of the randomness of the pad. Which takes us back
    to square one.

    I think your ``square one'' is that no system is provably secure.
    This denies the work of various thinkers who have written definitions
    and proofs. A proof is usually work of mathematical nature, not of
    engineering nature. Randomness is assumed in all of these proofs, so
    there is not a single step in them that's flawed in any way.

    So I think your position is that the assumption of randomness is not a
    good idea. You seem to rather prefer to assume that randomness
    doesn't exist. But that's just another assumption. And it's not an interesting one. It destroys a lot of good work.

    Why is randomness assumed? We can't calculate without it. For
    instance, what's the probability of getting a 6 in a fair die? It's
    1/6. But that's not true in your choice of assumptions because you
    reject the assumption of randomness. What do you get as a result? I
    think none---you wouldn't have a model to work with.

    --8<-------------------------------------------------------->8--- --8<-------------------------------------------------------->8---

    What about the practical world? We have enough randomness to run the
    entire world as it is currently done despite the accidents we've had
    and could still have. So I don't think it's a good idea to say that
    we don't have provably secure systems because someone may have
    criticisms with respect to the quality of random number generators: we
    have various systems that satisfy the definition of provably secure.
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From Ethan Carter@ec1828@somewhere.edu to comp.misc on Tue Apr 1 10:31:59 2025
    From Newsgroup: comp.misc

    Lawrence D'Oliveiro <ldo@nz.invalid> writes:

    On 30 Mar 2025 14:32:46 GMT, Stefan Ram wrote:

    |In spite of its superficial appeal, the limit frequency
    |interpretation has been widely discarded, primarily because there
    |is no assurance that the above limit really exists for the actual
    |sequences of events to which one wishes to apply probability
    |theory.
    |
    "Quantum Mechanics" (1998) - Leslie E. Ballentine

    Discarded or not, it’s the definition used in gambling. In other words, people literally bet money on it.

    Discarded in its theoretical use, which is where the discussion is. I
    think nearly nobody disputes how useful the limit-frequency
    interpretation is.
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From Lawrence D'Oliveiro@ldo@nz.invalid to comp.misc on Fri Apr 4 19:05:05 2025
    From Newsgroup: comp.misc

    On Tue, 01 Apr 2025 10:31:59 -0300, Ethan Carter wrote:

    Lawrence D'Oliveiro <ldo@nz.invalid> writes:

    On 30 Mar 2025 14:32:46 GMT, Stefan Ram wrote:

    |In spite of its superficial appeal, the limit frequency
    |interpretation has been widely discarded, primarily because there |is
    no assurance that the above limit really exists for the actual
    |sequences of events to which one wishes to apply probability |theory.
    |
    "Quantum Mechanics" (1998) - Leslie E. Ballentine

    Discarded or not, it’s the definition used in gambling. In other words,
    people literally bet money on it.

    Discarded in its theoretical use, which is where the discussion is. I
    think nearly nobody disputes how useful the limit-frequency
    interpretation is.

    “The difference between theory and practice is, in theory there is no difference, but in practice there is.”

    I wonder who said that ... ?
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From Lawrence D'Oliveiro@ldo@nz.invalid to comp.misc on Fri Apr 4 19:05:50 2025
    From Newsgroup: comp.misc

    On Tue, 01 Apr 2025 10:25:30 -0300, Ethan Carter wrote:

    I think your ``square one'' is that no system is provably secure.

    We were talking about randomness.
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From Toaster@toaster@dne3.net to comp.misc on Fri Apr 4 20:16:55 2025
    From Newsgroup: comp.misc

    On Sun, 30 Mar 2025 09:11:47 -0400 (EDT)
    kludge@panix.com (Scott Dorsey) wrote:
    Richard Kettlewell <invalid@invalid.invalid> wrote:
    Richard Kettlewell <invalid@invalid.invalid> writes:
    Exactly! All the stuff about lava lamps, helium motion inside hard
    disks, etc is just gimmicks. Real random numbers are tiny
    electronic
    ^generators
    components built into CPUs, HSMs, etc.

    Strictly I should probably say “entropy sourcesâ€_, since there’s
    generally a DRBG between the electronics and the application, as
    well.

    The problem with those genuine random number generators is that they
    are usually comparatively slow. They take milliseconds to spit out a
    number, sometimes tens or even hundreds of them. So we use the
    genuine RNG to seed a PNG in situations where we don't need complete randomness but need pretty good randomness and need a lot of it fast.
    Knuth has a discussion of this.
    --scott
    im no expert but can't you just amplify thermal (white) noise and just
    sample it? it's completely random.
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From kludge@kludge@panix.com (Scott Dorsey) to comp.misc on Fri Apr 4 20:56:58 2025
    From Newsgroup: comp.misc

    Toaster <toaster@dne3.net> wrote:

    im no expert but can't you just amplify thermal (white) noise and just
    sample it? it's completely random.

    Yes, but first of all you need to make sure you are only getting thermal
    noise and not anything else leaking in that might be repetitive. Secondly
    the rate at which you can generate random numbers is directly tied to the bandwidth of the noise source. But this is in fact how hardware RNGs often work.
    --scott
    --
    "C'est un Nagra. C'est suisse, et tres, tres precis."
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From Lawrence D'Oliveiro@ldo@nz.invalid to comp.misc on Sat Apr 5 02:13:13 2025
    From Newsgroup: comp.misc

    On Fri, 4 Apr 2025 20:16:55 -0400, Toaster wrote:

    im no expert but can't you just amplify thermal (white) noise and just
    sample it? it's completely random.

    In theory, there are lots of sources in nature of “completely random” numbers.

    The problem is, how do you construct a mechanism to sample those numbers,
    and prove that there are no bugs introduced (whether accidentally or deliberately) somewhere along the way that subvert the randomness of the output?
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From Richard Kettlewell@invalid@invalid.invalid to comp.misc on Sat Apr 5 09:08:35 2025
    From Newsgroup: comp.misc

    Toaster <toaster@dne3.net> writes:
    kludge@panix.com (Scott Dorsey) wrote:
    Richard Kettlewell <invalid@invalid.invalid> wrote:
    Richard Kettlewell <invalid@invalid.invalid> writes:
    Exactly! All the stuff about lava lamps, helium motion inside hard
    disks, etc is just gimmicks. Real random number [generators] are tiny
    electronic components built into CPUs, HSMs, etc.

    Strictly I should probably say “entropy source”, since there’s
    generally a DRBG between the electronics and the application, as
    well.

    The problem with those genuine random number generators is that they
    are usually comparatively slow. They take milliseconds to spit out a
    number, sometimes tens or even hundreds of them. So we use the
    genuine RNG to seed a PNG in situations where we don't need complete
    randomness but need pretty good randomness and need a lot of it fast.
    Knuth has a discussion of this.

    im no expert but can't you just amplify thermal (white) noise and just
    sample it? it's completely random.

    The physics isn’t my department but I think you’re on the right track.
    The point is that what you get out of the hardware component needs some additional processing before it’s usable in practice e.g. to generate cryptographic keys of a chosen strength. (Scott is for some reason
    repeating my remark about using a DRBG.)
    --
    https://www.greenend.org.uk/rjk/
    --- Synchronet 3.20c-Linux NewsLink 1.2