• Re: Rewriting SSA. Is This A Chance For GNU/Linux?

    From -hh@recscuba_google@huntzinger.com to comp.os.linux.advocacy,comp.os.linux.misc on Mon Apr 7 16:39:40 2025
    From Newsgroup: comp.os.linux.advocacy

    On 4/5/25 18:27, c186282 wrote:
    On 4/5/25 3:40 PM, The Natural Philosopher wrote:
    On 05/04/2025 20:22, c186282 wrote:
    Analog ...

    Massive arrays of non linear analogue circuits for modelling things
    like the Navier Stokes equations would be possible: Probably make a
    better stab at climate modelling then the existing shit.

      Again with analog, it's the sensitivity to especially
      temperature conditions that add errors in. Keep
      carrying those errors through several stages and soon
      all you have is error, pretending to be The Solution.
      Again, perhaps some meta-material that's NOT sensitive
      to what typically throws-off analog electronics MIGHT
      be made.

      I'm trying to visualize what it would take to make
      an all-analog version of, say, a payroll spreadsheet :-)

    Woogh! That makes my brain hurt.


      Now discrete use of analog as, as you suggested, doing
      multiplication/division/logs initiated and read by
      digital ... ?

      Oh well, we're out in sci-fi land with most of this ...
      may as well talk about using giant evil brains in
      jars as computers  :-)

      As some here have mentioned, we may be closer to the
      limits of computer power that we'd like to think.
      Today's big trick is parallelization, but only some
      kinds of problems can be modeled that way.

      Saw an article the other day about using some kind
      of disulfide for de-facto transistors, but did not
      get the impression that they'd be fast. I think
      temperature resistance was the main thrust - industrial
      apps, Venus landers and such.

    Actually, one of the things that Analog's still good at is real world
    control systems with feeback loops and all the like.

    I had one project some time 'way back in the 80s where we were
    troubleshooting a line that had a 1960s era analog control system, and
    one of the conversations that came up was if to replace it with digital.
    It got looked into and was determined that digital process controls
    weren't fast enough for the line.

    Fast-forward to ~2005. While back visiting that department, I found out
    that that old analog beast was still running the line and they were
    trolling eBay for parts to keep it running.

    On another visit ~2015, the update: they finally found a new digitally
    based control system that was fast enough to finally replace it & did.


    -hh

    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From c186282@c186282@nnada.net to comp.os.linux.advocacy,comp.os.linux.misc on Mon Apr 7 17:59:45 2025
    From Newsgroup: comp.os.linux.advocacy

    On 4/7/25 4:39 PM, -hh wrote:
    On 4/5/25 18:27, c186282 wrote:
    On 4/5/25 3:40 PM, The Natural Philosopher wrote:
    On 05/04/2025 20:22, c186282 wrote:
    Analog ...

    Massive arrays of non linear analogue circuits for modelling things
    like the Navier Stokes equations would be possible: Probably make a
    better stab at climate modelling then the existing shit.

       Again with analog, it's the sensitivity to especially
       temperature conditions that add errors in. Keep
       carrying those errors through several stages and soon
       all you have is error, pretending to be The Solution.
       Again, perhaps some meta-material that's NOT sensitive
       to what typically throws-off analog electronics MIGHT
       be made.

       I'm trying to visualize what it would take to make
       an all-analog version of, say, a payroll spreadsheet :-)

    Woogh!  That makes my brain hurt.


    Indeed ! However ... probably COULD be done, it's
    a bunch of shifting values - input to some accts,
    calx ops, shift to other accts ....... lots and
    lots of rheostats ........

    I'm not gonna try it ! :-)


       Now discrete use of analog as, as you suggested, doing
       multiplication/division/logs initiated and read by
       digital ... ?

       Oh well, we're out in sci-fi land with most of this ...
       may as well talk about using giant evil brains in
       jars as computers  :-)

       As some here have mentioned, we may be closer to the
       limits of computer power that we'd like to think.
       Today's big trick is parallelization, but only some
       kinds of problems can be modeled that way.

       Saw an article the other day about using some kind
       of disulfide for de-facto transistors, but did not
       get the impression that they'd be fast. I think
       temperature resistance was the main thrust - industrial
       apps, Venus landers and such.

    Actually, one of the things that Analog's still good at is real world control systems with feeback loops and all the like.

    As long as it's pretty straightforward, analog can
    sometimes do it quicker and simpler. I oft wonder
    whether the problem of a self-balancing android
    might be handled better with analog feedback schemes.

    Of course nerves are, ultimately, 'digital' - pulses
    of varying rate/spacing but always the same strength.
    Some of the sensory stuff even gets 'compressed'/encoded
    before going to the brain. Every little leg hair does
    not its own direct nerve to the brain.

    I had one project some time 'way back in the 80s where we were troubleshooting a line that had a 1960s era analog control system, and
    one of the conversations that came up was if to replace it with digital.
    It got looked into and was determined that digital process controls
    weren't fast enough for the line.

    Fast-forward to ~2005.  While back visiting that department, I found out that that old analog beast was still running the line and they were
    trolling eBay for parts to keep it running.

    Hey, so long as it works well !

    On another visit ~2015, the update:  they finally found a new digitally based control system that was fast enough to finally replace it & did.

    What was the thing doing ?

    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From Farley Flud@fsquared@fsquared.linux to comp.os.linux.advocacy,comp.os.linux.misc on Tue Apr 8 11:59:33 2025
    From Newsgroup: comp.os.linux.advocacy

    On Mon, 07 Apr 2025 17:59:45 -0400, c186282 wrote:


    Indeed ! However ... probably COULD be done, it's
    a bunch of shifting values - input to some accts,
    calx ops, shift to other accts ....... lots and
    lots of rheostats ........


    How is conditional branching (e.g. an if-then-else statement)
    to be implemented with analog circuits? It cannot be
    done.

    Analog computers are good for modelling systems that are
    described by differential equations. Adders, differentiators,
    and integrators can all be easily implemented with electronic
    circuits. But beyond differential equation sytems analog
    computers are useless.

    The Norden bomb site of WWII wan an electro-mechanical
    computer. It's job was to calculate the trajectory of
    a bomb released by an aircraft and the trajectory is described
    by a differential equation.

    One of my professors told a story about a common "analog"
    practice among engineers of the past. To calculate an integral,
    which can be described as the area under a curve, they would plot
    the curve on well made paper and then cut out (with scissors)
    the plotted area and weigh it (on a lab balance). The ratio
    of the cut-out area with a unit area of paper would be the
    value of the integral. (Multi-dimensional integrals would
    require carving blocks of balsa wood or a similar material.)

    Of course it worked but today integration is easy to perform
    to unlimited accuracy using digital means.
    --
    Hail Linux! Hail FOSS! Hail Stallman!
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From The Natural Philosopher@tnp@invalid.invalid to comp.os.linux.advocacy,comp.os.linux.misc on Tue Apr 8 13:17:42 2025
    From Newsgroup: comp.os.linux.advocacy

    On 08/04/2025 12:59, Farley Flud wrote:
    On Mon, 07 Apr 2025 17:59:45 -0400, c186282 wrote:


    Indeed ! However ... probably COULD be done, it's
    a bunch of shifting values - input to some accts,
    calx ops, shift to other accts ....... lots and
    lots of rheostats ........


    How is conditional branching (e.g. an if-then-else statement)
    to be implemented with analog circuits? It cannot be
    done.

    You can do better than boolen logic
    You can use an adder and a comparator.
    If the sum of all the inputs is greater than X then Y, Else not Y#
    You can use Y to switch another set of analogue circuits off or on.

    Or even use the sum of all the inputs to modify the gain of an amplifier.

    If you have a digital problem use a digital computer if not then think again


    Analog computers are good for modelling systems that are
    described by differential equations. Adders, differentiators,
    and integrators can all be easily implemented with electronic
    circuits. But beyond differential equation sytems analog
    computers are useless.

    Exactly

    The Norden bomb site of WWII wan an electro-mechanical
    computer. It's job was to calculate the trajectory of
    a bomb released by an aircraft and the trajectory is described
    by a differential equation.

    And it was the biggest heap of shit ever missold to the USAAF

    One of my professors told a story about a common "analog"
    practice among engineers of the past. To calculate an integral,
    which can be described as the area under a curve, they would plot
    the curve on well made paper and then cut out (with scissors)
    the plotted area and weigh it (on a lab balance). The ratio
    of the cut-out area with a unit area of paper would be the
    value of the integral. (Multi-dimensional integrals would
    require carving blocks of balsa wood or a similar material.)

    Of course it worked but today integration is easy to perform
    to unlimited accuracy using digital means.

    Not unlimited either in precision or in compute time



    --
    For in reason, all government without the consent of the governed is the
    very definition of slavery.

    Jonathan Swift


    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From c186282@c186282@nnada.net to comp.os.linux.advocacy,comp.os.linux.misc on Tue Apr 8 10:24:22 2025
    From Newsgroup: comp.os.linux.advocacy

    On 4/8/25 7:59 AM, Farley Flud wrote:
    On Mon, 07 Apr 2025 17:59:45 -0400, c186282 wrote:


    Indeed ! However ... probably COULD be done, it's
    a bunch of shifting values - input to some accts,
    calx ops, shift to other accts ....... lots and
    lots of rheostats ........


    How is conditional branching (e.g. an if-then-else statement)
    to be implemented with analog circuits? It cannot be
    done.

    Ummmmmm ... a lightly-latching op amp maybe, "IF (v1) > (v2)
    THEN amp = ON" ?
    Analog computers are good for modelling systems that are
    described by differential equations. Adders, differentiators,
    and integrators can all be easily implemented with electronic
    circuits. But beyond differential equation sytems analog
    computers are useless.

    The Norden bomb site of WWII wan an electro-mechanical
    computer. It's job was to calculate the trajectory of
    a bomb released by an aircraft and the trajectory is described
    by a differential equation.

    Note those kinds of systems require the use of
    something called 'dithering' ... essentially a
    small mechanical vibrating device. This would
    help overcome the natural 'stickiness' of parts
    on parts, making finer actions possible by
    adding just a little chaos. Some pure-electronics
    systems use a form of 'dithering' too. https://www.planetanalog.com/can-adding-noise-actually-improve-system-performance/

    One of my professors told a story about a common "analog"
    practice among engineers of the past. To calculate an integral,
    which can be described as the area under a curve, they would plot
    the curve on well made paper and then cut out (with scissors)
    the plotted area and weigh it (on a lab balance). The ratio
    of the cut-out area with a unit area of paper would be the
    value of the integral. (Multi-dimensional integrals would
    require carving blocks of balsa wood or a similar material.)

    That's impressively clever !

    Of course it worked but today integration is easy to perform
    to unlimited accuracy using digital means.

    Most things are ... but at the cost of great complexity
    and often power consumption. IF you can find ways to make
    use of 'natural calculations' they may be worth using,
    or at least including in the mostly-digital machine. If
    Ohm's law can do near-instant floating-point calx it
    MAY be easier to have a tiny circuit of a few resistors
    and then read it with an A/D converter than to do all
    the calx step by dozens/hundreds/thousands of digital
    instruction steps.
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From c186282@c186282@nnada.net to comp.os.linux.advocacy,comp.os.linux.misc on Tue Apr 8 10:28:54 2025
    From Newsgroup: comp.os.linux.advocacy

    Oh, on-theme, apparently Team Musk's nerd squad
    managed to CRASH a fair segment of the SSA customer
    web sites while trying to add some "anti-fraud"
    feature :-)

    PROBABLY no COBOL involved ... well, maybe ....
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From Chris Ahlstrom@OFeem1987@teleworm.us to comp.os.linux.advocacy,comp.os.linux.misc on Tue Apr 8 11:51:18 2025
    From Newsgroup: comp.os.linux.advocacy

    -hh wrote this post while blinking in Morse code:

    <snip>

    Fast-forward to ~2005. While back visiting that department, I found out that that old analog beast was still running the line and they were
    trolling eBay for parts to keep it running.

    I was on a project where the manager(s) ended up buying the box on eBay, causing prices to rise.

    On another visit ~2015, the update: they finally found a new digitally based control system that was fast enough to finally replace it & did.
    --
    Dear Miss Manners:
    Please list some tactful ways of removing a man's saliva from your face.

    Gentle Reader:
    Please list some decent ways of acquiring a man's saliva on your face. If
    the gentleman sprayed you inadvertently to accompany enthusiastic
    discourse, you may step back two paces, bring out your handkerchief, and
    go through the motions of wiping your nose, while trailing the cloth along
    your face to pick up whatever needs mopping along the route. If, however,
    the substance was acquired as a result of enthusiasm of a more intimate
    nature, you may delicately retrieve it with a flick of your pink tongue.
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From -hh@recscuba_google@huntzinger.com to comp.os.linux.advocacy,comp.os.linux.misc on Tue Apr 8 16:17:44 2025
    From Newsgroup: comp.os.linux.advocacy

    On 4/7/25 17:59, c186282 wrote:
    On 4/7/25 4:39 PM, -hh wrote:
    ...
    I had one project some time 'way back in the 80s where we were
    troubleshooting a line that had a 1960s era analog control system, and
    one of the conversations that came up was if to replace it with
    digital. It got looked into and was determined that digital process
    controls weren't fast enough for the line.

    Fast-forward to ~2005.  While back visiting that department, I found
    out that that old analog beast was still running the line and they
    were trolling eBay for parts to keep it running.

      Hey, so long as it works well !

    It did, so long as there were parts for it.


    On another visit ~2015, the update:  they finally found a new
    digitally based control system that was fast enough to finally replace
    it & did.

      What was the thing doing ?

    It was running a high speed manufacturing line. If memory serves,
    roughly 1200ppm, so 20 parts per second.

    For a digital system that's a budget of ~50 milliseconds total
    processing time per part, which one can see how early digital stuff
    couldn't maintain that pace, but as PCs got faster, it wasn't really
    clear why it remained a "too hard".

    That seemed to have come from the architecture. Its a series of linked tooling station heads, with each head has 22? sets of tools running
    basically in parallel, but because everything was indexed, a part that
    went through Station 1 on Head A, then went through Station 1 too on
    Heads B, and Station 1 on C, 1 on D, 1 on E, etc ...

    The process had interactive feedback loops all over the place between
    multiple heads (& other stuff), such that if head E started to report
    its hydraulic psi was running high, that was because of an insufficient
    anneal back between B & C, so turn up the voltage on the annealing station...and if that was already running high, then turn up the voltage
    on an earlier annealing station.

    But that wasn't all: it would make similar on-the-fly adjustments for
    each of the individual Stations too, so if Tool 18 on Head G was
    complaining, they could adjust settings on Tools 18 on Heads ABCDEF
    upstream of G .. and HIJK downstream too if that was a fix too.

    It must have been an incredible project back in the 1960s to get it all
    so incredibly figured out and well balanced.

    The modernization eventually came along because the base machines were expensive - probably "lost art" IMO - but were known to be capable of
    running much faster, and it was finally a modernization to have it run
    faster that got over the goal line for digitization. I think they ended
    up just a shade over 2000ppm; I'll ask the next time I stop by.


    -hh

    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From -hh@recscuba_google@huntzinger.com to comp.os.linux.advocacy,comp.os.linux.misc on Tue Apr 8 16:26:43 2025
    From Newsgroup: comp.os.linux.advocacy

    On 4/8/25 10:28, c186282 wrote:
    Oh, on-theme, apparently Team Musk's nerd squad
    managed to CRASH a fair segment of the SSA customer
    web sites while trying to add some "anti-fraud"
    feature  :-)

    PROBABLY no COBOL involved ... well, maybe ....


    Oh, its worse than that.

    "The network crashes appear to be caused by an expansion initiated by
    the Trump team of an existing contract with a credit-reporting agency
    that tracks names, addresses and other personal information to verify customers’ identities. The enhanced fraud checks are now done earlier in
    the claims process and have resulted in a boost to the volume of
    customers who must pass the checks."

    <https://gizmodo.com/social-security-website-crashes-blamed-on-doge-software-update-2000586092>


    Translation:

    They *moved* where an existing credit agency check is done, but didn't
    load test it before going live ... and golly, they broke it!

    But the more important question here is:

    **WHY** did they move where this check is done?

    Because this check already existed, so moving where its done isn't going
    to catch more fraud.

    Plus front-loading it before you've run your in-house checks means that
    your operating expenses to this contractor service go UP not down. Yes, that's a deliberate waste of taxpayer dollars.

    The only motivation I can see is propaganda: this change will find more 'fraud' at the contractor's check ... but not more fraud in total.

    Expect them to use the before/after contractor numbers only to falsely
    claim that they've found 'more' fraud. No, they're committing fraud.


    -hh
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From c186282@c186282@nnada.net to comp.os.linux.advocacy,comp.os.linux.misc on Tue Apr 8 19:03:00 2025
    From Newsgroup: comp.os.linux.advocacy

    On 4/8/25 4:17 PM, -hh wrote:
    On 4/7/25 17:59, c186282 wrote:
    On 4/7/25 4:39 PM, -hh wrote:
    ...
    I had one project some time 'way back in the 80s where we were
    troubleshooting a line that had a 1960s era analog control system,
    and one of the conversations that came up was if to replace it with
    digital. It got looked into and was determined that digital process
    controls weren't fast enough for the line.

    Fast-forward to ~2005.  While back visiting that department, I found
    out that that old analog beast was still running the line and they
    were trolling eBay for parts to keep it running.

       Hey, so long as it works well !

    It did, so long as there were parts for it.


    On another visit ~2015, the update:  they finally found a new
    digitally based control system that was fast enough to finally
    replace it & did.

       What was the thing doing ?

    It was running a high speed manufacturing line.  If memory serves,
    roughly 1200ppm, so 20 parts per second.

    For a digital system that's a budget of ~50 milliseconds total
    processing time per part, which one can see how early digital stuff
    couldn't maintain that pace, but as PCs got faster, it wasn't really
    clear why it remained a "too hard".

    That seemed to have come from the architecture.  Its a series of linked tooling station heads, with each head has 22? sets of tools running basically in parallel, but because everything was indexed, a part that
    went through Station 1 on Head A, then went through Station 1 too on
    Heads B, and Station 1 on C, 1 on D, 1 on E, etc ...

    The process had interactive feedback loops all over the place between multiple heads (& other stuff), such that if head E started to report
    its hydraulic psi was running high, that was because of an insufficient anneal back between B & C, so turn up the voltage on the annealing station...and if that was already running high, then turn up the voltage
    on an earlier annealing station.

    But that wasn't all:  it would make similar on-the-fly adjustments for
    each of the individual Stations too, so if Tool 18 on Head G was complaining, they could adjust settings on Tools 18 on Heads ABCDEF
    upstream of G .. and HIJK downstream too if that was a fix too.

    It must have been an incredible project back in the 1960s to get it all
    so incredibly figured out and well balanced.

    The modernization eventually came along because the base machines were expensive - probably "lost art" IMO - but were known to be capable of running much faster, and it was finally a modernization to have it run faster that got over the goal line for digitization.  I think they ended
    up just a shade over 2000ppm; I'll ask the next time I stop by.

    These days it's difficult to even imagine such a complex
    equation being handled by anything BUT digital and lots
    of rule tables - but they had what they had back then and
    made do anyway.

    MY wonder ... who initially DESIGNED all that ? Real Genius
    at work from the good old Can-Do days. Those are the kind of
    people who are rarely remembered in the histories.
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From Charlie Gibbs@cgibbs@kltpzyxm.invalid to comp.os.linux.advocacy,comp.os.linux.misc on Tue Apr 8 23:18:35 2025
    From Newsgroup: comp.os.linux.advocacy

    On 2025-04-08, -hh <recscuba_google@huntzinger.com> wrote:

    Plus front-loading it before you've run your in-house checks means that
    your operating expenses to this contractor service go UP not down. Yes, that's a deliberate waste of taxpayer dollars.

    You'd think someone would want to try to reduce that waste.
    Maybe set up a Department Of Government Efficiency or something...
    --
    /~\ Charlie Gibbs | Growth for the sake of
    \ / <cgibbs@kltpzyxm.invalid> | growth is the ideology
    X I'm really at ac.dekanfrus | of the cancer cell.
    / \ if you read it the right way. | -- Edward Abbey
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From c186282@c186282@nnada.net to comp.os.linux.advocacy,comp.os.linux.misc on Tue Apr 8 20:04:24 2025
    From Newsgroup: comp.os.linux.advocacy

    On 4/8/25 4:26 PM, -hh wrote:
    On 4/8/25 10:28, c186282 wrote:
    Oh, on-theme, apparently Team Musk's nerd squad
    managed to CRASH a fair segment of the SSA customer
    web sites while trying to add some "anti-fraud"
    feature  :-)

    PROBABLY no COBOL involved ... well, maybe ....


    Oh, its worse than that.

    "The network crashes appear to be caused by an expansion initiated by
    the Trump team of an existing contract with a credit-reporting agency
    that tracks names, addresses and other personal information to verify customers’ identities. The enhanced fraud checks are now done earlier in the claims process and have resulted in a boost to the volume of
    customers who must pass the checks."

    <https://gizmodo.com/social-security-website-crashes-blamed-on-doge-software-update-2000586092>



    Translation:

    They *moved* where an existing credit agency check is done, but didn't
    load test it before going live ... and golly, they broke it!

    But the more important question here is:

    **WHY** did they move where this check is done?

    Because this check already existed, so moving where its done isn't going
    to catch more fraud.


    "Well ... just jam the new code in ... *somewhere* ..."

    Oh, DOUBT many/any even knew the checks WERE done,
    just somewhere ELSE.


    Plus front-loading it before you've run your in-house checks means that
    your operating expenses to this contractor service go UP not down.  Yes, that's a deliberate waste of taxpayer dollars.

    The only motivation I can see is propaganda:  this change will find more 'fraud' at the contractor's check ... but not more fraud in total.


    There's a fundamental political rule, esp in 'democracies',
    that goes "ALWAYS be seen as *DOING SOMETHING*"

    Spin it however needed.

    ONLY possible maybe perhaps reason to move the checks
    is to not let fraudsters/Putin deeper into the system/
    process where there may be more little flaws to exploit.

    We know ALL code has those little flaws, logic/field/
    buffer issues. Even M$ can't clean all that junk out
    its products despite decades and 'AI' debugging and
    such. Check their security notes - there's still the
    dreaded "buffer-overflow vulnerability of the week".
    SO ... block perps earlier = less for them to attack.

    Maybe ....

    Expect them to use the before/after contractor numbers only to falsely
    claim that they've found 'more' fraud.  No, they're committing fraud.


    Nah ! They're *doing something* !!! :-)
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From c186282@c186282@nnada.net to comp.os.linux.advocacy,comp.os.linux.misc on Tue Apr 8 22:29:19 2025
    From Newsgroup: comp.os.linux.advocacy

    On 4/8/25 7:18 PM, Charlie Gibbs wrote:
    On 2025-04-08, -hh <recscuba_google@huntzinger.com> wrote:

    Plus front-loading it before you've run your in-house checks means that
    your operating expenses to this contractor service go UP not down. Yes,
    that's a deliberate waste of taxpayer dollars.

    You'd think someone would want to try to reduce that waste.
    Maybe set up a Department Of Government Efficiency or something...


    Hey ... humans are only JUST so smart, AI is
    even more stupid, and govt agencies .........

    Likely the expense of the earlier checks do NOT add
    up to much.

    I did mention one possible gain in doing the ID checks
    earlier - giving Vlad and friends less access to the
    deeper pages/system, places where more exploitable
    flaws live.

    In short, put up a big high city wall - then you
    don't have to worry AS much about the inner layers
    of the city.

    Hmmmmm ... wonder what kind of code they were
    screwing with ... lots of JS ? No WONDER it all
    blew up :-)

    Lucky it wasn't the old COBOL stuff ..... I only
    know ONE guy who is still a competent COBOL
    programmer. I did a little, but ......
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From Charlie Gibbs@cgibbs@kltpzyxm.invalid to comp.os.linux.advocacy,comp.os.linux.misc on Wed Apr 9 16:49:38 2025
    From Newsgroup: comp.os.linux.advocacy

    On 2025-04-09, c186282 <c186282@nnada.net> wrote:

    There's a fundamental political rule, esp in 'democracies',
    that goes "ALWAYS be seen as *DOING SOMETHING*"

    Something must be done. This is something.
    Therefore, this must be done.
    -- Yes, Prime Minister
    --
    /~\ Charlie Gibbs | Growth for the sake of
    \ / <cgibbs@kltpzyxm.invalid> | growth is the ideology
    X I'm really at ac.dekanfrus | of the cancer cell.
    / \ if you read it the right way. | -- Edward Abbey
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From Charlie Gibbs@cgibbs@kltpzyxm.invalid to comp.os.linux.advocacy,comp.os.linux.misc on Wed Apr 9 16:49:39 2025
    From Newsgroup: comp.os.linux.advocacy

    On 2025-04-09, c186282 <c186282@nnada.net> wrote:

    I did mention one possible gain in doing the ID checks
    earlier - giving Vlad and friends less access to the
    deeper pages/system, places where more exploitable
    flaws live.

    In short, put up a big high city wall - then you
    don't have to worry AS much about the inner layers
    of the city.

    A friend once told me about an interesting concept raised
    by a science fiction story he read. By assuming that entry
    controls are perfect, the presence of someone in a restricted
    area automatically means that he is authorized to be there.

    If the entry controls fail, at least you have a convenient
    scapegoat - which is of prime importance in politics.
    --
    /~\ Charlie Gibbs | Growth for the sake of
    \ / <cgibbs@kltpzyxm.invalid> | growth is the ideology
    X I'm really at ac.dekanfrus | of the cancer cell.
    / \ if you read it the right way. | -- Edward Abbey
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From c186282@c186282@nnada.net to comp.os.linux.advocacy,comp.os.linux.misc on Wed Apr 9 13:09:32 2025
    From Newsgroup: comp.os.linux.advocacy

    On 4/9/25 12:49 PM, Charlie Gibbs wrote:
    On 2025-04-09, c186282 <c186282@nnada.net> wrote:

    There's a fundamental political rule, esp in 'democracies',
    that goes "ALWAYS be seen as *DOING SOMETHING*"

    Something must be done. This is something.
    Therefore, this must be done.
    -- Yes, Prime Minister


    Heh !

    But often truer that we'd like to think and hear.



    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From -hh@recscuba_google@huntzinger.com to comp.os.linux.advocacy,comp.os.linux.misc on Wed Apr 9 14:18:10 2025
    From Newsgroup: comp.os.linux.advocacy

    On 4/8/25 22:29, c186282 wrote:
    On 4/8/25 7:18 PM, Charlie Gibbs wrote:
    On 2025-04-08, -hh <recscuba_google@huntzinger.com> wrote:

    Plus front-loading it before you've run your in-house checks means that
    your operating expenses to this contractor service go UP not down.  Yes, >>> that's a deliberate waste of taxpayer dollars.

    You'd think someone would want to try to reduce that waste.
    Maybe set up a Department Of Government Efficiency or something...


      Hey ... humans are only JUST so smart, AI is
      even more stupid, and govt agencies .........

      Likely the expense of the earlier checks do NOT add
      up to much.

    It might not be, but in this case, the benefit of the change is
    literally zero ... and the expenses are not only more money to the
    contractor who gets paid by the check request, but also the cost of
    higher bandwidth demands which is what caused the site to crash.


      I did mention one possible gain in doing the ID checks
      earlier - giving Vlad and friends less access to the
      deeper pages/system, places where more exploitable
      flaws live.

      In short, put up a big high city wall - then you>   don't have to worry AS much about the inner layers
      of the city.

    I don't really buy that, because of symmetry: when the workflow is that
    a request has to successfully pass three gates, its functionally
    equivalent to (A x B x C) and the sequence doesn't matter: one gets the
    same outcome for (C x B x A), and (A x C x B), etc.

    The primary motivation for order selection comes from optimization
    factors, such as the 'costs' of each gate: one puts the cheap gates
    which knock down the most early, and put the slow/expensive gates late,
    after the dataset's size has already been minimized.

    -hh
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From c186282@c186282@nnada.net to comp.os.linux.advocacy,comp.os.linux.misc on Wed Apr 9 16:51:26 2025
    From Newsgroup: comp.os.linux.advocacy

    On 4/9/25 2:18 PM, -hh wrote:
    On 4/8/25 22:29, c186282 wrote:
    On 4/8/25 7:18 PM, Charlie Gibbs wrote:
    On 2025-04-08, -hh <recscuba_google@huntzinger.com> wrote:

    Plus front-loading it before you've run your in-house checks means that >>>> your operating expenses to this contractor service go UP not down.
    Yes,
    that's a deliberate waste of taxpayer dollars.

    You'd think someone would want to try to reduce that waste.
    Maybe set up a Department Of Government Efficiency or something...


       Hey ... humans are only JUST so smart, AI is
       even more stupid, and govt agencies .........

       Likely the expense of the earlier checks do NOT add
       up to much.

    It might not be, but in this case, the benefit of the change is
    literally zero ... and the expenses are not only more money to the contractor who gets paid by the check request, but also the cost of
    higher bandwidth demands which is what caused the site to crash.


       I did mention one possible gain in doing the ID checks
       earlier - giving Vlad and friends less access to the
       deeper pages/system, places where more exploitable
       flaws live.
       In short, put up a big high city wall - then you>    don't have to >> worry AS much about the inner layers
       of the city.

    I don't really buy that, because of symmetry: when the workflow is that
    a request has to successfully pass three gates, its functionally
    equivalent to (A x B x C) and the sequence doesn't matter:  one gets the same outcome for (C x B x A), and (A x C x B), etc.

    The primary motivation for order selection comes from optimization
    factors, such as the 'costs' of each gate: one puts the cheap gates
    which knock down the most early, and put the slow/expensive gates late, after the dataset's size has already been minimized.


    I understand your reasoning here.

    The point I was trying to make is a bit different
    however - less to really do with people trying to
    defraud the system but with those seeking to
    corrupt/destroy it. I see every web page, every
    bit of HTML/PHP/JS executed, every little database
    opened, as a potential source of fatal FLAWS enemies
    can find and exploit to do great damage.

    In that context, the sooner you can lock out pretenders
    the better - less of the system exposed to the state-
    sponsored hacks to analyze and pound at relentlessly.

    Now Musk's little group DID make a mistake in
    not taking bandwidth into account (and we do
    not know how ELSE they may have screwed up
    jamming new code into something they didn't
    write) but 'non-optimal' verification order
    MIGHT be worth the extra $$$ in an expanded
    'security' context.
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From -hh@recscuba_google@huntzinger.com to comp.os.linux.advocacy,comp.os.linux.misc on Wed Apr 9 22:33:54 2025
    From Newsgroup: comp.os.linux.advocacy

    On 4/9/25 16:51, c186282 wrote:
    On 4/9/25 2:18 PM, -hh wrote:
    On 4/8/25 22:29, c186282 wrote:
    On 4/8/25 7:18 PM, Charlie Gibbs wrote:
    On 2025-04-08, -hh <recscuba_google@huntzinger.com> wrote:

    Plus front-loading it before you've run your in-house checks means
    that
    your operating expenses to this contractor service go UP not down.
    Yes,
    that's a deliberate waste of taxpayer dollars.

    You'd think someone would want to try to reduce that waste.
    Maybe set up a Department Of Government Efficiency or something...


       Hey ... humans are only JUST so smart, AI is
       even more stupid, and govt agencies .........

       Likely the expense of the earlier checks do NOT add
       up to much.

    It might not be, but in this case, the benefit of the change is
    literally zero ... and the expenses are not only more money to the
    contractor who gets paid by the check request, but also the cost of
    higher bandwidth demands which is what caused the site to crash.


       I did mention one possible gain in doing the ID checks
       earlier - giving Vlad and friends less access to the
       deeper pages/system, places where more exploitable
       flaws live.
       In short, put up a big high city wall - then you>    don't have to >>> worry AS much about the inner layers
       of the city.

    I don't really buy that, because of symmetry: when the workflow is
    that a request has to successfully pass three gates, its functionally
    equivalent to (A x B x C) and the sequence doesn't matter:  one gets
    the same outcome for (C x B x A), and (A x C x B), etc.

    The primary motivation for order selection comes from optimization
    factors, such as the 'costs' of each gate: one puts the cheap gates
    which knock down the most early, and put the slow/expensive gates
    late, after the dataset's size has already been minimized.


      I understand your reasoning here.

      The point I was trying to make is a bit different
      however - less to really do with people trying to
      defraud the system but with those seeking to
      corrupt/destroy it. I see every web page, every
      bit of HTML/PHP/JS executed, every little database
      opened, as a potential source of fatal FLAWS enemies
      can find and exploit to do great damage.

      In that context, the sooner you can lock out pretenders
      the better - less of the system exposed to the state-
      sponsored hacks to analyze and pound at relentlessly.

    Sure, but that's not relevant here, because from a threat vulnerability perspective, its just one big 'black box' process. Anyone attempting to
    probe doesn't receive intermediary milestones/checkpoints to know if
    they successfully passed/failed a gate.


      Now Musk's little group DID make a mistake in
      not taking bandwidth into account (and we do
      not know how ELSE they may have screwed up
      jamming new code into something they didn't
      write) but 'non-optimal' verification order
      MIGHT be worth the extra $$$ in an expanded
      'security' context.

    Might be worth it if it actually enhanced security. It failed to do so, because their change was just a "shuffling of the existing deck chairs".


    -hh
    --- Synchronet 3.20c-Linux NewsLink 1.2
  • From c186282@c186282@nnada.net to comp.os.linux.advocacy,comp.os.linux.misc on Wed Apr 9 23:11:06 2025
    From Newsgroup: comp.os.linux.advocacy

    On 4/9/25 10:33 PM, -hh wrote:
    On 4/9/25 16:51, c186282 wrote:
    On 4/9/25 2:18 PM, -hh wrote:
    On 4/8/25 22:29, c186282 wrote:
    On 4/8/25 7:18 PM, Charlie Gibbs wrote:
    On 2025-04-08, -hh <recscuba_google@huntzinger.com> wrote:

    Plus front-loading it before you've run your in-house checks means >>>>>> that
    your operating expenses to this contractor service go UP not down. >>>>>> Yes,
    that's a deliberate waste of taxpayer dollars.

    You'd think someone would want to try to reduce that waste.
    Maybe set up a Department Of Government Efficiency or something...


       Hey ... humans are only JUST so smart, AI is
       even more stupid, and govt agencies .........

       Likely the expense of the earlier checks do NOT add
       up to much.

    It might not be, but in this case, the benefit of the change is
    literally zero ... and the expenses are not only more money to the
    contractor who gets paid by the check request, but also the cost of
    higher bandwidth demands which is what caused the site to crash.


       I did mention one possible gain in doing the ID checks
       earlier - giving Vlad and friends less access to the
       deeper pages/system, places where more exploitable
       flaws live.
       In short, put up a big high city wall - then you>    don't have >>>> to worry AS much about the inner layers
       of the city.

    I don't really buy that, because of symmetry: when the workflow is
    that a request has to successfully pass three gates, its functionally
    equivalent to (A x B x C) and the sequence doesn't matter:  one gets
    the same outcome for (C x B x A), and (A x C x B), etc.

    The primary motivation for order selection comes from optimization
    factors, such as the 'costs' of each gate: one puts the cheap gates
    which knock down the most early, and put the slow/expensive gates
    late, after the dataset's size has already been minimized.


       I understand your reasoning here.

       The point I was trying to make is a bit different
       however - less to really do with people trying to
       defraud the system but with those seeking to
       corrupt/destroy it. I see every web page, every
       bit of HTML/PHP/JS executed, every little database
       opened, as a potential source of fatal FLAWS enemies
       can find and exploit to do great damage.

       In that context, the sooner you can lock out pretenders
       the better - less of the system exposed to the state-
       sponsored hacks to analyze and pound at relentlessly.

    Sure, but that's not relevant here, because from a threat vulnerability perspective, its just one big 'black box' process.  Anyone attempting to probe doesn't receive intermediary milestones/checkpoints to know if
    they successfully passed/failed a gate.

    Alas, the box is only "black" to OUR people.

    Remember a few months ago when China got into
    several major US phone/net carriers - and DID
    mess with them ?

    They got partway into the systems, then probed
    everything they found and found FLAWS they could
    exploit. THEY put 100 times more effort into it
    than the corp people spent looking for flaws.

    EVERY page is likely to have one or two tiny
    flaws - so the FEWER pages "They" can get into
    the system the BETTER.


       Now Musk's little group DID make a mistake in
       not taking bandwidth into account (and we do
       not know how ELSE they may have screwed up
       jamming new code into something they didn't
       write) but 'non-optimal' verification order
       MIGHT be worth the extra $$$ in an expanded
       'security' context.

    Might be worth it if it actually enhanced security.  It failed to do so, because their change was just a "shuffling of the existing deck chairs".

    THIS time, maybe. But some deck chairs are better
    than others.

    Alas, as long experience shows, there's likely NO
    way to solidly secure any public-facing system -
    esp with interactive web pages and such. Instead
    it becomes a statistical exercise. Can the damage
    be kept SMALL/RARE ?

    The cyber-access paradigm DOES seem to be the
    harbinger of doom. "They" have proven they CAN
    get into ANYTHING net-connected - govt, banks,
    utilities, nuke plants ... anything. There are
    ALWAYS sneaky back-doors, ALWAYS flaws that can
    be exploited.

    I'll go back awhile to when the USA exploited
    flaws in Siemens industrial-process units to
    trash all the Iranian uranium centrifuges.
    Alas, now, "They" can instruct yer local nuke
    reactor to pull all its control rods, or just
    shut down a cooling pump, or distort sensor
    readings, or .... the more access the more
    "in"s ..........

    2FA ? 3FA ? Required biometrics ? It can ALL
    be faked these days. "Security" is more a
    collective illusion/delusion.
    --- Synchronet 3.20c-Linux NewsLink 1.2