• Prolog missed a Billion Dollar Business Model [DGX Spark ]

    From Mild Shock@janburse@fastmail.fm to comp.lang.prolog on Mon Oct 20 21:32:34 2025
    From Newsgroup: comp.lang.prolog

    Hi,

    With their stupid fixation on Emacs Prolog
    missed a Billion Dollar Business Model.

    Now people will start paying 4'000 USD per
    Year and GPU to Red Hat to have some AI
    model deployment and training enviroment:

    DGX Spark Arrives at SpaceX
    https://www.youtube.com/watch?v=peaIkB0NzS0

    But these environments are nothing then
    Python virtual environments. Can Prolog not
    also push some evaluation graphs to

    a GPU or NPU? Well it seems with its current
    philosophy only if there is a spare Key
    Combination in Emacs for it.

    Bye
    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From Mild Shock@janburse@fastmail.fm to comp.lang.prolog on Mon Oct 20 21:50:12 2025
    From Newsgroup: comp.lang.prolog

    Hi,

    Prolog has become the Useful Fool.
    Over the years it was trained that
    constraints are importanter than

    anything else. Even this talk blends
    out natural language processing:

    Manuel Hermenehgildo - 50 Years of Prolog and Beyond https://prologyear.logicprogramming.org/videos/PrologDay_Session_1_talk.mp4

    And if you give them some IBM Roadrunner
    precursor, all they can do, is play
    some stupid constraint games:

    Parallel local search for solving Constraint Problems on the Cell
    Broadband Engine (Preliminary Results) https://www.researchgate.net/publication/220481722

    And mentioning in passing how useless
    blunt constraint solving is as implemented
    by most CLP(X), can't even solve

    magic square of 10 x 10.

    Bye

    Mild Shock schrieb:
    Hi,

    With their stupid fixation on Emacs Prolog
    missed a Billion Dollar Business Model.

    Now people will start paying 4'000 USD per
    Year and GPU to Red Hat to have some AI
    model deployment and training enviroment:

    DGX Spark Arrives at SpaceX
    https://www.youtube.com/watch?v=peaIkB0NzS0

    But these environments are nothing then
    Python virtual environments. Can Prolog not
    also push some evaluation graphs to

    a GPU or NPU? Well it seems with its current
    philosophy only if there is a spare Key
    Combination in Emacs for it.

    Bye

    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From Mild Shock@janburse@fastmail.fm to comp.lang.prolog on Tue Oct 21 00:43:57 2025
    From Newsgroup: comp.lang.prolog

    Hi,

    On small clusters, a failed GPU process
    usually crashes visibly. On large TPU systems,
    you can get “silent” or “partial” failures
    where a chip is alive but unhealthy —

    producing NaNs, stalling on the interconnect,
    or returning corrupted data that’s
    not immediately flagged.

    At hyperscale, training becomes a kind of
    “distributed computer architecture problem,”
    and engineers do end up inventing their own
    “parity-check”–like systems — not for

    individual bits (as in DRAM ECC), but for
    entire tensors, gradients, and replicas.

    See also:

    Gemini 2.5: Pushing the Frontier
    https://arxiv.org/abs/2507.06261

    Bye

    Mild Shock schrieb:
    Hi,

    Prolog has become the Useful Fool.
    Over the years it was trained that
    constraints are importanter than

    anything else. Even this talk blends
    out natural language processing:

    Manuel Hermenehgildo - 50 Years of Prolog and Beyond https://prologyear.logicprogramming.org/videos/PrologDay_Session_1_talk.mp4

    And if you give them some IBM Roadrunner
    precursor, all they can do, is play
    some stupid constraint games:

    Parallel local search for solving Constraint Problems on the Cell
    Broadband Engine (Preliminary Results) https://www.researchgate.net/publication/220481722

    And mentioning in passing how useless
    blunt constraint solving is as implemented
    by most CLP(X), can't even solve

    magic square of 10 x 10.

    Bye

    Mild Shock schrieb:
    Hi,

    With their stupid fixation on Emacs Prolog
    missed a Billion Dollar Business Model.

    Now people will start paying 4'000 USD per
    Year and GPU to Red Hat to have some AI
    model deployment and training enviroment:

    DGX Spark Arrives at SpaceX
    https://www.youtube.com/watch?v=peaIkB0NzS0

    But these environments are nothing then
    Python virtual environments. Can Prolog not
    also push some evaluation graphs to

    a GPU or NPU? Well it seems with its current
    philosophy only if there is a spare Key
    Combination in Emacs for it.

    Bye


    --- Synchronet 3.21a-Linux NewsLink 1.2