Groups | Search | Server Info | Login | Register


Groups > sci.logic > #340795

Looks like sorting of rational trees needs an existential type (Re: Prolog totally missed the AI Boom)

From Mild Shock <janburse@fastmail.fm>
Newsgroups sci.logic
Subject Looks like sorting of rational trees needs an existential type (Re: Prolog totally missed the AI Boom)
Date 2025-07-23 13:58 +0200
Message-ID <105qioh$2no9m$2@solani.org> (permalink)
References <vpcele$is1s$3@solani.org>

Show all headers | View raw


Looks like sorting of rational trees
needs an existential type, if we go full “logical”.
If I use my old code from 2023 which computes
a finest (*), i.e. non-monster, bisimulation

pre-quotient (**) in prefix order:

factorize(T, _, T) --> {var(T)}, !.
factorize(T, C, V) --> {compound(T), member(S-V, C), S == T}, !.
factorize(T, C, V) --> {compound(T)}, !,
    [V = S],
    {T =.. [F|L]},
    factorize_list(L, [T-V|C], R),
    {S =.. [F|R]}.
factorize(T, _, T) --> [].

I see that it always generates new
intermediate variables:

?- X = f(f(X)), factorize(X, [], T, L, []), write(L-T), nl.
[_8066=f(_8066)]-_8066

?- X = f(f(X)), factorize(X, [], T, L, []), write(L-T), nl.
[_10984=f(_10984)]-_10984

What would be swell if it would generate an
existential quantifier, something like T^([T = f(T)]-T)
in the above case. Then using alpha conversion
different factorization runs would be equal,

when they only differ by the introduced
intermediate variables. But Prolog has no alpha
conversion, only λ-Prolog has such things.
So what can we do, how can we produce a

representation, that can be used for sorting?

(*) Why finest and not corsets? Because it uses
non-monster instructions and not monster
instructions

(**) Why only pre-quotient? Because a
XXX_with_stack algorithm does not fully
deduplicate the equations, would
probably need a XXX_with_memo algorithm.

Mild Shock schrieb:
> 
> Inductive logic programming at 30
> https://arxiv.org/abs/2102.10556
> 
> The paper contains not a single reference to autoencoders!
> Still they show this example:
> 
> Fig. 1 ILP systems struggle with structured examples that
> exhibit observational noise. All three examples clearly
> spell the word "ILP", with some alterations: 3 noisy pixels,
> shifted and elongated letters. If we would be to learn a
> program that simply draws "ILP" in the middle of the picture,
> without noisy pixels and elongated letters, that would
> be a correct program.
> 
> I guess ILP is 30 years behind the AI boom. An early autoencoder
> turned into transformer was already reported here (*):
> 
> SERIAL ORDER, Michael I. Jordan - May 1986
> https://cseweb.ucsd.edu/~gary/PAPER-SUGGESTIONS/Jordan-TR-8604-OCRed.pdf
> 
> Well ILP might have its merits, maybe we should not ask
> for a marriage of LLM and Prolog, but Autoencoders and ILP.
> But its tricky, I am still trying to decode the da Vinci code of
> 
> things like stacked tensors, are they related to k-literal clauses?
> The paper I referenced is found in this excellent video:
> 
> The Making of ChatGPT (35 Year History)
> https://www.youtube.com/watch?v=OFS90-FX6pg

Back to sci.logic | Previous | NextNext in thread | Find similar


Thread

Looks like sorting of rational trees needs an existential type (Re: Prolog totally missed the AI Boom) Mild Shock <janburse@fastmail.fm> - 2025-07-23 13:58 +0200
  LLMs / Autoencoders could profit for Bisimulation Quotienting (Was: Looks like sorting of rational trees needs an existential type) Mild Shock <janburse@fastmail.fm> - 2025-07-23 14:02 +0200
    Are you Geh? From bi-simulation to bi-similarity (Re: LLMs / Autoencoders could profit for Bisimulation Quotienting) Mild Shock <janburse@fastmail.fm> - 2025-07-23 15:19 +0200
      Quite vibrant logic history one can experience right now! (Was: Are you Geh? From bi-simulation to bi-similarity) Mild Shock <janburse@fastmail.fm> - 2025-07-23 19:13 +0200

csiph-web