Path: csiph.com!fu-berlin.de!uni-berlin.de!individual.net!not-for-mail From: rbowman Newsgroups: comp.os.linux.misc,alt.folklore.computers Subject: Re: naughty Python Date: 31 Dec 2025 07:55:28 GMT Lines: 32 Message-ID: References: <10i8usb$2oo2c$3@dont-email.me> <10icd30$2ck7$1@gal.iecc.com> <10idu04$7inn$1@dont-email.me> <10if4lo$jr1h$1@dont-email.me> <6decndo7ib2Df8z0nZ2dnZfqn_adnZ2d@giganews.com> <10iu02q$1029n$12@dont-email.me> <10iu3g7$11u10$3@dont-email.me> <10iutjt$1c0aq$2@dont-email.me> Mime-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit X-Trace: individual.net b9DR67ZYEi2eUk/8BTji1AQwZkBBDBYgLVyWGeoa0diM6lkMUi Cancel-Lock: sha1:O/lq603q9aoWKs3Dz75JrWS16fI= sha256:LaPbgTT5PzioO4Q8VT1Bdy1whGnnCAKzfa5a8PAcCoc= User-Agent: Pan/0.162 (Pokrosvk) Xref: csiph.com comp.os.linux.misc:80155 alt.folklore.computers:232930 On Tue, 30 Dec 2025 23:42:19 -0500, c186282 wrote: > NNs are a distinct tech from LLMs. In *theory* they > have great potential ... and could be rather compact, > fit inside a bot. However to realize this, special hardware is > required, elements that are sort-of like neurons. They ARE getting > there, I see articles from time to time. https://en.wikipedia.org/wiki/Transformer_(deep_learning) Transformers are based on NNs and are an improvement on recurrent neural networks. I sort of miss one of my favorite terms 'long short term memory'. The wiki page starts off slow and then gets deep into the murk. The whole field tends to be headache inducing. Humans do fine in three dimensional Euclidean space with x, y, and z. Throw in w for the fourth dimension and the woo-woo starts particularly if it's interpreted as time. After that it's all Cloud Cuckoo Land. The tensor operations that worked in 3 or 4 dimensions work in any number of dimensions since it's all math but let's not think too hard about what it means. Consider Word2Vec... https://en.wikipedia.org/wiki/Word2vec Three dimensions and you can visualize those words over in that corner of the cube sort of hand together. 400 dimensions and you have sort of a problem. Then you get to the improvements on Word2Vec... I've got a feeling that the people who are really, really good at this are also really, really weird.