From Newsgroup: comp.ai.philosophy
On 10/19/2025 7:39 PM, Mild Shock wrote:
Hi,
Now there is the neuro symbolic hybrid of tripple store and
artificial neural networks. Already in 2019 proposed an
embedded attention mechanism by Deepak Nathani et al.:
GraphMERT: Efficient and Scalable Distillation
of Reliable Knowledge Graphs from Unstructured Data https://www.researchgate.net/publication/396457862
Neurosymbolic 80M AI from Princeton beats GPT,
SuperIntelligence without OpenAI:
https://www.youtube.com/watch?v=xh6R2WR49yM
Have Fun!
Bye
This is only the exact same idea as my basic
facts in that the goals are identical. It is
enormously more advanced than what I have
because I only have the goal. I basically
have no idea how to achieve this goal.
GraphMERT is actually achieving this goal with
high reliability. This is the exact kind of
bridge that LLM systems need to eventually
become extremely reliable.
--
Copyright 2025 Olcott "Talent hits a target no one else can hit; Genius
hits a target no one else can see." Arthur Schopenhauer
--- Synchronet 3.21a-Linux NewsLink 1.2