Hi,
Now there is the neuro symbolic hybrid of tripple store and
artificial neural networks. Already in 2019 proposed an
embedded attention mechanism by Deepak Nathani et al.:
GraphMERT: Efficient and Scalable Distillation
of Reliable Knowledge Graphs from Unstructured Data https://www.researchgate.net/publication/396457862
Neurosymbolic 80M AI from Princeton beats GPT,
SuperIntelligence without OpenAI:
https://www.youtube.com/watch?v=xh6R2WR49yM
Have Fun!
Bye
| Sysop: | DaiTengu |
|---|---|
| Location: | Appleton, WI |
| Users: | 1,075 |
| Nodes: | 10 (0 / 10) |
| Uptime: | 90:34:03 |
| Calls: | 13,798 |
| Calls today: | 1 |
| Files: | 186,989 |
| D/L today: |
5,324 files (1,535M bytes) |
| Messages: | 2,438,211 |