back to article We'll know what we node, we'll grok what we've graphed: Neo4j nails graph machine learning to data science workbench

Neo4j has added graph embeddings to its machine learning workbench in the hopes data scientists using its graph database will gain a productivity boost. Version 1.4 of the Graph Data Science (GDS) workbench was rolled out today. It is similar to tools such as H2O.ai but specifically for graph databases, and now supports " …

  1. Robert Grant Silver badge

    but very frequently it is two or more databases for a particular project

    Very very frequently it's Postgres. Occasionally it's Neo4J, if you have loads of self-referential data you need to traverse more than 5 levels deep to be faster than Postgres. According to a benchmark I read once that I can't find now.

    Their visualiser is neat, though. Good for exploratory stuff.

  2. Eclectic Man Silver badge
    Boffin

    Well-Quasi-Ordering

    Gosh, minor graph embeddings brought back my PhD thesis, or rather the knowledge that the set of finite graphs is well-quasi-ordered did. I couldn't find the proof, but did find a paper on Kruskal's tree theorem for finite trees which pre-dates the same theorem for finite graphs.

    https://www.cis.upenn.edu/~jean/kruskal.pdf

    I recommend C St. J A Nash-Williamns' paper if you want to understand the Kruskal Tree Theorem, as it is a model of clarity and exposition.

    Sorry, just a bit of nostalgia. I'll shut up now.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2021