
Does Gemma
phone home?
Google has released a family of "open" large language models named Gemma, that are compact enough to run on a personal computer. Gemma comes in two sizes: two billion parameters and seven billion parameters. The larger version is intended for GPU- and TPU-accelerated systems, while the smaller one is billed as suitable for …
But is it much better than Eliza?
Inside of Emacs, type M-x doctor to launch Eliza. For those of you like me from a Vim background who have no idea what this means, just hit escape, type x and then type doctor.
Is it a toy or really useful?
How do we know it wasn't partially trained on copyright material, which I regard as plagiarism and unethical, not fair use?
What does it tell Google?
Seems like if it is based on scraped text, however sanitized, that IP infringement is likely.
Google's exposure for shipping the model is in a bigger grey area than those that use it to generate content with it probably.
As to what it tells Google? Run it offline, and all you have to worry about is if the output is watermarked or fingerprinted.
"In the right hands, it carries incredible opportunity, but in the wrong hands, it can pose a threat to public safety."
As these LLMs don't learn once initially trained, how long before they cease to be useful (supposing of course...)? It's very unlikely that retraining would be possible on 'your computer', even if you had access to the training data set.