We all live in a yellow subroutine
So, basically, MIT have reduced thousands of lines of code to fifty that call thousands of lines of code.
Don't write code when you can re-use code is a well-known principle in writing software, but using it to replace thousands of lines of code with 50 seems positively parsimonious. That's what MIT is claiming with an upcoming demonstration of what it calls “probabilistic code”, an approach which it says it's applied to the field …
The principles behind it could prove of benefit to 3d-engines though.
Imagine specific code for 'rocky terrain' based on machine learning using a large batch of 2d pictures as the source material - you'd never see the same rock twice.
Repeat for all other types of terrain, mix them up with some basic rules about what goes next to what and what that should look like and you can build a unique planetary landscape. Add in some villages/towns/cities/transport etc. and you could, in theory, add massive replayability to sand-box games by having a different world to play in every time you start a new game.
Or is that a bit ambitious?
That's what some people call "procedurally generated content" and there are games that do this now; Din's Curse, a Diablo clone, for example.
This post has been deleted by its author
but that has now been given a trendy techster name by some kids, who think they invented something new.
I think that is very much the case. It isn't as if the term is intrisically new - I recall skip lists for example were described as probabilistic when first presented a quarter of a century ago. This is what these guys have to overcome - in that case the description made perfect sense and usefully and succintly described the behaviour of the data structure. Yes, I know we only have a press release but it sounds very much as if this is a new "trendy" handle for a not so novel approach.
Running through what we have been told here and inferring what it actually means sans cute fluff, it seems we have a combination of Monte Carlo techniques defining a set of initial seed values and a feedback mechanism to improve the values for a subsequent round of iteration. That doesn't sound too far removed from the nondeterministic techniques LISP programmers in particular have been using for decades.
something that is completely obvious, that loads of people are doing anyway, but that has now been given a trendy techster name by some kids, who think they invented something new.
Yeah, this is fast becoming a curse in the industry. It's not so different from 5 years ago when "they" stumbled on the GoF design patterns and decided everything had to be a pattern. Queue lots and lots of things that were never a pattern being relabled and spoken of in revered hushed tones as though they were new, and not something the kids should have been doing since day one.
I had one yoot tell me with a straight face each of the following in the past 3 years:
"MVC is dead. It's all MVVM now." Ok genius, how's that going to work over the web?
"RDBMS & SQL are dead. It's all NoSQL now." hahahahahahahahahahahahaha! Ok. Whatever.
"Anyone still writing SQL is stealing from their employer: use an ORM" Aye, righto. Tell me again what an ORM does, why it does it, and why you need it for your small one man onshore project with about 30 db objects? And why that'll get better performance when all the ops guys are sat in India and the db server is in london.
With the (sub)standard of grads being churned out these days, I do really think it's time we had some form of regulator for the industry to minimise the demage these geniuses will do before they learn how to do what they think they've already mastered after five minutes experience.
In summary then, I don't know anyone with more than 3 years experience and less than 15 who isn't an expert in whatever tools they use, yet I don't know anyone with over 20 who claims to be an expert. Odd that.
@LucreLout : I agree heartily with what you say, and I've even upvoted you.
But I can't avoid the suspicion that when you say "Queue lots and lots of things that were never a pattern being ..." you probably mean "Cue lots and lots ...". The metaphor refers to the stage (or possibly film and TV studios), as in "Cue music, cue lights, cue Hamlet".
Lord knows, none of you twerps bashing these researchers - or, for that matter, Richard - would want to do any of what we in the computer science world call "research". If you had, you might have discovered that "probabilistic programming" is a term of art that's been in widespread CS use for at least a decade.
A simple search of the CACM archives or ACM Digital Library would have told you that. All you self-professed experts are ACM members, right? You might want to skim, say, Gordon et al, "Probabilistic Programming", Proceedings FOSE 2014.
Honestly. What a bunch of anti-intellectual asses. "Oh, there's never anything new and I thought of all that years ago because I'm so damn smart." Come back when you can show even a glancing familiarity with the current state of research.
Lord knows, none of you twerps bashing these researchers - or, for that matter, Richard - would want to do any of what we in the computer science world call "research". If you had, you might have discovered that "probabilistic programming" is a term of art that's been in widespread CS use for at least a decade.
A simple search of the CACM archives or ACM Digital Library would have told you that. All you self-professed experts are ACM members, right? You might want to skim, say, Gordon et al, "Probabilistic Programming", Proceedings FOSE 2014.
Did you even read the comments before slagging them off? If you had you would have seen my point:
I recall skip lists for example were described as probabilistic when first presented
So that'll be CACM back in 1990 then. Your point is what, exactly?