
Perfectly possible
Perfectly possible to have a ball made only of hexagons, all you need is a non-Euclidean space. If I remember my college math correctly, a positively curved space in the vicinity of the ball should do the trick.
185 publicly visible posts • joined 26 Nov 2014
And: ThoughtWorks is proof that you can have a company full of smart, curious people who deliver really good stuff without being assholes.
I once had the privilege of delivering a lunch-and-learn session at their Chicago office, with Martin in the audience. Probably the most intimidating audience I ever presented in front of, even without him -- I spent the whole time up there thinking "is there anything I can tell them they don't already know?".
some people clearly like something about vinyl (handling discs? bigger artwork?) enough to cause a revival.
I do think this is part of the story: the rituals and practices heighten the anticipation of the moment. The fact that many of the rituals are tactile (e.g. the special way they wipe the dust, the care with which they lower the arm) is important. Regardless of what it may (or may not) objectively do to the music reproduction, it really can change some people's subjective enjoyment. (Compare: tea-making rituals).
And then behind that is the opportunity for geekery: to know more than the average person about selecting and matching components for the "best" reproduction, to read endless magazine articles about the specs of the latest equipment, agonizing over whether to upgrade now or hold out for better: for many of this type, the equipment is more important than the music.
And third, there's the lure of exclusivity and collectability: vinyl, especially older vinyl, is a physical artifact that exists in finite quantities and is found in physical locations. The search for a rare copy in good condition of a particular edition can itself be rewarding in a way that finding an MP3 online really is not.
And to be honest, I have no problem with people who enjoy vinyl in any of those ways, so long as they don't insist that their end result "must" be better than mine and that I'm doing it wrong by listening to MP3s on a tiny SanDisk through ear buds to drown out the noise of the lawn mower.
Yep. SSDs provide a lot of bang for the buck.
I upgraded my wife's aging Thinkpad to a 480GB SSD and it's massively faster, especially startup. There's not another investment that would have remotely come close in terms of performance improvement versus cost. And when I do eventually replace the Thinkpad, I'll move the drive over.
The only downside is that here in consumer-land we're still stuck with legacy interfaces (SATA) designed for spinning rust. It's going to get really interesting when NVMe reaches consumer-friendly prices...
A lot of what you describe here is goodness, but I think it's important to add: It's orders of magnitude more realistic to aspire to this if you are running a cloud-delivered business:
1. You only have one image to manage, and you control it. You don't have to rely on customers consuming and deploying your updates.
2. You can do partial roll-outs, e.g. with feature flags, to progressively test whether (a) something works and (b) makes the users' lives better...
3 ... and if it doesn't, you can [more] easily roll it back.
And that's not just for consumer stuff. Look at Salesforce.com for an example.
Yes, obviously, Google Nearline is disk-based. However, time to last byte is much slower than disk speed because throughput is throttled. AWS is also disk-based but slow to last byte (and first too).
The question, though, is what kind of performance *Oracle* is going to deliver with its tape-based system.
"IBM adopted the Eclipse framework early on, making it the basis of its Rational programming tools."
That's not quite right, although the full story is a bit more complicated.
IBM actually *created* the software that would become Eclipse -- IIRC as the basis of its VisualAge IDE. It then donated the code to open source as Eclipse, primarily to counter the then-dominance of Microsoft's Visual Studio (although industry rumors also suggest that the real target was Sun's control of the Java ecosystem, hence the name...).
A little later, as Rational was creating it's next-gen tools under the Jazz branding, it adopted Eclipse for that work. Later, other IBM software brands that needed a rich client selected (possibly with some amount of internal "encouragement") Eclipse too.
This is true, so far, but bear in mind this is the earliest of early access. Some of the stuff I've read suggests that at least some of these things will be coming along, including survival mode (the monsters will be skeletons), and vehicles / other moving parts. I haven't seen anything like Redstone, though.
For me the biggest difference in the experience is -- as you mention -- the way you gather resources for "crafting". It looks like Lego Worlds will not require lotsof grinding, possibly reflecting the fact that it's aimed at a younger audience who would not have patience for such things.
Anyway, I hope there's room for both.
Similar in spirit to the famous Tyson quote is the military maxim, "No plan survives first contact with the enemy".
There are many variations on this, variously attributed to whoever is the most famous general of the moment, but the oldest appears to be from Helmuth Von Moltke in the mid-nineteenth century. His original was a great deal less elegant and succinct, but that's not entirely his fault since he spoke German.
One of the first serious pieces of code I wrote and got paid for was to add user-defined event scheduling to an operating system that was very good at milliseconds and microseconds, not so good at days and months. And one of the primary uses was to schedule the clock changes.
Since I was 17 and clueless at the time my code was, of course, hopelessly wrong*. I don't know what happened the first time the system hit the 2am "set the clocks back to 1am" event as I had buggered off to Uni at that point, but I suspect the phrase "rinse, repeat" may have been relevant.
*Among many other things, it was perfectly happy for you to set events for times that didn't exist, such as the middle of the skipped hour in the Spring.
The original pilot is worth seeing if you get a chance. It gives a glimpse of an alternate future, narrowly avoided, in which there was no Captain Kirk, and the character of Spock was intelligent but quite normal emotionally. When it was rejected by NBC, the role of captain in the second pilot fell to Shatner, and Spock's character became an amalgam of his original role plus that of an emotionally flat female bridge office called Number One, giving us the Spock we know today.
Incidentally Number One, played by Rodenberry's then-girlfriend, latter wife, Majel Barrett, was cut at the insistence of NBC partly because they didn't appreciate Rodenberry casting his girlfriend in such a major role, and partly because the character tested really badly with audiences. Rodenberry slipped her back into the series as Nurse Chapel, and I like to think of her character's unrequited love for Spock, the man who took her job on the bridge, as an inside joke.
I wonder how much information about this mechanism and its predecessors -- and predecessors there must be, something this complex does not emerge fully formed from even the most brilliant inventor and craftsman -- is "hidden in plain sight", in ancient documents that have been mistranslated, because the translators had no idea that such a thing might exist, and therefore entirely missed what we would now recognize as references to it.
One thing that often gets minimized or overlooked is that team forming is itself a major cost in any kind of knowledge work. There's also the soft stuff about adjusting to each others' personal styles and quirks, discovering each others' strengths and weaknesses, negotiating the division of work, ... In brief, the less the work is sharply defined, easily partitioned and clearly delimited, the harder it is to just bring a team of strangers together and allocate tasks to them.
This is why even in modern Hollywood, the ultimate "contractor market", you often find the same group of people -- director, actor, cinematographer, set designer, ... -- working together. And even in more prosaic fields such as construction, you find that people prefer to work as a crew with familiar people, even if the each have their own unique fields.
In other words, TechCrunch's thesis is the kind of thing that is most appealing to somebody who thinks that people are essentially like soft, squishy Web services that you can call for transactions whenever you need one.
I gave you an upvote for pointing out the WTF in Edge of Tomorrow and especially managing to do so without spoilers, but it turns out that the reason for the problem is actually the opposite of what you suspected.
Allegedly, the original script made complete sense (at least, within the rules of the film) but the ending was much darker. Test audiences wanted a more upbeat ending, and given the constraints of time, budget, and availability of actors for re-shoots, that's what you got.
BTW, it's possible to make excuses for the ending -- and I'm sure some people will do so -- but I agree with you, there's no way to make that ending "right" based purely on the internal evidence of the movie itself.
The thing that bothers me most about movie and TV time travel -- far more than causality paradoxes, which will probably sort themselves out in the end -- is that the idea of "changing the timeline" raises profound philosophical questions that almost never get addressed. For example, if lots of people die, then the hero goes back in time and saves them, so now they are not dead and never have been... what does that mean? Did they experience "being dead" and then un-experience it? If you're of a religious bent, does that mean their souls were in Heaven but then got yanked back, leaving God going "hey, where'd they go? they were here a moment ago!". Even if you're not, what does it mean for the nature of experience that something can become un-experienced?
Travel into the future raises even more profound questions about the nature of consciousness and free will. For example, suppose I travel to next week and meet my up with my friend Alice (let's call her Alice+7). She appears to be -- and believes herself to be -- a perfectly normal, conscious, freely-acting person. And if I ask Alice+7 for her personal perspective, she will probably say that she exists and is conscious at that instant in time, continuously moving forward into the future. But if I travel back to my own time, Alice+0 will say the same; and so would Alice+14 (Alice two weeks from now). It seems like there is not one Alice existing from moment to moment, as we normally think of our conscious selves, but an infinity of Alices each existing in their own moment, any of which I can visit, and each convinced of their own continuity (and of course, if Alice had the time machine she would say the same about me). If the future (relative to my personal present) already "exists" in some sense that allows me to visit it, this is the inevitable conclusion.
Most TV and movies ignore this completely -- in fact, they act as if the perspective of the protagonist is uniquely special, that his or her "now" is the one definitive, privileged, real "now".
And yet... when a couple of Star Trek characters travel in time to save the day, it's as if they say "well, we just had an adventure that not only raises profound questions about causality and paradox, but also throws into doubt all our concepts of self, consciousness, free will, and the nature of experience. Let's never speak of this again."