" an operating system that crosses the streams ......"
That might not be a good thing to do.
Until last week, Microsoft’s $2.5 billion purchase of Mojang AB made no sense to me. Undeniably popular in the current generation of kids, at some point, Mojang’s Minecraft will fade, like every other fad before it. Mojang doesn’t even have a follow-up to its breakthrough first title. Back in 2013, Minecraft creator Marcus ‘ …
"Until last week, Microsoft’s $2.5 billion purchase of Mojang AB made no sense to me"
Well if they port Minecraft from Java to .Net then likely the average graphics card will then be able to run it with high resolution textures instead of ugly 16x16 chunks...
Java really does suck though.
I almost took it seriously, but then I read past the headline and stopped believing after the first paragraph. Someone has screwed up and posted his 1st of April story 3 months early. It's probably no great tragedy though, since a good April Fool's spoof is supposed to be somewhat plausible so that the more gullible sort fall for it, while this one is just way over the top.
For the 1st of April, try something like "Windows 8 sales skyrocket", or "fans line up around the block to buy the latest Windows Phone". You'll get at least some people to fall for either of those.
I'd take it seriously. With MS' development of Hololens, their future direction is a three dimensional operating system. A game where you can connect blocks together and make something work suddenly holds a new interest to MS.
Watch some hololens videos and minecraft shows up quite a bit. A lot of minecraft concepts will bleed into the windows OS.
"Games are one area where Microsoft consistently innovates"
Stopped reading at this point. Couldn't take the article seriously after that.
I've never seen a game MS actually made themselves that was "innovative". Every good game I've seen on a MS OS has always been from a third party, or purchased from a third party (factoid: MS bought Bungie just to get HALO after seeing it demoed at a conference).
There might be exceptions but I've never heard of them or played them.
Most of Apple technology - including multitouch - was purchased as well. Google Android is built on borrowed technologies developed elsewhere, and other Google products were purchases as well.
Small companies far too often innovate more and faster than bigger ones, and are purchased by the latter. Just, a purchase could be just wasted money, or you can build successulf products on them.
"Direct3D has always performed better than OpenGL"
"|Yeah, I'm not sure that stands up to analysis."
Historically for sure Open GL was always playing catch up - not just in performance but in functionality too. Finally maybe Open GL is close to parity. Until Direct-X 12 arrives anyway:
http://www.pcgamer.com/intel-show-off-the-directx-12-performance-advantage/
Didn't Gabe of Valve and Steam fame port Left for dead to Linux and come out with 30% better graphics when leaving DirectX3D in favour of OpenGL.
Or did I imagine that in the Reg article?
I can think of lots of MS games, Most are Ensemble or Bungie. Not sure any are Innovative. Halo is a clone of most things going and based on a book, Ensemble makes very good RTS games.
I digress.
i'm a total halo fan boy and i approve this message.
aside from the single player story, every iteration has only made it more and more CoD-ified.
love that Halo Wars tho. it was nice to see it in it's original vision, even if it was a simplistic version.
while Microsoft Game Studios has funded some of the best games of the past decade, i don't think that's what the author was thinking. i'm leaning achievements outside of the games themselves. like pushing sony into online multiplayer, after the dreamcast failed and console online multiplayer died with it.
"Stopped reading at this point. Couldn't take the article seriously after that.
I've never seen a game MS actually made themselves that was "innovative". Every good game I've seen on a MS OS has always been from a third party, or purchased from a third party "
He is talking about Direct3D and DirectX, not the games themselves.
I'd say Kinect, gaming hardware (used to make "some" innovative games) in case your haven't heard of it either, was an innovation in gaming. As one can see, the future of computing seems to be coming out of a blend between reality and virtual through that little innovation.
I know because it's Microsoft that it irks you, but don't worry, Apple will catch up next decade and you can point your little thumbs skyward again.
I'm sorry but I think kids mining pixelated blocks in minecraft to data mining is a bit of a stretch. Other than both having the word "mining" in common. I appreciate that what the author was probably pointing out how these tools would be interfaced but I'm still not convinced (putting aside that the original minecraft didn't have motion controls). While I believe AR like the hololens and presumably magic leap will have some very cool, very real applications. I'm not convinced that it will be seen as a fit for all business use cases for the foreseeable future.
For one some of the very complaints about Google Glass apply to hololens. Even moreso as it can truly obstruct your vision from what testers have claimed. Even if you're not using it outside I don't think this is tech you'll be using in cubicle farms unless you find watching your coworkers stumble over furniture and themselves funny.
But the biggest bottleneck as always is adoption and I don't mean simply units sold. Who is going to build the databases and SQL queries that these data mining tools expose visually? Who's going to take the extra time to identify and group every element of a 3D design so that someone in management can take it apart like a jigsaw puzzle? The point is data in any form isn't going to magically format itself to adapt to our new 3D visualization hardware. It's an investment that will pay off in some areas and not in others. Despite the cool factor.
*edit* On a second read I realized that I failed to detect the sarcasm the first time.
Not many people have been exposed to the tools involved, sheer cost being just the first limitation, but computer-aided SQL database engineering, related software engineering, and administration tools have been in use for at least a decade [http://www.embarcadero.com]. Back then I was using E/R Studio to do that work and it was almost entirely automated on the code generation side. I haven't heard of similar graphical design/code automation tools for Hadoop-related work, that section of the field isn't that old so I wouldn't be surprised to see an unfulfilled niche there.
Judging from the rest of the post, it's all sarcastic. However, given how much businesses, government, and the military waste tweaking PowerPoint presentations, it shouldn't surprise anyone if such techniques become popular given the lack of qualified people available to do this work professionally. While they may call it "data science" today, the field is actually quite old, as these things go. Econometrics.
I'll admit having a bias against expensive development tools. I often found that features that automated any aspect of my job actually created more work at the end of the day. That said I've not looked at embarcadero's products. Though I had a chuckle when I read that they bought Interbase from borland. For better or worse I had some experience dealing with a custom application written on top of Interbase a long time ago.
when I query a database In my head I visualise the joins and the filters. it's not much of a stretch (pun intended) to allow me (well let's say management) to make those joins and place filter blocks in 3d space instead.
each block a component. I think it maps very well to any case since I think and act in 3d myself. don't you?.
projecting me a nice big tv in front of my eyes instead of me having to carry one around would be quite nice though I'd pay more for a 3d tv projection.
I am not entirely convinced they have found their killer app.
The Hololens may not redefine the entire computing landscape, but it certainly could be a really useful tool for lots of applications, just as the Google Glass has/had it's applications but was marketed as a wide adoption device which it obviously wasn't.
Neither is going to be a new phone level adoption.
I imagine games e.g. army men or micro machines would be fun if the game could incorporate the objects in the room around the user as obstacles. If the headsets could be reduced to somewhat goole glass size, I think your might manage to convince a fair percentage into adoption (might not be as high as Nintendo Wii level, but still fair).
This post has been deleted by its author
> Right, now how do we do it? You've got to mine the data first to find some way to display it in 3D so that it can be mined.
Maybe some massive software organisation with thousands of programmers to hand will allocate some resources to invent/build tools to automate such a thing?
Perhaps this organisation will add APIs and SQL stuff to existing systems to allow current systems to instrument data as it is created as well as update existing systems.
Perhaps Excel will get additions to visualize 3D spreadsheets that already exist, provided they were 3D for logical reasons.
Perhaps some CAD software, Solidworks etc. will update to visualize output as something that you can 'enter' like a house (or a cellphone, you could shrink or it could become somewhat larger) and view from all angles. I admit is I less AR than Land of the Giants but it is still a sensible application.
Perhaps the BBC will supply weather data in such a form that a meteorologist could look at the weather in 3D, traverse the planet and determine weather patterns for us. Even cooler, send us the data and let consumers look at the weather out of the window FOR TOMORROW!
For a system that works, even a relatively bulky one, there are huge possibilities, the list is practically endless.
I have no problems envisioning [pun thoroughly intended] all of that and that's barely scratching the Surface [yep, again with the pun]. Back in 1985, I picked up one of the first Amiga 1000's off the production line (#2038 to be exact) with an eye for using it to replace the mainframes I was using "back in the day." Truly preemptive multitasking, huge (for the time) memory, co-processors, SCSI hard drives, etc. Games were nice but that wasn't what I picked it for at the time. Ditto the Amiga 2000 which received a massive upgrade to its baseline stats (68030/68882, 4 MB 32 bit RAM, 24-bit Retina graphics, 386sx Bridgeboard...) as they became available. Serious money went into these but when I was in the military, my income was completely disposable and I wasn't into cars, at all. These machines were fantastic game machines but that wasn't the intended purpose. I was totally off into the realm of computer aided engineering in multiple fields (software, hardware, database, electronic, ...). It was due to being fantastic game machines that did the trick.
I can't say if that principle still holds although my best machine here can eat games for a snack and barely notice the load. Seriously. Come to think of it, I've sunk about the same amount of cash in this one as I did into that Amiga 2000 back then (more than $6K and I'm not done yet). I can understand where Microsoft is going with this as I've been there, done that (and burned the T-shirt) before. I wonder what their price target is for the HoloLens is? I think this could be fun although I'm not at all looking forward to how much the upgrades to AutoCAD and the rest of tools is going to cost to upgrade. [Hamster on treadmill?]
lots of thumbs up. no you merely need categorise it, distinct from mining it. you know that bit where they bring rocks out of the ground and say this bit is iron and this bit is stone and this bit is mud. finally someone asks, how much iron vs rock did we get. that's the data bit of the mining that took place.
>"Direct3D has always performed better than OpenGL,"
LOL.
Read this: http://www.extremetech.com/gaming/133824-valve-opengl-is-faster-than-directx-even-on-windows
[Note: to be fair, I used to make 3d engine in the video game industry for a living, and my view is biased as I find both of them too far from the metal to get decent performances... I have a slight preference for OpenGL in term of standards, but not ES one]
"Read this: http://www.extremetech.com/gaming/133824-valve-opengl-is-faster-than-directx-even-on-windows"
They spent ages optimising it for Linux and Open GL - so surprise it's faster! Valve being desperate to try and move people away from the Windows Store at the time...
If Open GL was really faster then why not use it on Windows - seeing as Open GL on the latest Windows is faster than on Linux. They compared the latest Ubuntu Linux, but an old version of Windows - Windows 8 has significantly faster graphics performance....
http://www.phoronix.com/scan.php?page=article&item=intel_haswell_win8&num=1
... in the tenuous extrapolation awards 2015.
The PC is going nowhere not because of some pie-in-the-sky new world of 3D visualisation but because, like with its supposed heat-death in the furnace of phones and tablet adoption, it remains the best tool for an awful lot of jobs.
"..[The PC] remains the best tool for an awful lot of jobs."
I suspect you mean, "monitor, keyboard and mouse remain the best tool for an awful lot of jobs."
It's becoming irrelevant where the CPUs are: it could be they're in your mobile phone wirelessly connected to a dock, it could be they're on the other end of a network connection or it could be they're in a box on, or under, your desk. If you want to convince me there's a future for the PC you need to convince me the box on the desk is better placed that the box in your pocket or the box in the datacentre. (And remember, cheapest normally wins out over best engineered or most usable.)
> (And remember, cheapest normally wins out over best engineered or most usable.)
No, it doesn't. "Easy is better than free" was applied to iTunes (over illegal downloads).
Apparently, iTunes was easy (I have never used it but have read awful things here). I may have little trouble using BT but apparently this is not common.
Netflix charge money and have a large user base - it is not all about morality, for many it is just simpler and easier, available everywhere (and reasonably priced).
A PC, in your house, connected obviously but capable in any case, will probably remain for many.
It may even be the case, if a compute-intensive set of applications such as Hololens might create, cause more households to get a PC system, even if it is wirelessly connected and has no direct inputs such as a mouse etc. Many will call it a games console but it is a Personal Computer all the same.
I get that we would not need local compute power if we all had gigabit connections with near-zero latency that never failed but one of the choices could exist in five years, the other... not so much.
What is a PC anyway?
A PC is a personal computer. Macs are PCs too :-). Tablet and smartphones also.
But a true PC is one running a Linux distribution like Debian. It's very personal. :-)
Yes. The keyboard, mouse and display are the ones that are important, but the cloud is not personal until it's yours. Someone said that distributed computing is when your computer does not work because some other computer you know nothing about is broken. I don't like this happening to me.
Depends what you need from a PC. If you need a basic browser, office application or low or medium end game, you can get by on a mobile phone or tablet with external screen and keyboard (I frequently work like this) or some sort of virtual desktop system.
If you want access to high end games, running at high resolution (I know a few PC gamers who consider even 1080p to be medium resolution and every streaming game service I know of struggles with 1080p), 3d graphics or video editing software, you need a machine with a decent graphics card that can run at high resolutions and has lots of CPU power.
As an experiment, I installed Premiere on a VM running on our VM Ware cluster at work. Even running over PCoIP (which is supposed to reduce lag), and running on a gigabit network, there was a 1 to 2 second lag between any action the user took (including moving the pointer) and the results of that action appearing on screen.
It is fairly interesting use of AR but I'm thinking most will stick to virtual tabletop applications to share their maps and positions. It's bound to be cheaper and can support remote gaming sessions. Not that you couldn't do a remote session with AR too but what would it add to the experience? I suppose it would be great for those who really like to RP in that you could have outfits superimposed on each other. But I'm assuming there's some technical hurdles in rigging 3D models to move in sync with video of a person. Well at least convincingly.
I was actually thinking animated characters on the table, similar to the star wars holo chess game.
It also brings in the whole idea of having a visual display on each character for things like range for weapons, distance you can reach by walking or running etc.
A friend also had the idea of using dice that contain little accelerometers in them so it automatically tells the computer what you've rolled (no more "it fell on the floor, but it was a 20, honest!")
Actually thinking about it with multiple hololenses and night vision and infra vision programmed for each character you could seriously limit some people's views, and checks for secret doors for the rogues and elves become interesting as no one else would know if they've spotted something. They don't even have to have the same map displayed, so splitting parties up becomes really simple.
Think multiplayer Baldur's Gate in AR with real dice :)
"Undeniably popular in the current generation of kids, at some point, Mojang’s Minecraft will fade, like every other fad before it"
Actually you're wrong about that. The generation of now, will be replaced with the next generation and the next and the next...., it's going to be popular for a long, long time...if not ever...like chess.
Minecraft is all about a cube...that's its strength. It doesn't need to update its graphics either. Microsoft move will turn out to be a super move, I can guarantee you that.
Btw, you do know that a lot of middle-aged people play Minecraft too, right?
Randy: Why, why would we punch trees?
Corey: Just use your fucking brain! How do you get wood? [Randy isn't sure] 'Ow do you get wood?!
Randy: [nervously] Watching informative murder porn?
Corey: NOO NOO, in this Minecraft forest, how do you get wood?!
Randy: Puh, punching trees?
Corey: Riiight. You punch the trees to get the wood, you get the wood to build a cabin.
Randy: Oh, I see. So when does the game start?
Corey: You are playing the game; this is the game!
Randy: ...I don't get it.
Corey: That's because you're thinkin' like a dad. Minecraft, it don't got no winner. It don't got no objective. You just fuckin' build an' shit. And seein' if other things can come and knock it down. Now, let's click on the inventory, and let's filter through the skin!
Randy: Yes ah, I'm getting it now.
Gerald: You are?
Randy: No.
I think Microsoft's purchase of Minecraft for 2.5 Billion falls into the same category as Facebook paying 2 billion for Occulus. They are both massive overpayments in a fairly frothy market, and now analysts are struggling to come up with rational justifications for them. The real explanation is the simple one - it is probably a bubble.
There were a few winners from the first dot com boom (Amazon, eBay, Paypal, Skype). I think a few current giants are just spraying money around hoping to buy the next big thing. But I think marketplaces and social networks are natural monopolies, as the more people who use them, the more valuable they are. I am not sure that either minecraft or occulus have any network effects however?
For a 3d printer you buy a real cheap PC, or an Arduino (which is a bare bone microprocessor).
Your stupid if you buy 8.1 for 3d printing. Also on the high end PC's linux does run pretty good.
The only area might be high-end gaming. But i think these days people play more simple games on android then on windows.
A move away from traditional desktop software and Windows as MS' primary sources of income makes sense. OSes are actually mature products and there is no real need to upgrade because a new version is available for any OS (Windows, OS X, or Linux). There are very few "features" that can be added to an OS that are really needed by the average users or would ever be used by the average user. The same is true of office software; the major features users need are already included. A new release or subscription does not truly make economic sense for most users. Skipping releases is a very viable option for users.
Additionally, the OEM desktop/laptop market is growing more like a mature market where most of the new equipment sold is replacement for older deceased equipment.
Quote :- a company that until last week seemed destined for irrelevancy
Reply :- A company that's still destined for irrelevancy. Yet again Microsoft have lost another banking system. NCR are moving hundreds of thousand of their ATM money machines from windows to Linux, OS, It's never mentioned how many Linux users have stopped using Minecraft or Skype since Microsoft took them over. Buying companies out don't guarantee paying customers. Will these customers be happy when they have to buy new computer systems when M$ releases windows 10 because it wont run on their old computers,