"work every time without fail"
Plenty of examples where that hasn't happened, but the taxpayer had to pay for it anyway PLUS the extra cost of fixing the problem
68 publicly visible posts • joined 6 Mar 2011
I used to work in video games - the company I worked for in late 90s/early 2000 had some very expensive and secretive playstation 2 (or xbox, I forget which) devkits. Thieves broke in and stole them, ignoring all the fairly valuable pc workstations nearby. Almost like somebody put them up to it...because these were not useful things to flog to your average man in a pub carpark, as they are fairly useless unless you knew what you wanted them for. Very fishy business!
There's a small but significant crowd of people like me using them for content creation (3d rendering, non-game unreal engine visuals, live shows, etc). We don't need quadros, because they aren't really faster and cost a lot more, but we can justify this price for having the extra vram which is actually useful. But yeah it's a niche admittedly.
One for the children of the 70s/80s. Hanna Barbera made a cartoon godzilla show but the execs presumably thought an aloof city-destroying atomic monster was too much darkness for the kiddies, so they gave him a comedy sidekick called godzooky. Needless to say, they were wrong, and everyone despised godzooky. He even ruins the title sequence :
See also : Scrappy Doo
Well. You might remember games from 8 years looking not that much worse but if you do a direct A/B comparison on a 4k monitor you'll see that there has actually been quite a jump especially in texture fidelity and the general standard of human models and faces. Plus, there's Ray tracing. Have a look at Control on RTX - 10 years ago this was what architectural (non realtime) renders looked like. I personally think it's pretty incredible.
Urgh, quicktime. I still have to use that player on pc to view work files using an esoteric codec that isn't supported by any other player. I will never understand why it takes 4-5 seconds to start, and has always been this way for 20 years, no matter how fast cpus/hdds get. Almost like they deliberately gimped it for Windows by adding a timer to delay it starting up. They wouldn't do that, would they? Hmmmmmm
Hey ho. At least our police force isn't a thinly disguised paramilitary death squad free to gun down innocent people at will, unlike certain other large western democracies I could mention, and neither do we execute children or the mentally ill. So it could be worse.
It reserves a whole wodge of vram on your video card that you can't do anything with - even if there is no monitor connected. Win 7 and Linux don't do it. It's an issue for those of us doing things like 3d rendering on gpu or machine learning stuff. Plenty of people have tried getting them to change it and they have admitted they aren't going to bother any time soon.
Wonder what he'd think of the apple pencil then?
Btw. I'm a designer and I love the stylus on the note phones. It's genuinely useful, works as well as a wacom, and it's really convenient for sketching ideas out anywhere you are and painting them up a bit.
Just because YOU don't need it doesn't mean it's useless.
I use my £2.5k pc laptop for 4k video editing, in a professional capacity, for money. 2 years old -and it's a monster desktop replacement with 4 hdd slots (including 2 NVME), 4k screen with a fair crack at adobe srgb, 64Gb Ram, proper desktop i7, desktop level gtx1080. basically STILLL better than this new macbook. It's a bit heavy and hot but at least I had the option to buy this as a tradeoff!
Nearly every diehard motion graphics artist/editor/3d designer I know still using a mac who haven't already switched (not many, any more) have finally started asking me discreet questions about moving to pc. I've even got a stock document written for them. It's dated 2016. I've sent it to 15 people so far. They are all happy with their switch.
Apple do not care about professionals. That is the message that's been going out for years, and it has been received by professionals, loud and clear.
"my effusive 15-year-old never expressed any interest in returning to the virtual world."
This was my feeling exactly. We got hold of a couple of oculus DK2 devkits a while ago, and my first experience playing alien isolation was just that it was quite simply the best thing since sliced bread (or sliced space trucker). This is the future, it's amazing, so immersive, I thought. Fast forward about 2 weeks and it was on my shelf, never to be touched again for over a year apart from as a party novelty to show to visitors (and work never asked for it back - they lost interest too). I have no idea why the initial compelling nature of it faded so rapidly. Maybe it was the multiple cable faff every time I used it or maybe because it's so inherently antisocial.
Well, since AMD finally got their arses in gear last year and challenged the lazy/greedy intel gits by releasing ryzen/threadripper, we finally got a decent jump in performance again provided you are running multithreaded apps. For an average punter who isn't 3d rendering etc, these advances aren't that noticeable as single cores haven't got all that much faster since about 2012 (at least not in line with the old pentium days).
FWIW my new threadripper is about 2x faster at rendering than my old intel 8 core 5960x for a bit less money, but it took >2.5 years rather than 18 months for it to come along. That 8 core was about twice as fast as the sandy bridge quad core it replaced, but there was 3 years in it again and it cost more than double the amount (and is actually slower at single threads). So moores law had definitely broken at that point.
Another thumbs up for the note 8 here. Absolutely love it, the stylus is better than ever, screen amazing, battery life better than the old note 4 (even with a new battery) and NO NOTCH. That notch makes my OCD twitch. I can't believe that abomination popped out of such a supposedly design-focused company.
Since they have the monopoly on having a tightly integrated stylus that works brilliantly, I decided to suck it up and buy one because my note 4 is finally packing in and I absolutely love having a stylus (I do bits of sketches, drawings and idea concepts every day with it) and on the note 8 it works better than ever. It's pretty much a wacom cintiq in my pocket. Could do without the pricetag and the bixby nonsense but I'm very happy so far.
There's a ton of those old sandy bridge dual xeon blades knocking around for mega cheap on ebay. Bought a bunch of them for a 3d rendering project - £350 each and they are nearly as fast at rendering as the 8 core i7 PCs which cost nearly 2 grand each. A bit apples and oranges as they can't be used as workstations (no GPU) but for CPU number crunching, which is what we bought them for, they are great.
I've freelanced at two different motion graphics companies over the last year who finally got fed up and dumped all their mac pros for better spec windows pcs (and this despite them all being apple fans in general, they just couldn't deal with the limited spec hardware/lack of upgradeability any more). That's the way it's going to go in more creative companies, especially ones that do video and animation, unless Apple pull their finger out soon...
Presumably he meant media server, which are very much targeted toward running lots of large displays/projectors etc, and a lot of them run on some version of windows. Eg Hippotizer, Avolites ai, d3 etc. However the windows they run is usually pretty locked/stripped down and you'd like to think they'd have this covered.
We do the odd live event using cheaper desktop machines and back in the day I was caught out by a random windows update which rebooted the machine while I wasn't there, which showed the whole reboot to desktop cycle on the big projectors. That was embarrassing. But it's a good way to learn for the future...