Please stop saying ‘Brit’ I hate the term so much, all I can hear is an American saying it harshly and tastelessly.
546 posts • joined 28 Nov 2010
Every version of XCode, and OS redacts some more useful things, and prevents old software from working, like 32 bit stuff. And users get to use a 'free' OS update which means that software magically stops working so they have to update it at cost.
You have to pay $100 a year just to be a developer on one OS type, otherwise it warns the user that your software is 'unknown' when installing, even if you paid Apple at the time the software was first sold.
That will make far more than seven but I can't be arsed to detail any of it, because as a developer, I fucking hate Apple, sorry. They've got to be the biggest twats of them all.
If I wrote some software and it took me two years to write..oh right, I did. And yes it makes my blood run cold whenever I see it hacked, and the hacker sends me a message to inform me how clever they are at breaking the registration code. Well, yeah, they need to be in prison for as long as I want them to be, but I think they are based in China somewhere so that idea's gone out the window then!
You wouldn't mind if I nicked it and posted a complete e-book version online for free?
No, because you think it's OK to copy, right? It's advertising right?
It's a very good book. now how many years did it take you to write that book? - A couple of years?
Oh, and you wanted to make a little money from it, just a little bit of holiday money perhaps?
Well, tough, the internet has it now, wuhahahaaaa!!!.
[/devils advocate mode]
"So, if this news station had been running an enterprise version of what looks like either Vista or 7, the popup would never have ruined their lovely weather report."
But how would they have known this? They're just meteorologist presenters with a laptop.
Besides, the next day they'll reboot it and it get stuck in crash loop like the last update.
Is the 'find in page...' feature, it's very useful when looking at research.
Also they were the first to introduce the single address bar is a search bar idea.
They were also the first to run WebGL, which is nice...
I find the Firefox UI is quite broken in places, and it's font rendering to start with was truly horrible and pixelated, which they eventually fixed, but it took a while.
The patent Office - a great big mother cow surround by ravenous lawyers.
They don't protect your invention, look into whether it actually works, or even look to see if it's been done before. They just take the money and make a lot of business for lawyers and patent writers. It all sounds like a scam. And don't get me started on software patents...
Are places like DinoPC included in this list? Are self builds in this report? Yeah, some people have gone to the dark side that is Apple, but journalists and money wasters don't really count. And some have gone to Linux. But Windows remains the most popular OS out there despite journo efforts to make it go away.
The trouble is, that the show implied that he'd been waiting that long to get out of the test. But the truth is that he was re-made every time he got caught, and didn't remember what had happened before. So in reality he was only in there a few days or hours or whatever.
Moffat's script's tend to be collections of ideas that don't often make sense, but are there to create emotions in the viewer. The last two were not that bad, but let's hope the Chritmas one doesn't have magic elves or more bad Santa's. It's a scifi programme BBC, not a pantomime, by all means make pantomimes, but please leave the Doctor alone.
They were just genuine experiences at the time, not opinions born from nothing. And I've seen some bloody awful coding techniques.
Writing ASM in Arm for the Archimedes was great fun, and I could test it instantly.
C++ compilers these days often do a better job at optimising code than using assembly yourself, unless you've got a simple block of intrinsics to do. (I write time critical DSP code now)
But those compilers don't stop coders leaking memory everywhere, needlessly and lazily.
I used to love Arm assembler coding. Each instruction exactly the same size, pipelined in a humanly understandable way. The instruction MLA amazed me at the time - multiply two different registers and add a third, then store it in a forth. RISC made CISC processors look like a joke, with their internal microcode burning away power.
There is a worrying amount of coders out there that couldn't give a shit about that of course, their wasteful programs chomping up resources like it's infinite - and these fucking idiots wonder why things crash after a few hours! ;p
Biting the hand that feeds IT © 1998–2020