Re: Love Wayland
“You're doing it wrong”
Oh of course. Silly us.
53 publicly visible posts • joined 20 Nov 2012
You might like to compare the following workflows for updating my wife's laptop (say) compared to a Win updateathon:
$ssh me@wifeslaptop
$pacman -Syu
<hit a few keys and then ask her to reboot when she's ready or not bother mentioning it, it'll still work>
I notice your wife isn't doing the updating. She might if it was Windows.
What I'm trying to say is that to most of the population, a command-line is about as inviting as a swimming pool with a shark in it. The mistake that Linuxistas sometimes make is to dismiss this type of user as ignorant or stupid. The fact they have different life goals and interests and computers are not interesting gadgets, just the tools they're forced to use.
C++ is a langues designed by comp sci profs, for comp sci students; to appeal to an ideology
By "ideology" i think you mean "methodology". Well... Perish the thought that we should be using the proven OOD methodology to successfully develop increasingly large and complex applications over the last 20 years.
As for unpredictable, take a look at the varying behaviour of STL implementations
STLs vary across implementations, but each implementation is predictable in itself. However, a single implementation of a JIT, garbage collector plus other runtime management routines is unpredictable and uncontrollable by its very nature. You cannot have it both ways.
C is lighter and more predictable. For higher level stuff
C it's just as predictable as C++, because both compile into machine code executables (not bytecode), and are consistent at runtime; there's no comparison to be made here.
Java runs rings around it unless you can waste many magnitudes more time fine tuning your C++ code.
No, the opposite is actually true. C++ is compiled into machine code and runs, end-of. Java is compiled into bytecode which needs a JVM at runtime, which cannot hope to optimise to the same degree as a compiler. For evidence, google "performance c++ vs java" or see:
http://benchmarksgame.alioth.debian.org/u64q/java.html.
Define "OS"
By OS I mean Kernel, Monitor, Hardware drivers, UI Shell, plus a bit of the "shovel-ware" such as browser, editors, whatever. Performance is everything on these components, it can make-or-break. C++ can give you this and allow some OOD implementation. Absolute no-brainer.
All K&R and/or assembler, not C++.
By K&R you mean C? Why not say just "C"? It's less to type. Yes, that would be a good choice also, but why choose it over C++? That does not make sense, it just smacks of personal choice, idealism, stubbornness and missed opportunities.
Assembler? What, your joking right?, lots of development time, no portability, hard to maintain.
I don't think C++ is going down the pan any time soon, simply because of one thing: If you have performance-critical systems then C++ is a sound choice. There's no interpreter, unpredictable JIT or garbage collection. It has proven success record, excellent support (paid and free) and huge developer community.
OS development, real time systems, network software and drivers: Without C++ they would slow your whole system down and instead people would be complaining about "too many layers of cr@p, inefficient software, I remember when..., etc).
C++ is here to stay and keeps doing its job.
BTW I'm not a C++ evangelist, I like Python too.
That's luxury. In the 90s I switched to 10-expure medium format because I thought 36 exposures was causing me to be too slap-dash. Medium format (in my case a second-hand Mamiya TLR) was a great because developing and enlarging it was easier that 35mm (detail and tones was fantastic, dust was less obvious, handling it was easier).
In the early 2000s I remember the great Usenet flame-wars about 35mm-vs-digital and later medium format-vs-digital.
Nowadays my 20MP Sony Xperia Z3 Compact phone can probably pick out more detail than Fuji Velvia, but it still doesn't have a colour, tonality and veracity of film. RIP silver.
I cannot work this out. IPX7 says one metre is OK. Would have expected more. My £20 Casio is good for 50M and it's been swimming with me >20 times. Yes I know Casio != iWatch but if a £20 watch can be usefully waterproof IMO then why can't a $300 smart watch? Would have been a good selling point I think.
Interesting(ish) factoid: The original price of the BBC in 1982 was £399. The effect of inflation means that in today's money that's about £1300. That will get you (for example) a Toshiba Satellite P50-B with an i7, 4K display, 8GB RAM, etc. However the Tosh won't generate the same excitement in the average household as the Beeb did in 1982.
This is the most over-hyped non-existent product in history. Why someone would wear a watch that will surely (unless something exceptional emerges) need charging every few days is beyond me. Other smartwatches have bombed but this may be different, but only because the iSheep will obey and "go buy".
Expect queues in Oxford street, ecstatic young males looking like come weird church sect, high-fiving the blue-shirted iClergy.
"Is my life 43690.67 times better for this technology though?"
I asked myself a similar question when I purchased a Raspberry Pi last year and calculated that BBC Basic runs about 2,500 times faster on the Pi (running RiscOS) that on the original beeb (and the Pi 1/20 the price in real terms).
Is the Pi 2,500 better than the BBC Micro? Not quite ;)
Then why oh why don't future civilisations use LibraOffice 9538485.3 on Ubuntu 58486324.1 instead, thereby "sticking it to the M$ man"? It's free, stable and and they've finally got the Desktop UI and printer drivers nailed, (remaining printer, UI and network issues TBA in release 58486324.2 and later).
@Charles Manning:
"The only way to get charge out of a capacitor is to run it down to 0 volts."
I didn't think that "zero volts" were absolutely required, whatever you mean by that.
"... but then you're really making a perpetual motion machine"
No you are not "really" making anything of the sort; in fact you are not even trying. You are just trying to make a more efficient machine. I don't know how you got to perpetual motion machines.
"Pretty much. Except maybe in freetarded companies.."
Erm... What about "Oracled" companies?
Yes, I too was expecting an article about the SQL language "standard" being updated, not SQL Server. I think it's very bad that "SQL" is becoming synonymous with a single database product (if this is true).
I went to a computer fair at Earls Court (1985?) with my father and brother and remember getting lots of leaflets about the Elan Enterprise and thinking "wow, this has everything and it looks pretty". As a 15-year-old I didn't realise at the time that this wasn't the future but instead the present (which already had one foot in the 8-bit grave). I remember going home to our BBC Micro, not realising how lucky I was to have got one of the best machines of this magnificent era.
Proof that looks are everything (if you're an adolescent)...
"Since when did the iPad or iPhone enjoy a first mover advantage in the tablet computer/smartphone markets?!"
Apple have occasionally set the scene for other manufactures:
...Retina Display? It was damn cool at the time, admit it.
...iTunes & Apps Store? Love'em or hate'em, it's set the landscape for Android and Windows.
...Round Corners? Ermm, there's another one.
BTW I have switch to Android in the last 6 months (not a fanboi)
Yes, very fond memories of playing this classic game on my Amstrad PC. Other classic games I enjoyed around that time:
.....Gods - A platform glassic.
.....Xenon 2 - Vertical scroll space shoot em up.
These games were polished; Pure quality.
OK, here's a US receipt for 1981:
http://www.retroist.com/2011/01/11/how-much-did-an-atari-2600-cost-in-1981/
So the answer is between about $20 to $30 which would have been (guessing) £15 to £25. A retail price index calculator (http://www.hl.co.uk/news/inflation-calculator) tells me that in today's money that would have been £52 to £87.
That's not good value. No wonder only my "rich kid" friends had A2600. The rest of us had ping pong selector consoles playing on our Grundigs.
...But you still have to pre-charge the other battery, which *is* hassle. Just saying that "I always have a battery pre-charged" kind of hides the extra hassle of managing more that one battery, I think.
Also I'm guessing that it's actually more that 10 second to change a phone battery if you include power-down/power-up/relaunch. More like 30 seconds on a good day.
You don't have to pre-charge a charger.
Why? I know lots of people with Samsungs and they never swap the battery. Why? Because you have to:
1. pre-charging the spare battery
2. powering the phone down,
3. swapping the battery,
4. booting it up again and
5. relaunching the app you were using...
...which is far more hassle that carrying a wall charger or car charger. Unless you are going on an expedition to the dark ages, you will always find a power-socket.
The SD card slot - May be. But the battery? After a 2 year contract you get a new phone anyway.
Please can we have all future measurements in "Wales", please. Use of the imperial "bus" measure is non-standard and frankly insulting to anyone under the age of 40. I was a child of the 70s and by then the "wales" was already established in geography classes when measuring rainforests. Anyone knows these days that 1 hectare = 5 nanoWales.