The problem is, we all want the functionality of a smartphone. Until there is another game in town, consumers have no real power or choice to influence the situation, except to entirely opt-out of what's essentially modern society.
83 posts • joined 15 Apr 2011
Corporate desperation or solution to real need?
There are currently 2.5 web browser engines in the world: WebKit, Blink, based on WebKit but potentially different/better. and Mozailla's Servo.
I get it, Mozilla needs an income stream. I am happy, for now, that they provide some competition to the One Browser. (Anyone remember MS IE6?)
The VPN thing is a side-hustle. Definitely useful for some; most people who need it will know how to get it either way.
I dream of the day the open source community (or frankly anyone) comes forward with an alternative browser engine. Forget the baggage. Support current web standards. Modular architecture with the right vertical and horizontal abstractions so yoiu can build a browser with only the features you need...you see what I am putting down. Is it possible? Or will Chrome be IE6 all over again?
The reporting on this has been confusing.
Initially it was reported that the pilot could not counteract malfunctioning MCAS.
When I posted about how absurd/negligent that appeared, I was downvoted to hell because apparently MCAS can be disabled by flipping a switch.
Boeing now says new MCAS it will ensure the pilot can always overrule MCAS using stick alone. So clearly there was room for improvement. But all the same...
Now this report says that the crew "tried to undo [MCAS trim] changes [33 times]." And that they were fully trained on the type. So what went wrong? I agree that in the scenario of flying a complex aircraft like the 737 MAX 8, there is the potential for information overload, and with the lack of positive warning of sensor/MCAS malfunction, it might take some effort to gain full situational awareness concerning aircraft flight systems.
But if the crew positively observed uncommanded trim changes on 33 separate occasions--how many possible causes could there be? How long was the checklist that would have identified possible AoA/MCAS failure, leading to disengaging MCAS? In the event of multiple uncommanded inputs for an unknown reason, would it not be recommended practice in general to disable non flight-critical automated control systems and regain manual control of the aircraft? Would not multiple uncommanded trim changes clue in the crew pretty much immediately? How many other systems are able to input trim changes in that way?
Certainly 7 minutes is not much time to figure all that out. But nonetheless, the whole scenario remains puzzling to me.
Click here to see the New Zealand livestream mass-murder vid! This is the internet Facebook, YouTube, Twitter built!
May I suggest--purely as a Devil's advocate--that free speech is more important than hand-wringing over censoring unpleasant information.
I find it unlikely that censorship (or lack of) from Facebook might have made a material difference to what happened. No matter how much the video is shut down, it will be reposted somewhere else. In fact, such action may even trigger a fight-back where the video that some can't stomach is reproduced in every possible medium.
I present an alternate view: all speech is good. Meaning, everyone gets to decide what they choose to say, read or see. Anyone who is deeply offended by seeing such incidents can choose not to watch the video.
This eliminates the sense of martyrdom on which perpertrators of many such atrocities rely (which might further popularise their material): "facebook took down my video. What happened to freedom of speech? You can find the video here, here and here. Fight the repressors!"
Re: Trumpian Diploonacy
Let's be real. We already know the CIA probably has implants to intercept internet traffic anywhere in the world. Even among friends, espionage is pretty much expected as part of the game.
China is different. They have never been frenemies with European powers, and they are gearing up to fight if they have to. Also have an impressive track-record of stealing corporate and government organisations via network intrusion. Just as (I imagine) very little Cisco kit does out of the door withour implants, you can guarantee China is doing the same. Difference is, we (the west, the free world, whatever you want to call it) have a pact going on where the data is used responsibly by governments that we the people elect.
This is information warfare. Don't think the companies in the Chinese government's pocket are giving away their kit at lower prices for no reason.
Which would you prefer - to live under a (very arguably) benevolent dictotator's thumb that has trouble with civil liberties that would put the USA efforts to shame, or...?
This image-recognition neural net can be trained from 1.2 million pictures in the time it takes to make a cup o' tea
Re: Can you get 58% accuracy with a much smaller training set in 90 seconds?
It's the AI winter all over again, just with improved technology. After genetic algorithms was now the in thing, we now have neural nets and deep learning. Don't get me wrong, the stuff we are doing now is phenominal and a step change from past attempts. But as you rightly point out, a computer would remain utterly unable to comprehend meaningfully something as simple as an image classification task if not trained extensively. Training it to handle all aspects of life and become a "general AI"? Not a how. Magnitude of computing power has helped remarkably, but bear in mind the human brain does this on 20 watts. We have much to learn. fMRI and other techniques have given us just a fraction more of a glimpse at how our remarkable brain has eveolved. I'm sure more practical and fundamental research will advance this area rapidly.
I share some of your commenters' concerns, such as how can we reason about an AI system? This is being addressed, but, frankly, the problem runs much deeper. We are not (yet) able to effectivly articulate how a trained neural network works. So now the big deal is the ability to described "how" a neural net arrived at a certain result. But before long, works like "how" start to lose their mening. "How" did I decide to get up and have a sandwich today? "How" did I choose my academic and professional career?
As I said...much to learn.
But, what a time to be alive!
Microsoft reveals terrible trio of bugs that knocked out Azure, Office 362.5 multi-factor auth logins for 14 hours
What's in the fine print?
How many 9's of uptime are you promised as a paying commercial MS customer? What is the recourse if they fail? This is basic due diligence you would do on any vendor. I wonder if MS etc. are perceived as "too big to fail," and the (arguable) convenience of SaaS leads organizations to move to services that simply have no assurance of service level.
There is a real point to QUIC though--my inferences only, haven't read the (draft) specs.
The big one is maintaining connection state while moving between networks. It's great when the PPP session to your mobile provider hands over cleanly from cell-to-cell, but although this can happen as others noted, it can by no means be relied upon (especially if dropping in and out of coverage) and certainly not when moving between wifi networks.
Now, in my mind this might all be best solved by IPv6 and appropriate routing protocols. But we don't have those deployed in a way an average (or even technically-savvy) person can even remotely use. So, pragmatically, onus of fixing this increasingly falls to the application layer.
The dream is that no matter where you are, networks via which you might be connected, or plain inadequacies in mobile infrastructure...your connections should retain any state and continue to work. If you suspend your laptop or shut down your phone radios, 10 hours later and in another timezone, the conections should remain established and working the instant you pick up your phone again. They should also have differential service, so e.g. your html, css and js must be received fully and in-order at the expense of possible retransmissions, the RTP traffic as you talk can trade packet loss for reduced latency Techniques not used before at this layer such as FEC could reduce the impact of retransmission.
Not to mention revamping TCP's strategies for handling packet loss to recover much more rapidly in a fast changing network environment.
Everybody wants this. Apple and Google have been hammering at the problem internally and implemented some proprietary solutions.
For UN*X users, the mosh project has implemented the same concepts for some years now. It is breezy. I can have a dozen ssh sessions on my laptop, suspend it, wake up in another country, unsuspend, and in no more than 1 to 2 seconds, all my ssh sessions continue uninterrupted, with any output while I was offline being immediately refreshed.
This is the end-game and it ever more compelling. I will be interested to learn more about QUIC and see how it performs in the real world.
Re: 3 Letters
Having been in this industry, time-to-market pressures and lack of experienced developers on what sounds like an embedded Linux device makes it likely this was a simple mistake. Cisco probably acquired the system or implemented it from scratch without adequate resourcing and review/oversight...it's common for developers to set a trivial root password to simplifiy development and testing. It's very easy to imagine that being overlooked when it came to release time.
Not that this is any excuse for operating in that way. But Cisco is so oversized at present that the left hand certainly don't know what the right hand is doing. I doubt they have any rigorourous and effective dedicated IoT security function that applies consistently and effectively aross their diverse product lines, some developed originally in-house and some acquired.
Believe me, this happens every day, not out of malice (though I don't rule that out) but simply because of organizational inertia.
Is this related to SQLite?
Until now, Apple has made extensive use of SQLite in both MacOS X (e.g. time machine is built on it) and iOS (it is a standard datastore API made available to all apps).
I am curious as to Apple's internal roadmap on this, and whether they see FoundationDB as more capable and will eventually replace SQLite, or if each DBMS will continue to have its own niche within Apple products.
Conspiracy theory central
I struggle to guess whether the majority of responses to this piece are serious.
Apply Occam's Razor. Putin is a KGB man who mourns the fall of the Soviet Union and prides himself on doing everything possible to bring it back. The Cold War per se may be over, but the modus operandi remains the same. If anything, the FSB is perhaps more free to act than the KGB ever was; protocols for these things existed, now it is a free-for-all.
Anyone could throw out conspiracy theories one after another. The Russians have a history of assassinating defectors, or anyone they don't particularly like; I think the body count since the fall of the USSR stands at about 16, these being only those high-profile cases we know about.
Talk about false-flag and other conspiracies all you like, but the simplest answer remains that Putin's Russia is sending a consistent and clear signal to anyone who crosses them: you are not safe anywhere. This is their entire purpose.
I hope the Reg readership is above being taken in by the well-documented Russian disinformation machine. To sow discord and doubt has always been the most effective propaganda tool. Rise above it, please.
I just hope there is a large exodus to Pale Moon, which continues to support XUL, with associated UI customisation etc. With an influx of FF refugees, Pale Moon, which is already well-maintained, could gather a decent number of new (core and extension) developers and become the new FF, in the spirit of what FF used to be.
Onwards and upwards!
I also use Qi all the time (and have even retro-fitted Qi charging to all my devices, including phones and tablets that don't come with it--not with a case, but actually adding the electronics inside the phone/tablet). I can hardly remember the last time I had to plug a micro USB cable in, as I simply have a Qi charger sitting enywhere I am likely to want to put my phone or tablet.
To answer your question, Qi is a bit fraught in some ways and the different priced chargers on the market definitely reflect varying quality--but sometimes not in the way you'd expect. There is lot of stuff in the Qi spec about e.g. communicating the actual power needs of the device to the power supply in real-time, so it can continually adjust its output accordingly. But if the charger is a cheap rubbish one, it may not support this protocol correctly, and blast out maximum power all the time, causing overheating. But again, I have a stack of cheap $2 no-name Qi pads and they seem to be as good as or better than some of the very expensive ones with fancy marketing. YMMV.
I don't doubt that Apple has this absolutely nailed with their end-end implementation. And Samsung and Google Nexus has always had great wireless charging support--they seem to have more flexibility about positioning, and smartly PWM the charging, or stop charging with a message if too far off-centre. Also, Samsung phones show an alignment marker on-screen the moment you move it near the charger, making it trivially easy to position exactly right.
The elephant in the room with Qi is that you have to line up the phone to within about 10 mm to achieve optimal inductive coupling without too much loss to heat (and corresponding early death of your phone battery). Also, the phone has to sit almost directly on the charger, so that even a slightly-thick case can make charging unreliable in some cases. In theory max Z-height is 6 mm, but this is not always achieved, depending also on the materials in the gap--air is different from a case with possible metallic parts. Supposedly the resonant charging features present from Qi 1.2 address this (giving 30mm+ Z height/offset), but I have yet to find a Qi 1.2 compliant charger or phone with those specs.
Despite its limitations, you soon get used to wireless charging, and I would never be without it.
I've not observed any adverse effects on magstripes (another feature of Qi that may be supported to better or worse degree in each charger is detection and avoidance of foreign objects), but the thickness of your card and case may be enough to make the Z-height out of spec. Depending a lot on the charger and phone in question.
If you understand the limitations and it still fits your use-case, I highly recommend it. You could always dip your toe in the water by buying a cheap charger and see how it goes.
Re: This was my grandfather's HDD.
Yeah! In data recovery I've popped the lid on a few HDD's over the years, giving the spinde a gentle push to overcome stiction. Certain 2.5" laptop disks appeared particularly prone to this issue back in the day, perhaps due to lower power/smaller/lower torque spindle motors, or plain bad mechanical design.
I tend to think the article missed the mark somewhat on where storage is going. NAND Flash is cheap(ish) now but new technologies such as phase-change memory will probably supersede Flash *and* possibly spinning disks for some (most?) applications in the next 10-20 years.
As for tape, it's here to stay for some time. Not much else can store 150+ TB (300+ TB demonstrated recently by IBM and Sony) and stream at insanely high speed on a single small, robust, portable cartridge.
Re: I've been told that SSD isn't good for cold data storage @Matthew
Actually, SCSI has been extremely long-lived, SAS is simply SCSI over a different physical layer. Controllers/adapters to support plain old parallel SCSI going right back to the 80's are easily obtained. I should know something about this, I designed SCSI controller ASICs *cough* years ago.
I can't imagine that anyone sensible would ever have used [E]IDE in the enterprise.
Consumer home users; well that's a different kettle of fish entirely, and for the majority of home users who do only light tasks, choice of storage is not something they know or care much about. With the accelerating move to web-based SaaS applications and things like tablets and Google Chromebooks, combined with ubiquitous high-speed network access (even when away from home/office, i.e. HSPA, LTE,) local storage for the bulk of consumers and even business end-user devices is becoming rapidly less relevant.
Re: Old school
I am with those pointing out that email has always been fundamentally a text medium. Certainly, email *could* be something else. But do we want that? Surely the purpose of email is to efficiently communicate ideas.
When writing a paper letter, I hand-write or type a plain text missive. Logos, garish colored text and fonts, overuse of bold and italics etc. could only detract from the simple act of communicating an idea in words in the English (or any other) language.
I am in the privileged position of being selective about whom I choose to hear from. If I engage with a client who cannot communicate ideas effectively in plain English, this is not a client I want.
Re: Sir Tim is 62
"Even if the TV is built with tamper resistance? And if the BluRay discs contain stuff like ROM-Marks or the like that can't be read by PC-class players?"
Yes. Obviously not for the average consumer, but for a well-resourced and motivated person/organisation. It requires only one such organisation to crack the DRM and make the content available to everyone else.
"You say ransom, I say purchase.."
There is a fault in your reasoning. Were content was not DRM protected, II could also use a non-DRM protected player to play back content I have purchased.
In other words, owning a DRM-protected player (as opposed to a non-protected one) need not be a necessary condition for watching content purchased legally in accordance with copyright law.
Re: Sir Tim is 62
Still hackable. Even if the BluRay player is locked down to the max, there will be a weak point in the TV.
This is all a security sham, designed to inconvenience the average consumer, while still allowing dedicated pirates to copy the content and make it available to everyone else. I don't think the film industry understands what game is being played here.
Re: Sir Tim is 62
It's relatively trivial to reverse-engineer a 4K TV or numerous other attack vectors to sidestep any content-protection measure. with full quality. Main reason this has not yet happened much is probably that there really just isn't much demand or 4K content and the bitrate makes it not impractical, but inconvenient to transmit over the internet, for lttle gain compared with 1080p. As demand picks up and network speeds/data caps go up, I'm sure the "pirates" will step in.
Re: How is this going to help; cost to consumers
"Not necessarily. 4K BluRay players haven't been cracked AFAIK because they demand locked-down dedicated players AND protected hardware paths from end to end. That includes the TV (splitters can be detected and blocked with HDMI 2.0+ IIRC)...."
It's only a matter of a short time before these new technical measures are circumvented. Yes, HDMI has progressively introduced an absurd level of hardware lock-down including measurement of the cable transmission line characteristics to detect tampering. It is still possible to spoof these checks, or simply step around them by, say, a) extracting the video signal from a compliant television at the point where it is stored in the framebuffer or drawn to the LCD panel; or b) copying decrypted video stream or decoded frames from a computer's main memory.
An interesting point is how much all of this lock-down is costing the consumer. Implementing DRM software and hardware requires programmers, signal processing and analogue chip design engineers who do not come gratis; and hardware that costs silicon area and hence cash (and possibly patent license fees) per unit. Let alone all the overheads of standards committees and the like. Perhaps if purveyors of audiovideo content, and computer and TV manufacturers, scrapped all these overheads and passed on the savings to consumers in lower-cost film prices, consumers might be more likely to pay for the content.
Re: Mixed feelings
Agreed. The supposed need for DRM is predicated on a default assumption of criminal intent of all content providers' clientelle, and that this has to be mitigated by technical means.
What if we started from the assumption that most people just want an efficient and effective way to buy and watch movies.
Nobody (in recent cultural memory) has sucessully argued that books must be chained to the shelves to prevent IP theft, for example.
Re: I don't see a problem.
But if I understand correctly, it is not going to be based on open-source implementations.
The proposed W3C standard provides a framework for audiovisual content providers to deploy their DRM binary blobs to the browser.
There is no provision for code review, ensuring the binary blobs do not contain backdoors, vulnerabilities, performance problems etc., all of which have been demonstrated in abundance in prior DRM implementations.
Since the audiovisual content eventually has to be emitted in unencrypted form so that my eyes and ears can perceive it, it will always be possible for motivated individuals to rip DRM-protected content anyway.
Personally, it is not about the money; it is about ease of use, and my freedom to consume media in the format and on the device I prefer. Until content providers make it easier to buy their content than to pirate it, people will pirate it. The music industry realised this some time ago and now DRM is all but nonexistent. Meanwhile music artists' overall revenues from sales of music recordings has increased. My guess is the film industry is at least 5 years behind.
Re: Highly amusing to the cognosenti but utterly baffling to the rest of us.
You're one of the few commentards to "get it," Robot W. Functional programming is not some obscure passtime for nerds, but a powerful way of thinking about programming and the management of complexity in the systems we create. From experience, a solid foundational knowledge of FP makes programmers more effective in any language.
The article was still mildly funny though, if only for the excellent TtTE reference.
Re: An alternate perspective
Precisely my point. What is so wrong about a company, Uber or another, creating an app connecting people who need transportation with those who can provide it, enabling people to more efficiently use their cars, reduce congestion, and provide transportation to persons who could never have afforded it before.
If you don't like Uber conecting buyers and sellers of transportation, what about ebay connecting buyers and sellers of goods etc.
I don't get the point about how the concept of an app that connects drivers and passengers to enable carpooling etc. can have so many ethical issues in some peoples' eyes.
Clearly, Uber could have executed better in this instance, but actually everyone I know who has used Uber has only good things to say about the service--easier to use app, more prompt, and cheaper than any of the established taxicab operators in the city where I live.
I don't understand what some people believe is fundamentally wrong with this.
An alternate perspective
For better or worse, Uber is doing what it must to survive in a hostile environment.
Whatever the management issues at Uber, they actually provide a great service. Disruptive technology and business models are what we need to move the world forward.
A mobility-impaired friend of mine has had a new lease on life since Uber came to her city. Do you want to take that away?
Bring on the flames!
I don't live in London (Auckland, New Zealand here) but we have the same problem of only a half-hearted attempt to make roads and footpaths friendly to all.
Agree 100% especially re electric-assisted bikes and (my addition) novel transport modes such as electric-assisted foot-scooters (take a look e.g. at Israeli brand Inokim). These are the machines that have the potiential to revolutionise 'last mile' transit or replace cars entirely for short commutes.
I applaud cyclists (used to commute 20km on one) but personally find it too scary sharing dual carriageways with fast moving trucks and cars that give no way for cyclists.
I've noted that in some countries, it is commonplace for pedestrians, cyclists and riders of novel transport modes to share broad pedestrian/cycleways with no problems. Councils everywhere need to learn from this, and change by-laws and urban design to encourage the use of cycles and novel transport modes rather than designing around the car.
Fortran 90 "weird?" I don't think so.
Fortran is a very common language (particularly in research and supercomputing).
Referring to the original competition rules, the allowed languages list is highly biased towards Algol-derived, object oriented languages. The competition would have been more interesting if other languages were included, e.g.: Common Lisp, Haskell, Scala, Smalltalk, SQL etc.
Killing spreadsheets for fun and profit
Spreadsheets just need to die.
If you think you need one you probably should be instead using a database that actually stores your data in a well-specified, consistent schema with named tables and attributes (not meaningless columns names like A, B, C) and type safety. SQLite is ideally suited to small-scale ad-hoc data storage/analysis tasks.
I agree with Linus
If you take a moment to look, the styles he is complaining about *are* plain crappy. If you must use asterisks throughout your comment, at least align them. That's kind of the whole point.
May sound petty, but to professionals, basic code tidyness and consistency should go without saying. If my team was submitting sub-par code like this and expecting me to deal with it all the time I would soon be on a short wick too.
God. This is not how I wanted to end my week. I may not have known you Lester, but there is no question you are a fellow I would have liked to meet down at the local, had circumstances somehow made that possible. Through the unique style and wit that characterised your articles, I feel like I knew you Lester--such is the odd relationship between celebrities, writers and their public. We never really believe that the people we know, personally or at a distance, could just not be there any more.
Perhaps I am over-sentimental today because I also recently lost a loved one, so the emotions are nearer the surface and the value of human life dearer to me just now than usual.
I think I speak for all Reg readers when I say each of us has felt this as a body blow.
RIP Lester. Thank you for taking the time to lighten our days through your contributions to the Reg. Sincere condolences to your family and close friends.
The world is less one good man.
Icon because if it was Lester's time to go out, he would have wished to do so in style.
Re: Galaxy S6 runs Marshmallow
Same with my Nexus 7. On my Photon Q I had to install Cyanogenmod. So far the only device I have that can't run Android M is the original LG G1, which truly has done its time and can serve only as a museum-piece now.
Actually, now that I think about it my Samsung Galaxy Note Pro 12.1 (top of the line, expensive tablet) is still stuck on Android 5 with little sign of movement. Now that *is* frustrating. Could CM it, but in this case the Samsung supplied apps (to support the inbuilt Wacom digitizer) are essential.
Re: CM *absolutely* is a way to get a newer version of Android on your phone
Have a virtual beer for being a fellow Photon Q holdout. I've recently upgraded my Photon Q to CM13 and it's never run better! Just a pity we can't swap in a slightly larger RAM chip... the 1 GB RAM is really all that holds this phone back. Who would have thought the day would come we need 2 gigs of RAM to make a phone call (all right, I jest a little).
I just wish Motorola would make a new Droid 4/Photon Q, exactly the same but with up-to-date innards. Not likely to happen unfortunately. But have you ever tried ssh'ing into a server on a phone without a physical keyboard? Not a pleasant experience. And my dad will let his Droid 4 (hand me down from me) be pried from his cold dead hands, and all he uses it for is texting. There is a market niche here...
Allow me to present a dissenting opinion.
I've found my Android Wear watch very practical. Maybe the UI could be improved, but the existing Wear UI based on simple swipe actions is a damn good effort and works well. I actually don't see a need for major changes.
Main uses I get from the watch:
- Telling the time. Why do you think wristwatches were invented in the first place and took over the market from pocket watches? Our phones are now our pocket watches. Any time I leave my wristwatch at home nowadays I find myself glancing at my wrist... oh... dig phone out of pocket, turn on screen... yeah that sure is just as easy as glancing at the wrist. No benefit to a wristwatch here at all...
- Receiving messages. Recieve an SMS, glance at the wrist and there it is. The majority can be dismissed with a flick of the wrist. If a response is warranted it can be made by voice dictation in a fraction of the time it would take to pull the phone out of the pocket. And the voice dictation is good! Less error-prone than the auto-correcting on-screen phone keyboards my friends seem to use (based on experience of trying to interpret their messages).
- Sending messages. The number of times I've been stuck in traffic and fired off a quick SMS using my watch's voice dictation feature without having to use my hands (illegal here).
- Customisable watchfaces. I can add whatever data strikes my fancy to my watch, which I'd probably never be able to find in a traditional wristwatch. Examples: ISO week number, 12 and 24 hour time, UTC time, sunset and sunrise points indicated on the dial, current weather report... All while looking beautiful like a traditional wristwatch. Do I need to take an umbrella with me today? Just glance at my watch. Sounds silly until you try it. When working in project management, glancing at the watch to get the week number is so handy. Doubtless there is a mechanical watch somewhere out there with ISO week, but does it have all the other features? Likely not. Point is I can make my watch work how I need and want.
- Apps. Stuff I could do on a phone, but it's so much easier on the watch. Like whipping out a calendar when discussing meeting/operational dates. Even checking my bank balances to see if I need to move money around before a big purchase--takes two taps to the screen. Done before I could have even got my phone out of my pocket.
- Novel applications which could never be done with just a phone. Guess what one of the best features of my smart watch is? It acts as a viewfinder for the camera in my phone. This is invaluable when trying to see the cabling behind some networking gear poorly packed into a rack, for example. I was on-site with a colleague just the other day, and this feature blew his mind. Sure I could find some remote camera hardware to serve the purpose, or fiddle around with a mirror on a stick, but this feature is there *in the watch already on my wrist*.
I've only scratched the surface here of all the smartwatch features I use on a regular basis.
If you've never used a smart watch then don't be so quick to dismiss its usefulness. If you have and disagree, I respect your opinion. But for me it all comes down to convenience. Sure, I *could* do many of these things with a phone. But nowadays I hardly ever take my phone out of my pocket except to answer calls. Smart watches have reached the point where they are unobtrusive and highly functional. In my opinion they are right in the sweet spot.
As to the people moaning about having to recharge it every night... how many of you wear your conventional wristwatch all night? Or do you take it off and put it on your nightstand? Thought so. Well, you do the same with your smartwatch, and it stays charged. There are zero instances where lack of battery life has been an issue for me with either of my smartwatches.
Disclaimer: I haven't used the Apple watch. My experience is only with the Moto 360 and G Watch R, both of which run Android Wear.
Mainframe and Fortran > LAMP
So what are they going to replace it with? Some unweildy Java EE platform that nobody really understands? Maybe MySQL and PHP/Python on commodity x86 hardware that throws runtime errors whenever the version of the PHP interpreter is updated? You have to be joking.
I feel far more secure with half the world's nukes controlled by a straight-forward, time-tested IBM mainframe designed from the ground up for reliability, using a language like Fortran that has been around for a long time, is not going away, and does the job it was designed to do well. BTW, Fortran is arguably the most popular language in scientific computing.
Replacing a bespoke COBOL HR program running on zSeries just because it is only 8 years old? Insanity. I bet the system they have works fine and does its intended job. If there is a problem with it, address the problem.
If this practice were applied in any other engineering discipline, we would laugh. What, we're going to tear down this skyscraper because it is 8 years old and there are now new materials that are arguably better? Replace that multi-million dollar bridge because it is a suspension design from the 50s, and these days concrete/steel truss is in vogue?
Haven't looked into the implementation, but I like the concept. There is no phone on the market today that has the combination of features I want. Being able to put together a phone that suits my very particular needs sounds fantastic in theory.
I am worried that in practice this won't really be very modular. After all the SoC and the screen are a large part of a smartphone, and comments seem to imply these are *not* modular. Talk about shooting yourself in the foot.
Re: Do we all need to feel like we're in a sci-fi movie?
An obvious solution is to run structured cabling everywhere in every new house build, and use PoE. No need for fat mains wiring and expensive, large, inefficient power supplies on every little device. Plus it is safe, and the wattage provided is perfect for just about any kind of smart monitoring and control device.
союз v. Ariane
I must say I am also curious about the economics of setting up an entire союз launch infrastructure. Why are they not simply using Ariane launchers--are союз and Ariane suited for different mission profiles?
Or did Russia lease the facility and set up the infrastructure themselves, competing on cost with alternatives?