Re: The next question: are the problems so severe that they can't be fixed?
Oh, you cynic.
See at the L-Street bar later?
12172 publicly visible posts • joined 16 Apr 2007
This isn't really about the quality of the engineers or programmers but how heavily politicised any large project at NASA is. Funding is continually being allocated, reallocated and removed. As a result, companies favour lobbying over product: it will only cost a couple of million to get a law that requires your company gets involved.
The next question: are the problems so severe that they can't be fixed? Seeing as the failure was not catastrophic that does look to be the case.
Well, yes, we'll never really know whether Oracle had plans for commercialsing OpenOffice. Though as yet, they've not looked like entering the end-user space: setting up the sales and marketing for "Oracle OpenOffice" would have been significantly more expensive than paying for a few developers. And getting a product that could really compete with MS Office would have required several years of development with no revenue stream. But who really knows?
There was even a grant awarded jointyl to both projects that the LO team used the GPL to sabotage. It's a pity and now that OpenOffice is with Apache, it would make sense to align the licences again so that code can flow freely in both directions. Not that I expect this top happen, but it would still be nice.
That statement doesn't really mean very much. OpenOffice was already an open source project under the Oracle's ownership. While there was reasonable concern that Oracle would try and turn the project into a commercial cash cow, some people also saw the opportunity to advance their own ideology with the fork.
I am no fan of Oracle, but it turns out some of the predictions of doom were unfounded: MySQL is still around and arguably in a much better state than under MySQL AB's or Sun's tenure, VirtualBox is still around and they obviously decided that there was no market for them in a commercial version of OpenOffice and, as developers, cost money, they got rid of it to the Apache Foundation.
If carrier-grade network address translation is applied to them all, it could permit hundreds of millions of devices to hook up to the internet using IPv4 rather than IPv6
Carrier-grade NAT is in place all over Asia, which is why they know all about it. China and India coudl each easily use the entire IPv4 address space. Seeing as the growth of devices is driven by mobile devices, which generally spend most of them their time on carrier networks, which already have to support IPv6, carriers probably aren't that interested in downgrading to NAT. These addresses will make more sense for legacy systems where 6-to-4 gateways aren't practicable. But I reckon "cloud" providers will be the biggest users.
OLED was always due to get cheaper over time as the technology improved: in theory much simply than LCD. In theory.
But, while it was initially restricted to a couple of manufacturers, it's now much more widely available, though still relatively expensive, which is why production is still devoted to phones and not TV screens. And the high resolution screens are still expensive.
Both of those are just shiny baubles to me with the latter being a battery sucker
Never use NFC myself but it is generally passive, otherwise how would wireless card payments happen? But then that's what I use for paying for stuff? Why should I get my phone out to make a payment, when a far less fragile card will do?
Wireless is charing is very convenient and becoming more available. But, particularly with USB-C, it's not as if plugging in a cable is beyond people.
Failures can be catastrophic and the consequences devastating and some images cannot be unseen, doubly so with the internet, so this is standard practice. You can always release any unseen footage later but more important for investigations will be the telemetry and how forthcoming the company is with the data.
Certainly, using the constellation for broadband provision would make sense (particularly for those UK households unable to get decent speeds). It is, after all, what it was designed to do.
Given how tiny the UK is, you could easily provide the same service with a few of Google's dirigibles. Satellites are designed for covering large areas of the earth with their signals.
Oh, hang on. Is that the 19:21 Gravy Train? I have to catch that one! If you fine it, mine's the one with "A Short History of Barnard Castle" in the pocket but you can keep that.
Well done, you! While it's easy to laugh at e-bikes as somehow a soft option (they are, but so what?), they do get people out in the fresh air and, as you have to pedal, there is always going to be some gentle cardiovascular exercise. And, of course, they make light weight of any shopping.
I remember when I first saw them and realised that it meant the grandparents could enjoy a day out with the grandchildren. What's not to like about that? Also met a 70-year old out in the hillls who was still able to cycle around 100 km with the help.
There are still problems with them largely related to separate motor/gearing systems which mean that many people leave them in the highest gear, with the highest degree of support, which means they don't half accelerate at junctions, which is also where most accidents with them occur. I suspect we'll start seeing combined control systems so that people don't have to think about the gears either.
Another interesting effect they're having by driving up the average price of bikes, is that bikes are gaining status, which means people are more willing to spend money on keeping them maintained: so they go for regular inspections rather than being ridden until they start falling apart. Though I also doubt that you can get a really usable bike for less than £ 100.
Dear Def,
billions of clicks from your less enlightened brethren beg to differ…
Actually, targeting ads generally appeals to the same sort of companies and users of services like Groupon: they have a shitty product that they're desperate to get rid of.
Context is everything so just wait for targeted product placement coming to a streaming service near you with products being placed in real time on breakfast tables, cars, etc. of the shows you love. I wish was making this up but AFAIK it's already been patented.
True. This is one of the reason I didn't go WithThings.
But there are exceptions: my digital scales from Soehnle has an online storage option but the default is on the device with the optiom to sync to my phone. I think the same is true for their other devices but I only really want to know how fat I am…
Why, one might ask, does the same reasoning not apply to the entire JavaScript engine? Should Mozilla just migrate to V8?
Well, regexes really count as DSLs (domain specific languages), which is why the two browsers have been using the same approach for years. The change presumably includes some kind of binding so that the V8 engine can be called directly and doesn't have to be ported.
The same does not necessarily apply to the JS runtime. But I do think we will start to move towards consolidation other areas such as HTML, CSS and JS parsing, areas where Mozilla's use of Rust might have advantages.
I think the main reason that the main reason cyclists have so few serious accidents in the Netherlands is that bikes are always taken into consideration when planning roads, thus minimising competing traffic streams. Add to this the fact that most Dutch people cycle and the liability is nearly always with the most powerful vehicle.
Only according to Nathan Barley.
These toys do not:
Oh, and helmets are pretty fucking useless for all the knee and hand injuries that people suffer with them.
Depends on your definition of "on time", considering it was originally due in April. And it only got approval about its handling of data privacy from the CCC after massive initial criticism. The € 20 m will certainly have helped pay for a few of the strategy boutique meetings.
More importantly, if the Entsendegesetz had been applied to the meat processing industry years ago, thousands of people in meat processing plants wouldn't have been put unnecessarily at risk or the good burghers of Gütersloh forced to stay at home. Still waiting for the technological solution to treating employees like shit that doesn't involve replacing them by robots.
I'm sure Apple knew that benchmarks would be released so I wonder why they decided to bother banning developers from providing them; apart from the usual paranoia that is. Presumably, because it gives them plausible deniability over benchmarks for code that is almost certainly not optimised, while at the same time knowing that the leaked scores will keep people talking about the speed of the new chips.
Will we see improvements over time as the compiler and emulator get better? Will we see a baseline that allows Apple to gather wows™ when new hardware is released?
Interestingly, Geekbench notes that the chip running on the Transition Kits has four cores…
Not really, no point in shipping big.Little silicon for what is a non-mobile developer workstation and Apple will presumably be aiming to ship beefier kit in its own hardware later this year, otherwise it would probably selling it already.
They certainly tend to stick around once there inhouse. Got a similar problem with a client currently migrating from MS Office 2010 to Office 2019 and sticking with 32-bit because of the fecking add-ons, which are probably no longer even maintained.
Good that there are now other options with things like PyXLL.
Not really, a table at best. But at least Excel provides some kind of data typing, which is why it's become so ubiquitous. I've recently seen this confirmed in a project which relies on data passing through several hands before it can finally get into the database. Using Excel as the file format has led to far fewer errors than I would ever have imagined.
which is now in the 2013 version rather than the 2007 version. This may cause problems for Word 2010 users, to whom the advice is to "upgrade to LibreOffice".
Unlikely: the transitional spec was really developed for MS Office 2007 and was incomplete, so that changes were made even after it shipped. Microsoft itself deprecated Office 2007 a while back so it's not uncommon to come across OOXML documents that Office 2007 cannot process completely but I've not seen such problems with Office 2010 which saw far greater adoption, largely on the promise of stable document formats.
Funny how people from Google never comment about how it always seems to be an iPhone the F.B.I. or whoever, seem to have difficulty with decrypting. It never seems to be an Android phone. Funny that.
Nice bit of whataboutery which isn't true. I-Phones are encrypted by default, with Android you normally have to enable it manually. But that doesn't mean it isn't secure and, just like Apple, Google cannot provide the keys to decrypt an encrypted Android device.
Apple's decision is more about scope: how much functionality can a browser provide safely? For several years there have been many people championing the browser as an OS. In order to do so this requires replicating OS services, which works well in some situations: notifications, hardware acceleration and even location services can make a lot of sense on (mobile) devices. But in other situations it essentially means breaking open the browser sandbox.
We got it once a year from my granma in Plymouth, back when it took 8 hours to drive there… A few years ago I found my local American-British store was stocking it here in Germany. And why not? It's damn fine stuff!
Oi! El Reg, where's the scone with jam and cream icon?
It was probably chosen for two reasons: tax and regulation. Wirecard AG didn't fall under the remit of the BaFin (FCA) but under the government of the very parochial local government of Upper Bavaria. Mind you, it does look like BaFin did also fuck up but it has a history of this on big money projects. German savers seem to be magically drawn to "too good to be true schemes" in other countries.
Yes, I've seen that behaviour in some companies as well. It used to be the argument against open source, until it turned out that commercial software was just as shit but you wouldn't have a chance of finding out until it was exploited.
Personally, I don't agree with either the current practice (all problems can be solved by an update) or the disclaimers in many licences. But the point is that I can't think of any court cases. Unlike, say, those that have upheld the GPL. No doubt there will be some case at some point but IIRC in the US there are some wide-ranging excemptions. Otherwise Microsoft would probably have been bankrupted multiple times in class action suits over Active X, which wasn't just an oversight but a design goal waiting to be exploited.
I don't think the law is on your side. Currently, software developers are not subject to strict liability, which is why we live in a world of updates. Then there are the licences: most open source licences explicitly exclude liability and I've not seen the clauses invalidated by any court yet. It's not as if commercial software is immune to such stuff either: both the flaws in own code but also in liberal use of open source libraries.
While it is possible to check for known vulnerabilities in libraries, there's basically no way around extensive pen testing for modern web-based applications. But getting customers to pay for these is another matter. As is paying for updates of the software stack as new vulns get discovered.
For next few years there will be x86 Macs sold, including new Intel Macs that haven't been announced yet.
Apple said they expect the transition to take about 2 years, ie. 2 years from now they will not be selling x86 machines. x86 will continue to be supported by the OS for "years" (my guess would be 5-6 in line with current practice of deprecating hardware chez Apple) but it may soon become "maintenance" mode if they can sell enough of the new ones.
We won't really know until the new devices appear towards the end of the year and we can see whether Apple silicon does have better TDP and memory performance than x86. For example, same battery life / performance as now but in machines < 1 kg. That would be serious bragging rights. But let's see what they come up with.
I've been running Windows in VMs on MacOS for over 10 years and never found it to be slow. As long as the VMs can use the hardware hypervisor there's no reason why it should be.
There will be a hit for Windows 10 x86 on MacOS ARM because of the emulation. Apple is clearly saying to Parallels and VMWare: if you want performance it's up to you to do it but, again, the hypervisor might help here for CPU stuff. Bigger problems will be with anything wanting to use x86 hardware acceleration: MMX, etc. because Apple might not make any optimisations it's done for this available to other software, as is already the case on MacOS for codecs. But it also had a vested interest in Windows on MacOS not being completely unviable. Guess we'll soon see reports from people running QEMU on the new developer boxes.