* Posts by Tom7

281 publicly visible posts • joined 3 Aug 2010

Page:

Green recycling goals? Pending EU directive could hammer used mobile market

Tom7

To some degree, though for phone charging you can still use a lower-powered PD device, it'll just charge slower. That's the point of PD - there's a negotiated agreement between charger and device over how much power will be delivered so they can work with different-rated chargers.

Tom7

Apple

I kind of see Apple's point - eventually this will prevent the adoption of the next improved connector.

But to hear it coming from Apple is ludicrous. Apple, who until a couple of years ago were clinging onto Lightning, with its order-of-magnitude-lower power delivery and two-orders-of-magnitude-lower data rates. If you wanted fast charging for your Apple phone, you had to buy a USB-C to Lightning converter ffs.

'Error' causes Alexa to endorse Kamala Harris, refuse to discuss Trump

Tom7

That might be the simpler explanation, but according to some reports doing the rounds today it gets a bit hard to sustain. Apparently, if asked for reasons not to vote for Harris it would give to "no politics here guv" answer, while asking for reasons not to vote for Trump produced copious responses.

NASA pushes back missions to the ISS to buy time for Starliner analysis

Tom7

Options

What other options are there though? Let's face it, Boeing might be confident of starliner's ability to undock and return, but if everyone was similarly confident then it would be undocking and returning, not sitting at the ISS nearly eight weeks into its eight-day mission.

Presumably, the reason that Crew-9 is being delayed is because there are no docking slots left available at the ISS because starliner is there longer than anticipated. But the ISS has six docking bays, with four currently being taken up by cargo / resupply vessels. Why not return one of them earlier than planned and leave the Crew-9 schedule where it was?

SpaceX's Falcon anomaly could have serious implications for the space industry

Tom7

Surely it would be better to track this down to a design fault or a quality control issue, either of which could be rectified?

Better than if they just shrug their shoulders and said, "Oh well, it was an isolated anomaly, let's try another one..."

The UK reveals it's spending millions on quantum navigation

Tom7

I'm curious on performance and cost here. I've worked with a gyro-based INS in the past that could achieve on the order of a few km of error over 24 hours of operation. It was - how can I put this - pretty seriously not cheap. Though the system was, admittedly, decades old by the time I worked with it.

This is the kind of thing that attracts pretty well-policed export controls (it turns out to be quite useful if you're building a cruise missile, for instance) so I'm guessing those details aren't going to be public for a while.

Malicious SSH backdoor sneaks into xz, Linux world's data compression library

Tom7

Re: What about the culprit

It appears to be one of the maintainers of xz who committed the backdoor and had since gone in to bat pretty hard claiming it's an unrelated GCC bug while working around the symptoms caused by the backdoor.

Tom7

Re: systemd was responsible for injecting the vulnerability into the SSH daemon

There was a potential opportunity for signing to help here - although the payload that got built into the library was in git, the M4 macro that deployed it into the build was not and was added to the release tarball presumably to circumvent review and delay detection.

So there are three ways this could have been avoided: use the code from git with a release tag, or use the tarballs generated from git on the fly by GitHub, or compare checksums between the tarball and git.

Tom7

Re: a symptom of) the brain virus

Technically true, but the backdoor was specifically aimed at sshd - it only triggers if `argv[0]` is `/usr/bin/sshd`, for a start. The purpose appears to be to short-circuit certificate verification if the certificate fingerprint matches a known value.

So the backdoor is invoked on anything that uses liblzma (and I think that's actually a lot more than the packages that declare a dependency on it) but it's not quite accurate to say they are all backdoored - they don't all open a back door into your system.

Linux kernel 4.14 gets a life extension, thanks to OpenELA

Tom7

BSPs

There are plenty of Board Support Packages (BSPs) for embedded hardware out there that have 4.14 as their standard kernel; I use one from Qualcom for a product that's stil being manufactured. It's a puzzle why OpenELA are interested in it, though.

Your PC can probably run inferencing just fine – so it's already an AI PC

Tom7

Re: So I can run a local chatbot

My system has 32GB RAM, an RTX4070 and 8GB of VRAM. It can run all of the freely-licensed chatbots from GPT4All - but maybe that's not saying so much. There is a definite difference in quality of responses between those models and the ones you get online from Google or OpenAI. It can run Stable Diffusion's image generation models ... just ... if there's not too much else running and you use the version that's been optimised to need less memory.

As to why you'd want it, they are genuinely useful. I'm using them for language learning at the moment. They're perfectly capable of setting you exercises, correcting them, explaining what you got wrong, chatting informally, assessing formal writing, introducing new grammar and so on and so on, all stuff you'd ordinarily pay a language teacher real money to do for you. Does it get the odd thing wrong? Probably. But it gives you about 98% of what a paid language tutor gives you, is free and is available 24 hours a day, for a few minutes or a few hours whenever you want it.

Forgetting the history of Unix is coding us into a corner

Tom7

Re: Not *everything* is a file

It's slightly complicated by the fact that the Wayland protocol doesn't specify the IPC mechanism it uses, the compositor and the client have to agree on it. Weston, the reference implementation, uses UNIX domain sockets but there's no particular reason that a different IPC mechanism couldn't be used.

When red flags are just office decoration: Edinburgh Uni's Oracle IT disaster

Tom7

Re: ...And Then There Is Birmingham To Consider......

Hell, at least Edinburgh got a system delivered.

A certain suppler in the USA, who we shall call Seer and who have a big, bright blue logo, have been in court repeatedly recently because of a new gimmick they've devised. It goes something like this:

A large public body comes to Seer with a proposal to roll out a new ERP system. "Fabulous!" say Seer. "It'll take three years and cost you $300 million."

"Okay," says the public body, "what's the payment plan for that look like? 30% upfront, the remainder on delivery of key milestones?"

"Ah," says Seer, "Actually we have a new policy on that. All projects have to have payment up-front in full."

"What???" says the public body. "Even if we wanted to do that, there's no way we have that sort of money in this year's budget."

"Well," say Seer, "What about we _lend_ you the money so you can pay us up front and then pay it back in nice, easy monthly installments?"

The public body shakes its head but says, "Well, if that's how you want to do it, okay..."

As soon as the contract is signed, Seer sell the loan on to Seer Financial Services, a company that _sounds_ closely related but when you go digging into the paperwork turns out to be very, very carefully separated from Seer. SFS starts collecting the easy monthly payments.

Two years later, it becomes readily apparent that Seer have done essentially nothing on the project and the original three-year project is still a solid four years away from delivery. The public body starts witholding payment to try to get action, only to get sued by SFS because project delivery is nothing to do with them, they're just providing financial services. If the public body has a problem with the project delivery, they need to take that up with Seer.

Thankfully, courts have started ruling that this is a scam.

Tom7

So easy

It's so easy for large organisations to see anyone warning of problems as nay-sayers who just don't like change.

It's uncertain where personal technology is heading, but judging from CES, it smells

Tom7

So? Who did Ida say is the greatest automaker on earth?

CEO Satya Nadella thinks Microsoft hung up on Windows Phone too soon

Tom7

Re: Microsoft Lumia 650

No nagging to update - or, in other words, vulnerable to absolutely everything.

Let's take a look at those US Supreme Court decisions and how they will affect tech

Tom7

Re: What about signs

The whole article is a serious beat-up. The equal protection clause applies to governments, not businesses. Even universities are free to use affirmative action in their admissions, so long as they don't receive government funding for those places.

For an article about "how [those decisions] will affect tech" there is not a lot here about how those decisions will affect tech; just a bunch of left-wing bashing of the decisions and some scaremongering "maybe this will have big implications for tech companies." What implications?

Whose line is it anyway, GitHub? Innovation, not litigation, should answer

Tom7

Summarising open source as "the creator reasonably wishes people to be able to read it and put them to use" is a vast simplification. Open source licenses have conditions and those conditions are full of minefields for AI ingesting source code. Does use of the code require attribution? How exactly does an AI coding tool comply with that? How does an AI ingesting source code even know that the repo it is ingesting it from belongs to the original author and is correctly following the license terms? If it's been forked within github then it's reasonably straightforward; there are many, many examples of software being copied into github from other places by other people. It's not worth it for the author to go around telling them not to do it, but that doesn't mean the author is okay with it or that they're okay with AI then ingesting it.

Open source AI makes modern PCs relevant, and subscriptions seem shabby

Tom7

Re: TIME

I got all interested after this article and went and figured out how to install Stable Diffusion locally. About half an hour later, I had the GPU version installed, which immediately died because I have an AMD GPU and it will only work with NVIDIA. Okay.

So I went for the CPU version. I've just run my first generation; ten and a quarter minutes to generate a 512x512 image that isn't really what I asked for.

The difference between SD and the more commercial offerings is still large. They just work, they produce reasonable results and they're fast enough to just dabble with and adjust your prompt if the output isn't quite what you were after. SD takes quite a bit of nous to know how to get it to work at all, the results are a bit disappointing and it's slow enough that you'll have got bored by the time your first image is delivered.

Fed up with Python setup and packaging? Try a shot of Rye

Tom7

Re: No mention of pip and venv?

I can't figure out what was wrong with this. I blew it all away and started again and it all just worked. It had somehow got it into its head that it needed quite an old version of pylint in that one project and nothing I did seemed able to convince it otherwise.

Tom7

Re: No mention of pip and venv?

As with a number of other systems, this only solves half the problem. It helps you manage a collection of other people's packages, not to package your own.

Tom7

So I'm curious - how do you install Python packages without pip? Or do you insist that all the Python you write use nothing but the standard library?

Tom7

Re: No mention of pip and venv?

They work if:

* You want to use the version of Python that is installed by default on your system

* You don't want to package your work

As soon as you want to use a different version of Python for a project, or you want to publish your work on PyPI or distribute it in wheel form, there is a whole world of pain that venv and pip don't help you with.

That said, rye seems to still have some pretty rough edges. After an hour of faffing around, I still can't get it to install pylint into a venv...

Croquet for Unity: Live, network-transparent 3D gaming... but it's so much more

Tom7

Re: It's not FLOSS :(

That's not how they describe it in the documentation. The reflectors also serve to serialise the event stream, ensuring that each client sees the event stream in the same order with the same timestamps and so ends up with the same simulation. At any rate, you can't use Croquet without using the reflectors and you pay for their use (subject to a free usage allowance).

Tom7

Re: It's not FLOSS :(

It's also not really clear what your comments about it being "serverless" mean, since it's dependent on a network of "reflectors" that are operated by Croquet and are closed-source.

Tom7

Re: It's not FLOSS :(

Yep. And "Contact Us" if you're interested in running your own reflector.

I get it. They need a revenue model. But this is a lot less exciting now.

Enter Tinker: Asus pulls out RISC-V board it hopes trumps Raspberry PI

Tom7

Re: Yikes.

I hadn't thought of that but that's really neat. An EtherCAT driver for this would be awesome; one interface for networking, the other for control. It's curious, in a way, that it's running Linux and not FreeRTOS, which would be a more useful OS for robotics etc I think (although Yocto seems to have decent support for PREEMPT_RT, it's always still a pain to actually make something do what you want it to in RT).

The Great Graph Database Debate: Relational can't do everything

Tom7

The crux of our disagreement is simply with the claim that some future "well-architected" relational database engine could render the use of today's useful, existing, in-production graph databases unnecessary

Not a great way to start, but mis-quoting (apparently deliberately) the other side. No-one has so far mentioned some future "well-architected relational database engine" but rather well-architected databases (ie schemas) within existing relational database engines.

This, and a big pile of thinly-veiled neo4j sales-speak, appears to be about the level this contribution to the debate is operating on.

How to get the latest Linux kernel on your Ubuntu box

Tom7

Why do instructions of this sort always include `sudo apt update` when `add-apt-repository` has, for some years now, done this automatically?

Massive energy storage system goes online in UK

Tom7

Re: Decommissioning?

Solar is a really dumb idea in the UK and makes the problem worse; all the generation is at the time of minimum demand.

Nuclear is not a bad idea but extremely expensive as it is currently implemented. Much more expensive than wind.

Hydro is a good idea but we've run out of suitable geography to build more.

Geothermal is a good idea but it's not obvious that there is a large, economic resource in the UK.

I really shouldn't have to explain why an interconnector from Moroccan solar output is not a solution here.

So wind is pretty much what we're stuck with. It is far from impossible for there to be very little wind across the whole of Europe - there have been several electricity price shocks in the past few years that have been linked to exactly that.

Yet another thing you've not taken into account is that "powering 300,000 homes" means meeting the current electricity demand of 300,000 homes. That's about 1/3 of the current energy demand, with the other 2/3 roughly split between transport and gas use. If we're going to transition to a future with no fossil fuels, that triples the electricity demand.

Tom7

Re: Decommissioning?

Not to mention that to make wind energy actually reliable in the UK, you'd need to build somewhere around 9,000 of these - that would allow you to power 30 million homes for about a week.

Someone has to say it: Voice assistants are not doing it for big tech

Tom7

"the dream of a cross-platform voice-assisted future"

I think one of the issues is that none of these assistants is actually really cross-platform.

If you own an Echo, then you probably have Alexa on your Echo, either Google Assistant or Siri on your phone and Cortana on your laptop/desktop. Amazon and Google both appear to have abandoned the desktop space. Google and Apple dominate the phone space; there is an Alexa app but it's so inconvenient to reach that it's pointless. Google has a stand-alone device of some sort but Amazon seem to dominate this space.

In a way, the disparity helps the metaphor. People expect computers to be like people and it's just weird when you have six different devices all called "Alexa" that appear in different places and you interact with in different ways.

Twitter engineer calls out Elon Musk for technical BS in unusual career move

Tom7

Sooooo....

The number of RPC calls isn't the problem but... "we spend a lot of time waiting for network responses." Sounds a lot like someone playing word games with what is and isn't an RPC to win a point.

How GitHub Copilot could steer Microsoft into a copyright storm

Tom7

Re: I am not a lawyer

You haven't actually read the terms of service, have you?

If you had, you would have spotted this in the definitions section:

The "Service" refers to the applications, software, products, and services provided by GitHub, including any Beta Previews

Licenses mean what they say they mean, not what you'd like them to mean.

Tom7

Re: I am not a lawyer

Do you have an alternative interpretation? Those words seem pretty clear to me. Anyone who posts code on GitHub licenses it to GitHub for the purpose of providing any service that GitHub provides - including Copilot.

That doesn't extend to the people who use Copilot of course - they're just SOL. But Microsoft is covered for their use in training Copilot.

Tom7

Re: No Solidarity with A.I.'s run for profit!

Check the definitions though - "the Service" is defined as any service or application provided by GitHub, not just the GitHub service itself.

Tom7

Re: Liability has already been defined

What's the relevance of that? No-one denies that everything posted on GitHub remains copyright; the point is that the authors have, by using GitHub, granted Microsoft a license to use that code.

Tom7

Re: I am not a lawyer

The author of said code agreed to the GitHub terms of service, which includes a license for Microsoft to use your code for essentially any purpose "as necessary to provide the Service" (quote from the ToS). Here 'The “Service” refers to the applications, software, products, and services provided by GitHub, including any Beta Previews.'

Tom7

Re: No Solidarity with A.I.'s run for profit!

PAAAAAhahaha. You posted it on GitHub! You realise that involved choosing a license, right? Specifically this in the GitHub terms of service:

4. License Grant to Us

We need the legal right to do things like host Your Content, publish it, and share it. You grant us and our legal successors the right to store, archive, parse, and display Your Content, and make incidental copies, as necessary to provide the Service, including improving the Service over time. This license includes the right to do things like copy it to our database and make backups; show it to you and other users; parse it into a search index or otherwise analyze it on our servers; share it with other users; and perform it, in case Your Content is something like music or video.

Tom7

Re: Liability has already been defined

Given that it's trained on GitHub public repositories, this language in the ToS looks relevant:

You grant us and our legal successors the right to store, archive, parse, and display Your Content, and make incidental copies, as necessary to provide the Service, including improving the Service over time. This license includes the right to do things like copy it to our database and make backups; show it to you and other users; parse it into a search index or otherwise analyze it on our servers; share it with other users

If Copilot is part of the GitHub service, then anyone who posted their code on GitHub has implicitly licensed their code for this purpose.

Note that, in this agreement, the "Service" is defined as "the applications, software, products, and services provided by GitHub, including any Beta Previews."

Tom7

This is very much the issue and it's not nearly as clear-cut as your example makes out.

Most people learn by looking at what others have done, then build something themselves based on what they've learned. Whether that is a copyright violation or not depends on just how close it is to what they've seen from others.

One way of looking at Copilot is that it's a tool to make that process of looking at other people's work and using what you learn a lot more efficient. But it lacks any sort of "hang on, that's too similar to what we've seen elsewhere" filter and also hides the source of the material from the human who is using the tool, so they have no reasonable basis to assess whether the code it's just produced is a copyright violation or not.

Copilot, as an AI, is not a legal person who can be sued for the copyright violation and, naturally, the Copilot terms of use make the end user completely responsible for assessing whether the output is a copyright violation or not.

The only sane course from here is to avoid Copilot like the plague.

OpenAI, Microsoft, GitHub hit with lawsuit over Copilot

Tom7

Yes, absolutely it is. However, the GitHub terms of service include the grant of a license to GitHub "to store, archive, parse, and display Your Content, and make incidental copies, as necessary to provide the applications, software, products, and services provided by GitHub, including any Beta Previews" (note this is a synthetic quote, generated by substituting definitions from the "Definitions" section of the ToS for the terms so defined). I expect the Microsoft / GitHub will simply rely on this part of the terms of service; by posting your code, you gave them a license to use it pretty much however they like in the course of their business; the fact that Co-Pilot wasn't conceived at the time the code was posted is irrelevant, as this section of the ToS also includes "including improving the Service [the applications, software, products and services provided by GitHub] over time."

What will be very interesting is how the court treats people who posted someone else's copyleft-licensed code. Such a person has every right to make copies of the code, make derivative works, post it all on the internet etc etc; what they don't have a right to do is to grant a non-copyleft license to GitHub, which they implicitly purport to do when they post it on the site.

It's worth noting that GitHub has separate terms of service for corporate customers. Those terms have a similar license grant, but crucially define "the Service" much more narrowly, as "GitHub's hosted service and any applicable documentation" instead of "the applications, software, products and services provided by GitHub, including any Beta Previews."

It's official: UK telcos legally obligated to remove Huawei kit

Tom7

It's official

UK publications obliged to stop using the stupid Americanism "obligated".

Open source databases: What are they and why do they matter?

Tom7

Free is, well, free

It seems odd to talk about how FOSS databases are dominant in startup culture without mentioning that an Oracle database license costs five figures per CPU. If you're using Postgres or similar and you become capacity constrained and want to expand, you shell out an extra $10 per month to AWS or whoever and spend a few hours configuring it all. If you run Oracle and want to do the same, you call your local salesman and tell him you're bent over, ready and waiting. In a time when software scalability is everything, what sort of startup wants to expose themselves to the risk that they'll be successful and need to buy more Oracle licenses at whatever the hell the going rate is then? It's not like you'll have a choice; shell out or your service will fall over.

The crime against humanity that is the modern OS desktop, and how to kill it

Tom7

Re: It does suck

I agree that Windows 7 was the peak of Windows usability. 10 was sort of okay and sort of not. I haven't used Windows regularly since 7.

IMO the current Ubuntu / GNOME desktop gets it right. I'm keyboard-centric so the 'super key + start typing' thing works really well for me; it's the Windows 7 scheme without the folder-structure to fall back on.

The only drawback I could understand is that it doesn't work very well for touch. The Android-like page after page of unsorted app icons is not exactly usability plus.

I think the author has missed one of the key reasons that OS makers keep on messing with desktops - they're still searching around for a desktop metaphor that feels equally natural when you're sat at a screen, keyboard and mouse as it does on a tablet or phone. Moan as much as you like that tablet UIs have no place on the desktop, but personally I have a laptop that folds around into a tablet and turns into a touch screen. Which UI metaphor should it use?

This tiny Intel Xeon-toting PC board can take your Raspberry Pi any day

Tom7

It has 16 digital IOs. The specs make no mention of any peripheral controllers on them. It's hard to see that as more useful than the Pi's 27 GPIOs, most of which can be configured to some alternative function (I2C, SPI, UARTs, PWM).

Or, for that matter, an ESP32's extremely impressive list of peripherals and very good RTOS.

Tom7

No mention of any GPIO, ADC, touch or PWM peripherals accessible on the board. Also no mention of WiFi. It's hard to see this as an RPi competitor. It's just a SFF PC. Impressively SFF, maybe, but it doesn't offer any of the things that make the RPi distinctive.

Modeling software spins up plans for floating wind turbines

Tom7

Re: Oil rig technology?

Yes, for a given value of "solved". An offshore oil rig is pumping thousands of barrels of oil per day, some of them hundreds of thousands. A barrel of oil is equivalent to around 1.7MWh of energy, so an offshore platform is producing anything up to around half a million MWh per day. A 10MW turbine, operating at a 30% capacity factor, produces about 75MWh per day. Not all oil platforms are that big, but neither are all wind turbines. A turbine support structure has to cost about 15% of what an oil platform's support structure does to make the economics comparable. That's before you consider that a turbine also needs a cable capable of carrying XMW back to shore installed, while your average oil platform stores it all internally until a ship comes along and takes it away.

In the medium term, I think wind turbines will have chemical plants built into them that produce synthetic fuels. There is a pilot (onshore) plant in Iceland producing 1.4 million litres of methanol per year from geothermal energy; there's no particular reason that the same could not be built into a turbine tower. Then, again, the stuff could be stored until a ship comes and collects it. Similar chemistry is available to produce ethylene and ammonia, major energy-intensive feedstocks for industrial processes.

Tom7

Re: Oil rig technology?

It's a crap idea. The life of a wind turbine is already largely limited by the life of the blades under constant flexing from wind loads. So someone's invented a turbine that requires much more flexing of the blades to control it. Slow clap.

The challenge with deep-offshore wind is not the bit above the water but the bit below it. People have been prototyping floating conventional windmills for well over a decade. If this new turbine was a good idea, it would be a good idea on land as well as offshore. it isn't.

The thing about deep offshore is that you can't just let it bob around aimlessly. You still need a grid connection to each turbine that's capable of carrying several MW (or whatever the rated output of the turbine is - some are up to 10MW these days). So you've still got to lay a cable on the ocean floor and that upsets environmental types because no doubt there is some fragile sea grass somewhere on the path between the turbine and the shore. You then also need a way to anchor the turbine to that location, in a way where it's not going to break loose, snap its grid connection cable, smash up any other turbines in its path and become a hazard to navigation in rough weather. This all makes it terribly expensive to install. There are enough shallow-water locations where turbines can be installed but haven't to make deep-water offshore wind a solution to problem we don't have yet.

Why the end of Optane is bad news for all IT

Tom7

Re: Insane

In a way, I think Optane was a good idea poorly timed.

Ten years ago we all had spinning disks in our laptops and how transformative it was to replace the spinning disk with an SSD five years or so ago. Workloads had been disk-bound for decades while everything else on the system got orders of magnitude faster; suddenly, storage caught up several orders of magnitude. For most people, most of the time, their systems are now fast enough for their needs. Most people now look at their laptop and see how much slicker it is than five or seven years ago; the idea that storage could improve by another order of magnitude just doesn't hold that much attraction. If we'd had another ten years to get used to SSDs, we might be feeling the limits a bit more and faster storage would be more attractive.

To interact a bit with the author's ideas, they write this as though we could have jumped straight back to a 1960s paradigm because Octane appeared. Never mind that back then software amounted to hundreds of bytes and running a programme was expected to take hours or days; the idea of having more than one programme running at once simply didn't make sense to people then. Attacking the filesystem as an abstraction for managing storage is all very well, but unless your software is going to go back to being a single process of a few hundred bytes, you have to have *some* sort of abstraction for managing it. No-one really seems to have done any work towards figuring out what that abstraction could be. Saying you just install an application into primary memory and run it from there, where it maintains its state forever is all very well; how does that work if you want to run two copies of the same piece of software? If your answer is to separate data from code and have multiple copies of the data, how do you tell your computer to run a new one or pick up an old one? There is a new category of thing that is persistent process memory; how do you identify and refer to that thing? How does that model even work for something like a compiler, where you feed it a file and it produces another file in output? Is persistent state even useful there? If not, how does the abstraction work?

Page: