Well, they got the initial law passed, and by exempting agricultural equipment they avoided a huge lobbying blitz by Deere. Now a few years later as R-to-R gets established in other states and civilization doesn't collapse from it, they'll have ammunition to push back when they amend the law to cover agricultural equipment as well. Or Deere and the other tractor OEMs will see this is a Prisoner's Dilemma and one or more will defect and open their stuff, for the PR and sales benefits; they must see that the writing is on the wall.
Posts by Michael Wojcik
10664 publicly visible posts • joined 21 Dec 2007
Page:
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
- 19
- 20
- 21
- 22
- 23
- 24
- 25
- 26
- 27
- 28
- 29
- 30
- 31
- 32
- 33
- 34
- 35
- 36
- 37
- 38
- 39
- 40
- 41
- 42
- 43
- 44
- 45
- 46
- 47
- 48
- 49
- 50
- 51
- 52
- 53
- 54
- 55
- 56
- 57
- 58
- 59
- 60
- 61
- 62
- 63
- 64
- 65
- 66
- 67
- 68
- 69
- 70
- 71
- 72
- 73
- 74
- 75
- 76
- 77
- 78
- 79
- 80
- 81
- 82
- 83
- 84
- 85
- 86
- 87
- 88
- 89
- 90
- 91
- 92
- 93
- 94
- 95
- 96
- 97
- 98
- 99
- 100
- 101
- 102
- 103
- 104
- 105
- 106
- 107
- 108
- 109
- 110
- 111
- 112
- 113
- 114
- 115
- 116
- 117
- 118
- 119
- 120
- 121
- 122
- 123
- 124
- 125
- 126
- 127
- 128
- 129
- 130
- 131
- 132
- 133
- 134
- 135
- 136
- 137
- 138
- 139
- 140
- 141
- 142
- 143
- 144
- 145
- 146
- 147
- 148
- 149
- 150
- 151
- 152
- 153
- 154
- 155
- 156
- 157
- 158
- 159
- 160
- 161
- 162
- 163
- 164
- 165
- 166
- 167
- 168
- 169
- 170
- 171
- 172
- 173
- 174
- 175
- 176
- 177
- 178
- 179
- 180
- 181
- 182
- 183
- 184
- 185
- 186
- 187
- 188
- 189
- 190
- 191
- 192
- 193
- 194
- 195
- 196
- 197
- 198
- 199
- 200
- 201
- 202
- 203
- 204
- 205
- 206
- 207
- 208
- 209
- 210
- 211
- 212
- 213
- 214
- Next →
Minnesota governor OKs broad right-to-repair tech law
Twitter Spaces groans under weight of Ron DeSantis and Elon Musk's egos
Intel mulls cutting ties to 16 and 32-bit support

Re: It's about time.
X86 has come closer than anything else in history
Has it? Last I looked, there were still a lot of Z80 cores inside little dedicated-purpose embedded chips, for things like kitchen-appliance controls. That era may have passed now too, but it was still the case circa 2010, which would give the 8080 architecture a good 35+ years. Wikipedia says there's still a Z80 variant in TI calculators as of 2015. True, x86 is only 5 years younger, so it won't take x86 long to surpass 8080 after the last version of the latter ships, but for the moment I believe it's still telling its 16/32/64-bit younger sibling to get off its lawn.
(Of course you could pretend that 8080 is part of the x86 family, if you really wanted to loosen your definitions. And you could pretend the ROMP is part of the POWER family, which makes it more or less contemporary with x86. In any case, 360 beats them all handily, with a solid decade on 8080.)

IA-64 wasn't a complete failure in the beginning because of marketing. It was a lousy CPU which was dreadful to use, particularly if you were trying to write a compiler backend for it, or (god help you) debug something in assembly.
Even the things that seemed like a good idea in theory – hey, a trap representation for integer registers! – were terrible in practice because they weren't handled well by the OSes that ran on IA-64. Which was mostly HP-UX.
Google wants to target you – yes, YOU – with AI-generated ads

Algorithms selling to algorithms
We're likely not far away from the point where many consumers let algorithms do a lot of their discretionary shopping for them. Already we have plenty of popular "sell me stuff without making me choose" services, like those meal-preparation ones or Bespoke Post. I know people who have gotten random-clothing-of-the-month subscriptions and the like.
It's a small leap to just turning a budget over to a purchasing app that "learns your preferences". Or even not budgeting it, if it's reasonably successful at matching how much the victim consumer would have spent on unnecessary crap every time unit anyway. I wouldn't do it, but I'm a contrarian curmudgeon and I hate all joyful things such as spending money. You know there's no shortage of people who would. Every day is Christmas!
Then those machine-generated ads will gradually shift to targeting purchasing algorithms, rather than human purchasers, and we'll have an arms race between sell-bots and buy-bots. Which, again, should not be surprising; that's what we've seen happen in financial trading, for example, and in IT security. We can just take humans right out of the loop, and simply spend a lot of resources making and shipping crap around the world to be briefly admired and then tossed in a corner.
O brave new world (same as the old world).

Re: Bring it
Exactly. The expected return from advertising to a consumer something they already frequently buy is minimal, unless you think you can tempt them away from a competitor. Well, there's the theory about "brand prominence", which is what supposedly justifies the excessive advertising of things like Coca-Cola and F-150s here in the US; but I'm dubious about its value, frankly.
Ford in reverse gear over AM radio removal after Congress threatens action
SF cops got warrant-free OK to watch protest via private security cameras

Yes, it's just possible there's some imbalance of power potentially affecting the camera owner's decision to permit or refuse access.
There is a reason for the principle of requiring warrants for searches. That reason does not magically go away when the police promise to be nice and a craven city government rubber-stamps their requests.
IBM asks UChicago, UTokyo for help building a 100K qubit quantum supercomputer

Well, this bit is nonsense
From the article: a 100,000 qubit "quantum-centric supercomputer" allegedly capable of solving the world's most intractable problems
Did IBM actually claim that? Because it's prima facie bullshit.
There are only a handful of areas in which we know of algorithms that have a quantum advantage – i.e. are in complexity class BQP (or possibly some related classes; see the Complexity Zoo). They most certainly do not cover most of "the world's most intractable problems".

Re: Updated headline
Err ... we have quantum computing today. We might have quantum advantage today, e.g. in Google's quantum-circuit experiments.
If you had written "useful quantum computing to be 10 years away..." you might have been right, though there's a decent chance that one or another of the systems under development will eventually scale up to a number of error-corrected qubits that is actually useful for physical-simulation tasks.
That Meta GDPR fine is €1.2B. Plus biz must stop sending EU data to US

Re: Honest question
Shareholders have no control, but they can sell shares, depressing the stock price, reducing equity and raising the cost of borrowing. A significant shareholder revolt could hurt Meta, in principle.
In practice, there won't be one. Institutional investors mostly move slowly, and Meta have the capital to pacify them with buybacks (or dividends, in theory, but dividends are passé). Unmanaged funds, such as index funds, that hold Meta shares buy and sell according to their rules, not sentiment. Big activist investors generally retain their holdings in the hope of having influence.
Google settles location tracking lawsuit for only $39.9M
One of the world's most prominent blockchain apps looks like being binned

Re: Blockchain not at fault
There are certainly many good applications for Merkle graphs in general, such as some types of filesystems, git,1 some database systems, and so on. And there are applications which are of theoretical interest even if they're not practical, such as hash-based signature schemes.
But blockchain is just a dumbed-down, degenerate Merkle graph, and I've yet to see a good use case for it. The one where it supposedly provides something useful – as an append-only ledger in a Byzantine environment – fails all over the place in practice (e.g. partitioning attacks) while consuming egregious amounts of resources.
1Well, git is a popular application of a Merkle tree. I'll leave the question of whether it's a good one to the side.
Microsoft and Helion's fusion deal has an alternative energy
China becomes the 37th country to approve Microsoft's Activision buyout

Re: "Regulator outreach"
I don't have any personal dog in this fight either – I've never bought any Blizzard or Activision games, and don't have any Microsoft gaming stuff – but it still seems like a lousy move for consumers. We really do not need more entertainment-industry consolidation in the US.
Russian businesses want to party like it's 1959 with 6-day workweek

Re: What do you expect?
It's not just to glorify Putin. Russia's ability to speedily project naval power south of Siberia is highly dependent on Sevastopol – the only good deep-water port Russia currently has that's not in the far north – and Crimea's only natural connection to the mainland, and in fact its only road connection aside from the Kerch Strait Bridge (which Ukraine already disabled once, though Russia recently managed to reopen it), is through Eastern Ukraine. So there's a real military objective behind the invasion. Which doesn't justify it, morally or even strategically, of course; but there is a reason.
Now that Finland has joined NATO, imperiling Russia's Baltic and nearby Arctic naval resources, the strategic situation for Russia is even more dire.
Parent discovers the cost of ignoring Roblox: £2,500 and heart palpitations

Re: the next AI peril
So long as they don't get into eBay Platinum Reserve and buy Airwolf and Keith Moon's head.

Re: Two and a half grand on a game
Ah, remember the days when stories of a romantic relationship (or attempt thereat) which started with meeting online were actually rare enough to be interesting? Before Eternal September (to say nothing of today's swamp of eHarmony and match.com and Tinder and god knows what else) made it all so mundane?
That sort of thing probably died for good when The Shop Around the Corner was remade as You've Got Mail.1 Meeting someone on IRC or Usenet or a BBS had a certain nerd cachet; meeting them by actual handwritten letters is sentimental; meeting them by AOL email is just sad. By website or app doesn't bear thinking about.2
1Went to Wikipedia to confirm the titles and discovered TSAtC was itself a film adaptation of László's Parfumerie. Credit where it's due.
2Though I did used to enjoy the television commercials for chemistry.com. "I was blocked from eHarmony! What's wrong with me? Find me on chemistry.com!" Yes, yes, that is an appealing prospect. Donovan's Not a Match is also mildly amusing in this regard.

Re: Dystopian
Indeed. I've often remarked that there is not a single person in the world who understands all the features of the software products I work on. That is almost certainly true for any significant contemporary OS or other large software product. Not only is it impossible for end users to know all the quirks and features,1 regardless of technical expertise; it's impossible even for those who work on the thing.
1Apologies to Doug Demuro.

Re: Dystopian
Often there's some click-through agreement stating that the purchaser is an adult, or has permission from one. That could be a tough disclaimer of liability for the plaintiff class to climb over, since it implicates the purchasers in fraud (even if, as children, they're presumed to have reduced capacity) and given the paucity of alternatives for verifying age in online purchases.
And every big organization that sells stuff online would be amicussing up a storm in such a case. Not many law firms are going to want to go up against Amazon.

Re: Dystopian
From the article it sounded as though the parent didn't even bother checking her bank statement until the bill was enormous.
Does it? The article says it was a series of small purchases, but unless I've missed something, not over what period of time. Looks like it would have been around a hundred purchases, which could be made quite quickly.

Re: Dystopian
Roblox gets money for nothing – they're selling in-game content, which are non-rival intangible assets. They lose nothing but transaction fees if a purchase is reversed; there's no good that they've lost the ability to sell to someone else.
So they have no motivation whatsoever to limit purchases. Occasional "whale" purchases either will stand, in which case they've made a heap of profit in exchange for essentially nothing; or be reversed, in which case they suffer only a negligible loss (plus they get the PR benefit of bragging about how they'll reverse unwanted charges). Preventing the latter would prevent the former.
First ever 64-bit version of Windows rediscovered … and a C compiler for it too

Re: Given its dominance on the desktop and corporate server world
SQL Server is ... OK.
I understand some people like VSCode; I have no reason to try it, m'self.
As managed-code environments go, I like some aspects of CLR and the .NET Framework.
All of Office is horrible. Venomous Studio is horrible, as is MSVC when used outside VS (though marginally less horrible that way). Teams is excruciating. Windows keeps getting worse.
Phones' facial recog tech 'fooled' by low-res 2D photo

Re: Biometrics!
law enforcement probably quite like face unlocking
In the US, law enforcement can compel unlocking a device using a biometric without a warrant, but not a passcode. That's a huge difference, and sufficient reason to not use biometrics at all, IMO.
Personally, I never use biometrics, for all of these reasons. They're terrible, terrible authenticators, and there are no benefits for any of my use cases. I don't use my phone for payments, either; that's a completely unnecessary expansion of the attack surface with minimal or no benefit.
SCOTUS rules Google and Twitter didn't contribute to terrorist attacks

Well, what it does is close one avenue to further restrictions on free expression. To say that it "reaffirmed" anything is overrating the importance of the decision, I think. Siding with the plaintiffs would have been a disaster; tossing the case, which is what SCOTUS did here, basically just kicks the can down the road. It's a good outcome but not a great one.
UK's GDPR replacement could wipe out oversight of live facial recognition

Re: What's the problem?
Or leave your phone at home, which again is tough to do with your face. Or "forget" your phone on a train. Or put it in a Faraday cage.
OK, you can put your head in a Faraday cage, but 1) that's not terrible effective, and 2) it'll look kind of stupid.
(On the other hand, "Tin Foil Facemask" is a good band name.)

Let's hope it's legal. During COVID mask mandates in New York, Kevin Underhill noted that wearing masks in public is actually illegal in the state of New York. That's under the "loitering" statute, which is one of those catch-all laws beloved of the police when they're inclined to put the boot in.

Re: "the fight against crime and terrorism"
Drugging is unnecessary. It's quite feasible to control a majority of the population with ordinary rhetorical devices, and the infrastructure for deploying those is orders of magnitude better in pretty much every respect than at any other time in history.
It's not Nineteen Eighty-Four. It's Brave New World, where most people subscribe to their own oppression. Enjoy Your Symptom, as Žižek put it.

Re: Suppose it's time for me...
Various attempts along these lines show up on Hackaday periodically. It's a good place to search / keep an eye on if you're interested in the idea, since it's an interesting area for hackers and the bar to entry is low – there are open-source facial-recognition systems to test against, besides the ones that come with certain popular operating systems.
Sci-fi author 'writes' 97 AI-generated tales in nine months

Re: Nope, this is nonsense.
I broadly agree (per my post above about the human-cognition multi-agent model; it might be possible to make a transformer model big enough to simulate all those processes, but it would have to be much bigger).
That said: A recent paper explored a method for making the transformer compute cost linear, rather than quadratic, in the token-stream length. If that works out, it would be conceivably possible (albeit expensive) to have a context window of, say, 50K-100K words, which would let the LLM operate over the entirety of a novel draft and maintain logical consistency. I doubt the output would be particularly interesting – there would almost certainly be the occasional surprising gradient, but the parameter space for an LLM trained on a large, diverse corpus necessarily tends toward minimizing information. But size of the context window alone appears to not be an inherent limitation of LLM architecture for something novel-sized.

Re: Self publish hell
To be fair, even before ebook self-publishing pulled down the capital costs of vanity publication, we were already producing books at such a furious rate that only a small portion could get widespread attention. And there was certainly no correlation between popularity and quality, by any metric (other than the tautological one of popularity). I can name a dozen contemporary young-adult fantasy novels off the top of my head that are vastly superior to the Harry Potter series in every way (logical and intricate plotting, well-conceived and -realized characters, imagination of concepts and rigorous exploration of them, prose style...).
Few readers have time for more than, what, perhaps a hundred novels a year if they're eager? You could easily fill that with the winners and runners-up for various awards, say, without ever touching on the ones that didn't get a look in.
We can regret the talented authors and terrific novels we'll never hear about, much less read. We can celebrate that we have this abundance of great literature available to us, so if we allow ourselves to be even a little ecumenical in our tastes we need never run out of wonderful books to read.1 Or both.
Will a large increase in the number of machine-generated novels make it harder to find the good ones? Perhaps. Personally, I keep finding that recommendations from reviewers and friends and the local bookshop and even, astonishingly, Amazon point me to new (to me) authors that I enjoy tremendously.
1And that's if you're not already an incorrigible re-reader, like Jo Walton is (or I am, for that matter).

It can be argued that under that sort of surplus economy many/most humans would find productive tasks to undertake for their own sake. That's basically the milieu of Doctorow's Down and Out in the Magic Kingdom, for example, where people form "adhocracies" to work on projects that inspire them. Some would even argue that there's historical precedence in some pre-modern societies; you can find examples in the wide array of sociopolitical systems Graeber & Wengrow document in The Dawn of Everything.
Personally, I find my life most rewarding when it includes a substantial amount of productive labor – I like to get things done – and I like to have some of the requirements for that labor come from external sources, as puzzles or problems that I'd be unlikely to conceive of myself, and so that there are external stakeholders to motivate me. So I wouldn't say that what I want is to maximize "free leisure time". But I would enjoy being able to work on interesting projects in an environment of abundance.

Re: Did he correctly credit the sources
This would be a more persuasive argument if you could define "spark of creativity".
Per my previous posts, I think the current crop of LLMs, and the simplistic unidirectional deep-layered transformer architecture in general, are highly overrated. I don't think they do a good job at simulating the processes humans use to produce what we call "creative" text. But I also don't think that handwaving gestures toward undefined mystical essences clarify the problem in any way.
(What might be a better argument? I'd start with the Multi-Agent Model hypothesis; point to the heterogeneity of those agents, particularly the abundance of human senses (including e.g. the "passage of time" sense) and qualia; emphasize the importance of unconscious and somatic (see e.g. the research of the Damasios and their team) influences; consider a framework such as Pickering's "mangle of practice" for analyzing how inputs have unexpected effects on human cognition; suggest a high-level arbitration process – think "concept rectification" – based on attention economy; and note the important role of distraction. It's also useful to have a more-sophisticated model of how human languages work in practice – I like Davidson/Rorty – and how the writing process works.)

Well, rate of production for fiction prose varies wildly, even among authors who can be considered "good" by some metric. Anthony Trollope famously wrote in his "spare time" (after working his regular job of census-taker), according to a strict schedule he devised, and cranked out his novels – which are widely considered at lest good entertainment, and in some quarters serious literature – quite quickly. Kazuo Ishiguro has an anecdote about spending one morning adding a comma to one of his manuscripts, and the afternoon removing it.
But, yes, in general there's nothing particularly unusual about being able to generate a few thousand words of fiction prose1 in a day or so, particularly when the author is in the groove and fleshing out a structure that's already well-developed.
1Or substantial non-fiction, for that matter. When I was in graduate school it wasn't unusual for me to produce 5000-word essays in a day. That represented quite a few hours of research, sketching out arguments, outlining, etc over the previous several weeks; but the actual writing I typically did in a few passes within a 24-hour period.

Re: each hour’s earnings is about $2.95.
if you 'do' it in your spare time
... you're still paying opportunity costs, and the cost of your own labor, and any capital that has to be invested.
If you're doing it for intangible reasons – because it's a hobby, or an experiment, or whatever – that's fine. But if you're doing it for the income, then the "every little bit helps" argument only applies to the extent that the return on investment is large relative to alternatives.
If you're constrained in what types of work you can do, then that decreases the opportunity cost and so increases the rate of return. If you're constrained in what you want to do, that increases the return (because there's an intangible psychological benefit). But merely saying "every little bit helps" is sophomoric; it's not a rational calculation.
Page:
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
- 19
- 20
- 21
- 22
- 23
- 24
- 25
- 26
- 27
- 28
- 29
- 30
- 31
- 32
- 33
- 34
- 35
- 36
- 37
- 38
- 39
- 40
- 41
- 42
- 43
- 44
- 45
- 46
- 47
- 48
- 49
- 50
- 51
- 52
- 53
- 54
- 55
- 56
- 57
- 58
- 59
- 60
- 61
- 62
- 63
- 64
- 65
- 66
- 67
- 68
- 69
- 70
- 71
- 72
- 73
- 74
- 75
- 76
- 77
- 78
- 79
- 80
- 81
- 82
- 83
- 84
- 85
- 86
- 87
- 88
- 89
- 90
- 91
- 92
- 93
- 94
- 95
- 96
- 97
- 98
- 99
- 100
- 101
- 102
- 103
- 104
- 105
- 106
- 107
- 108
- 109
- 110
- 111
- 112
- 113
- 114
- 115
- 116
- 117
- 118
- 119
- 120
- 121
- 122
- 123
- 124
- 125
- 126
- 127
- 128
- 129
- 130
- 131
- 132
- 133
- 134
- 135
- 136
- 137
- 138
- 139
- 140
- 141
- 142
- 143
- 144
- 145
- 146
- 147
- 148
- 149
- 150
- 151
- 152
- 153
- 154
- 155
- 156
- 157
- 158
- 159
- 160
- 161
- 162
- 163
- 164
- 165
- 166
- 167
- 168
- 169
- 170
- 171
- 172
- 173
- 174
- 175
- 176
- 177
- 178
- 179
- 180
- 181
- 182
- 183
- 184
- 185
- 186
- 187
- 188
- 189
- 190
- 191
- 192
- 193
- 194
- 195
- 196
- 197
- 198
- 199
- 200
- 201
- 202
- 203
- 204
- 205
- 206
- 207
- 208
- 209
- 210
- 211
- 212
- 213
- 214
- Next →