Re: Matthew and Son
The cake is a lie.
1537 posts • joined 10 Jun 2009
"Cheap last gen tech is awful therefore this cutting edge future tech will be awful too."
Yes, it's highly likely they're overegging the pudding here, but you can't know for sure it will be in any way comparable to what we've seen so far.
When they first started, they expressly didn't want people doing the version number thing, and they deliberately did not build the infrastructure to facilitate purchasing of upgrades. In fact, I'm pretty sure they mandated no version numbers in app titles to begin with. What a lot of publishers ended up doing instead was using the "bundles" feature and creating a bundle of the new piece of software and the obsolete version so that you'd get a discount based on it subtracting the money you'd already spent on the old one from the cost of the new one. At the start, they did this by branding the new version as a new product rather than a version-number upgrade, and no, there was nothing Apple could do to stop them, so slowly they started letting people do the version number thing instead.
Upgrade by in-app purchase is antisocial, as you're using your customers' storage space on things they can't use. I bought the Pinnacle Lite+Pro bundle (the same price as Pinnacle Pro on its own -- one of those upgrade workarounds) precisely because I was short of space on my iPad and couldn't afford the space for the full app with its filtering etc included, so only installed it when I specifically wanted to do the fancier stuff and just kept the Lite version on for occasional casual use. (Of course, it turned out the workflow was utterly awful, so I stopped using it entirely.)
" consumers could choose when deciding to buy an Android device or an iPhone "
Ah, that hoary old chestnut. The thing is, Apple went out of their way to push app developers to adopt a free-updates-for-life model, which doesn't benefit devs in any way whatsoever... Who benefits...? Well, Apple would say the consumer, but in the end it's Apple, because all of us with iOS devices are then actively discouraged from ever switching to Apple.
I was given an iPad as a gift, bought some video and audio apps, and Pythonista, and then when I went to (finally) buy a smartphone, I bought the one I already owned the apps I wanted on... i.e. an iPhone.
Switching to Android would be an expensive proposition, as suddenly I've got to reinvest in all of that.
They're not feigning ignorance -- they're deliberately challenging it as sharp practice and suggested that it is an unfair contract, and they wouldn't have taken that step if they didn't think they've got a very strong case. The key word here is "tying" -- they're making the point that Apple's business practices are exactly the same as car manufacturers who actively prevent 3rd-party servicing of vehicles, which is now banned in most major jurisdictions.
Apple created the situation for themselves, though.
They started out with a goal to remove versions and upgrades, and create a buy-once-for-life ecosystem. Prior to this the productivity software market relied on upgrade cycles for steady income to fund upgrades.
They encouraged low pricing with a claim that what you lost out in individual sales would be made up for in volume, and you got apps providing 90% of the functionality of desktop apps for a tenth of the price... with free upgrades for life instead of a £50 or more every year or two.
They set up this gold rush race to the bottom, which led to a massive marketplace of "ad supported" apps when they must have known that the market wouldn't support it, and devs the world over wasted innumerable person-years making software that no-one ever got properly paid for.
The only people who ever benefited from this setup were Apple. They made lots of money from cheap app sales. They made lots of money from advertising. Because they were the ones who had scale -- while everyone else got a miniscule slice of the pie, they got 30%. And the reason I have an iPhone is not because I wanted an iPhone, but because I was given an iPad as a gift, and I bought apps for many thinks (mostly sound, video and programming) that I liked, and I didn't want to have to buy them again on Android, or hunt around for suitable substitutes if they weren't available. So I give money to Apple to avoid having to give money to the people who made the apps I use.
It's a messed up market, and it was never sustainable. That's why apps have moved more and more towards subscription models -- it's the only revenue stream that stays (mostly) open to them.
Yes, it's also happening in the desktop space, but as yet on a smaller scale. The death of physical media coupled with the increasing confidence that Windows 10 is going to offer the long-term stability that Microsoft promised at its launch is also probably making desktop app devs worried that for most users a license-for-life will end up being treated as such, rather than buying into an upgrade to get round backwards compatibility bugs in their OS.
So yes, the market was going that way, but Apple accelerated it, and they created a rod for their own back.
" Thus all this government-sponsored life support for languages is quite likely a waste of time. "
That is a perfectly logical argument, because we all know Wikipedia is run by the Scottish government.
You see, once you join in on a discussion about a news article and say something about government/subsidies/taxpayers, you inadvertently show that all the talk about government money is just a smokescreen to present your bigoted opinions as calm rational analysis...
As I understand it, one of the biggest social pressures in Czechoslovakia was linguistic: "Czechoslovakian" essentially meant "Czech" and "Slovakian" was consider "Bad Czechoslovakian". Slovakian people were considered stupid, and the evidence was that they spoke "bad Czechoslovakian". Even to this day, we have the same problem in Scotland -- that speaking like yer mammy is taken as a sign you're thick, and speaking like someone from miles away is taken as a sign you're sophisticated.
It is not as much a matter of identity [i]construction[/i] as it is of "identity positivity" -- what's being called "reclaiming" these days. For people to stand up for the identity [b]they already have[/b] and defend their right to speak [b]in a way they already do[/b] is very much not "construction".
Quite the opposite, in fact. The real "constructed" identity is the one that denies variation and tries to impose a single uniform cultural identity on diverse peoples.
There are many people inside the independence movement that want nothing to do with Scots, and there are many outside the independence movement that support it. Same with Gaelic. Neither is a party-political issue.
As I was told it, most of the variety in KFC is just down to frying temperature. With all frying, if the fat isn't hot enough when the food goes in, it doesn't crisp up immediately and the batter absorbs shedloads of grease.
It was suggested to me that many UK-based KFC branches let the fryers cool too much and don't wait till they're back up to temperature when the customers come in, and reuse the fryers too quickly when busy, meaning the cold chicken hitting the fat cools things back down to "greasy sponge" point again.
" The disposable income people are still prepared to spend will go somewhere. "
China, as we buy more cheap tat on eBay and Amazon Marketplace.
The trend over time has always been to reduce the amount of our outgoings spent on labour costs. Sandwich shops were one of the last few labour-intensive drains on our wallets.
On the other hand, recent decades have seen society degenerate further an further into siloed "tribes" that rarely mix. When we spend all our time with people "like us", we let ourselves get narrow-minded.
Socialising by geography (a.k.a. "Talking to the neighbours") means exposing yourself to more ways of thinking.
"Asking somebody to explain it forces them to think about it. "
Unless they've already thought about it (or had someone think about it for them) and they already have an answer, such as "the brain--why do you think tinfoil hats exist?"
But if the question is repeated again and again with no malice or implication of stupidity, over time it might have an effect. Lots of us here will remember the long, slow decline of racism, sexism and homophobia... that is still going on, decades after it started. Ignorance is never dispelled overnight. Patience is required.
If you can't change their minds, don't talk to them.
Educating the ignorant is a slow process, and for those of us who try to act with patience and help people slowly move out of their cognitive traps, it is supremely frustrating when people blunder in and shout "YOU FUCKING NUTTER!" then run away proud of themselves for speaking up in the name of truth, having demolished our attempts to get the ignorant to start to understand the truth a bit better.
" The 4 hours extra of working a week, can be offset by the reduced time wasted commuting. I've worked in jobs where commuting wasted 10 hours a week, so spending 4 of those working and 6 of them relaxing is a win for both sides, and helps justify the idea to the company. "
Your company hasn't paid you for that time, and hasn't "gifted" you that time -- that's your time, and you shouldn't be giving it away for free. If you need to put in the extra time to buy yourself the ability to work from home, that's a bit shitty, but go ahead. But to rationalise doing more work for no active recompense from your employer... that's being complicit in your own exploitation.
Unshackle the proletariat, and fight for a better tomorrow, comrades!
" 2) "VR" Games. They suck. I don't want an immersive VR experience, where I have to mime getting out of the car and twisting off the gas cap in order to refuel. I want to press "Y" while next to the pump and have my tank meter zip to "F" on its own. In fact, I don't want to even stand up.
3) No/crap support for non-VR games. The main reason I want 3d is for stereo 3d, not for a VR experience. I like traditional styled games, where you sit and use the keyboard or maybe (for the less cerebral games) a console style controller. "
Lots of development houses thought that way, and started trying to do traditional gaming in VR (the original Oculus dev kits didn't come with motion controllers, after all) but in the end, they all came to the same conclusion -- that the sense of presence was too much, and using a controller just felt weird.
VR security is a non-issue. Not because it isn't a potential problem, but because either your headset is just a fancy display controlled by an external computer/console or it's a display with an Android device built it. That means VR security is IT security, and business as usual, not a new category. VR attacks will exploit the exact same attack vectors as every other attack, but there will be far fewer of them, as it's a small target group. Furthermore, with the exception of the Quest and similar standalone units, it's going to be pretty much impossible to identify potential targets from internet metadata, as there's nothing to set a VR-equipped PC or Playstation apart from non-VR ones unless you happen to be browsing the web from a inside a headset.
" both financing its operation and financing its abolition. "
Well, there is the question of morality vs pragmatism. If the world's run by slaves, conscientious object costs money that might render you uncompetitive. Then eventually you get the chance to do something about it.
Or maybe it's just that families are made up of different people with different views.
If you look at the PEP, one of Guido's justifications was that reviewing existing Python code, he was finding plenty of examples of people duplicating work (eg [ f(x) for x in x_list if f(x)>0 ]) or doing redundant work to avoid nested ifs, and he saw this as a solution for real-world problems.
The other advantage is removing what the linguist part of me would call "long-range dependencies". The closer the assignment is to its use, the easier it is to reason through the code. Particularly, mathematicians and scientists are used to reasoning through things in semantically dense formulas, and less used to the step-by-step imperative style.
When you remove the possibility of =/== substitution bugs, the main danger of C-style assignment is gone.
The main reason I see the walrus as a good thing is in list comprehensions.
newlist = [ f(x) for x in oldlist if f(x) > 0 ]
Now imagine the that f(x) is O(^n) or O(n!) -- you've either got to take the performance hit of running it twice or expand the code out. For a scientist or mathematician (and academia is a major target audience for Python), this one line, pseudo-mathematical, pseudo-functional approach is much more readable.
My personal preference would have been for local variable declarations, eg:
newlist = [ result for x in oldlist if result > 0 where result = f(x) ]
but that's not the way they chose to go. Instead we have:
newlist = [ result for x in oldlist if (result:= f(x)) > 0 ]
...which is less restrictive than my way, and Guido set out good reasons for it. But when you look back at the examples, I think it boils down to this: a great many users of Python aren't immersed in the imperative programming style the way most pro software devs are, and the reliance on branching blocks and lines is more confusing to them as it renders code logic implicit. I'm surprised it took Guido as long as it did (v2.4) to introduce conditional expressions, actually, as that's really useful for bringing a single mathematical function into a single line.
On the other hand, it is now getting more and more widely recognised that the YouTube algorithm is a slave-driver, and if you don't work yourself beyond the point of being able to produce decent content, you will be punished by being made irrelevant in the listings.
Whole multi-person production teams spend months to create a couple of hours of television, and YouTube is pushing solo producers to knock out around an hour of video per week to get a basic income. Of course corners get cut.
I'm not defending Raval here, though -- he chose how to react to the pressure. If he'd exited YouTube with his reputation at a high point, he could probably have got himself a fairly decent job. Instead he tried to continue in an unsustainable market.
" It seems such a counter-intuitive thing to do and yet is so widespread. The only explanation I can think of is that users learn it from each other. "
The most underestimated force in UI design is the path of least resistance. Deleting is one of our quickest actions (thanks to having a dedicated key) and if deletion is (initially) non-destructive, the apparent continued availability makes it seem like the most efficient means of archiving.
If we also had an "archive" key on our keyboards, more people would archive properly.
All very much true.
However, the good course designer starts with a particular demographic in mind as his/her target audience, and builds the course around them. A good teacher will adapt the course on the fly if students are finding it too hard or too easy.
Modern digital courseware wants to sell to as wide an audience as possible, which means all notions of prerequisite learning go out the window. Then there's the tendency for everything to be live coding instead of lectures, which means everything's paced by lines of code rather than complexity of concepts.
I'd be more generous and say that he's fallen into the greatest cognitive trap of the crowdfunding era -- the idea that money comes first, then everything else falls into place.
Kickstarter was so swapped with this sort of thinking (for example all the non-engineers who said "give me some money and I'll hire an engineer to create my technologically impossible and/or financially infeasible games console) that they insisted on prototypes first (and now they all head to Indiegogo instead).
The notion of building a course around freely-available information is so much the mainstream that the notion on its own is valueless (how many of your uni lecturers invented and/or discovered the stuff they taught you) -- it's the execution that matters.
What I note as missing from the article is any discussion of quality assurance. If you're delivering a course to that many people, there should be several layers of oversight, and ideally also a fairly rigorous testing process involving teaching beginners... I doubt there was any of that.
It's the thing that bugged me most about Coursera and Udacity when they first came out -- 1000s of people taking courses that had never been run before, and many of which (on Coursera and EdX, anyway) were only ever run once. No beta testing, no refinement or improvement... all people ever saw was the first draft. In that situation, at least these courses were based heavily on tried and tested university modules, but even then, the change of medium really called for a lot more in terms of adaptation and testing.
That's an odd comparison on several levels.
First of all, the 80s were full on one-man dev teams, like Mike Smith (the Miner Willy games), Andrew Braybrook (although he was part of a software house, most of his genius games were solo coded) or Jeff Minter. With 16, 48 or 64k to play with, it was pretty easy for one person to fill all the memory in a few weeks' work.
But Mike Singleton... well, he did some clever maths-based programming to make games that most people wouldn't have thought possible -- even now part of me wants to believe that Lords of Midnight on a 48k spectrum is just a Mandela effect.
We're now living in an era where technological marvels are beyond the reach of the sole coder, and what indies like Pope are doing is finding ways to use what's already there efficiently to build compelling story-driven games without having to worry about the tech.
OK, so Pope did have to write a custom shader to get the "dithering" effect for monochrome shading, so it was far from a totally non-technical project, but the comparison just seems weird to me.
" Unlike the wizardry used by the HoloLens 2 to calculate the position of the user's extremities, the Quest will employ its monochrome cameras and "AI magic".
Good luck with that – the Quest is hardly the processing powerhouse. "
Processing not required, allegedly. Oculus seem to be saying (https://www.youtube.com/watch?v=K2zLneGGbk8) that this is all done on the devices DSP hardware, and that the DSP hardware was specced up deliberately to leave enough capacity after implementing the basic inside-out tracking for later hand-tracking to be included.
I haven't looked into developer specs in any depth (if I play with VR dev, it's always with the dev kits in Unity or UE) but I'm not aware of any API to allow developers to access the DSP for custom functionality, so I don't believe it's going to have any real impact on device performance. Yes, there's reportedly currently noticeable latency in it, but that's likely to be down to the DSP itself.
"Algorithms have to crunch through tons of data in order to make accurate predictions. Relying on a single individual's data is probably not enough to get it to work. Even if a tool is effective for groups of people, it doesn't mean it'll necessarily be accurate for a single person."
It's even worse than that, though. Setting negative expectations regarding an individual is likely to increase that person's alienation, making the negative prediction a self-fulfilling prophecy. A famous education experiment in Switzerland told teachers that a set of kids were going to be "late bloomers" and highly intelligent. The teachers treated them differently, and the prediction game true. There was nothing particularly special about these kids, other than teachers' expectations.
If convicted criminals are treated with constant suspicion by beat cops, they'll be convinced that rehabilitation isn't possible and bam! -- self-fulfilling recidivism prophecy.
I haven't really been able to examine Eiffel closely, because as soon as I look at it, I see yet another impenetrable jumble of fixed-width letters.
Why are we so obsessed with fixed width? It is hard to read, making it easy to make mistakes.
Why are we so obsessed with plaintext? It's so inconvenient that we spend our lives hacking IDEs to try to present layers of visual meaning on top of the text through colouring and font weight. And still it lets us type illegal code -- syntax errors.
Creating a truly smart language integrated with the IDE would be simpler than hacking the editor to highlight errors.
Just look at the STRIDE language, invented as a teaching language. Almost all the flexibility of Java with practically no scope for syntax errors.
" I take it that you hadn't heard that this process was completed in 1760 under George III? The Monarch's assets were separated off and paid to the Treasury, "
Which is why anything since taken from the people would be accurately described as stolen in a legal sense. Under previous monarchs, many state assets have gone mysteriously missing from high-security locations and no charges have been brought, including several generations of crown jewels. The crown jewels were quite certainly stolen. Who by, we can't say, but the lack of any action suggests someone who is above the law. Some of these disappearances are said to line up quite coincidentally with cashflow problems in the households of the then-reigning (now dead) monarchs.
" and in return the government ran a Civil List returning a set figure "
Ah, so we got back what they stole from us... by buying it? You think having to buy back what was stolen is justice. If so, I'll sell you your own car next Sunday.
Oh, and then you mention the Crown Estate. How did they get the Crown Estate? Now I'm not saying they stole it, but I think the previous poster would. Why did they have rights to own the land in the first place?
Biting the hand that feeds IT © 1998–2021