* Posts by andy 103

435 posts • joined 18 Aug 2009

Page:

Even Facebook struggles: Zuck's titanic database upgrade hits numerous legacy software bergs

andy 103
FAIL

Re: Facebook's key motto...

I can take a project from embryonic to mega-buster

It's exactly this sort of bullshit I'm talking about. Go on, name your project that has 2.8 billion users and virtually zero downtime in ~16 years?

I very much doubt I'm alone in that on these forums.

Agreed, it's full of people who also criticise yet never seem to be able to mention an equivalent/better thing they've done themselves. Primarily because they haven't, and can't.

It just seems that FB let it run away a bit too much.

... and yet, it continues to work without any significant problems.

andy 103

Re: Facebook's key motto...

As I said in my first post people are quick to criticise Facebook. I bet none of those people would have the skills to create something that went from a bedroom project to being able to concurrently support billions of users and hold nearly 2 decades worth of data for all those people.

Generally speaking Facebook is very stable and that is _incredibly_ hard to achieve when you have over 2.8 billion users. Even if only 50% of them were active that's way more than your average application. The sort of people who criticise them generally don't even have 1 million users on their own platforms.

I'm yet to see an example of anybody who has taken something from an amateur project and scale it to what it has become. Granted it's somewhat annoying when the use-case is for the Karens of the world to share posts about how disappointed they are at their local school... but the tech and processes behind it is nonetheless very impressive.

andy 103

Didn't know they were still using MySQL

When Facebook was originally built - early 2004 - it was a fairly basic PHP application which used MySQL for all storage (user profiles etc). I don't think it used anything else except maybe memcached or possibly Redis for caching.

I doubt MySQL would scale to hold all of the user data that it now needs to. What are they still storing in MySQL and what other storage technologies are they using? There must be an absolute myriad of data especially given that you can pretty much go back to when you opened your account and find posts.

People are quick to criticise Facebook but the way in which they've scaled that is nothing short of incredible.

Windows 11 still doesn't understand our complex lives – and it hurts

andy 103

Re: Browser sessions don't work as you've described

@sictransit what steps are you following to do that in Chrome?

andy 103
WTF?

Browser sessions don't work as you've described

"I know of no system that allows different simultaneous workspaces with their own IDs, nor browser that allows the same with tabs"

How are you expecting that to work in a web browser? Each tab is using the same browser session - they don't have a "separate session per tab" feature, and that would be annoying as hell if they did. Imagine if you'd signed into Gmail and then had to login separately to Google Docs, Google Sheets etc... per tab.

This is why if, for example, you wanted to login to 2 different Gmail accounts you'd have to do it in 2 separate browsers, or a separate Incognito mode browser window. You can't be logged in with 2 separate accounts under 1 session.

I'm not sure what you're expecting here but this seems to be a lack of understanding of how sessions work.

British Airways data breach lawsuit settled: Airline coughs up potentially millions to make sueball bounce away

andy 103
Facepalm

saving a Windows domain admin username and password in plain text

I'd be willing to bet if you went deep enough into literally any enterprise/large organisation on the planet, you'd find something equivalent to this.

The trouble with this is that it only becomes known _after_ it has caused a problem. The person or personS who were responsible will never be held to account and in some cases not even known.

As much as these large businesses have thousands of pages of information security policies - getting them working in practice and enforced is a completely different matter.

New mystery AWS product 'Infinidash' goes viral — despite being entirely fictional

andy 103
WTF?

Reading between the lines of recruitment bullshit

You see this on LinkedIn all the time.

Adverts that say things like "we're looking for a frontend PHP developer...". Oh, so a backend developer that also has knowledge of frontend technologies? Or a frontend developer who knows a bit of PHP?

The reality is usually that the company asking for the candidate wants somebody who can do "any work" they have, and/or doesn't really know what they need. The recruiter - generally - has no idea of the difference between backend and frontend tools, and doesn't care, as long as they get their commission. Somebody has already alluded to this when you get people asking for 5 years experience in something that hasn't even existed for half that time. Amongst many other giveaways such as a real lack of understanding of where the technologies fit into a particular stack or toolchain.

Exactly the same principle comes when you get a new or "cool" technology. Not so long ago it was Kubernetes and Docker. So many adverts asking for people with experience of it. What were these companies doing before they had it, and why did they suddenly _all_ need it simultaneously? Oh yes, because it's a bandwagon on which you must jump otherwise your company will fade into obscurity? Well if you know as little about what you're advertising for as to how to run a company, it may do just that.

The M in M1 is for moans: How do you turn a new MacBook Pro into a desktop workhorse?

andy 103

Re: Why only M1? Also applies to Intel Macbook's

@John Robson - this seems to have become a bit pedantic although I think you missed the point I was originally making.

My current set up of a 2020 Macbook Pro and an iPhone 12 cost me about £2000. Anyone spending that shouldn't even have to spend "only" £20 more to get... connectivity. I never recall having this problem on older Apple laptops (my 2015 Macbook Air being a good example) or even cheaper PC hardware.

If it's "only" £20 then Apple can fucking well throw one in the box - at trade prices it would be pennies for them. But no... have to be bastards for the sake of it. I like Apple hardware but stuff like this really is insulting.

Incidentally this wouldn't happen with anything - or from anyone - else. Imagine spending £2k on a washing machine to then be told, oh if you want to connect it to water/power you need to spend £20 on an extra adaptor? Literally nobody would.

andy 103
WTF?

Re: Why only M1? Also applies to Intel Macbook's

@John Robson - The first line of my post - the bit about TWO Thunderbolt ports - is a copy/pasted version of what's under "About this Mac". It's not like I need confirmation though since I can see it only has 2 Thunderbolt ports.

The only other port is a 3.5mm headphone jack.

So either:

- Use 1 for the monitor + 1 for the keyboard/mouse = 2 ports used

- Use 1 for the charger + 1 for the keyboard/mouse = 2 ports used

- Use 1 for the charger + 1 for the monitor = 2 ports used

If you can make those numbers add up any other way please let me know.

Unlike my older Macbook Air there is no separate/dedicated power port; charging is done via a Thuderbolt 3 (USB-C) charger that was supplied by Apple with the laptop. Have a look at "What's in the box" on https://www.apple.com/uk/shop/buy-mac/macbook-pro/13-inch-space-grey-apple-m1-chip-with-8-core-cpu-and-8-core-gpu-256gb#

The reason you can do it is because your model has 4 ports... not 2! I'm guessing it's an earlier model.

andy 103
Facepalm

Why only M1? Also applies to Intel Macbook's

I have a MacBook Pro (13-inch, 2020, Two Thunderbolt 3 ports). Intel i5.

In my home office I have a single Samsung HDMI monitor and a keyboard/mouse that needs a USB adaptor to connect. This means I can either:

1. Charge my laptop and plug in the monitor (via a separate HDMI adaptor), but not connect my keyboard/mouse

2. Not charge my laptop (use it off battery) whilst also being able to use my monitor and keyboard/mouse.

The problem is when the battery runs out none of (2) works. I can either use my monitor *OR* my keyboard/mouse whilst it's being charged!

Imagine if I'd also bought an iPhone 12 and wanted to charge that via my shiny Apple laptop. Oh yeah, I did.

The irony of this is that my 2015 Macbook Air doesn't have this problem. I can connect it to power, use the keyboard/mouse and connect it to my monitor (via a Mini DisplayPort - HDMI adaptor)... and still have a 1 spare USB port.

Japan assembles superteam of aircraft component manufacturers to build supersonic passenger plane

andy 103

Re: What baffles me about Concorde

"Maybe traveling at this speed made no sense in the first place"

It depends where you're travelling to and from. If you were going from the UK for a beach holiday in Spain then no, it wouldn't make any sense. But if you wanted to go half way around the world being able to do it in, for example, 3 hours instead of 9 definitely makes sense. When you factor on travel times at the start and end of the journey especially, cutting down that flight time really is advantageous because you're less weary from travelling for so long.

The only thing I can see being different now to say in the 1980 and 1990's is business customers where you would essentially have to go to another country to "see" somebody whereas now you can just get on Zoom.

andy 103
Meh

What baffles me about Concorde

My grandparents flew on Concorde once, in the very early 1990's. They did it for a milestone wedding anniversary as a treat. What baffles me is that they were born in a time when it wasn't even possible for people to fly to destinations, they then went through the 1970s(?) period where there was an huge increase in 747's and the like to get to destinations around the world. They then flew on Concorde. That was taken out of service just over 10 years after they flew on it... Now it's not possible to travel from London to Canada (that was the trip they did) in the time they did.

It seems to me like we've really taken a backwards step. I've watched the programmes about how Concorde was a financial loss, but a lot of that seems to be due to political shenanigans and terrible project management as opposed to the actual technology.

Everyone moans about budget airlines letting the great unwashed go to places for £20 after spending more than that on booze in Wetherspoons at the airport. There are about 2 million millionaires in the UK alone as of today. What's the missing piece here? Surely there is a market for this?

Apple settles with student after authorized repair workers leaked her naked pics to her Facebook page

andy 103

Re: "Apple believes everyone has a right to privacy"

"Nobody has the right to drive my car away if I leave the key in it. But I'm definitely not going to do it."

That's exactly my point. You shouldn't do it, but even if you did, it doesn't mean all of a sudden you're in the wrong and the person who stole your car has done something acceptable because you indirectly gave them that opportunity. You didn't consent to your car being taken, and the thief taking it has committed a crime.

In the same way, somebody giving a technician their passcode or access to their device, doesn't mean that they have implied the technician can do whatever they want with that person's data. Giving them access doesn't mean you're in the wrong because - even by doing that - it doesn't imply they're ok to "do whatever they want".

andy 103
Stop

"Apple believes everyone has a right to privacy"

I can't remember the exact wording but when you set up a new iPhone/Mac it has some statement along the lines of "Apple believes everyone has a right to privacy".

Which is correct, they do.

It seems some people are quick to blame others for storing *their* data of *their* choice on *their* own device. It doesn't matter if you think someone is stupid for storing nudes on their own phone - the bottom line is that's their own personal data and on their own device and there is absolutely no implied consent for other people to "have" it. They have a right to privacy and just because it's "technically" possible for the data to be shared or leaked doesn't mean that's acceptable. On balance of probability the victim didn't want that data to be shared publically and gave nobody the right to do that.

This also goes further. Some people seem very quick to blame others for what they consider to be weak security practices. Again, it doesn't give anybody else the right to take advantage of that, and the law should rightly take that into account - was there clear intent from a victim to have their data stolen? Of course in 99.9% of cases - no, absolutely not. Stop blaming people when you know damn well others have done them wrong.

'We want to try and remove tools rather than add more...' Netlify founder on simplifying the feedback loop and more

andy 103

It is up there for the world to see.

This depends more on developers understanding basics like you shouldn't have hardcoded credentials in any code that's in version control. That plus a private (Enterprise) GitHub account takes care of a fair number of concerns.

My concern is more about the infrastructure and set up of applications in general. In the past having a web framework and using a dedicated server was, and still is, more than good enough to produce something of production quality. Having decent developers who understand security principles is essential.

Where things have gone wrong is that introducing lots of other layers of complexity and trying to "separate" things out does not necessarily mean they are "more secure". But the message is always the opposite - you should use X new technology because it's "more secure" / "better" than what you're currently doing. From what I've seen, it most often isn't, at all.

andy 103
FAIL

Such sites do not require a web server

Well, they do. It's just that you've separated out the bits that need a web server (an API) from the frontend of the application... which needs a web server to be, erm, served.

They can slate monolithic applications all they want. All these modern "solutions" that "decouple" everything and run on cloud infrastructure simply shift complexity elsewhere for no real benefit.

Give me a proper web framework and dedicated server any day over any of this hipster bullshit.

What the heck is FinOps? It's controlling cloud spend – and new report says it ain't easy

andy 103

@Korev, no it doesn't. Because you can buy your own hardware and put it in a datacentre for a fixed price. Or lease a server from the datacentre... for a fixed price.

The key with it is that the pricing is known. Where as with cloud infrastructure the "cost per CPU cycle, read/write operation" and all that bullshit means you can effectively get a bill for £-anything.

A counter argument is that if you lease a dedicated server you don't pay more than your known, fixed cost regardless of whether the server is doing nothing or 100 computationally intensive processes. My original post on this topic sums that up - having a server that's way above the spec the company I described needed was actually significantly cheaper than cloud infrastructure. This was entirely because of the false premise of things like "you only pay for usage", when in fact you also pay to migrate/re-train people etc but this is never advertised by the cloud provider, naturally!

andy 103

Re: on-demand pricing is bullshit

For that matter, it doesn't matter WHAT you intend to lease

Effectively the dedicated server I mentioned in my original post was leased. It was leased from the hosting company, i.e. the company I worked at paid the host £250/month. But here's the thing... that cost is known and fixed. It doesn't matter if we ran nothing on it and it had 0 bytes traffic, or if we ran 100 web applications that maxed out the CPU. The cost was known and didn't vary month-to-month. The existing apps and procedures for running/deploying to that host were known and understood by the employees.

The issues with the cloud are:

1. The costs aren't fixed - or in some cases, even known.

2. The transition / migration costs are extremely high if you factor in how long it takes a company to do it, get their employees up to speed with how it works, and then factor in how much other paid work they could have been doing in this time.

andy 103
Mushroom

on-demand pricing is bullshit

I worked at a company that had a dedicated web server for various web applications they produced for customers. They paid £250/month for it and got well over that in revenue from clients as the hosting cost of their applications. There was nothing wrong with it.

The company knew that the spec of the server was in excess of what they needed. Some (young, inexperienced) "DevOps guru" came in and told them that they could use cloud hosting and save £££

Let me illustrate the problem:

- The previous dedicated server had a fixed, known cost. It was always £3000 per year (250 x 12)

- The cloud server started with an "amazing" monthly price of about £60 in the initial month. Management thought wow, we've saved over £2000 a year since this new one will only cost us £720/year (60 x 12)

- yes, APART FROM..... as they added more applications, the £60/month went up. They never knew what it would be since the calculator for said host seemed to pick numbers out of thin air

- The company charged around £150/hour for development. They spent - no word of a lie - around 70-80 hours moving stuff from one host to another and learning about the new infrastructure. Given their hourly rate the "cost" of this to the company was £12k.

End result: they spent a minimum of around £13k in one year, when something they had previously that was perfectly suited cost them a fixed price of £3k.

If they'd have gone as far as employing a consultant to tell them this, said consultant would have been laughing all the way to the bank!

Good businesses need to know their costs. Having fixed costs is actually good because everyone knows where their stand. The idea that you can "save" with cloud tech simply isn't true for most people especially because no cloud provider factors in the transition cost. If you add in the cost of your FinOps consultant...well...

Salesforce: Forget the ping-pong and snacks, the 9-to-5 working day is just so 2019, it's over and done with

andy 103
FAIL

Prediction

Pre 2020 :

Employee - "We need flexibility to WFH"

Employer - "No, you can't be trusted to do a full X hours per day like we pay you"

2020:

Employer - "Please can you WFH so our business can continue to operate? Or we can furlough you maybe"

Employee - "Yeah, I'm happy to have a job or just be paid a percentage of my wage so I can still pay my bills. Whilst finishing that decorating... in between home schooling my kids."

Mid 2020 :

Employer / employee - "this is working pretty well"

2022 :

Employee: "I really wish I had a remote office to work from so I don't have to live and work in the same place"

Employer: "We're done with offices. They cost us loads of money, it's cheaper (for us) if you fund that yourself. But that was your idea in 2019. Well done. A bonus? No, there's no money for that.".

Brit IBM veteran wins unfair dismissal case after 2018's Global Technology Services redundancy bloodbath

andy 103

Re: Like people producing something useful, the Title Is Optional...

Because managers never factor in their own salaries when producing Excel sheets showing cost "savings".

It's a pretty simple formula. All the people who actually do real work are a cost (salaries) but all the people who manage those costs (even though they often have much higher salaries) are seen as saviours because they can "save the company money". Often by trying to replace highly skilled workers with less skilled workers. When things go tits up said managers leave, and then repeat the process at another company. All whilst extracting a massive salary + benefits for themselves, naturally.

andy 103

Because it's not really about a score. A score is just their artificial way of justifying why they want to get rid of you. In a similar manner if they had a project and desperately needed staff, said scores would suddenly become a lot higher.

They've decided they don't want you. You've decided them getting rid of you is unfair. Nobody is a winner.

andy 103

Re: redundancy is laying off workers where the job no longer exists

Re - Intentionally laying people off and moving their jobs to another country in order to lower costs is not redundancy.

Absolutely. But the only thing which will stop employers doing it is if there is some default massive fine (let's say £1m plus) per employee if they're found doing it. They do it, because they can get away with it.

andy 103
WTF?

This is ridiculous

So many employers don't even follow their own procedures when it comes to doing this.

But what really really f-ks me off is this line:

job cuts, known in IBM-ese as a "resource action." His job was due to be offshored from the UK to Bulgaria

How can an employer claim someone's job isn't required, then offshore it? Clearly it is required, but they feel it should be done "differently" - i.e. at a lower cost to them.

Cases like this shouldn't even be given time of day. Clear case of an employer taking the p**s. If someone's job isn't required then they shouldn't be allowed to replace them. It should be put into law with a massive, massive payout to anyone who's unfairly screwed over by this.

Spare a thought for Asos.com techies: Topshop acquisition coincides with deadline for global retail system go-live

andy 103

Re: I have faith in them

@Tom38 That's why I quite like ASOS though. Because of the free delivery even so much time after you placed the order. You don't have to necessarily make a trip to the post office. I tend to do it when I'm passing whilst doing other jobs like food shopping. Knowing you have so long to return stuff puts less pressure on. You can also return it from places like petrol stations. They're pretty quick on refunds too so if something doesn't fit you soon have the money back to buy again. Like I said, this is what people are after, although appreciate it's not completely without downsides.

andy 103
Thumb Up

I have faith in them

I'm a loyal Asos customer and one of the reasons is because - aside from shit delivery companies they use (i.e. Hermes) - the process of getting stuff is absolutely flawless. Without wishing to sound like an advert being able to get free next day delivery and return anything over a month later - I think it's still about £10/year for this - is absolutely responding to the way people want to shop.

Whenever I hear of these "death of the high street" stories, I think, well if you can compete with this good luck to you. What Asos offer is what a considerable number of people want. That's why it works. It's not rocket science. The high street hasn't adapted to shopping habits very well. Lockdown will inevitably have helped online retailers but companies like this already had it nailed long before then.

I wish them luck and hope they don't screw it up because what they have now is about as good as it gets from an online shopping perspective.

The killing of CentOS Linux: 'The CentOS board doesn't get to decide what Red Hat engineering teams do'

andy 103
WTF?

Am I missing the point? Why does anyone care?

So in our case we had a few servers that ran CentOS. We'd heard about this way before it was reported on The Register and decided to just switch to a different distro. In our case, Ubuntu. Far from it being a royal pain in the arse that we perhaps anticipated, everything worked fine. There were some differences in the software installation processes that we used but nothing insurmountable.

One of the reasons we never bothered with RHEL is that for our particular use-case Red Hat offer absolutely nothing of value to us. So any other distro would probably have been ok. Given that all Linux distros are based around a particular kernel the only "downside" of using an alternative distro is that some of the processes/procedures (apt-get compared to yum, and so on) may differ. But I can't imagine anyone who's competent having a real struggle with that.

A new take on programming trends: You know what's not a bunch of JS? Devs learning Python and Java ahead of JavaScript

andy 103

Re: PHP?

I think this is to do with the fact that Wordpress alone makes up a significant chunk of what's on the web. With it being built in PHP it's often hard to separate people from who do actual PHP development, from people who use an application without doing any development at all, that happens to be written in PHP.

andy 103

Re: 25 years ago

I’m actually describing both a browser and a server, amongst other things. Yes you need a server side component to save a document, but the likes of Ajax/async requests (which you need to send the request to save) and support for UI controls built with CSS3 and SVG just weren’t available in browsers that long ago. The slickness of applications now is reliant on everything from the client, server and networks in between. Fault tolerance and redundancy (network / server side) have moved on considerably. In my view it is progress.

andy 103

25 years ago

25 years ago you couldn’t write a document in a browser, have it auto save without human interaction, and have multiple collaborators.

That’s just a very basic example. What you can do in a browser now is nothing short of incredible progress. Progress which people who don’t understand how it works behind the scenes get annoyed about. If you’re the sort of person who doesn’t understand why you can’t just copy/paste HTML to get stuff like that to work I can’t help you. The nature of some of these apps has also brought people together to do great things which *is* the very essence which the openness the internet was intended for.

Smartphones are becoming like white goods, says analyst, with users only upgrading when their handsets break

andy 103
Facepalm

Like second hand cars

In 2019 I went into an Apple store and paid £749 for an iPhone XR on a finance agreement that was £31/month, which is just the phone without a data/call plan. I pay about £10/month on a SIM only deal which I change when I want as there's no contract.

Fast forward to today and a family friend managed to get a refurbished (read: same quality as mine is now) XR - including airtime - for about £17/month. I'm still paying over £40 for the same thing.

The analogy I use is what I did is like buying a brand new car. Why would anyone do that when you can get one 18-24 months old for a fraction of the cost, and effectively get the same thing? I bought it on a whim because I genuinely thought it was a good phone that would last me 4-5 years (and hopefully it will). But of course had I have waited and just gone for a second hand one that could still have been the case.

The point being that "new tech" isn't good enough when compared to tech from a year or two ago to justify the cost. Manufacturers know this and that's why you get these crap adverts from EE and such about "envy" when in reality everyone thinks you're a bit stupid for spending £80/month on a phone just because it's shiny and supports 5G. We all know in a year or so what customers will actually be prepared to pay for that.

If you're a WhatsApp user, you'll have to share your personal data with Facebook's empire from next month – or stop using the chat app

andy 103

Re: Does this apply here?

"if someone has my details on their phone, I am not consulted nor consent about my details being passed on to Whatsapp (or any of those other 'allow access to contacts' apps)"

It's called a loophole. In this case *if someone has my details on their phone* and they *allow access to contacts* it is actually they (the "someone") who is technically at fault in that scenario. That person allowing access to their contacts gives permission for Whatsapp to have the data.

Search history can calculate better credit ratings than pay slips, says International Monetary Fund

andy 103
FAIL

I have an iPhone 12 on contract and search for luxury holidays which I don't go on.

Guess I'm rich.

World+dog share in collective panic attack as Google slides off the face of the internet

andy 103

Re: Try again *later*

How long does it take YOU to make a cup of tea?

Well I made at least 2 in the time it took Google to fix this.

So, considerably longer than 5 mins.

andy 103
FAIL

Try again *later*

It always makes me chuckle when you get error messages that say

"Try again in five minutes"

Why five? Where has that figure come from? They don't even know what the error is! So how can you reliably determine when someone can "try again"?

But it's somehow better than "try again later". When is later? 10 mins, tomorrow, next year?

Either way we'll all keep madly hitting Ctrl+F5 until we get the response we want.

Bare-metal Macs-as-a-service come to AWS. Intel for now, M1 silicon in 2021

andy 103
WTF?

What is the use case for this?

I must have mis-understood this but what use-case does anyone have for running a Mac within cloud infrastructure when by definition they want a bare metal device anyway? Wouldn't it just be better to...buy one? It's not like buying your own is mega expensive especially when you consider over a period of 24 hours it pretty much costs $30 to rent from AWS.

What applications or services are people running where they need a remote Mac to run them?

None of the examples given by Amazon are "people who don't already own Macs".

AWS reveals it broke itself by exceeding OS thread limits, sysadmins weren’t familiar with some workarounds

andy 103
Mushroom

Isn't the point of cloud infrastructure that you can do more with less?

That's funny because most cloud providers including AWS are telling everyone the main point of their existence (as far as customers are concerned) is that you can do everything using fewer resources, therefore costing you less.

Except when it goes wrong. Then you need more resources to fix the problem.

Almost as if you hadn't bothered with any of it, everything would have been better.

Redis becomes the most popular database on AWS as complex cloud application deployments surge

andy 103
FAIL

I wouldn't use the phrase "poorly-sourced" when there are 2 errors in your first sentence. No, I'm afraid that Redis is very much a database and isn't limited to memory based storage since it can read/write to disk as well.

Dell online store charges 16 million dollars for new laptop with paint job

andy 103
Joke

TCO

Well, the total cost of ownership of Dell hardware has always been a bit absurd.

Either you get lucky and it's very good value, or it's absolutely shite hardware and you may as well piss your money down the grid.

It seems all they're doing here is making the latter an explicit option.

Joke icon. But also sort of not a joke.

What does everyone make of today's Google antitrust action? Only the stock market is happy with the status quo

andy 103

15 quid if they pay within 14 days and acknowledge that "lessons have been learnt"

andy 103
Thumb Down

How to disrupt a monopoly

Offer something better than they do.

Go and speak to Ask Jeeves if you need further info.

*still waiting*

Apple re-arms the iMac with 10th-gen Intel Core silicon

andy 103

Their older hardware is good enough (and that's the problem)

21" model with a decent second monitor would be a more than adequate set up for me.

I have one of the 2016 27-inchers and it's been a great value machine, still blazingly fast and with the 5k Retina display. But that's the problem: that machine is more than good enough and there's not much tempting me to upgrade. In the same way I have a 2015 Macbook Air which is still - IMO - better than the new models.

Basically I'm not seeing anything from Apple that's tempting me to upgrade. Even if these machines are better than my current setup, they are not *that* much better that I'd bother for a considerable time, at which point they'd probably be available second hand. Have we reached a point where hardware several years old is good enough for most people? See also: phones.

You wait ages for a mid-air collision spoofing attack and along come two at once: More boffins take a crack at hoodwinking TCAS

andy 103

Re: Security

I would never advocate security through obscurity. But I also don't feel that on a system of this nature the details of how it works should be in the public domain. Personally I'd prefer it if these things were kept within closed circles rather than "hey everyone here's how this works, please can you try and break it?". Not for this application anyway. Opening everything up to the world for scrutiny doesn't automatically make it "secure".

andy 103

Re: Security

It's more concerning how people who come up with these proofs of concept get the information required to simulate them in the first place.

From the way this article is written it suggests all they did was a bit of Googling, then bought a mid-to-high spec computer from their local PC World. Frightening that that's "all" it takes. The information needed to do something like this shouldn't be in the public domain, simple as.

It isn't security through obscurity because they clearly knew enough about how it worked to simulate an attack. Absolutely crazy.

One year ago, Apple promised breakthrough features to help iPhone, iPad, Mac owners with disabilities. It failed them

andy 103
Unhappy

What are the penalties for excluding disabled people?

It makes me cringe to write this, but it reminds me of published guidance on website accessibility with regards to those who have - for just one example - visual impairments up to and including blind people.

Unfortuantely it isn't particularly well enforced. So you can get a web developer who makes a "beautiful" looking website and then goes "meh, making this accessible for screen readers is too much work / makes my work look un-sexy". If they take this approach, there are very few consequences. Theoretically they could be fined. When was the last time you heard about someone being fined for that?

UK Gov are apparently very focused on making all of their own digital assets accessible to those with disabilities. But you get the impression they don't really want to do this, they're just doing it to make it look like they comply with guidance - guidance which they probably don't even understand, or care about.

If there were severe penalties for producing any technology that was seen as exclusive (as in excluding) towards disabled people some of these companies may think twice about where they put their efforts.

Ex-barrister reckons he has a privacy-preserving solution to Britain's smut ban plans

andy 103
FAIL

the content is labelled by the creator

If our proposal is accepted, the content is labelled by the creator so it's easy to filter out inappropriate content from young children, using the lightweight filter.

Right so - just off the top of my head - several problems.

1. The content creator actually labels their content? To be fair this article acknowledges that might not happen. If this is the case how do consumers choose to set up devices which use a mixture of content that has these headers, and ones that don't? It seems all or nothing.

2. Who regulates how content creators use this? Can a smut company put material out there and simply label it as "safe" to gain viewers? Who enforces that and what are the penalties if they don't comply with the spirit of using it correctly? If they get "banned" from using it then surely we're back to square one?

3. "using the lightweight filter" - it's not that lightweight because the playback devices also then have to implement controls to toggle using it. Presumably behind some other layer of security that has nothing to do with this proposal.

4. As with DVD / Bluray, if you can play it back, you can distribute it. This problem has existed for years. Play the recording back using 1 authorised device, record it on another (smart phone camera will do, nobody cares about it being *that* HD). Distribute the decrypted content as much as you want on one of the many apps people of a young age use, i.e. Snapchat, Messenger, etc. Or if you want to go old-skool, a USB flash drive, or god forbid optical disk.

Apple to keep Intel at Arm's length: macOS shifts from x86 to homegrown common CPU arch, will run iOS apps

andy 103
Facepalm

Re: RIP Hackintosh

Apple is primarily a hardware company, and they actively oppose running their software on other hardware, as the software is mainly there to sell the hardware.

There's also a bigger reason. Consider all of the variants of Android, running on a huge number of different devices comprising of hardware from different manufacturers.

There's a reason people talk about Apple's offerings saying they "just work". Because in terms of iOS, it's literally 1 OS with no variations (aside from version releases) running on hardware which doesn't vary much. They know their software works on their hardware. So supporting it isn't as expensive as if you throw in a load of variations.

That's why it "just works". That's why it sells. The last part is what Apple (a business) care about.

not everyone can afford to buy their expensive hardware - but there are plenty who can, and these are Apple's target market. They don't care about people who won't give them money. Because they are, you know, a business.

andy 103
Thumb Up

No different to changing ports

This whole thing reminds me of the debates that have gone on about Apple dropping certain ports from their hardware. Whether it's a 3.5mm jack on a phone, or a USB 3 port on a Macbook.

Things move on.

Years ago I had an AMD based laptop that had a parallel port, serial port, infrared (yes, infrared) and CD drive. Is any of that something I miss now? No, not at all. Same goes with the 3.5mm jack on my previous iPhone compared to the one I have now. Or that I can't plug in that 13 year old USB 2 printer to my Macbook.

The clear reason they've done this is because they feel it's a genuine improvement over any of the Intel offerings. Particularly the fact they have little control over production and design of those chips. Given Apple's propensity to make money, they must be pretty confident consumers and end-users won't miss an Intel processor. Whatever your view, it's not a step backwards, even though at the time it might seem that way. In the same way that nobody will care that their phone doesn't have a 3.5mm jack, once they've got in their mind that things naturally progress and you don't carry on doing things in a certain way because "it's what we know".

Also remember that ~80% of users of their hardware are non-tech people who don't even know - or care - what processor their device has, as long as it works, and works well. Their target market isn't "person who gives a shit what architecture the CPU is". It's, "person who will spend money on this device". Interestingly, Apple has a very good track record at the latter.

Page:

SUBSCRIBE TO OUR WEEKLY TECH NEWSLETTER

Biting the hand that feeds IT © 1998–2021