90 minutes at 3mph. It's not exactly a wide ranging beast is it? I bet that 90 minutes isn't flat out fully loaded either.
34 posts • joined 26 Apr 2012
"They say piracy killed the Amiga..." but maybe they lied ?
Amiga Power (ex-Amiga games mag, I guess they're all ex now) covered this, it's still online in AP2 (https://theweekly.co.uk/ap2 look for "The Amiga's Death Sentence"). Basically at an industry gathering pulled together by Future Publishing in 1993 a large selection on major game publishers met up and said how they didn't want to support the Amiga, not because of piracy (The PC has piracy issues that made the Amiga's laughable) but because the profit markup on consoles (SNES and Megadrive at the time) where so much better.
Yes, Yes, Yes BASIC, very nice but...
Here are the damning lines :
"They also created time-sharing to open up access to all students at the college. The idea was that the computers should be used by all students, not just those studying technical subjects."
Non-technical students? In the labs? Ack! At last we have found the root cause of the favourite University Dean quote "But the CS department doesn't need extra money for their labs all faculties use computers, it's not like biology/physics/chemistry..." usually from the head of the Maths, Computing and Science Dept. who used to be a biologist/Physicist/Chemist.
A memorial ? For them ? Truly they were histories greatest monsters.
Re: Back off another notch?
Oh God no, half the problem with the field is renaming stuff when it becomes apparent it doesn't do all the bullshit the PR people said it would and then we can all start again with a different name (I'm looking at you "deep learning"). It was christened as AI back in the day, I don't see a reason to change it, main issue is too much focus from the press on the "I" and not enough on the "A", I doubt Gardener's Question Time has to field that many questions about the pollination of plastic chrysanthemums. Machine Learning on the other hand is supposed to be algorithms adapting results based on received data, it's part of AI but not all of it (some what ironically in many cases once we've trained an ML process to a required level it's adaptive process is locked and it stops learning.)
Back in the day I was told the difference between Expert Systems (ES) and Decision Support Systems (DSS), (remember them?) was that if you wanted to publish an academic paper on it it was an ES but if yo want to sell it it's a DSS.
Should a robo-car run over a kid or a grandad? Healthy or ill person? Let's get millions of folks to decide for AI...
Hasn't this already been decided ?
According to el reg Mercedes sorted this out ages ago :
So with in car problems answered I'm assuming a Merc / BMW / Audi key fob will just emit a signal and the car will aim for the group with the least number of owners in it (or most of the competitors, looking at you VW techs). If no signals are detected maybe they can find a way to run over both sets of losers to teach them a lesson.
Infosec short of people ? What a shock.
Looked into Infosec a few years back when I was looking to leave academia (didn't in the end). All jobs required full CISSP certification and part of CISSP cert was 12 months industrial experience. And now you say they don't have enough folk ? Well I'm stunned.
Re: > 640k base memory
Ah the memory allocation table layout, knowing where Windows 3 loaded the CGA driver in memory so you could overwrite them with something else because you were posh and had a VGA card, those where the days.
At Uni though I had an Amiga and a hardware PC emulator, (an actual board with a 286 on it, you lifted out the Amiga's 68000 chip, stuck it on this board with the 286 and plugged it back into the CPU socket) but this had the fantastic feature that when you booted it into PC mode you could have access to the Amiga's entire 1MB of Ram in DOS, the luxury. Only ever used it for Turbo Pascal and Dbase IV obviously because the games were better on the Amiga side.
Re: Average IQ
Nobody done this one yet ?
Three people, IQ's 100, 100, 100 = average (mean) 100
Three people IQ's 80, 80, 140 = average (mean) 100
Two people below average IQ in second example, no-one has below average IQ in the top one.
The only problem I can see is that your friend is assuming someone, somewhere is getting higher IQs (whether this actually relates to intelligence is an entirely different matter), and I'm not seeing much evidence of that these days.
I've been teaching programming for well over a decade to undergraduates on degree programmes that have to be able to code to hold down a job in the field (first CS then Electronic and Robotic Engineering). So reasonably intelligent folk with at least a small amount of motivation (should be more but hey...). Let's kill off the great lie here and now. Not everyone can code, not everyone will be able to code, not at a consistent, competent level anyway.
The future is bright...
I, for one, am ecstatically looking forward to the day where I get in my car ready to set off (with excited kids / to important business meeting / where ever the hell people go) and am greeted by :
"We are updating your driving software. Please do not switch off the engine. 1% complete."
Adding to the confusion
Out of interest why is Facebook, a company presumably quite interested in data analytics and large amounts of data, releasing a software package with the same name as a much larger project run by Apache focused on such things ? Do they just hate their software dev. teams that much ? I know they don't do the same things but presumably it has potential to cause internal confusion ?
"I'm developing for Yarn, no not that one the other one."
20 straight months of non-growth ?
They could do with seeing some external consultants to help grow their business. I was at an event recently where there were these guys who had just the thing to help them "out-think business challenges", see they've got this AI soution called Watson that "drives innovation and growth"...
Well the 3 UK universities I've worked at all have taught ethics and professional practices to their CS / Engineering students, it's common sense and, I'm fairly sure, required by both IET and BCS accreditted courses. Nevermind AI what about control and safety critical software ? Projects generaly not in the publics interest ? Personal ethical considerations regarding fittness for purpose etc ? It's more of a worry that US academics are calling for it to start.
I've no problem with this...
...as long as they sit down with the rest of us and learn how a support vector machine checks boundary conditions across multiple dimensions, how a multi-layered perceptron network is just a big minimisation function, how a beep belief network is usually just an MLP that's figured out it's own reduced input set via restricted boltzmann machines, learn how to tweak the many, many parameters of an ant colony algorithm to get the best results etc. You know, learn the actual AI stuff, once they've done that and actually understand what it is they are commenting on I'll welcome their input. Same for the robotic wonks.
I don't hate my family so I won't use Barclays
Barclays are bastards. Yeah I know all banks have their problems but we've (my family) been unfortunate to have a couple of deaths in the family over the last couple of years and every other bank we've dealt with (Natwest, Co-op, Lloyds) have been professional and at least shown some human decency but not Barclays. Barclays do not deal with deaths in branch offices face to face you have to ring up a call centre. They emptied my brother's partner's account as they had an unsecured loan with them (no Barclays, you do not get paid before HMRC), took months to give account information, delayed payment for funeral expenses, only reversed actions that were illegal when threatened with the banking ombudsman and were generally as utterly obnoxious to deal with as they could possibly be.
Re: unable to write a "for" loop in C/C++/Java
That wouldn't surprise me if the degree was Computing, chances are the student hasn't done any programming in anger since their first year. Computing degrees concentrate on using technology not writing code. If you're looking for someone to work in the IT dept. doing some database, networking, server configuration etc. and writing the odd script or SQL query then Computing may be the go to place but for coding look for Software Engineering, CS or even Games Programming (those students will have done (or should have) a lot C/C++ with maths and some realise they aren't going to be the next Gabe Newell).
Have you been interviewing CS grads or grads that have CS sounding titles ? I used to be a CS lecturer and there are a huge number of CS related degrees out there with hugely different curriculum even at the same university, CS isn't Software Engineering, which isn't Network and Communications which isn't Computing, which isn't Intelligent Systems etc. One of the frustrations we used to have (one of the many) was it's sometimes really hard to know what the hell industry wants from a new graduate. They need to have solid fundamentals (Ok so we'll teach them C, Data Structures, Algorithm Design etc.) but they need to be up to speed on industry standards (Ok so Java / C# etc.) and they need to know about the full software lifecycle (Software design, testing, UML, agile etc), and the new shiny thing (cloud, IoT, functional, containers etc) and of course the course needs to be accredited (big database component, study skills, personal development, ethics) and it's academic so they need that (research methods etc). Look at that it's a crap degree because they haven't even learnt basic comms, web development etc. What kind of University sends out graduates in 2016 without knowing Android/IOS/Linux/Oracle/Windows 10/Azure/AWS/Whatever the hell your company thinks is important but can't be arsed to pay for training in.
As others have said the point of a degree to is give the students some basic knowledge in an area and instruct them on how to learn and keep learning, but the IT field is so big now employers really need to look at what students have actually done not simply look at a CS sounding course and think it's the same thing they did 15 years ago. If you want coders look at Software Engineering not Computing, ask for transcripts.
To answer another question, CS grads are typically employed by the 12 month mark but government stats insist on checking after 6 months (another frustration).
Bit of basic maths seems to be the issue here.
Honestly all of this seems less about cloud and more about tech companies failure to understand the implications of the term "unlimited" (again). First it was internet access with dialup, then it was data on mobile, now it's cloud storage.
Question tech company, is the service / offer you are providing actual free for you to operate (zero pounds, zero fractions of a pence, including maintenance) ? If not then you can't afford to offer an infinite amount of it.
Fun game to play.
Step 1. - Go and have a look at what billions of dollars of research funding has got us in the way of general purpose autonomous robots, I suggest this years DARPA challenge (the IEEE youtube vid of the failures is a good place to start.
Step 2. - Have another read of the reg article on Japanese research on how people (especially children) behave around robots (http://www.theregister.co.uk/2015/08/07/engineers_help_robot_escape_tots/)
Step 3. - Imagine sending one of the cutting edge DARPA challenge systems into a room with constantly changing layout and dozens of people. Load it up with hot liquids and sharp objects for added fun.
Step 4. - Check the BBC's guide to which jobs are most at risk and see that Waiter / Waitress is rated at 90% (it's quite likely).
Step 5. - Consider how much time, effort and above all else money, you are going to have to throw at this thing to prevent your restaurant being sued out of existence, in order to replace a bunch of people working for minimum wage plus tips.
Step 6. - Realise that most of the stuff being written is by people that haven't spent more than about 10 minutes with an actual robot, which was running a manufacturer's demo, in their life, let alone worked with one.
But I'm just bitter, being someone that has to do research with the bloody things, rather than get paid to write tosh as a "Robot Ethicist" or "Futurist" or whatever the hell these people like to call themselves this week.