C# is one of the fastest growing languages in the world. Practically everyone that writes VB is moving over to C#.
A-level computer science students will no longer be taught C, C# or PHP from next year following a decision to withdraw the languages by the largest exam board. Schools teaching the Assessment and Qualifications Alliance's (AQA) COMP1 syllabus have been asked to use one of its other approved languages - Java, Pascal/Delphi, …
What a marvellous idea, lets stop teaching kids things that are useful and instead have them learn Pascal and Delphi - what wonderful careers that will set them up for.
Instead lets just prep them for a degree in computer science at some uni then when they come out they wont have a clue like all the other grads with no useful knowledge.
They are to learn Computer Science in a uni, not to get work experience in it. You get work experience at work, not in the Uni.
Fundamentally it is a version of the difference between a structural engineer and a builder. A builder can become a builder by work experience alone. A civil engineer needs to study lots of boring stuff like math and such to become a civil engineer. He cannot become a civil engineer on work experience.
Coming back to your comment. First of all there is a reason for this. See this for an example:
You cannot teach even the basics of algorithms and data structures in java, C# or php. They simply disallow you access to pointers and low levels to do some of the basic data ops required to show the students a data structures problem. Try doing a doubly linked list in java for example. That is good for the trade, but it is bad for the teaching.
While it is possible in C or Perl making it readable and presentable is a nightmare.
While I am not a great fan of Pascal, credit where credit is due - it is a language designed for teaching. It is one of the very few languages which can be used to learn things like linked list manipulation, pointers, etc and it is _READABLE_. You simply cannot do that in C#, java or php.
It is the university job to provide higher education. It provides the CS equivalent of engineers, not builders. A person who knows data structures and the fundamentals of CS can start coding in _ANY_ language in a couple of days including Java, C# or PHP.
If however, the universities are to follow your great advice they will provide "builders" which can practice only their trade and cannot do any of the real stuff. No better way to ensure that this country never ever has a Google an Infovista or even a measly Yahoo to be proud of.
I can only say - applause, long overdue.
El Reg readers would be the first to complain when school children aren't taught 'word processing' or 'spreadsheets' but Word and Excel.
How is teaching the use of a specific language any different.
Education is there to teach the concepts and principles behind computer science, so that they can be applied anywhere, to any language. Not how to write simple Java programs.
Learning how to use a particular programming language is something that is learned after leaving school/university at work. What is commercially important changes rapidly and the educational system neither needs nor can afford to continually follow what is in fashion.
In an educational setting the aim should be to first teach principles. In the context of creating software that means things like how to manipulate data and what algorithm to apply to a problem or how to translate a system into objects. An established and stable programming language that can be used to learn these things - and I think that Pascal and related languages are very good - it what is needed.
Any competent person can program a solution in several programming languages. Teaching the basics properly should produce someone able to do that.
Modern coding may well be done in C# or Python, but using languages which are so high-level as to be almost pseudo-code teaches nothing about how to write efficient or elegant code.
I did VB in my CompSci A-Level. Can you guess how much I learned about writing efficient sorting algorithms?
I applaud moving away from objective languages, at least until a firm grasp of good basic coding practices is established. Up until the end of college, maybe...
What on earth do you think is efficient about writing your own sorting algorithms!? Other people have written plenty of well honed well considered sorting algorithms for you, all you need to know is which one to use if the standard one proves too slow...
Sorting an object by typing whatever.sort() as opposed to spending a whole afternoon debugging and testing your homebrew barely remembered un-reusable quicksort, now that's efficiency right there! Supporting features like duck typing so you don't have to rewrite your quicksort function for every different type you might ever want to use, now that's efficiency.
Personally I'd advocate a Beauty and the Beast strategy - I think Python and C (proper not #, not ++) should be the only languages in use at A level. Early exposure to (the beauty of) Python should ensure they are naturally repelled by all the other awful, backwards, compiled, non-dynamic languages out there, and that if they really need recourse to some serious low level voodoo there's really no need for anything other than C. The two play together fairly well too.
Props to them for ditching PHP BTW, I've had to write far more code in that ugly terrible language than I care to recall. You know a language is a dog when programming in Adobe Actionscript feels like taking a holiday!
This does appear to be a somewhat remarkable decision. I could understand if the languages were defunct, but C is still widely used, as are C++, C# et al.
When I did O level computer studies (y-e-a-r-s ago) we had to do a project in BASIC. We could use one of the school's PETs or our home computers (Speccy, C64, VIC20, BBC, etc). My mate was a dab at Z80 and wrote a good 80% of his project in machine code, only to be told to re-write it as the examiners wouldn't be able to disassemble it. Then, at college, we were told to submit projects in 6502 machine code as that was the way forward.....
Seems that 25 years on some people haven't learned a darn thing....
Sure, why teach em something that might prove useful.
Just about any language can be used to teach programming fundamentals, but leaving out C (but keeping Delphi?) seems about as practical as teaching general driving skills with the use of a tractor.
Should we be looking forward to a generation of VB-wielding engineers?
If Pascal/Delphi are better Computer Science learning tools then it may, indeed, be sensible to teach these instead of C#. The amount of time actually spent learning a language in school is small compared to that which an employee will spend learning a scripting or coding language.
I'm neither a software Engineer or a Teacher but I don't believe a Schools primary duty is to turn out only productive worker bees.
Yes, you're right.
However, I'm not sure how VB gives you access to the memory management stuff that's required for teaching about things like pointers and memory management that are important aspects of any computer science course. Actually, C is generally used to teach these. Strange one, that.
Dropping C, and C# in favour of Pascal, Delphi and VB?
Seriously, it may be easier to _teach_ those languages, but it's hardly going to be of any value to the student once they have their A-level, unless they want to go into a career where their job involves maintaining spreadsheets, and software written 40 years ago.
If anything, they should drop the obsolete and 'toy' languages in favour of languages like C#. It might be harder for the students, and they might not all get A-grades, but they'd get an employable skill.
The school should be teaching the student how advanced programming works NOT how a specific language works.
This way the student will have sufficient knowledge to pick up any language from the basics learned in school.
The learning of VB is because it is a nice easy language to begin with, the student can see instant results and not be bogged down with any complexities too early on. Once they have the basics mastered, they progress to something like C in Uni.
Remember, it is up to the individual student to learn the language themselves. I (and I'm sure a lot more of you) taught myself several languages (Java, C++, PHP, Perl etc) all from the basics I learned when in School.
As someone once said, give a man a fish and he will eat a meal. Give a man a fishing rod and he will provide for himself!
"Most centres offer Pascal/Delphi and Visual Basic as the language of choice for their students. This selection is based on the experience of the teacher in that centre and their own comfort with that language."
So students have to learn a language which won't get them a proper job because it's the only language their teacher knows? Anybody else sensing a vicious cycle?
That may be taught in higher education? Well, lets not teach anybody English, French or whatever until they get to Uni if they're studying Modern Languages courses.. See how that flies..
Pascal/Delphi/Ada isn't bad as a 'basics' language.. Maybe for O level..
VB? Don't make me laugh.
While C/C# may have a bit of a learning curve, A levels aren't meant to be too easy, and you don't need to know all of C/C#/C++ to code basic programs in it.
Removing some of the industry's most used and effective languages, while leaving niche languages on a syllabus just strikes me as wrong.
No more than Visual Basic, and certainly much less than C.
I say C++. Teaching a multi-paradigm programming language is the best basis for students moving on at a later date to other technologies. C++ is one of the most fundamental technologies in the industry.
Please, not Visual Basic. It teaches almost nothing about technology.
I can understand dropping C, simply because it's very low level (for the most part), where-as C++ allows for some relatively high level development with the right tools.
I think you've missed the point of computer science. It isn't about "technology" at all. Technology is the end result of people with an education trying to solve real-world problems. Computer science is (in part) about teaching the different theories and methodologies used to make computers do things. Object oriented programming is one, as is functional programming, imperative programming and so on. Each is an entirely distinct system of thought, each has its own particular strengths which promote its use for solving certain classes of problem, and each has weaknesses which are revealed when trying to use it for the wrong thing.
It's necessary to learn the low level in order to understand why the high level works as it does. One needs this in order to create code which is sympathetic to the way the machine operates and is therefore efficient. Jumping straight in at the high level leads to quick results, but without the deep, intuitive understanding of what's going on under the covers the student cannot build on their knowledge. It's the difference between a tourist phrasebook and properly learning a foreign language. One will get you a beer, the other will get you a wife.
A "multi-paradigm" language like C++ is probably the worst possible case. Mixing imperative (or structured, or procedural) and object oriented programming muddies both concepts. Many students struggle with the difference; presenting them in the same language is likely to hinder their understanding. C is perfect for teaching structured programming because it is so close to the underlying machine code and so intimately bound up with what the computer is actually doing. You can easily see why one algorithm may be much less efficient than another, because you can see what the machine is actually doing. C is not in any way hard. Once you understand the _concepts of that paradigm_ it is very simple, it just gets out of the way. The "learning curve" you refer to is actually the difficulty in learning to program properly, which is what comp sci teaches!
Do you have a computer science degree?
... but they need a good grounding in data structures, pointers, and the machine model. If that can be combined with a basic understanding of OO all the better. Sadly few coming out of UK universities have the skills, which is why we mostly recruit overseas.
Computer science, is not, or shouldn't be, about learning to program in language X, it's about learning the full software development methodology.
It's goal is not to teach you C#, java, c, whatever, it's goal is to teach you about memory, variables, data-structures, code flow, etc.
If you understand these concepts, the actual implementation language doesn't matter as much. Yes, you can code some pretty impressive stuff, quite easily in c#, but if it doesn't work exactly as you expected, if you don't understand what basic concepts sit underneath it, you'll never know why.
I did a software engineering degree, and at no point during it, was i ever taught to actually program. You were expected to go away and learn 3-4 different languages on your own, no specifics given, all of the lectures and concepts were done in a form of pseudocode.
When you come across sorting code that chunders along slowly, and find out it's because someone is copying memory all over the place constantly instead of just swapping pointers around. That's the sort of stuff computer science should be teaching, not how to use library sorting functions. Once you understand how to do it yourself, then you can start using the libraries!
I am now a professional, and guess what, in reality, it's not about the language, if you're needed to do some c, you get a CBT course, then you're coding in C. Yes experienced people are needed to support and review, but in the main, unless you are doing something specific, the actual language is unimportant, you need to work in VB, you learn VB, it doesn't take long to get going if you have the basic grounding in the fundamentals that the purpose of these courses is to teach.
I'm quite sick of people saying that the programming languages used in schools must be popular in industry. After all, a lot can change between the time the students learn about programming to the time they have to apply for a job. Of course, you can try to predict which languages will be popular in the future, but you are as likely to be wrong as right, and even if you guess right, the languages are likely to change even if their names don't.
Also, in my experience, if you have learned to really program (as opposed to doing trivial cut-and-paste exercises) , then you can very quickly apply this to new languages. So instead of using a language where you have to do a lot of apparently meaningless mumbo-jumbo to just write a hello-world program, use one where you only write things that have direct relevance to the problem you solve. And use a simple language. Instead of being introduced to a new language construct or library function every time they need to solve a new problem, the students should learn only a few constructs and build everything up from these. They might not be able to create flashy animations or games as quickly, but they learn more.
Tories take over and the world ends.
PHP is the only language where pay held up and there weren't mass redundancies during the election.
C is a crazy important language anyway - everything useful is written in it.
C# is the future, and coincidentally is a great reference language, because it takes everything that's been learnt about writing software over the years and boils it down into a new language with an excellent feature set. Microsoft's implementation taking the most beautiful language ever written and effectively turning it into faster Java not withstanding.
Java is just useless across the board and it can only be a good thing it's going out the window.
Delphi? Come on. Python is pretty ugly too. Not saying PHP isn't but at least PHP knows it is, Python thinks it's God's gift to software developers and falls waaaay short of the mark.
I mostly take issue with the killing of C and C#, they're quite possibly the most important languages on the planet right now, everything we have is written in C so you need to know it to maintain all other code, and C# is the future.
And no; this stuff isn't too hard, if they can't get to grips at A Level they probably won't at Uni either.
If we're really trying to create a generation of totally unemployable people why not just teach them all Fortran and have done with it?
"This selection is based on the experience of the teacher in that centre"
On that basis, I should have been taught COBOL using punch cards by Mr Crusty McOldfart at Technical College.
I suppose it's too much to ask that the teachers are actually competent in a more 'current' language rather than just the one they were taught 25 years ago?
The only job you'll get working with Pascal is teaching it.
The OU has a stubborn attachment to Java for some reason. Yes, it's getting a lot of support from mobile devices to enterprise stuff, but I'd rather stick my spuds in vice than do another Java module. How about offering some other languages?
I've been working with C# for a couple of years now. As an old C/C++ dinosaur, I'm hugely impressed by the way it gently pushes towards good, clean programming practice.
It's just a shame the IDE decides to eat it's own feet occasionaly.
OK, who the hell thought teaching Microsoft languages was a good idea? Really?! Aren't they quite likely to be dropped without trace (J++) or the name applied to something totally different (Visual Basic).
Don't even get me started about Delphi...
I guess somethings never change, I had to study CoBOL, DBase IV, Fortran 77, never used any of them. I also did Pascal, but I suppose I can see the point as an "introductory" language.
Clearly educationalists are slow learners (ironically).
It's an A-level not a complete programming course. As long as the teacher is competent enough to teach the basics of programming using whatever chosen language then does it matter? If the students really want careers as developers then they will learn all that they need either in higher education or on line.
My higher education programming consisted of:
Pascal and VB6 for beginners programming
C for more than beginners programming
Motorolla 68k Assembler for hardware principals
Java for Object Oriented stuff
A, B, Z and C++ for formal methods
And now, professionally, I write Databasic for Reality.
If the intention is to teach students the concepts of programming, then using a language suited to that purpose (pascal/delphi) is not a bad thing. With pascal/delphi you can code in a functional or object oriented style, for example.
Once you know programming concepts and how to code in one language, you should be able to pick up other languages reasonably easily - this is what is required of programmers in the real world.
Don Knuth used a made-up assembly language in The Art of Computer Programming. Anyone who learned that language and worked through the examples in that book is likely to be much deeper knowledge of programming and be more useful in the commercial world than someone who learned and only knows Php/C#/Java/whatever.
'but a course that covers the fundamentals of computing ' - last time I looked most OSs were written in C/C++ , and C syntax still forms the basis of many modern languages. So I guess the main reson for dropping it is simply that the teachers don't have the depth of knowledge to teach it.
You might want to look up what the __stdcall prefix is for in some of the older Windows APIs.
As for the C family's underserved popularity: perhaps it's still common because many operating systems are still build on the lumbering dinosaur known as UNIX. How a 40-year-old OS (and OS design) has managed to remain relevant to today's IT needs is a mystery.
Schools don't have huge IT budgets—hence the continued support for VB6, or did everyone miss that?—so it's hardly a great shock that some older languages remain. Teachers have to teach with whatever they have to hand. If that means a classroom stuffed with ageing Pentium 4s running Windows 2000 and Delphi, so be it. The teacher doesn't get to demand new PCs capable of running the latest toolchains.
Few companies are hiring programmers fresh out of 6th form. They're hiring them out of *uni*. So there's at least another 3-4 years of learning after the student has their A or AS Level.
(Of course, when I were a lad, we were satisfied with a lone ZX80, Commodore Pet, and the school's mighty Research Machines "Link 380Z" running its "Cassette Operating System". Kids these days don't know they're born. [INSERT PYTHON SKETCH HERE]. Etc.)
"Schools don't have huge IT budgets—hence the continued support for VB6, or did everyone miss that?"
Hence them using some outdated proprietary shit as opposed to a real language with a free compiler/interpreter? Or did you miss that?
I think a large part of why they use VB and Delphi is because either A) The teachers know nothing else and are to scared or lazy to change environments and course materials OR B) They don't credit their students with enough intelligence to grasp a language not inherently entombed in a point and click graphical IDE. Neither of these is a very good excuse as isn't money.
Most people here seem to be subscribers to a false dichotomy. There is no reason a language can't be both useful for teaching and useful in the real world. Find me a computing concept that can't be well demonstrated using either of Python or C or both - they are both useful real world languages no?
"Hence them using some outdated proprietary shit as opposed to a real language with a free compiler/interpreter? Or did you miss that?"
No, I didn't "miss that". Believe it or not, the cost of a toolchain is not an indicator of its quality.
I've used vim and gcc. I've written in low level and high level languages, from various assembly languages through C++ to .NET and Objective-C. I've written entire games using HiSoft's GenST tool and even the Picturesque Assembler for the ZX Spectrum (at a time when I couldn't afford anything other than a cassette deck for storage; believe me, you learn to write good, quality code *fast* under such environments.)
Microsoft, for all their recent management problems, actually do know what they're doing when it comes to developer tools. They've been making them for much longer than they've been making operating systems. And they're actually pretty good. One of their key philosophies is to make programming *easier*. Visual IDEs are a part of that. (The GNU / Linux community appears to believe in making programming *harder*. I'm not yet sure why, although, having used Emacs, I suspect masochism may have a lot to do with it.)
Laboriously writing your commands out in dumb, flat text files and telling the computer how to link them together is a *terrible* solution for most software development problems. That it's how it was done 40 years ago does not mean it's a good way to do it now. Unfortunately, the programming fraternity is ridiculously conservative.
Windows and OS X each provide a single, homogenous platform to target, with easy development tools that let you get results quickly. The myriad variants of UNIX do not: their development platforms are fragmented and barely coherent, let alone cohesive.
My job as a teacher was teaching *programming*. And, since "BASIC" actually stands for "Beginner's All-purpose Symbolic Instruction Code", I think it's fair to say that it's not a terrible choice of programming language for—you know—*beginners*.
Knowing how to choose the right algorithm or library is far more valuable than understanding the finer details of the "make" command or Emacs.
Similarly, the argument that lots of people use "C" or "C++" does not wash. Lots of people love "X Factor" and "Pop Idol" and drinking themselves to the edge of alcohol poisoning too. Doesn't mean they're *right*.
That the tools are *easy to learn* and *easy to teach with* is of far more value to a teacher than whether they're particularly popular. Most languages are very similar syntactically, and by the time the student has been through university, he'll have had plenty of experience with other programming languages already.
Linux (no window manager necessary even, so you could practically use pentium 3's) + GCC and Python interpreter.
Bearing in mind these are introductory courses, you don't need an IDE and not having one will avoid having the students becoming too reliant on the 'helper' features. The coding exercises in these courses are small enough to be done on a simple text editor. Learning why consistent program structure, proper naming conventions, etc. are so important the _hard_ way is probably more valuable at this stage than having experience with any particular advanced developer tool.
So given this scenario the impact on the IT budget is .......
cost of installation time.
...teaches you nothing about programming apart from bad habits.
At University I had to study Oberon (pascal variant - procedural programming), Haskell (functional programming), and Java (object oriented programming). Now I program mainly in C#, and sometimes in F#. I wouldn't touch VB.NET with a bargepole, as C# is so much cleaner and efficient.
Either use totally obscure languages to prove a point and so that nobody has the inherent advantage of experience, or let people use the most popular languages in commercial use.
...is actually a proper choice to learn programming. The strategy is like "we first show you how you SHOULD do it with Pascal. Then we will show you the C#/Java/VisualBasic hairball that eats memory like a Polar Bear and is slow as a slug. If you are brave, we will then proceed to self-mutilation using C/C++".
Delphi is clean, fast, efficient and small. A good reason to be used for teaching.
We ask our Jr programmers what is 0x1200+0x34 if they claim to know C or assembler. I''ve started keeping a list of professors at universities that are giving these idiots references and blacklisting on that. At some point I'm going to call up the idiot professors and suggests he owes lots of his former students money.
It probably doesn't matter too much which languge you learn to start with. And it is probably better to avoid C because it makes it difficult to teach OO concepts - and, unless you are actually going to become a C/C++ programmer there doesn't seem much valuelearning what a pointer is (not really transferable knowledge).
Even so, the main reason to settle on Delphi and BASIC is that the teachers learnt it twenty years ago* and aren't in any hurry to learn a new language. Tragic.
*It was probably Turbo Pascal not Delphi back then
With the country facing 'a new age of austerity' this does seem to play into the hands of the suppliers. I'm not aware of any competition for the supply of Visual Basic or Delphi, so how are the education buyers supposed to get the price down and ensure taxpayers are getting value for money? Java, Python and C all have competing implementations, with prices ranging from free to insanely expensive.
When I was at school the argument was that we should be using Windows 3.0 because that was what was used in industry. By the time I actually got into industry the world had moved on to Windows NT 4.0, and nowadays it's all Macs(in my corner of the world) which has a lot more in common with RISC OS then Windows. The only constant is change. What schools need to teach is how to learn, they can do that as well in Algol as they can in C.
Open Pascal & Lazarus - gives you Delphi for nothing, if CodeGear's new owner stopped offering free/cheap education versions
MS gives the compiler away free for VB.net as part SDK (need windows licence though) - there are free open source IDEs and there is a open source version of .net
"This selection is based on the experience of the teacher in that centre and their own comfort with that language."
Those that can't, teach. I was running rings around my IT teachers by age 13. Your average developer earns more than your average teacher, and has the potential to earn considerably more.
I'd agree with this very much. To be frank, plain IT teachers tend to have little/no actual knowledge other than teaching how to use M$ Office and the like from my experience - to the point where, when asked to develop a basic site, if you (as I did...) code it manually with decent code you get accused of having stolen it from the internet. I spent about an hour trying to convince that teacher I actually made it myself, and in the end I'm still pretty sure he didn't believe me.
Most of the courses I applied for (admittedly this was 10 years ago) didn't even require Computing A-Level. I was accepted into Durham with Physics and Maths instead, knowing nothing more than a few dialects of BASIC, and I went on to gain a very strong grasp of Java in my first year there.
"Most of the courses I applied for (admittedly this was 10 years ago) didn't even require Computing A-Level."
Just as well, since in Scotland you can't get one, you get Highers and CSYS instead...
Having said that, I did Comp Sci and Uni and I never had any Highers or CSYS in Computing - I did have an HND in 'Applicable Mathematics with Computing' though ;)
Interesting choice of languages there... if they dropped Java ("as happened in the US") then there's not a single C-type curly braces language there.
Now, I know it's about teaching the theory of computer science, the problem solving, almost artistic side of it, rather than the nuts-and-bolts get-it-working side of things... but once you've settled into a way of writing things does it not become "uncomfortable" doing things in an alien way?
I've never got on with VBs or Python because I started with C++ and Java and the BASIC "if this then this" style of writing things, or even Python's line-breaking, never really sat well with me - it's not that I can't do it but that it just feels wrong :)
This post has been deleted by its author
I was told by quite a few prospective Unis to avoid Comp Sci A level as they end up spending half the first year of a degree undoing the rubbish they teach at A level.
That was back in the days they did basic and pascal though.
Then again, went to uni and they taught Ada as the first language. Still, I actually believe that was a good grounding. Frustrating language as it's so string, but good in that sense. Moving from Ada with it's package spec/body concept to C++ with class headers and implementation was fairly easy. C on the other hand was left to much later for systems stuff. Start out on C and you learn so many bad habits.
Makes you wonder why they bother teaching anything really. I mean History - c'mon, it's in the past, who cares. Geography - well there's an app for that. Politics - they're all as bad as each other. English - pfft that's my first language so what do I need to learn? Other languages - pfft let other people learn English.
Of course it could be argued that learning to learn and learning to think are important and that this is what is actually being taught. But then where would that leave commentards eh.
I teach overseas in an international school. We have been following CIE's A-Level Computing syllabus, but will be changing next year to another syllabus / exam board.
The reason? CIE have decided that the best way to examine programming skills in the first year of the course is not through the current Programming Project (seems like a good way of testing skills - actual programming), but through a *written* exam - pseudocode, dry-runs, etc.
"It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration."
Who is still using VB6 when the .NET version has a lot of stupidity-promoting constructs (e.g. Variant) removed?? Teachers should update their knowledge, especially if they teach
I started on BASIC (ZX Spectrum), then after a break onto ASP and VB.
Since then I have written proper code in Java, PHP and C#. It really takes a long time to undo Basic syntax, and I would never recommend starting on Basic syntax.
Anyway, I was never taught programming at school, perhaps thats how I manage so well now!
You don't teach the language, you teach language theory and how to think.
C is one of those 'perfect' languages that everyone should know. Pascal? Not so much.
Students should be taught assembler, C, and then some OO language along with a scripting language (python, ruby, groovy).
This is an A-Level in computer science, not a vocational course in programming.
I learnt Pascal at A-Level - and it was great for learning about ALGORITHMS. Algorithms is what Computer Science is fundamentally about. Pascal is perfectly adequate for teaching this, whilst hiding some of the complexities of C (strings and pointers I'm looking at you).
At degree level I did more pascal - this time looking in more depth at algorithms with more complex data structures (trees etc). Only in my second year did I touch C, but armed with a trusty K & R it didn't take me long. I also did modules on Functional programming, predicate logic based languages, and object orientation. So that's four different paradigms, two of which are rarely used in industry. Practically all the rest of the course was algorithms and concepts. Why? BECAUSE IT'S COMPUTER SCIENCE!
If you can't pick up a manual and get up to speed quickly I would question whether you had learnt anything about computer science at all.
Already knew Basic (BBC micro to be specific), so turning up for Computing A level in the mid 80's I learnt Fortran (and CObol), - quite a culture shock. Then Pascal in the first year at University (Note, not Uni, I went to a proper 'University'), then C in 2nd and third years.
Both Fortran and Pascal gave me, I think a good grounding, but especially Pascal. As many other have said - you are not learning in order to work, you are learning how to program. Pascal does that well. I've not use it since, but the techniques and concepts HAVE been used.
As one of my lectures at University said (professor Shuman or something) - you learn a procedural computer language - then forget it again, until you need it. They all have very similar constructs - it's just the syntax that changes.
Of course, this was before OO, so I have had to learn C++ since then, but again, once you have the concepts, you can move from language to language fairly easily - because the concepts don't change.
Now, I work on some bleeding edge tech for a semiconductor company, on state of the art chips and software. In C, because even with modern processor, you still need efficient code (somebody, please tell that to MS), and C is the fastest out there. You need to be able to talk to hardware, but maintain reasonable readability. C does well at this.
"Java, Pascal/Delphi, Python 2.6, Python 3.1, Visual Basic 6 and VB.Net 2008" is just the choice given to schools, not to the kids.
You can bet your bottom dollar that schools will choose VB because Microshit made it and that's the end of that. 30 more Microsoft shills per school per year, you can't buy that kind of publicity (or more accurately, you can, but you have to keep it a secret)
Schools will not be able to get VB6 - it's out of support.
As for putting more shillings Micro$haft's way, how do you figure that? Give them a version of the DotNet framework (free), and a copy of the SharpDevelop IDE (also free) and leave 'em to it. This also has the advantage of giving them a set up they can easily replicate at home (being free) to further expand their learning and experience.
A very similar case could be made for Java with Eclipse.
Python and Ruby? Not sure I'd be making someone's first introduction into programming a dynamically typed language. Give the compiler a fighting chance in helping them out lol!
... I'm afraid I live in the real world, and in the real world A Level Comp Sci students are more likely to have Windows on their machine than any variant of Linux - not that having them work with Linux would be a bad thing, because it wouldn't...
Their school / college is likely to be running Windows as well - not many places at that level would have Linux support on tap...
But if they have Linux, or Windows, or just about anything else, then they still have the option of Eclipse for Java...
The word 'shill' is can be used as a noun that refers to someone 'who publicizes or praises something or someone for reasons of self-interest, personal profit, or friendship or loyalty'*, or as a verb; 'to work as a shill: "He shills for a large casino."'*.
Pascal has always been a good language for teaching programming fundamentals - data structures, pointers (for linked lists, etc) while remaining readable and manageable.
The scripting languages are hopeless for that, and I'd argue that OO and very high level languages such as C# obscure too much of the fundamentals.
Of course any new programmer should be quickly directed towards C to really learn how to do it properly.
All commenters are missing the point.
You don't go to school CS classes to learn about how computers are programmed *now*, or will be programmed in the *future*. You go to learn about how computers were programmed 10-20 years ago.
When I did 4th-year CS the teacher was on old ICL employee for whom computer programming meant writing COBOL on punched cards and having someone else feed it into the mainframe and run it. This was in 1985 when kids could already write software in BASIC on a £200 box plugged into the TV. Silly me, I'd thought CS classes might help me become better at my hobby. Instead I learned about such technological marvels as MCR and EAN checksums and binary coded fucking decimal.
I suspect whatever we teach our kids today will be equally laughable in 25 years. Particularly if it involves Visual Basic in any way, shape or form.
Clearly, someone is slipping today in approving comments. There's not a single reference in this one to the supreme evil of Apple, Google or Microsoft. And there's no "shouty shouty" name calling either (too bad), which of course means this unreasonable person is just giving us a good, strong, clear point of view. Must mean the Apocalypse is upon us.
Well written, Eddie, though I'm sure someone will agree with my sarcasm on a literal level and we'll be pulled back down to comment purgatory.
Nobody wanting to continue with their CompSci A level let alone a degree.
Would you rather do some weak subject like geography or take on C, C++ at A level? While you want to spend most your time chasing skirt.
You know the kid who gets the straight A's in geography, business and media studies will get into a better uni and still be able to do a CompSci course?
Revolved around learning Pascal and building databases in Access. I'd taught myself BASIC at a young age so I had fair grasp of some of the essentials of programming. I found Pascal to be an unfriendly language to use and one that has been of no value to me since.
As for Access, well, I got increasingly bored with the final year project of building a database and learned nothing about building queries and the like. I found out (by accident) that it was possible to include Visual Basic code into the database and eventually made several different card games with AI opponents. Looked really cool, ran stable and was a great way to pass a boring afternoon. For some reason, it wasn't what the course examiners were looking for, so I came away with a D. If I had been given the choice of using a programming language of some USE, I'd have probably been more interested in the coursework.
Oddly enough, the software I use at work in the hearing aid department looks like it was put together by a CompSci A-Level student in Access and crashes more times than a British Grand Prix driver.
The point of 'C' in particular is that it is so very low-level. While it's true that in any class/project situation, you won't be able to produce much - that's not the idea of the lesson. In a 'C' program, you can point at any one line and describe, in a sentence, what it does. The OO and interpreted languages perform so much help in the background (this is why they make programming quicker) that you would need an entire paragraph to fully describe the type-inferring, dereferencing, garbage-collecting, automatic-object-creating magic that happens to execute a line as simple as a=b+c
Once the student understands the essential proceduralness of the underlying execution, the magic and gloss can be introduced. The result of the current policy is more numpties who don't know why their server pauses every so often to garbage collect.
Python is excellent as a teaching language, because it is cross platform and you can get useful stuff done with very little knowledge of it and this then leads naturally to more advanced concepts. Python also has no redundant syntax. Once you learn that the code won't compile until you indent it the way you should indent code in any language to make it readable, you'll be over the one arguable nuisance factor Python seems to have. Also to make indentation bugs less of a nuisance, the student will very soon be splitting coding into smaller components and files which has to be a great improvement over what most beginners do.
I agree that 'C' is better for teaching data structures and algorithms from the point of view of resource optimisation, but I really don't think 'C' pointers should have to be handled by a student learning their first programming language. Also relatively few programmers who already know how to use language provided and library sorts and hashes etc. need to worry much about optimisation.
> Python is excellent
Not matter how much you fudge around it the fact of Guido van Rossum mandating indentation and white space as being integral to program syntax is a huge flaw in Python and one that turns many people off.
A shame because otherwise its an OO excellent scripting language.
I think you're confusing "fudge" with stroke of bloody genius! I'm not disputing it turns some crusty old stalwarts off but they people will grow old an die eventually.
Not having to terminate every bleedin' line with a semicolon
No opening a closing brackets around everything and hence...
No arguments over how to lay your code out and
No more wasted time tracing stray & missing brackets
Sematic whitespace is a quantum leap forward in common sense and usability and I feel sorry for those poor souls who can't see that. I've programmed in all sorts of languages over the years and I have to say Python is the first one that's literally beautiful...
def sqr( x ): return x * x
Go on, make that more elegant and readable with your wrinkly old curly brackets and your wizened owd semicolons.
I did a "Computing" A-level and I'm guessing it's the same thing as this. We had people that had never done programming before in the class - I don't think there were any pre-requisites - so Delphi and Pascal were good choices. At the level we were doing it didn't matter how much real-word use the language had outside of the classroom. Once you have the basic concepts, the syntax is just that and you can go away and apply what you know to whatever language you want. Is this class really the place for the more advanced features of these languages? I would personally not class the detailed implementation of an operating system as a computing fundamental for an A-level student.
In my opinion any broad programming course should consist of
- Short course on assembly (just enough to understand registers and memory management). Too many developers these days who only know managed languages (PHP/Java/.NET) don't understand fundamental principles of resource management and end up with poorly performing code.
- Short course in a procedural language (C or pascal)
- Short course in a declarative language (XSLT)
- Long course in an OO language (C++/.NET/Java)
Each one of those components is essential in getting students to understand the fundamentals of programming.
Computer Science is fundamentally about the thoughts that go into the algorithms--less syntax, more semantics. Coding is simply to put the fundamentals into practice. In comp. sci. you're learning more about things like the different searching and sorting algorithms, code efficiency. Later courses may even delve into the historic roots of computing such as Turing machines, and so on.
Agree. I get depressed when hearing someone say "I did computer science and learnt how to program some cool stuff in VB!"
A real CS course is more akin to a mathematics course than anything else. The hands-on aspect is merely a way to prove that you understood the theory.
How would Frank Skinner describe it I wonder?
This is true, but...
Without a grounding in how you actually express these fine ideas in practice, such a comp. sci. course would merely produce people who could "talk about it". I think at A-level the balance should err on the side of being grounded rather than appreciating the view from 10,000 feet. Notice also that the suggestion was for all these course to be short ones. The objective is to spend long enough with a concrete example of each paradigm to open the student's mind to its merits and drawbacks.
And you're in danger of confusing comp.sci with soft.eng. Personally I loved programming and wanted to get into games so in retrospect I should have picked comp.sci as they got to fuse their learning with the real languages while we mostly had to tit about with the conceptual ones like ML & lisp & smalltalk etc - interesting and expansive to be sure but of little practical use to any mid 90s games companies!
Yes ubiquity != quality, look at Windows. It kills me writing stuff in PHP that I know I can write much more elegantly and concisely in another language but sadly it's what most web hosts traditionally support and consequently what most of the large open source web based codebases are written in and what I have to interact with, that and PERL which isn't much better IMHO :(
Honestly it's a pig of a language, they've struggled to bolt on namespaces and OO features over the years but its still shows it's procedural, speedily cobbled together roots all too often.
I learn Pascal, then C during my time at North Staffs Poly (now Staffordshire University). I've never used Pascal during my career, its pretty much been nothing but C (well, apart from a bit of shell scripting and 68000 assembler in the late '90's). Someone made a comment about Pascal having good structures and pointers... Well, I'm quite happy with structures and pointers under C, myself.
One more thing. Can anyone think of any open source projects that use Pascal?
There's about 800 projects on Sourceforge alone, including a lot of libraries and compilers to assist with writing apps.
Delphi is a very good language to learn in, as you can get a graphical result with little work, it compiles very quickly (so you can check a change quickly), and you are forced to think ahead - you must define all your variables at the top of each method. Plus you have pointers, references, OOP, exceptions etc so can learn all of those concepts if you want.
So it's a really good teaching language.
The executables are really small as well - much smaller than MS Visual C++, and don't need external libraries for the majority of apps. (Unless you *want* some external libraries for some function or other of course)
Delphi is actually used a fair bit in business. I use it at work - I had never seen Delphi before my current job, and now I do half my programming in it. The other half is in C++ with Qt, which is how our 'high performance' stuff gets coded.
VB on the other hand... I learnt BBC BASIC first, then moved to VB6. I got so many bad habits from that - I have probably only just squashed them all a decade later.
Anon because I feel dirty admitting that I used VB.
This post has been deleted by a moderator
"Frustrating language as it's so string, but good in that sense."
Ada and Pascal have very similar behaviour in that they force programmers to think before they debug. With C it is the other way around in many cases. Also, Ada and Pascal have a similar syntax, type system and philosophy. Coincidence ??
Now Mr Lewis, please tell us why Pascal is utter crap, because it HasNotBeenInventedInMerkina(TM).
You *could* use it as a server-side scripting language, but with so many better options why *would* you?
The schools don't need to be going for free, pseudo-OO languages with serious OO caveats.
They need to pick one language in which to teach programming principles, something that one taught in the proper way can be used in almost any language (even Cobol), and I seriously doubt that those languages that cannot be written object-ively would be in any enterprise the new crop of IT yoof will encounter. Java would work, as would Delphi/Pascal, hough the former would have the advantage that you might acually be able to use the bugger after school finishes.
Then the schools need to pick a second language, one that is actually being used to make real things in the real world, because presumably the little darlings will mostly be going on to places where they will be expected to do that and there should be something to show for all that time, money and effort. That would be C#, Java or (shudder) C++ as a last resort.
C is a teaching language - bizarre that this has been dropped as everything you ever need to know about programming you learn in C, pretty much right at the beginning
But really, AQA... Visual Basic and PHP? Are you kidding me?
Well I finished my ALevels at the end of the last academic year.
AQA Computing was the course (It was revamped for those in the year below though).
Languages: Pascal, Delphi, Visual Basic
Now at Uni of Bath doing Computer Science
Year 1: (My current Year): Python (3/4 of a semester), Java (Rest of the year), quick into to Lisp
Year 2: (Going by Unit Descriptions) will involve these in varying degrees:
Java, Haskell, Prolog, Lisp, answer set programming, C or C++., and quite a few others
I can't say for Year 3 or further as the unit descriptions haven't been published yet.
I've never met anyone who didn't have to use VB or Delphi for a Computer Science A level. God, they're what I had to use 12 years ago, and I lost interest in those languages pretty quickly. I suppose they can be fine for teaching those with little computing experience, and considering that for GCSE Computing at the time the only language they tried to teach was LOGO, and 90% of A level computing students at my school didn't have computers at home, many of the students on my course back in the 90's couldn't really have got to grips with anything more.
I wouldn't have minded if they had concentrated more on basic cs theory and less on practical applications. Often the closest we got were lessons on theoretical information processing systems, but even they were informed by the mainframe generation. So this sounds the same really.
When I did computer science A, we learned Basic using punched tape & teletypes, talking to the OU over flippin' great 4U modems. When I got to uni, the first year programming course for physicists used Fortran on punched cards... how I enjoyed that!
Now, just a few years later, I'm sat here reading El Reg while waiting form some C++ to compile. I got here via all sorts of assembler, scripting, Fortran, Java, C... oh and some hardware too.
It doesn't matter what language you start with, you need to learn to think, problem solve, and structure your solution sensibly.
Did I miss the functional language? Did I miss the LISP-derivative? This is an incredibly narrow-minded list.
I note an earlier comment that you study to learn how things were done 20 years ago. This is probably true, but it would be nicer to learn how things might be done in 20 years time, and our best guide to that might be looking at real (if historical) examples of languages that aren't imperative and single-threaded.
That is why Silicon Valley is in California and not in Cambridge.
Most people with C, C# & PHP experience are having proper developer's job. Those only know Pascal/Delphi and Visual Basic are obviously out of job in the field and their only option is to train others on their outdated language.
When I was at college many many years ago, I did BTEC and we did everything using Pascal. The lecturer also taught A-level Computing as it was known then. He would teach in Pascal because as he said it was very useful for learning algorithms and structured programming. I have used my Pascal knowledge to adapt to other languages like Java and C. I also used the knowledge that I learnt from algorithms in Pascal to other disciplines.
I say that Pascal is good and it introduces good programming habits. Remember, this is an A-level course in Computing, not a degree or just programming. They have one, or two exams on all the subjects regarding Computing, so they have to cover things like programming, social aspects of computers, databases, spreadsheets and so on. Programming is just a small section of what is taught at A-level computing.
After all - learn to program in something like Pascal, understand OO and discipline required for memory management etc. and when you switch to Java/C# you have the skills to learn language properly without the someone from education to teach you wrongly.
Of course I don't think that is their objective ;)
If your sole objective is a deep and thorough grounding in OO theory and implementation, you should be talking in terms of Eiffel.
Or, at a pinch, Java. But Eiffel is the only OO-purist language out there.
It's also the only OO-language that effectively attacks the problem underlying the "reinventing the wheel" issue, but that's not valid to the discussion at hand.
While having a knowledge of how HTML (and other XML based stuff) works is important, CS courses are more likely to concentrate on the php/jsp/ruby/whatever and sql stuff (ie data processing) rather than presenting the results in an aesthetically pleasing manner (making it look pretty)
People who want to do HTML for a living have much more relevant routes through university than CS.
On my Uni course (which has many paths), the guys doing web development are more likely to be using, java script and flash etc, but then, their degree would be called something like "computing and multimedia" not "computer science" or in my case "software engineering"
I would not expect them to be able to use formal logic to simplify a network of gates, and they would not expect me to know in what ways internet explorer is not "standards compliant"
That is why the middle ground is where the big bucks are being made.
There is an element of elitism amongst some CS types who think (wrongly) that the web front end is the easy part. Spending hours trying to get various web browsers to behave is not their idea of fun (or career), but find the intellectual challenge of other areas of programming interesting.
Conversely, many web types love how powerful the mark-up can be, but the idea of working with abstract numbers all day (rather than numbers that have a directly viewable effect on what they see), reminds them a little too much of maths lessons back at school. Seeing the results of a change almost instantly is a lot more gratifying than the rather dry "Unit testing" and subsequent bug report merry-go-round.
I apologise for rambling (again)
CS != web development
I'm glad to see them choosing a selection of languages that actually makes sense. All the people out there saying that teaching Delphi will produce unemployable graduates... how does that work? People write new apps in whatever language they can find programmers for. (How else do you explain Java? Sun marketed it aggressively to all the universities, they started turning out huge waves of Java coders, and since it's what the grads know, it's what they end up using.)
Besides, I never had any trouble getting a job as a Delphi coder. I'm making good money at it right now. And if more Delphi coders leads to more apps being written in it, that's good for everybody. It's a good language that makes writing good code easier. Plenty of good programs, even top-of-their-class programs, are written in it, with Skype being one of the most visible examples. What actually gets written (and published and used) in C#?
Most real programs these days seem to be written in Delphi or C++, and the less C++ production code out there, the less glitchy crap I'll have to put up with. And the less .NET production code out there, the less slow, bloated crap I'll have to put up with. So I'm all for it.
I did A-Level Computing in 2007/2008 and we did VB, Assembler, and PHP. It was really good. It helped me a lot for my first year doing Applied Computing at uni.The ASM, documentatrion skills, and computer architechture parts really helped
My actual course doesnt exist anymore, theyve dumbed it down so much. :(
And what the guy before said is BS for uni i just needed BBB with 1 science subject or maths, or an A at Computing instead.
Pascal I can understand, as its syntax is excellent for data structure concepts.
Delphi... I haven't seen that in at least a decade.
VB should be banned from any course, *especially* VB6!!!!
Java would be the only language on that list that I actually think should be there, and maybe Pascal because of the aforementioned "learning syntax".
Removing C doesn't look so bright, though...
All the curly brace languages started out life as variants of Algol, a language which (I believe) predates computers. Algol's quite a good language to learn the trade with but its now not used by anyone I know.
Pascal/Delphi's an odd one -- I know that Pascal was conceived as a teaching language that could be compiled in a single pass, just right for lots of student jobs through an old timeshare system, but its hardly relevant to modern programming. 'C' is a systems programming language that's good to learn on, especially as mistakes get punished severely.
What these guys don't realize is that any language that has keywords is really just a kludge so they really shouldn't be learning stuff like BASIC or 'C' to start with. Since its probably not a good idea to spring books like "Structure and Interpretation of Computer Programs" on high schoolers is probably OK to teach them a practical language -- provided you don't dilute the work by confusing programming theory with knowing how to work different flavors of GUI.
To my old fashioned mind, education is about teaching ideas, principles, problem solving, critical ability, culture, foreign languages (because these can broaden the mind and are best learnt when young) and general knowledge. One learns, of course, some basic tools such as reading, writing and arithmetic in order to be able to do the rest. One does not learn, in maths. for example, how to operate a cash register or do double entry book keeping or drive Excel (I hope). One learns algebra, geometry, basic mathematical concepts and whatever else can be applied to whatever you have to learn or work with in the future. i.e. you should be learning the intellectual tools to survive, grow, learn and adapt. The result should be able to change fields, learn new skills as the world of work changes, not be trained as a programming drone.
University may be a step towards training in some fields, e.g. medicine, engineering. But its main purpose was always to develop intellectual ability to equip one to tackle new ideas, be open to new disciplines.
Even when I left school, computers were the size of large houses if they had any power and in buildings that looked like concrete silos: an amazing number of things we take for granted now, in everyday life, were barely a twinkle in somebody's eye even ten years ago; many of us ended up in jobs or fields the existence of which we did not even know when at university.
So, why are so many people keen to teach children or even students specific working tools? Why are they not keener to teach them problem solving, open-minded, adaptable ability to absorb and adapt ideas from the past, the present and to come? In other words, the priority should be to educate to provide flexible, open minds for the unknowable future. Nobody could train Trevithick in the technology of steam power, nor Stevenson in building engines, nor Darwin in genetics. They had good education that empowered them with good minds and basic ability.
The other way, for the technicians and those expected to stay on well-worn paths until their skills are redundant, is training: I am out of touch with UK now; but these were covered by apprenticeships, Higher National Diplomas/Certificates and so on. You do not go to university to learn to be a plumber, nor a car mechanic and nor should you to be a programmer in a specific technology. Those technically trained who develop the interest and ability can always convert to the more academic courses and vice-versa. Some of our best, most imaginative people have done that.
I think it is very appropriate if A level pupils and junior undergraduates learn the history, culture and context of the field, with a good dose of the sorts of things that work training is unlikely to cover for most, e.g. BCD, number systems ....
Of course teaching ideas, principles, problem solving, critical ability etc. are what education is about. Teaching language and mathematics is fundamental to these things. When it comes to the natural language in which teaching and learning occurs, schools have little choice, but a natural language has to be used or little education could occur. I'm all in favour of children learning a second natural language at a young age as this promotes much greater flexibility of thought.
If learning computer use in any deep sense is to be a required part of education, this clearly cannot occur from the perspective of a dashboard set of controls liable to change every few years. That is all most schools do, unfortunately, and this in practice is analogous to teaching a bit of reading while leaving all writing to a priesthood whose mystical secret knowledge is considered too abstract and elusive for mere mortals to understand. This is how things were with literacy in the middle ages, and we are now repeating this mistake in computing education if we don't teach programming in some language.
So either an artificial language choice has to be made or our computing education won't go beyond the superficial. Clearly the choice of artificial language for the purpose of teaching is important - in the sense of the choice we make not putting students off, and helping them to learn the underlying principles and concepts of computing as well as possible. The number of responses in this discussion topic suggest others think so too.
>I don't think I'd agree with the comment that C doesn't know what a file is. AFAIK, C treats everything as a file.
I should have clarified. The C *compiler* doesn't know what a file is. It knows includes and "translation units," but it has no concept of source code files as modules.
(It also doesn't know the difference between a number and a boolean, but that's a completely different flaw.)
I took Computing A-Level in 2003-2005 and found it quite beneficial to my understanding of computing and it was taught by a PHD student at the local university (I still consider him one of the best teachers I've ever had)
The syllabus was pretty good, you got to learn loads of cool low level stuff as well as the more waffly high level crap like Databases and so on.
I don't remember really being 'taught' a language as such, we used quite a few different environments as part of the course (ASM, Java, Haskell and......unfortunately.....VB6) but it shaped me well for university.
One of my fondest memories of Computing A-Level was when we did a practical on altering the values in memory during a game of Minesweeper so you could cheat a bit - useless in reality but it was a lot of fun at the time
...and since then (something like 20 years...) *never* touched it again. Taught myself C, VB, ARM assembler, and am now getting on with PHP.
Given that a person's brain can only cope with so much, doesn't it seem illogical to remove what may be the two most widely used languages (C/PHP) in favour of languages that are supposed to teach you HOW to program? The biggest lesson anybody can learn is "pseudocode", as in sit your ass down and think about what the program is supposed to do, and how. You don't need Pascal for that. I can't quite get over dropping solid deployed languages (C/PHP) in favour of VB6, even if kept "without comment"? What the hell does VB6 teach except how to sit at a Windows machine and "just bash out some code" with no thought to stability or future, and scant regard to method.
Skip the middleman, just teach 'em ARM code. All this curly and not-so-curly language nonsense is just an abstraction from reality. :-)
Way back in 1979, I got my Computing Science (as it was then called) A level with the first ever submission based on a microcomputer - 'Conway's game of life in 6502 machine code for the Apple II'.
Things have changed a lot since then!
I think the problem with choosing what to teach is due both the current enormity of the subject and attempting to fulfill 2 goals since it is a fact that not all people go to university after A levels.
I therefore think that the subject should be split into two streams, one to equip people to directly enter the workforce with a fair handle on practical aspects and another more interested in the 'philosophy; or underlying principles of programming languages.
Whilst I live and breathe C programming which is king of the embedded world, I can attest that it is a bad place to start for subsequently learning C++ which is much more common in banking (been there, done that!), games etc. Since C is definitely the easier of the two to teach yourself, I would choose C++ as the programming language for the practical fork of computer science A level.
As a former lecturer in software engineering I agree that C/C++/C# is not a language to teach beginners the basics of programming. Pascal is probably the best language for teaching programming precepts, but only in the first two semesters. Once the students have grasped the basics of programming with it, it ceases to be useful for many purposes.
And yet they're keeping VB? That isn't even a proper programming language! It might be OK for your spreadsheet-designing script monkeys, but it's of no use whatsoever to anyone working at the back end. Seems to me there's going to be a severe shortage of good server programmers in a few years' time.
"As a former lecturer in software engineering I agree that C/C++/C# is not a language to teach beginners the basics of programming."
I'm amazed by how prevelant this perception is amongst graduates.
VB's not even a proper programming language eh? Erm, VB.Net is every bit as capable a language as C# and Java matey. VB6 is a bit lacking, but regardless was used to write half the world's GUI applications.
The VB you're refferring to is actually VBA (Visual Basic For Applications) - and the fact you could confuse this with proper VB perhaps might provide a hint as to why you were teaching programming fundamentals rather than being trusted to do anything meaty out there in the real world...
As a Physics grad my programming knowledge is limited to C++, VB and 68K Assembler...
Glad to see that they are taking a proactive step to ensure pupils are taught more about the theory and methodology behind computer science, but is it reasonable to expect teachers to just learn new languages?
You wouldn't say to an English teacher "Right, we need you to teach German from September. English and German are from the same family of languages and have similar structure so you're basically 50% there already!"
What the hell is behind this thinking? If one .net language is suitable for teaching then both should be. Arguably C# also has a seconday benefit of being similar in syntax to Java and C++ making the future learning of those easier. Which will be needed as you aint going to getta job with Delphi or Pascal!
"Most centres offer Pascal/Delphi and Visual Basic as the language of choice for their students. This selection is based on the experience of the teacher in that centre and their own comfort with that language."
So because teachers dabbled in some VB a decade ago, we're going to be teaching kids on VB. Oh joy. VB is such a terrible choice that it would be right off my list. It's terrible for exception handling, object orientation and it's dead.
Python, C#, Java or Ruby are more sensible choices (what are universities teaching if not Java?)
The course is in two halves, candidates's choice of language for the assessment in the first half of the course has been reduced but there is no restriction on programming language in the second half of the course when candidates undertake a major project. A candidate may learn Pascal/Object Pascal/Delphi in the first year but may then choose to program their major project in year 2 in C# or PHP or F'# or Haskell or whatever. Once a student has acquired experience in one block structured OOP language they are able to pick up a second language fairly quickly.
I would have thought that the way to go would be to teach the basics from ground up to get a fundamental understanding of how computers work.
While RAD's like VB and Delphi have their place and I have used them (well, actually I would use Delphi but not VB), I am not sure that a Top down approach to teaching is superior to a bottom up.
I learnt from ground up so that's maybe where my bias stems from.
imho, leaving out c and what it actually represents is a serious failing but it may reflect the existing knowledge base of the educators currently in practice.
Imho, you can't do c without some rudimentary understanding how a compiler actually works and some exposure to the underlying architecture, and usually when you learn c on a specific platform, while you can just do 'standard c' and be blissfully ignorant, you can also invariably pick up on how the OS has been cobbled together (be this linux, windows or Mac OS X). In other words, you learn a bit about OS's too.
With this basic knowledge gleaned it's usually not hard to transition from platform to platform, language to language.
Again I would say, do not underestimate the value of learning c.
Maybe this is deemed no longer important. Maybe the goal of getting results without understanding anything too deeply is more important. Maybe that's why .NET stuff is there.
The first half of the course emphasises console/command line programming. The assessment involves students extending a console mode application. This year's task is noughts and crosses. Centres that teach Pascal use the console mode in Delphi. Console mode then leads on to Object Pascal programming which can be introduced in console mode including creating a forms/windows-based Delphi application from within a console mode application. Indeed, in Delphi it is possible to have both a console window and GUI application windows open at the same time so students get to see how a GUI event-driven application is created from the bottom up with the application writing and reading to/from both console mode and Window. This empowers students to also create their own objects at runtime in a Windows environment thus gaining a good insight in to how a graphical user interface is programmed. This leads on to students creating their own classes. It is also possible using Delphi to mix assembly language programming with high level programming as well as examining the effect on the registers of the underlying machine. Thus without leaving Delphi students can experience the full range from the bottom up.
One of my students this year created an equivalent of Michael Kollings' Java-based Greenfoot system in Delphi as a proof of concept that a Greenfoot approach to teaching OOP could be done based on Delphi rather than Java. Is anyone interested in supporting the further development of this?
I learned to program in BBC Basic, then in Forth, then 6502 assembler - the language you learn is in many ways irrelevant, it's the ways of thinking you learn that matter a lot more. The danger of teaching what industry demands is that what industry uses changes so frequently - much better to teach the transferrable skills that let you get up to speed in any language pretty quickly.
Prepare your young for the future by teaching them something they can actually use in real life!
C# is just as good a vehicle to teach computing as any one other structured OO language. The difference is that it is widely used in industry. Who uses Delphi or Pascal in the real world?
And VB?!? This was surely included because it was felt that the teaching staff are incapable of teaching a 'clever' language. No one uses VB in serious applications. Those who did, have regretted it - I have seen it many times...
What can we expect next? That the education wizards dumb down history education by using Hello magazine as the text book?
(Paris - for surely she is behind this dumbing-down conspiracy, so that she can look clever)
American Airlines flight reservation system is written in Delphi, I believe and the train arrivals/departures board at Shanghai central railway system is written in Delphi. I know because an ex-student of mine wrote it.
C# is a very nicely designed language, after all it was designed by the designer of Delphi, I believe. But why not F#?
It had 2 very important that made it a wonderful tool for teaching.
1: It quickly showed you the students who refused to follow best practice. You prepare your code on paper first in C and it's easy to work it out. This stays true in anything you want to program in.
2: Everything that was built after C looks like it. Some of the names change and most do away with pointers, but it still looks the same. Even SQL is easier once you've done C.
I forgot another one. It showed most of us that the book you buy for 60$, or the diploma you buy for 10k$, is worthless if you can't just pick up a new language quick. Don't bet on what the paper says.
Why anyone cares what A Level Comp Sci students do. Don't you do Maths or similar at A Level, and then do Comp Sci/Programming at Uni, in order to prove you're not a complete idiot who calls a spreadsheet macro a program, and can instead actually do programming? Not as bad as the "Computer Game Programming" courses, mind.
You mean the "Computer Game Programming" courses that generally teach physics, maths and C++ development? A lot of students enrol on them for the wrong reasons but their content tends to be far more practically useful than a good portion of the other compsci degrees you get these days (Media Technology anyone?). Also you can do more than one A-Level you know, room for maths AND comp sci.
>Who uses Delphi or Pascal in the real world?
Plenty of people, actually. Delphi is currently #9 on the TIOBE index, and it sees a lot of use in the real world. What it doesn't get a lot of is publicity. It's such a productivity booster, even compared with "more modern" languages, that a lot of companies treat it as a competitive advantage and they keep real quiet about it so their competitors don't pick up on it.
List of known delphi applications: http://delphi.wikia.com/wiki/Good_Quality_Applications_Built_With_Delphi Examples of the list: Skype, The Bat!, Spybot Search and Destroy
And if anybody is in doubt what can be done with pascal, have a look at: http://www.kanzelsberger.com/pixel/
To anyone that hasn't tried it, Delphi is really nice. It's object Pascal.
I came up the commodore PET route, Commodore Basic, then BBC Basic, then 6502 machine code, Z80, a bit of 68000 then 8086, VB, C and Delphi.
Given complete free reign I go straight for Delphi. strictly typed and readable, not like C's compile anything but who knows what it'll do style! And the "please don't make me come back and modify it later" syntax!
It's very hard to write the kind of security flaws we see every day if it had been written in Delphi, 99% of the time it just wouldn't compile. Buffer overflows of strings would be hard to exploit as Delphi has very powerful native string handling that doesn't rely on the programmer keeping tabs on how big his lump of allocated memory is and performing bounds checking manually.
That's not to say you are insulated from the real guts of the machine, you could do it C style if you wanted to, but you'd have to be insane to really want to!
Unfortunately on a day to day basis I have to use C... Oh well, that's microcontrollers for you, but I cringe every time I have to do anything with strings. I might as well be programming in machine code.
If you want to give it a try go, grab FPC (Free Pascal Compiler) and the GUI front end Lazarus. Free and open source for Windows and Linux.
If you can code pointers and recursion, everything else should follow. So long as you pick a language that allows both of these then I don't care. I code hardcore C++ in a high-flying job now. My language at O-level was Basic (on a Commodore PET); at A-level was Pascal. My first year at University we used Modula-2; and subsequently we used C++, and various specialist functional or parallel languages (Haskell and Occam spring to mind amongst others).
Since then, I've coded in various 4GLs, C++, and a bunch of domain specific languages. What I learnt when I was young was useful for principles, not detail (C++ at University looks very different to C++ now).
learn ISO machine tool langage
That will teach the little buggers the validity of error checking their code before some hulking great robot tries dismembering them
Follow that up with a heavy dose of assembly... a nice obscure RISC processor will do and the horrid little things will never do Comp.Sci at university..... thus leaving the field wide open for us oldsters with real knowledge to charge an arm and a leg for it.
We used to complain at university when various langages were jammed down our throats, spend 1 yr learning C++ and SmallTalk, then move onto C and Java with a dose of UML and SQL thrown in for good measure.
We were taught howto solve problems, then implement the solutions in whatever langage we were using at the time.
Which is exactly the way my apprentice gets taught on the robots.
Look at the problem
Produce a solution
Write the solution into code
There again.... I still think 2 yrs of assembly will do them a lot of good
Of course none of the languages are really natively suited to parallel programming, mostly they just have extensions of hidden ways the compiler thinks it can gain a bit of a performance boost, all being well.
Since we are now commonly at 6 desktop cores and 12 threads and the GPGPU advent will make this hundreds of cores, they are teaching languages which will be obsolete and inadequate just as this batch of victims leaves school / uni'!
Paul: Nope. The Delphi development team is currently doing some very interesting research into parallel execution as a language feature, and there are at least two very nice open-source parallel libraries written by community members.
I'm not sure where the perception of Delphi as outdated or not suited to modern programming tasks comes from, but it's just not true at all.
For a techie site it's amazing just how many people here don't get that schools are there to teach the concepts not a precise implementation. You know, language constructs, what goes on under the hood and so on. They are not production lines for ready made coders for industry. As some have alluded to, if you teach someone the underlying fundamentals then they should be able to pick up the various implementations (i.e. C#, C, Java). It really is that simple. I remember having to do assembly language at school in order to demonstrate how things work under the hood. Have I used it since? No. Was it useful? At the time yes.
For all those critics of it Pascal is a pretty standard teaching language - I certainly studied it 16 years ago at Uni (along with Prolog and Miranda for the declarative and functional aspects). Fortran 77 was for those in the Physics dept.
VB though? Sketchy, real sketchy - unless they want to demonstrate how things shouldn't be done.
As we all know, Pascal was originally designed to be taught and many (including myself) have benefited from good structured programming practice using Pascal since school and have easily turned that balanced foundation into professional use of many languages.
Object Pascal has taken the language to another level and paired with the Delphi IDE sees continuing and extensive, academic and commercial use around the globe.
Embarcadero Technologies took over Delphi, from Borland, a couple of years back, where it has gone from strength to strength under their stewardship, investment and dedication. I would expect a significant gesture of support to the UK academic community from Embarcadero to support this AQA announcement.
I work at a company now that does VB.NET and I have to say, everyday that I learn more about VB is another day I want to jump off a higher bridge. I learned C++ in highschool and a bit in college, learned Java in college, taught myself python, did C# in a previous job and am now learning Ruby on Rails in my spare time. I have to say VB is the worst thing I have ever had to do. It's ugly and wordy and the way handlers work is retarded. The board needs to take VB off and put another good language back on like C/C++ or C#. Something that resembles a coding language and not a 7th graders love letter to her boyfriend. "Not IsNothing AndAlso..." yuck.
"If anything, they should drop the obsolete and 'toy' languages in favour of languages like C#. "
Only that Pascal is much less of a toy than C#. It is possible to write fast and efficient programs in Pascal like it is possible with C/C++, but that cannot be said about Java or .Net. The latter two are the toys that do all the beancounting in the big corps.
As soon as you want to crunch 10 Terabyte of data in the shortest possible time you will prefer Pascal over C#. Or if you want to process 100Mbit/s of RADAR data. Or images larger than a stamp. Or simulate airflow around a car. Or anything else which is not beancounting.
Biting the hand that feeds IT © 1998–2020