Come back APL
All is forgiven....
Programmers often talk about writing "beautiful code," but computer scientist Ramsey Nasser has taken that idea to new lengths by developing the first programming language that uses Arabic script for its source code. The language is called قلب – roughly pronounced "alb," after the Arabic word for "heart" – and as Nasser …
"It's just a translation after all."
Thank you. You saved me the effort of typing in those exact words.
{checks calendar, discovers it's 2013} I would have thought that development environments would have had language switches by now. Being so perfectly trivial and obvious. Every time I scratch the surface I discover yet again how primitive you humans really are. I think you need more than seven billion just to finish up the obvious loose ends.
Will someone translate (port) this new language to English script? It might take a day or two.
No actually - this is called genius become mentally ill and knows to much but cannot function.
Assemble 100 different written languages, from every century, going back 1000 years. Substitute ONE letter from the english code, with one letter from one of the languages, and go back to the previous centuries set, and get one letter from the second language, and so on, and then repeat back the cycle through the centuries and languages, and when get to the first million characters of your code., tell me what the 497,227th letter stands for.
Idiot comment - but I would like to have an enormous capacity far beyond the human capacity.
I'd also like a 50,000 year life span too...
but...
APL a language was, with syntax worse than JOSS
And everywhere that language went, it caused financial loss.
(the use of APL was blamed, in part, for 1980s hedge fund failures, because its use of matrices in the restricted memory of the day meant that people simply didn't sum enough of the outlying cases and so failed to estimate correctly the probability of loss - or so the books, including the one by Bookstaber, tell me)
> the use of APL was blamed, in part, for 1980s hedge fund failures
1) Build rickety financial shite that you only pretend to understand (because "PhD in maths", natch)
2) Crash like the fat-arsed pretentious prick that you are, taking people's pension schemes with you
3) ???
4) "It was the programming language! Honestly!"
"1) Build rickety financial shite that you only pretend to understand (because "PhD in maths", natch)
2) Crash like the fat-arsed pretentious prick that you are, taking people's pension schemes with you
3) ???
4) "It was the programming language! Honestly!""
This does sound like it meets Occams razor quite well.
"(the use of APL was blamed, in part, for 1980s hedge fund failures, because its use of matrices in the restricted memory of the day meant that people simply didn't sum enough of the outlying cases and so failed to estimate correctly the probability of loss - or so the books, including the one by Bookstaber, tell me)"
Note that is down to the implementation not the language itself.
A criticism that could be leveled at any language used in this application. APL, due to its terseness might have been more memory efficient, so allowing larger ranges to be considered.
Not an APL fanbois, just looking to see fairness. I still think that a bunch of coked up ar**heads making the decisions seems more plausible than a poor language implementation, but that's just me.
No, according to Bookstaber the problem was in between implementation and design.
The issue was that at the time APL was an interpreted language. As a result, it was efficient if one if its terse constructs was able to manipulate a lot of data in one hit, but not if loops were needed. This meant that it slowed down dramatically if a range of values had to be calculated that required more than the available memory space.
I am prepared to concede that hedge fund traders, despite their PhDs in maths, failed to realise that they shouldn't have been doing it that way.But 6/6 hindsight is given to all of us.
Why next?KLingon programming already exists.
* Specifications are for the weak and timid!!
* This machine is a piece of GAGH! I need dual Pentium processors if I am to do battle with this code.
* You cannot really apprecaite Dilbert unless you've read it in the original Klingon.
* Indentation?! I will show you how to indent when I indent your skull!
* What is this talk of 'release'? Klingons do not make software 'releases'. Our software escapes, leaving a bloody trail of designers and quality assurance people in its wake!
* Klingon function calls do not have "parameters" - they have "arguments"- and they ALWAYS WIN THEM.
* Debugging? Klingons do not debug. Our software does not coddle the weak.
* I have challenged the entire Quality Assurance team to a Bat-Leh contest! They will not concern us again.
* A TRUE Klingon warrior does not comment his code.
* By filing this bug report you have challenged the honor of my family. Prepare to die!
* You question the worthiness of my code? I should kill you where you stand!
* Our users will know fear and cower before our software! Ship it! Ship it and let them flee like the dogs they are!
"Why next?KLingon programming already exists.
* Specifications are for the weak and timid!!
* This machine is a piece of GAGH! I need dual Pentium processors if I am to do battle with this code.
* You cannot really apprecaite Dilbert unless you've read it in the original Klingon.
* Indentation?! I will show you how to indent when I indent your skull!
* What is this talk of 'release'? Klingons do not make software 'releases'. Our software escapes, leaving a bloody trail of designers and quality assurance people in its wake!
* Klingon function calls do not have "parameters" - they have "arguments"- and they ALWAYS WIN THEM.
* Debugging? Klingons do not debug. Our software does not coddle the weak.
* I have challenged the entire Quality Assurance team to a Bat-Leh contest! They will not concern us again.
* A TRUE Klingon warrior does not comment his code.
* By filing this bug report you have challenged the honor of my family. Prepare to die!
* You question the worthiness of my code? I should kill you where you stand!
* Our users will know fear and cower before our software! Ship it! Ship it and let them flee like the dogs they are!"
Martin?
"Elvish? Klingon? Old hat. Go look up FiM++ if you want real crazy"
Na, the cow language is tops (http://www.bigzaphod.org/cow/), check this out.....
generate fibonacci sequence
MoO
moO
MoO
mOo
[[ main loop ]]
MOO
[[ print first number ]]
OOM
[[ temp copy of first number ]]
MMM
moO
moO
MMM
mOo
mOo
[[ store second number off in the first position now ]]
moO
MMM
mOo
MMM
[[ move back to temp number ]]
moO
moO
[[ use temp to add to first and store in second in loop ]]
MOO
MOo
mOo
MoO
moO
moo
mOo
mOo
moo
".......Arabic has given more to computing than most cultures." Like what? As has been pointed out, the vast majority of so-called Arab scientific achievements were simply recycled from the Romans, Greeks, Persians, Indians, Chinese, Phoenicians, etc., etc., in fact from just about everyone except Arabs.
This post has been deleted by its author
at the height of the post 9/11 hysteria, when TPTB were introducing RIPA an PATRIOT, quite a few people pointed out that if the bad guys *really* wanted to communicate in secret, they'd be best faxing each other in Arabic, given the number of people the west has who could actually read it that way.
There was a similar problem in the 50/60s in the USA. Anybody who could translate secret Russian messages was obviously a commie and so couldn't have clearance to see secret Russian messages.
Even the BBC used to blacklist reporters who spoke Russian/Chinese -as potential security risks. In one famous case, including an historian with a PhD in medieval chinese
Well, the IRA might delight in infusing Arabic with Gaelic, Welsh, and limerics.... With mandatory use of the fax tone as punctuation and pause indicators...
Wait... I think the NSA is with Section 31 to arrest me before this post completes...
A excerpt from a list of publisher's queries re spelling, and T. E. Lawrence's answers;
Query: "Slip [galley sheet] 20. Nuri, Emir of the Ruwalla, belongs to the 'chief family of the Rualla'. On Slip 23 'Rualla horse', and Slip 38, 'killed one Rueli'. In all later slips 'Rualla'."
Answer: "should have also used Ruwala and Ruala." .
Query: "Slip 47. Jedha, the she-camel, was Jedhah on Slip 40." .
Answer: "she was a splendid beast." .
Query: "Slip 78. Sherif Abd el Mayin of Slip 68 becomes el Main, el Mayein, el Muein, el Mayin, and el Muyein." .
Answer: "Good egg. I call this really ingenious."
For reasons that are too tedious to go into, I once had to assist* in applying for US study visas for 50 members of the Iraqi Navy. Due to there being no proscribed method for translating between Arabic and Latin script it was a rare day when I found they'd written their own names the same way on more than one form, or on occasion the same form.
*When I say assist I pretty much mean do, from their successful travel to and from the USA I can only conclude it doesn't matter if you make up ~75% of the information on the forms.
> are built with commands based on English words, such as "function," "for," "if," "loop," and so on.
If you can't remember a dozen words (or symbols) then you are probably never going to make it as a programmer anyway. Some of us old timers had to learn the hex codes for several different processors.
Fair enough (and those of us who are real old timers learnt octal, not hex codes...) but it isn't a dozen or so.
If you use a left to right language based on the Roman alphabet you can create understandable function names no problem, but right to left languages do present a bit of a problem. I expect someone will tel me they have a C compiler that accepts mixed Arabic and Roman for C, because some people will do anything, but it is hardly mainstream. The program on which I currently work has thousands of descriptive function names, whereas remembering a couple of hundred opcodes compared to having all those in a foreign language you don't know is easy in comparison
If you can't write a pre-processor to convert text from your chosen language into the character set expected by your compiler then you aren't a real programmer.
C++ makes it particularly easy because it allows a huge range of Unicode characters in variable names. (This ought to be true for any serious language, but I imagine there are exceptions.) In fact, *your* pre-processor would only have to translate the half-dozen tokens used by the C++ pre-processor and you could then use a header file with some well-chosen #defines for all the keywords of the actual language. Since this replacement doesn't actually change line-numbers, compiler error messages would still point to the right place and source-level debugging would still work.
Remember to write another translator for the reverse direction, so that you can import other people's code.
Translating the comments is "left as an exercise for the reader". For most code, simply removing them is probably the best bet, even if they were written in your own language.
Exactly, if your code needs comments, consider re-writing it.
Anyway, this type of translation could be done along with the code-tidy macros. (sometimes I wonder how people who use visual studio can get their code so badly formatted when it does it for you when you type the closing brace)
This is all so wrong - a street performance artist rather than an artist with any insight into the subject matter.
We've already got more than one computer language per week for 50 years. Now you want to compound that by my Scheiß vs. your merde ? بذاءة!
"As a result, all of the most popular programming languages, libraries, and APIs in use today are built with commands based on English words, such as "function," "for," "if," "loop," and so on. That can make learning programming especially difficult for students whose native language doesn't even use the Latin alphabet, for whom the keywords are little more than abstract symbols."
They're keywords, they represent abstractions. It does not matter what symbols you use as long as they succinctly represent the idea. Does using "如果" really change anything? How about "assuming"?
Oh, look, "إفعل" is "do", "إذا" is "if", "حدد" is "set" (see qlb/qlb.js), "قول" is "say" (qlb/primitives.js), etc. Not a revolution here I think.
The only possible hook this argument might have is that the first education in a new area should be as 'comfortable' as possible for the student. That's why introductory texts are in native languages? However, nothing you can do will make learning programming less difficult for the majority of people.
""If we are going to really push for coding literacy, which I do; if we are going to push to teach code around the world, then we have to be aware of what the cultural biases are and what it means for someone who doesn't share that background to be expected to be able to reason in those languages," Nasser says."
Examples would be helpful, as this looks like drawing pictures with waving flashlights in the dark.
I fail to understand how computer languages are based on human languages to such an extent that this is a worry. The only things essential are consistency and sequentiality. Is this some hangup over SVO vs. SOV or abverb/verb and adjective/noun ordering? Hey, does he know about Forth?
"What makes قلب unique, however, is that it allows Nasser to write programs that are not only functional, but also visually pleasing. By varying the lengths of the lines that connect the Arabic letters that make up the language's commands, Nasser can reshape the appearance of his code without altering its function, producing programs that are both practical and artistic."
ORLY? Okay, maybe not Python, but you can alter the appearance, the layout of code in many languages at will. That is, if you are perverse enough. "That's ART" you fool!" Ah, my mistake, I thought it was supposed to be readable.
Strangely, looks like code to me, just with Arabic symbols. Sorry, I'm unconvinced.
I chuckled at the thought of an alternate version that refuses to compile if it isn't also readable as valid holy scripture... I think this idea has appeared in science fiction before. IIRC, the object was to design a perfect program, that was also a prayer, which would result in either a simulation representing a religious utopia, or the actual alteration of reality itself...
Nah, that ain't the reason:
SONY. Because Caucasians are just too damn tall.
http://www.youtube.com/watch?v=96iJsdGkl44
Seriously though, there is marked difference between the West and Japan in the culture mental arithmetic, and notation may play a small part in that- so there may be a grain of truth in your hypothesis. Manufacturing is a different matter, but post WII it was influenced by an American manufacturing engineer (JIT, philosophy of perpetual improvement), as well as their own traditions.
To quote my tutor in manufacturing technology,
"British and American operational researchers wrote books on how to improve manufacturing. They were read in the UK, the US, and Japan. The difference was that the Japanese, not being privy to what went on in Western factories, believed them."
"an American manufacturing engineer (JIT, philosophy of perpetual improvement)"
I presume we're talking about W Edwards Deming (RIP)?
For folk who've not benefited from Deming's insights yet, the Wikipedia article is a decent start. But don't expect your corporate-sponsored "continuous improvement" drones to talk about his ideas, even if they've heard of him, because much of what he says is not convenient for traditional Western management.
Bit of history for you.
The number system was invented in India and was referred to as Hindu Numerals by the Persians. The Europeans then got them from the Persians and called them Arabic Numerals.
The actual European representation of the numerals (0,1,2,3,4,5,6,7,8,9) didn't happen until the late 15th century when they were used in printing presses.
This post has been deleted by its author
Whilst rarely seen in computer fonts (especially programs), our Roman letters can be written in "joined up writing", as most people do when writing with a pen.. Nothing stopping you making these 'joins' longer or shorter to fit some style.
Still if you want code to also be artistic, don't forget the source code for that Fujitsu web page that had the html indented to look like the landscape as seen from Tokyo (I think)...
"The gutteral consonant is often dropped in standard [Egyptian Arabic]"
Ok, thanks for that. I had assumed the reported could not read Arabic and made a mistake.
[ I have a passing familiarity with Arabic. I can read it and often make sense of it through extrapolation from another Semitic language. I love the way Modern Standard Arabic sounds though, especially when spoken by an educated speaker. ]
I think that putting the basic commands of a computer language in Arabic - or Thai, or Armenian, or French - is a perfectly legitimate idea. Really, to be neutral, the Latin equivalents of "if", "then", "goto" and so on should be the standard.
However, while an attractive appearance of formatted programs is a good thing, there are high-quality Arabic typefaces that implement a feature called "kashida" properly, with graceful curved extensions to the joins between letters. Presumably, a programming language works with ordinary fonts, to I am a bit concerned that this aesthetic feature will not succeed.
Neutral language? Latin? The vengeful ghosts of the Persians and the Germans will prove you wrong. As for the Chinese and the Russians, they didn't even notice the Roman Empire.
That's what is wrong with Esperanto - it is an "international" language which assumes that everybody either knows Latin or speaks a Romance language.
"..the most popular programming languages, libraries, and APIs in use today are built with commands based on English words, such as "function," "for," "if," "loop," and so on. That can make learning programming especially difficult for students whose native language doesn't even use the Latin alphabet.."
It's not just Johny Foreigner that's inconvenienced. Anyone who speaks and writes "proper" English has to try and develop the habit of mis-spelling words like "colour" and "centre" when using them in the context of writing code. I mentally read the 'Merkin versions as "coll-OR" and "sen-TER" when coding to try and force my wee brain to differentiate but, even then I do still occassionally stick an accidental "colour" or "centre" in my code, as a result of decades of accumulated muscle memory trying to spell things correctly. .
I went the other way. I learned to program first on an old DOS machine running quickbasic that I got second-hand, and picked up American spelling from there. My english teacher constantly marked my spelling as incorrect, resulting in a battle of wills that lasted for years: I refused to change my spellings, arguing that spelling is a consensus and the US, with it's much greater population, was now the greater authority on english spelling.
You seem to have forgotten about the English (first or second) language speakers of the Commonwealth nations, as well as all of the English speakers in the EU (just guessing but I doubt that too many Europeans use the American spellings), I think you're out numbered.
As for the "art" in this article, I present "Hello World" as art (some help from google translate of course).
# تشمل <iostream>
باستخدام مساحة الأمراض المنقولة جنسيا؛
باطلة الرئيسي ()
{
محكمة << "أهلا بالعالم!" << ENDL؛ محكمة << "أنا فنان أكثر من رائع!" << ENDL؛}
well... to accommodate the latests god google
..,,,,,─────,,,,,.
/ .. \
/ / \\.. \、
| // .____ 丶 ヽ
| ./ \ \ ヽ|
ヽ/ /::..,,,,,. ,,,,,,.::.\ ヽ|
ヽ!!l::.”⌒`.:i i::’⌒`ヾ.!!|iiiヽ../
;〈..⊂・⊃| |:⊂・⊃.:::〉iii/
\!!, …//| |ヾ\…..:,;;iii/
`lir…. /(,,∪,,)\…Yiiii/
;!iiii彡━━ミll,,lllli<
;;llllllllllilllllllllilllllii; \
/;llllllllllllllllllllllllll;ヽ |\
_/ヽ ,;lllllllllllllllllllllll; | |…..|\_
::::;| ヽヽ,illlllllllllllllllllllll!゙.| |::::::|::
Apparently that's the guy, can I haves virgins now?
Boffins, cause I read it on the internets.
Can any Chinese programmers/coders tell us about their early learning experiences of coding in English? Was it a confusing experience and did they wish they could use Chinese characters instead? Has a Chinese character coding tool been developed?
As others have already said, the words 'if', 'then', 'else', 'for', ...etc are symbols with a rigidly defined meaning and could be replaced by any other combination of symbols to make code that does exactly the same thing.
".....Has a Chinese character coding tool been developed?....." A friend from uni wrote his own C compiler in Chinese in his final year. It still used the standard English phrases, just the equivalent Chinese character set. He even wrote a script to convert source code files in English into Chinese chars, just so we could see what our programs looked like. A bit pointless in the long run, I suppose, but a bit amusing at the time.
Have you watched the video? The guy's a complete idiot.
"The first programming language which is a conceptual artpiece" Yeah. Much in the same way this comment of mine on El Reg is one. Oh look I just farted, make it two.
"The language can express any kind of computation" ...unlike any other Turing-complete programming language. I love the way he then proceeds to demonstrate Hello World... on a REPL. YAWN!
"A big part of قلب was, could you really build a language that doesn't use the latin alphabet?" Wow, a tough challenge. Worthy of Turing award, to be sure.
Why did you give this clown any space on the Register?
I think it's cool.
Went to an Italian Aerospace company to sort out some integration issues and was surprised to find that even the comments in their code were written in English.
It is understandable how code came to use English, but no reason there can't be ports for others who use different alphabets etc
I once had to work with a Thomson-developed RTOS for which we had documented source (and we needed it). The comments were written in a mixture of English and French. The English comments were the usual standard of 1980s documentation, i.e. they were formally correct and told you nothing of any use. The French comments were often rather amusing, and frequently referred to a "tas de merde", which was good to know when you were wondering what the hell something actually did.
Round here we use a liberal mix of English, Spanish and Catalan. Choice seems to depend as much on that particular coder's own political beliefs more than anything else.
But let's not talk about the efficiency gains made by having to search through the codebase three different times for something...
Couldn't you just have a translation stage before a compiler which maps keywords in different languages to English equivalents to if, for, echo etc? Variables and class names you could just leave in the native script/language. So long as they're consistent throughout the program it shouldn't matter. Needs the programming language to allow multibyte characters in variable/class names, but that could be one solution if this is really causing problems.
"a translation stage before a compiler which maps keywords in different languages to English equivalents to if, for, echo etc? "
You mean a bit like a preprocessor? K+R C does part of it, more flexible and less well known macro processors do more.
"Why" is an entirely different question.
... it will be much used for writing Christmas carols.
I recall reading in an article about the early days of computer science there was some discussion of whether english was the best language for code. A team lead by Grace Hopper was tasked with studying the problem and they found that english was the best.
The interesting thing about this will be if it becomes possible to write programs that could not exist in the languages current today.
RE: "programs that could not exist in the languages current today".
Language creation follows certain well-defined rules, and much like the plethora of algebras created before those rules were understood, many new ones are destined to fall by the wayside.
However, I do wonder if new language creation rules might not flow out of cultural differences, such as cultures with number systems like "one, two, many", or those that do not distinguish between past, present, and future. A language created by such a culture might be particularly useful in a multiprocessing environment. However, developers using that language might need to learn to speak Hopi (a particularly difficult language). A fitting revenge for all of the English-based programming languages we've inflicted on the non-English speaking world.
Multiprogramming still needs a concept of past and future - or at least "before operation is done" and "after operation is done".
Discarding the idea of the "present" might help avoid race conditions though - a variable has no value "now", only a value before doing X and after doing X.
No idea how you'd express that though, and my head may asplode.
class MyValue{
public:
MyValue(int initVal){
startVal = initVal;
};
~MyValue();
operator=(int newVal){
endVal = newVal;
};
int get_val(){
if (endVal){
return endVal;
}else{
return initVal;
}
}
int get_initval(){return initVal;);
int get_endVal(){return endVal;};
private:
int startVal;
int endVal;
}
...I got bored. Haven't even tested it. It's up to you to override things to support more than just ints.
"But Nasser had another reason for developing قلب, too – namely, that the Euro-centric nature of most programming languages puts people from other regions at a disadvantage when learning computation and software development."
The difference here is that all regions know Latin script, at least to some degree.
If this idea ever became successful (which I highly doubt, luckily), the only thing that would come from this, is even more of a divide, with each region using their own script, for no good reason.
I am Russian and when I started learning programming I was amazed that people agreed to use the same letters for that.
But now some man comes and says that this is wrong and everybody should use their own language because it is easier...
In my opinion the following is also easier:
- not to learn programming;
- not to learn math;
- not to learn how to write in foreign language;
- or better not to learn your own language writing.
Having said that in my opinion all the above is not right since only smart and hard work makes you even smarter.
There's always Puroderu/プロデル (http://rdr.utopiat.net/) - which seems to pack a nice collection of methods/features into its standard library - including a "Guguru"/ググる method for doing Web searches (according to http://rdr.utopiat.net/docs/reference/core/core.htm).
... to see if anyone else has mentioned this, but I take issue with the assertion that this is "the first programming language that is a conceptual art piece".
The curious may wish to have a look at this venerable collection of languages that in many cases are conceptual art pieces: http://esolangs.org/wiki/Main_Page
Furthermore, they tend to go a bit further than just assigning an ungrounded (for most of us) symbol set to well-worn functional programming elements.