This should be at the Bletchley park museum
Full stop.
Auction house Bonhams says it has a lost Alan Turing notebook for sale. The auctioneer says the notebook “dates from 1942 when he was working at Bletchley Park to break the German Enigma Code.” The 56 hand-written pages reportedly includes “... works on the foundations of mathematical notation and computer science”. The …
Notations are important, you think through them, you frame things acording to their contours if that makes sense. Heaviside D operator is a good example: reduces certain classes of differential equation to polynomials.
About the notebook: just publish high res scans of the Turing pages so historians of maths can mine them. The book itself can then go to the collectors so far as I'm concerned.
Notation is everything but one notation for everything isn't the best idea. The idea of notation is that in some cases it's better and in others it's worse but merely CHANGING the notation you use between the two instantly makes a certain class of problems easier.
Think about chess games. You notate in classic co-ordinate systems, but that's no good if you want to record the board position itself (as you have to replay all the moves). So we have Forsyth notation to note the board position. But even in game-record notations, sometimes it's easier and more sensible to talk about particular ranks and to do so from the viewpoint of each player, hence we get into descriptive notation, but for beginners they probably are more comfortable with algebraic, both of which are really just a simplification of much more long-winded notation which notates every square moved from even when it's unnecessary.
But computers have their own notation for games, and there's even odd things like chess notations that side-stepped censorship on coded messages sent through the post office during certain wars, etc.
A different notation of EXACTLY the same problem can make it trivial to solve. But no one notation can do that for ALL problems. This is true of all mathematics, where merely using the notation of a different branch of mathematics to describe the same problem can provide links that nobody ever thought of between the two and make solving complex problems trivial. This is pretty much where a lot of the "universal theory" mathematics is heading towards at the moment, for instance.
I used Leibnitz in high school and college, and Heaviside as well in college, but I don't think I ever had a course where we used Newton's notation. I may go look it up just out of curiosity.
Of course, some fields, like linear algebra, seem to spawn new notations all over the place. How many ways are there of writing complex vectors in common use?
Which leads me to ask - does anyone know of a site that's good for answering a "what the hell is that notation supposed to be?" question. I was reading some mathematical paper a couple of years ago - might have been Zadeh's original fuzzy sets paper - and ran into something I don't think I'd ever seen before. Spent a while poking around Wikipedia and Mathworld to no avail.
"The "U-571" of Turing Biopics (What a complete load of tosh, dramatacised up for Hollywood)."
Not seen it so can't comment, but the publicity drive here in UK does mean my students have at least heard of Turing. Silly code games in maths lessons - went down very well, a bit of safety stuff from CEOPS made them think about https and the little padlock...
In Bletchley Park, you can get some nice books explaining very well the mathematics behind the first breach of the Enigma machine (reconstructing the complete internals from the first six letters of about 80 encrypted messages and probably a cleaner writing down the cabling on an Enigma that stood around with settings unchanged for several weeks).
"True story" movies are boring. It's as simple as that.
Hence why every "True story" is "based on", not "this is what actually happened".
You have to go into movies KNOWING this. U-571, however, is a different class of tosh. Imitation Game is merely embellishment and "artistic licence" with the story so that grannies aren't saying "That was boring, what the hell was going on" for everything.
NEVER watch a movie expecting historical re-enactments. You won't get them. Ever. You think Mrs Brown was accurate? Or The Iron Lady? Or Made in Dagenham? Or The Queen? Or The King's Speech? No. Never. Not even close, any of them.
What you can get, having seen the movie, is a fun run-around in a Turing-like world. A homage to the man. Something embellished and polished but fun and interesting and insightful and true to the SPIRIT if not the word of the law. I'm a massive Turing fan. Sorry, I'm a mathematician and, from there, a computer scientist. I'm a programmer. I'm a computer theorist. I'm into coding theory and cryptography. I couldn't help but be anything else. I do not go to watch movies in cinema. I did for this. It was great fun, close enough and good enough that I can point out the problems (and that's half the fun of knowing the subject truly, like half the fun of knowing The Silmarillion and LOTR inside out is pointing out the bad bits of those movies) but my girlfriend can enjoy the movie as much as I do, and we can get some kids in the cinema going "Oh, cool, I never knew about this guy but this looks interesting".
You want factual representations? There is NO VENUE for them whatsoever. Even the "science" channels on TV are a load of tosh, the Royal Institution Christmas Lectures nothing more than QI without the questions, etc. YOU WILL NOT GET IT.
But you can enjoy a good movie that you can poke holes in as an expert in the subject, if you like.
You missed two of the greatest examples: "Braveheart", which managed to compress decades of actual history into mere weeks of story timeline, and "The Other Boleyn Girl" which I watched recently, and which concludes its credits with the boilerplate disclaimer along the lines of "This motion picture is a work of fiction and any resemblances to historical persons and events are purely coincidental". Amen to that.
Many years ago I asked Robin Gandy about Andrew Hodges' biography of Turing. He said it was very accurate. There are some more biographies of him now, but I've not read them.
The problem with mathematical notation is that it is usually designed by accountants, physicists, and, of course, mathematicians. None of which groups are noted for their prowess in graphic design or cognitive psychology. (Ok so Robert Recorde's equals sign is an example of superb graphic design but I agree with Turing on Leibnitz' notation for calculus.)
> The problem with mathematical notation is that it is usually designed by accountants, physicists, and, of course, mathematicians.
That's not the problem, that's the stroing point. Mathematical notation is created (rarely designed) to express some concept consisely and in a way that helps *the person creating it* thinking about the concept/using it correctly. Anyone doing non-trivial math develops new notations all the time. If lots of other people find a particular notation helpful it sticks. It's a natural selection following the devlopment of the field.
The last thing we need is notation designed by cognitive psychologists who don't even understand the mathematical concepts. As for graphical design, notations are typically first created for writing by hand (scribbling, actually). The time for typographical refinement comes when (if) they do stick.
Mathematical notation is created (rarely designed) to express some concept consisely and in a way that helps *the person creating it* thinking about the concept/using it correctly.
There's a nice bit in Beckmann's A History of Pi on this point, where he quotes a Medieval text on geometry that was written without the benefit of that eponymous bit of notation, and instead uses the breezy quantitas, in quam cum multiplicetur diameter, proveniet circumferentia. Put that in your equation and smoke it.