Humans don't have to worry about the leap seconds/minutes. The issue is that computers have to worry about them, and it's much easier if they only have to worry once a century instead of the every year.
8 posts • joined 12 Mar 2011
RE: Dislikes new FF UI
Interpreted languages always include their compiler because they are compiled at runtime. That's pretty much the definition of an interpreted language.
I tried the experiment you mentioned in FF. That's stupid. I don't know who thought that up, but it's moronic. They might have thought it wasn't necessary since now you can just right-click on any tab and "Bookmark All Tabs", but having it in the bookmarks menu only under some circumstances is ridiculous.
Though TBH I'm a little surprised that anyone even uses that feature now that we have tab groups.
Pretty much all interpreted languages have some equivalent of eval, and there are legitimate uses for it, but way too many devs use it as an excuse to be lazy.
I'm not sure I get what you mean by "capabilities" though. Are you saying there should be more APIs? Or that you like the native data manipulation allowed by Types like Queue and StringBuffer?
Also, what's wrong with FF's UI? They're getting rid of all the crap in the chrome so we can actually see webpages. What do you think that browser users really need, design-wise, and how does it differ from the minimalist approach that's come into vogue recently?
Note: If you don't like it, it's pretty easy to switch the UI back to the old one. Just right-click the upper UI, check "Menu Bar" and uncheck "Tabs on Top".
I don't get it
It almost looks like they're trying to forgo JS's event-driven nature to build more linear programs. Just what the web needs, more UI code that looks like it came out of Visual Studio.
Not to mention their choice to go back to class-based inheritance, which I can only assume was a move born out of programmer familiarity rather than functionality. There is nothing inherently wrong with prototypal inheritance such that it needs to be replaced with a class-based solution. Sure, it takes a little getting used to, but it's perfectly usable (and quite powerful) once you get the hang of it.
The only explanation I can think of for this is that it was written by the same sort of people who came up with GWT. Programmers who really want to be web developers but refuse to let go of Java.
PS: If anyone really wants a good way to make development of web apps easier, try Backbone. I just found out about it the other day and have been geekgasming over it ever since.
What's the big deal?
I fail to see how this is a big deal. How does someone's personal actions have any bearing upon their ability to do their job (as long as they don't drink heavily the night before they need to make important decisions)? It's stupid that he lied about it, but I can see why he did given the furore that always crops up around this sort of crap.
They did fix it
I looked through the Dropship source code and did a few tests.
Normally, when the client is going to upload a file it first hashes it, and then send the hashes to the server. If they hashes already exist, then it makes a new pointer to the file data in that person's account and tells the client not to bother uploading the file. If the hashes don't exist then the client must upload the file.
From what I can tell, they have just changed the server code so that it always responds with a "Hashes not found" message.
This looks like a quick fix, as it's probably going to result in increased bandwidth expenditure for them until they come up with a better solution.
One solution might be to only allow hashing on a per-user basis, and still require uploads even if the file already exists for another user. Once the file is stored they can safely use pointers as long as they don't implement it such that the timing of availability (or similar) reveals whether the file already existed in the datastore. This wouldn't fully solve the bandwidth problem, but it would make the system more secure.
First, you said that they wanted to have more than 6 users able to collaborate on a document at once. That's not the problem. The problem is having more than 6 documents open at once.
Second, to Ammaross Danan.
SPDY isn't used for everything. It's just being used experimentally right now. I did a packet capture to double check, and normal websites are all still loaded with vanilla HTTP.