It sounds like building an outback datacenter might be decent for developing techniques for building a lunar base — maximize use of local materials, minimize human presence, etc. Putting a data archive on the Moon has been mentioned as semi-plausible.
22 publicly visible posts • joined 30 Jan 2012
Datacenter architect creates bonkers designs to illustrate the craft, and quirks, of building bit barns
Dear chip designers, if you're struggling to get components made, try 28nm. Supply set to overtake demand
Re: Sort of inbetween
Even for a new design, a less dense process might not make financial sense. Microcontrollers often use older processes because pads and analogue circuits do not scale as well as logic or memory and because minimum cost rather than price-performance is a common design goal (high voltage support and on-chip support for persistent memory are also factors). Production volume also has an impact since newer processes have larger non-recurring expenses (e.g., more expensive mask sets).
Older process equipment can be fully depreciated, so selling capacity near incremental cost makes sense (but there is also pressure to upgrade a fab if all the new equipment fits space, power, etc. requirements, which is not a given as new processes tend to add steps). However, if one gets 90% more chips per wafer on 20nm than 28nm — very good design scaling — and gain some power and/or performance that adds value (or perhaps improve cost/power/performance by integration), that one major node older process will have to be a lot less expensive unless NRE is a major factor.
Stallman's final interview as FSF president: Last week we quizzed him over Microsoft visit. Now he quits top roles amid rape remarks outcry
A page can be broadcast to a somewhat large area via satellite. This wastes bandwidth (broadcast when there is only a single intended receiver) and provides some location information (not being a global broadcast as that would really waste bandwidth). However, I would not be surprised if Richard Stallman is just out of touch with the technology.
IBM, Intel tease 2020's specialist chips: Power9 'bandwidth beast' – and Spring Crest Nervana neural-net processor
Re: Is the world going to switch from Microsoft Windows...
Linux is not ISA-neutral. E.g., it effectively requires caches to support virtual address aliasing. It is also strongly oriented toward a ring-based permission system and linking translation and permission. The application environment for Linux is certainly not oriented toward use of capabilities (i.e., passing permissions in a fat pointer).
Hate your IT job? Sick of computers? Good news: An electronics-frying Sun superflare may hit 'in next 100 years'
Strong female characters
One does not even need lead characters to be female to have a significant presentation of strong female characters. I (and I suspect more than a few male nerdy science fiction fans) enjoyed the early, highly intelligent presentation of Romanadvoratrelundar (sadly the character later became "just a companion"). Having a peer (superior?) in intelligence albeit with less experience provided nice opportunities to show the Doctor's strengths as well as present a positive female role model. (What other kind of character would be able to say both as a jibe and respectfully that the Doctor wins by making mistakes?)
While Doctor Who is not exactly known for consistency, it would have been nice if the established expectation that regeneration does not change gender (which was changed at least as early as "The Doctor's Wife"). It would not have been difficult to make up some technobabble to explain the exceptional case, but those managing the series chose to support the an arbitrary view of gender (which is distinct from equality in worth).
(The series also seems to have diminished the superiority of the Doctor. Part of this is normal (bad) enemy/conflict inflation (saving worlds, galaxies, and even universes can become old hat), but I suspect part of this comes from trying to make the companions more significant. This could have been done in other ways than making human companions peers (or superiors), but it is easier present quantitative value than qualitative value.)
I very much disagree. Government is a compensation mechanism for human injustice: "If men were angels, no government would be necessary." (James Madison). That people fall short of being perfectly just and perfectly compassionate might be called bad news (though it is hardly news).
Religion works against some of the ills of crass materialism (apathy, despair, the attitude that might makes right) by declaring that there is ultimate purpose and ultimate justice.
While there are advantages to a unification of the offices of prophet (proclaiming truth both as a comfort and a correction; the role of the press in modern democracies), priest (reconciling truth and relationships; religion in a broad sense plays this role), and king (enforcing justice; the role of government), their unification most often results in no one speaking uncomfortable truths to the king or inciteful truth to the people and no one reconciling the king and the people (making repentance ineffective and so unattractive) or the king and truth (making cover-ups seductive). The efficiency of unification degrades into a fragility of execution. (This is a common optimization issue, not limited to human organization.)
The unification of government and religion also embraces that "masculine" thinking which "has a little trouble recognizing as such some forms of power which are not synonymous with force" (Lois McMaster Bujold, Barrayar). Rejecting the power of personal and social change is very problematic.
Re: Not exactly
You do realize that the Operton A1100 has significantly higher per core performance than Avoton and that power use is super-linear to core performance?
One might argue that its higher per-thread performance is unnecessary for the workloads where it will be used, but that is not obvious.
(It might also be noted that the A1100 is using a Cortex-A57 primarily to more quickly enter the market. Since Cortex-A57 is a more generic high performance 64-bit ARM design, later designs by AMD may use custom cores more suited to the targeted workload.)
Price is a significant consideration. While higher revenue per mm-squared allows Intel to invest more in processor design and (especially) manufacturing technology, the performance and power-efficiency return-on-investment is sub-linear for processor design. This means that a company can be profitable with lower prices even with lower volume in an area with high fixed costs.
Alternative market targets can also provide a significant advantage. A one-size-fits-most design will be less optimal than a design more focused on a specific workload. If one is only seeking 10% of a somewhat large and diverse market (claims of ARM having 25% of the server market by 2019), there is usually opportunity for specialization. This may be particularly significant in components outside of the core.
Intel's volume advantage (which allows more aggressive binning and reduces the impact of fixed costs which are significant for processors) and process technology advantage make competition difficult. However, it is not a foregone conclusion that Intel x86 is unassailable in the server market.
That reminds me of a quote from H.C. Andersen's "The Old Oak Tree's Last Dream" (http://www.andersen.sdu.dk/vaerk/hersholt/TheOldOakTreesLastDream_e.html)--a mayfly replying to an oak tree:
"No, I don't understand you at all. You have thousands of my days to live, but I have thousands of moments in which to be happy and joyous. Will all the beauty of this world die when you die?"
"No," said the tree. "It will probably last longer, infinitely longer, than I am able to imagine."
"Well, then, we each have an equally long lifetime, only we figure differently."
Pricing is not so simple
Even books have a substantial amount of Non-Recurring Expenses--e.g., editing, typesetting, even to some extent marketing--associated with their production and sale. Publishers must also cover costs associated with unexpectedly unpopular products (much as pharmaceutical companies must pay for research that turns out to be unprofitable). If a flat price was charged, the sales volume would be lower, the mean price would be higher and many would be unable to justify the expense of purchase. (In addition, at lower volume some efficiencies of production and distribution would be lost.)
I am not suggesting that publishers are efficient (i.e., that costs could not be lower) or willing to sacrifice profit for the common good (i.e., that prices could not be lower) but that pricing is not simple and selling just above incremental costs in some markets can be beneficial to all.
On an emotional level, artificial market segmentation seems wrong, but the economics are *not* simple.
See the Open Source Definition
"It's entirely possible to release open source software that directly forbids commercial use"
No. That would violate the Open Source Definition: http://opensource.org/osd#fields-of-endeavor
(Prohibiting selling the software would be a violation of the first point in the definition: http://opensource.org/osd#free-redistribution )
Re: I Think Wear Leveling Will Still be Needed
Since MLC flash must be erased before a write and erasure is slower and uses a larger block size, erasure is generally handled as a background process (so some fresh blocks are kept ready at all times). This means that some redistribution of writes will occur anyway. Whether this redistribution is sufficient for enterprise use with a 100M endurance is questionable, but the endurance behavior is not as simple as 100M writes to the same (virtual) location.
Since decent wear-leveling techniques are relatively mature, providing adequate wear-leveling should not require special algorithms when write endurance is relatively high. Such high endurance flash *might* reduce the benefit of clever flash management (or trade-off cost of the flash chips vs. cost of the flash controller).
Since write endurance is one of the factors limiting flash cell scaling, this technique *might* help flash scale down further than previously expected.
A better answer for interview question?
“Which one of the five principles of software engineering is the most important?”
A better answer might have been: The one that is least treated as important by the team.
Such an answer would have affirmed the idea that they are all important (while providing a singular answer--not that a tie for "most" is not possible) and established that the interviewee understands that complacency is a significant danger and that there are team responsibilities. Such an answer may even jar (and perhaps impress) the interviewer if he was complacently hoping for the canned answer of "all of them".
Of course, getting snotty because of a weak answer to something of a trick question does demonstrate poor interviewing skills. Outsmarting such an interviewer would probably also keep one from being inflicted with that interviewer's work environment; such a person is more likely to be annoyed than impressed by an answer that is better than the one being sought.
Re: How did they miss it?
My very slightly informed guess for why this extremely hot gas cloud was much more difficult to detect than the cold cosmic background radiation is that the black body radiation at extremely high temperatures is skewed toward the far blue end of the spectrum (so the use of X-ray detection satellites). In addition to the difficulty detecting X-rays (even using a satellite), the high energy per photon for X-rays reduces the number of photons emitted for a given energy. This cloud is also extremely diffuse, so the X-ray glow viewed from a telescope's relatively narrow cone would be relatively faint (I suspect).
GPL is a license
Relatively little free software is in the public domain. Also the FSF has lawyers (ordinary businesses are unlikely to hear from them, of course) and RHEL (which includes support) is sold under a contract.
While I agree that FLOSS can be easier to manage, the commercial Linux distributions do have some compliance issues.
Not according to the OSD (I think)
See point 6 ("No Discrimination Against Fields of Endeavor")
While the main point of this is to expand the community by allowing commercial use, the general application means that one cannot discriminate against military use much less some difficult to evaluate concept of just war (i.e., what actions are sufficiently unjust). The military certainly does not have a monopoly on injustice--no human being acts perfectly justly.
One also has the problem (even aside from any inability to enforce compliance) that restrictions tend to be insufficiently flexible (e.g., the classic no South African use restriction) or excessively flexible ("do not harm").
Using an auxiliary salt
Dom 3: "That's why you need a second site-wide salt that is not stored in the database. Database compromises seem to happen all too frequently, but source code doesn't seem to leak so often."
While this would help, for sites like LinkedIn which are very open about setting up accounts the attacker could have already set up an account and so be able to derive the site-wide auxiliary salt. Basing the auxiliary salt on a hash of the user name and password (and a constant) might help a little.
A higher cost mechanism could use checks from multiple sites to authenticate (presumably with tolerance if one site is determined to be down at the moment or perhaps a secondary, more secure but more cumbersome authentication method).
If a high-profile site cannot be bothered with using salts, there is not much hope for widespread use of more sophisticated authentication.