Those things happen all the time
Essentially many important projects are "maintained" by a single person. The IANA used to be a single person.
4837 posts • joined 9 Mar 2007
"You are erroneously assuming, as a programmer/techie, that every business house had the ability in terms of technical wherewithal to just open up the source code and make it their own."
Well if your business depends on software you cannot maintain yourself, maybe you should not be doing that business. It's like running a restaurant without having staff that can cook. It's like running a factory without a mechanic on hand.
And that's why the Unix philosophy works so well with Free Software. If your program is small and simple enough that someone can just take the manual and re-implement it from scratch, the software is truly free. If your software package is huge and it takes dozens of people just to maintain it, it cannot be free.
BTW this has nothing to do with funding. Mozilla, for example, is a hugely overfunded company which could hire hundreds of programmers for decades on a singe year of income. Yet they let their main "product" fall into disrepair.
I mean IPv6 is not inherently more complex than IPv4, in fact it's much easier in many regards (like stateless auto configuration for networks without DHCP).
My guess is that it's because of the "hype" people which crammed more and more "experimental and optional" (read unused) features into it like "IP Mobility" or "NAT64" or "NAT46". However nobody really uses that. In reality IPv6 is not much different to IPv4. It's a separate network sharing some infrastructure, it codifies some nifty ideas you have in IPv4 in a cleaner way (e.g. your local nameserver should always listen to a fixed local anycast address so you don't need to configure it). Nobody uses those advanced features except for experiments.
1. Mozilla doesn't care about its users. They fail to understand what they exist for and instead work against the user.
2. Web standards are so utterly complex, that it's impossible to write a truly free browser. The code of browsers is to complex for a single person or small group to make meaningful changes. The code is, in a way, unfree, but not because of license, but because of complexity.
Servers are not the problem here. Servers can be secured physically and they typically only run "trusted" code. (=code that you deliberately installed)
The main issue here is with browsers. Browsers continue to have a missfeature that allows people to send code with their documents. The figleaf is that "sandboxes" will prevent that from getting dangerous. Ignoring for a moment that the mere act of computation on a client can be an attack, this is yet another example for sandboxes failing in more or less unexpected ways.
We must stop using sandboxes as an excuse to do highly dangerous things. A sandbox can be an additional barrier against exploitation, however it is not a cure all that allows you to execute random malware.
Most of the JS from other domains is malware by now. Usually it's code that manages ad providers to do things like holding an auction to determine what ad will be displayed to you. I can accept advertisements, but I do not accept such behaviour.
"Native code means porting if you have more than one target, which in itself can be imperfect and can result in bugs."
Yes, but if you choose one of the sensible ways to do this, porting is easy and reliable. I maintained a large-ish software package and there were something like 3 lines of code with ifdefs around them to handle differences in platforms. The platforms were Linux, Windows and MacOSX.
"I fail to see how this argument is any different from JS. That can suck your battery too."
It is in no way different than JS, but that's my point. WebAssembly is essentially like JS, but you don't even get the (potentially obfuscated) source code.
If we want to have server-dependent "Apps", we should perhaps ditch any kind of code executing locally and instead define a sort of "terminal". This doesn't need to be based on character terminals, but could instead be a DOM-tree controlled via Web sockets.
... that it's a gigantic security nightmare. Even if your sandbox is somehow "secure" it still can be used to suck your battery empty or mine $crypocurrency without your consent.
Then again the need for a platform independent "bytecode" for programs might have existed in the 1990s, but today we have moved on to distributing software in source form. Why turn back the time to where software was distributed in opaque binary files you had to disassemble in order to adapt to your needs?
It's not just DWDM on a chip, it's OFDM on a chip. The carriers can be much closer together relative to their bandwidth than on DWDM as they are all coherent and have a well defined distance. Essentially the neighbouring frequencies will interfere with the middle one, but those interference patterns will cancel themselves out over a symbol period.
Honestly I do not think this will require a lot of rack space. After all this is "just" 422 times as much as already established 100 Gbit/s and this can apparently is compatible with CMOS, so you could even place your routing logic on the same piece of silicon.
that there may be countries somewhere, where your ISP is less trustworthy than Cloudflare. Of course this doesn't apply to Europe where your ISP could easily get shut down if they were caught exploiting your DNS traffic, whereas Cloudflare only makes a non-enforcable "promise" that they won't mess with your queries.
After all that is just a sign that can display text. Serial lines are more than adequate to get the data there. This probably uses something like RS-485 which works much better for long lines, but is otherwise nearly identical to "normal" serial lines. Also it works over normal twisted pair and doesn't need higher grade cable you would use for Ethernet. BTW you can run such a system in a unidirectional mode so an attacker on the bus can eavesdrop and modify the data, but not access the master.
I mean this makes a lot more sense than what a German railway company did, using Windows PCs to directly drive their station signage... with the obvious result that eventually they were hit by ransomware.
For example on average there are only around 1300 credit card transactions per second in the US. While this may sound like a lot, it's probably less computation than playing an MP3 file takes.
Of course there is _way_ more database activity, but we live in an age where storing your database in RAM or on fast flash memory is feasible.
To put this into context, every fixed line call in Germany has to go through a complete lookup of the portability database. That's a database listing every number that has ever been ported. That's millions of datasets. The lookup works with a simple barely optimized program which rarely takes more than a millisecond to look up a dataset, even on a very modest computer.
"I wouldn't have thought there would be much reason not to do something like that as a web app, that way anything with a recent browser from a single board computer (such as a Pi) to a full on workstation costing thousands could be powering the screen."
Yes, but web standards change very quickly, and web developers always want to have the newest technology to fail in. Also web browsers are hugely complex systems (more complex than operating system kernels) which are therefore likely to fail in inpredictable ways.
I think the problem is that we do not have propper "graphical multimedia terminal" standards. Sure we have VT100 to which we have added truecolour and mouse support, but if you want to display a photograph or play a sound, your choices are severely limited.
First of all, it's extremely easy to do something and pin the blame on someone else. Want the Russians to be the culprit, buy a Russian PC to develop your code on and leave Russian language clues. Attribution is basically impossible, unless you are dealing with stupid people.
Then there's the whole area of side effects of doing this. If you want to make attribution easier you have to make sure that things like anonymous communication disappear. This endangers large groups of the population, from whistleblowers to homosexuals. Probably even people like security analysts.
Third it doesn't fix anything. The security holes are still there. If they are not used by criminals they probably are used by "Lawfull" organisations.
In short it's an insane idea, not well thought out and based on assumptions which have been proven wrong many times.
Since the planing of Galileo both Russia and China have created their own satellite-based navigation system. Most mobile phones today support all 3 fully operational systems now, and they are all operated by different entities meaning that even if one decides that Europe is evil, there are still 2 other systems.
There actually was a talk about this problem at the 36c3. The proposed idea for a solution against that was to mark your library "Geek-Code"-style to indicate if you see it fit for use for security critical things.
A typical example would be a crypto library someone started because they wanted to experiment with it. Of course one could use it for serious things, however since it wasn't meant for that there could be serious issues with this. Nevertheless releasing such code may be beneficial for some as a demonstration device.
"Are you suggesting using CSVs because they are "standard"?"
No of course not, I'm suggesting that because in 99% of the cases it can be done in a very simple way. Often you don't need the ability to have the delimiter character in your data fields, you can simply replace it with another character or reject that input as invalid.
For example if you just have nummerical values, scanf can easily read that for you. With slightly more effort it can also read space delimited colums of strings.
Even if you need arbitrary data, there are way simpler ways then the "Windows CSV". Just use no quoting and add an escape character. That way your parser only needs to read in the input character by character and only have 2 modes. The first is the normal mode, the second is the "after escape character" mode.
One of the worst examples for how you can mess up a simple format is probably the "Windows CSV" which adds things like quoting which makes parsing very hard.
XML and JSON may have their advantages for complex and dynamic data structures. However one rarely needs that. Relying on standards is not always a good idea, particulary when you need more code to use a separate library than an implementation of your own parser would need.
A good summary of the state of the art is here:
Although you can never be 100% safe, you can always lower your risk by lowering your dependencies.
For example, if you have a simple list, using XML or JSON adds complexity without providing value. If you use simple delimiter separated files you can often use standard library features to parse such a list.
Beware of environments where adding a new dependency is simple. Adding a dependency is a potentially dangerous thing to do, think before you do it, think before pulling in code that adds new dependencies.
> Which raises the question of what actually is the native look and feel of Windows these days?
Well actually that still is the same as in the Windows 9x era. You can see that when all of the modern GUI extensions crash. I think it even reverts to the "System" bitmap front.
Of course if you don't like the look of the GUI elements you can use the OwnerDraw event and draw them yourself.
Take a look at Lazarus, it's a Free (as in speech) alternative to Delphi with all the nasty bits taken out. Software natively compiles on at least Windows, Linux and MacOSX, and since it uses the native GUI toolkits it'll always look and feel native. For all of those platforms you get a fairly large (10 Megabytes) static binary you can just drop onto the system and run.
For example the search function in Outlook can only search for whole words. So if you have a composite nown (as common in Germany) "Ticket" won't match "Carrierticket". Of course in the age of multi megabyte RAM in PCs doing a full text search of the subject still seems something hard for Microsoft.
Sonos always was more of a lifestyle product aimed at people with more money than brains. I mean it was always obvious that those things were bound to happen as everything relied on proprietary and closed standards.
Normal audio equipment, on the other hand, is designed to rely on open and simple standards. The analogue line in virtually every device has will still work in 50 years just like it did 50 years ago. Bluetooth and HDMI, while probably not around in 50 years, are widely supported from many different manufacturers.
and that's to boost sales, as shortages mean that it will become harder to source something. Considering this, it also makes sense to lower the prices, this also boosts sales.
To me it looks as if Intel wants to reduce its stockpiles of older processors.
It's just that more and more little toy projects get shared on github. There the idea is that someone wrote some code which isn't worth thinking about copyright, so they simply slap on some BSD license as they don't care what is being done with that code.
It's more a sign of a rise of casual code sharing on github than a fight against copyleft.
After all the Trusted Computing Initiative was not about protecting user data but about protecting business models. If it was about making computers safer they would lobby for the elimination of scripting languages in browsers, and the elimination of "Service Mode" features and "security enclaves".
"Well apparently they're willing to trade sales figures for style so I guess we'll see how that pans out.
No great surprise if they manage to sell scant few and end up scrapping the line within 18 months."
Nah, that's not how sales works. In case they sell some, they'll proclaim that it as an ingenious idea, but since the market is shrinking it wouldn't sell as well as previous models. In case they sell very few they will also blame it on the market.
It's not there to store data or numbers or anything like that, that should be obvious to anyone using it.
It would be nice if Microsoft would release any indication on what Excel is supposed to be. If I was allowed to put on my tinfoil hat I'd say they won't, because that would mean they would have to actually respond to bug reports.
Cisco has abysmally bad security for decades now... on different product lines... with completely new implementations. I mean just by chance they should have found some competent programmers starting a new project.
I mean even Microsoft managed to clean up their mess for a while after XP.
The de-allocation method wouldn't be a problem as such... if they didn't have weird stuff like implicit object copies which can, if you aren't super carefull, cause 2 "copies" of an object have their members point to the same memory locations, paving the way to use after free and double free.
Pascal, for example, doesn't have that, the destructive assignment operator := does not copy objects, at best it copies references to objects.
... that people who use it think it's a high level language when it's in fact more of several layers of complexity above plain C. Since C++ is so insanely complex, members of a team will spend most of their time learning new features. The result is typical "beginner" code where someone tried to use a feature for the first time without fully understanding it first. This distracts people from writing good code as the amount of "thinking" they can spend on a given amount of code is constant.
C has the same intrinsic problems as C++, but it's much simpler and much closer to Assembler. So if you know some Assembler you can spend more of your "thinking" to making sure your code is correct.
Of course today what we need is a language with about the complexity of C, but with type checking and without the problems of making it to easy to include libraries.
Slow or absent changes mean that your code will work for a long time. Any change in the language can mean that your code breaks resulting in more of a motive to replace it. That's why the slow development of COBOL caused it to be indispensible for many banks. If Java was changing every couple of years no bank would seriously consider it.
I mean Java didn't even include the really sensible feature ideas of J2K yet.
Biting the hand that feeds IT © 1998–2021