Re: walked to the lab to disconnet the switch?
i think most likely to respond should be read as, responded more quickly than the probably slowish uni servers.
176 publicly visible posts • joined 20 May 2007
Postscript: this comment was based on the article here and what the linked south china morning post article writes. I cannot read Chinese, but from the English abstract, cited references at the end, abbreviations and formulas in the text of the linked paper it seems to be about application of global minimization algorithms to searching for prime factors. Not sure how the scmp article and the paper are related.
Not an expert, but this is interesting in that so far the quantum threat to cryptography seemed largely confined to asymmetric systems. Now this is explicitly about substitution boxes, as used e.g. in AES, *but* this is explicitly about the gift-64/128, rectangle, and present algorithms. All 3 are lightweight algorithms, which seems to mean how can we get some security out of as few jules of energy used and as low hardware complexity as possible.
Im not saying it has no relevance to AES at all, but no results for AES is mentioned in the article, and i have to assume the researchers probably do not yet know what it could mean either. Maybe it shaves a bit or two off at some point, but if they found a direct way to shave off a large number of bits off AES, or even actually reduce the problem complexity, either this would have landed with very explicit claims towards AES and a huge bang, or they would have been stuck into a reaseach cave in a military faciliy before the paper even concluded its review phase.
I'm partial to rust. I think the borrowing is a good addition to the memory model and stepping back on class hierarchies (inheritance) are very nice aspects, and i would like to see them in whatever general purpose high performance language id like to use in long term. But the two issues i have with rust today are
a) if you want a personal problem - *if* a language is so obviously inspired by c/c++ syntax, why so deliberately introduce some constructs that seem just alien to that style (if b is solved i guess i can get used to it over time, but reading doc today for a prospective language to get used to, there seemed to be far more WTH are you thinking aspects that eg java and c# introduced over a long time)
b) (actually somewhat on topic) no way i'm voluntarly investing in such a moving target as rust appears today. The projects i care about are measured in decades. At job i maintain pre c99 code, c++ that kinda compiles on semi modern visual studio but was written for vs2003, and a project from ground up written in std c++14 as cutting edge. Some used libraries were ported from the 1970 or maybe even earlier. Now that's the job, but my hobby project is also largely in that dimension, its' mostly c++ today and i rewrite some algos from 10 years ago, but introducing a new language to take over even parts of what modern c++ can do well today required some confidence in there being real benefits (the private project is RAII, make_shared and stack variables; you could say HPC, afaict no memory violations in lots of allocations - no idea about numbers, running even a desktop ryzen at full tilt today rips through terabyes of memory in no time) and language stability. In some parts i feel c++ is dragging, in other parts i also think the standard is moving rather fast. Now for rust, once there is a proper versioned standard (written down as standard, preferably ISO, but IETF, IEEE, EMCA etc ok fine) where you can expect to still be able in 2035 to compile a large code base wtitten in standard rust version 2024, yes let's go. absolutely.
But while people are committing to a language that in a way still tastes like some ephemeral move fast break things before the first coffee in the morning github project. Well. Not surprised.
Upvoted even if in the industry im in it is basically irrelevant whether it takes you one minutem to add a library or half a day. What matters are the hours and days every additional library may generate in clearing it for use, trackin safety issues, in configuration management, over the decade(s) the application is likely going to exist.
Of course i cannot exactly know what rich2 meant, but i upvoted the post because for me the RAII in c++ provides a very clean way to initialize and deinitialze "hardware". And that's not only file descriptors or the like, but communicating with a device that hangs off a specific serial port, network address etc. Initializing is in the constructor and it can allocate memory (well i dont new/malloc but automatic vars or make_shared), open sockets etc, and fail any way it likes - throwing from a ctor is basically the RAII way to signal errors, but the important thing is to avoid anything that could throw or allocate additional memory in destructors, and then only catch by const reference (else it copies the exception, a mem alloc).
Sticking to this strategy, afaict you can safely feed the dogs, park all motors and turn off the big freaking laser (basically, keep hardware safe) before the process* exits in any situation short of the operating system forcefully killing the process, or dying on its own.
*i'm aware of threads, newer c++ has appropriate tools to keep that topic being irrelevant here. The process is what matters in this context, because that is what the operating system handles.
DH allows two peers to construct an encryption key for the sender and a decryption key at the recipient while talking in public.
The recipient still needs to store the decryption key if she opts to not decrypt right now but only later.
For a scenario where a person wants to decrypt their own data later on, the procedure is pointless.
Now if the ecosystem companies set up an industry association to promote the paid support or customized commercial variants and the engine batch et al mentioned in those slides it made sense. That and TDF coordinating as needed. But for this to happen under the TDF umbrella seems really poortly thought out.
I dont know fortnite, but it doesnt appear to be a q3a class game ie no need for sub-degrè rotation accuracy in milliseconds even for beginners. As wikipedia lists switch, ps4,ios and android as platform why not just use a smartphone or get a console...
/ex pc now longtime console, but the change was made along with a shift in kinds of games i play (no fps any longer)
Now you made me curious. What a progress.
Hexchat with 13 channels across 3 networks eats less than 0.01% (left idle, up to 6% by busily tabbing between channels) and 14mb ram, and that's on windows. Something like bitchx in its native habitat probably eats not even a fraction of that.
FB could only use the uploaded nude as an authorization token (face-checked against existing tagged photos or some such) to block all nudes tagged with your profile name on other profiles. After the outcry, version 2 would then give you the opportunity to allow select profiles to post nudes of you again.
Not sure where all the downvotes derive from. At least in my industry, static linking as much as possible (ie everything but libc+pthread/msvc*) is the sensible thing to do. But thats a highly regulated industry where post final build, a new version can take weeks to months and i guess six or seven figure monies until it can be shipped in any market. After that it should be frozen, but with the bad bad internet you still have to accept that turning off o/s security updates is probably not the best option and hope for the relevant authorities to accept the o/s somewhat changes underneath (bar APIs, i hope).
So, upvoted
Your post holds for about every single patch, but it can't be stressed enough.
Theoretical patch/updateability may actually be counter-productive where it really matters, because it may be an incentive to carelessness or cost cutting at release while critical systems often cannot "just update" without considerable costs and interrupting service. Ianal, but something might need to be recertified, years and milions spent.
I have only worked for a year with an oracle db and generally are not a db guy (more image processing, physics simulation, hardware control etc), but what i gathered back then was oracle can be a beast in db performance. Features ... errr, dunno but in my eyes they are all about being fast at relational queries.
A) how do you brag about that other than comparing to how others do on that very same field
B) i can see them actually doing extremely well in /that/ field also in the cloud, but hypy things like big data seems to be about unstructured data.
dunno why; the things i would guess the most valuable for business intelligence, where probably the money lies, /should/ be structured to have value. IOW, if oracle manages to offer relevant help at BI also with "AI" (tho i still shudder at the thought of "pretrained" AI models to use in BI) they should be set up good to help others make money (in contrast to sink money into hype).
Disclaimer of course i'm no expert on any of this, just seeing +/- from the sidelines what some few companies in one very specific field (medtech) are interested in/struggle with, apart from the -inherent to that field- "how do we remain or even get compliant to the gazillon and ever changing regulations".
I'm not going to vote either way, but think the "synchronization" list filled with functions that deal within the same process. As such they better not actually switch to kernel mode to do their magic and thus can be implemented by anyone else without additional win API support. I prefer to use the C++ language features (where available, and newer versions really add a lot of useful stuff in that regard) than bolting myself onto the win API very firmly. Of course MS's stdlib may or may not use the win API calls under the hood. Or the win API functions use their stdlib under the hood, who knows.
..or 4 versions of visual studio, each complete with half a dozen links named the same for the most part. Well except if you started with "v", you lost because although the folder and all other links might start with "visual studio", the main IDE itself is named "microsoft visual studio XXXX".
"primarily intended for use in phishing attacks rather than giving access to full systems"
OK maybe i should wait until details are out, but "phishing" sounds like a bain attack, not a system attack.
A browser can't really defend against that *unless* it phones home all the time in order to block what its home base considers insecure.
Not exactly what i want my browser to do.
(I use varous versions of FF at home, as far back as 3.0.x {ofc always with NS and AB}, use IE at work, but chrome? uuuuhhh)
yeah it's funny how how some "tech" people think how computing is reinvented every two years. Funny again how most of the most fundamental algorithms and protocols i actually make use of were developed mostly somewhere around 20 years before i was even born.
upvoted, not because i'm an "open source" advocat, but because buying binary can lead to some unfortunare circumstances where you are forced to a) hack around failings in the manu firmware and b) are either forced to threat them to kill if they dont give out source/permission to hack or b2) buy the company as a whole.
All because someone thought it was ok to buy in custom drivers without insisting on the source code and the manu is incapable to fix the problems by themselves (i.e driving multiple cards in the same device; not related to parallel ports but close).
Irfan View: more of a viewer with limited editing
(fast start, can set minimal interface, walk folders with space/backspace, crop, rotate, adjust curves/gamma/brightness, *no* drawing (lines etc), supports something like 9000+ formats e.g. also custom specified raw interpretation)
Paint.net: more of a lightweight editor ("mini-PS/PSP")
you know what it is
I use
-Irfan for viewing, batch conversion, simple stuff like rotate and resize photos
-PDN for labeling stuff on screenshots, collages (at work)
OTOH, often-used antialising does render at higher-than-screen-res, then scale down (interpolate) to make the jaggies less visible.
If your screen has twice the pixel/mm you can probably dial down the AA-ing by a factor 2.
So you render at the same resolution as before, have a little crisper image at the risk that people might be able to make out the jaggies a little better (or just not so, because the single pixel is too damn small and looks exactly the same as the larger interpolated value pixel before).