Yep. It is 2022-01-04 everywhere in the world EXCEPT the US, which has to stubbornly be "different" with the likes of Imperial measurements.
YYYY-MM-DD only makes sense - it sorts properly in string form. :)
Microsoft has ushered in 2022 with an amusing (and now deleted) tweet from its Windows Developer account that answers oh so many questions about the quality of code emitted from Redmond nowadays. As is so often the case, the code (which looked like it aimed to greet 2022 with a perky "Happy New Year") doesn't appear to have …
This post has been deleted by its author
Some of us have always written it as YYYY.MM.DD, even when threatened by our school teachers with detention for "doing it wrong".
I'd simply point to my military background, highlight the fact that all the date+time stamps on every bit of paperwork that came from there was in said format, & ask them to take it up with the government.
It's fun to force a teacher to eat their words every now & then. =-)p
*Hands you a plate of homemade cookies still hot from the oven*
They're your favourite flavour thanks to the magic of my insanity! =-)p
Yeah, and don't get me started on time zones. I work in a project where some brilliant person (before my time) decided to have date and time in local time. Did not matter when we had mostly data from business hours (not the case for oh such a long time), not much, and then we also have daylight saving. What. A. Mess.
This is how the NHS tend to do it and it drives me up the wall. yyyy-mm-dd and dd/mm/yyyy are our official date formats. Nobody uses YDM anywhere, dd/mm/yyyy should be retired, and yyyy-mm-dd is ISO-standard (even though ISO are nobs and 8601's messy, it's the closest thing we have to an international standard). No need for Anglocentric nonsense, especially when it's presented as a "solution" to an i18n problem that was solved decades ago.
In this day and age, I would regard the use of any of the Etc/ timezones in a system as a defect that requires fixing.
However, at least you have the consolation that someone at least thought enough about things to use the IANA zone-names, rather than just specifying the zone-offset numerically — a “great idea” that condemns consumer code to an eternity of calculating Daylight Savings Time transitions (oh, and for the USA, the added joy of accommodating those not-quite contiguous counties in Arizona that don’t observe DST at all).
I have a simple rule: never, ever persist any timestamp that is not UTC. For situations where the code needs to do zone-aware calculations, I will still store the timestamp as UTC, but then I store the relevant time-zone beside it: so the turn of the New Year in New York City is stored as (“2022-01-01 05:00:00”, “America/New York”). That way, all your general queries by timestamp still work, and if you need to calculate locale-aware stuff, you can still do it easily enough.
The request in question came from Maine, US. My suspicion is that the user has set Etc/GMT+5 so that they WON'T EVER get DST switched on via an OS update or some other shenanigans. Windows allows Etc timezones (although interestingly I can set Etc/GMT+8 where I am but not Etc/GMT+5). I would expect other operating systems to also allow it.
Yes we also store everything in UTC. However, we do use the current timezone sent by the browser to correctly query the database for snapshots, in this case, by the user's current 'day'.
As if by complete coincidence there is a recent XKCD Cartoon about date formatting!
And of course, the classic-but-not-so-coincidental ISO 8601 of 8 years ago.
And of course, the classic-but-not-so-coincidental ISO 8601 of 8 years ago.
Which misses the points of standards entirely. They do not define "the right" way to do anything, just "a" way. It's the fact that it has been defined which matters. There is absolutely nothing wrong with having more than one standard in existence, as long as you make clear which one you are using.
A very nice point, which amply demonstrates that you have never had to parse a 10000+ line CSV file with dates from over 60 users and at least eight different date formats.
Did you know that some people use . to seperate year.month.day ?
I swear, if I had had a gun that day, you would have read of another shooting in the news.
@Pascal Monett ... nope, I didn't know that. So, now I'm wondering if I should support that in my date recognition code :)
BTW, https://site.uit.no/english/punctuation/dates/ agrees with you.
I've found Australian and German sites that say dash, slash (although incorrectly calling it forward slash), and dot ... no U.S. sites so far :(
Thanks!
This post has been deleted by its author
This post has been deleted by its author
I know enough about coding to know I'd rather leave it to someone who knows what they're doing. I know enough to vaguely recognise it as C on account of the ==, but can't remember why C did that. Isn't 1 equal enough? Which I guess it possible if you're not too fussed about data types. Or if one = just means it's good enough for government work, and 2 equals means you really want it to equal that.
Sadly my copy of K&R succumbed to gravity, and then the Cam. I think I ended up putting Numerical Recipes in the microwave. Guess if I'd waited for YT, I could have made that into torture pron for programmers.
One = mean assign, two means comparison. As to why that was done way back when, I don't know for sure. It does mean that both of these are valid in C although they behave differently (in most cases):
if ( x = y ) // assigns value of y to x, checks if that is true or false (true=0, false anything else IIRC - it's been a while)
if ( x == y ) // compares x to y, doesn't change value of either
The fact that the top option works and will compile must have been the source of millions of bugs in C programs over the years. I will note that, for my sins, I have used the method of assigning a variable in an "if" statement intentionally (in Java, FWIW, although the methods are the same) and the code worked as expected.
Compilers will warn about common possible errors nowadays. As a software engineer I used to try and write code to add to check for things I could work out how to check for so code was always as thoroughly automatically checked before it hit the compiler.Even in the 80s there was a shed load of code and tricks to leave your C code free of most of the things that still haunt many today.
Back in the day when I dabbled with programming, it was a source of pride to get something to compile with no errors or warnings. Nowadays, that seems to be impossible because, as you and others have said, the compiler authors keep trying to second guess the programmers and, in effect, asking "Are you sure?" over anything that might be a common error, even when it's intentional.
A little late but may help someone somewhere, courtesy of archive.org
V1: https://archive.org/details/TheCProgrammingLanguageFirstEdition
V2: https://archive.org/detail/brianw.kernighandennism.ritchietheansicprogramminglanguageprenticehall1988/
The perfect example of the insanity of "everyone can code" and "everyone needs to learn coding". It is the road to disaster. Imagine these people coding a part of the fly-by-wire system...
Unlike the professionals who coded the payments systems for Santander (did 'em twice) or Nationwide (didn't do them at all) you mean? Or the professionals who coded the Crossrail signalling system (three years late and counting) and every NHS computerisation ever.
Face it - standards of coding right across the industry are abysmal and getting worse.
Ian Johnston,
"Face it - standards of coding right across the industry are abysmal and getting worse."
Sorry, when did things improve ...... they were 'abysmal and getting worse' 20 years ago !!!
By now we should be writing/cut & pasting 20GB of code for a simple 'Hello World' test compile.
Where is the AI system where you explain your problem and it writes the code for you ???
:)
Gah. Upvoted. I get so furious when I google “how to fix $error” and the top 10 results are youtube videos. I don’t want a video! A simple numbered list of steps would be more than enough, have more clarity, and save megabytes of download!
Can’t decide whether to use thumbs up icon, for you, or exploding nuke icon, for my current state of mind. Let’s compromise and have a drink. Cheers.
Perhaps a bit of Google Fu? -site:youtube.com
Actually, searching for nearly anything iphone related takes one to a bunch of sites trying to peddle their "fix anything software" for iphones. Keep a list of these strings to add to the search (because I never remember the first time).
The video makes a lot of sensible points, including that in general date comparisons should be done in UTC. But in the case of printing a message at new year, local time would be what you are interested in. Otherwise someone in New Zealand would be half way through new years day before the computer was hit by the clue bat.
Try ensuring csv files created to local standards in every possible country's 'local' standards can be parsed by the same bit of code....
(as a hint... not everyone uses the ' . ' character as the delimiter between integer figures and the following decimals...)
void PrintTime( )
{
time_t current_time;
char* c_time_string;
/* Obtain current time. */
current_time = time(NULL);
if (current_time != ((time_t)-1))
{
/* Convert to local time format. */
c_time_string = ctime(¤t_time);
if (c_time_string != NULL)
{
/* Print to stdout. ctime() has already added a terminating newline character. */
(void) printf("%s", c_time_string);
}
}
}
I guess that would mostly work. For a given value of "work". Until it doesn't.
PrintTime should be declared as void PrintTime( void ). PrintTime() declares a function that takes an unspecified number of arguments, not necessarily a function that takes no arguments (that's C++ syntax).
The actual encoding of the value returned by time is unspecified, so while most systems conform to the POSIX specification and return number of seconds since epoch, they don't have to.
ctime always returns a pointer to a static buffer, so no need to check the return value. (There's no way to detect a failure.)
ctime doesn't support any localisation, and the format of the output string is... weird:
Www Mmm dd hh:mm:ss yyyy\n
ctime isn't thread safe.
POSIX has obsoleted ctime, and the C standard recommends using strftime instead which is locale sensitive, more flexible, and thread safe.
And finally, there's no indication whether the function was successful, either through a return code, or by printing an error to stderr.
So, if I'd been reviewing this function in 1991 I'd probably let it pass. Today, not a chance.
Guess what? With any compiler written by even a fourth year Computer Science student nowadays in a 400-series course, the compiler will optimize the double call out of existence. Even Java will.
If you are following a functional paradigm in C/C++, then the example of invoking the same methods twice is not only valid, it is preferred. Leave the optimization to the compiler, and stop wasting your time and making code hard to read trying to tune it by hand.
If your compiler doesn't optimize that out of existence, do check your compiler options and set some flags...
To justify the word "still" in the output, the program needs to wish you a happy new year when it is run for the first time, regardless of the date, and then store the year between runs so that it can detect the change.
Unless the spec is to do exactly what the program does, in which case the programmer gets full marks and the analyst is just deranged.
right?
The sort of coding that would get a FAIL mark in my assignment
But maybe it helps understand m$ and its failings.... somewhere , on a managers PC , theres the same sort of code claiming its 1995, and also claiming we dont need more than 640Kb of memory..... or netscape or some free OS called Linux....
So....why the escape on the apostrophe in the second "WriteLine"?
I don't use C#, thankfully, but C, C++, Lua, ALGOL, Pascal, FORTRAN, APL, BASIC, COBOL, and some others ...
none require escaping an apostrophe *unless* the string is delimited by apostrophes, and this one isn't (it's delimited by double quote marks (")).
And, why the braces? Nothing like making the code harder to read :)
(Braces would be indicated if one is using a language requiring them...apologies if C# does, or (to me) they'd indicate the programmers intention to add more code within them later.)
Escaping the apostrophe is weird. I don't know if it even works in C# but it's not normal.
Using braces is quite normal IMO. It's so easy to screw up one-line if/else statements I certainly wouldn't pick fault with it. Without even reading the code, braces stick out like a sore thumb that there is a logic block there.
Braces are also a requirement (i.e. something you do unless you can justify why you shouldn't) for MISRA compliance, so for some of us, seeing code written without braces doesn't just look weird, it makes us actively shudder at the thought of what other crap the coder has left lurking in there for us to discover...
Maybe back in the days of yore, when paring source code down to the absolute minimum to save on storage space/pages of fanfold required to print said code meant that you'd pull every trick in the book to make your source as small as possible, adding braces where they weren't strictly required by the language itself might have been seen as an extravagance, but these days there's no reason not to use them, and a bunch of good reasons why they ought to be used.
I very quickly got out of that habit once I started using debuggers where, in such a piece of code, it'd have been impossible to set a breakpoint that fired *only* when the if condition was true. And even now that I'm working with debuggers that could do it, it's still easier to do it when the statement is arranged over multiple lines.
Console.WriteLine(DateTime.Now.Year >= 2022 ? DateTime.Now.Year > 2022 ? "Well Ive missed 2022" : "Its at least 2022" : "Lame its 2021 or before");
Or lets really abuse the string formatter
Console.WriteLine($"{(DateTime.Now.Year==2022 ? "Happy New Year its " : "Commiserations its still ")} {DateTime.Now.Year}");
Heh I like to craft dreadful examples find it helps to show what not to do to new hires. Generally if I find string interpolation or ternary operators in anything but ui code it's a code review fail (same with control statements missing curly braces)
Reminds me of the ad that was doing the rounds on social media a few years ago trying to promote the benefits of using AWS vs your own solution, but which ended up suggesting that AWS was no better than anything you could come up with yourself due to whoever'd written the ad not knowing the difference between pre and post increment...