DD/MM/YY or DD/MM/YYYY?

  • There have been discussions about the order of DD, MM, YYYY, but never really any discussion about why designers choose to use YYYY over YY (for instance, 01/01/2017 rather than 01/01/17).

    Any idea why there is such a preference?

    I can briefly think of some instances when including the entire YYYY is useful based on context, but assuming that the system I'm designing won't be dealing with century old objects, I wouldn't need the full YYYY. Am I right? Isn't it fine to just use YY then?

    In the year 2099, *somebody* is going to curse your name...

    Use neither. https://xkcd.com/1179/ ...and a curse on your code if you ever use YYYY/DD/MM - that is an abomination.

    I did not expect such a question just 17 years after Y2K.

    https://en.wikipedia.org/wiki/ISO_8601 "an unambiguous and well-defined method of representing dates and times": YYYY-MM-DD . For example, September 27, 2012 is represented as 2012-09-27. No slashes. with time: YYYY-MM-DDThh:mm:ss ( or YYYY-MM-DDThhmmss for filenames), even including milliseconds if needed: .mmm . The "T" separator really helps taking out every ambiguities, and is easy to parse as well. iso8601 is a good thing, and should be used EVERYWHERE ! (I have log-search scripts that needs to "guess" amongst 12 date formats :'( )

    Comments are not for extended discussion; this conversation has been moved to chat.

    If it says that someone was born on 01/01/13 then it is ambiguous... That person could be 4 or 114 years old. Specifying full year makes the date unambiguous while two digit years require context and exercising one's brain in order to interpret.

    There is no situation when one should use anything but ISO 8601: `YYYY-MM-DD`. No exceptions.

    _assuming that the system I'm designing won't be dealing with century old objects_ That's already a good reason to use YYYY, not to have to make such assumptions.

    I think it makes a big difference whether you are talking about how to display it or how to store it and you haven't mentioned which one you are referring to.

    Not century old. 17-year-old. If you have an object '99 and another '01, we all know you mean 1999 and 2001, but your computer will need special programming to know that, or it will order them incorrectly.

    @yitzih et al. I believe it's an automatic and 100% safe assumption that here we are exclusively talking about displaying and not about storing. But it's true that it would have not hurt to make that explicit - just like the year in a date ;)

    just because you display 07/04/17 doesn't mean the database is only storing 17. There's an advantage in saving space by not showing the redundant 20. Just so long as you record it as 2017.

    If this is a desktop application you're developing, or some other context where you have access to a user's global date/locale preferences as configured by the host operating system, then *use those*. Otherwise, for a web app or something else, I'd say follow everyone else's advice and use the ISO format. (Every web site does not need yet another configuration page, and certainly not for something as trivial as a date format.)

    Dude, Y2K already has the answer to your question!

    "Those who cannot remember the past are condemned to repeat it" - George Santayana

    Only use 31/12/2017. Why? Because I say so!

    @sds Starting in the year 10000, the format YYYY-MM-DD will no longer do. That is one exception. Only one, but it will last a long time.

  • There is no universally good answer to this question, but there are definitely two pros of YYYY:

    • by showing the two leading numbers you can easily tell e.g. 1911 from 2011,
    • you know exactly where the year is in cases when the year is from the range XX01-XX12.

    In other words:

    Notation     Possible interpretations:
    -----------  -------------------------
    09/10/11     Four:
                 September 10th 2011
                 9th of October 2011
                 2009, 10th of November
                 2009, October 11th
    
    09/10/2011   Two:
                 September 10th 2011
                 9th of October 2011
    
    2009/10/11   Two as well:
                 2009, 10th of November
                 2009, October 11th
    

    So, as you can see, by telling the User where the year is you limit the concern. Should the date be from a year which ends in a number higher than 31 (which is the highest number possible in the other fields), something interesting happens:

    Notation     Possible interpretations:
    -----------  -------------------------
    09/10/33     Two:
                 September 10th 2033
                 9th of October 2033
    

    However, the above interpretation requires the User to first analyse the contents of the string, so it increases the cognitive load significantly. The thinking would be:

    "Is 09 the year? Not sure. The middle one is not year. Oh, 33 is the year. So 09 must be the day or month."

    Of course this happens in a blink of an eye (Well, two of them. Well, three), but it is still a cognitive load and if Users need to deal with a lot of dates in this form, they may need to go through the same unwelcome process of searching for the year many times until they learn. And they should not have to learn.

    And for these you do not need to bother of the contents, you can easily tell where the year is just looking at the obscured string:

    • ▓▓-▓▓-▓▓▓▓
    • ▓▓/▓▓/▓▓▓▓
    • ▓▓▓▓/▓▓/▓▓
    • ▓▓▓▓-▓▓-▓▓
    • ▓▓▓▓-▓▓
    • ▓▓-▓▓▓▓

    The day vs month problem: gradual versus cultural approach

    Now we get to the real culprit why the dates are so unclear: month versus day. Let us say, we have solved our problem with the year and still need to tell one from another here: ▓▓/▓▓/▓▓▓▓

    Notation     Possible interpretations:
    -----------  -------------------------
    09/10/2033   Two:
                 September 10th 2033
                 9th of October 2033
    

    For me, the gradual approach, where the time units consistently goes from lower to higher (so: DD/MM/YYYY) or the other way (YYYY/MM/DD) makes much more sense. Unfortunately, in a system that Users only approach from time to time it does not matter if you use this approach, because they will not remember that you have used it.

    On the other hand, the MM/DD/YYYY format for the date is common in the US, Canada, Greenland, Philippines and several African countries (source), as an abbreviation of the way the date is pronounced: "September 10th, 2017". However, as there is also another pronunciation allowed this brings only confusion.

    To get out of this madness you may consider changing MM to the textual version of it (e.g. shortened: OCT/10/2017 or 09/SEP/2017), but in this case you fall into a problem of translation for international Users.

    Professional usage

    One situation when you do not need to bother about the notation is a situation when Users deal a lot with the date data, mostly in professional way. Two examples I can give you out of my head would be financial analysts (observing changes on the market) or photographers dealing with a lot of photos named using some convention they know by heart. If they know it by heart, this is not a concern.

    "Now" context anchor

    Another situation when the importance of what is year in the date becomes less important is when Users are more oriented on "now". Facebook is a good example.

    Saving space

    Saving space may be sometimes a really important factor for making decisions. Again, in dashboards containing a lot of data, the year may be either completely obsolete or may need to be truncated. But I believe these dashboards fall into the basket of professional usage most of the times, so no need to worry about them too much.

    Combining into one text string and sorting

    In some cases, you may face a situation when you need to combine the date into one big chunk of text. For example, the naming convention I use for photo files is YYYYMMDD_HHmmSS.ext, (e.g. 20170911_113426.RAF) This, again, falls into the "pro" usage basket; however it also provides means for sorting by date without needing to worry that the date attribute of a file would change (e.g. because it was moved to a file system that does not support this kind of attribute, or edited in an app that would clear it). This usage scenario brings two conclusions:

    • it is good to have a full year, because at least the photos from 2000 will be after those from 1999,
    • it is good to use the gradual order, progressing from higher to lower unit.

    Wrap-up:

    • Do your Users know the system by heart? Do they use the date attribute on everyday basis? If so, do not bother about the recognition where the year is, but consider additional things like sortability or uniqueness (e.g. 1911 from 2011 when the date scope is wide.)

    • Do your Users approach the date attribute only occasionally? If so, provide higher recognition for what is what in the date without high cognitive load from their side: expand the year to four digits, make it clear where the month is. Unless space is critical for you, in which case you need to prepare for trade-offs.

    EDIT:

    As many comments below refer to ISO-8601 standard, I would like to explain why I have not added it in my original answer.

    I believe that the word "standard" has a twofold meaning: a norm and a convention.

    A a convention is a common approach to something that is used by a limited group of people. Regarding a specific topic like this one, there can be (and there usually are) various conventions, out of which one can contradict another (again: like in this case). And what is more, most of the conventions contradict the norm and the norm contradicts most of the conventions.

    Conventions have their historical, linguistic, practical etc. roots. In case of date conventions, for example the MM/DD/YY comes from American way of saying the date as "November 5th, 2008" whereas somewhere else it can be different.

    Now to the norm.

    A norm has a role to deny most of the conventions used so far, to replace most of them. It can be one of the conventions that has been selected as a norm, but most of the conventions need to be denied if just one has to stay. A norm usually is well thought out. The norm in this particular case makes a lot of sense, as every-next-unit in it is smaller than the previous one (and this allows easier comparison between dates, sorting by date as a text string etc.).

    There are definitely two ways to go from here.

    • One is to push the norm until it is used everywhere, in long term providing coherence in the standard used all over the world. Forcing people to use something different from what they have always used has got its drawbacks, and this way is - to some extent - against usability.

    • The other option is to adapt to the local conventions people understand. Having derived from cultural, linguistic, practical reasons, the conventions feel locally more adequate. But at the same time, when people using some convention also meet the other ones while browsing the web may become confused when they see something different from what they got used to, and hailing this approach is also - to some extent - against usability.

    This way, I still believe that there is no universally good answer to this question, and it may not appear any soon. What can be done for now is limiting some bits of confusion - like in case of the year being written as four, not two digits.

    Why no mention of the ISO standard? YYYY-MM-DD. Surely the more people aware of the standard and the more people get used to seeing it, the better the standard and the lower the confusion.

    https://www.iso.org/iso-8601-date-and-time-format.html "ISO 8601 tackles this uncertainty by setting out an internationally agreed way to represent dates: YYYY-MM-DD . For example, September 27, 2012 is represented as 2012-09-27." . No slashes. YYYY-MM-DDThh:mm:ss ( or YYYY-MM-DDThhmmss for filenames), even including milliseconds if needed: .mmm . Get used to the "T" separator: it really helps taking out every ambiguities, and is easy to parse as well. iso8601 is a good thing, and should be used EVERYWHERE :) (I have log-search scripts that needs to "guess" amongst 12 date formats :'( )

    -1 for "There is no universally good answer to this question" ISO-8601 exists. It is *the* "universally good answer to this question". Period.

    Comments are not for extended discussion; this conversation has been moved to chat.

    While ISO 8601 is nice to the programmers, it's not nice to the users. In reality, date should be formatted according to user language settings. Yes, year should never shortened and ideally the month should be text, not a number. That's why we have "Unicode Technical Standard #35".

    @Sulthan "date should be formatted according to user language settings". The problem with this is that, more often than not, *the user doesn't know if the date has been formatted according to their language settings*. If a British person visits an American website and sees `01/10/17`, how do they know which format is being used? Of course you can solve that with `01 Oct 17` but then you have added a new internationalization problem. I agree with using the ISO format. The unfamiliarity problem is smaller than other problems, and the more it is used, the more this problem goes away.

    @dan1111 - I agree to some extent, but to some I do not. First, standard is a good thing when it is commonly used. ISO-8601 is not. Using locally common notation makes date unfamiliar for those who come from somewhere else. Using text for month triggers linguistic problem – I would know March, but not მარტი, which is in Georgian. This is why I still believe there is no universally good notation, even though there is a standardised one. Should this standard be commonly used, yes, this would be the correct answer.

    I agree that there is no *perfect* solution. But I would lean much more toward that ISO standard as the best of the imperfect solutions.

    One additional point about the sorting topic: Beginning with (I believe it was) Windows Vista, Microsoft changed the operating system's alphanumeric sorting algorithm with respect to filenames, from a simple dictionary-style sort to a system that sorts any sequence of consecutive digits as a single number unit within the filename, such that the hex strings properly sorted as 777, 98F, 990, 99A, 9A0 are instead sorted as **9**A0, **98**F, **99**A, **777**, **990**, and that any date in the format MM-DD-YYYY will sort all January items together regardless of year. (And DD-MM-YYYY is even worse!)

    @Shane Not according to every business I have ever worked for in the last 20 years in the US. They all have demand MM/DD/YYYY.

    As an American who has lived in and done business with commonwealth countries, please please ensure your format isn't simply _implied_ by whatever region your office happens to be in. Be explicit. Personally I write "6 April 2017", or "April 6, 2017". Zero confusion. Your users **do not** bring the same assumptions you do.

    You write "2009/10/11 Two as well:" There is not and never will be a YYYY/DD/MM date code, this is why your argument while long is wasted. The ISO-8601 YYYY-MM-DD option is the solution that you hide from by inventing a nonexistent conflict to try and obscure the obvious bias in favour of the standard.

    @DominikOslizlo _standard is a good thing when it is commonly used._ I disagree, a standard is a good thing when there is the need for one. Then if for any reason it doesn't get used by the majority (in this case, mostly due to laziness or resistance to change), it's still good that it's there, because in that way it still has the chance of becoming commonly used; once enough people get fed up with the guessing and switch to the standard, the problem is solved forever and for everybody. The problem of not knowing what format that date is is caused by knowing that it might be one of several.

License under CC-BY-SA with attribution


Content dated before 7/24/2021 11:53 AM