tux3 16 hours ago

The NIST report that explains the decision: https://nvlpubs.nist.gov/nistpubs/ir/2025/NIST.IR.8545.pdf#p...

  • buu700 13 hours ago

    Really glad to see it. It sounds like their reasoning is essentially the same as mine for implementing HQC alongside Kyber in Cyph three years ago: having a different theoretical basis makes it highly complementary to Kyber/ML-KEM, but its public key sizes are much more practical than those of the more well known code-based candidate Classic McEliece.

whimsicalism 16 hours ago

Give how quickly quantum is potentially coming, I wonder if we should/could find some way of using multiple quantum-resistant algorithms simultaneously as a default, in case a fault is found after the limited time we have to verify that there are no faults.

Also - should we not be switching over to these algorithms starting like... now? Am I wrong that anyone collecting https traffic now will be able to break it in the future?

  • jonmon6691 15 hours ago

    Correct, KEM's should be replaced ASAP since they are currently vulnerable to store-now-decrypt-later attacks. Digital signature algorithms are less urgent but considering how long it takes to roll out new cryptography standards, they should be preferred for any new designs. That said, the new PQC signatures are much larger than the current 32 byte ed25519 signatures that are most common, and that could end up being very difficult to integrate into embedded systems with low bandwidth or limited memory, ie. CAN bus secure diagnostics, meshtastic nodes etc.

  • MJGrzymek 13 hours ago

    Isn't that trivial in a sense? Encrypt with layer 1, then use that encrypted channel to send layer 2 (and so on). Not sure about the performance.

    Signal has a post about using pre and post-quantum together: https://signal.org/blog/pqxdh/

    > The essence of our protocol upgrade from X3DH to PQXDH is to compute a shared secret, data known only to the parties involved in a private communication session, using both the elliptic curve key agreement protocol X25519 and the post-quantum key encapsulation mechanism CRYSTALS-Kyber. We then combine these two shared secrets together so that any attacker must break both X25519 and CRYSTALS-Kyber to compute the same shared secret.

    • chowells 11 hours ago

      You generally don't want to layer encryption like that. It apparently really does introduce new kinds of attacks, which has been observed in the real world.

      The pattern typically used for this is that the key for the high-speed symmetric encryption is split into multiple parts, each of which is encrypted with a separate public key system. One classical, one (or two, now?) with a post-quantum algorithm. As long as each part of the key is big enough, this still means you'd need to crack all the separate public key algorithms, but doesn't introduce any of the layering weaknesses.

      • immibis 10 hours ago

        Running TLS over TLS is fine, or ssh over ssh, or ssh over TLS, or so on. Otherwise the bad guy would just put the TLS traffic they intercepted from you, through their own TLS tunnel and somehow acquire more information.

        In the early days of SSL there were cross-protocol information leaks if you used the same key or related keys for different protocols or protocol versions. In the DROWN attack, I can get some ciphertext from you in TLS, then feed related ciphertexts back to you in SSLv2 (an ancient version) if you're using the same key for both and have both enabled. With enough tries - a practical number of tries, not 2^64 - I can find the decryption of that ciphertext, and then I can calculate the key for the TLS session I intercepted.

        Well, I can't because I'm not a leading cryptographer, but some people can.

        • 0xDEAFBEAD 7 hours ago

          This is a great point. If layering encryption really does introduce new attacks, that suggests encrypting single-layer ciphertext could allow you to perform that layered-ciphertext attack anyways. So I find myself skeptical of chowells' claim here.

          Here's Wikipedia: https://en.wikipedia.org/wiki/Multiple_encryption

          I'm no expert here, but if I understand Wikipedia correctly:

          * Be sure to use distinct keys and IVs for each individual layer.

          * Be aware that encrypting ciphertext could lead to a known-plaintext attack on the outer cipher, if the inner ciphertext starts in a standard way (file header etc.)

        • bawolff 6 hours ago

          One thing you do have to be careful about is layering hash functions, which generally does not work.

      • api 6 hours ago

        Layering independent crypto is safe, otherwise one of the layers has some nasty vulnerability. Security should not depend on the type of traffic. If it does something is wrong.

        In fact it’s common practice in high security government use cases to mandate two layers built by two different vendors for defense in depth. That way a nasty bug in one doesn’t compromise anything, and the odds of a nasty exploit in both at once are quite low.

        You might be thinking of bespoke combinations of algorithms at the cryptographic construction level where the programmer is trying to be clever. Being clever with crypto is dangerous unless you really know what you are doing.

  • __MatrixMan__ 3 hours ago

    It's not clear whether NIST's recommendations are influenced by their relationship with the NSA (who would have an incentive to pressure NIST to recommend an algorithm which they--and hopefully they alone--can break). So perhaps the reasonable move is to use only one NIST-recommended algorithm and have the other layer be one that they do not recommend.

    • jeffrallen an hour ago

      Or just ask NIST to STFU about cryptographic systems entirely, because their integrity is completely destroyed by their association with NSA.

  • throw_pm23 11 hours ago

    Sorry if my question appears ignorant, but how quickly is quantum really coming? If your prior belief is "nothing practical is ever likely to come out of quantum computing", then so far there is nothing that would seriously suggest you to reconsider it.

    I do not say this lightly, having followed the academic side of QC for more than a decade.

    • FateOfNations 10 hours ago

      Given how seriously the spookie parts of the US government are taking it, I would treat it with a similar level of urgency. While we obviously aren't privy to everything they know, their public actions indicate that they don't think it's a hypothetical risk, and is something that we will need to be ready for within the next decade or so, given technology refresh cycles: they need to get these crypto algorithms in place now, for devices that will still be in production use well in to the 2030s.

      There's also a good chance that the initial compromises of the classical algorithms won't be made public, at least initially. There are multiple nation-state actors working on the problem in secret, in addition to the academic and commercial entities working on it more publicly.

      • dist-epoch 9 hours ago

        > Given how seriously the spookie parts of the US government are taking it, I would treat it with a similar level of urgency

        Various US standards require encryption algorithms to be considered safe for the next 30 years.

        Sufficiently big quantum computers are likely in the next 30 years, but it's not urgent in any other meaning of the word.

        • mort96 8 hours ago

          Well, that depends on whether or not you care about "store now, decrypt later". Will the info you're sending now be totally uninteresting in 5 years? Great, you're probably good. Do you still want it to be secret in 20 years? Implementing post-quantum cryptography might be urgent.

        • adgjlsfhk1 5 hours ago

          given how sticky crypto algorithms are, transitioning early is a really good idea. git is still stuck with SHA1, and there's plenty of triple DES hiding in the boring types of critical infrastructure that no one can update.

    • grayhatter 11 hours ago

      It's a reasonable question. The need for quantum resistant crypto isn't because the practical attack is right around the corner. All though, I do really enjoy the analogy of predicting when we'll get QC based crypto attacks, is similar to predicting when humans will land on the moon by looking at the altitude for the highest manned flight. It has more to do with the level of effort it takes to replace infra as critical as cryptography.

      Imagine if next year, via magic wand, all the current TLS systems were completely and totally broken such that the whole of the internet using TLS became effectively unencrypted in any way? How much damage would that do to the ecosystem? But we also just invented a new protocol that works, so how long would it take to deploy it to just 50%? or to 80%? And how long would it take to replace the long tail?

      I'll also leave record now decrypt later for another commenter.

      • MidnightRider39 7 hours ago

        The problem is more that people concentrate a lot of energy on hypothetical future quantum attacks when the actual threats have been the same since the 00s: unvalidated input, buffer overflow, bad auth, xss, injection etc.

        All the big important systems are again and again vulnerable to these attacks (Cisco, M$, fortinet, etc.) - but of course those aren’t “sexy” problems to research and resolve, so we get the same stuff over and over again while everyone is gushing to protect against some science fiction crypto attacks that are and have been for the last 30 years complete fantasy. It’s all a bit tiring to be honest.

        • grayhatter 7 hours ago

          It's a mistake to conflate cryptography, with application logic errors.

          Your argument is akin to,

          > The problem is that a lot of physicians concentrate on diabetes, or hypertension, when there's people who have been stabed, or shot. Constantly hearing about how heart disease is a big problem is tiring to be honest.

          Also, I'm not sure what circles you run in, but if you had to ask any of my security friends if they wanted to spend time on a buffer overflow, or xss injection, or upgrading crypto primitives for quantum resistance... not a single one would pick quantum resistance.

          > The problem is more that people concentrate a lot of energy on hypothetical future quantum attacks when the actual threats have been the same since the 00s

          Just so I can be sure... you meant having the qbits to deploy such an attack, right? Because really the only thing stopping some of the quantum computing based attacks is number of stable qbits. They're not hypothetical attacks, they've been shown to work.

          • MidnightRider39 5 hours ago

            > any of my security friends if they wanted to spend time on … quantum

            I commend your friends but many people in these HN threads seem to be ready to implement post-quantum encryption right now to protect against some future threats.

            > you meant having the qbits to deploy such an attack, right

            Yes - last time I checked it was like 3 stable qbits. It’s just so far off from being a reality i really can’t take that research seriously. I feel like a lot of resources are wasted in this kind of research when we are still dealing with very basic problems that aren’t just as sexy to tackle.

            Edit: heart disease is a real thing so your analogy is lacking - there have been 0 security risks because of quantum in the real world. It’s more like “physicians concentrating on possible alien diseases from when we colonise the universe in the future while ignoring heart disease”

    • kerkeslager 10 hours ago

      I agree with you based on my following QC that we're still pretty far away from QC attacks on current crypto.

      The problem is, this sort of question suffers from a lot of unknown unknowns. How confident are you that we don't see crypto broken by QC in the next 10 years? The next 20? Whatever your confidence, the answer is probably "not confident enough" because the costs of that prediction being wrong are incalculable for a lot of applications.

      I'd say I'm 99% confident we will not see QC break any crypto considered secure now in the next 20 years. But I'll also say that the remaining 1% is more than enough risk that I think governments and industry should be taking major steps to address that risk.

      • MidnightRider39 7 hours ago

        You don’t need any QC attacks if you can far easier find exploits in the same top10 vulns that were used 20 years ago… Industry should first address that very real and serious risk that is present _right now_ before thinking about QC.

bdcravens 6 hours ago

Completely off-topic, but in today's climate, I worry about any department like this having their efforts rendered moot if they become a DOGE target.

  • rsfern 5 hours ago

    It’s not that off topic IMO, NIST (along with all the other federal agencies) is finalizing restructuring/reorganization plans this week in response to one of last month’s executive orders. It’s not clear when the extent of the cuts will be announced, or what the criteria will be for which programs are to be affected. Though in the Commerce secretary’s confirmation hearing there was mostly good will towards this kind of light touch non-regulatory standardization approach that NIST tries to take

some_furry 10 hours ago

The title is subtly incorrect: HQC is a fifth post-quantum cryptography algorithm selected for standardization by NIST, but it's only the second KEM (encryption-like).

> NIST selects HQC as fifth algorithm for post-quantum encryption

The other 3 are digital signature algorithms, not encryption.

throw0101c 15 hours ago

Meta: I can understand the math problems behind RSA and DH, and the general concepts of EC, but all stuff for post-quantum algorithms I have yet to have a intuitive understanding even after reading / watching a bunch of videos trying to explain things.

  • tptacek 9 hours ago

    The intuition for ML-KEM-style lattice cryptography is actually kind of straightforward. Generate a secret vector x and a random matrix A; multiply Ax, then corrupt it with a random vector of errors e, so instead of Ax=b you have Ax+e=b.

    Solving Ax=b is like week 2 of undergraduate linear algebra. Solving Ax+e=b, in a lattice setting, is Hard in the same sense factoring is: as you run the steps of elimination to attempt to solve it, the errors compound catastrophically.

    • j_chance 9 hours ago

      Solving `Ax=b` is easy if `x` can be any value. But if you bound `x` such that all entries must be small (e.g. only zeroes and ones) then the problem is hard: https://en.wikipedia.org/wiki/Short_integer_solution_problem

      What you described would be closer to learning with errors (https://en.m.wikipedia.org/wiki/Learning_with_errors) combined with SIS/SVP. Learning with errors is based on the parity learning problem (https://en.m.wikipedia.org/wiki/Parity_learning) in machine learning, which I take as a positive sign of security.

      Interestingly you can get information theoretically secure constructions using lattices by tuning the parameters. For example, if you make the `A` matrix large enough it becomes statistically unlikely that more than 1 solution exists (e.g. generate a random binary vector `x` and it's unlikely that an `x'` exists that solves for `b`).

  • FateOfNations 10 hours ago

    Yeah, I'm going to have to spend some time trying to understand the math too. I was mostly waiting for things to calm down standardization wise. This newly selected algorithm is based on error correction codes (the same ones used in communications protocols), which look interesting.

  • WaitWaitWha 12 hours ago

    I was in the same exact boat. I just could not digest it.

    I found AI (combo Grok & ChatPPT 4o) to be the best resource for this. It was able to break it down to digestible chunks, then pull it together that made sense. It even made suggestions what math areas I need to brush up on.

    • bryan0 9 hours ago

      I just did the same thing. Claude (sonnet 3.7) walked me through a toy implementation of the algorithm.

EGreg 14 hours ago

What is your favorite post-quantum encryption approach?

I think Lattice-based ones will eventually be broken by a quantum algorithm. I am fully on board with lamport signatures and SPHINCS+

  • buu700 13 hours ago

    In Cyph, I went with Kyber (lattice-based) combined with HQC (code-based) for encryption. NTRU Prime also may be a good option if Kyber is ever broken in a way that doesn't fundamentally break all lattice crypto.

    For signing, I treat Dilithium (lattice-based) as "standard security" and SPHINCS+ (hash-based) as "high security". In particular, the former is used for end user public keys and certificates, while the latter is used for code signing where the larger public key and signature sizes are less of an issue.

    In all cases, I wouldn't use PQC without combining it with classical crypto, in Cyph's case X25519/Ed25519. Otherwise you run the risk of a situation like SIDH/SIKE where future cryptanalysis of a particular PQC algorithm finds that it isn't even classically secure, and then you're hosed in the present.

    • coppsilgold 5 hours ago

      > In all cases, I wouldn't use PQC without combining it with classical crypto

      With hash-based signatures, hybridization isn't required. They are the most powerful signature scheme approach by far. The security assumption hash-based signatures rely on is also shared with every other signature scheme (hashes are what are signed). Other schemes come with additional assumptions.

      It's unfortunate that hash-based public key exchange is not practical. Merkle Puzzles require 2^n data exchange for 2^2n security.

      • buu700 4 hours ago

        That's fair. I'd basically agree, which is why I decided not to hybridize SPHINCS+ with Dilithium as I did with the two post-quantum encryption algorithms.

        Having said that, while SPHINCS+ seems highly likely to be safe (particularly as far as PQC goes), it isn't impossible that someone finds a flaw in e.g. the construction used to implement statelessness. It's probably fine on its own, and stacking it with something like RSA is maybe more trouble than it's worth, but there's also very little downside to hybridizing with Ed25519 given its overhead relative to SPHINCS+; 64 bytes on top of a ~30 KB signature is practically a rounding error.

        (Also, small correction to my last comment: only the SPHINCS+ signatures are large, not the public keys.)

bobsmooth 8 hours ago

Don't get me wrong, it's good to be prepared, but what are the chances we'll need these algorithms by 2050?

  • mmooss 5 hours ago

    There are many secrets created 25 years ago that need to be secure now; think of national security secrets.

  • bawolff 6 hours ago

    Predicting anything 25 years out is a fool's errand.

    Cryptographically relavent Quantum computers are definitely not happening in the near term, but 25 years is a long enough time horizon that it is plausible.

    Just consider what tech was like 25 years ago. Would anyone (without the benefit of hindsight) in 1999 really be able to predict modern AI, the ubiquity of smart phones, etc. Heck 25 years ago people still thought the internet thing was a fad. Anyone trying to predict 25 years out is full of crap.

    https://xkcd.com/678/

    • tptacek 5 hours ago

      Except for RC4, which people were side-eyeing on Usenet in the 1990s, and key size progress, I think the cryptography primitives we were using 25 years ago have held up reasonably well. A lot of things that have been major issues since then are things that we know theoretically back in 2000, but just didn't know viscerally, because people hadn't written exploits. That's not the same thing as, like, Peter Schwabe's take on whether a particular key exchange or KEM is likely to survive until 2050.

Starlord2048 15 hours ago

[flagged]

  • neapolisbeach 15 hours ago

    This comment reads as 100% LLM generated

    • nartho 15 hours ago

      I Wonder why ? It doesn't to me. Maybe I'm just bad at recognizing LLMs patterns

      • whimsicalism 14 hours ago

        the question/conclusion phrasing at the end is a suspicious red flag to me, quick click on their comment history is confirmation

        • hallway_monitor 14 hours ago

          The pattern of a top level comment on each post and no replies gives it away for me. Surprised to see a bot here.

          • whimsicalism 14 hours ago

            frankly, it's not a bad reply IMO - probably one of the best LLM replies that i've noticed, although perhaps there are better flying under my radar

    • jeremyjh 14 hours ago

      If you read their other comment history its even more obvious.

    • air7 14 hours ago

      wow, This is our reality from now on...

garbageman 16 hours ago

[flagged]

  • kmeisthax 12 hours ago

    No, and that wasn't the case under prior administrations either. Remember Dual_EC_DRBG[0]?

    NIST is an untrustworthy government agency that occasionally produces useful encryption standards. The answer to "should we use a NIST standard" is to look at what the wider academic cryptography community is talking about. Dual_EC_DRBG was complained about immediately (for various strange statistical properties that made it impractical) and people found the ability to hide a backdoor in Dual_EC_DRBG in 2004.

    If anything, the biggest issue is that the security experts pointing out the obvious and glaring flaws with NIST standards don't get listened to enough.

    [0] A random number generator standard designed specifically with a back door that only the creator of its curve constants could make use of or even prove had been inserted. It was pushed by NIST during the Bush Jr. administration.

  • qzx_pierri 16 hours ago

    I recommend this video by Computerphile - He talks about how NIST may have been pressured into enforcing compromised (backdoored?) cryptography methods as a standard - Dual_EC_DRBG to be exact. He also gives a super cool/intuitive breakdown on how this came to be. It will definitely grow some food for thought.

    https://www.youtube.com/watch?v=nybVFJVXbww

    • diggan 16 hours ago

      Small summary, courtesy of Wikipedia which makes a stronger claim than "may have been pressured":

      > In September 2013, both The Guardian and The New York Times reported that NIST allowed the National Security Agency (NSA) to insert a cryptographically secure pseudorandom number generator called Dual EC DRBG into NIST standard SP 800-90 that had a kleptographic backdoor that the NSA can use to covertly predict the future outputs of this pseudorandom number generator. [...] the NSA worked covertly to get its own version of SP 800-90 approved for worldwide use in 2006. The whistle-blowing document states that "eventually, NSA became the sole editor".

      https://en.wikipedia.org/wiki/National_Institute_of_Standard...

    • tptacek 16 hours ago

      Dual EC was not the product of a contest. The NIST PQC algorithms are all designed by academic cryptographers, many of them not US nationals.

      • natch 13 hours ago

        And chosen by NIST…

        • tptacek 12 hours ago

          And? Finish that thought.

          • natch 11 hours ago

            You are tptacek; I believe you know exactly what I meant. But to indulge you, do you think we can know that the selection process is not comprised?

            • tptacek 11 hours ago

              Explain what the compromised selection process does here. NIST doesn't control the submissions.

              • natch 5 hours ago

                Your question presupposes a claim that the selection process is compromised. I'm not saying it is. I just wonder how we know it's not.

                In NIST's position one could analyze the submissions for vulnerabilities to closely held (non-public) attacks, then select submissions having those vulnerabilities.

              • immibis 10 hours ago

                Seems pretty obvious no?

                1. Pretend to be someone else and enter a backdoored algorithm. Or pressure someone to enter a backdoored algorithm for you. Or just give them the algorithm for the reward of being the winner.

                2. Be NIST, and choose that algorithm.

                • tptacek 9 hours ago

                  You think someone is going to pretend to be Chris Peikert and submit a backdoored construction as him, and that's going to work?

                  This is the problem with all these modern NIST contest theories. They're not even movie plots. Your last bit, about them paying someone like Peikert off, isn't even coherent; they could do that with or without the contest.

                  • mistercheph 4 hours ago

                    > they could do that with or without the contest

                    Then why does the contest give you any more confidence that the selection isn't backdoored?

  • affinepplan 16 hours ago

    if it's any consolation, decisions like these (normally) have a very long lead time measured in years. now, these are not normal times, but even so I'd be more concerned about NIST decisions coming out nearer to the end of this administration rather than just now at the beginning.

    • dylan604 16 hours ago

      More telling would be any reversals in the next couple of years

      • immibis 10 hours ago

        Just yesterday I read someone say that post-quantum cryptography is woke. Not sarcastically. Someone actually believes that post-quantum cryptography is woke, because it's something academics dreamed up and then told everyone they should be using. They view it the same way as DEI.

      • natch 13 hours ago

        If you think the bad actors are only on one side, you are in a bubble that is feeding you bad information.

        • affinepplan 10 hours ago

          yes, and if you think the _number_ of bad actors on each side are even CLOSE to similar in magnitudes then you are in a bubble that is feeding you bad information

          • natch 5 hours ago

            I would just encourage you to keep your eyes open and listen to the other side, and revisit this post and see how it ages.

  • dialup_sounds 16 hours ago

    Not to defend the practice, but the layoffs have been for employees with 0-3 years in role, which probably does not include the people selecting post-quantum encryption algorithms.

    • mikeyouse 16 hours ago

      Perhaps, but what surprised the DOGE folks is that “in role” included some people who were recently promoted or had changed teams.. so many of the laid off employees were actually long-time employees with a ton of institutional knowledge. Perhaps they would have learned as much if they had done literally any due diligence to understand the departments they were tasked with organizing, but I guess we’ll never know.

      • dylan604 16 hours ago

        You're implying that these people are not smart enough to already have known what they were doing. You will now be logged and noted as a dissenter, and are now officially warned against further dissent. You don't want to FAFO. --DOGE management

        it's only the next natural step

    • nartho 15 hours ago

      If I'm not mistaken, this is inaccurate. It was any probationary employees, which also includes anyone promoted in the last 2 years.

    • doikor 14 hours ago

      The 0 to 3 year counter also resets when you get promoted (you are "probationary" for the new role)

  • natch 14 hours ago

    Claiming loyalty is a litmus test for layoffs is a bit incendiary and a needless introduction of a strongly biased view of politics into the conversation. No doubt for leadership levels an active disinterest in helping enable open inquiry into the state of things would be fireable, but calling this a loyalty test is a strong spin. One that’s been normalized lately to be sure, but there’s no need to further it.

    I’d be more concerned with whether NIST colludes with the NSA to approve algorithms they could crack.

    • krunck 13 hours ago

      > I’d be more concerned with whether NIST colludes with the NSA to approve algorithms they could crack.

      It's more than a concern that the US government will select algorithms that their top spook agency can crack. One must assume it is the case.

      • natch 13 hours ago

        Exactly.

  • ZiiS 14 hours ago

    "can we trust this is the best encryption standard" is by definition "no". Doesn't matter who you are asking about what.

  • grayhatter 15 hours ago

    I was unable to trust NIST before

  • rdtsc 15 hours ago

    Did you trust them before?

    • garbageman 15 hours ago

      I trusted them to at least pick an encryption that only they could break...rather than one that other nation states may be able to crack as well.

      • ziddoap 14 hours ago

        If it's breakable by them, it's breakable by anyone. Full stop. It is not possible to create an encryption scheme only breakable by NIST (or NSA, or whatever 3-4 letter agency).

        • timewizard 14 hours ago

          Have people forgotten about this already?

          https://en.wikipedia.org/wiki/Dual_EC_DRBG

          It's not at all impossible to put a backdoor in a protocol which requires knowledge of a key in order to exploit. This isn't even the only example where this is thought to have occured.

          • ziddoap 13 hours ago

            I haven't forgotten about it, no, and I stand by my original comment.

            If you introduce a deliberate weakness to your encryption, the overall security is reduced to the security level of that weakness.

            Relying on NOBUS ("nobody but us") is hubris (see shadow brokers, snowden, etc.).

            • tarlinian 13 hours ago

              This just doesn't make technical sense. I completely agree that backdooring encryption standards is a bad thing. But Dual EC DRBG is a clear example of a NOBUS backdoor actually being that. The backdoor is equivalent to "knowing" a private key. The weakness is not some sort of computational reduction. Using this logic, you would say that no encryption method is possibly secure because you can't rely on its security once the key is exposed.

              • ziddoap 12 hours ago

                The reason it remained a "NOBUS" backdoor is because the whole world noticed something was funky with it pretty much immediately (even prior to standardization), and security researchers loudly told people not to use it. At that point the value of cracking open the backdoor is reduced significantly. It was standardized, and barely used except where mandated by governments, for less than 10 years.

                There's no reason to think it would have remained a "NOBUS" backdoor forever. Especially if it was more widely used (i.e. higher value), and/or used for longer.

                >Using this logic, you would say that no encryption method is possibly secure

                I mean, to an extent that a little waterboarding will beat any encryption method, yes I would say that.

                But, for 99.99% of people, your data isn't worth the waterboarding. On the flipside, a backdoor to, say, all TLS communication, would be very worth waterboarding people.

        • whimsicalism 14 hours ago

          really? it seems to me like it should be theoretically possible in some cases for the same reasons public/private key cryptography is possible.

      • rdtsc 14 hours ago

        Heh. That's fair.

        I wonder what other countries do? Do their agencies trust NIST or they recommend their own and run their programs for algorithms. I am thinking of say Germany, France, Britain etc.

  • jedisct1 15 hours ago

    This is a key exchange mechanism, not an encryption system.

  • whimsicalism 16 hours ago

    can we not turn every thread into a politics one? i doubt the Trump admin has strongly vested interest in which post-quantum scheme is selected

    e: and yes, i am aware of the history around nist and crypto

    • grayhatter 15 hours ago

      That's a good question, the answer is no.

      That's the thing about politics... they touch everything. There's a popular youtuber that I like, he's got a funny saying "You might not fuck with politics, but politics will fuck with you!" Fits well here.

      You might wanna ignore politics when talking about something that should be pure math, but now that we're talking about why crypto is going to be the standards that all commercial software must support. Suddenly we now need to consider how confident we are in something. And really, that's all crypto boils down to is confidence in the difficulty of some maths. Was this recommended (soon mandated) with more or less care then the other options? How would we be able to tell. Is NIST likely to remake their previous unethical mistakes?

      • timewizard 13 hours ago

        > they touch everything.

        No. They don't. The level at which politics has actually intersected with my life in the past year is zero. I suspect the same is true for the majority of people in the US.

        Your politics are mostly a fashion choice. You don't need to put them on display in literally ever conversation. You also cannot possibly change the world around you with this behavior so I can't understand why so many people feel the need to engage in it.

        > Suddenly we now need to consider

        The government is massive. You always need to consider this. Pretending that a choice of a single federal official is the difference maker here takes bizarre fashion choices into the completely absurd. The only thing you're doing is alienating half the audience with churlish behavior.

        • ziddoap 13 hours ago

          >The level at which politics has actually intersected with my life in the past year is zero.

          Road maintenance, sewer connections, water and air quality, food safety, and a million other things that you interact with daily are all results of various levels of politics.

        • grayhatter 11 hours ago

          > Your politics are mostly a fashion choice. You don't need to put them on display in literally ever conversation. You also cannot possibly change the world around you with this behavior so I can't understand why so many people feel the need to engage in it.

          Given your obvious disgust with someone else thinking or talking about it. (You weren't tagged, you decided to invite yourself into the conversation to proclaim that someone else is wrong for understanding the world differently from you) It's not much of a shock you can't understand why.

          > you also cannot possibly change the world around you with this behavior

          This feels like the thing you're actually mad about. Complaining at other people for talking about politics (instead of ignoring them) will have even less of an effect. If you want to have an impact ask more questions, don't berate people for not being as smart as you already are, or for daring to see things differently from the way you see them.

          As for the change in the world I want to see. First, I want people to be nicer to each other. This us vs them thing needs to stop. Second, I don't need to change the world, I'm happy to just improve my little corner of it a bit. Security (and crypto) is my corner; and NIST has made some mistakes that I find problematic. The idea that leadership of any org *does* influence an org, isn't normally controversial, so if the leadership changes. It's good to know the trust level is gonna change. If the leadership changes in a less trustworthy direction. I'd hope more people learn about it, so blind trust in NIST drops. I would call that a small improvement.

          > The government is massive. You always need to consider this. Pretending that a choice of a single federal official is the difference maker here takes bizarre fashion choices into the completely absurd. The only thing you're doing is alienating half the audience with churlish behavior.

          I mean, the people replying to you while you rage at them must care a bit about the politics of the system. So I can't imagine that calling something they care about a "fashion choice", and then insulting them wouldn't feel alienating. Is this a do as I say, not as I do kinda thing?

        • malcolmgreaves 13 hours ago

          I don’t believe that you have absolutely zero money invested in public stock markets.

    • Analemma_ 16 hours ago

      > i doubt the Trump admin has strongly vested interest in which post-quantum scheme is selected

      That's not the argument being made, you're using that as a strawman to distract from the actual position, which is that indiscriminate layoffs (which is what DOGE is doing) reduce institutional competence and increase the likelihood that whatever scheme is selected is not fit for purpose. Address that argument, not the one you've invented in your head.

    • kewho 16 hours ago

      [dead]