Talk:HKDF

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Untitled[edit]

The python example doesn't take into account some limitations over variables imposed by the RFC, such as length<=255*hash_len, and both length and ikm mustn't be empty. — Preceding unsigned comment added by Hackancuba (talkcontribs) 8 March 2017 (UTC)

The wikipedia page does not explain how HKDF works, or the steps performed within the algorithm. HappyDragon* (talk) 04:03, 18 August 2017 (UTC)[reply]

The python example also incorrectly concatenates the byte-value of "i" -- in my testing, it added the bytes for "[1]" (the ascii for the brackets plus the number in between). This should probably just be "t + info + chr(1+i)", also with bounds checking on lengths > 255 * hashlen as suggested above. (though the error resulting when it tries to generate chr(256) will provide bounds checking too. :) )) Dschuetz (talk) 18:23, 5 September 2018 (UTC)[reply]

I don't like having code example without explaining HKDF() parameters first, in general, too. About some missing checks, you're right. However, bytes([1]) is not "[1]", but b"\x01", unless you are using ~obsolete Python2 :) . Applying first two RFC test cases,
testcases = [
    ("0b0b0b0b0b0b0b0b0b0b0b0b0b0b0b0b0b0b0b0b0b0b",
     "000102030405060708090a0b0c",
     "f0f1f2f3f4f5f6f7f8f9",
     42,
     "077709362c2e32df0ddc3f0dc47bba6390b6c73bb50f9c3122ec844ad7c2b3e5",
     "3cb25f25faacd57a90434f64d0362f2a2d2d0a90cf1a5a4c5db02d56ecc4c5bf34007208d5b887185865"),

    ("000102030405060708090a0b0c0d0e0f"
     "101112131415161718191a1b1c1d1e1f"
     "202122232425262728292a2b2c2d2e2f"
     "303132333435363738393a3b3c3d3e3f"
     "404142434445464748494a4b4c4d4e4f",

     "606162636465666768696a6b6c6d6e6f"
     "707172737475767778797a7b7c7d7e7f"
     "808182838485868788898a8b8c8d8e8f"
     "909192939495969798999a9b9c9d9e9f"
     "a0a1a2a3a4a5a6a7a8a9aaabacadaeaf",

     "b0b1b2b3b4b5b6b7b8b9babbbcbdbebf"
     "c0c1c2c3c4c5c6c7c8c9cacbcccdcecf"
     "d0d1d2d3d4d5d6d7d8d9dadbdcdddedf"
     "e0e1e2e3e4e5e6e7e8e9eaebecedeeef"
     "f0f1f2f3f4f5f6f7f8f9fafbfcfdfeff",

     82,

     "06a6b88c5853361a06104c9ceb35b45c"
     "ef760014904671014a193f40c15fc244",

     "b11e398dc80327a1c8e7f78c596a4934"
     "4f012eda2d4efad8a050cc4c19afa97c"
     "59045a99cac7827271cb41c65e590e09"
     "da3275600c2f09b8367793a9aca3db71"
     "cc30c58179ec3e87c14c01d5c1f3434f"
     "1d87")
]

for t in testcases:
    ikm, salt, info, l, prk, okm = [ bytes.fromhex(x) if isinstance(x, str) else x for x in t ]
    k = hkdf(l, ikm, salt, info)
    if k == okm:
        print("OK")
    else:
        print("FAIL")
it passes. —Mykhal (talk) 20:55, 13 April 2021 (UTC)[reply]

Uses[edit]

The Uses section states: "To "extract" (condense/blend) entropy from a larger random source to provide a more uniformly unbiased and higher entropy", however, section 4 of the RFC is very clear that this is not possible (emphasis mine): "The extract step in HKDF can concentrate existing entropy but cannot amplify entropy" — Preceding unsigned comment added by ColinA3 (talkcontribs) 20:49, 9 May 2018 (UTC)[reply]

Example code[edit]

I hope it's not too much to ask that contributors test the example code when editing it. The previous couple of edits were blatantly wrong. Ewx (talk) 19:18, 18 February 2022 (UTC)[reply]