Some services are slowly developing post quantum resistant protocols for their services like Signal or Tutanota. When will this be a thing for the web?

    • Sources have dried up now that the Snowden’s and Assanges of the world have gone into hiding, but they definitely used to collect massive amounts of data for years.

      The NSA capturing traffic for later decryption is mere speculation. They’re not exactly going to admit doing so and I doubt anyone is going to commit career suicide and flee to Russia just for getting this out. However, they’d be wasting money on servers if they weren’t doing it. There have been tons of bugs and vulnerabilities allowing decryption after the fact after weak RNGs and protocol bugs have been found. It wasn’t that long ago that a website using a paid certificate for encryption would only require one key per website per year to decrypt all traffic. Give it a year or five and a mid sized company can easily start brute forcing through 2005 style HTTPS keys for cheap. Surely those NSA billions are ahead of the curve.

      With a little preselection based on the plaintext domain name or the IP address of the target, they could easily gather all relevant information while filtering out things like images and videos. That’s still a lot of data, but it’s manageable.