Everyday life on the internet is insecure. Hackers can break into bank accounts or steal digital identities. Driven by AI, attacks are becoming increasingly sophisticated. Quantum cryptography promises more effective protection. It makes communication secure against eavesdropping by relying on the laws of quantum physics. However, the path toward a quantum internet is still fraught with technical hurdles.
I think my question on all this would be whether this would ultimately cause problems in terms of data integrity.
Currently most amplifiers for digital information are going to capture the information in the light, probably strip off any modulation to get to the raw data. Then re-modulate that using a new emitter.
The advantages of doing this over just amplifying the original light signal are the same reason switches/routers are store and forward (or at least decode to binary and re-modulate). When you decode the data from the modulated signal and then reproduce it, you are removing any noise that was present and reproducing a clean signal again.
If you just amplify light (or electrical) signals “as-is”, then you generally add noise every time you do this reducing the SNR a small amount. After enough times the signal will become non-recoverable.
So I guess my question is, does the process also have the same issue of an ultimate limit in how often you can re-transmit the signal without degradation.
The way I interpreted the article, you aren’t amplifying the signal but transferring it. Same as store and forward. I think that implies that degradation is not a problem as long as the new photon profile was a match. The real problem is matching the profile which they only managed at 10 meters.
what about “bit rot” (but, y’know, tiny tiny bits)
Yep. I also valid concerns. But those seem to be their next steps. I just wondered if there would be degradation. You wouldn’t even be able to tell until it reached the destination.
Definitely interesting stuff.