fiber – Improvements to Gbps per wavelength over time

Closed. This question is off-topic. It is not currently accepting answers.

NE is a site for to ask and provide answers about professionally managed networks in a business environment. Your question falls outside the areas our community decided are on topic. Please visit the help center for more details. If you disagree with this closure, please ask on Network Engineering Meta.

I want to build a sense of how fiber optic transceiver technology has improved over time, and what relevant physical limits constrain it in the limit.
For instance, ChatGPT tells me that in 2010 typical bandwidths were in the order of 10 Gbps per wavelength, while now I see systems deployed that achieve 100 Gbps. is this right? Where can I find a reliable source tracking this improvement over time? Are there expectations of it improving further? What is the theoretical limit to how much more it could improve?
Complementarily, how has the possible range of DWDM channels increased over time? My understanding is that standard DWMD technology allows for simultaneous use of 80 bands. Is this right? Has this changed much over time?

Stay in the Loop

Get the daily email from CryptoNews that makes reading the news actually enjoyable. Join our mailing list to stay in the loop to stay informed, for free.

Latest stories

- Advertisement - spot_img

You might also like...