Abstract
Chirp-induced dispersion penalties in high-bit-rate optical fiber transmission are assessed using numerical integration of laser rate equations and a Fourier transform fiber dispersion routine. The roles of the imposed modulation waveform and laser design parameters are evaluated from computer generated eye diagrams and simple analytical observations. Consistent with experiment, we find device dependent optimum laser extinction ratios. In addition, we address the delicate balance between nonlinear chirp-induced dispersion penalties and the speed limitations imposed by linear current filtering on both the laser transmitter and the receiver. These considerations become increasingly important at higher bit rates such as 8 Gbit/s.