The "spectrum-slicing" technique employing incoherent light has been shown to be a highly practical, cheap and hence very attractive proposal for future all-optical networks. In this study, the use of Semiconductor Optical Amplifier (SOA) gain saturation for intensity noise reduction of incoherent light is studied with a view to obtaining the optimum SOA injection current and input power conditions to achieve the best possible intensity noise reduction-in terms of OSNR, BER, noise power and Q-factor. The results reported herein give designers knowledge of the best SOA operating conditions to enhance overall system performance, while still obtaining signal gain from the SOA.