Impact of Inverter Loading Ratio on Solar Photovoltaic System Performance
Due to decreasing solar module prices, some solar developers are increasing their projects’ inverter loading ratio (ILR), defined as the ratio of DC module capacity to AC inverter capacity. In this study, we examine the operational impacts of this trend. Using minute-level solar data, we examine the relationship between inverter induced clipping losses and AC generation. We find minimal clipping losses at an ILR of 1.25; at an ILR of 2.0, we observe that 16% of potential annual generation is lost. Minute-level data prove to be essential in determining the generation lost to clipping, as hourly data mask key clipping and ramping events. Higher ILRs lead to a greater frequency of time spent at maximum generation, but also a greater frequency and magnitude of large solar ramping events. Module degradation can attenuate the impacts of inverter clipping over time. We observe that the effective degradation rate (net of any changes to inverter clipping losses) can be as little as half the actual degradation rate for projects with high ILRs. The diurnal and seasonal trends in clipping correspond with solar insolation patterns, with the highest clipping occurring around noon. For fixed tilt installations with tilt angles at latitude, we observe the highest clipping near the autumnal and vernal equinoxes. Increasing the tilt angle leads to more winter clipping, while lower tilt angles shifts the clipping to summer months. Shifting from fixed tilt to north–south single axis tracking increases the generation lost to clipping significantly. At an ILR of 1.25, annual clipping approximately doubles to 1% compared to fixed tilt at latitude, while clipping under an ILR of 2.0 increases to 22%, compared to 16% for the fixed configuration. As expected, more clipping occurs during the hours preceding and following noon when using single axis tracking.