Hot streaks are known to have a significant impact on the wall temperature distributions of first-stage turbine rotors. The experimental geometry most often employed to simulate hot streak migration is the Large-Scale Rotating Rig (LSRR) turbine model used by Butler et al. (1989) and Roback and Dring (1992). The LSRR is a large-scale, low-speed, rotating-rig wind-tunnel facility designed to simulate the flow field in an axial-flow turbine. The migration of hot streaks through the LSRR has been simulated by many researchers including Krouthen and Giles (1988), Rai and Dring (1990), Takahashi and Ni (1990,1991), Dorney et al. (1992,1993,1996). While these numerical simulations have produced significant insights into the mechanisms controlling hot streak migration they have (for the most part) neglected surface heat transfer, which can be important in high-pressure turbines. The design of efficient blade cooling schemes requires a knowledge of both the hot streak migration path and the local heat transfer coefficients.
The focus of the present effort has been to study the combined effects of a combustor hot streak and heat transfer in a three-dimensional viscous flow environment. To this end, three-dimensional unsteady Navier-Stokes simulations have been performed for the 1-1/2 stage configuration on the LSRR turbine. The predicted aerodynamic data (time-averaged surface pressures) have been compared with the available experimental data. The time-averaged and unsteady temperature data has been used to analyze the effects of the hot steaks on the airfoil heat transfer.