The incessant decrease in transistor size has led to reduced voltage noise margins and exacerbated power integrity challenges. This trend intensifies concerns about the efficacy of conventional static timing analysis (STA), which traditionally assumes a constant power supply level, often resulting in imprecise and overly conservative outcomes. To address this, this paper proposes a dynamic-noise-aware STA engine enhanced by just-in-time (JIT) machine learning (ML) integration. This approach employs the Weibull cumulative distribution function to accurately represent dynamic power supply noise (PSN). We perform gate-level characterization, assessing delay and transition time for each timing arc under variations in input transition time, output capacitance, and three PSN-aware parameters. The timing for each timing arc can then be predicted by a multilayer perceptron (MLP), trained with the characterization data. Finally, by incorporating JIT compilation techniques, we integrate trained MLP models into the STA engine, achieving both computational efficiency and flexibility. Experimental results show that the proposed method can accurately estimate the timing fluctuation due to dynamic PSN, with an average relative error of 4.89% for single-cell estimations and 6.27% for path delay estimations.