There are several things that contribute to phase shift in a CT:
- Frequency - as you point out.
- Primary current
- Individual variation in manufacture and installation
The device tables address all three:
As you point out, there are different specifications for 50Hz and 60Hz. Thankfully there are no common power systems that use other frequencies.
For each frequency, there is an array of shift/cutoff current entries to deal with primary current.
The table values are the average of two representative samples of the device (typically after testing more).
As you can see above, this 200A CT varies from about 0.5° down to a fairly steady 0.25° in midrange.
Realistically, quality larger CTs, and 0.5% CTs, have relatively low shift. Moreover, shift isn’t really a big source of measurement discrepancy until the power factor get down below, say, .70. If you are having issues with accuracy as compared to another measurement source, you might first compare VA between the two. VA would not be effected by phase-correction. Graph+ does not have a direct way to access VAh, but if you plot VA over any time period, the Data Statistics will show VAh.
If the VAh are within acceptable margins and kWh is not, then you might suspect phase-correction as a source of the discrepancy.