You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
A crude filtering algorithm was implemented in the ucomp code to prevent the lyot filter from reacting to noise in the temperature readings, slow the reaction to temperature change, and better match the filter's thermal mass. The theory for slowing down temperature updates is that the heaters and temperature probes are mounted to the outer radius of the filter. However, the temperature we want to know about is closer to the optical axis in the middle of the filter. So if the heat flows into the filter along its cylindrical radius and generally out in its top and bottom, any change we measure at the outer radius will predict the future for the optical elements in the filter.
The filtering algorithm reads a new temperature every 30 seconds, saving both Current_Temp_Read and New_Filter_Temp into the ucomp fits headers
while (True)
{
sleep 30 seconds
New_Filtered_Temp = Old_Filtered_Temp - (Old_Filtered_Temp - Current_Temp_Read)*0.05
Old_Filtered_Temp = New_Filtered_Temp
}
This filtering archives the nominal goal of stopping the filter from reacting to noise in the readouts. It would probably have adequate performances if we didn't see a terminal variation over the day. However, real-world UCOMP thermal data shows a slow Gaussian-like profile with peak temperatures in the filter in the afternoon. The filtered temperatures sent to the lyot filter tuning code roughly follow the actual temperatures. Still, there is a time lag, and the magnitude of this lag changes over time. Worse (but not shown here), the filter resets every time the LabVIEW code is stopped and restarted, so the filtering algorithm also has the hidden input of how long it has been since the last crash (or another stop in observing).
Keeping some filtering seems important, if at a minimum, to prevent the tuning from following a noisy read and ignoring high-frequency measurement noise. It is unclear if we should put the time lag into the filtering. And if we should keep the lag, how should we remove the time the observing code has been running from the equation?
The text was updated successfully, but these errors were encountered:
One possible replacement for the proportional filter we used is an Exponential Moving Average filter.
new_smoothed_value = alpha * new_measurement + (1 - alpha) * old_smoothed_value
Where alpha is a value between 0.1 and 0.3.
If implemented, we would tune the lyot filter with green instead of blue values.
This should drastically improve things on days when we had funky croprico measurements like 20220902, where, due to a bad initial condition, the applied value was 1-2 degrees lower than the measured value for over half the observing day.
We dont know if this properly addresses the thermal inertia of the LN crystals. We still have the problem that the probe is sampling the outside of the crystal near the heater, while the light more or less passes thru the center.
A crude filtering algorithm was implemented in the ucomp code to prevent the lyot filter from reacting to noise in the temperature readings, slow the reaction to temperature change, and better match the filter's thermal mass. The theory for slowing down temperature updates is that the heaters and temperature probes are mounted to the outer radius of the filter. However, the temperature we want to know about is closer to the optical axis in the middle of the filter. So if the heat flows into the filter along its cylindrical radius and generally out in its top and bottom, any change we measure at the outer radius will predict the future for the optical elements in the filter.
The filtering algorithm reads a new temperature every 30 seconds, saving both Current_Temp_Read and New_Filter_Temp into the ucomp fits headers
while (True)
{
sleep 30 seconds
New_Filtered_Temp = Old_Filtered_Temp - (Old_Filtered_Temp - Current_Temp_Read)*0.05
Old_Filtered_Temp = New_Filtered_Temp
}
This filtering archives the nominal goal of stopping the filter from reacting to noise in the readouts. It would probably have adequate performances if we didn't see a terminal variation over the day. However, real-world UCOMP thermal data shows a slow Gaussian-like profile with peak temperatures in the filter in the afternoon. The filtered temperatures sent to the lyot filter tuning code roughly follow the actual temperatures. Still, there is a time lag, and the magnitude of this lag changes over time. Worse (but not shown here), the filter resets every time the LabVIEW code is stopped and restarted, so the filtering algorithm also has the hidden input of how long it has been since the last crash (or another stop in observing).
Keeping some filtering seems important, if at a minimum, to prevent the tuning from following a noisy read and ignoring high-frequency measurement noise. It is unclear if we should put the time lag into the filtering. And if we should keep the lag, how should we remove the time the observing code has been running from the equation?
The text was updated successfully, but these errors were encountered: