Designed and Programmed a Software System to analyze the Signal-to-Noise Ratio in spacecraft-to-ground RF systems throughout space missions:
LEAP missions prior to LEAP3 (and some other program launches) had experienced times when telemetry data 'dropped-out', meaning that the received signal power dropped below the Signal-to-Noise Ratio threshold required for error-free data transmission. In order to transmit data over a radio signal, the combination of the transmit power, the distance between the spacecraft and the ground station, and the antenna gains of the spacecraft and ground station antennas determine what fraction of the transmitted power is received at the ground station. The received power must be significantly higher than the background noise or errors start appearing in the data.
This was especially problematic, because some of the telemetry systems are considered mission critical by the launch range, because during the launch, they need to have information about the motor pressures, remaining fuel, and directions of nozzles in order to verify that the rocket is expected to continue traveling in a safe direction. If the launch range Safety Officer determines there is a danger of a rocket going off-course, they can send a radio signal to destroy the rocket, ending the mission.
So the Director of Electrical Engineering at OSC tasked me with figuring out why these data drop outs were occurring and what we could do to eliminate them. I first investigated various possible hardware failures, but those all turned out not to be the cause of the data issues.
I, and other RF engineers at OSC, suspected that these were being caused because the antennas on the rockets were not perfect. In an ideal world, the RF energy from the transmitters on the rocket would radiate in a perfectly even sphere equally in all directions, but real-world antennas do not behave this way. They have areas in which the antenna pattern has 'nulls' - directions in which less (and sometimes, much less) energy is radiated.
There is always a big null along the axis of the rocket due to the metallic skin of the rocket, and real antenna patterns look like a 'donut' with the best radiation direction being directly perpendicular to the rocket axis. But even directly perpendicular there is some variation as you move around the rocket.
We thought that at various times during the mission, the rocket may be oriented in such a way that the direction pointing to a particular ground station may happen to be passing through these 'nulls' in the vehicle antenna pattern.
So, to investigate this, I wrote a program in C that could accept the mission's trajectory file giving the rocket's position and rotational orientation for every second of the mission, along with the location of a particular ground station. The software could also store an accurate 'map' of the vehicle's antenna pattern, and by doing some detailed calculus III-style vector calculations using curl, it could determine the direction the ground station was looking in the vehicle's coordinate system, look up the specific antenna gain in the vehicle's antenna pattern, and this, combined with the distance between the vehicle and ground station could be used to calculate the expected Signal-to-Noise Ratio in the ground station receiver.
Once I completed developing this software, I tested it using previous LEAP mission data, and found that it accurately predicted periods when the Signal-to-Noise Ratio would be low and those corresponded to when the data dropouts occurred.
We then used this software to predict when we might have problems on the LEAP3 mission and were able to switch to a secondary ground station location to receive data when these times to avoid the problem.