Processing and Quality Control (QC) of ADCP current data was done using QARTOD 
Recomendations. The following are the methods used to achieve the final, processed result.
The processing was done using a code created by Cody Benton titled "ADCP_Processing.m" It and
more information can be found on GithHub https://github.com/Cody-Benton/ADCP_Processing

The raw binary ADCP data is imported to Matlab and the matlab function “rdradcp” written 
by R. Pawlowicz is used. If the ADCP was set to measure waves then RDI’s WavesMon software
must be used before importing the data to Matlab. The resulting .PD0 file is used. .PD0 file.

The final data has had pre and post-deployment data, faulty pressure measurements and extra 
fields removed. 

The pre and post-deployment data are found and removed using several methods.
The first method is by looking for depth less than 5m but whose mean depth is greater 
than the deployed depth. This would correspond to the data while the ADCP is being 
deployed or while it is being retrieved. These intervals vary but tend to be every 
few months. The mean must be greater than the deployed depth because a failing 
pressure sensor could read any value as it drifts over time. That means the entire 
data set would be thrown out if say the sensor was reading 3m the whole time. So we 
check that the pressure sensor is recording a depth that is somewhat close to the 
actual depth before screening the values that correspond to a less than 5m depth. 
The second method for finding the pre and post-deployment data is looking for an 
echo intensity less than 90 decibels at the surface. The sea surface will result in 
the highest values for echo intensity. A low echo intensity here most likely 
indicates the ADCP is pinging in air and not water, and the echo intensity will be 
lower in air than water. 90 decibels was chosen as a threshold as this is significantly 
below the average for the surface. The next step is to manually determine the pre and 
post deployment data. This is done by visually looking at some data. Section 4 of the 
ADCP_Processing code will create an image with the instruments heading for the first 10 
ensembles and last 10 ensembles. If the heading appears to be unstable near the 
beginning or end of the time series the data is screened. The echo intensity is also 
screened in a similar manner for the beginning and end of the time series. Although we 
screened some pressure at the beginning, there could still be erroneous pressure data 
due to a failing or compromised pressure sensor. The pressure data is plotted and 
checked for validity, if the pressure data appears to be wrong or drifting over a 
deployment, it will be deleted from the data set. After this step the pre and post 
deployment data should be completely screened and we are now ready to apply the QC 
measures. 

Three QC tests are applied to the data. These QC tests are a velocity error 
test, a correlation magnitude test and an echo intensity test. These tests were 
recommended by documentation provided by Quality Assurance/Control of Real Time 
Oceanographic Data (QARTOD). Since OB27 is not a  real time mooring, not all tests are
applicable. A presentation on QC of ADCP by Australia’s Commonwealth Scientific and 
Industrialc Research Organization (CSIRO) was also helpful in developing these QC tests. 
The thresholds for these tests will vary depending on the region the ADCP is deployed. 
The velocity error test screens any data that has an error velocity greater than the 
threshold of 0.05m/s. This value screened ~75% of the values above the surface, which 
we know are bad, while keeping the vast majority of the values in the water column. Because 
multiple QC tests were being implemented we could use a less stringent threshold and 
try to keep as many good values as possible. The second test is a correlation magnitude 
test. The threshold used for the correlation magnitude was set to 110. 
This threshold screened about 80% at the surface boundary and approached 100% the 
farther above the surface you go. The final QC test uses echo intensity. The echo 
intensity for each beam is used and the difference between bins is determined. If the 
echo intensity increases by more than 10 decibels between bins, all 
the data above that point is screened. This resulted in nearly 100% of the data 
being screened at the surface and above. This test is not applied to the bottom half 
of the water column. This is to prevent good data near the bottom from being screened. 
Sediment in the lower level of the water column can cause a sudden increase in echo 
intensity. When all three tests are applied we get a good data set with erroneous data 
removed.


The mean depth from pressure or echo intensity is used in the final step to screen any
values above the surface that may have passed by the QC tests. Echo intensity is used to 
calculate a conservative estimate of the surface when there is no valid pressure data. This
is done by finding the first bin where the echo intensity test fails, then determining that to
be the surface. This tends to be 1-2m below the actual surface.