Initial Set-Up and Calibration
for the MSA

Updated Aug. 11, 2017,  Update for MSA Software Version 118 Rev 0. Note: Cypress USB must be re-programmed.
Updated June 9, 2016,  Update Frequency and Path Calibration
Updated Mar. 3, 2014,  Notice for Win7 and later for Liberty Basic Users
Special Notice for XP and later users,  Possible XP Problems (at end of this page)

    This page will describe the Initial Set-Up for a new MSA and the procedures to Calibrate it.  The MSA Software can be downloaded from this page.  The procedures apply to both the Original MSA and the SLIM MSA.  Separate pages will describe the Operation of the MSA for its different FunctionsBefore you begin this Initial Set-Up and Calibration, I suggest you read and become familiar with  MSA Control and Operation.

I.  Initial Set-Up Procedure for the MSA
    Before the MSA is can become operational, these procedures must be followed:
    A.  MSA Software Download and Installation
            Initial Software Installation
            MSA Software Updates
            Reverting to Original Software
    B.  Hardware Configuration
            MSA Hardware Configuration
            MSA Hardware Adjustments
            Computer Interface Configuration
                For USB Operation
                For LPT Operation
    C.  MSA Program Configuration
            
Run the MSA Program
            Configuration Manager Window
           
Files created by the Software
    D.  Coarse Calibration using the Calibration File Manager

II.  Calibration Procedures for the MSA
    For the MSA to be accurate, calibrations must be performed, in this order:
    A. 
Coaxial Cavity Filter, Tuning Procedure
    B.  Master Oscillator Calibration
    C.  Resolving Filters in the MSA
            
Resolving the Final Crystal Filters for each Path
            
Resolving the DDS Crystal Filters (in-work)
    D.  Phase Detector Module Calibration (VNA only)
    E.  Path Calibration for Magnitude (and Phase for VNA)
    F.  Frequency Calibration for Magnitude
        F1.  Manual Frequency Calibration for the Basic MSA, or
        F2.  Semi-Automatic Frequency Calibration for the MSA/TG


I.  Initial Set-Up for MSA
A. 
MSA Software Download and Installation,
    The MSA software is written in Liberty Basic and, as the name implies, is code written in Basic.  I won't get into the total capabilities of Liberty Basic, you can get their application software and information at their website.  Find it at http://www.libertybasic.com.  It is free, but will continually "Nag" you to buy it. If you do download Liberty Basic, it is important to place
it directly under the C: drive, i.e. C:/Liberty Basic. In earlier Windows, the folder was placed into C:/Program Files/Liberty Basic.
    The MSA software is released as "spectrumanalyzer.bas" file and as "spectrumanalyzer.tkn". "spectrumanalyzer.bas" is opened, manipulated, and run, using the Liberty Basic Application Software, version 4.03 or newer"spectrumanalyzer.tkn" is a "tokenized" version and is executed without having to download Liberty Basic. If upgrading MSA software from a previous version, skip the following paragraph and go to MSA Software Updates.

Initial Software Installation
    For a fresh installation on a computer, download the folder MSA_Software.zip and temporarily place it on your desktop. This is a zipped folder containing all of the files required to support the MSA. Unzip and place the unzipped folder in a convenient location. I prefer mine to be on the desktop. The .zip folder can now be deleted. Read the instructions in the ReadMe117.txt file for the full software installation procedure.

MSA Software Updates  The following are the latest versions and are included in the folder MSA_Software.zip. They only need downloading if upgrading from a previous version. Download to your folder, MSA_Software and replace your previous version with the latest version.
spectrumanalyzer.bas   (Version 118 Rev 0, 8-11-17)  This is the MSA source code, written in Liberty Basic, version 4.03.  It can be viewed with any text program on your computer, such as Word or Notepad.  It will "Run" with the Liberty Basic application, version 4.03, or higher.
spectrumanalyzer.tkn   (Version 118 Rev 0, 8-11-17)  The MSA source code, "tokenized". The name of this file is important. When replacing a previous version of spectrumanalyzer.tkn, delete or change the name of the previous version before placing this new file into the MSA_Software folder.
msa-fw-cb-v33.hex (Version 33, 8-6-17)  This is the firmware installed into the EEPROM on the Cypress USB Board. After downloading, go to the USB Converter page and follow the instructions at step 6. Updating EEPROM Firmware.
ReadMe118.txt        A "ReadMe" text file explaining update changes.

Reverting to Original Software
    It is possible, although unlikely, that you could manipulate the software configurations into a condition where your MSA does not operate. You can always revert to the original software, and start over.  Do this by finding and opening the "MSA_Software" folder.  Click and highlight the folder named, "MSA_Info".  Either, delete this folder and send it to the computer's "trash can" or, change its name to "errorMSA_Info".  Then, run spectrumanalyzer.exe (or spectrumanalyzer.bas for Liberty users). The folder named, "MSA_Info" will be re-created. It is as though you are starting with a fresh MSA with default topology. Previous versions of MSA Software are available on the MSA Archives Page (in-work).

Special Note:  MSA Software may have some bugs. As they are found and fixed, the software will be re-released as a later version or with a Rev update. If you ever have a problem with your MSA software, return to this page and see if a more recent "Rev" might cure your problem. If it doesn't, email me a "Bug Report" at:   wsprowls (at) yahoo.com.

B.  Hardware Configuration
MSA Hardware Configuration
The MSA Program default assumes the SLIM MSA is initially configured for 1G Band Operation:
  Verify that Mixer 1 output is connected to the Coaxial Cavity Filter.
  Verify that Mixer 2 output is connected to the I.F. Amplifier.
  If installed, verify that Mixer 3 output is the TG output (Tracking Generator Option)

MSA Hardware Adjustments
The SLIM MSA has only one mechanical adjustment, tuning of the Cavity Filter. The initial positions of the tuning screws are not important.
The SLIM Modules used in the MSA have no adjustments.

Computer Interface Configuration
    The MSA can be controlled by either the Parallel LPT Port or by USB, using the Cypress FX2.
For USB Operation
    If using the Cypress USB interface, go to the USB Conversion web page and follow the instructions for installing the software. This must be done before attempting to run the MSA Program.
For LPT Operation
    If using the computer's LPT (Parallel Printer Port) the computer must be updated with two files, "NTport.dll" and "Zntport.sys". These files are in the folder called "Redist", which is in the main folder MSA_Software. Open "Redist" and run the file called "NTPortDrvSetup.exe". This will install the two files into the proper locations in your computer. This must be done before attempting to run the MSA Program. You may not have to make any other configuration changes to your computer for proper operation of the MSA. However, if your computer has an extremely high speed processor or its PCI bus speed is very fast, you may have to re-configure your LPT port via the computer's BIOS. If your LPT is an add-on card, it may not show up in the BIOS.
    Entering your computer's BIOS is normally done from a cold start. I have to press the "DEL" key while the computer is booting. Find your configuration for the LPT Port. Depending on your computer, your choices will be Normal, SPP, ECP, EPP, Bi-Directional, EPP+ECP, etc. For a moderate to slow computer, any of these modes should work for MSA. For high speed computers, select EPP. Save, and allow the computer to continue its boot. All Microsoft Windows Platforms support the MSA LPT interface, but for Win XP (and newer) Parallel Port operation is weird, see the XP Problem at the end of this page.

C.  MSA Program Configuration
Run the MSA Program
    Configure the MSA for 1G Band Operation. Assure there is no input signal to the MSA, just leave the MSA Input unterminated. Select the Magnitude Video switch to Wide (if manual switch is used). Connect either the computer LPT or the Cypress USB to the MSA Control Board. Apply power to the MSA.
   
Open the MSA_Software folder. For Liberty users, run spectrumanalyzer.bas. For Non-Liberty users, run spectrumanalyzer.exe. This is a Runtime Engine created by Liberty Basic. When it is "Run" it looks for a file called "spectrumanalyzer.tkn" and runs it. When either is run for the first time, the Configuration Manager Window will open automatically. Subsequently, it will be opened from the Graph Menu Item, "Setup". The variables that are initially in place are defaults for a SLIM MSA with Tracking Generator and VNA. The user is able to change the variable values in the Configuration Manager window to match the topology of the user's MSA. They can be modified at any time.

Configuration Manager Window
msascreens/confgmgrdefault.gif

   
The values shown in this Configuration Manager Window are the defaults for the SLIM MSA with Tracking Generator and VNA.  You will now change the values in this Window to match the topology of your particular MSA.
    If you have followed a standard build for the SLIM MSA using SLIM modules, you need only to:
Click "Delete VNA" if you installed the TG but did not install the Phase Detector Module.
Click "Delete TG" if you did not install the tracking generator feature.

The Buttons in the upper right quadrant of the Configuration Manager Window have these functions:
Set to SLIM Defaults - This will insert all SLIM defaults into the Configuration Manager Window. Upon
Initial Set-Up, the default values are already inserted. But, if the Configuration Manager Window is opened after the Initial Set-Up, it is a quick way to change all the values back to SLIM defaults.
Re-Load File - This will read the
config.txt file and enter its values into the Configuration Manager Window.
Delete TG - This will delete all options in the
Configuration Manager Window that pertain to a Tracking Generator.  Click this if you do not have the Tracking Generator installed.
Delete VNA-
This will delete all options in the Configuration Manager Window that pertain to the Vector Network Analyzer.  Click this if you do not have the VNA installed.
Help - This will open a window for more explanations. I will add more to this window.
Save Configuration - This button will save the entries into the
config.txt file, closing the Configuration Manager Window and returning to the main program.
Return to MSA Without Saving - This button does not appear on the Initial Set-Up.  Subsequently, this will just close the
Configuration Manager Window and return to the main program without saving the changed entries.

Change any of the following variables to match the configuration of your MSA. SLIM MSA defaults are underlined.

PLL 1
PLL Type - Select for LMX2325, LMX 2326, ADF 4118
, LMX 2350, LMX 2353, ADF 4112,4113.  "0" is an option reserved for future use.
PLL Loop Filter -
For non-inverting loop filter, use 1(non-inv). For inverting op amp, use 0(invert)
PLL Phase Detector Freq (Mhz) - The value will determine the app
roximate Reference phase detector frequency of PLL 1. If the DDS 1 Center Freq is 10.7 and if DDS 1 Bandwidth is greater than .010 (MHz), then enter .974. For other topologies, this number can get rather involved. There are several factors that determine the value to be used. It can be determined by using the following formula: PLL1 Reference = (VCO 1 minimum frequency) x (DDS 1 Bandwidth)/(DDS 1 Center Freq). But, in cannot be greater than 1.02 (MHz).
Integer N or Fractional N Counter - 0(Integer) or 1(Fract)ional Mode.  Use Fractional Mode only when using Fractional N type PLL's. (LMX 2350, 2353).  Even if using a Fractional N PLL, I recommend using the Integer Mode for the MSA.  It is less noisy.
PLL 2
PLL Type - Select for LMX2325, LMX 2326, ADF 4118, LMX 2350, LMX 2353, ADF 4112,4113. The selection of "0" is reserved for topologies that use a frequency multiplier scheme to replace PLL 2.
PLL Loop Filter - For non-inverting loop filter, use 1(non-inv). For inverting op amp, use 0(invert)
PLL Phase Detector Freq (Mhz) - The value will determine the approximate Reference phase detector frequency of PLL 2. Use 4 . Other values can be used for experimentation. I suggest you contact me for more information.  This value can get extremely involved.
PLL 3
PLL Type - Select for LMX2325, LMX 2326, ADF 4118, LMX 2350, LMX 2353, ADF 4112,4113. Select "0" (zero) for no Tracking Generator.  Better yet, for no Tracking Generator, click the "Delete TG" button.
PLL Loop Filter - For non-inverting loop filter, use 1(non-inv). For inverting op amp, use 0(invert)
PLL Phase Detector Freq (Mhz) - The value will determine the approximate Reference phase detector frequency of PLL 3. Use .974. If PLL 3 is being steered by DDS 3 then use the same rules as for PLL 1 Reference. If PLL 3 is a fixed frequency, as used in the Original MSA with Original Tracking Generator, I suggest you contact me for more information. This value can get extremely involved.
Integer N or Fractional N Counter - 0(Integer) or 1(Fract)ional Mode.  Use Fractional Mode only when using Fractional N type PLL's. (LMX 2350, 2353).  Even if using a Fractional N PLL, I recommend using the Integer Mode for the MSA.  It is less noisy.
DDS 1
DDS Filter Center Freq (MHz) -
The value (in MHz) is the center frequency of the DDS 1 crystal filter. 10.7
DDS Filter Bandwidth (MHz) -  The value (in MHz) is the bandwidth of the DDS 1 crystal filter. .015
DDS 1 Parser - Select the command mode for DDS 1, (serial) or (parallel)
DDS 3
DDS Filter Center Freq (MHz) - The value (in MHz) is the center frequency of the DDS 3 crystal filter. 10.7
DDS Filter Bandwidth (MHz) -  The value (in MHz) is the bandwidth of the DDS 3 crystal filter. .015
LO 2 (MHz) - This is the fixed frequency of Local Oscillator 2. By default, it is 1024.
 If PLL 2 is used in the Original MSA with Original Tracking Generator (PLL3 is a fixed frequency), I suggest you contact me for more information. This value can get extremely involved.  If PLL 2 is replaced with a multiplier scheme (PLL2 = 0), this value needs to be a whole number multiple of the Master Clock nominal value.
Mast Clock (MHz) -
enter the exact frequency of the Master Oscillator (in MHz). If the Master Oscillator Module is adjustable, enter the oscillator's nominal value (64.0 in this case).   If it is not adjustable, enter the actual frequency that the clock is creating (in MHz).  If you are not sure, enter the nominal value of the Master Oscillator, and you can change this value during calibration. My Mast Clock is = 63.9995093 (a .3 Hz resolution)
Auto Switches -  Previously, to version 118, these switches were selected if electronic switching was configured into the MSA hardware. Now, the MSA Program will send the commands even if switches are not configured. No need to check any of them. The next revision will remove the check boxes.
Video Filters (uF) - Enter into each box, the value of the capacitor (in microFarads) used in the Video Filter switching. The defaults shown are values used in the Video Filter Switch, designed by Sam Wetterlin. If your MSA build uses mechanical switches, use the appropriate values. If only three switch positions are used, as in the recommended MSA design, the values for Mid are 0.1 for both Mag and Phase.
The values for Narrow are 10 for both Mag and Phase. There is no fourth position, XNarrow, so enter "0" for both Mag and Phase.
Max PDM out - VNA only. Nominally, 65535.  This is the Bit Count output of the Phase Analog to Digital Converter when the Phase Detector Module is reading 360 degrees. For the SLIM Phase Detector Module and SLIM AtoD Module, this value is the same for the 16 Bit or the 12 Bit). For the Original MSA using the Original AtoD this value is adjustable, and is determined during calibration (65535 for 16 bit, 4095 for 12 bit, or 255 for the 8bit). If the VNA is not installed, click the "Delete VNA" button.
Inv Deg - VNA only. Inversion in Degrees.  This is the actual amount of phase change when the Phase Detector Module has its state changed from Normal to Inverted.  A perfect PDM would have a 180 phase change.  The actual value is determined during the PDM calibration.
ADC / Mux - select:
    8 ladder, no mux          For Original MSA using 8 bit parallel ADC
    12
ladder, no mux        For Original MSA using 12 bit parallel ADC
    Dual ser. no Mux         For SLIM MSA using either one or two, 12 bit or 16 bit ADC. (Default, and what most MSA users have)
    Dual ser. w/Mux          For SLIM MSA using the SLIM-ADCMUX for Magnitude, and a 12 bit or 16 bit ADC for Phase
    Single ser. w/Mux        For SLIM MSA using SLIM-ADCMUX for both Magnitude and Phase

TG Topology - Select
    0
(None)                    no Tracking Generator is installed.
    1(orig)                      Tracking Generator but no VNA capability.
    2(DDS3/PLL3)         Tracking Generator or VNA configuration (default)
Control Board - Select
    0(Orig CB-LPT)         Original Control Board with LPT, Parallel Port interface only
    1(Future Use)             Reserved for any future Interface to the Control Board (such as RasPi or Beaglebone)
    2(SLIM CB-LPT)      SLIM Control Board using
LPT, Parallel Port interface (default)
    3(SLIM CB-USB)      
SLIM Control Board using Cypress USB Interface
LPT Port Address -
Hex 378  If your computer does not command the MSA, this may be the problem and needs to be changed. The standard home computer LPT address is Hex 378. Plug-in parallel cards will likely be different. If this value is changed, highlight only the octal value (378) and modify it. My computer is using a plug-in card with an address, Hex E800. To find your computer's LPT Address, click "How do I find out?"
List your final filters:  This is a table listing the Resolution filters that are installed in the MSA. The default is a single filter (Path 1) with frequency 10.7  (MHz) and bandwidth 15 (KHz).

To change this default to match your Path 1 final filter, highlight the data by clicking it with the Mouse. Four Buttons will appear: AddPrior, AddAfter, Delete, and Replace. Enter the correct data for your Path 1 filter in the Freq(MHz) box and BW(KHz) box. Then click the Replace button. If you use more than one Resolution filter, then enter the correct data for your next filter in the Freq(MHz) box and BW(KHz) box. Then click the Add or AddAfter button.  This will become Path 2. Repeat this process for adding filters. You can have up to 15 lines of filter values, Paths 1 through 15. Note: The MSA software can electronically switch only Paths 1 through Path 4. If a higher Path is selected, Path 4 will be electronically commanded but the higher Path values will be used in calculations.
    After the configuration values are entered, click the Button called "Save Configuration".  The Configuration Manager Window will close, and the Working Window and Graph Window will open. The MSA will begin sweeping in the Spectrum Analyzer Mode.

    The following graph of the Magnitude response is what you would get if your MSA was Calibrated.
msascreens/graphrun1.gif
The Magnitude response is the actual response of the Final Crystal Filter (Resolution Filter) centered at Zero MHz. The actual peak Magnitude will vary among different MSA's due to the variation of losses of mixers and filters. Expect this "Zero Response" Magnitude to be about -30 dBm, +/- 10 dBm.
    NOTE: If your graph initially shows a trace response and then "goes away", you probably have the notorious XP Problem. Prove this by Clicking "Halt" and then click "Restart". The trace response will return, and then "go away" again. Halt here and fix your computer by following the advice in the paragraph,
XP Problems (at end of this page).

Below, is a more realistic display of a first-time sweep.
msascreens/graphwinuncal.gif

    This graph of the Magnitude response is more likely what you will see during the Inital Set-Up. The cavity filter is not tuned (low amplitude), the Master Oscillator is not calibrated, and the Final Crystal Filter (Resolution Filter) is not exactly as planned (note that the center frequency of response is not at Zero MHz). The power measurements are certain to be incorrect, since no magnitude calibrations have been performed. We will perform those critical calibrations in the Calibration section. After the next paragraph, you will perform a Coarse Calibration by installing values in the Calibration File Manager Window. This will assure a Magnitude response for a "freshly built" MSA.
    
As I stated earlier, this page is written with the assumption that the MSA is in working condition. If your Graph response is "not even close" to either of these shown, you may have a problem. But, do not be dismayed until after the Coarse Calibration.

Files created by the Software
When "spectrumanalyzer.bas" (or .tkn) is run for the first time, several background operations will be performed by the software.  A new folder called "MSA_Info" will be automatically created and placed in the same folder that "spectrumanalyzer.xxx" is located.  This is usually the folder, "MSA_Software".  Within the "MSA_Info" folder, three folders and one text file will be automatically created:
1.  "MSA_Cal" folder.  Within this folder will be a minimum of two text files:
    a. MSA_CalFreq.txt  This is the Magnitude vs. Frequency Calibration factors. Defaults will be 0.
    b. MSA_CalPath1.txt
  This is the Magnitude vs. AtoD Calibration factors. Defaults will be 0.
    c.
MSA_CalPath2.txt, if a Path 2 is specified in the Configuration Manager
    d. MSA_CalPath3.txt, if a Path 3 is specified in the Configuration Manager
    e. MSA_CalPath4.txt, if a Path 4 is specified in the Configuration Manager
2.  "OperatingCal" folder.  It will be empty until the first time a line calibration is performed.
3.  config.txt  This is the information about the MSA hardware that was collected in the Hardware Configuration Manager Window.

4.  "MSA_Prefs" folder. This contains files with information about sweep and appearance settings. After the initial software run, there will be a Prefs.txt file with default settings, but the user can replace that file by the File->Save Prefs menu item, or he can save the preferences under a new name so he can have multiple preference files in this folder. The Prefs.txt file is loaded on startup, and that file or any other preferences file can be loaded at any time with the File->Load Prefs menu item.
    During MSA operation, there may be more folders and files that will be created and installed into the "MSA_Software" folder.

D.  Coarse Calibration using the Calibration File Manager
    The Calibration File Manager creates and controls all of the Calibration Files that are used by the main MSA Program.  During the Initial Set-Up and Running of the MSA Program, Calibration Files are created and filled with SLIM MSA default values.  The user has the option to change any, or all of these values.  To access the Calibration File Manager Window, "Halt" the sweep, then select from the Graph Menu, Setup, Initial Cal Manager.

msascreens/calmgrdefault.gif

*The right hand "Available Files" menu will display all of the MSA's Calibration files.
*The Calibration File Manager will control these files:
        * The Frequency Calibration File.
        * The Path Calibration Files.  There may be up to 15 Path Calibration Files,
            depending upon the number of Paths, as entered during the Initial Configuration Managment
            Procedure.
*Within "Available Files" menu, the 0(Frequency) is highlighted.  This is an opening default.
    The left text box is named "Frequency Calibration Table", and will display the most recently saved Frequency Calibration File.  Initially, it displays the SLIM default values of 0 MHz and 1000 MHz.  All Magnitude measurements at frequencies between 0 and 1000 MHz will be given a calibration factor of 0.00 dB.

Buttons in the Calibration File Manager Window:
Clean Up - This will sort the displayed Table's Calibration values.
Display Defaults - This will change the
displayed Calibration values to the nominal SLIM defaults.
Re-Load File - This will load the last saved
MSA Calibration Table values into the displayed table.
Save File - This will replace the MSA Calibration Table with the displayed Table's
Calibration values.
Return to MSA - This will close the Calibration File Manager Window and give the option to Save.
Start Data Entry - This will allow the user to Calibrate the MSA, semi-automatically.  When clicked, more boxes and buttons will appear, depending on the type of calibration requested.  These will be described during the Calibration Procedures.

*Within "Available Files" menu, select and highlight 1(xxx yy).  The Calibration File Manager Window will change and display the Path Calibration Table for Path 1.
msascreens/calmgrpath1.gif
    The left text box is named "Path Calibration Table", and will display the most recently saved Path 1 Calibration File.  Initially there are only two lines of values.  These are for the minimum and maximum magnitude dynamic range points for the MSA.  The default values are coarse and the final values will be characterized during the Path Calibrations.
    If your Analog to Digital Converter is a 16 bit version, the ADC value at 0.000 dbm should be 32767.  If not, change it.
    If your Analog to Digital Converter is a 12 bit version, the ADC value at 0.000 dbm should be 4095.  If not, change it.
    If your Analog to Digital Converter is an 8 bit version, the ADC value at 0.000 dbm should be 255.  If not, change it.
    For any version, the ADC value at -120.000 dbm should be 0 (zero).
    If changes are made in either the Frequency Calibration Table or Path Calibration Table, click the "Save File" button.  Then click the "Return to MSA" button.
    "Restart" the sweep.  If the Magnitude response is traced on the bottom -100 dBm scale line, "Halt" the sweep.  Open the Axis Y2 Window and change the Bot Ref to -120.  Click "OK".  The Graph will change its Magnitude scale and the trace should be seen. If you still have absolutely, no Magnitude response, you may assume that your MSA has a "hard failure".
    If you have any value of Magnitude response, perform a coarse tuning of the Coaxial Cavity Filter. While sweeping, adjust each the four tuning screws on the Cavity Filter for maximum amplitude of the Magnitude response. Do not expect the response to change in frequency, just maximum amplitude. The resulting Magnitude magnitude should be at least -50 dBm. If it is lower, it is likely that there is high insertion loss in either the Coaxial Cavity Filter, or the Final Crystal Filter. I term this condition, "soft failure".
     Again, this page is written with the assumption that the MSA is in working condition. If you have no Graph response, you have what I term, "hard failure". Since your MSA is fully integrated, stop here and fix the problem. Go to the page, Testing the Integrated MSA, and follow the Troubleshooting Guide.

II.  Calibration Procedures for the MSA

    The MSA can be constructed with a variety of topologies.  There are no two MSA's that have identical characteristics.  The purpose of calibration is to measure, characterize, and quantify the effects of those characteristics.  Once calibrated, an MSA can perform with the accuracy of an expensive, commercial unit.  These are the main factors that affect MSA measurement accuracy:
*  The coaxial cavity filter affects the gain/loss characteristic of the MSA.  It is sensitive to its source and load impedance, and will need tuning, even if pre-tuned independently from the MSA.  This is accomplished in the
Coaxial Cavity Filter, Tuning Procedure.
*  The Master Oscillator determines the frequency accuracy of the MSA.  It is usually quite stable, once it reaches its operating temperature, but may not be absolutely accurate.  The software can be compensated for this inaccuracy.  This is determined in Master Oscillator Calibration.
*  The Resolution Bandpass Filter(s) may not be exactly at the expected center frequency.  However, it can be charcterized, and the software can be compensated.  This is accomplished in Resolution Bandpass Filter Calibration.
*  MSA Magnitude Measurement is, basically, a linear function of input power.  MSA linearity is quite good in most of its dynamic range, but deviates significantly when its input power is close to its upper and lower dynamic range limits.  The full range of magnitude response can be characterized and compensated by software.  Magnitude nonlinearity is characterized in Path Calibration for Magnitude and Phase.
*  MSA Magnitude Measurement is affected when using different Resolution Bandpass Filters.  This is due to different insertion losses and filter bandwidths.  Therefore, the MSA gain can be characterized for each Resolution Bandpass Filter that is used.  Gain is characterized in Path Calibration for Magnitude and Phase.
*  MSA Magnitude Measurements are affected by frequency changes within the MSA.  There are multiple components in the MSA whose gain/loss characteristics change when frequency changes.  MSA gain can change greater than 2 dB over the frequency range of 0 to 1000 MHz.  These gain vs. frequency changes can be characterized and compensated by software.  Magnitude Accuracy versus Frequency is characterized in the Frequency Calibration for Magnitude.
MSA/VNA Phase Measurement accuracy is affected by the power of the input signal and the frequency of operation.  The Phase Accuracy versus input signal power level is characterized in Path Calibration for Magnitude and Phase.
MSA/VNA Phase Measurement accuracy is affected by the frequency at which the MSA is operating at.  The Phase Accuracy versus Frequency is characterized during normal VNA operation each time a Line Reference Calibration is performed.  But, a one-time VNA Baseline Calibration is performed to provide a coarse calibration for "uncalibrated" VNA operation.
    The measurement accuracy of the MSA can be optimized by adding a permanent, 50 ohm attenuator on the input of the MSA.  A 3 dB to 10 dB attenuator is best.  If you plan to use a permanent attenuator, it must be attached during the Calibration procedures.  Padding the MSA does not change its dynamic range, but it does shift it in the positive direction.  Example: Range without: -20 dBm to -110 dBm, Range with: -10 dBm to -100 dBm.  The same consideration can be made for the output of the Tracking Generator.  Its output will decrease by the amount of padding placed on its output connector.  I have determined that 8 dB of padding on both the TG output and the MSA input is optimum.

A.  Coaxial Cavity Filter, Tuning Procedure:
    This is not really a "calibration". It is a tuning procedure. If the coaxial cavity filter has been pre-adjusted with the mechanical information given in the construction procedures, it will be fairly close. For correct adjustment, perform the following steps. No other test equipment is required. If the MSA has only a single Final Filter (Path 1) or if all Paths use the same center frequency (or within .5 MHz) the following procedure will give good results.
    * Open and Run the MSA Program (spectrumanalyzer.exe).
    * Halt the sweep
    * Open the Sweep Parameters Window
    * Select the Video Filter BW to Wide
    * For an MSA with a single Final Filter, select Path 1
    * For an MSA with multiple Path Filters, select the Path with a center frequency that is closest to the average center frequency of all paths.
    * Verify, "0" as the Center Frequency, "Cent" box.
    * In the Span Box, enter 10 times the bandwidth of the Final Crystal Filter (in MHz).
    * Click "OK", then "Restart". The Graph should show a response curve, even if the cavity filter is
        badly mistuned. It is also possible that the response is below the Bottom Reference Line. If so,
        Halt sweep and change "Bot Ref" box to -120.  Click "Restart".

    * It is very possible that the center of the response curve will not be in the center of the Graph.
        Halt the sweep.
 Place Mouse pointer over the center of the response peak.
        Left double click the mouse. This will place the "L" marker and show the Magnitude in dBm below the Graph.

    * Click "Continue".

    * Adjust the tuning of the Cavity Filter for maximum Magnitude at the "L" marker.
    * Tuning is complete. Halt the sweep.

B.  Master Oscillator Calibration:
    If your system is the Basic MSA, and has no Tracking Generator, use Method A.  If your MSA has the Tracking Generator addition, with or without VNA extension, you can use Method A or Method B.

Method A.  For the Basic MSA (no Tracking Generator). Beat Frequency Method.
    This method requires an external AM radio receiver (and appropriate antenna) that will recieve WWV at 2.5 MHz, 5 MHz, 10 MHz, or 20 MHz.  This is for North America.  For Europe or other countries, you can use a Frequency Standard radio station, operating below 32 MHz.
  The DDS 1 spare signal is used as a "beat" frequency oscillator.
    1. 
Tune the external receiver to WWV, 10 MHz.  Use an antenna, if necessary.  I will use 10 MHz during this procedure, but others may be used.
    2.  Open and Run the MSA Program (spectrumanalyzer.exe). Halt the sweep.
    3.
  Connect a length of hook-up wire to the DDS 1 spare output, and position the wire close to the radio reciever or antenna input.  If the MSA's DDS 1 spare output is brought out to the front panel, the center conductor of the hook-up wire should fit snuggly in the center pin of the connector.  If the DDS 1 spare output is not brought out, it is available on the bottom of the SLIM-DDS-107 and is J3.  Use a hook-up wire size so that its center conductor will fit snuggly in the pwb hole.
    4.  (allow the MSA to warm up for 30 minutes).  Select from Menu, Setup, "Special Tests".  In the Special Tests Window, enter 10 (MHz, the frequency of WWV) into "Command DDS 1" box .  The "with DDS Clock at" box will display the value of the default global variable, "masterclock" (64.xxxyyyz).  Click the "Command DDS 1" button.  DDS 1 will immediately command to approximately "10" MHz.  The program software used the value in the "with DDS Clock at" box as "masterclock" for its calculation.  Leave the Special Tests Window open.
     5.  Couple the DDS 1 spare output wire close to the receiver to obtain an audio beat signal.  If the DDS 1 and WWV frequencies are more than a few hundred Hz apart, this "beat" may sound like a tone.  For best results, the WWV input power to the radio reciever and the DDS 1 signal input power to the radio reciever should be equal.  Move the DDS 1 signal wire to a location near the radio to obtain best results.
    6.  To adjust the Master Oscillator for zero beat, use a. or b.
        a. 
If you have a mechanical adjustment for the master oscillator, the nominal Master Oscillator frequency value should be in the "with DDS Clock at" box.  If not, Halt the sweep and enter it, then click the "Command DDS 1" button.  Manually adjust the Master Oscillator for zero beat.  A final zero beat is less than 1 noticeable cycle per second. When found, you are finished. Skip b.
        b.  If you don't have a
mechanical adjustment for the master oscillator, zero beat is found by changing the value in the "with DDS Clock at" box and clicking the "Command DDS 1" buttonThe goal is to find the lowest frequency zero beat.  If the beat frequency increases when changing values, you are changing in the wrong direction.  When the final value in the "with DDS Clock at" box is determined, you are finished.
    7.  Exit the Special Tests Window.
    8.  From the Graph Menu, select Setup, Hardware Configuration Manager.
    9.  In the Configuration Manager Window, change the "
Mast Clock" to the final value that was last entered in the "with DDS Clock at" box.
    10. 
Click the "Save Configuration" Button.  The MSA program will close.
If a zero beat to within 1 cycle per second can be obtained, the Master Oscillator is calibrated to within 1 part in 10 million, (using WWV, 10 MHz).  If WWV, 5 MHz is used, the calibration is within 1 part in 5 million, etc.  This is a one-time calibration.

Method B.  For the MSA with Tracking Generator or VNA. This is a Beat Frequency Method, but no external receiver is required.  The MSA acts as a receiver.  This method requires that the cavity filter be adjusted first.  Otherwise, there may not be enough signal to perform the calibration.
    This method uses the MSA as a radio reciever for WWV at 2.5 MHz, 5 MHz, 10 MHz, or 20 MHz.  This is for North America.  For Europe or other countries, you can use a Frequency Standard radio station, operating below 32 MHz.
  DDS 3 is used as the "beat" frequency oscillator.
    1.  Connect an antenna or long wire into the input of the MSA.  This injects WWV into the MSA.
    2.  If the MSA program is not running, RUN the program.
    3.  Halt the sweep.
    4.  Open the Magnitude Axis Window
    5.  Enter "-20" into the "Top Ref" box.  Enter "-120" into the Bot Ref" box.
    6.  Click "OK", "Restart", then "Halt"
    7.  Open the Sweep Parameters Window
    8.  Command the MSA Center Frequency to WWV, 10 MHz., "Cent" box = 10.0
         I will use 10 MHz during this procedure, but other WWV's may be used.
    9.  Uncheck the "Refresh Screen Each Scan"
    10.
  Click "OK", then "Restart".
    11.  Verify the signal response is in the center of the Graph.  If not, Halt and center the signal.  Click Restart. 
Allow the MSA to warm up for at least 30 minutes.  The Master Oscillator should stabilize in this time period.
    12.  Verify the input signal level has at least 10 dB of signal to noise ratio.  Take note of this input power level, as WWV power. Example, -90 dBm.
    13.  Halt the sweep.
    14. 
Open the Sweep Parameters Window and enter "0" into the "Span" box.
    15.  Click "OK", "Restart", then "Halt"
    16.
   Open the Magnitude Axis Window and enter "-70" into the "Top Ref" box.  Enter "-110" into the "Bot Ref" box. (use +20 dB above and -20 dB  below the noted WWV power).
    17.
  Click "OK" then "Restart".
    18.  A uniform, horizontal, trace will be displayed, along with some noise ripple.  Some magnitude change will occur if the WWV signal is fading or modulating.
    19. 
Halt the sweep.
    20.  Select from Menu, Setup, "Special Tests".  In the Special Tests Window, enter 10 (MHz, the frequency of WWV) into "Command DDS 3" box .  The "with DDS Clock at" box will display the value of the default global variable, "masterclock" (64.xxxyyyz).  Click the "Command DDS 3" button.  DDS 3 will immediately command to approximately "10" MHz.  The program software used the value in the "with DDS Clock at" box as "masterclock" for its calculation.  Leave the Special Tests Window open.
    21.  Combine both the DDS 3 spare output signal, and the antenna input, using a "T" connection on the input to the MSA.  For best results, the WWV input power to the MSA and the DDS 3 signal input power to the MSA should be equal.  See a. and b. next.
        a.  If the MSA's DDS 3 spare output is brought out to a front panel coaxial connector, its power level is very high, about -8 dBm.  Add an appropriate attenuator so that the DDS 3 power into the MSA is approximately equal to the level of the WWV signal.
        b.  If the DDS 3 spare output is not connectorized, it is available on the bottom of the SLIM-DDS-107 and is J3.  Use a hook-up wire with a center conductor that will
fit snuggly in the pwb hole.  The end of the wire can be loosely coupled to the WWV antenna input to the MSA, so that its power level is somewhat equal to the WWV power level.
    22. 
Click "Continue".  The previous uniform magnitude trace will look like waves on water.  These waves are a result of the beat frequency between WWV and DDS 3.  There could be many "waves" per sweep, meaning the Master Oscillator is far off frequency.  You can "grab" and move the Special Tests Window out of the way to see the Graph display.
    23.  Adjust the Master Oscillator for zero beat, use a. or b.
        a.
  If you have a mechanical adjustment for the master oscillator, the nominal Master Oscillator frequency value should be in the "with DDS Clock at" box.  Example, "64.00".  If not, Halt.  Enter the correct Master Oscillator value, then click the "Command DDS 3" button, then click "Continue".  Manually adjust the Master Oscillator for zero beat.  Zero beat occurs when the "waves" occur very slowly (less than one per second).  The sweep can be slowed for better display of the very slow waves.   Halt the sweep, enter "20" into the "Wait" box, "Continue".   When this adjustment is found, you are finished. Halt the sweep and skip the next step b.  I have a mechanical adjustment in my Original MSA.  It is very easy to adjust to 1 wave (1 beat) every 5 seconds.
        b.  If you don't have a
mechanical adjustment for the master oscillator, zero beat is found by changing the value in the "with DDS Clock at" boxThe goal is to find the lowest frequency zero beat.  If the beat frequency increases when changing values, you are changing in the wrong direction.  The procedure is: Halt the sweep, change the value in the "with DDS Clock at" box, then click the "Command DDS 3" button, then click "Continue".  Repeat, until the value of "with DDS Clock at" box creates the slowest waves (less than one per second).  Halt the sweep.  You are finished.  In the SLIM MSA, I was able to command the value of the Master Oscillator until I got 1 wave (1 beat) every 3 seconds.
    24.  Exit the Special Tests Window.
    25.  From the Graph Menu, select Setup, then, Hardware Configuration Manager.
    26.  In the Configuration Manager Window, change the "
Mast Clock" value to the final value that was entered in the "with DDS Clock at" box.
    27. 
Click the "Save Configuration" Button.  The MSA program will close.
If a zero beat to within 1 cycle per second can be obtained, the Master Oscillator is calibrated to within 1 part in 10 million, (using WWV, 10 MHz).  If WWV, 5 MHz is used, the calibration is within 1 part in 5 million, etc.  This is a one-time calibration.

C.  Resolving Filters in the MSA:
Resolving the Final Crystal Filters for each Path
    The center frequency of a
Resolution Bandpass Filter (Final Crystal Filter) may not be exactly as the manufacturer states.  For wide-band filters of 20 KHz or greater, this is not much of a concern.  But for narrow filters, this error will be indicated when the swept response in not in the center of the graph, when it should be.  To determine the real center frequency of the Final Xtal Filter, follow these steps.
    1.  The Master Oscillator must have been calibrated.
    2. 
Run the MSA Program (spectrumanalyzer.exe).
    3.  Select Magnitude Video Bandwidth to Wide.
    4.  Halt sweep.
    5. 
Open the Sweep Parameters Window
    6. 
Verify or change the MSA Center Frequency to "0", and Filter Path at P1
    7.  In Span box, enter 5 times the bandwidth of the Resolution Filter in Path 1 (Final XtalFilter)
    8.  Enter "20" into the "Wait" box
    9.  Click "OK" then "Restart".
    10.  The trace on the Graph is the actual frequency response curve of the Resolution Filter.  A perfectly tuned Resolution Filter will have low ripple within the 3 dB bandwidth.
    11.
  Verify the response is centered. Centered, means that the 3 dB points are equally distanced from the center of the Graph, and the maximum power indication is in the center of the Graph.
    12.  If centered, verification is complete.
    13.  If the response is not centered, Halt the sweep. Position Mouse pointer over the center of the response curve.  Double Left click or single Right click the Mouse.  The "L" marker frequency in the Marker Box will indicate the MSA tuning frequency.  A negative value is valid.  We will call this value, "
L Mark Freq".
    14.  To determine the true center frequency of the Resolution Filter:
        a.  true center frequency = value in "Select Final Filter Path:" box -
"L Mark Freq"
        b. 
example, if the "L" marker is at 0.0011 then true center frequency = 10.7 - 0.0011 =  10.6989 (MHz)
        c.
  or, if the "L Mark Freq" was at -0.0015 then true center frequency = 10.7 - (-0.0015) =  10.7015
        d.  this "true center frequency" will be entered into the Configuration Manager Window for Path 1
        e.  or, for subsequent Paths, "finalfreq2", "finalfreq3", and "finalfreq4"
    15.  To determine the unknown Bandwidth of the Resolution Filter:
        a.  With the sweep Halted, Select Marker "L".  Position the Mouse cursor directly on the trace that indicates the lower -3dB point of the filter response curve.  Double Left click the Mouse.  The frequency will be displayed in the Marker Box as the "L" frequency.
        c.  Select Marker "R".  Position the Mouse cursor directly on the trace that indicates the upper -3dB point of the filter response curve.  Double Left click the Mouse.  The frequency will be displayed in the Marker Box as the "R" frequency.
        d.  The actual Bandwidth of the Resolution Filter is "R" frequency - "L" frequency.
        e.  This actual Bandwidth
must be entered into the Configuration Manager Window for Path 1
    16.  If you have more than one Resolution Path, repeat steps 4 through 15 for each Path (2, 3, 4).
    17.  After the center frequencies and bandwidths of the Resolution filters have been determined, change the Path Variables in the Configuration Manager Window:
       a.  If sweeping, Halt the sweep

       b. 
Select Menu item, "Setup", Configuration Manager.
       c.  In the Configuration Manager Window, change the appropriate Filter Paths for correct frequency and bandwidth.
       d.  Click the "Save Configuration" Button.  The MSA program will close.
       e.  You have completed the characterization of the Resolution Filter Paths

Resolving the DDS Crystal Filters (in-work)
    The frequency response of the Crystal Filter used in DDS 1 and DDS 3 may not be centered exactly at 10.7 MHz.  It is not necessary that it is.  The bandwidth of the Crystal Filter should be 15 KHz and the MSA only utilizes 10 KHz of this bandwidth.  So, there is plenty of "head-room".  This is an optional procedure to characterize this filter.  It is written so that no other special test equipment is required.  In this test we will find two equadistant frequency points on the crystal filter's response curve and calculate the true center frequency.  This will be used to update the Configuration Manager. (This is in-work, more to follow).

D.  Phase Detector Module Calibration (VNA only):
    The Phase Detector Module (PDM) is very accurate when the differential phase of its two input signals is between +72 degrees and +288 degrees. If the differential phase is outside these boundries, the PDM will automatically be inverted, and a phase measurement is repeated. The inversion of a "perfect" phase detector would create a 180 degree phase shift, and that inversion could be compensated by factoring out 180 degrees. But, the MSA's PDM is not "perfect". Its actual phase shift will be between 170 deg and 190 deg, possibly more. We must calibrate the PDM to find out what the actual phase shift is when the PDM is inverted. This PDM is calibrated by using the following procedure:
    Open and Run the MSA Program (spectrumanalyzer.exe).
    *  Halt sweep.
    *  Enter the VNA Transmission Mode
    *  Halt the sweep
    *  Allow at least a 30 minute warm-up to ensure valid measurements
    *  Select Menu, Operating Cal, Reference To, and check "No Reference", exit References window
    *  Connect Tracking Generator output to MSA input with 1-2 foot cable.
    *  Set the Phase Video Filter Switch to WIDE bandwidth.

    *  Open the Sweep Parameters Window
    *  Select Video Filter BW box to Wide
    *  Select Final Filter Path 1, if not already displayed.
    *  Enter 200 into "Cen" box, center frequency = 200 MHz
    *  Enter 200 into the "Span" box, sweep width will be 200 MHz
    *  Click "OK" then "Restart".
    *  Verify a sawtooth response.
    *  Halt the sweep. The Magnitude power level will display the power level of the Tracking Generator and is not important, unless it is less than -70 dBm.  Nominal power level should be between -10 dB and -20 dB.
    *  Select the "L" marker and position the Mouse cursor on the slope that is near +90 degrees (left phase scale).  Double Left click the Mouse.  The "L" marker phase is displayed in the Marker Box.  Reposition the "L" marker to obtain 90 degrees, if necessary.
    *  Click the"Mark->Cent" box.
    *  Click "Restart"
    *  The sawtooth will shift with the "L" marker in the center of the Graph
    *  Halt the sweep
    *  Set the Video Filter Switch to NARROW bandwidth.
    *  Open the Sweep Parameters Window
    *  Change the Span to 0 (MHz)
    *  Select Video Filter BW box to Narrow
    *  Click "OK", then "Restart"
    *  Both the Magnitude and Phase traces will be horizontal lines.
    *  Halt the sweep.  Phase at the "L" marker should be approximately 90 degress.  The Magnitude is not important.
    *  Select Menu, Setup, PDM Calibration.  The "PDM Calibration" window will open
       *  Click the "PDM Inversion Cal" button.  The following take place:

            *  the button will change to "Be Patient".  (This calibration takes about 10 seconds)
            *  the "Current Inversion =" will go blank. It was displaying the current PDM calibrated value.
            *  the computer will command the PDM to its uninverted state
            *  the computer will beep and take the first phase measurement.
            *  in about 5 seconds the computer will command the PDM to its inverted state
            *  the computer will take the second phase measurement.
            *  when the measurements are finished, the computer will beep again
            *  the button will revert back to "PDM Inversion Cal"
            *  the software will calculate the PDM Inversion Phase Shift, using the two measurements.
            *  the "Current Inversion =" will display the newly calculated PDM Inversion Phase Shift
                * you should expect a value of 180 degress, plus or minus 10 degrees.
            *  The Message area will display the two phase values taken during the measurement.
        *  You may repeat the measurements by clicking the "PDM Inversion Cal" button as many times
             as you wish.  I suggest 4 or 5 times to verify repeatability.

        *  You may choose the save this new value as the permanent PDM Calibration value, or
            click the "Cancel" button to exit without saving.

            *  For a permanent Calibration, click the "Save New Value and Quit" button.  This will install
             the value into the Configuration Manager file, automatically.  The new value will be valid for
             this and all future MSA sessions.  If you wish to change the value manually, open the
             Hardware Configuration Manager and change the value in the "Inv Deg" box.

    *  PDM Calibration is completeThis is a one-time calibration and should never have to be repeated,
         unless there is some future modification to the PDM.


E.  Path Calibration for Magnitude (and Phase for VNA):
        The MSA can be described as a dual conversion receiver with fixed gain. The final frequency power level is converted into a digital word by the Log Detector and A to D Converter. For VNA, the final frequency is compared to an internal reference signal and the difference in phase is converted into a digital word. The MSA software will then convert the digital words back to power (in dBm) and phase (in degrees). Under ideal conditions, there would be an absolute difference in magnitude (dB), due to the fixed gain, and an absolute difference in phase, due to the fixed time delay of the conversion circuits.
    However, the MSA is not ideal throughout its dynamic range, and variations of gain/loss and time delay (phase) can be expected,
due to these factors:
    *  Mixer 1 conversion loss is not linear in its compression range, but we can use this range if we quantify it.
    *  The Log Detector, used to convert RF power to voltage, is not linear.  Close, but not perfect.
    *  The A to D Converter
, used to convert Log Detector voltage to binary bits, is not exact.
    *  MSA gain/loss is dependent on the characteristics of each Resolution Filter Path. Gain difference between Paths can be as much as 10 dB.
    *  MSA gain/loss is also dependent on the Frequency of the measured signal. This is addressed in section F. "Frequency Calibration for Magnitude".
The purpose of the Path Calibration is to measure and record the variations over the dynamic range of the MSA.
    Path Calibration for Magnitude will measure the digitized output power at different input power levels, at a single frequency. The input power levels and digital words are recorded in a "Path Calibration File".  This will be done for each Resolution Filter Path. There can be up to 4 Path Calibrations, each creating its own "Path Calibration File".
    Path Calibration for Phase records the measured phase change versus power level change in the same "Path Calibration File".

 Overview of Path Calibration,
     A Path is selected and a signal with a known power level is injected into the MSA. The output power level is measured as a digitized bit count and recorded in that Path's Calibration Table as "ADC". The known input power level (in dBm) is also 
recorded in that Path's Calibration Table as "dbm".  If the MSA has the VNA capability, a digitized phase measurement is recorded as "Phase". The input power is then changed to another known power level (at the same frequency) and the measurements are recorded. This process is repeated for multiple input power levels. The final accuracy of the MSA depends on the accuracy of the known input signal level and the number of calibration points taken. The more calibration points taken, the more accurate the MSA becomes.
    For an MSA
with a dynamic range of 100 dB, and to be accurate to within .1 dB, each Path Calibration would require 1000 different input power level measurements. The MSA can have one to or more Resolution Bandwidth Filter Paths.  Calibrating 1000 times would not be practical, so each Path Calibration will calibrate about 30 input power levels or less.  During normal MSA/VNA measurements, the software will access the Path Calibration File and interpolate between the two closest calibration levels to calculate the actual MSA input Magnitude and Phase.

Step by Step Procedure for Path Calibration:
     Note:  These calibration procedures must have been performed before a Path Calibration.
        Tune coaxial cavity filter
        Master Oscillator Calibration
        Resolve the
Center Frequency of Path 1, and others, if installed.
        Phase Detector Module Calibration, if the MSA has the VNA installed.

    1.  Configure MSA for Path Calibration:
        a.  Open and Run the MSA Program (spectrumanalyzer.exe).
            *  If an MSA session is already running, close the session and re-open it.
            *  Verify MSA is sweeping in Spectrum Analyzer Mode
            *  Halt the sweep.
        b.  Select MSA Mode of operation, Spectrum Analyzer or VNA Transmission:
            * If the MSA has VNA capability, Path Calibration will include Phase Calibration. It is
              important to select VNA
Transmission. The MSA's Tracking Generator must be used as
              the calibration source.
            * If the MSA does not have VNA capability, and:
               * if you are using an external Signal Source, select Spectrum Analyzer.
               * if you are using the MSA's internal Signal Generator, select Spectrum Analyzer with TG.
            * If the MSA has VNA capability, but you select Spectrum Analyzer mode, phase will not
               be calibrated.
        c.  Open Sweep Parameters Window and configure the Path to be calibrated:
            *  Verify that the Select Final Filter Path box has the correct Path selected. If not, change it.
                    Note that up to 30 Filter Paths can be calibrated and each maintained as a Path File.
                    However, if a Path number greater than "4" is selected, the Auto Switch provision will
                    command the Filter Bank to Path 4.
            *  Select "Narrow" in Video Filter BW pull-down box.
            *  If
Video BW selection is manual, select appropriate switches to Narrow.
            *  Enter the Calibration Frequency into the "Cent" box. If Phase calibration is included, a
                Calibration Frequency between 1 MHz and 2 MHz is preferred. If Phase calibration is
                not included, any frequency within the Band of operation is allowed.
            *  Enter 0 into "Span" box.
            *  Verify or select Band 1 operation.
            *  Click "OK", then "Restart". Sweeping will begin. The Magnitude trace will be very low and
                if in VNA mode, the Phase will be erratic.
            *  The Tracking Generator (if being used) will now be commanded to the Calibration Frequency.

            *  Halt the Sweep
    2.  Configure the Calibrated Signal Source for injecting into the MSA
        a.  Configure Signal Source
            *  For a Basic MSA (no VNA or Tracking Generator), using an external CW Signal Source:
                * 
Adjust the external CW Signal Source to the Calibration Frequency.
                *  Adjust the power level of the external Source to approximately, -10 dBm.
                *  Most MSA's will be in saturation with an input of -10 dBm.  Therefore, higher
                    calibration levels are usually unnecessary.

            *  If you use the Tracking Generator (internal Signal Generator) as the Signal Source:
                * 
The output power level of the TG (Sig Gen) is approximately -10 dBm
                *  Most MSA's will be in saturation with an input of -10 dBm.  Therefore, higher
                    calibration levels are usually unnecessary.

                *  If you need an input power higher than -10 dBm, you will require an amplifier.
            *  For the very best Path Calibration results, connect the output of the Signal Source to a
                low pass or band pass filter that will pass only the fundamental calibration frequency.
                Harmonic effects are rather minor, but a filter will help.  This is a user option.

        b.  Configure Step Attenuator
            *  Connect the Signal Source to the input of a precision selectable attenuator.  An ideal
                attenuator would have 120 dB of range, with 1 dB resolution, and an accuracy of .01 dB.
                It would also not deviate in Phase for any step change.  For an MSA without VNA
                capability, Phase change is not important.
            *  No matter what Signal Source you are using, the Calibration Power level injected on the
                input connector of the MSA must be known, as accurately as possible.  The final MSA
                operating accuracy depends on the known amplitude of the MSA input signal.

            *  Calibrate the Step Attenuator's Output with a precision power measurement instrument.
                *  With the attenuator at 0 dB, measure the Signal Power and record it. __________dBm.
                *  This "Signal Power" value will be used for each Path Calibration. Example: -10.75 dBm.
                    *  Precision power meters are not available to most people. But, a fair substitute is an
                        Oscilloscope with at least 5 MHz bandwidth and a 50 ohm termination on its input.
                        For an o'scope reading, dBm = 20 x log(peak to peak volts / .6324555 volts)

            *  Connect the Step Attenuator's Output to the input of the MSA.
        c.  Adjust the step attenuator for an output level of approximately, -30 dBm, +/- 5 dBm.
            *  It does not have to be exactly -30 dBm, but it does have to be accurately known.
                *  Whatever the power level is, it must be known to within .1 dBm (.01 dBm is preferred).
                *  This power level will be referred to as "True Power Level", the level entering the MSA.
                *  True Power Level = Signal Power - Attenuator setting(20). Example: -10.75 - 20 = -30.75
            *  If the VNA is installed, it is important to use a power level of about -30 dBm for the first
                Calibration Point.  The first point is used as a Phase Reference for the subsequent calibration
                points.  A very high input power level may saturate the Log Detector.  A very low input
                power level will result in very noisy digital conversion.  For either extreme, the digitized
                phase would be very much in error, and, we would not want to use it as a reference for
                the subsequent calibration points. However, i
f the MSA does not have VNA capability, the
                power level of the first Data entry does not matter, even if the MSA is in saturation.
            *  Click Restart and verify a solid horizontal Magnitude trace. If Phase is taken, the Phase
                trace should be a solid horizontal line but may take a few sweeps to stabilize.
            *  Halt the sweep. We are now configured to calibrate the selected Path.
    3.  Configure the Calibration File Manager Window
         a.  In Graph Window, select Menu, Setup, Initial Cal Manager. The Calibration File Manager
              window will open.

         b.  In the Calibration File Manager window, "Available Files" box, select and highlight the Path to be
              calibrated. The Path Calibration Table will be displayed with the latest Path Calibration File.
msascreens/calmgrpath.gif
         c.  The Path Calibration File will contain three columns of data with these headings,
                ADC  The bit value of the digitized Magnitude value of the Log Detector
                dbm  The actual MSA input power to generate the ADC value
                Phase  The "Phase Error vs Input Power" Correction Factor, in +/- db
            *  The table that is displayed on initial set-up will have only two rows of data which
               
are approximate values for a SLIM MSA.
         d.  Click "Start Data Entry" box. 
msascreens/calmgrpath1cal.gif

         e.  The Calibration Boxes and Buttons will be displayed.
                Input (dBm) box - The True Power Level injected into the MSA, in dBm
                ADC value box - The measured ADC Bit value, correlated to the Input Power Level
                Phase (degrees) box - The phase measurement, correlated to the True Power Level
                Ref Freq (MHz) box - The Frequency at which the calibration is performed.
         f.  Delete the old calibration data, leaving the Header information.
            *  The table that is displayed on initial set-up will have only two rows of data. If a Path
                Calibration has been previously taken, several rows of data will be displayed.
            *  These are old calibration values that we do not want in a new Path Calibration.
            *  Move the Mouse cursor into the displayed Path Calibration Table, Left Click and Highlight
               all the rows of data under the Header information (*  ADC      dBm      Phase).
            *  Delete the data, by pressing the "Delete" key on the keyboard
            *  The line,
*  ADC      dBm      Phase and the information above it must remain.
    4.  Calibrate this MSA Path, using multiple steps and True Input Power levels:
         a.  Enter into the "Input (dBm)" box, the True Power Level currently injected into the MSA.
            *  True Power Level = Signal Power with no attenuation - Attenuator setting
            *  Example, -10.75 - 20 = -30.75 (type the value, -30.75, without the suffix, dbm)
         b.  Click the "Measure" button. This measurement will take several seconds. Be patient.
         c.  Automatic measurement and data entry will occur:
            *  The software will test for an error condition for Center Frequency and Sweep Width.
            *  The software will read the Center Frequency and install it in the "Ref Freq (MHz)" box.
            *  The software will read the Magnitude Analog to Digital Converter 30 times, and average
                 the Bit values.  It will then enter the average Bit value in the "ADC value" box.
            *  If VNA is installed, the software will read the Phase Analog to Digital Converter 30 times,
                 convert Bits to Phase, and average the Phase values. 
It will then enter the Phase value
                 into the "Phase (degrees)" box, in degrees.
            *  The software will not enter any value into the "Input (dBm)" box. You must manually
                enter the
True Power currently injected into the MSA (in dBm).  You may do this before
                or after clicking the
"Measure" button, but certainly before clicking the "Enter" button.
            *  Note: You may make repeated measurements at a Calibration Point without clicking
                 the
"Enter" button.  Simply, re-click the "Measure" button.
         d.  Click the "Enter" button to transfer the box data.
            *  If this is the first Calibration Point for this Path Calibration:
                *  A new box will be created, called Ref Phase (deg).  For VNA, this becomes
                     the Reference Phase for all subsequent Calibration Points.

                *  The Phase values are only used for VNA operation.  If the VNA is not installed,
                    the values will be meaningless.  Or, the
"Phase" boxes may not even be shown.
            *  For all Calibration Points,
                *  The Input Power and its correlated ADC bit value are entered into the
Path Calibration
                    Table under the headings "dbm" and "ADC".
                *  The Reference Phase is subtracted from the Measured Phase, and the result is entered
                    into the
Path Calibration Table under the heading "Phase".  This is the "Phase Error
                    vs Input Power Correction Factor.  It is used to compensate for the phase error created
                    for different input power levels to the VNA.  Used only for VNA.

                *  The boxes will clear and be ready for the next Calibration Point.
         e.  Change the attenuator setting for a new Input Power Level (for the next Calibration Point)
             *  You may use higher or lower power; it makes no difference.  But it is important
                 that no two Calibration Points have the same input power level.

         f.  Return to 4-a. and follow the steps for each Calibration Point. Basically, the steps are:
             Apply
a new Power Level, Enter its True Power into the Input Box, click the Measure button,
             and finally click the
Enter button to insert data into table. Repeat for all Calibration points.
             *  You would like to take as many Calibration Points as possible.  An MSA may have a
                 dynamic range between 100 dB and 120 dB. Saturation can be as high as 0 dBm input and
                 a noise floor as low as as -130 dBm, depending on the MSA's topology.
             *  When calibrating points in the input level range of 0 dBm to -35 dBm, attenuation
                 steps of 2 dB or 3 dB increments
is advised.
             *  When calibrating points in the input level range of -35 dBm to -75 dBm, attenuation
                 steps of 5 dB or 10 dB increments is fine. This is a very linear range for the MSA.
             *  When calibrating points in the input level range of -75 dBm to the noise floor
                 (approximately -110 dBm),
attenuation steps of 2 dB or 3 dB increments is advised.
                You will know you have reached the noise floor when changing the attenuator will not
                 change the average Bit count.
    5.  After the last Calibration Point is taken, you will manipulate the Path Calibration Table
         a.  Click the "Clean Up" button.  This will sort the data points.
         b.  The first row of data in the displayed Path Calibration Table, is the lowest input power
                 Point taken during Path Calibration.  However, this may not be the ultimate noise floor
                 of the MSA for this Path.
             *  Remove the signal connection from the MSA input connector.  Install a 50 ohm load on
                 the
MSA input connector.
             *  Click the "Measure" button
             *  The "ADC value" box will display the bit value for the ultimate noise floor for this Path.
                 Take this value and subtract 1% .  Highlight the value in the
"ADC value" box and replace
                 it with the resulting value. Example: If the displayed value was 4900 (bits), 4900-49 = 4851
             *  Read the "dbm" column value of the first row, and subtract 10 (dB).  Type this value into
                 the "Input (dbm)" box.  However, if this value is greater than -120, use the value, -120.0
                 Examples:
If it was -125.33, use -125.33 : If it was -106.88, use -120.0
             *  Click the "Enter" button, to install this new data into the Path Calibration Table
             *  Click the "Clean Up" button.  This will sort the data points.
             *  Now, the first row will contain the "ADC" Bit value and "dbm" value of the noise floor.
         c.  Highlight the "Phase" value of this first row.
             *  Change this value to the same "Phase" value, as displayed in the second row. (The Phase
                values of first row and second row will be the same).  Highlight the value in the Table with
                the Mouse cursor and type in the new value.
         d.  Verify the data in the Path Calibration Table is acceptable, before Saving.
             *  Click the "Clean Up" button.  This will sort the data points.
             *  Make sure all data points are monotonic, that is, an increase in Bit count shows a resulting
                 increase of input power.
             *  Do not allow any two data points to have the same ADC bit value.  If this has occurred,
                 delete one of the rows and click
the "Clean Up" button.
             *  Do not allow any two data points to have the same "dbm" value.  If this has occurred,
                 delete one of the rows
and click the "Clean Up" button.
             *  Click the "Save File" button.  This will replace the MSA Path Calibration File with
                 the Table that is displayed.
    6.  The Path Calibration is complete.
         a.  Exit the Calibration Manager Window by clicking the "Return to MSA" button.
         bIf you wish to calibrate another Path,
             *  Open the Sweep Parameters window
             *  Change the Path number in the "Select Final Filter Path" box
             *  Click "OK", "Restart", then "Halt"
             *  Return to Step 2-c. and follow the procedure, replacing any reference to Path 1, to Path X.

F.  Frequency Calibration for Magnitude:
    The MSA's Magnitude measurement accuracy is dependent on both the input power to the MSA and at what frequency the measurement is taken. The previous Path Calibration compensates for the input power (the Magnitude Error), but at only one frequency. At other frequencies, the Magnitude Error is different, due mainly to the Mixer 1 conversion loss versus frequency. This effect is called, "Magnitude Error vs. Frequency".
    This Frequency Calibration will characterize the
Magnitude Error at multiple frequencies, and will create a "Frequency Calibration File". The main MSA software will use both this "Frequency Calibration File" and the "Path Calibration File" to calculate input power measurements.
    The Frequency Calibration is accomplished by injecting a signal of Known Power Level, and of known frequency into the MSA input connector.  The Magnitude is read and converted to "Measured Power" in dBm using the Magnitude Correction that was determined in the previous Path Calibration.  The "Measured Power" is compared to the Known Power Level and the difference is called the Calibration Correction Factor (Magnitude Error vs. Frequency). The value of Frequency, and the value of "Magnitude Error vs. Frequency" Correction Factor are both installed in a "Frequency Calibration Table".  This process is repeated for multiple input signal frequencies.  The completed Frequency Calibration Table is then saved as the MSA Frequency Calibration File, and placed into the MSA Software Folder.
    The final accuracy of the MSA depends on the accuracy of the Known Power Level at each frequency, and the number of frequency calibration points taken. The more points taken, the more accurate the MSA becomes.  Of course, we do not calibrate at "every" frequency. This would require millions of calibration points. Instead, we calibrate at several frequencies and allow the software to interpolate between these frequencies. Frequency Calibration is performed in only one Resolution Filter Path, normally Path 1.
    Updated June 9, 2016    The above explanation is valid when calibrating any frequency from 0 Hz to 3 GHz. All frequencies being calibrated will be placed in a single Frequency Calibration File. It is important that the same Resolution Filter Path is used for all frequencies, even though the MSA Band configuration may be changed for frequency groups.
    Because there is only one
Frequency Calibration File, it is important that no two calibration points have the exact same frequency. Also, frequencies within one Band cannot overlap frequencies in another Band. As examples: Band 1 is normally specified for frequencies from 0 Hz to 1000 MHz. However, some MSA's in Band 1will extend to as high as 1200 MHz. Band 2 can be as low as 950 MHz and as high as 2100 MHz. Band 3 can be as low as 1960 MHz to as high as 3100 MHz. If the frequency of 1000 MHz is calibrated while configured for Band 1, then Band 2 can not include the frequency of 1000 MHz or lower. However, it can include a frequency of 1000.000001 MHz (1 Hz higher).
    Frequency Calibration can be either a manual procedure or a semi-automatic procedure.
       
* The Basic MSA uses the Manual Frequency Calibration for the Basic MSA.  The MSA will be manually commanded, and each Calibration Point will be manually entered.
        * The MSA/VNA can use the
Semi-Automatic Frequency Calibration for the MSA/VNA.  The MSA will be swept, and each Calibration Point will be manually entered.
    Note:  The following calibration procedures MUST be performed before a Frequency Calibration:
        Tune coaxial cavity filter
        Master Oscillator Calibration
        Resolve the
Center Frequency of Path 1.
        Phase Detector Module Calibration, if installed.
        Path Calibration for the Path used during this Frequency Calibration, usually Path 1. Other Path Calibrations can be taken at any time.

F1.  Manual Frequency Calibration for the Basic MSA:
       The MSA will be manually commanded, and each Calibration Point will be manually entered. A calibrated Signal Source is used, either external or internal Signal Generator, if installed.

Step by Step Procedure for Manual Frequency Calibration:
    1.  Start with a "Fresh" Frequency Calibration File.
         a.  Open and Run the MSA Program (spectrumanalyzer.exe).
         b.  Halt sweep.
         c.  In the Graph Window menu, Setup, select Initial Cal Manager
         d.  In Calibration File Manager Window's "Available Files" box, select and highlight (Frequency)
         e.  The "Frequency Calibration Table" will display the latest Frequency Calibration File.
              *  If this is the initial Frequency Calibration, the Table will display only two rows of entries,
                0.00 MHz and 1000.00 MHz, with a corresponding calibration correction factor of 0.00 under
               
the "dB"column.  This is what we want.
              *  If a previous Frequency Calibration has been performed, multiple entries will be displayed. If
                so we want a "fresh table" for a Frequency Calibration.  Click the "Display Defaults" button.
                The Frequency Calibration Table will be replaced with the default Table, showing the two rows.
         f. Click "Save File". Click the "Return to MSA" button.

    2.  Configure the Calibrated Signal Source.  There are two methods of obtaining a Calibrated
        Signal Source, External Source or internal Tracking Generator.  The following are the
        requirements for the calibration signal, which is injected into the input of the MSA.
        a.  The frequency must be adjustable from 100 KHz to 1000 MHz, or greater.  A narrower
             frequency range is usable, but the final MSA will be "uncertain" at any frequency that is
             outside of the calibrated frequency range.
        b.  The frequency must be stable to within 1 KHz, or within the Final Resolution Bandwidth.
        c.  The output power level must be between -20 dBm and -40 dBm. -30 dBm is optimum.
        d.  Whatever the power level is, it must be known to within .1 dBm (.01 dBm is preferred).
            The input power level will be referred to as "
True Power".  The Magnitude Measurement
             calibration accuracy of the MSA is dependent on the accuracy of this
"True Power Level".
        e.  Connect the Calibrated Signal Source output to the Input of the MSA using attenuators
             if necessary to obtain the proper input level of -30 dBm +/- 10 dBm.

    3.  Configure the MSA to sweep a Calibration Point.
        a.  If sweeping, Halt the sweep.
        b.  Verify that the MSA mode is in Spectrum Analyzer Mode, or Spectrum Analyzer with TG
            if using internal Tracking/Signal Generator.
        c.  If not in correct Mode, halt the sweep and select it in MSA Graph Window menu / Mode.
        d.  Select the Magnitude Video Bandwidth Switch to Narrow.
        e.  Open Magnitude Axis Window
            *  Enter 0 into the "Top Ref" box and -100 into the "Bot Ref" box.
            *  Select Magnitude (dBm) in "Graph Data" pull-down box.
        f.  Open Sweep Parameters Window
            *  Verify the Select Final Filter Path box is Path 1.  If not, select it.
            *  Verify the Frequency Band is 1G.  If not, select it.
            *  In the "Span" box, enter 0 (zero).
            *  Enter 100 into the "Steps/Sweep" box
            *  Enter 50 into the "Wait" box
            *  Change the "Cent" box to the frequency of your first Frequency Calibration Point.  This
                 will be the same frequency that was used for Path 1 Calibration.
  If you don't know
                 what it was, use the following procedure:

                *  Graph Window menu, Setup, select Initial Cal Manager
                *  In Calibration File Manager Window's "Available Files" box, select and highlight Path 1
                *  The Calibration Table for Path 1 will be displayed in the Path Calibration Table.
                *  The top of the Calibration Table will be, *Calibrated (date) at (xx.xx) MHz.
                *  The (xx.xx) value is the frequency, in MHz, used in Path 1 Calibration
                *  Click "Return to MSA" button
            *  Click "OK"

    4.  Create a Working Calibration Chart for the Frequencies you will use.
        a.  Create a Table with these Row and Column Headings:
                Frequency        Measured Power      
True Power            dB Correction (T-M)
Point 1      __2__MHz       _________dBm         _______dBm        ________0_____dB
Point 2
     _____MHz       _________dBm         _______dBm        ______________dB
Point 3      _____MHz       _________dBm         _______dBm        ______________dB
etc, to       _____MHz       _________dBm         _______dBm        ______________dB
Point X     _____MHz       _________dBm         _______dBm        ______________dB
        b.  In the "Frequency" column, fill in the frequency values you plan to use for each Point.
            *  Point 1 should be as close to the same frequency that was used for Path 1 Calibration.
            *  Points 2 through Point X can be at any frequency that is within the range of the MSA.
                They can be a lower frequency than Point 1 and be taken in any order.
                Keep in mind that all SLIM MSA's will operate higher than 1000 MHz.  Use Calibration
                Points up to the frequency limit of your MSA. When calibrating a frequency outside of
                Band 1 (1G, 0-1000 MHz) the Frequency Band must be changed.

            *  We would like to have as many Calibration Points as possible.  More points taken will
                result in better accuracy of the MSA, but more than 50 points is probably unnecessary.
                I used 20 points for Calibration within Band 1 and obtained good results for the MSA.

        c.  "Measured Power" will be the power of the input signal, as measured by the MSA.
              Fill this column with the Measured Power Level at each Frequency Point.
        d.  "True Power" is the actual input power to the MSA, as provided by the Calibrated
               Signal Source.
Fill this column with the True Power Level for each Frequency Point.
        e.  "dB Correction" is the Calibration Correction Factor (Magnitude Error vs. Frequency).
             It is found by subtracting the Measured power from the True Power. For Point 1 it
             should be 0. If not within .1 dB, the Path 1 Calibration is in error.
For Point 2 and
             subsequent Points, t
he dB Correction can be a positive or negative number.
             Round off to two decimal places. i.e., -1.23 dB

    5.  Command the Calibrated Signal Source and the MSA for a Calibration Point Sweep
        a.  Open the Sweep Parameters Window and change the "Center Frequency" to the same
             frequency as the Calibration Point.  Click "OK".
        b.  Change the Calibrated Signal Source Frequency to the same frequency. If using internal
            Tracking / Signal Generator, this is automatic.
        c.  Click "Restart".  Magnitude will be displayed as a horizontal trace. The measured power
             
will be displayed below the graph.
        d.  Allow at least one full sweep, then Halt.
        e.  Enter the Magnitude measurement into your Working Calibration Chart, under
             the header "Measured Power", for this Frequency Point.

        f.  Return to Step 5 and repeat steps a. through f. until you have completed your Working
             Calibration Chart for all Frequency Calibration Points.

    6.  Open the Calibration Manager and a
ccess the Frequency Calibration Table
         a.  In Graph Window menu, Setup, select Initial Cal Manager
         b.  In Calibration File Manager Window's "Available Files" box, select and highlight 0(Frequency)

msascreens/calmgrdefault.gif

         c.  The "Frequency Calibration Table" will display the default Frequency Calibration File.
              The Table will display only two rows of entries, 0.00 MHz and 1000.00 MHz, with
              corresponding values of 0.00 under the "db" column.  This is the "Correction Error" column.
         d.  Manually enter the new Calibration values from your Working Calibration Chart into the
              Frequency Calibration Table.  We will use the "text editor" process.
            *  Place the Mouse Cursor under the "1000.000" row entry, and left click to apply cursor.
            *  Enter the Frequency of Point 1 (in MHz)
            *  Press the space bar on your computer to move the cursor to the right
            *  Enter the db Correction for Point 1 (in dB), using correct sign
            *  Press the "Enter" key on your computer to move the cursor to a new row
            *  Enter the Frequency of Point 2
            *  Press the space bar on your computer to move the cursor to the right
            *  Enter the db Correction for Point 2
            *  Press the "Enter" key on your computer
            *  Repeat this process for all Calibaration Points, Point 3 through Point X
            *  They can be entered into the Frequency Calibration Table in any order
            *  When finished, click the "Clean Up" button. This will sort the rows by frequency
    7.  Save the Frequency Calibration Table
         a.  The row containing the default data at 1000.00 MHz must be changed, or deleted.
            *  If you have entered a Calibration point that is higher in frequency than 1000 MHz,
                delete the default 1000.00 row.  Highlight the "1000.00", and its column values,
                and delete them.
            *  If your highest frequency Calibration point is lower than 1000 MHz, change the
                dB value in the 1000.00 row
.  Do this by highlighting the correction value in the
                1000.00 row, and replace it with
the same value as your highest
                Frequency Calibration Point's correction value.
         b.  The row containing the default data at 0.00 MHz may be changed, but must not be deleted.
            *  It is advisable to make its data values the same as the values of the next closest
                Calibration Point.  H
ighlight its correction value and replace them with the values of the
                next closest Calibration Point.

         c.  Click the "Clean Up" button.  This will sort the data points again.
            *  Look through the Frequency Calibration Table and verify that no two rows contain
                the same frequency.
         d.  Click the "Save File" button.  This will replace the MSA Frequency Calibration File
                with the displayed Frequency Calibration Table.
         e.  The Frequency Calibration is complete.
         f.   Exit the Calibration Manager Window by clicking the "Return to MSA" button.

F2.  Semi-Automatic Frequency Calibration for the MSA/TG
    The MSA will be automatically swept, but each Calibration Point will be manually entered.

Step by Step Procedure for Semi-Automatic Frequency Calibration:
    1.  Start with a "Fresh" Frequency Calibration File.
         a.  Open and Run the MSA Program (spectrumanalyzer.exe).
         b.  Halt sweep.
         c.  In the Graph Window menu, Setup, select Initial Cal Manager
         d.  In Calibration File Manager Window's "Available Files" box, select and highlight (Frequency)
         e.  The "Frequency Calibration Table" will display the latest Frequency Calibration File.
              *  If this is the initial Frequency Calibration, the Table will display only two rows of entries,
                0.00 MHz and 1000.00 MHz, with a corresponding value of 0.00 under the "Error"
                column.  This is what we want.
              *  If a previous Frequency Calibration has been performed, multiple entries will be
                displayed. We want a "fresh table" for a Frequency Calibration.  Therefore, click the
                "Display Defaults" button.  The Frequency Calibration Table will be replaced with
                the SLIM default Table, showing only the two rows.  Click "Save File".

         f.  Click the "Return to MSA" button.
    2.  Configure the MSA and Tracking Generator
        a.  Open and Run the MSA Program (spectrumanalyzer.exe).
        b.  Verify that the MSA is in the Spectrum Analyzer Mode
              *  If not, halt the sweep and select it in Graph Window Mode menu.
        c.  Halt the sweep.
        d.  Select the Magnitude Video Bandwidth Switch to Narrow.
        e.  Open Magnitude Axis Window
            *  Enter 0 into the "Top Ref" box and -100 into the "Bot Ref" box.
            *  Select Magnitude (dBm) in "Graph Data" pull-down box.
        f.  Open Sweep Parameters Window
            *  Verify the Select Final Filter Path box is Path 1.  If not, select it.
            *  Enter 1000 into the "Span" box.
            *  Enter into the "Steps/Sweep" box, the number of Frequency Calibration Steps you wish
                to make.  The more Steps, the better the resolution of the final calibration.  I entered the
                value, 20.  This actually creates 21 Calibration Steps, since step number 0 is included.
            *  Enter 200 into the "Wait" box
            *  Change the "Cent" box to 500 plus the same frequency that was used for Path 1
                Calibration.
  Example: if the Path 1 Calibration Frequency was 2 MHz, then enter
                the value "502".  If you don't know what it was, use the following procedure:

                *  Graph Window menu, Setup, select Initial Cal Manager
                *  In Calibration File Manager Window's "Available Files" box, select and highlight Path 1
                *  The Calibration Table for Path 1 will be displayed in the Path Calibration Table.
                *  The top of the Calibration Table will be, *Calibrated (date) at (xx.xx) MHz.
                *  The (xx.xx) value is the frequency, in MHz, used in Path 1 Calibration
                *  Click "Return to MSA" button
            *  Click the "Signal Generator" button to change to "Tracking Generator.  This will configure
                the MSA to use the Tracking Generator output as the Calibration Source.
            *  Click "OK"
    3.  Configure the Tracking Generator as a Calibrated Signal Source.
        a.  Requirements of a Calibrated Signal Source, which is injected into the input of the MSA:
          * The frequency should be adjustable from 100 KHz to 1000 MHz, or greater.
          * The frequency must be stable to within 1 KHz.
          * The power level must be between -20 dBm and -40 dBm.
          * The power level must be characterized over a frequency range of .1 MHz to 1000 MHz
              * That is, whatever the power level is, it must be a known value to within .1 dBm
                 (.01 dBm is preferred).  This power level will be referred to as "True Power Level".
                 The Magnitude Measurement accuracy of the MSA is dependent on the accuracy of
                 this
"True Power Level".
        b.  The specifications of the Tracking Generator:
          * Frequency range is from 1 KHz to greater than 1050 MHz.
          * Frequency stability to within 3 Hz.
          * The RF output level is approximately -10 dBm, however,
          * Level is not uniform across its entire range of .1 MHz to 1000 MHz (and above),
          * Expected ripple is about 2 dB.
          * The output level of the Tracking Generator can be characterized (calibrated) over frequency.
            
There are several methods of characterizing the Tracking Generator, and I will not explain
             them here.  But, if the Tracking Generator is used directly, as the
Calibrated Signal Source,
             it must be
characterized.
        c.  If the Tracking Generator is used to drive a leveling circuit, such as a Limiter or Leveler, the
             Tracking Generator does not need to be characterized.  Only the leveling circuit
needs to be
             characterized.  It is considered the
Calibrated Signal Source.
        d.  For either method, connect the Calibrated Signal Source output to an attenuator.
          * The attenuator should attenuate the Signal Source output power to approximately, -30 dBm.
        e.  Connect the attenuator output to the Input of the MSA.
          * The coaxial interconnections should be low loss and as short as possible.
    4.  This step not used.
    5.  Critically Sweep, to read the Calibration Points.
        a.  Click "Restart".  Magnitude will be measured at each step in the sweep.
        b.  Verify that the graphed Magnitude response is a horizontal line, although there may
              be a large amount of ripple.  The measured levels should be close to the
True Power Level
              of the
Signal Source output, +/- 2 dB.
        c.  Halt after at least two full sweeps.  The MSA has now recorded Magnitude data for each
            Frequency Point step in the sweep.

    6.  Calibrate for each Frequency Point of the sweep.
         a.  In Graph Window menu, Setup, select Initial Cal Manager
         b.  In Calibration File Manager window, "Available Files" box, select and highlight (Frequency)
         c.  The "Frequency Calibration Table" will display the default Frequency Calibration File.
            *  The Table will display only two rows of entries, 0.00 MHz and 1000.00 MHz, with
                corresponding values of 0.00 under the "Error" column.
         d.  Click "Start Data Entry" box
msascreens/calmgrfreq1.gif
         e.  The following Buttons will be displayed
            *  "Next Point".  This will increment through each of the recorded Magnitude steps
            *  "Prev Point".  This will decrement through each of the recorded Magnitude steps
            *  "Enter".  This will calculate and enter this step's Calibration into the Calibration Table.
            *  "Enter All". This will calculate and enter all steps into the Calibration Table.
         f.  The following Boxes will be displayed with data already entered.
            *  "Point Number" box will display 0. This is the first data Point and will become the reference.
            *  "Freq (MHz)" box will display the Frequency of this Point
            *  "Measured Power (dB)" box will display the Magnitude reading of this Point in dBm.  It
                 is important that this measured value, in dBm, be the same as the
True Power Level of
                 the Calibrated Signal Source injected into the MSA, within 0.1 dB or better, if possible.
                 If it is not, the Path 1 Calibration may be in error.
         g.  Click the "Enter" button, this occurs:
            *  Software reads the "Freq (MHz)" box and install its value into the Frequency
                Calibration Table, under the column, "MHz"
            *  A new box appears with the label "True Power (dBm)".  It contains the same
                value as the
"Measured Power (dB)" box.  This becomes the Reference value for
                subsequent calibration points.

            *  Software reads the "Measured Power (dB)" box and subtracts its value from the value
                in the
"True Power (dBm)" box.  For the first Frequency Calibration Point, the result is
                0.00
.  It installs this result of 0.00 into the Frequency Calibration Table, under the
                column, "db", in the same row as the just entered, Frequency.  This is the "Magnitude
                Error vs. Frequency" Correction Factor.  It will be used in the main MSA software when
                determining the true input power to the MSA, during normal Magnitude Measurements.

    7.  Select the next Frequency Calibration Point.
         a.  In the Calibration File Manager Window, click the "Next Point" button.
            *  If this is not the correct Frequency Calibration Point, continue clicking the "Next Point"
                button until you reach the correct Point.

            *  The "Point Number" box will increment by 1.
            *  The "Freq (MHz)" box will display the frequency of the new Frequency Calibration Point
            *  The "Measured Power (dB)" box will display the Magnitude reading of the Point, in dBm.
                This value will probably not be the True Power Level on the input of the MSA.

            *  The "True Power (dBm)" box will display the True Power of the previous point.
         b.  Enter the Calibrated Signal Source's True Power Level at this Calibration Point into
              the
"True Power (dB)" box.
            *  Highlight the value in the "True Power (dBm)" box and replace it with the value of the
               
Calibrated Signal Source's True Power Level.
         c.  Click the "Enter" button
            *  Software reads the "Freq (MHz)" box and install its value into the Frequency
                Calibration Table, under the column, "MHz"

            *  Software reads the "Measured Power (dB)" box and subtracts its value from the value
                in the
"True Power (dBm)" box.  It installs the result into the Frequency Calibration
                Table, in the same row, under the column, "db".  It can be a positive or negative value.
                This is the "Magnitude Error vs. Frequency" Correction Factor for this step.

         d.  Return to 7. and repeat the step procedures for the next Frequency Calibration Point.
            *  Basically, the steps are:  Select each Frequency Point, insert True Power, "Enter" into
                Calibration Table.  Repeat for all Calibration points.
            *  We would like to have as many Calibration Points as possible.  More points taken will
                result in better accuracy for the MSA, but more than 50 points is probably unnecessary.
    8.  Save the Frequency Calibration Table
         a.  The row containing the default data at 1000.00 MHz must be changed, or deleted.
            *  If you have selected a Calibration point that is higher in frequency than 1000 MHz,
                delete the default 1000.00 row.  Highlight the "1000.00", and its column values,
                and delete them.
            *  If your highest frequency Calibration point is lower than 1000 MHz, change the
                error value in the 1000.00 row
.  Do this by highlighting the error value in the
                1000.00 row, and replace it with
the same value as your highest
                Frequency Calibration Point's eror value.
         b.  The row containing the default data at 0.00 MHz may be changed, but must not be deleted.
            *  It is advisable to make its data values the same as the values of the next closest
                Calibration Point.  H
ighlight its error value and replace them with the values of the
                next closest Calibration Point.

         c.  Click the "Clean Up" button.  This will sort the data points.
            *  Look through the Frequency Calibration Table and verify that no two rows contain
                the same frequency.
         d.  Click the "Save File" button.  This will replace the MSA Frequency Calibration File
                with the displayed Frequency Calibration Table.
         e.  The Frequency Calibration is complete.
         f.   Exit the Calibration Manager Window by clicking the "Return to MSA" button.

XP Problem
    Here is a strange problem that I had when I first started using my new WinXP Pro computer and the MSA.  After opening and running the MSA software, the display would graph the correct results for the Spectrum Analyzer for a few seconds.  Then the Magnitude trace would begin to disappear.  I could "Halt" the sweep and click "Restart".  The Graph would be normal for a few more seconds and go away again.  I would repeat this process for about a minute and the MSA would be normal for the rest of the time I had the MSA session open. Even a new MSA session would work fine if I had not re-booted the computer. The same problem exists on my new Win 7 computer. I found this Question and Answer on the internet:
    Question: If a logic 1 is written to the Control Port, bit 0, (Strobe), my PC clears all of the port bits once every five seconds for about a minute.
    Answer: Some versions of Windows XP look for devices by periodically writing to the port. A registry key can disable this behavior. 
You can make these changes in Windows' regedit utility.
    The following registry setting disables the port writes:
[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Parport\Parameters]
"DisableWarmPoll"=dword:00000001
    The following registry setting enables the port writes:
[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Parport\Parameters]
"DisableWarmPoll"=dword:00000000
    This must be for a version of XP which I don't have.  I could not find the "Parameters" file.  However, I did find this:
[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Parport]
"Start" = 3
    I double clicked the "Start", and an Edit window opened to allowed me to change the value.  I changed the value from 3 to 4.  I saw this somewhere else on the internet.  This fixed my problem.
    As always, use caution when working with the registry, which contains critical values for configuring and running the PC.

    Windows XP and Win 7 (and likely later versions of Windows) has another feature that previous Windows versions do not have and should be modified to prevent "weird" displays when resizing or moving the MSA Graph Window. You need to disable this feature.
     Perform the following procedure for XP:
Start => Control Panel => Display and Design => Display => Appearance => Effects
=> Show Window Content while dragging.    Uncheck this box.
   
 Perform the following procedure for Win 7:
Start => Control Panel => Performance Information and Tools => Adjust visual effects =>Performance Options window, Visual Effects tab
=> Show Window Contents while dragging.    Uncheck this box.

End of Page