top of page

GPS tutorial: basics of GAMIT/GLOBK

Basic instructions and examples for processing a set of survey GPS sites with GAMIT/GLOBK (version 10.7)

 

Last updated August 2019, Eric Lindsey

​

Modified from an earlier version posted at igppweb.ucsd.edu/~elindsey/ (this page no longer exists).

​

The workflow described below was used during a GPS processing workshop I taught with U Pyae Sone Aung in Naypyidaw, July 9-13, 2018, and at EOS in Singapore, August 19 - 22, 2019.

​

Disclaimer: This tutorial is not guaranteed to be a definitive processing workflow for your GPS data. There are many ways to use this software and your data may require special care that is not discussed here. Please feel free to email me (elindsey -at- ntu.edu.sg) if you have any trouble with this tutorial, but don’t complain to any of the authors of GAMIT at MIT (with whom I am not affiliated) if you find a bug in this workflow or it is not working for you - they will just redirect your question to me. 

 

You are encouraged to read the GAMIT/GLOBK manual available at http://geoweb.mit.edu/gg/ in detail to understand the reason behind each step, along with the many other options that this brief tutorial completely skips over. There are also helpful lecture notes available from the short courses held at UNAVCO - https://www.unavco.org/education/professional-development/short-courses/course-materials/gps/gps.html. There are many other great resources on the UNAVCO web page as well. If you are going to use this software for research or professional purposes, I strongly recommend attending one of these workshops.

 

 

Summary

​

1. run sh_setup -yr
2. place rinex data in rinex/ folder
3. edit sites.defaults and station.info (list all IGS / reference sites you want to include, and setup metadata)
4. edit lfile. (set apriori positions for all your sites)
5. check sittbl. (optional. constrain one or more CGPS sites with known position)
6. Run: sh_gamit -expt expt -s YYYY DDD DDD -pres ELEV -orbit IGSF
7. Run sh_glred to create .org file, plots, and time series
8. Make multi-year combined .org file and time series
9. Run globk to make velocities

​

Key folders in the GAMIT/GLOBK file structure

​

2018/ etc. - you should have one top directory for each year:

    tables/ - all the setup and metadata information for this year goes here.

    rinex/ - your rinex data for this year goes here

    001/ etc. - there will be one folder containing the intermediate processing results from each day of the year

    gsoln/ - the final coordinate solutions are here, after running sh_glred

vsoln/ - if you are doing velocity solutions, place a folder named vsoln/ next to all your year directories. This step is not currently part of this tutorial.

 

​

Getting started

​

- For installation, the GAMIT home directory needs a soft link: ~/gg -> /path/to/GAMIT. See instructions at the GAMIT/GLOBK website for software download and installation.

- don't run GAMIT from a folder whose name contains spaces, periods, or other non-alphanumeric characters. This can cause an unexpected error

- double check that you run each command from the proper directory. eg: sh_upd_stnfo must be run from within tables/

- For this tutorial, make extra sure your rinex header information are correct for every file (antenna type matching those in rcvant.dat, correct antenna height, etc.). This is the most common source of an undetected error in GPS positions. It may be a good idea to get someone else to double-check your file headers.

​

​

Step 1. setup year/ folder and get default tables

 

First, set up the year directory. Create a directory called (for example, if our data is from 2018). If your data spans multiple years, you will have to create several separate directories, and do all the steps below in each directory. 

 

   mkdir 2018

   cd 2018

 

Next, we need to set up the gamit tables/ directory. In the year directory, run:

 

   sh_setup -yr 2018

 

This copies the default tables/ folder from ~/gg/ for gamit to get started. Now, we will need to edit them to suit our needs.

 

​

Step 2. setup rinex/ folder

 

    mkdir rinex/

 

Place rinex data from the year 2018 (any files ending in .18o, .18d, .18o.gz, or .18d.gz) in the rinex/ folder. Double-check to make sure all the rinex headers have correct information, for every day and every site! Even a single typo here can cause big problems later.

​

Optional: get rinex for IGS sites

 

This is only necessary if we are going to run our processing in an offline mode. If you have a good internet connection, it is much easier to skip this, and follow the “with internet” instructions below.

 

Example: get 14 days of data in March 2018:

   cd rinex/

   sh_get_rinex -archive cddis sopac unavco olg -yr 2018 -doy 70 -ndays 14 -sites  lhaz cusv hyde iisc ntus bako pimo hksl hkws jfng shao bjfs coco xmis pol2 dgar 

​

Notes on file compression

 

Manually compressing and uncompressing rinex data is not needed for GAMIT (it knows how to read both compressed and uncompressed data), but you may want to compress the data if you are sending someone a copy:

 

sh_rnx2crx -c y -d y -f *.18o

 

If you need to uncompress the rinex (for example, to double-check the header information), you can use the following commands:

 

gunzip : this will convert any files ending in (for example) .18d.gz to .18d, or from .18o.gz to .18o. Example:

 

gunzip *.gz

 

sh_crx2rnx: this will convert files ending in d (Hatanaka compressed rinex) to o (standard rinex). Example

 

sh_crx2rnx -c n -d y -f *.18d

 

​

Step 3. setup IGS site information and metadata

​

Option 1: with internet: add ftprnx and xstinfo (run sh_gamit online, and pre-create the station.info)

 

edit sites.defaults

 

Because the IGS rinex data have not been downloaded, here you must list all the continuous sites that you want to include in your network for double-differencing and to fix the absolute position. For example, here is a current list of ~16 options to use for SE Asia.

 

** KEY NOTE: there is one invisible space character (‘ ‘) before every line. If you do not include this space, GAMIT will ignore that line. This is true for nearly all gamit configuration files. Take note! ** 

 

 all_sites expt xstinfo

 lhaz expt ftprnx xstinfo

 cusv expt ftprnx xstinfo

 hyde expt ftprnx xstinfo

 iisc expt ftprnx xstinfo

 ntus expt ftprnx xstinfo

 bako expt ftprnx xstinfo

 pimo expt ftprnx xstinfo

 hksl expt ftprnx xstinfo

 hkws expt ftprnx xstinfo

 jfng expt ftprnx xstinfo

 shao expt ftprnx xstinfo

 bjfs expt ftprnx xstinfo

 coco expt ftprnx xstinfo

 xmis expt ftprnx xstinfo

 pol2 expt ftprnx xstinfo

 dgar expt ftprnx xstinfo

 

These are most of the IGS sites located within about 3000km of the survey data that we were processing when this example was written.

Older sites you may consider for older data in the same region, that may not exist now: kunm,wuhn,lck3,pbr2. To search for sites that exist near a specific location and date range, you can try searching at UNAVCO’s DAI, which has a good graphical interface: https://www.unavco.org/data/gps-gnss/data-access-methods/dai2/app/dai2.html (requires Adobe Flash)

 

 - The 4-letter code 'expt' is the name of our experiment. If you prefer, you could use any other 4-letter name, like ’sv18’ that is more descriptive for your project. This name does not affect any of the results, but you have to make sure to use the same name throughout all the processing below.

 

 - The code ‘ftprnx’ tells GAMIT to download the rinex files for this site via FTP. This will require a good internet connection when you run sh_gamit, because it may download several hundred MB of data. If you already downloaded the data for this site, you don’t have to include ‘ftprnx’. 

 

The code ‘xstinfo’ tells GAMIT not to update the information in the station.info file for this site.

 

Now, we need to update the station.info table with the antenna heights, model and serial numbers that are included in the RINEX headers. 

 

So that this doesn’t take forever, first we need to separate most of the huge station.info file (just keeping the header) and put it somewhere else. We will keep only those stations specified in our sites.defaults (those that have the argument ‘ftprnx’), using the command:

 

   sh_upd_stnfo -l sd

   mv station.info.new  station.info

 

This should create a much smaller station.info file. 

 

Next, make sure the rinex files for all sites we want to process are in the rinex/ directory (see the diagram above for the file structure). Now enter (still from the tables/ directory):

 

   sh_upd_stnfo -files ../rinex/*o

 

Now, have a look at the station.info file and make sure that the metadata has been read correctly for your sites. To find the newly updated sites you can search for the string “mstinf”

 

(Optionally, we could have left off the 'xstinfo' commands in sites.defaults, then we wouldn't have to do this step. But this gives us a way to double-check that everything has been prepared correctly, before we run sh_gamit.)

 

​

Option 2. without ftprnx and xstinfo (for pre-downloaded IGS rinex data, and use automatic station.info creation)

 

If you have pre-downloaded the IGS rinex data (see instructions at the end of this file), you could modify sites.defaults to have only one line:

 

 all_sites expt

 

(note, there is still one space at the start of the line!) This is the simplest way to set up your sites.defaults. Because we left out “xstinfo”, sh_gamit will automatically create station.info during its processing, and we don’t have to set it up beforehand.

 

However, note that the current station.info file is huge, with over 20,000 lines! Because of this, sh_gamit could take up to several hours to update the file with your rinex data, a pointless waste of time. To avoid this issue, we should first enter all our IGS sites in sites.defaults with the option “ftprnx”, as described in Option 1 above, and run “sh_upd_stnfo -l sd”. This will create a new file station.info.new, containing only the lines relevant to the IGS stations we are using. You can then rename this file to station.info (‘mv station.info.new station.info’) and delete the list of IGS stations from sites.defaults.

 

​

Step 4. update lfile.

 

We need to input good a-priori coordinates for every site (~1-10m accuracy) to the file named “lfile.” to give gamit a starting point for the processing. These can be taken from an earlier occupation, or other prior information like a handheld GPS measurement, or from the rinex header. If they are not known, several options exist for obtaining good coordinates. In order of increasing complexity, these are:

 

Option a. rx2apr rinex header extraction. The easiest way to get the coordinates from the rinex headers. From your rinex/ folder:

 

   cd rinex/

   grep POSITION *18o > positions.txt

   rx2apr positions.txt 2018 071

 

Here, the date you enter can be anything, but should be close to the time of your survey. This method works well if the coordinates in the rinex header are good, but note that they are occasionally very wrong (e.g. zero), and will cause gamit to fail (you may get an error like “geodetic height unreasonable”. In this case, check the coordinate for that site and use another method (below) to update it).

 

Option b. Positions from a handheld GPS, or google earth, converted to Earth-Centered, Earth-Fixed coordinates (XYZ), using the GAMIT interactive command 'tform' or another conversion program. These need to be accurate to work well; be careful!

      

Option c. Automated online processing: Upload a single rinex file via ftp; this method is slow but does a full GAMIT run, so it generally finds the coordinates to within ~1cm or better, using nearby continuous stations. There are several services that can do this processing for us:

    - SOPAC: http://sopac.ucsd.edu/cgi-bin/SCOUT.cgi

    - APPS: https://apps.gdgps.net/apps/index.php

    - AUSPOS: http://www.ga.gov.au/bin/gps.pl

      

Option d. Teqc qc-full mode: first, get nav info, eg.:

        sh_get_nav -archive sopac -yr 2018 -doy 071 -ndays 1

        

      Then run teqc in qc mode with the nav data as input, eg.:

        teqc +qc -nav brdc0710.18n rinex/yngn0710.18o

        

      The resulting positions are frequently off by ~10m or so, but they may be good enough for a first pass in GAMIT.

 

Once you have all your site coordinates in the .apr file format, add them to the file called  “lfile.” in the tables folder (yes, it has a dot at the end of the name).

 

​

Step 5. Check sittbl.

 

We need to ensure there is a constraint in the adjustments GAMIT is permitted to make to at least one station, or it will never find a good solution. As long as you are using some core IGS sites for your reference stations (step 3 above), you probably don’t need to change anything. Just make sure that at least two or three of your selected IGS sites have an entry in sittbl. that looks like this:

  

SHAO              NNN    0.050 0.050  0.050

 

Caution: Note that this file is very sensitive to the exact format. The number of spaces between each column must be exactly the same as in the current format for the existing lines. If you have a missing or extra space, or a tab character, sh_gamit will fail. Modify at your own risk!

 

​

Step 6. Run GAMIT

 

At last, we're ready to run gamit!

 

Option 1. with an internet connection.

 

Gamit is much simpler to run with an internet connection, because it needs to download orbit information. This is still true even if you previously downloaded IGS rinex data.

 

   cd .. (now we are in the YYYY directory)

   sh_gamit -expt expt -d YYYY DDD DDD ... -pres ELEV -orbit IGSF

   

For example, for days 071 – 074 in year 2018, the command would be:

 

   sh_gamit -expt expt -s 2018 071 074 -pres ELEV -orbit IGSF

 

This will run the gamit programs makexp, makex, fixdrv, and csh bexpt2.bat, all together automatically. The options we gave mean to process the experiment named 'expt', for each day within the range (days 071 – 074 of 2018); to plot residuals vs. elevation; and to use the IGS Final orbits (other options include IGSR for “rapid” orbits, which are available shortly after the day of data. IGSF are released 2 weeks after each day, or slightly later).

 

Option 2. without internet.

 

It’s possible to run sh_gamit without a live internet connection, but you need to download some data first. You will need to get orbits and nav files for each day, plus one day before and one day after. For example, if we are running days 071 – 074 from year 2018, we should first run:

 

   mkdir gfiles

   cd gfiles

   sh_get_orbits -orbit igsf -yr 2018 -doy 070 -ndays 6

   cd ..

 

   mkdir brdc

   cd brdc

   sh_get_nav -yr 2018 -doy 070 -ndays 6

   cd ..

 

Now, we can run sh_gamit with a slightly modified set of options. We change the “-orbit” flag to “-orbt” (no i for no internet?) with the option igsg instead of igsf, and add “-noftp”:

 

   sh_gamit -expt expt -s 2018 071 074 -pres ELEV -orbt igsg -noftp

 

For either of these two options (with/without internet), sh_gamit is the slowest step in the process, and it will be even slower if you have included a lot of IGS or continuous stations. For a total of 20 stations including IGS, I estimate about 5-10 minutes runtime per day of data on a normal laptop.

 

​

Inspecting the sh_gamit solution.

 

After running sh_gamit on each day’s data, you should check to make sure that the results are sensible. A good way is to inspect the summary file: for example, for day 071, it is

 

   cat 071/sh_gamit_071.summary

 

Here, you should see some statistics about the run.

 

Look at the RMS for the two best/worst sites: there is one column for each satellite, with an average in the first column. According to the GAMIT authors, we expect to see all values less than 10 mm. For our example processing with Topcon receivers and small (non-ground-plane) antennas, we seem to get RMS values around 15 mm. This is not ideal, but still seems to work OK for this survey.

 

If you want to look at the RMS for all sites, you can use the command: “grep RMS 071/autcln.post.sum”. This is the output of the postfit solution from the GAMIT program “autcln”.

 

In the summary file, the 4 lines below the RMS values list prefit and postfit nrms values. According to the GAMIT authors, the postfit nrms should be less than 0.2. In practice, it is always about 0.18 for a good solution, and may only be 0.21 for a bad solution.

 

At the bottom, there may be a list of coordinates that were updated in the lfile. If you know that you put very good coordinates in the lfile. (from a previous gamit run), then you should not see any updated sites (they are only updated if the difference is more than 30cm). If your initial coordinates were approximate (from the rinex header, or another approximate source), then you may see some updated coordinates. If any coordinates were updated, you should run sh_gamit a second time for that day’s data. You should first delete the whole day directory, e.g. “rm -rf 071/“. Then run sh_gamit again.

​

Because we added the option "pres ELEV" you can also look at sky plots of the residuals, and residuals vs. elevation, found in the folder figs/. Look for sites with very large residuals compared to the others - that may be an indication of a noisy site, probably due to multi path or water vapor. If you see a very strange (asymmetric) pattern in the plot of residuals vs. elevation angle, this may be an indication that you have the wrong antenna name. Check the rinex header! If it is wrong, delete that line from the station.info and create it again, then delete the day directory and run sh_gamit again.

​

 

Step 7. Run GLOBK: use sh_glred to solve the reference frame and get final coordinates

 

First, get a default set of .cmd files:

 

   sh_glred -cmd

 

Now, we need to modify this to specify our stabilization sites, which should be all the IGS sites we used in the processing above.

 

The default set of stabilisation sites does not have enough IGS sites from our current processing. Comment it out (by adding a character at the start of the line):

x source ~/gg/tables/igs14_comb.stab_site

 

Then, add a new line (note, space at the start of the line): 

 stab_site lhaz cusv hyde iisc ntus bako pimo hksl hkws jfng shao bjfs coco xmis pol2 dgar 

 

Now, we can run sh_glred and specify the full timespan of our data for which we have run sh_gamit above:    

 

sh_glred -expt expt -s 2018 071 2018 074 -opt H G T

 

Done! There are plots in gsoln/plots_[date], and averaged coordinates with residuals in gsoln/res_[date]. You can get a nicely formatted time series by running:

 

   mkdir pos

   tssum pos pbo.final_frame gsoln/*org

 

This specifies to create the time series in the folder pos/, with the format “pho.final_frame” (this is a nice standard time series format), and to use all the .org files from the gsoln directory. The org files contain all the final information from the solution for each day, so tssum is really just reformatting that information.

​

Optional: Combine data from each survey (skip for continuous GPS processing)

​

If we are doing survey processing, it is helpful to first combine the results from each survey into one file, to ensure consistency of the solution. From each year directory we have to run sh_glred with the -ncomb option to specify the number of days to combine:

​

   sh_glred -expt expt -globk_cmd_prefix COMB -opt G -ncomb 4 -s 2018 071 2018 074

 

If you have some surveys conducted at different times during the year, you should combine these separately for each different timespan. The GAMIT authors suggest that you should not combine more than a few weeks of data into a single file.

​

​

Step 8. Make multi-year time series

​

After completing the steps above to process each year of data, it’s time to combine the results together to create our full time series and estimate velocities! Compared to the previous steps (especially sh_gamit), this is a very quick step now that all the low-level processing is done.

 

Go up one directory, so that you are in the folder that contains all the year folders.

​

Running sh_glred has produced ‘h’ files, which are the equivalent of the .org file but in a computer-readable format and with all the full covariance matrices preserved. After running all years of data through sh_gamit and sh_glred, we can use these to create a combined solution for all years. From the directory above the individual year/ directories:

​

   mkdir vsoln
   cd vsoln

​

Get a list of all the h-files created in the last step:

​

   ls ../????/gsoln/h*glx > expt.gdl

​

Next, run glred (note, not sh_glred here) to produce one combined .org file. Note that we have to delete any existing .org files in this directory first, since glred will not do it automatically.

​

   \rm globk_replong.org globk_replong.log
   glred 6 globk_replong.prt  globk_replong.log expt.gdl  globk.cmd > glred.out

​

Now, create the time series from the combined files:
 

   mkdir pos
   tssum pos eos.final.itr14 -R globk_replong.org

​

​

Step 9. Make velocities

​

Finally, we can get velocities and plot arrows:

 

    globk 6 globk_vel.prt globk_vel.log expt.gdl globk.cmd VEL
    sh_plotvel -ps mmtest -f globk_vel.org -R85/105/15/25 -factor 0.75 -arrow_value 20  -page L

 

This creates a file ‘mmtest.ps’ that is a very bare-bones map of the velocities. Check it out:

 

    gs mmtest.ps 

 

If we want to plot our own, we can get a text file with all the velocities:

 

    sh_org2vel -file globk_vel.org

 

This creates a file ‘globk_vel.vel’ that has a nicely formatted list of velocities. A GMT command to plot them might look like this:

 

   awk '{if(NR>5 && $7<5) printf("%f %f %f %f %f %f %f %.4s\n", $1, $2, $3, $4, $7, $8, $9, $13)}' globk_vel.vel | gmt psvelo -J -R -Se0.06/0.68/5 -A10p+p1p+e1p -W1p -Gblack -L -V -O -K >> $ps
   # plot a velocity scale arrow
   echo '92.5 16 20 0 0 0 0 20 mm/yr' | gmt psvelo -J -R -A10p+p1p+e1p -W1p -Gblack -Se0.06/0.99/8  -L -V -O -K >> $ps

 

These velocities are still in the default reference frame (ITRF2014) because of the default reference file (apr_file) we used. If we want to use (for example) the Sunda plate reference frame, we can set a different option in the glorg.cmd file:

 

    globk 6 globk_vel_sunda.prt globk_vel_sunda.log expt.gdl globk.cmd VEL+SUND14
    sh_org2vel -file globk_vel_sunda.org

 

This produces a new file, globk_vel_sunda.vel in the GAMIT-defined Sunda plate. This works because SUND14 is an option in the glorg.cmd file (it is a word at the start of a line, which will be activated when we pass that command to globk. In this case, activating that line causes GLOBK to use a different .apr file for the IGS sites, which lists their velocities in the Sunda frame, so our velocities will be computed relative to those). Check out the .cmd file for other options.

 

Of course, you can also rotate the velocities from ITRF2014 later by applying your own plate rotation parameters using a separate program, but this is beyond the scope of this document.

​

​

Appendix A: Notes on scripting

​

In my experience teaching people to use GAMIT/GLOBK, one of the main causes of trouble with the software is that it is very easy to make a small typo or forget a step when running on the command line, which lacks a nice graphical interface to display the status of your run. To avoid these pitfalls, I suggest putting your commands in scripts whenever possible so that (1) you have a record of what exactly you ran, and (2) you can type as little as possible directly into the command line.

​

For example, suppose you have already created your customized 'sites.defaults', 'station.info', 'lfile.', 'glorg.cmd' and 'globk.cmd' files, listing your chosen regional IGS sites and with all the correct info for your own sites. Let's put them in a folder my_table_defaults/ .

 

Then you can use a simple script to run steps 1-7 all together: save these lines as the file 'run_year.sh':

​

#!/bin/bash

if [[ $# -lt 3 ]]; then
  echo "Usage: $0 year doy1 doy2"
  echo "Runs sh_gamit and sh_glred for specified days in the year directory"
  exit 1
fi

year=$1
day1=$2
day2=$3

cd $year
sh_setup -yr $year
cp ../my_table_defaults/lfile. tables/
cp ../my_table_defaults/sites.defaults tables/
cd tables
sh_upd_stnfo -l sd
mv station.info.new station.info
cd ..
sh_gamit -expt expt -s $year $day1 $day2 -pres ELEV -orbit IGSF
cp ../table_defaults/*cmd gsoln/
sh_glred -expt expt -s $year $day1 $year $day2 -opt H G T

​

This script assumes that you have put your data in the folder (for example) 2018/rinex, and the customized versions of 'sites.defaults', 'station.info', 'lfile.', 'glorg.cmd' and 'globk.cmd' in the folder my_table_defaults/ . To use it, first use 'chmod' to make the file executable:

​

    chmod +x run_year.sh

​

Now run it with:

​

    ./run_year.sh 2018 071 074

​

This will do all the processing described above through step 7. If you want to run for a different set of data (for example, from 2017, days 310 - 321), it's as easy as putting the rinex files in place (in 2017/rinex) and then running a single command:

​

    ./run_year.sh 2017 310 321

 

A similar script that allows you to make the velocity solutions for all data from all years in one step is left as an exercise to the reader. Also note: the fact that you have automated many steps at once does not exempt you from the need to double-check your results from each step of the way afterwards. Many things can still go wrong and not cause a 'FATAL' error. Happy scripting!

​

​

Appendix B: Common errors and bug fixes - send me yours!

 

Problem:

   “No g- or sp3-files available and noftp = Y, cannot continue” 

Solution:

   You are running sh_gamit with the -noftp option, but forgot to get the gfiles (orbits) in the gfiles/ directory. Run (for example):

   cd gfiles/

   sh_get_orbits -orbit igsf -yr 2012 -doy 070 -ndays 6

   cd ..

 

Problem:

   “-noftp = Y : You must have /home/geodesy/test_processing3/2018/brdc/brdc0710.18n or /home/geodesy/test_processing3/2018/brdc/brdc0710.18n.Z available to continue processing”

Solution:

   You are running sh_gamit with the -noftp option, but forgot to get the brdc (nav) files in the brdc/ directory. Run (for example):

   cd brdc

   sh_get_nav -yr 2012 -doy 070 -ndays 6

   cd ..

 

Problem: 

   “FATAL  :160615:1736:49.0 FIXDRV/bmake: Ocean loading requested no list or grid file”

Solution: 

   You may not have downloaded the otl (ocean tidal loading) grid, which is not included in the default gamit installation. Ftp to everest.mit.edu (anonymous login) and download the otl*.grid files from pub/GRIDS/. Place them in your ~/gg/tables directory (you may have to delete some broken links there in order to replace the files.

Then delete all the intermediate products in the day directory (eg. 071/) produced by sh_gamit and run again.

 

Problem:

   “FATAL  :180713:1848: 8.0 FIXDRV/lib/rstnfo: No match for BAKO 2018  71  0  0  0 in station.info”

Solution:

   Your station.info is missing some entries. You need to run “sh_upd_stnfo -l sd”, or “sh_upd_stnfo -files ../rinex/*o”, or remove the command “xstinfo” from your sites.defaults. See the discussion in step 3 above regarding these issues.

 

Problem:

   “FATAL  :180713:1850:44.0 FIXDRV/fixdrv: Sestbl or sittbl errors--see GAMIT.warning”

Solution:

   Probably there is a formatting error in your sittbl. This occurs very easily if you have manually edited sittbl., because it is space-sensitive. Any misalignment of the columns or even inclusion of “tab” characters can cause this issue. Check your sittbl. carefully to make sure no lines are misaligned. In the worst case, copy the file again from ~/gg/tables/ and start over.

​

​

​

​

bottom of page