Getting Started #

The Alaska Vegetation (AKVEG) Database is built in PostgreSQL and hosted on a cloud server that can be queried in numerous ways. These instructions will enable you to connect to and query the AKVEG Database query using R, Python, or ArcGIS Pro.

To query the database, you will need server credentials. To request server credentials, fill out a Database Access Form. The database is public and access is free; the purpose of the server credentials is to prevent excessive loads on the server and for us to know how many people are connecting.

Connecting with R or Python #

Once you have received server credentials, unzip the credentials in a location accessible to your script environment. The credentials include three SSL certificate files and a csv file. In the csv file, you will need to update the file paths for the following rows: 9 (sslrootcert), 10 (sslcert), and 11 (sslkey).

Then, visit our GitHub repository, download the SQL queries from the queries folder, and follow along with the scripts in the connection_examples folder.

Prerequisites #

  1. Python 3.9+

    1. psycopg2
    2. pandas
    3. geopandas
    4. os
    5. akutils
  2. R 4.4.0+

    1. dplyr
    2. fs
    3. janitor
    4. lubridate
    5. readr
    6. readxl
    7. RPostgres
    8. sf
    9. stringr
    10. terra
    11. tibble
    12. tidyr

Preparing files for example scripts #

The example scripts use three input files that are not included in the GitHub repository. Each of the three inputs is available for download from a publicly accessible location (see links for each dataset below).

  • domain_input (‘region_data/AlaskaYukon_ProjectDomain_v2.0_3338.shp’): A shapefile containing the spatial domain of the AKVEG Map project domain, which is the focal area of data integration and taxonomy for the AKVEG Database. You can download the domain input and unzip it into an accessible local folder.
  • region_input (‘region_data/AlaskaYukon_Regions_v2.0_3338.shp’): A shapefile containing features for major biomes and vegetation regions of Alaska & adjacent Canada (Yukon and Northern British Columbia). You can download the region input and unzip it into an accessible local folder.
  • fireyear_input (‘ancillary_data/AlaskaYukon_FireYear_10m_3338.tif’): A 10 m raster dataset of most recent burn year from 1940 to 2023 where zero indicates that no burn was documented during that time frame. You can download the fire year input (~ 800 mb) and unzip it into an accessible local folder. Alternatively, you can remove interactions with this dataset from the example script, but you will then not be able to compare observed year to burn year.

These files are provided for the purpose of providing examples for how to relate the data in the AKVEG Database to other geospatial data. These data are not required to use or access data in the AKVEG Database. To run the example scripts with the example data included, file paths in the script must be modified to match where you store the geospatial datasets locally.

Column aliases for shapefiles #

The queries employ shorthand field names where the database field names are too long for the ESRI shapefile format field character length constraint. When viewing the shapefile attributes, please refer to the scripts for relationship between shapefile aliases and original field names used in the AKVEG Database.

Connecting with ArcGIS Pro #

To connect to the database using ArcGIS Pro, you will have to copy the three SSL certificate files (server-ca, client-cert, and client-key) to a location that will be recognized by ArcGIS Pro. Perhaps the easiest default location to copy your SSL files to is:

C:\Users<Your_Username>\AppData\Roaming\postgresql\

  1. Navigate to the default location. If there is no folder called ‘postgresql’, create it. ‘AppData’ is a hidden folder; if you don’t see it, enable “Show hidden files, folders, and drives” in File Explorer options.
  2. Copy the 3 SSL files into the postgresql folder. (I recommend copying the files rather than moving them in case something goes wrong). Rename the 3 files as follow; you can ignore the warnings about changing file extensions:
    1. server-ca.pem –> root.pem
    2. client-cert.pem –> postgresql.crt (Note that the file extension is crt without the ’e’).
    3. client-key.pem –> postgresql.key
  3. If ArcGIS Pro is open, close it and launch it again.
  4. In ArcGIS Pro, go to Insert / Connections / New Database / New Database Connection. Choose ‘PostgreSQL’ as the Database Platform. Use the information that we provided in the authentication_akveg_public_read.csv to populate the Instance, Username, and Password fields (the ‘Instance’ is what we call ‘host’ in the CSV file). Keep the Authentication Type as the default ‘Database authentication’.
  5. You should see two database names appear in the ‘Database’ field. Select akveg_public. Click OK.
  6. All of the tables should now appear in the Catalog Pane / Databases. The ‘akveg_public.public.site’ can be converted into a Point Feature Class using the latitude_dd and longitude_dd fields (EPSG: 4269). All other tables are non-spatial.

Although there will be a slight delay when working with the data in this way, it is preferable to exporting the data into a local geodatabase. Maintaining a direct connection to the database ensures that you’re always working with the latest version of the database.

Common connection issues #

If you are having trouble connecting to the database:

  1. Confirm that your paths to the credentials files are correct.
  2. Verify your security settings. Many company networks and VPNs block access to Port 5432; this port is used to establish a database connection between the PostgreSQL server and the client. If your security settings do not allow access to the database:
    1. Contact your IT administrator
    2. Access the database from another Internet connection e.g., residential.
    3. If you are comfortable using the command line interface and have a Google account, you may be able to use the Cloud SQL Auth Proxy to bypass firewall restrictions to Port 5432. Please contact us if you would like to explore this option.

Next steps #

Once you have successfully connected to the database 🎉, it’s time to explore the database. In the next section, you will learn how the database is structured and how the database tables and fields are related to each other. With this knowledge in hand, you’ll be able to develop your own queries to extract the data you are interested in.