node -v
npm -v
if you don’t install it here https://docs.npmjs.com/downloading-and-installing-node-js-and-npm and then relaunch your terminal and then run nvm install 18.17.0
nvm use 18.17.0
OR latest versionnpm install
(this will install the correct version of node and all node modules that the repo depends on by looking at the package.json and package-lock.json files)WARNING: If you are on a windows machine to test the maps you’ll need to launch the server through a linux terminal, such as WSL.
create a virtual environment and activate it
pip install -r requirements.txt
cp creds_TEMPLATE.py creds.py
copy creds_TEMPLATE.py creds.py
creds.py
with a text editor or IDE, and populate this file with the following:
Update the map tracker log sheet (tab name prep_file with the new data’s google sheet key from the copy of official data saved above
Responsibilities of this repo (hint: maybe we separate this out to other repos soon)
all_config.py
based on your needs (primarily these initial four and any local file path)
–trackers_to_update = ['Bioenergy']
# official tracker tab name in map tracker log sheet
–new_release_date = 'June_2025'
# for find replace within about page NEEDS TO BE FULL MONTH
–releaseiso = '2025-06'
# YYYY-MM-DD (day optional)
–simplified = False
# True False
–priority = ['gbpt']
# allows you to prioritize global, regional or internal output filespython run_maps.py
trackers/{mapname} subfolder
trackers/bioenergy/compilation_output/
and the updated Africa output will be in trackers/africa/compilation_output/
var config = {
geojson: 'path/to/file'
....other config variables for that map}
python -m http.server 8000
at the root of the repo and navigate to trackers/map_folder_name
be sure to rename the remotes so that you can only push to the original if you specify upstream git remote set-url origin NEW_URL git remote add upstream OLD_URL
Warning: you’ll have to have the testing repo cloned to your machine and perhaps already open in an IDE window. You should also have set up two remotes repos, one called official that is linked to the official repo and the other that is linked to the testing repo. (see above section on Steps to creating a new testing map repo that pulls from official remote for how to do that, note you DO NOT need to create a new testing repo for this, just clone the testing repo instead of the maps one)
On the official repo IDE window, push the branch you have with the new data to the official remote repo, do not merge into the live branch called “gitpages-production”. Then go to your IDE window where you have the testing repo cloned and set up. Pull from your branch name on official remote repo, accept all merges from official remote since they will override anything going on there, and then push to the test remote repo. Note that currently the test remote repo branch connected to its own gitpages is called “testmaplive”. Now you can share the updated map preview via the testing repo’s gitpages link.
Here are the steps on my machine: git push origin yourbranchname [in official repo IDE window] git pull official yourbranchname [in test repo IDE window] accept merges git push origin testmaplive [in test repo IDE window]
GEM Tracker Maps are served entirely staticly, with no build process. Each tracker only requires a JSON based configuration file, and a data file (mostly hosted in digital ocean as geojson files).
/src/
contains the site code, styling information, layout, and supporting assets like images.site-config.js
contains site wide configuration that applies to all trackers/trackers/
contains a director for each trackerClone the repo. Create a new directory under /trackers/
. Place the data for the tracker there. Create a symlink to index.html
: while in the new directory, ln -s ../../src/index.html
. Create a config.js
. Commit to GitHub.
First, there are sitewide configurations with site-config.js
. Any parameter can be configured site wide. Documentation on the typical site wide parameters is in that file.
The config.js for coal-plant
has documentation on the parameters typically set for a tracker.
Create a new branch. Place new data file in the appropriate tracker directory. Test and do quality checks locally by running python -m http.server 8000 at the root of the directory. When ready, make a pull request to the main repository. And accept the pull request to make the update.
Currently only used for GIPT map. Adjusted in the tracker/map’s config file with the flag “tile” instead of “csv” or “json”
Detailed GEM Specific Instructions for creating and updating GIPT tiles
Install csv2geojson and tippecanoe
% csv2geojson --numeric-fields "Capacity (MW)" Global\ Integrated\ Power\ data\ 2024-02-14.xlsx\ -\ Sheet1.csv > integrated.geojson
% tippecanoe -e integrated-2024-02-14.dir --no-tile-compression -r1 -pk -pf --force -l integrated < integrated.geojson
Copy local files to digital ocean spaces recursively and set public
aws s3 cp --endpoint-url https://nyc3.digitaloceanspaces.com PATH/TO/DIR/TILES/FROM/TIPPECANOE s3://$BUCKETEER_BUCKET_NAME/NAME_OF_FOLDER_IN_DIGITAL_OCEAN/NAME_OF_SUB_FOLDER_IN_DIGITAL_OCEAN --recursive --acl public-read
This can be hosted directly from GitPages.
If hosting on another webserver, the entire repo should be available from a directory on the webserver.
Official Maps can be found at this repo:
Live branch is gitpages-production
Maps spun up for PM review before pushed to live can be found in this repo:
Live branch is testmaplive