For anyone who doesn't know, Project Gorgon provides some publicly available data on

I decided to make a local copy of this data, as suggested on that site, since i search it a lot. I'm not sure how trivial a task this is for "most people", but i found it involving enough that i'd share my solution.

I used python. To use this script, you will need python installed. You'll need to edit the third line (using any text editor) to the directory where you want the json files stored (they will go into a new folder for each version when you run the script). Hopfully it's useful to someone.

import os
import urllib.request

os.chdir(r'C:\Users\Tim\Root\Code\Python 3.x Scripts\Project Gorgon')

version = urllib.request.urlopen('').read().decode('UTF-8')
print ("version = " + version)

prefix = ''
midfix = '/data/'
postfix = '.json'

dirName = './v' + version
if os.path.isdir(dirName):
    print("Directory " , dirName ,  " already exists")
    print("Directory " , dirName ,  " Created ")

JsonNames = [

for name in JsonNames:
	fileName = dirName + '/' + name + postfix

	if os.path.isfile(fileName):
		print("File " , fileName , " already exists")
		url = prefix + version + midfix + name + postfix
		print ("Reading " + name + ".json from " + url)
		dataIn = urllib.request.urlopen(url).read().decode('UTF-8')
		print ("Caching " + name + " json")
		with open(dirName + '/' + name + postfix, 'w', newline='') as dataOut: