You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have been using baltic 0.1.1 and if I use bt.loadJSON() as previously used in 0.1.0 I get an error regarding the metadata. For an example of when this happens: nx_tree = json.load(open(path, 'r')) tree = nx_tree['tree'] # tree data meta = nx_tree['meta'] # metadata json_translation={'absoluteTime':lambda k: k.traits['node_attrs']['num_date']['value'],'name':'name'} json_meta={'file':meta,'traitName':'region'} bt_tree=bt.loadJSON(tree,json_translation,json_meta)
which results in the following error:
Traceback (most recent call last):
File "C:/Users/28Cha/AppData/Roaming/JetBrains/PyCharmCE2020.2/scratches/scratch_7.py", line 39, in
bt_tree=bt.loadJSON(tree,json_translation,json_meta) ## give loadJSON the name of the tree file, the translation dictionary and (optionally) the meta file
File "C:\Users\28Cha\Documents\Python\venv\lib\site-packages\baltic\baltic.py", line 1288, in loadJSON
json_meta=auspice_json['meta']
KeyError: 'meta'
If I run the bt_tree=bt.loadJSON(nx_tree,json_translation,json_meta) it runs, but becomes a tuple, which makes it hard to work with.
I believe the issue you're having has to do with the outdated notebooks on here. Up until a few months ago auspice JSONs came in at least two parts - a meta and a tree file which were then merged into a single JSON file (v2 auspice JSON format). The loadJSON function assumes you're giving it the new v2 auspice JSON that includes both tree and meta parts in a single JSON and after parsing spits them out as separate entities (baltic tree object and meta dictionary, hence the tuple) because there's information in the meta part that wouldn't make sense as part of the baltic tree object. You can simply unpack these in your function call like this:
No description provided.
The text was updated successfully, but these errors were encountered: