You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Where can I find the pretrain code that includes "Structure-Text Contrastive loss," "Cross-Modal Matching loss," "Masked Language Modeling loss," and "Knowledge graph embedding loss," as in figure 1 and section 3.2 Pre-training objectives of the MolFm paper?
In other words, how can I reproduce the ckpt file?
In addition, that ckpt file is uploaded on Baidu, so it is difficult to download it. Can you provide the file through another channel?
Sincerely, Syzseisus
p.s. (to author of paper) Why did you use the knowledge graph only in the molecule property prediction task and not in the rest of the 4 downstream tasks mentioned in the paper?
The text was updated successfully, but these errors were encountered:
Hi, Thanks you for your great work!
Where can I find the pretrain code that includes "Structure-Text Contrastive loss," "Cross-Modal Matching loss," "Masked Language Modeling loss," and "Knowledge graph embedding loss," as in figure 1 and section 3.2 Pre-training objectives of the MolFm paper?
In other words, how can I reproduce the ckpt file?
In addition, that ckpt file is uploaded on Baidu, so it is difficult to download it. Can you provide the file through another channel?
Sincerely, Syzseisus
p.s. (to author of paper) Why did you use the knowledge graph only in the molecule property prediction task and not in the rest of the 4 downstream tasks mentioned in the paper?
The text was updated successfully, but these errors were encountered: