Erva Ulusoy commited on
Commit
1d133b0
·
1 Parent(s): 0d1479e

changed workflow pdf to png it was not working

Browse files
figures/ProtHGT_workflow.pdf DELETED
Binary file (467 kB)
 
figures/ProtHGT_workflow.png ADDED
pages/About.py CHANGED
@@ -37,10 +37,7 @@ Overall workflow of ProtHGT is shown below.
37
  """)
38
 
39
  st.subheader('Schematic overview of ProtHGT', anchor='schematic-overview')
40
- with open("figures/ProtHGT_workflow.pdf", "rb") as pdf_file:
41
- base64_pdf = base64.b64encode(pdf_file.read()).decode('utf-8')
42
- pdf_display = F'<iframe src="data:application/pdf;base64,{base64_pdf}" width="750" height="550" type="application/pdf"></iframe>'
43
- st.markdown(pdf_display, unsafe_allow_html=True)
44
 
45
  st.markdown(
46
  '<p style="text-align:center"><em><strong>Schematic representation of the ProtHGT framework. a)</strong> Diverse biological datasets, including proteins, pathways, domains, and GO terms, are integrated into a unified knowledge graph; <strong>b)</strong> the heterogeneous graph is constructed, capturing multi-relational biological associations; <strong>c)</strong> feature vectors for each node type are generated using state-of-the-art embedding methods; <strong>d)</strong> protein function prediction models are trained separately for molecular function, biological process, and cellular component sub-ontologies; <strong>e)</strong> heterogeneous graph transformer (HGT) layers process and refine node representations through multi-relational message passing. Final protein function predictions are obtained by linking proteins to GO terms based on learned embeddings and attention-weighted relationships.</em></p>', unsafe_allow_html=True)
 
37
  """)
38
 
39
  st.subheader('Schematic overview of ProtHGT', anchor='schematic-overview')
40
+ st.image('figures/ProtHGT_workflow.png')
 
 
 
41
 
42
  st.markdown(
43
  '<p style="text-align:center"><em><strong>Schematic representation of the ProtHGT framework. a)</strong> Diverse biological datasets, including proteins, pathways, domains, and GO terms, are integrated into a unified knowledge graph; <strong>b)</strong> the heterogeneous graph is constructed, capturing multi-relational biological associations; <strong>c)</strong> feature vectors for each node type are generated using state-of-the-art embedding methods; <strong>d)</strong> protein function prediction models are trained separately for molecular function, biological process, and cellular component sub-ontologies; <strong>e)</strong> heterogeneous graph transformer (HGT) layers process and refine node representations through multi-relational message passing. Final protein function predictions are obtained by linking proteins to GO terms based on learned embeddings and attention-weighted relationships.</em></p>', unsafe_allow_html=True)