account18hackathon
commited on
Commit
·
0eab3b8
1
Parent(s):
26a9f0f
Update README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,59 @@
|
|
1 |
---
|
2 |
license: afl-3.0
|
3 |
-
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
---
|
2 |
license: afl-3.0
|
3 |
+
---
|
4 |
+
|
5 |
+
## Suri: Develop new algorithms that are more efficient at training models on scRNA-seq data
|
6 |
+
|
7 |
+
## Data
|
8 |
+
|
9 |
+
## Usage
|
10 |
+
|
11 |
+
- Pretrain on single-cell RNA-seq data
|
12 |
+
```
|
13 |
+
python --data_path "data_path" pretrain.py
|
14 |
+
```
|
15 |
+
|
16 |
+
## Time cost
|
17 |
+
Typical install time on a "normal" desktop computer is about 30 minutes.
|
18 |
+
|
19 |
+
Exptected run time for infering 10,000 cells on a "normal" desktop computer is about 25 minutes.
|
20 |
+
|
21 |
+
|
22 |
+
## Disclaimer
|
23 |
+
This project is used for academic research purposes.
|
24 |
+
|
25 |
+
|
26 |
+
## Citations
|
27 |
+
|
28 |
+
You can find more information in these citations if you are interested in the technical details.
|
29 |
+
|
30 |
+
```bibtex
|
31 |
+
@article{yang2022scbert,
|
32 |
+
title={scBERT as a large-scale pretrained deep language model for cell type annotation of single-cell RNA-seq data},
|
33 |
+
author={Yang, Fan and Wang, Wenchuan and Wang, Fang and Fang, Yuan and Tang, Duyu and Huang, Junzhou and Lu, Hui and Yao, Jianhua},
|
34 |
+
journal={Nature Machine Intelligence},
|
35 |
+
volume={4},
|
36 |
+
number={10},
|
37 |
+
pages={852--866},
|
38 |
+
year={2022},
|
39 |
+
publisher={Nature Publishing Group UK London}
|
40 |
+
}
|
41 |
+
```
|
42 |
+
|
43 |
+
```bibtex
|
44 |
+
@inproceedings{choromanski2020rethinking,
|
45 |
+
title = {Rethinking Attention with Performers},
|
46 |
+
author = {Krzysztof Choromanski and Valerii Likhosherstov and David Dohan and Xingyou Song and Andreea Gane and Tamas Sarlos and Peter Hawkins and Jared Davis and Afroz Mohiuddin and Lukasz Kaiser and David Belanger and Lucy Colwell and Adrian Weller},
|
47 |
+
booktitle = {International Conference on Learning Representations},
|
48 |
+
year = {2021},
|
49 |
+
}
|
50 |
+
```
|
51 |
+
|
52 |
+
```bibtex
|
53 |
+
@article{liu2023sophia,
|
54 |
+
title={Sophia: A Scalable Stochastic Second-order Optimizer for Language Model Pre-training},
|
55 |
+
author={Liu, Hong and Li, Zhiyuan and Hall, David and Liang, Percy and Ma, Tengyu},
|
56 |
+
journal={arXiv preprint arXiv:2305.14342},
|
57 |
+
year={2023}
|
58 |
+
}
|
59 |
+
```
|