Amelie-Schreiber
Independent researcher and mathematician, working on protein language models, certain equivariant transformers, Low Rank Adaptations, and QLoRA
Pinned Repositories
anomaly_detection_persistent_homology
Detecting anomalous texts with persistent homology of context vectors computed by individual attention heads in language transformers
confiboot
Confidence bootstrapping for DiffDock
esm2_LoRA_binding_sites
Training a Low Rank Adaptation of the protein language model ESM-2 for RNA binding site predictor
esm2_loras
Trying to train LoRAs for ESM-2
esm2_masked_lm
Various Ideas for Protein Masked LM with ESMFold/ESM-2
hebrew_context_persistent_homology
Persistent homology of context vectors computed by individual attention heads in Hebrew language models
protein_mutation_scoring
Some scoring functions for predicting the effects of mutations on protein sequences using ESM-2
sense-ppi
ShapeMol
Shape conditioned diffusion model for small molecule generation and drug design
transformers_proteins_and_persistent_homology
Transformers, Protein Sequences, and Persistent Homology
Amelie-Schreiber's Repositories
Amelie-Schreiber/persistent_homology_of_attention
Persistent homology analysis of individual attention heads in transformer models
Amelie-Schreiber/attention_persistent_homology
Studying how well the persistent homology of collocations and keyphrases are preserved by various models.
Amelie-Schreiber/chatgpt
Experiments with ChatGPT
Amelie-Schreiber/lora_models
Some ideas and code for LoRA models (Low Rank Adaptation)
Amelie-Schreiber/persistent_homology_of_entanglement
Amelie-Schreiber/quantum_surface_codes
Quantum Surface Codes
Amelie-Schreiber/the-singularity
Quantum machine learning and quantum enhanced AI
Amelie-Schreiber/topology_of_attention_heads
Persistent topological representation of attention heads within a transformer model.
Amelie-Schreiber/yiddish_context_persistent_homology
Persistent Homology of context vectors computed by attention heads in various language models for Yiddish.