Pinned Repositories
094295_hw1
Early Prediction of Sepsis from Clinical Data
094295_hw2
Improving performance just by Data
fine-tuning-Bert-for-NER-task
It is known that transformers are in the state-of-art of NLP tasks. Bert models usually fine-tuned on specific datasets that were made for specific tasks. In our work we want to apply Bert model after it was fine-tuned for NER (Named Entity Recognition) task and in addition was fine-tuned for specific domain. We saw the big difference between the performance of our model after just a general fine-tuning for NER task and the performance after the second fine-tuning for our specific domain.
generating-anime-faces
Generating Anime Faces Using Variational Auto-encoder
Statistic-data-analysis
In this project, we took a database and ran statistical analysis on it, such as testing hypotheses, finding confidence intervals, linear regression, logistic regression, bootstrap, Bayesian method and handling missing data.
Word-Embeddings-and-the-Brain
Extend of the paper Pereira, F., Lou, B., Pritchett, B., Ritter, S., Gershman, S. J., Kanwisher, N., Botvinick, M., & Fedorenko, E. (2018). Toward a universal decoder of linguistic meaning from brain activation. Nature communications, 9 (1), 1–13, on decoding words from fMRI data.
gisser770
Config files for my GitHub profile.
github-projects-playground
gisser770's Repositories
gisser770/gisser770
Config files for my GitHub profile.
gisser770/github-projects-playground
gisser770/Word-Embeddings-and-the-Brain
Extend of the paper Pereira, F., Lou, B., Pritchett, B., Ritter, S., Gershman, S. J., Kanwisher, N., Botvinick, M., & Fedorenko, E. (2018). Toward a universal decoder of linguistic meaning from brain activation. Nature communications, 9 (1), 1–13, on decoding words from fMRI data.
gisser770/094295_hw2
Improving performance just by Data
gisser770/094295_hw1
Early Prediction of Sepsis from Clinical Data
gisser770/fine-tuning-Bert-for-NER-task
It is known that transformers are in the state-of-art of NLP tasks. Bert models usually fine-tuned on specific datasets that were made for specific tasks. In our work we want to apply Bert model after it was fine-tuned for NER (Named Entity Recognition) task and in addition was fine-tuned for specific domain. We saw the big difference between the performance of our model after just a general fine-tuning for NER task and the performance after the second fine-tuning for our specific domain.
gisser770/generating-anime-faces
Generating Anime Faces Using Variational Auto-encoder
gisser770/Statistic-data-analysis
In this project, we took a database and ran statistical analysis on it, such as testing hypotheses, finding confidence intervals, linear regression, logistic regression, bootstrap, Bayesian method and handling missing data.