A Lightning framework for retriever architecture prototype
A "just few lines of code" utility for fine-tuning (not only) Llama models.
A Word Level Transformer layer based on Pytorch and 🤗Transformers.
A template to initialize PyTorch projects that use as a backbone framework PyTorch Lightning.
NLP Preprocessing Pipeline Wrappers.
Simple multilingual NER model serving using Docker, FastAPI, ONNX and Multilingual Mini-LM.
Simple NER model, showcasing Transformer Embedder library.
Reimplementation of a BERT based model (Shi et al, 2019), currently the state-of-the-art for English SRL. This model implements also predicate disambiguation.
Tensorflow 2 implementation of Super Slo Mo paper.
Chinese Word Segmentation task based on BERT and implemented in Pytorch for state-of-the-art Chinese word segmentation.
A supervised Bi-LSTM architecture for the Word Sense Disambiguation task.
BabelNet (and WordNet) sense embedding trained with Word2Vec and FastText.
Convolutional Neural Networks trained to classify different types of boats.
Three different classifier trained to distinguish malware application from non-malware ones and recognize the family they belong.
Bachelor’s thesis. Densest Subgraph in Fork/Join. A Fork/Join parallel algorithm for the densest subgraph problem.
Solver that uses Fork/Join framework to solve Sudoku in parallel, made during a multi-core programming course.