About me.

Hi, I'm Riccardo Orlando, Computer Scientist and NLP enthusiast. Currently, I'm a first-year PhD student at the SapienzaNLP group of Sapienza University of Rome. Before that, I worked as NLP Engineer at Babelscape. _


News.


Now.

PhD Student - SapienzaNLP Group

Sapienza University of Rome, Italy

PhD Student in NLP at the SapienzaNLP group of Sapienza University of Rome.

Nov. 2021
- Present


Publications.

Universal Semantic Annotator: the First Unified API for WSD, SRL and Semantic Parsing.

Riccardo Orlando, Simone Conia, Stefano Faralli, and Roberto Navigli

In Proceedings of LREC, 2022

June 2022

AMuSE-WSD: An All-in-one Multilingual System for Easy Word Sense Disambiguation.

Riccardo Orlando, Simone Conia, Fabrizio Brignone, Francesco Cecconi, and Roberto Navigli

In Proceedings of EMNLP, 2021

November 2021

InVeRo-XL: Making Cross-Lingual Semantic Role Labeling Accessible with Intelligible Verbs and Roles.

Simone Conia, Riccardo Orlando, Fabrizio Brignone, Francesco Cecconi, and Roberto Navigli

In Proceedings of EMNLP, 2021

November 2021


Past.

NLP Engineer - Babelscape

Rome, Italy

Worked at two end-to-end high-efficient multilingual pipelines for Word Sense Disambiguation ( AMuSE-WSD ) and Semantic Role Labeling ( InVeRo-XL ).

Feb. 2021
- Dec. 2021

Master of Science in Engineering in Computer Science

Sapienza University of Rome, Italy

Thesis title: An automatic approach to produce multilingual training data for Semantic Role Labelling.

Grade: 110/110 with honours

2018
- 2020

Software Engineer - Capgemini

Rome, Italy

Included in a team that worked to re-platform a series of internal applications of an insurance company. Worked as a software engineer mainly on the backend but also contributed to the frontend.

Mar. 2018
- Sept. 2018

Bachelor of Computer Science

Sapienza University of Rome, Italy

Thesis title: Densest subgraph in Fork/Join.

2015
- 2018


Personal projects.

These are some projects I'm involved in:

transformers-embedder

A Word Level Transformer layer based on Pytorch and 🤗Transformers.

ner-serve

Simple multilingual NER model serving using Docker, FastAPI, ONNX and Multilingual Mini-LM.

transformers-ner

Simple NER model, showcasing Transformer Embedder library.

transformer-srl

Reimplementation of a BERT based model (Shi et al, 2019), currently the state-of-the-art for English SRL. This model implements also predicate disambiguation.

Super SloMo TF2

Tensorflow 2 implementation of Super Slo Mo paper.

Chinese word segmentation

Chinese Word Segmentation task based on BERT and implemented in Pytorch for state-of-the-art Chinese word segmentation.

Word Sense Disambiguation

A supervised Bi-LSTM architecture for the Word Sense Disambiguation task.

Sense Embeddings

BabelNet (and WordNet) sense embedding trained with Word2Vec and FastText.

CNN Image Classification

Convolutional Neural Networks trained to classify different types of boats.

Machine Learning Malware Analysis

Three different classifier trained to distinguish malware application from non-malware ones and recognize the family they belong.

Densest Subgraph in Fork/Join

Bachelor’s thesis. Densest Subgraph in Fork/Join. A Fork/Join parallel algorithm for the densest subgraph problem.

Fork/Join Sudoku Solver

Solver that uses Fork/Join framework to solve Sudoku in parallel, made during a multi-core programming course.