I'm a researcher in computational linguistics. My work is about making machines understand the meaning of texts (mostly French).
For this I interleave findings from theoretical linguistics with empirical methods from computational linguistics.
I am affiliated with the German collaborative research centre SFB 732 at the University of Stuttgart.
I am also still affiliated with the Logoscope project at the University of Strasbourg.
Did it really happen?
Natural language inference, factuality
Texts not only describe events, but also convey information about their factuality, i.e. they encode whether these events correspond to real situations in the world, or to uncertain, (im)probable or (im)possible situations.
We investigate the influence of clause-embedding French verbs on the factuality of the event described by the embedded clause.
How completed is this event?
Learning degrees of telicity, predicting aspectual values
To what extent an event described by a sentence is completed, i.e. its aspectual value, results from a complex interplay between lexical features of the predicate and its linguistic context.
We predict the aspectual value of verb senses based on semantic and morpho-syntactic features extracted from a French valence lexicon.
Monitoring new words
Detecting Neologisms in French Newspapers
The Logoscope is a tool which (semi-)automatically collects new French words, documents them and allows a public access through a web interface. In contrast to most other existing tools it attempts to give a more complete account of the context in which the word creation occurred.
Words are collected daily by browsing the online versions of several French newspapers. The Logoscope not only provides morpho-syntactic information about the new words but also describes their textual and discursive context. In particular it automatically determines the (journalistic) topics of the text containing the new word.
This is a project funded by the German National Science Foundation (DFG), which was established in 2006 and has been renewed in 2010 and 2014. It brings together more than 40 researchers from the Institute of Natural Language Processing and the Institute of Linguistics at the University of Stuttgart.
I joined its research project B5 - Polysemy in a Conceptual System in 2015. In B5 we investigate how inferences triggered by polysemous French predicates vary in context.
The Logoscope - Monitoring the creation of new words.
The Logoscope is a project funded by the University of Strasbourg (2012-2015) and the DGLFLF (Délégation générale à la langue française et aux langues de France) (since 2015).
In this project we developed a tool which allows to (semi-)automatically collect and document new words appearing in French newspaper texts. We document not only morpho-syntactic properties of the new words but also the larger textual and discursive context in which the word creation happened.
I was funded by this project from 2013-2016 and developed the first prototype of the tool.
Allegro was an EU funded INTERREG IV A project which focused on the development of new technologies for foreign language learning. Its aim was to make computer-based foreign language learning more interactive, more engaging, and more fun! It provides learners with tools which would enable them to exercise practical conversations, which they might encounter in real life, with a computer. I worked at modeling the linguistic and lexical knowledge and its interface with the domain (ontological) knowledge base.
During my PhD work I was employed by the INRIA to work on the SEMbySEM project. The SEMbySEM project was a European project whose aim it was to provide open source software dedicated to the semantic management and monitoring of systems. The contribution of our team consisted in addressing the linguistic requirements: how to represent the ontological objects in language - in English and in the other participants' languages.
I was born and grew up in Bucharest, in communist Romania. I was educated at the German school in Bucharest, where I received a bicultural German and Romanian schooling - which later I learned to value very much.
After moving to Germany I studied Mathematics at the University of Bonn, where I appreciated the motivating and inspiring scientific environment and graduated in 1988 with a master thesis in Algebraic Number Theory under the supervision of Prof. Fritz Grunewald.
Despite the pleasant years with Mathematics in Bonn I left academics and went to Berlin to work as a software engineer first in Telecommunications (Alcatel SEL AG), then in ERP (PSIPENTA AG). It is there that my daughters Lea and Kari were born and this is only one of the reasons that Berlin was and is my favourite town.
In 1998, when my husband finally obtained a permanent position as a researcher, we moved to Nancy, in France. There I worked as a software engineer in the Synalp team (which was then called LED) and this is how I got into Computational Linguistics and research. In 2008 I was granted a Master in Computer Science/Computational Linguistics and started a PhD under the supervision of Claire Gardent and Samuel Cruz-Lara which I defended in 2012.
In 2013 we moved to Strasbourg where we can enjoy the French and German ways of life -- and our very own mix of both. I first worked with LiLPa, the NLP team at the university of Strasbourg developing a (semi-)automatic neology detection and documentation system. Since 2015 I am a postdoctoral researcher with the romance linguistics and computational linguistics departments at the university of Stuttgart.
Playing the Clarinet.
In 2006 I plucked up my courage and started to play the clarinet which since then gives me great pleasure.
in Germany (organised by Velociped).