Contact Addresses

There is a moderated mailing list mrp-users@nlpl.eu for task participants (or candidate future users of the data and evaluation software).  All interested parties are kindly asked to self-subscribe to this list.  The archives of the mailing list are open to the public, as the list is intended both for updates from the organizers and for general discussion among participants.

Additionally, the organizers of the MRP 2019 parsing task can be reached at the following address:

   mrp-organizers@nlpl.eu

News & Updates

June 24, 2019
A re-release of the MRP companion data provides reference (if not gold-standard) ‘alignments’ (i.e. anchoring) for the AMR training graphs, obtained from the JAMR and ISI aligners.  mtool now offers basic graph validation.
June 16, 2019
The MRP evaluation software (mtool, the Swiss Army Knife of Meaning Representation) now provides an implementation of the official MRP cross-framework metric.  Debugging and refinement are still ongoing.
June 3, 2019
An initial release of the MRP evaluation tool is available on Microsoft GitHub.  Unified cross-framework evaluation, however, is still being under development.  Please monitor continuous updates to the repository and its issue tracker.
May 25, 2019
We have clarified the constraints on which data resources can be used in addition to the training and companion data distributed by the task organizers.  The deadline for nominations of additional data has been extended to Monday, June 3, 2019.
May 21, 2019
We have released an update to the UCCA training graphs (improving consistency and adding more annotations), i.e. an ‘overlay’ to the original, full training data.  Furthermore, premium-quality tokenization and morpho-syntactic parses of the training data are now available.
April 26, 2019
We have moved the mid-May target date for extended (UCCA) training data and the morpho-syntactic ‘companion’ parses back by one week.  Also, the official scorer may not be available before early June, but its approach will mirror extant, framework-specific evaluation tools.
April 9, 2019
New information has been added to the task web site, including a description of the uniform serialization; a public sample of sentences annotated in all frameworks; and the no-cost evaluation license for access to the training data.
March 6, 2019
The initial task web site is on-line, and the first call for participation has been posted to major mailing list.  Please sign up for the task mailing list to receive continuous updates.

Task Co-Organizers

Acknowledgements

Several colleagues have assisted in designing the task and preparing its data and software resources.  Sebastian Schuster kindly made available a pre-release of the converter from PTB-style constituent trees to (basic) UD 2.x dependency graphs.  Milan Straka provided invaluable assistance in training and running the latest development version of his UDPipe system, to generate the morpho-syntactic companion trees for the MRP sentences.  Zdeňka Urešová graciously took the time to provide (and quality-control) fresh gold-standard annotations for the PSD evaluation graphs.  The ‘companion’ alignments for AMR graphs were most helpfully prepared by Jayeol Chun, including forcing the aligners to use the MRP tokenization from the MRP morpho-syntactic companion parses.

We are grateful to the Nordic e-Infrastructure Collaboration for their support to the Nordic Language Processing Laboratory (NLPL), which provides technical infrastructure for the MRP 2019 task.  Also, we warmly acknowledge the assistance of the Linguistic Data Consortium (LDC) in distributing the training data for the task to participants at no cost to anyone.

XHTML 1.0 | Last updated: 2019-06-24 (21:06)