Organisme | Année | Campagne |
---|---|---|
A2iA | 2010 | OpenHaRT2010 (Not Scored)
http://www.nist.gov/itl/iad/mig/hart2010.cfm |
A2iA | 2011 | ICDAR’11 : 1st Place in Worldwide Handwriting Recognition Competition http://www.icdar2011.org/EN/volumn/home.shtml Beijing, China, September 2011 |
GREYC | 2011 | IMAGECLEF 2011 : 2nd Place – Photo Annotation Task (reconnaissance de concepts par l’image) http://www.imageclef.org/2011 Amsterdam, The Netherlands, September 2011 |
GREYC | "Labeled Faces in the News" : similarité entre visages | |
GREYC | FERRET : reconnaissance faciale | |
GREYC | JAFFE : reconnaissance d’expression faciale | |
GREYC | VIPER + iLIDS : réidentification de personnes dans des vidéos | |
GREYC | HATDB : reconnaissance de traits caractéristiques des personnes | |
INRA | 2011 | Best result at BioNLP ST’11, BB Task.
http://2011.bionlp-st.org/ |
Inria | 2008 | Excellent Results on Trecvid 2008 copy detection task, Inria Lear http://www-nlpir.nist.gov/projects/tv2008/tv2008.html Gaithersburg, USA, November 2008 |
Inria | 2008 | 1st in PASCAL Visual Object Classes Challenge (VOC2008) http://pascallin.ecs.soton.ac.uk/challenges/VOC/voc2008/ Marseille, France, March 2008 |
Inria | 2009 | ImageCLEF’09 second place among the 19 participating teams for Photo Annotation and Image Retrieval tasks – Inria Lear http://www.imageclef.org/2009 Corfu, Greece, Sep – Oct 2009 |
Inria | 2010 | PASCAL VOC 2010 : INRIA-LEAR’s work on human action recognition achieved best results on three out of nine action classes, and best average results. http://pascallin.ecs.soton.ac.uk/challenges/VOC/voc2010/ |
Inria | 2012 | TRECVID MED challenge 2012: First and second. The Multimedia Event Detection (MED) evaluation track is part of the TRECVID Evaluation.
http://www.nist.gov/itl/iad/mig/med12.cfm |
IRCAM | 2008 | Best results in several categories at MIREX 2008 http://www.music-ir.org/mirex/wiki/2008:Main_Page Philadelphia, USA, September 2008 |
IRCAM | 2009 | MIREX’09 http://www.music-ir.org/mirex/wiki/2009:Main_Page Kobe, Japan, October 2009 |
IRCAM | 2010 | 1st rank at MIREX’10 in several different categories http://www.music-ir.org/mirex/wiki/2010:Main_Page Utrecht, The Netherlands, August 2010 |
IRCAM | 2011 | 1st rank at MIREX’11 in several different categories http://www.music-ir.org/mirex/wiki/2011:Main_Page Miami, USA, October 2011 |
IRCAM | 2012 | 1st rank at MIREX’12 in several different categories http://www.music-ir.org/mirex/wiki/2012:Main_Page Porto, Portugal, October 2012 |
JOUVE | 2010 | 2nd rank in CLEF-IP evaluation campaign http://www.ir-facility.org/clef-ip Padua, Italy, September 2010 |
JOUVE | 2011 | ICDAR 2011: 1st in category "Historical Document Layout Analysis" http://www.icdar2011.org/EN/volumn/home.shtml Beijing, China, September 2011 |
KIT | 2011 | IWSLT Talk Task ASR Evaluation, 2011 second place, MT first place http://www.iwslt2011.org/ San Francisco, USA, December 2011 |
KIT | 2011 | WMT 2011, 3times constrained winner http://www.statmt.org/wmt11/ Edinburgh, UK, July 2011 |
KIT | 2012 | IWSLT Talk Task Evaluation, 2012, second place, SLT-Task first place http://hltc.cs.ust.hk/iwslt/ Hong Kong, December 2012 |
KIT | 2012 | WMT 2012, 4times constrained winner http://www.statmt.org/wmt12/ Montreal, Canada, June 2012 |
LIMSI | 2008 | 1st in QA@CLEF’08 (main track and audio track – QAST evaluation) http://www.clef-initiative.eu/edition/clef2008 Aarhus, Denmark, September 2008 |
LIMSI | 2008 | LIMSI system ranked 1st in Answer Validation Exercise (AVE) 2008 for French, 2nd among all systems (all languages) http://www.clef-initiative.eu/edition/clef2008 Aarhus, Denmark, September 2008 |
LIMSI | 2008 | Gale 2008, part of the Agile team led by BBN
http://www.itl.nist.gov/iad/mig/tests/gale/2008/ |
LIMSI | 2008 | 2008 NIST Speaker Recognition Evaluation http://www.itl.nist.gov/iad/mig/tests/sre/2008/index.html Montreal, Canada, June 2008 |
LIMSI | 2009 | 1st in QA@CLEF’09 (main track and audio track – QAST evaluation) http://www.lsi.upc.edu/~qast/2009/ Corfu, Greece, Sep – Oct 2009 |
LIMSI | 2009 | Ester2 evaluation campainn 2009 (segmentation, speaker diarization, STT, top ranked results for all submissions)
http://www.afcp-parole.org/camp_eval_systemes_transcription/ |
LIMSI | 2009 | Gale 2009, part of the Agile team led by BBN
http://www.itl.nist.gov/iad/mig/tests/gale/2009/ |
LIMSI | 2009 | 2009 Language Recognition Evaluation
http://www.itl.nist.gov/iad/mig/tests/lre/2009/ |
LIMSI | 2010 | Gale 2010, part of the Agile team led by BBN
http://projects.ldc.upenn.edu/gale/ |
LIMSI | 2010 | 2010 NIST Speaker Recognition Evaluation
http://www.itl.nist.gov/iad/mig/tests/sre/2010/index.html |
LIMSI | 2011 | Gale 2011, part of the Agile team led by BBN
http://projects.ldc.upenn.edu/gale/ |
LIMSI | 2011 | 2011 Language Recognition Evaluation
http://www.nist.gov/itl/iad/mig/lre11.cfm |
LIMSI | 2011 | WMT 2011: Best MT system for English to French translation http://www.statmt.org/wmt11/ Edinburgh, UK, July 2011 |
LIMSI | 2011 | IWSLT 2011: Best MT system for English to French translation http://www.iwslt2011.org/ San Francisco, USA, December 2011 |
LIMSI | 2012 | 2012 NIST Speaker Recognition Evaluation
http://www.nist.gov/itl/iad/mig/sre12.cfm |
LIMSI | 2012 | REPERE dry run http://www.defi-repere.fr/ January 2012 |
LIMSI | 2012 | WMT’12: 1st French/English MT system http://www.statmt.org/wmt12/ Montreal, Canada, June 2012 |
LIMSI | Etape Speaker diarization, Named entities | |
LIMSI/RWTH/KIT/Systran | 2011 | Jointly (with Aachen, KIT and Systran) best German/English system at WMT 2011 http://www.iwslt2011.org/ Edimbourg, UK, July 2011 |
LIMSI/RWTH/KIT/Systran | 2012 | Jointly (with Aachen, KIT and Systran) best German/English system at WMT 2012 http://www.statmt.org/wmt12/ Montréal, Canade, June 2012 |
LIMSI/Vocapia | 2008 | 1st in Technolangue ESTER’08 http://www.technolangue.net/article60.html Aarhus, Denmark, September 2008 |
LIMSI/Vocapia | 2008 | NBest 2008 with Vocapia (top ranked results for all submissions) |
RWTH | 2009 | 3rd at the ICDAR’09 Arabic Handwriting Recognition Competition http://www.cvc.uab.es/icdar2009/ Barcelona, Spain, July 2009 |
RWTH | 2010 | Best BLEU Score for English to German translation in the WMT 2010 Shared Translation Task and many other top positions in translation and system combination tasks http://www.statmt.org/wmt10/ Uppsala, Sweden, July 2010 |
RWTH | 2010 | 2nd rank in ICFHR 2010 (Arabic Handwriting Recognition Competition) http://www.isical.ac.in/~icfhr2010/ Kolkata, India, November 2010 |
RWTH | 2010 | 2nd rank in IWSLT 2010: BTEC Arabic to English translation task among 13 participants, according to the automatic evaluation http://iwslt2010.fbk.eu/ Paris, France, December 2010 |
RWTH | 2012 | IWSLT 2012: RWTH best system (Ar->En) http://hltc.cs.ust.hk/iwslt/ Hong Kong, December 2012 |
Synapse | 2008 | 1st in QA@CLEF’08 (main track and audio track – QAST evaluation) http://www.clef-initiative.eu/edition/clef2008 Aarhus, Denmark, September 2008 |
Synapse | 2009 | 1st in QA@CLEF’09 (main track and audio track – QAST evaluation) http://www.clef-initiative.eu/edition/clef2009 Corfu, Greece, Sep – Oct 2009 |
Télécom ParisTech | 2008 | Participation to MIREX’08 Evaluation campaign Multiple Fundamental Frequency Estimation & Tracking (Rank 5th/10) http://www.music-ir.org/mirex/wiki/2008:Main_Page Philadelphia, USA, September 2008 |
Télécom ParisTech | 2008 | Participation to MIREX’08 Evaluation campaign Melody Extraction (Rank 2nd/5) http://www.music-ir.org/mirex/wiki/2008:Main_Page Philadelphia, USA, September 2008 |
Télécom ParisTech | 2008 | Participation to the 2008 French evaluation campaign ESTER 2 on audio segmentation task (very good results)
http://www.afcp-parole.org/camp_eval_systemes_transcription/ |
Télécom ParisTech | 2009 | MIREX’09 evaluation campaign: Chord Detection (rank 2nd/9 for score 1, rank 1st/8 for score 2) http://www.music-ir.org/mirex/wiki/2009:Main_Page Kobe, Japan, October 2009 |
Télécom ParisTech | 2009 | MIREX’09 evaluation campaign: Melody Extraction (rank 2nd/9) http://www.music-ir.org/mirex/wiki/2009:Main_Page Kobe, Japan, October 2009 |
Télécom ParisTech | 2010 | MIREX’10 evaluation campaign: Chord Detection (Rank 7th / 12) http://www.music-ir.org/mirex/wiki/2010:Main_Page Utrecht, The Netherlands, August 2010 |
Télécom ParisTech | 2011 | MIREX’11 evaluation campaign: Beat Tracking (Rank 2nd/6 on MCK dataset) http://www.music-ir.org/mirex/wiki/2011:Main_Page Miami, USA, October 2011 |
Télécom ParisTech | 2011 | Participation to SiSEC 2011 Signal Separation Evaluation Campaign (The technique was the only one capable of achieving high performances on large files).
http://sisec2011.wiki.irisa.fr/tiki-index.php |
Télécom ParisTech | 2012 | Participation to MIREX’12 evaluation campaign: Beat Tracking (Rank 3rd/9 on MCK dataset) http://www.music-ir.org/mirex/wiki/2012:Main_Page Porto, Portugal, October 2012 |
Télécom ParisTech | 2012 | Participation to MIREX’12 evaluation campaign: Multiple Fundamental Frequency Estimation & Tracking (Rank 1st / 6 on Note tracking task) http://www.music-ir.org/mirex/wiki/2012:Main_Page Porto, Portugal, October 2012 |
Université Joseph Fourier | 2011 | 3rd at the TRECVID semantic indexing task in 2011
http://www-nlpir.nist.gov/projects/tv2011/tv2011.html |
Université Joseph Fourier | 2012 | 3rd at the TRECVID semantic indexing task in 2011 and 2012
http://www-nlpir.nist.gov/projects/tv2012/tv2012.html |
LIMSI/Vocapia | 2011 | 1st rank at ASR benchmark task of Evalita 2011
http://www.evalita.it/2011 |