Skip to content

Multieurlex #817

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 7 commits into
base: eval-hackathon
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
283 changes: 283 additions & 0 deletions promptsource/templates/multi_eurlex/all_languages/templates.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,283 @@
dataset: multi_eurlex
subset: all_languages
templates:
38ddea55-1710-4615-bbfa-fe5803e21e43: !Template
answer_choices: null
id: 38ddea55-1710-4615-bbfa-fe5803e21e43
jinja: 'If the French version says: {{text["fr"]}}; then the English version
should say:
||| {{text["en"]}}'
metadata: !TemplateMetadata
choices_in_prompt: false
languages: []
metrics:
- BLEU
original_task: true
name: version-fr-en-source+target
reference: ''
2bc0e46c-d1fe-4bc9-99d1-9b61aa42cd02: !Template
answer_choices: null
id: 2bc0e46c-d1fe-4bc9-99d1-9b61aa42cd02
jinja: 'If the English version says: {{text["en"]}}; then the French version
should say:
||| {{text["fr"]}}'
metadata: !TemplateMetadata
choices_in_prompt: false
languages: []
metrics:
- BLEU
original_task: true
name: version-en-fr-source+target
reference: ''
73dc1b77-e8ea-4dc8-8a12-0abc3b0dbba0: !Template
answer_choices: null
id: 73dc1b77-e8ea-4dc8-8a12-0abc3b0dbba0
jinja: 'Given the following source text in French: {{text["fr"]}} , a good
English translation is: ||| {{text["en"]}}'
metadata: !TemplateMetadata
choices_in_prompt: false
languages: []
metrics:
- BLEU
original_task: true
name: a_good_translation-fr-en-source+target
reference: ''
63dc1b77-e8ea-4dc8-8a12-0abc3b0dbba0: !Template
answer_choices: null
id: 63dc1b77-e8ea-4dc8-8a12-0abc3b0dbba0
jinja: 'Given the following source text in English: {{text["en"]}} , a good
French translation is: ||| {{text["fr"]}}'
metadata: !TemplateMetadata
choices_in_prompt: false
languages: []
metrics:
- BLEU
original_task: true
name: a_good_translation-en-fr-source+target
reference: ''
3bc0e46c-d1fe-4bc9-99d1-9b61aa42cd02: !Template
answer_choices: null
id: 3bc0e46c-d1fe-4bc9-99d1-9b61aa42cd02
jinja: 'Document in English: {{text["en"]}}\n\nTranslate the previous document to proper French:
||| {{text["fr"]}}'
metadata: !TemplateMetadata
choices_in_prompt: false
languages: []
metrics:
- BLEU
original_task: true
name: prev_doc-en-fr
reference: ''
5bc0e46c-d1fe-4bc9-99d1-9b61aa42cd02: !Template
answer_choices: null
id: 5bc0e46c-d1fe-4bc9-99d1-9b61aa42cd02
jinja: 'Document in French: {{text["fr"]}}\n\nTranslate the previous document to proper English:
||| {{text["en"]}}'
metadata: !TemplateMetadata
choices_in_prompt: false
languages: []
metrics:
- BLEU
original_task: true
name: prev_doc-fr-en
reference: ''
9bc0e46c-d1fe-4bc9-99d1-9b61aa42cd02: !Template
answer_choices: null
id: 9bc0e46c-d1fe-4bc9-99d1-9b61aa42cd02
jinja: 'Document in French: {{text["fr"]}}\n\nTranslate the entire previous document to proper English sentence for sentence (min 100 words):
||| {{text["en"]}}'
metadata: !TemplateMetadata
choices_in_prompt: false
languages: []
metrics:
- BLEU
original_task: true
name: prev_doc_long-fr-en
reference: ''
49ddea55-1710-4615-bbfa-fe5803e21e43: !Template
answer_choices: null
id: 49ddea55-1710-4615-bbfa-fe5803e21e43
jinja: 'The French version says: {{text["fr"]}}; Thus the Spanish version
should say:
||| {{text["es"]}}'
metadata: !TemplateMetadata
choices_in_prompt: false
languages: []
metrics:
- BLEU
original_task: true
name: version-fr-es-source+target
reference: ''
2bc1f46c-d1fe-4bc9-99d1-9b61aa42cd02: !Template
answer_choices: null
id: 2bc1f46c-d1fe-4bc9-99d1-9b61aa42cd02
jinja: 'Spanish version says: {{text["es"]}}; hence the Portuguese version
should say:
||| {{text["pt"]}}'
metadata: !TemplateMetadata
choices_in_prompt: false
languages: []
metrics:
- BLEU
original_task: true
name: version-es-pt-source+target
reference: ''
74ec1b99-e8ea-4dc8-8a12-0abc3b0dbba0: !Template
answer_choices: null
id: 74ec1b99-e8ea-4dc8-8a12-0abc3b0dbba0
jinja: 'Given the following source text in Portuguese: {{text["pt"]}} , a good
Spanish translation is: ||| {{text["es"]}}'
metadata: !TemplateMetadata
choices_in_prompt: false
languages: []
metrics:
- BLEU
original_task: true
name: a_good_translation-pt-es-source+target
reference: ''
63dc1b99-e8ea-4dc8-8a12-0abc3b0dbba0: !Template
answer_choices: null
id: 63dc1b99-e8ea-4dc8-8a12-0abc3b0dbba0
jinja: 'Source text in English: {{text["en"]}} , a good
Portuguese translation is: ||| {{text["pt"]}}'
metadata: !TemplateMetadata
choices_in_prompt: false
languages: []
metrics:
- BLEU
original_task: true
name: a_good_translation-en-pt-source+target
reference: ''
3bc0e88c-d1fe-4bc9-99d1-9b61aa42cd02: !Template
answer_choices: null
id: 3bc0e88c-d1fe-4bc9-99d1-9b61aa42cd02
jinja: 'Document in Spanish: {{text["es"]}}\n\nTranslate the previous document to proper French:
||| {{text["fr"]}}'
metadata: !TemplateMetadata
choices_in_prompt: false
languages: []
metrics:
- BLEU
original_task: true
name: prev_doc-es-fr
reference: ''
5bc1f46c-d1fe-4bc9-99d1-9b61aa42cd02: !Template
answer_choices: null
id: 5bc1f46c-d1fe-4bc9-99d1-9b61aa42cd02
jinja: 'Document in French: {{text["fr"]}}\n\nTranslate the previous document to proper Portuguese:
||| {{text["pt"]}}'
metadata: !TemplateMetadata
choices_in_prompt: false
languages: []
metrics:
- BLEU
original_task: true
name: prev_doc-fr-pt
reference: ''
9bc0e46c-d1fe-4bc9-99d1-9b61aa42cd02: !Template
answer_choices: null
id: 9bc0e46c-d1fe-4bc9-99d1-9b61aa42cd02
jinja: 'Document in the language of Portugal: {{text["pt"]}}\n\nTranslate the entire previous document to proper English sentence for sentence (min 100 words):
||| {{text["en"]}}'
metadata: !TemplateMetadata
choices_in_prompt: false
languages: []
metrics:
- BLEU
original_task: true
name: prev_doc_long-pt-en
reference: ''
49ddee85-1710-4615-bbfa-fe5803e21e43: !Template
answer_choices: null
id: 49ddee85-1710-4615-bbfa-fe5803e21e43
jinja: 'The English version says: {{text["en"]}}; Thus in Spanish the version
should say:
||| {{text["es"]}}'
metadata: !TemplateMetadata
choices_in_prompt: false
languages: []
metrics:
- BLEU
original_task: true
name: version-en-es-source+target
reference: ''
2bc2e46c-d1fe-4bc9-99d1-9b61aa42cd02: !Template
answer_choices: null
id: 2bc2e46c-d1fe-4bc9-99d1-9b61aa42cd02
jinja: 'Portuguese version says: {{text["pt"]}}; hence the French version
should say:
||| {{text["fr"]}}'
metadata: !TemplateMetadata
choices_in_prompt: false
languages: []
metrics:
- BLEU
original_task: true
name: version-pt-fr-source+target
reference: ''
73dc1b99-e8ea-4dc8-8a12-0abc3b0dbba0: !Template
answer_choices: null
id: 73dc1b99-e8ea-4dc8-8a12-0abc3b0dbba0
jinja: 'Given this in French: {{text["fr"]}} ... a good
Portuguese translation is: ||| {{text["pt"]}}'
metadata: !TemplateMetadata
choices_in_prompt: false
languages: []
metrics:
- BLEU
original_task: true
name: a_good_translation-fr-pt-source+target
reference: ''
63dc1b99-e8eb-4dc8-8a12-0abc3b0dbba0: !Template
answer_choices: null
id: 63dc1b77-e8eb-4dc8-8a12-0abc3b0dbba0
jinja: 'Source text in English: {{text["en"]}} , a good
Spanish translation is: ||| {{text["es"]}}'
metadata: !TemplateMetadata
choices_in_prompt: false
languages: []
metrics:
- BLEU
original_task: true
name: a_good_translation-en-pt-source+target
reference: ''
3bc0e99c-d1fe-4bc9-99d1-9b61aa42cd02: !Template
answer_choices: null
id: 3bc0e99c-d1fe-4bc9-99d1-9b61aa42cd02
jinja: 'Document in Spanish: {{text["es"]}}\n\nTranslate the previous document to proper English:
||| {{text["en"]}}'
metadata: !TemplateMetadata
choices_in_prompt: false
languages: []
metrics:
- BLEU
original_task: true
name: prev_doc-es-en
reference: ''
5bc1f55c-d1fe-4bc9-99d1-9b61aa42cd02: !Template
answer_choices: null
id: 5bc1f55c-d1fe-4bc9-99d1-9b61aa42cd02
jinja: 'Document in Portuguese: {{text["fr"]}}\n\nTranslate the previous document to proper English:
||| {{text["pt"]}}'
metadata: !TemplateMetadata
choices_in_prompt: false
languages: []
metrics:
- BLEU
original_task: true
name: prev_doc-pt-en
reference: ''
9bc0e73c-d1fe-4bc9-99d1-9b61aa42cd02: !Template
answer_choices: null
id: 9bc0e73c-d1fe-4bc9-99d1-9b61aa42cd02
jinja: 'Document in English: {{text["pt"]}}\n\nTranslate the entire previous document to the most common language in Spain sentence for sentence (min 100 words):
||| {{text["es"]}}'
metadata: !TemplateMetadata
choices_in_prompt: false
languages: []
metrics:
- BLEU
original_task: true
name: prev_doc_long-en-es
reference: ''