Compact models that mimic vector space of the "crisistransformers/CT-M1-Complete". These models should be fine-tuned on downstream tasks like BERT.
CrisisTransformers
university
AI & ML interests
Natural Language Processing, Social Computing, Crisis Informatics
Organization Card
CrisisTransformers | State-of-the-art contextual and semantically meaningful sentence embeddings for crisis-related social media texts
CrisisTransformers is a family of pre-trained language models and sentence encoders introduced in the following papers:
- Pre-trained models and sentence encoders: CrisisTransformers: Pre-trained language models and sentence encoders for crisis-related social media texts
- Multi-lingual sentence encoders: Semantically Enriched Cross-Lingual Sentence Embeddings for Crisis-related Social Media Texts
- Mini models: "Actionable Help" in Crises: A Novel Dataset and Resource-Efficient Models for Identifying Request and Offer Social Media Posts
The models were trained on a massive corpus of over 15 billion word tokens from tweets associated with 30+ crisis events, such as disease outbreaks, natural disasters, conflicts, etc.
models 14
crisistransformers/CT-XLMR-SE
Sentence Similarity • Updated
• 1
crisistransformers/CT-mBERT-SE
Sentence Similarity • Updated
• 198
crisistransformers/CT-M1-Complete
Fill-Mask • Updated
• 5
crisistransformers/CT-M2-OneLook
Fill-Mask • Updated
• 2
crisistransformers/CT-M2-BestLoss
Fill-Mask • Updated
• 2
crisistransformers/CT-M2-Complete
Fill-Mask • Updated
• 2
crisistransformers/CT-M3-OneLook
Fill-Mask • Updated
• 2
crisistransformers/CT-M3-BestLoss
Fill-Mask • Updated
• 1
crisistransformers/CT-M3-Complete
Fill-Mask • Updated
• 5 • 1
crisistransformers/CT-M1-BestLoss
Fill-Mask • Updated
• 3
datasets 0
None public yet