File size: 2,298 Bytes
e8ffa96
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
---
license: apache-2.0
language:
- en
library_name: transformers
tags:
- mortality
- actuary
- healthcare
- llama
- text-generation
datasets:
- world-mortality
widget:
- text: "What is the life expectancy in United States for 2024?"
---

# Morbid.AI v0.0.4 - Mortality Prediction Model

## Model Description

Morbid.AI is a specialized language model fine-tuned for mortality analysis and actuarial predictions. Based on Llama-2-7b, it's trained on the World Mortality Dataset to provide insights on:

- Life expectancy calculations
- Mortality trends analysis  
- Death probability estimations
- Actuarial assessments
- Country-specific mortality comparisons

## Intended Use

This model is designed for:
- Actuarial analysis
- Healthcare research
- Mortality trend analysis
- Educational purposes

**Note:** This model should NOT be used for personal medical advice or life insurance underwriting decisions.

## Training Data

Fine-tuned on:
- World Mortality Dataset (2015-2024)
- 34,537 training examples
- Countries: 200+ nations
- Mortality metrics from official statistics

## Usage

```python
from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("h3ir/morbid0.0.4")
model = AutoModelForCausalLM.from_pretrained("h3ir/morbid0.0.4")

prompt = "What are the mortality trends for Japan in 2023?"
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(**inputs, max_length=200)
response = tokenizer.decode(outputs[0], skip_special_tokens=True)
```

## API Usage

```bash
curl https://api-inference.huggingface.co/models/h3ir/morbid0.0.4 \
  -X POST \
  -d '{"inputs": "What is the life expectancy in France?"}' \
  -H "Authorization: Bearer YOUR_TOKEN"
```

## Model Performance

- Training Loss: 0.42
- Validation Accuracy: 87%
- Specialization: Mortality & Actuarial Data

## Limitations

- Data limited to 2015-2024
- Predictions are statistical estimates
- Should not replace professional actuarial advice
- May have biases from source data

## Citation

```bibtex
@misc{morbidai2024,
  author = {h3ir},
  title = {Morbid.AI: Mortality Prediction Model},
  year = {2024},
  publisher = {HuggingFace},
  url = {https://huggingface.co/h3ir/morbid0.0.4}
}
```

## Contact

For questions: Visit [morbid.ai](https://morbid.ai)