Skip to main content
← Back to Text Models LFM2.5-1.2B-JP is fine-tuned for Japanese language tasks, delivering high-quality Japanese text generation, translation, and conversation. Built on LFM2.5 with specialized Japanese training data.

Specifications

PropertyValue
Parameters1.2B
Context Length32K tokens
ArchitectureLFM2.5 (Dense)

Japanese NLP

Native Japanese text generation

Translation

English-Japanese translation

Fine-tunable

TRL compatible (SFT, DPO, GRPO)

Quick Start

Install:
pip install transformers torch
Download & Run:
from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("LiquidAI/LFM2.5-1.2B-JP", device_map="auto")
tokenizer = AutoTokenizer.from_pretrained("LiquidAI/LFM2.5-1.2B-JP")

input_ids = tokenizer.apply_chat_template(
    [{"role": "user", "content": "機械学習とは何ですか?"}],
    add_generation_prompt=True, return_tensors="pt"
).to(model.device)

output = model.generate(input_ids, max_new_tokens=256)
print(tokenizer.decode(output[0], skip_special_tokens=True))