File size: 930 Bytes
2fc62f5
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
---
license: mit
language:
- en
base_model:
- mistralai/Mistral-7B-v0.1
pipeline_tag: question-answering
tags:
- technology
- QA
---
# TechChat

**TechChat** is a domain-specific chatbot fine-tuned from [Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1) using LoRA.

## Model Details
- **Base model:** mistralai/Mistral-7B-v0.1
- **Fine-tuning method:** LoRA (Low-Rank Adaptation)
- **Max sequence length:** 512 tokens

## Intended Use
- Technical Q&A in [your domain]
- Chat-style interactions

## Example
```python
from transformers import AutoTokenizer, AutoModelForCausalLM

model = AutoModelForCausalLM.from_pretrained("hari7261/TechChat")
tokenizer = AutoTokenizer.from_pretrained("hari7261/TechChat")

prompt = "Explain DNS in simple terms."
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=150)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))