sflagg/Kaggle-Mental-Health-Survey-Data
Viewer β’ Updated β’ 237k β’ 24
How to use vedhamani/CareMinds-AI with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-classification", model="vedhamani/CareMinds-AI") # Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("vedhamani/CareMinds-AI")
model = AutoModelForCausalLM.from_pretrained("vedhamani/CareMinds-AI")CareMinds-AI is a lightweight, offline-capable medical AI system built fine-tuned with domain-specific healthcare data. It combines:
CareMinds-AI is a hybrid AI system designed to:
It operates fully offline, without requiring external APIs.
Vedhamani Prabakar A
CareMinds-AI can be used as:
from transformers import AutoModelForCausalLM, AutoTokenizer
import torch
model_path = "./CareMinds-AI"
model = AutoModelForCausalLM.from_pretrained(model_path)
tokenizer = AutoTokenizer.from_pretrained(model_path)
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
model.to(device)
prompt = """### Instruction:
Explain about diabetes
### Response:
"""
inputs = tokenizer(prompt, return_tensors="pt").to(device)
outputs = model.generate(**inputs, max_new_tokens=150)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
###test the model:
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("vedhamani/CareMinds-AI")
tokenizer = AutoTokenizer.from_pretrained("vedhamani/CareMinds-AI")
prompt = "### Instruction:\nExplain about diabetes\n\n### Response:\n"
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=100)
print(tokenizer.decode(outputs[0]))