Skip to content

Issues: pytorch/torchtune

v0.7.0 tracker
#2538 opened Mar 30, 2025 by joecummings
Open
Testing tracker
#1890 opened Oct 23, 2024 by felipemello1
Open
Beta
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Assignee
Filter by who’s assigned
Sort

Issues list

[Multi-Node] Test recipes/knowledge_distillation_distributed.py on a multi-node setup community help wanted We would love the community's help completing this issue good first issue Good for newcomers testing
#2649 opened Apr 29, 2025 by joecummings
Set log level for all recipes from the config community help wanted We would love the community's help completing this issue enhancement New feature or request good first issue Good for newcomers
#2648 opened Apr 29, 2025 by joecummings
Support Qwen3 enhancement New feature or request
#2645 opened Apr 29, 2025 by pocca2048
QAT LoRA related files for Qwen2.5 discussion Start a discussion
#2640 opened Apr 27, 2025 by altria-zewei-wang
Rename LoRATrainable enum community help wanted We would love the community's help completing this issue
#2636 opened Apr 25, 2025 by ebsmothers
triton and edge device support
#2634 opened Apr 25, 2025 by moghadas76
torchtune + vLLM discussion Start a discussion
#2632 opened Apr 23, 2025 by haydn-jones
Add tool calling to ShareGPTToMessages Transform enhancement New feature or request
#2618 opened Apr 21, 2025 by joecummings
Add tool calling to OpenAIToMessages Transform enhancement New feature or request
#2617 opened Apr 21, 2025 by joecummings
support for multigpu PPO enhancement New feature or request
#2606 opened Apr 17, 2025 by parthsarthi03
How to try with FP8 for LLM finetuning? discussion Start a discussion
#2600 opened Apr 15, 2025 by kailashg26
vllm support for qat model discussion Start a discussion
#2597 opened Apr 15, 2025 by mces89
_generation token mask error
#2590 opened Apr 14, 2025 by xueyan-lii
ProTip! no:milestone will show everything without a milestone.