Skip to content

Conversation

Copy link
Contributor

Copilot AI commented Jan 19, 2026

What does this changes

Adds support for Alibaba Cloud's Qwen3-0.6B model to the pythainlp.lm module.

What was wrong

The issue requested adding Qwen/Qwen3-0.6B to pythainlp.lm as a lightweight model option suitable for small tasks.

How this fixes it

New Qwen3 wrapper class (pythainlp/lm/qwen3.py):

  • load_model(): Loads Qwen3-0.6B from HuggingFace with configurable device and dtype
  • generate(): Basic text generation from prompts
  • chat(): Chat-based generation with message history support
  • Proper device handling using .to() method
  • Graceful error handling with validation
  • File uses 2026 copyright year as it's newly created in 2026

Module integration:

  • Exported from pythainlp.lm with conditional import
  • Falls back to helpful error when dependencies missing
  • Added qwen3 optional dependency group (torch>=1.0.0, transformers>=4.22.1) in alphabetical order

Tests:

  • Tests added in tests/extra/testx_lm.py for Qwen3 functionality
  • Tests gracefully skip when optional dependencies are not installed

Compatibility:

  • Merged dev branch to include Python 3.13 compatibility fixes
  • All tests pass with the latest changes

Usage example:

from pythainlp.lm import Qwen3
import torch

# Install with: pip install pythainlp[qwen3]

model = Qwen3()
model.load_model(device="cpu", torch_dtype=torch.bfloat16)

# Basic generation
result = model.generate("สวัสดี")

# Chat format
messages = [{"role": "user", "content": "สวัสดีครับ"}]
response = model.chat(messages)

Your checklist for this pull request

  • Passed code styles and structures
  • Passed code linting checks and unit test
Original prompt

This section details on the original issue you should resolve

<issue_title>Add Qwen/Qwen3-0.6B to pythainlp.lm</issue_title>
<issue_description>I found Qwen/Qwen3-0.6B. It is good enough for small tasks. I think we can add Qwen/Qwen3-0.6B to pythainlp.lm.

Not using

Model: https://huggingface.co/Qwen/Qwen3-0.6B

Apache license 2.0</issue_description>

Comments on the Issue (you are @copilot in this section)


💡 You can make Copilot smarter by setting up custom instructions, customizing its development environment and configuring Model Context Protocol (MCP) servers. Learn more Copilot coding agent tips in the docs.

Copilot AI and others added 4 commits January 19, 2026 14:56
Co-authored-by: wannaphong <8536487+wannaphong@users.noreply.github.com>
Co-authored-by: wannaphong <8536487+wannaphong@users.noreply.github.com>
Co-authored-by: wannaphong <8536487+wannaphong@users.noreply.github.com>
Co-authored-by: wannaphong <8536487+wannaphong@users.noreply.github.com>
@wannaphong
Copy link
Member

@copilot move test to tests/extra

Copilot AI changed the title [WIP] Add Qwen/Qwen3-0.6B to pythainlp.lm Add Qwen3-0.6B language model support to pythainlp.lm Jan 19, 2026
Copilot AI requested a review from wannaphong January 19, 2026 15:06
Co-authored-by: wannaphong <8536487+wannaphong@users.noreply.github.com>
Copy link
Contributor Author

Copilot AI commented Jan 19, 2026

@copilot move test to tests/extra

Done. Moved Qwen3 tests to tests/extra/testx_lm.py and reverted tests/core/test_lm.py to its original state. Commit: 2018813

Co-authored-by: bact <128572+bact@users.noreply.github.com>
Copilot AI requested a review from bact January 19, 2026 15:52
@bact
Copy link
Member

bact commented Jan 19, 2026

@copilot There are changes in dev branch to fix the errors occured in Python 3.13. Merge the changes and run the tests again.

Copy link
Contributor Author

Copilot AI commented Jan 20, 2026

@copilot There are changes in dev branch to fix the errors occured in Python 3.13. Merge the changes and run the tests again.

Merged dev branch with Python 3.13 fixes. All tests pass. Merge commit: 0d104dd

@sonarqubecloud
Copy link

@coveralls
Copy link

Coverage Status

coverage: 53.298% (-0.3%) from 53.624%
when pulling 0d104dd on copilot/add-qwen3-0-6b-model
into 65e6c54 on dev.

@bact bact marked this pull request as ready for review January 20, 2026 08:00
@bact bact added the enhancement enhance functionalities label Jan 20, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement enhance functionalities

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Add Qwen/Qwen3-0.6B to pythainlp.lm

4 participants