Junwon Hwang
nuxlear
AI & ML interests
LLM / Optimization
Recent Activity
new activity
10 days ago
LGAI-EXAONE/EXAONE-Deep-7.8B:transformers로 사용 안됨 / cannot try this model via transformers library
new activity
about 2 months ago
LGAI-EXAONE/EXAONE-Deep-32B:float32 please!!!!
new activity
about 2 months ago
LGAI-EXAONE/EXAONE-Deep-32B:Add links to paper and Github repository
Organizations
nuxlear's activity
transformers로 사용 안됨 / cannot try this model via transformers library
1
#6 opened 12 days ago
by
jjwsl
float32 please!!!!
1
#9 opened about 2 months ago
by
ctranslate2-4you
Add links to paper and Github repository
1
3
#3 opened about 2 months ago
by
nielsr

Chat template difference with 32b
3
#2 opened about 2 months ago
by
nbroad

Update configuration_exaone.py
1
#4 opened about 2 months ago
by
sukrucildirr

How to properly run EXAONE-Deep-32B-AWQ with vLLM?
1
2
#1 opened about 2 months ago
by
hyunw55
<thought> was not being invoked when typing Korean.
6
#1 opened about 2 months ago
by
JDNOH
Space to test it online?
1
#1 opened about 2 months ago
by
celsowm

Getting an error on LM Studio + Parameters are not optimized also for LM Studio, please add a parameters file
1
#1 opened about 2 months ago
by
alberkheijn
Getting an error on LM Studio + Parameters are not optimized also for LM Studio, please add a parameters file
7
#1 opened about 2 months ago
by
alberkheijn
Getting an error on LM Studio + Parameters are not optimized also for LM Studio, please add a parameters file
7
#1 opened about 2 months ago
by
alberkheijn
No module named 'transformers_modules.LGAI-EXAONE.EXAONE-3'
1
#1 opened 4 months ago
by
Yesimm
Tokenizer에 대한 질문
2
1
#4 opened 4 months ago
by
min913
Help with llama
3
#1 opened 5 months ago
by
urtuuuu
Model upload incomplete?
1
9
#1 opened 5 months ago
by
bartowski
