Spaces:
Running
Running
refactor(global_config): reduce LLM_MODEL_MAX_INPUT_LENGTH to 1000
Browse filesThe input length limit for the LLM model was reduced from 20,000 to 1,000 characters to improve performance and prevent potential issues with excessive input sizes. This change ensures more efficient processing and better resource utilization.
- global_config.py +1 -1
global_config.py
CHANGED
@@ -97,7 +97,7 @@ class GlobalConfig:
|
|
97 |
DEFAULT_MODEL_INDEX = int(os.environ.get('DEFAULT_MODEL_INDEX', '4'))
|
98 |
LLM_MODEL_TEMPERATURE = 0.2
|
99 |
MAX_PAGE_COUNT = 50
|
100 |
-
LLM_MODEL_MAX_INPUT_LENGTH =
|
101 |
|
102 |
LOG_LEVEL = 'DEBUG'
|
103 |
COUNT_TOKENS = False
|
|
|
97 |
DEFAULT_MODEL_INDEX = int(os.environ.get('DEFAULT_MODEL_INDEX', '4'))
|
98 |
LLM_MODEL_TEMPERATURE = 0.2
|
99 |
MAX_PAGE_COUNT = 50
|
100 |
+
LLM_MODEL_MAX_INPUT_LENGTH = 1000 # characters
|
101 |
|
102 |
LOG_LEVEL = 'DEBUG'
|
103 |
COUNT_TOKENS = False
|