Skip to content
GitLab
    • Explore Projects Groups Snippets
Projects Groups Snippets
  • /
  • Help
    • Help
    • Support
    • Community forum
    • Submit feedback
    • Contribute to GitLab
  • Sign in / Register
  • L llm_api
  • Project information
    • Project information
    • Activity
    • Labels
    • Members
  • Repository
    • Repository
    • Files
    • Commits
    • Branches
    • Tags
    • Contributors
    • Graph
    • Compare
  • Issues 0
    • Issues 0
    • List
    • Boards
    • Service Desk
    • Milestones
  • Merge requests 0
    • Merge requests 0
  • CI/CD
    • CI/CD
    • Pipelines
    • Jobs
    • Schedules
  • Deployments
    • Deployments
    • Environments
    • Releases
  • Packages and registries
    • Packages and registries
    • Package Registry
    • Container Registry
    • Infrastructure Registry
  • Monitor
    • Monitor
    • Incidents
  • Analytics
    • Analytics
    • Value stream
    • CI/CD
    • Repository
  • Wiki
    • Wiki
  • Snippets
    • Snippets
  • Activity
  • Graph
  • Create a new issue
  • Jobs
  • Commits
  • Issue Boards
Collapse sidebar
  • MemriMemri
  • plugins
  • llm_api
  • Merge requests
  • !2

Update to mistral preset. Double context size

  • Review changes

  • Download
  • Email patches
  • Plain diff
Merged Alp Deniz Ogut requested to merge mistral into dev 1 year ago
  • Overview 0
  • Commits 1
  • Pipelines 1
  • Changes 3
Compare
  • dev (base)

and
  • latest version
    5c8483e1
    1 commit, 1 year ago

3 files
+ 4
- 4

    Preferences

    File browser
    Compare changes
llm‎_api‎
consta‎nts.py‎ +2 -2
db‎.py‎ +1 -1
mode‎l.py‎ +1 -1
llm_api/constants.py
+ 2
- 2
  • View file @ 5c8483e1

  • Edit in single-file editor

  • Open in Web IDE


import os
CONTEXT_SIZE = 2048
CONTEXT_SIZE_LIMIT = CONTEXT_SIZE - 200
CONTEXT_SIZE = 4096
CONTEXT_SIZE_LIMIT = CONTEXT_SIZE - 400
REDIS_HOST = os.getenv("REDIS_HOST", "localhost")
REDIS_PORT = os.getenv("REDIS_PORT", 6379)
REDIS_PASSWORD = os.getenv("REDIS_PASSWORD", None)
llm_api/db.py
+ 1
- 1
  • View file @ 5c8483e1

  • Edit in single-file editor

  • Open in Web IDE


@@ -13,7 +13,7 @@ r_db = redis.Redis(
socket_connect_timeout=REDIS_SOCKET_CONNECT_TIMEOUT,
)
DEFAULT_MODEL = "llama-2-13b-chat.ggmlv3.q4_0"
DEFAULT_MODEL = "mistral-7b-instruct-v0.1.Q4_K_M"
def make_request_stack_key(model: str = DEFAULT_MODEL):
llm_api/model.py
+ 1
- 1
  • View file @ 5c8483e1

  • Edit in single-file editor

  • Open in Web IDE


@@ -12,7 +12,7 @@ class Message(BaseModel):
class Chat(BaseModel):
system: str = "You are a helpful assistant."
model: str = "llama-2-13b-chat.ggmlv3.q4_0"
model: str = "mistral-7b-instruct-v0.1.Q4_K_M"
messages: list[Message]
def as_llm_query(self):
0 Assignees
None
Assign to
Reviewer
Aziz Berkay Yesilyurt's avatar
Aziz Berkay Yesilyurt
Request review from
Labels
0
None
0
None
    Assign labels
  • Manage project labels

Milestone
No milestone
None
None
Time tracking
No estimate or time spent
Lock merge request
Unlocked
2
2 participants
Aziz Berkay Yesilyurt
Alp Deniz Ogut
Reference: memri/plugins/llm_api!2
Source branch: mistral

Menu

Explore Projects Groups Snippets