Skip to content

Difficulty getting started! #352

@mcchung52

Description

@mcchung52

LMQL looks very promising (having played w/ Guidance) so I want to make this work but having issues from get go, trying to run it locally. I'm really hoping I can get some help.

IMMEDIATE GOAL: what is the simplest way to make this work?

Context:
I have several gguf models in my comp that I want to run in my Macbook pro (pre-M, intel), basically via CPU which I ran previously many times in python code, though slow.

I want to:
1.run model directly in python code
2a.run model by exposing via api, like localhost:8081
2b.(can't in my mac but can in pc) run gguf via LM Studio and expose ip:port in PC and have python code in mac tap into it

Code:

import lmql

model_path = "/Users/mchung/Desktop/proj-ai/models/"
# model = "wizardcoder-python-13b-v1.0.Q4_K_S.gguf"
model = "codeqwen-1_5-7b-chat-q8_0.gguf"
# model = "mistral-7b-instruct-v0.2.Q5_K_M.gguf"

m = f"local:llama.cpp:{model_path+model}"
print(m)

@lmql.query(model=lmql.model(m, verbose=True))
def query_function():
    '''lmql
    """A great good dad joke. A indicates the punchline
    Q:[JOKE]
    A:[PUNCHLINE]""" where STOPS_AT(JOKE, "?") and \
                           STOPS_AT(PUNCHLINE, "\n")
    '''
    return "What's the best way to learn Python?"

response = query_function()
print(response)

Thanks in advance.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions