Skip to content

Conversation

cebtenzzre
Copy link
Member

Now that our Vulkan backend has been merged into llama.cpp, we have a much smaller diff. This PR includes the updated llama.cpp as well as the necessary changes to make GPT4All compatible.

@cebtenzzre cebtenzzre added backend gpt4all-backend issues vulkan labels Jan 29, 2024
@cebtenzzre cebtenzzre requested a review from manyoso January 29, 2024 21:01
@cebtenzzre cebtenzzre force-pushed the update-llamacpp-vulkan branch from e022a5e to 6b6a8fe Compare January 29, 2024 21:42
@manyoso manyoso merged commit 38c6149 into main Jan 29, 2024
@koech-v
Copy link

koech-v commented Feb 1, 2024

Intel Arc got Windows support 2 days ago...

@cebtenzzre
Copy link
Member Author

Intel Arc got Windows support 2 days ago...

If you're talking about SYCL, that's a completely different backend that we don't support. GPT4All is built on top of llama.cpp's Kompute-based Vulkan backend.

@koech-v
Copy link

koech-v commented Feb 2, 2024

Intel Arc got Windows support 2 days ago...

If you're talking about SYCL, that's a completely different backend that we don't support. GPT4All is built on top of llama.cpp's Kompute-based Vulkan backend.

It's okay. Let me look around for one with support.

@cebtenzzre cebtenzzre deleted the update-llamacpp-vulkan branch February 10, 2025 16:38
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
backend gpt4all-backend issues vulkan
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants