Skip to content

Conversation

vvnsrzn
Copy link
Contributor

@vvnsrzn vvnsrzn commented Feb 10, 2025

Hi @nsarrazin!

I'm trying to refactor this component with the most minimalist way possible!

I hope the CI will be good too 🤞

Following #1691

Copy link
Contributor

@nsarrazin nsarrazin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi thanks for the contrib! 🚀

I tried to run this locally but i'm getting the following error

Uncaught Svelte error: effect_update_depth_exceeded
Maximum update depth exceeded. This can happen when a reactive block or effect repeatedly sets a new value. Svelte limits the number of nested updates to prevent infinite loops

Last ten effects were:  
Array(10) [ ScrollToBottomBtn(), ScrollToBottomBtn(), ScrollToBottomBtn(), ScrollToBottomBtn(), ScrollToBottomBtn(), ScrollToBottomBtn(), ScrollToBottomBtn(), ScrollToBottomBtn(), ScrollToBottomBtn(), ScrollToBottomBtn() ]

I think there's something going on with the reactivity there, do you also get this locally? 👀

@vvnsrzn
Copy link
Contributor Author

vvnsrzn commented Feb 11, 2025

Sorry for that! I would like to double check but, I'm no longer able to run it locally:

WARN (277839): No tokenizer found for model CohereForAI/c4ai-command-r-plus-08-2024, using default template. Consider setting tokenizer manually or making sure the model is available on the hub.
ERROR (277839): Failed to load tokenizer mistralai/Mistral-Nemo-Instruct-2407 make sure the model is available on the hub and you have access to any gated models.
    err: {
      "type": "Error",
      "message": "Forbidden access to file: \"https://huggingface.co/mistralai/Mistral-Nemo-Instruct-2407/resolve/main/tokenizer_config.json\".",
      "stack":
          Error: Forbidden access to file: "https://huggingface.co/mistralai/Mistral-Nemo-Instruct-2407/resolve/main/tokenizer_config.json".
              at handleError (file:///home/vivian/labs/oss/chat-ui/node_modules/@huggingface/transformers/dist/transformers.mjs:29672:11)
              at getModelFile (file:///home/vivian/labs/oss/chat-ui/node_modules/@huggingface/transformers/dist/transformers.mjs:29905:24)
              at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
              at async getModelJSON (file:///home/vivian/labs/oss/chat-ui/node_modules/@huggingface/transformers/dist/transformers.mjs:30007:18)
              at async Promise.all (index 1)
              at async loadTokenizer (file:///home/vivian/labs/oss/chat-ui/node_modules/@huggingface/transformers/dist/transformers.mjs:23494:18)
              at async AutoTokenizer.from_pretrained (file:///home/vivian/labs/oss/chat-ui/node_modules/@huggingface/transformers/dist/transformers.mjs:27771:50)
              at async getTokenizer (/home/vivian/labs/oss/chat-ui/src/lib/utils/getTokenizer.ts:6:12)
              at async getChatPromptRender (/home/vivian/labs/oss/chat-ui/src/lib/server/models.ts:87:17)
              at async processModel (/home/vivian/labs/oss/chat-ui/src/lib/server/models.ts:218:21)
    }

It is something related to my HF token?

Sorry for the fuzz!

@nsarrazin
Copy link
Contributor

If you're using the production huggingchat config, some models require that you accept some terms on the Hugging Face Hub. For example: https://huggingface.co/mistralai/Mistral-Nemo-Instruct-2407/ should show

image

near the top. I think there's a few more, there should be some warnings in the logs.

Otherwise feel free to remove some models from the config for testing 😄

@vvnsrzn
Copy link
Contributor Author

vvnsrzn commented Feb 11, 2025

Thanks for your patience, I think I've found the problem with my previous implementation.

I'm sorry, even a bit ashamed, because for some reason I was redirected to the chat... in production during my development! 🤦

However, I'm having a problem with the proxy right now.

09:53:24.738] ERROR (293713): fetch failed
    err: {
      "type": "TypeError",
      "message": "fetch failed: getaddrinfo ENOTFOUND proxy.serverless.api-inference.huggingface.tech",
      "stack":
          TypeError: fetch failed
              at node:internal/deps/undici/undici:13178:13
              at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
              at async streamingRequest (file:///home/vivian/labs/oss/chat-ui/node_modules/@huggingface/inference/dist/index.js:317:20)
              at async textGenerationStream (file:///home/vivian/labs/oss/chat-ui/node_modules/@huggingface/inference/dist/index.js:715:3)
              at async generateFromDefaultEndpoint (/home/vivian/labs/oss/chat-ui/src/lib/server/generateFromDefaultEndpoint.ts:12:20)
              at async getReturnFromGenerator (/home/vivian/labs/oss/chat-ui/src/lib/utils/getReturnFromGenerator.ts:6:14)
              at async generateTitle (/home/vivian/labs/oss/chat-ui/src/lib/server/textGeneration/title.ts:50:10)
              at async generateTitleForConversation (/home/vivian/labs/oss/chat-ui/src/lib/server/textGeneration/title.ts:13:19)
          caused by: Error: getaddrinfo ENOTFOUND proxy.serverless.api-inference.huggingface.tech
              at GetAddrInfoReqWrap.onlookupall [as oncomplete] (node:dns:120:26)
              at GetAddrInfoReqWrap.callbackTrampoline (node:internal/async_hooks:130:17)
    }

It's not a big deal, since I created some fake messages to test the ScrollToBottom component 😉

@nsarrazin
Copy link
Contributor

Ooh yeah the prod config has been updated recently to use an internal proxy, my bad I forgot to update the docs!

@nsarrazin
Copy link
Contributor

Alright I've updated updateLocalEnv on latest main, see commit here it should now be more compatible with local development again.

Will review the PR now 😄

Copy link
Contributor

@nsarrazin nsarrazin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Works great thanks a lot! 🚀

@nsarrazin nsarrazin merged commit 64f5938 into huggingface:main Feb 11, 2025
3 checks passed
@vvnsrzn vvnsrzn deleted the svelte-5-leftover-scroll-to-bottom branch February 11, 2025 10:36
maksym-work pushed a commit to siilats/chat-ui that referenced this pull request Jul 2, 2025
* refactor: migrating ScrollToBottomBtn to svelte-5

* fix: infinite loop on the scrolltobottom effect

---------

Co-authored-by: Nathan Sarrazin <[email protected]>
Matsenas pushed a commit to Matsenas/chat-ui that referenced this pull request Jul 4, 2025
* refactor: migrating ScrollToBottomBtn to svelte-5

* fix: infinite loop on the scrolltobottom effect

---------

Co-authored-by: Nathan Sarrazin <[email protected]>
Matsenas pushed a commit to Matsenas/chat-ui that referenced this pull request Jul 4, 2025
* refactor: migrating ScrollToBottomBtn to svelte-5

* fix: infinite loop on the scrolltobottom effect

---------

Co-authored-by: Nathan Sarrazin <[email protected]>
gary149 pushed a commit to gary149/chat-ui that referenced this pull request Aug 29, 2025
* refactor: migrating ScrollToBottomBtn to svelte-5

* fix: infinite loop on the scrolltobottom effect

---------

Co-authored-by: Nathan Sarrazin <[email protected]>
gary149 pushed a commit to gary149/chat-ui that referenced this pull request Aug 29, 2025
* refactor: migrating ScrollToBottomBtn to svelte-5

* fix: infinite loop on the scrolltobottom effect

---------

Co-authored-by: Nathan Sarrazin <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants