발생 상황
허깅페이스에서 모델을 불러올 때 계속 아래와 같은 403에러가 발생했습니다. 인터넷도 원활하고, 권한도 부여받았는데 계속 이런 문제가 발생하더라고요..
Exception has occurred: OSError
We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like google/paligemma-3b-pt-224 is not the path to a directory containing a file named model-00002-of-00003.safetensors.
Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.
requests.exceptions.HTTPError: 403 Client Error: Forbidden for url: https://huggingface.co/google/paligemma-3b-pt-224/resolve/main/model-00002-of-00003.safetensors
The above exception was the direct cause of the following exception:
huggingface_hub.errors.HfHubHTTPError: (Request ID: Root=1-66fbfa79-7a5aa8fa55958d782e36283c;59f87a96-47fd-435a-9ba0-8f1f5614a37c)
403 Forbidden: Please enable access to public gated repositories in your fine-grained token settings to view this repository..
Cannot access content at: https://huggingface.co/google/paligemma-3b-pt-224/resolve/main/model-00002-of-00003.safetensors.
Make sure your token has the correct permissions.
The above exception was the direct cause of the following exception:
huggingface_hub.errors.LocalEntryNotFoundError: An error happened while trying to locate the file on the Hub and we cannot find the requested files in the local cache. Please check your connection and try again or make sure your Internet connection is on.
The above exception was the direct cause of the following exception:
File "/home/UserID/gemmasprint/error.py", line 15, in <module>
model = PaliGemmaForConditionalGeneration.from_pretrained(model_id).eval()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
OSError: We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like google/paligemma-3b-pt-224 is not the path to a directory containing a file named model-00002-of-00003.safetensors.
Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.
해결 방법
토큰 타입이 문제였습니다!!! Write type 토큰을 사용해야 모델 다운이 되더라구요. write type 토큰이 없다면 새로 토큰을 발행하셔야 합니다. 발행 방법은 https://huggingface.co/settings/tokens 이 링크 따라 들어가셔서 Create new token 눌러서 새로운 토큰을 발행하는데 이때 아래와 같이 token type을 write으로 지정하시면 됩니다.
'DeepLearning' 카테고리의 다른 글
[On-device AI] Pytorch 모델 TFLite로 변환하기(torch->onnx->tf->tflite) (4) | 2024.12.26 |
---|---|
[CV] ViT 모델 구조 정리 (2) | 2024.10.06 |
[MLLM] 구글의 오픈소스 VLM 'PaliGemma' 로컬 튜토리얼 (1) | 2024.10.02 |
[Pytorch] collate_fn: 데이터 샘플을 배치로 합치는 함수 (1) | 2024.08.08 |