The Hugging Face Bedrock Importer is a command-line tool that simplifies the process of downloading Hugging Face models and deploying them to Amazon Bedrock. This tool automates the workflow of model download, Amazon S3 upload, and Bedrock Custom Model Import, making it easier for developers to leverage Hugging Face language models in their AWS environment.
📚 Hugging Face models overview: https://huggingface.co/models
With this importer, you can quickly bring powerful language models from Hugging Face into your Bedrock ecosystem. It handles the complexities of model download, S3 storage management, and Bedrock custom model creation, allowing you to focus on utilizing these models in your applications.
To use the Hugging Face Bedrock Importer, ensure you have Python 3.11+ installed. Then, follow these steps:
pip install git+https://github.com/masquare/huggingface-bedrock-importer.git
uv tool install git+https://github.com/masquare/huggingface-bedrock-importer.git
Note
Ensure your AWS credentials are properly set up, either through environment variables or the AWS CLI configuration (see docs). Make sure Amazon Bedrock Custom Model Import is supported in the AWS Region you are planning to use. You can check region support here.
To import a Hugging Face model to Bedrock, use the following command:
hf-bedrock-import --model-id <model_id> --s3-uri <s3_uri>
Replace <model_id>
with the Hugging Face model ID and <s3_uri>
with the S3 URI where you want to store the model files (e.g., s3://amzn-s3-demo-bucket/hf_models/
).
Example:
hf-bedrock-import --model-id deepseek-ai/DeepSeek-R1-Distill-Llama-8B --s3-uri s3://amzn-s3-demo-bucket/hf_models/
The importer supports the following command-line options:
--model-id
: Hugging Face model ID (default: "deepseek-ai/DeepSeek-R1-Distill-Llama-8B")--s3-uri
: S3 URI for model storage--cleanup-resources
: Cleanup AWS resources (Bedrock custom model, IAM role, S3 model files)--cleanup-model
: Cleanup local model files--test
: Test the model after importing it
-
Import a model and test it:
hf-bedrock-import --model-id deepseek-ai/DeepSeek-R1-Distill-Llama-8B --s3-uri s3://amzn-s3-demo-bucket/models/
The script will print the model ARN that you can use to invoke the model, as well as a link to the Bedrock Playground where you can play around with the model:
Model ARN: arn:aws:bedrock:{AWS_REGION}:{ACCOUNT_ID}:imported-model/{MODEL_ID} Link to the Bedrock playground for the model: https://{AWS_REGION}}.console.aws.amazon.com/bedrock/home#/text-generation-playground?mode=text&modelId=arn%3Aaws%3Abedrock%3A{AWS_REGION}%3A{ACCOUNT_ID}%3Aimported-model%2F{MODEL_ID}
Note: Using custom models in Amazon Bedrock incurs costs. See the Amazon Bedrock pricing page for more details.
-
Clean up AWS resources for a specific model:
hf-bedrock-import --model-id deepseek-ai/DeepSeek-R1-Distill-Llama-8B --s3-uri s3://amzn-s3-demo-bucket/models/ --cleanup-resources
-
Clean up local model files:
hf-bedrock-import --model-id bert-base-uncased --cleanup-model
Integrate the Hugging Face Bedrock Importer in your own code. Example:
from huggingface_bedrock_importer import importer
S3_URI = "s3://amzn-s3-demo-bucket/hf-models/"
MODEL_ID = "deepseek-ai/DeepSeek-R1-Distill-Llama-8B"
# import model to Bedrock
model_arn = importer.import_model_to_bedrock(MODEL_ID, S3_URI)
# use the model
bedrock_runtime = boto3.client("bedrock-runtime"))
prompt = "What is the capital of France?"
invoke_response = bedrock_runtime.invoke_model(
modelId=model_arn, body=json.dumps({"prompt": prompt})
)
invoke_response["body"] = json.loads(invoke_response["body"].read().decode("utf-8"))
print(json.dumps(invoke_response, indent=4))
# cleanup
importer.cleanup_aws_resources(S3_URI, MODEL_ID)
importer.cleanup_local_resources(MODEL_ID)
-
S3 Access Issues:
- Problem: "Access Denied" errors when uploading to S3
- Solution: Ensure your AWS credentials have the necessary permissions to write to the specified S3 bucket
- Diagnostic steps:
- Check your AWS credentials configuration
- Verify IAM user/role permissions for S3 access
- Try uploading a test file to the S3 bucket using the AWS CLI
-
Model Download Failures:
- Problem: Unable to download the model from Hugging Face
- Solution: Verify internet connection and Hugging Face API status
- Diagnostic steps:
- Check your internet connection
- Ensure the model ID is correct and publicly accessible
- Try downloading the model manually from the Hugging Face website
-
Bedrock Import Errors:
- Problem: Model import to Bedrock fails
- Solution: Check IAM role permissions and S3 bucket accessibility
- Diagnostic steps:
- Verify the IAM role has the correct permissions for Bedrock and S3
- Ensure the S3 bucket is in the same region as your Bedrock endpoint
- Check Bedrock service quotas to ensure you haven't exceeded limits
The Hugging Face Bedrock Importer follows this data flow when importing a model:
- Download model from Hugging Face Hub to local storage
- Upload model files from local storage to specified S3 bucket
- Create or retrieve IAM role for Bedrock model import
- Initiate Bedrock model import job using S3 location and IAM role
- Wait for import job completion and retrieve model ARN
- (Optional) Test the imported model with a sample prompt
flowchart TD
hf[Hugging Face Hub] --> local[Local Storage]
local --> s3[Amazon S3 Bucket]
s3 --> bedrock[Amazon Bedrock]
Note: Ensure sufficient storage capacity in local environment, as language models can be several gigabytes in size.