This repo demonstrates how to get started with AWS services and how to make a basic call to Bedrock in Python
Before jumping into the code, we first need to make a new user. AWS typically has a root user that gives out permissions, with other users that run specific services. Because the hackathon is so short, we're only going to create 1 user and give that one all the permissions you need.
- Navigate to your provided workshop account url
- Should look something like
https://catalog.us-east-1.prod.workshops.aws/<...>
- Should look something like
- Click the
Email one-time password (OTP)option in orange - Enter in your SCU email address
- Check your email, copy-paste the code into the box, and click
Sign In - Scoll to the bottom of the Terms and Conditions, check the box to agree, then click
Join event - On the Event dashboard page it directs you to, go to the sidebar and click
Open AWS consolein theAWS account accesssection
- Type
IAMin the search bar at the top, and click on the top result
- Click on the
Userstable under Access management tab on the left sidebar - You should see your root user, which will have a bunch of Access denied text over it. Ignore that and click the orange
Create userbutton in the top right - Give it a name, whatever you think is best
- You do NOT need to give it access to the AWS console, as the root user can already do that
- In the permission options, click the
Attach policies directlybox- If you create more users, you can copy permissions, but do NOT copy from the root
- There are many permission policies, but I would recommend searching the service you want to use, then FullAccess after it
- Ex: AmazonBedrockFullAccess, click the checkbox to add it
- You can add policies later if you find out you missed one
- I would recommend Bedrock, SageMaker, EC2, and S3 Full Access to start
- Once you've selected your services, click the
Nextbutton at the bottom - If everything is correct, click
Create user
This will allow us to use AWS services in our code
- After creating your user, the
UsersIAM section will now show the root and your new account's name. Click on your new account - This page is where you can add permissions later if necessary, but we need to create an Access Key. In the
Summarybox, click theCreate access keylink on the right side
- It will ask you for a use case, select the
Local codeoption. Ignore the suggestion it gives you and check the conformation box and clickNext - The description is optional, so leave it blank and click
Create access key - This gives you an access key and a secret. Copy both of these in a place you won't lose them
- THE SECRET CAN NEVER BE VIEWED AGAIN ONCE YOU CLOSE THIS SCREEN
- If you lose it, just create another key
- NOTE: These are extremely sensitive keys! Do NOT push these codes to GitHub and share them with your team via Discord DM, iMessage, etc.
If you plan to use AWS Bedrock models in your code (or plan on following this README past this point), you need to get the model ID in 1 of 2 ways.
- In the search bar, type
Bedrockand click on that service, it has a green icon - Feel free to click on
Model catalogunder theDiscoversection on the left sidebar and explore the different models, determining what would be best for your use case- You can use the playgrounds in the
Testsection on the left sidebar to explore models further - Keep note of values such as temperature, top P, etc. as you tune your model responses to your liking
- You can use the playgrounds in the
- Once you have one you'd like to use in your code, click on the model in the
Model catalogsection and look for theInference typefield- If your model does NOT have one (ex: OpenAI models), you can just copy the
Model ID
- If your model does NOT have one (ex: OpenAI models), you can just copy the
- If this field is defined (ex: Claude models), you need a different parameter: inference ID
- Under the
Infersection on the left sidebar, clickCross-region inference - Search for your model and copy the
Inference profile IDfield- Use the
USoption if possible, although global profiles work as well
- Use the
- Under the
First, we need to create a virtual environment for Python, which essentially installs your packages to one project instead of globally
- Create a virtual environment by running the following in the terminal directory of your project
- Windows:
py -m venv venv - Mac/Linux:
python3 -m venv venv
- Windows:
- Then, you need to actually use the environment, so type the following
- Windows:
venv\Scripts\Activate.ps1 - Mac/Linux:
source ./venv/bin/activate
- Windows:
- Your terminal should have (venv) at the beginning of it now
- I would recommend setting your VSCode interpreter to the venv path in the bottom right when a Python file is open
- I have made a requirements.txt to install all of the packages you will need, so run the following to install them
pip install -r requirements.txt
In order to not push your keys to GitHub, we need to put them in a .env file
- Open that file and replace
YOUR_ACCESS_KEYandYOUR_SECRET_KEYwith the ones you copied earlier - Go to the
.gitignorefile and uncomment (remove the #) from line 4 that says.env- A
.gitignorefile tells git to not push certain files and keep them local to computer - Remember, your AWS credentials are EXTREMELY CONFIDENTIAL, hence why we do this
- A
- Now you are ready to call AWS services!
There are 2 files I have made for you: bedrock.py and converse.py. Both call Amazon Bedrock, using the model you enabled and specify, to generate a response to a given prompt.
- Change
model_idto the model you selected on the AWS console- This changes depending on if you just need the
Model IDor a model that requires anInference ID
- This changes depending on if you just need the
- Change
user_messageto the prompt you want to use - Change the
inferenceConfigif you want to specifiy temperature, topP, etc. - Run the file you changed with the following, with the terminal showing
(venv)python bedrock.pyorpython converse.py
- You should see the response in the console!
bedrock.pymay take a long time depending on the prompt, as it waits for the entire response to completeconverse.pyacts more like ChatGPT, where it generates the response in real-time






