Automate Posting Hugo Blog to Social Sites (with a db) Part 4
GCP Flask App
I created a db.. now I need update a few tables. Since the db is in gcp. I amgoing to create a quick flask app that will update the tables..
I am using the quick start to get this one going.
Source: https://github.com/justin-napolitano/python-docs-samples/tree/main/cloud-sql/mysql/sqlalchemy
Create a New Local Service Account
Create a new local service account to be able to test the app locally.
Create a new role for the cloud run service account
Reference : Reference Material
Add the cloud run clien to the service account.. Follow directions above.
To run locally we need some environmental variables
There is a secrent manager available from google.. i will update all code to use that in the future.
I created a .env with the following
GOOGLE_APPLICATION_CREDENTIALS='secret.json' INSTANCE_CONNECTION_NAME='smart-axis-421517:us-west2:jnapolitano-site' DB_USER='cobra' DB_PASS='your pass' DB_NAME='jnapolitano'
Install Reqs
Test the Application
Navigate towards http://127.0.0.1:8080 to verify your application is running correctly.
Modify the app for our needs
So the base app is a simple app that creates a table called votes and records the information.. I want to add two routes that will update two seperate tables in my database.
Feed Route
=
=
=
=
=
=
=
return , 500
return , 201
Update Build Route
=
=
=
=
=
=
=
=
=
=
=
=
return , 500
return , 201
Test it out
Update-feed
create a bash called test-update-feed.sh... chmod +x it and then run.. You should get a response that says updated successfully.
#!/bin/bash
Update build
Same as above but with this code
#!/bin/bash
Testing the Container Locally
Once the app is confirmed to work we should test that the container builds as expected.
Prerequisites
- Docker installed on your local machine
- Flask application with a Dockerfile
.envfile with environment variablessecret.jsonfile for Google Application Credentials
Step 1: Prepare the .env File
Create a .env file in your project directory with the following content:
GOOGLE_APPLICATION_CREDENTIALS=secret-path
INSTANCE_CONNECTION_NAME=your-instance
DB_USER=your=user
DB_PASS=yuour-pass
DB_NAME={your-db}
PORT=8080
Step 2: Ensure secret.json is Included in the Docker Image
Your Dockerfile should include a line to copy the secret.json file into the /app directory in the Docker image. Here is an example Dockerfile:
# Use an official Python runtime as a parent image
FROM python:3.9-slim
# Set the working directory
WORKDIR /app
# Copy the current directory contents into the container at /app
COPY . /app
# Ensure the secret.json file is copied into the container
COPY secret.json /app/secret.json
# Install any needed packages specified in requirements.txt
RUN pip install --no-cache-dir -r requirements.txt
# Make port 8080 available to the world outside this container
EXPOSE 8080
# Run gunicorn when the container launches
CMD exec gunicorn --bind :$PORT --workers 1 --threads 8 app:app
Step 3: Build the Docker Image
If you have made changes to the Dockerfile or added the secret.json file, build the Docker image:
Step 4: Run the Docker Container with the .env File
Use the --env-file option to pass the environment variables from your .env file to the Docker container:
Step 5: Verify Environment Variables Inside the Container
To verify that the environment variables are set correctly, start the container in interactive mode:
Once inside the container, check the environment variables:
Step 6: Test the Endpoints Locally
Create test scripts to test the endpoints.
Create test_update_builds_local.sh
Create a file named test_update_builds_local.sh with the following content:
#!/bin/bash
Create test_update_feeds_local.sh
Create a file named test_update_feeds_local.sh with the following content:
#!/bin/bash
Make the Scripts Executable
Make the scripts executable:
Run the Scripts to Test the Endpoints Locally
Run the scripts:
Expected Response
You should receive a JSON response indicating the success or failure of the request. If successful, it will look like this:
Expected Response for test_update_builds_local.sh
Expected Response for test_update_feeds_local.sh
Troubleshooting
If you encounter any issues:
-
Check Docker Logs: To view logs from the running container, use:
-
Database Connectivity: Ensure that your local environment can connect to the database. If you're using a local database, make sure the credentials and host are correct.
Building and Deploying the container to gcp
Step 1: Create and Configure Environment Variables
- Create a
.envFile
Create a .env file in your project directory with the following content:
PROJECT_NAME=your-project-name
PROJECT_NUMBER=67904901121
REGION=us-west2
SERVICE_ACCOUNT=your-service-account
INSTANCE_CONNECTION_NAME=your-instance-connection-name
DB_USER=your-db-user
DB_PASS=your-db-password
DB_NAME=your-db-name
Replace the placeholder values (your-project-name, your-service-account, your-instance-connection-name, your-db-user, your-db-password, and your-db-name) with your actual values.
Step 2: Manage Secrets
#!/bin/bash
# Set your variables in a local .env and source
# Enable the Secret Manager API
# Create secrets in Secret Manager
| \
| \
| \
| \
# Grant access to the Cloud Run service account
Create the Artifact Repository
Create
#!/bin/bash
# Set your variables in a local .env and source
# Create the Artifact Registry repository
# Verify the repository creation
Make sure to chmod +x the script
Build the Docker Image with Cloud Build
Create cloudbuild.yaml
steps:
- name: 'gcr.io/cloud-builders/docker'
args:
images:
- 'us-west2-docker.pkg.dev/${_PROJECT_NAME}/rss-updated/rss-updated-image:latest'
substitutions:
_PROJECT_NAME: "your-project-name"
Trigger the Build
#!/bin/bash
# Set your variables in a local .env and source
# Trigger the Cloud Build with custom substitution
Deploy to Cloud Run
#!/bin/bash
# Set your variables in a local .env and source
# Deploy to Cloud Run