
Let’s say you are new to some of the nuances of cloud, e.g. a Network Engineer who’s asked to get involved in cloud networking. You may be familiar with tools like ping and traceroute to test network reachability. You learned in Part 1 of this blog series that PSC Endpoints and PGA IP ranges do not respond to ping or traceroute utilities, so in Part 1 we explored other ways to validate network private access to managed services using PSC and PGA. In this blog we’ll go a step further to validate the operation of PSC / PGA for a specific managed service, Cloud Storage, which is simple to test.
Part 1 provides the steps to validate that your communications to managed services are sent over private access. Once you’ve validated, here are the extra steps to test Cloud Storage operation over private access.
Things to note
- The commands below are run from a Linux command prompt with gcloud.
- Running the commands found in this blog from Cloud Shell does not test private access since Cloud Shell exists outside of your VPC network. Run these commands from a Google Cloud Compute Engine VM, or from a Linux machine in your on-prem network that’s connected to GCP and configured for PSC or PGA from on-prem (see Part 1 for more on configuring on-prem for PSC or PGA).
- There is a Common Errors section below. If you’re getting errors, please refer to it.
Create a Cloud Storage bucket
Pre-verify the Cloud Storage buckets that are already created (if any) by going to the GCP console under Cloud Storage > Buckets to view them. Take a screen capture if you’d like as a point of comparison.
From command line, save values in shell variables (change the values to match your setup)…
PROJECT_ID="project-id"
BUCKET_NAME="bucketname"
FILE_NAME="file.txt"
LOCATION="US-CENTRAL1"
STORAGE_CLASS="STANDARD"
…and run these each time you restart the command line session.
Acquire an access token for these calls using gcloud auth print-access-token and then run curl with the access token. Here’s one way…
ACCESS_TOKEN=$(gcloud auth print-access-token)
curl -X POST \
-H "Authorization: Bearer $ACCESS_TOKEN" \
-H "Content-Type: application/json" \
-d "{\"name\": \"$BUCKET_NAME\", \"location\": \"$LOCATION\", \"storageClass\": \"$STORAGE_CLASS\"}" \
"https://storage.googleapis.com/storage/v1/b?project=$PROJECT_ID"
Reminder: If you get an error, there’s a Common Errors section below.
You can also embed the gcloud auth print-access-token command in the curl command…
curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "x-goog-user-project: $PROJECT_ID" \
-H "Content-Type: application/json" \
-d "{\"name\": \"$BUCKET_NAME\", \"location\": \"$LOCATION\", \"storageClass\": \"$STORAGE_CLASS\"}" \
"https://storage.googleapis.com/storage/v1/b?project=$PROJECT_ID"
Issuing a curl command makes it clear that we’re leveraging a googleapis.com API, in this case the JSON API for Cloud Storage or storage.googleapis.com.
You can also create a Cloud Storage Bucket with the following gcloud command…
gcloud storage buckets create gs://$BUCKET_NAME \
--project=$PROJECT_ID \
--location=$LOCATION
…though it is not as obvious how a gcloud command connects, which is why we started with curl commands.
To verify that the new Cloud Storage bucket is created, go to the GCP console under Cloud Storage > Buckets to view the new bucket.
Copy a file to Cloud Storage
Pre-verify the files that are already created (if any) in the Cloud Storage bucket by going to the GCP console under Cloud Storage > Buckets to view them. Take a screen capture if you’d like to have a point of comparison.
At command line run: touch $FILE_NAME
Optionally, run nano $FILE_NAMEand add some text and save the file, then run this command…
curl -X POST --data-binary $FILE_NAME \
-H "Authorization: Bearer $ACCESS_TOKEN" \
-H "Content-Type: application/octet-stream" \
"https://storage.googleapis.com/upload/storage/v1/b/$BUCKET_NAME/o?uploadType=media&name=$FILE_NAME"
You can also use gcloud instead of the curl command to copy a file…
gcloud storage cp $FILE_NAME gs://$BUCKET_NAME
To verify, go to the GCP console under Cloud Storage > Buckets > choose Bucket name to view the file.
Common errors to curl commands
Permissions errors:
{
"error": {
"code": 403,
"message": "Provided scope(s) are not authorized",
"errors": [
{
"message": "Provided scope(s) are not authorized",
"domain": "global",
"reason": "forbidden"
}
]
}
}
{
"error": {
"code": 403,
"message": "412809883496-compute@developer.gserviceaccount.com does not have storage.buckets.create access to the Google Cloud project. Permission 'storage.buckets.create' denied on resource (or it may not exist).",
"errors": [
{
"message": "412809883496-compute@developer.gserviceaccount.com does not have storage.buckets.create access to the Google Cloud project. Permission 'storage.buckets.create' denied on resource (or it may not exist).",
"domain": "global",
"reason": "forbidden"
}
]
}
}
The fix: You may see this permissions error if testing curl from a Google Compute Engine (GCE) VM in Google Cloud. To resolve it, find the service account associated with the GCE VM instance by selecting the VM in the GCP console and searching the page for “Service account” (if you’ve not changed the service account, it will follow the format here). Then in the GCP Console under IAM & Admin > IAM, select Grant access button to assign Storage Admin role to the service account. You’ll see a message: “Policy updated. It may take a few minutes for these changes to become active.” After a few minutes, test again..
Bucket name error:
{
"error": {
"code": 409,
"message": "The requested bucket name is not available. The bucket namespace is shared by all users of the system. Please select a different name and try again.",
"errors": [
{
"message": "The requested bucket name is not available. The bucket namespace is shared by all users of the system. Please select a different name and try again.",
"domain": "global",
"reason": "conflict"
}
]
}
}
The fix: Create a more unique Cloud Storage bucket name, e.g. by adding more letters / numbers at the end of the name, such as “abc123”.
Summary
The steps here are simple and take very little time to test full operation of a managed service over private access, PSC and PGA.
Further Validate PSC & PGA in Google Cloud and On-prem was originally published in Google Cloud – Community on Medium, where people are continuing the conversation by highlighting and responding to this story.
Source Credit: https://medium.com/google-cloud/further-validate-pga-psc-in-google-cloud-and-on-prem-97d345ad74c0?source=rss—-e52cf94d98af—4
