Postman and Google Cloud platform private keys

We have an API endpoint that is running on Google cloud platform and that endpoint writes stuff to Google BigQuery. So we have a never expiring Bearer token that we use to talk to the API endpoint.
I am writing some automated test using Postman to make API calls to the endpoint and BigQuery as well. So the problem that I am having is Google has something called service account (private key) in form of .json file. So right now I don’t see postman support this for Google cloud platform. So as a workaround using the Google service account I am generating a Bearer token and using that making a query to BigQuery but that token expires in 1 hour and I have to manual generate a token again(GOOGLE_APPLICATION_CREDENTIALS=~/<path-to-service-accout.json> gcloud auth application-default print-access-token) to run the automated test. My Question is

  1. is there a way using postman app and Google service account( google private keys) I can have my tests run without user intervention?

Hi Pushyamig,
I have the same use case. May I know whether you found any solution for this?

I start my newman collection from a Jenkins pipeline / Jenkinsfile. So, generating the token at the beginning of the pipeline with the above is what works for me. A temporary access token just for the execution of the test suite.

could you give an example of how you configure your jenkins pipeline to set GOOGLE_APPLICATION_CREDENTIALS and run postman?

I can’t wrap my head around it and am hitting a wall.

Hi there, did anyone get a handle on this ? I am facing the same situation

@docking-module-ast23

Can you post a link to the Google documentation you are using for this.

I can’t see this on the main authentication pages.

Authentication methods at Google | Google Cloud

This also perhaps should have been its own topic. Can you confirm its the same issue as the original poster had?

Hi Michael, thanks for replying. Yes, the post does capture most of what i am trying to do i.e. there’s an external call being made through an application ( similar to a webhook ), to post messages to PubSub.

For testing purposes, we did it manually i.e.

  1. created a Service Account(SA) in GCP, gave it required pubsub roles,

  2. downloaded it’s key in GCP env, used CloudShell(GCP) to activate it ( gcloud auth activate-service-account) ,

  3. then issued another command in CloudShell GCP ( gcloud auth print-access-token ).

  4. Using that token, we did 2 things:
    a) Checked connectivity from Postman :
    curl -X GET
    https://pubsub.googleapis.com/v1/projects/PRJNAME/topics/TOPICNAME
    -H ‘Authorization: Bearer
    -H ‘Content-Type: application/json’

b) Published message through Postman :
curl -X POST
https://pubsub.googleapis.com/v1/projects/PRJNAME/topics/TOPICNAME:publish
-H ‘Authorization: Bearer ’
-H ‘Content-Type: application/json’
-H ‘Accept: application/json’
-d ‘{ “messages”: [ { “data”: “SGVsbG8gV29ybGQhCg==” } ] }’
-H ‘Accept: application/json’

Both 4a and 4b succeeded manually from the POSTMAN command prompt, which is what is mentioned in the above forum’s first topic as well.
So, trying to figure out how :
steps 2 & 3 can be automated AND how step 4 can be parameterized