Add files to Environments (JSON schemas, CSV files etc.)

json
schema
jsondatafile
#1

I’ve started implementing JSON schema validation in my tests and it’s a bit of pain, and of course, a maintenance nightmare.

I’ve currently got my schema defined in the “Pre-request Script” on a request and save it as an environment variable.

If this request is used several times in a test collection, then I have to make sure that the “original” request is made first as that’s where the JSON schema env variable is defined (Unless I define it in every “Pre-request Script” for this request!? Which would be an even bigger maintenance nightmare!)

Proposed Solution:

Have the ability to upload/attach files (JSON/CSV) to an environment.

  • This means I can define the JSON schema for given requests in ONE place
  • This also removes the “snowballing” affecting that is required in the above example.
  • The user should be able to edit/create the files within Postman (in a separate window). This way, if a schema has changed the user can update the file quickly instead of having to open a text editor, find the file, edit it, save it, re-import into Postman and then re-run the test.

I’m sure this feature could be used for other things, but JSON schema validation is the main headache for me.

There’s a great article about making these sort of tests easier to maintain below, but it would be nicer if Postman handled this better

4 Likes
#2

If you keep your collection constrained to a small set of related tests you can also place your schemas in to the collections variable tab (select … next to the collection name, edit, and select the variable tab). It allows you to specify it once for the entire collection and it is available straight away to all requests without relying on a certain request being made first.
In the future, I would want postman to have a way of separating these into their own tab so that they don’t muddy up my library of common functions that I use elsewhere in my collection.

#3

I tried connecting direct to swagger to and using AJV with AJV as “required” - can’t get it working yet - I could do with a good how-to study to work through.
In the meantime for environment variables I have a solution where I run a localhost server and GET updated JSON files from there (it’s just an Apache server with JSON files in documents folder - couldn’t be easier), setting them to environment or global
Makes maintenance much easier…

Have to say though by this point the complexity of maintaining the library od Postman collections is at the point where I am seriously looking at migrating a lot of it to java REST-Assured project with serenity reporting - I need convincing it hasn’t reached ‘critical mass’ to stop thinking this way…

#4

Our company has already moved away and are currently looking at rest-assured or javascript. We’re leaning more towards a javascript since it handles json natively and there are libraries out there to handle the request/response promises.

We had a few reasons to move. We have about 120 APIs we test and the number keeps growing which meant that keeping all the collections up to date with environmental changes was becoming difficult, e.g. everything requires a token so when we changed the security api we had to change 119 APIs in multiple places. The peer review process is convoluted unlike most version controlled repositories where you can see the code change within the tool. There is no way to create template methods that can be used everywhere, e.g. we have a method that returns true if the response code is valid so that we can put the rest of the test behind a check but when the postman api code changes then we have to remember all the places that that code has been used.
Postman stopped being useful to us once we got beyond about 30 APIs with a collection for each and when we had more than 5 people working on the APIs (we now have 15 QAs and 30 devs actively working in Postman)

#5

Makes sense - I value postman for its immediacy and flexibility when you are dealing with a here and now, but for committing longer term work to a library and reusing functions, templates, etc, it becomes complex and needs further tooling written to service it, eg scripts to export and import between CSV, JSON, libs of globals that get run across workspaces for functions shared.
Keeping it simple.it serves best, and leave the complex reusable.stufd to a more collaborative test development approach with a tool better aligned to version control and collaboration. REST-Assured already serves some test stages in the strategy, it makes sense to run other repos alongside it for uniformity and leveraging common work