What's new

Welcome to ehcgo | Welcome My Forum

Join us now to get access to all our features. Once registered and logged in, you will be able to create topics, post replies to existing threads, give reputation to your fellow members, get your own private messenger, and so, so much more. It's also quick and totally free, so what are you waiting for?

Passing Take a look at Inputs into pytest


Staff member
Mar 22, 2024
Reaction score
Somebody not too long ago requested me this query:

I’m growing a pytest challenge to check an API. How can I cross setting info into my exams? I must run exams towards completely different environments like DEV, TEST, and PROD. Every setting has a unique URL and a singular set of customers.

It is a frequent downside for automated take a look at suites, not simply in Python or pytest. Any info a take a look at wants concerning the setting below take a look at is named configuration metadata. URLs and consumer accounts are frequent configuration metadata values. Checks must know what web site to hit and the right way to authenticate.

Utilizing config information with an setting variable​

There are lots of methods to deal with inputs like this. I wish to create JSON information to retailer the configuration metadata for every setting. So, one thing like this:

  • dev.json
  • take a look at.json
  • prod.json

Each might appear like this:

"base_url": "
"username": "pandy",
"password": "DandyAndySugarCandy"

The construction of every file have to be the identical in order that exams can deal with them interchangeably.

I like utilizing JSON information as a result of:

  • they’re plain textual content information with a typical format
  • they’re straightforward to diff
  • they retailer information hierarchically
  • Python’s customary json module turns them into dictionaries in 2 traces flat

Then, I create an setting variable to set the specified config file:

export TARGET_ENV=dev.json

In my pytest challenge, I write a fixture to get the config file path from this setting variable after which learn that file as a dictionary:

import json
import os
import pytest

def target_env(scope="session"):
config_path = os.environ['TARGET_ENV']
with open(config_path) as config_file:
config_data = json.load(config_file)
return config_data

I’ll put this fixture in a conftest.py file so all exams can share it. Because it makes use of session scope, pytest will execute it one time earlier than all exams. Take a look at capabilities can name it like this:

import requests

def test_api_get(target_env):
url = target_env['base_url']
creds = (target_env['username'], target_env['password'])
response = requests.get(url, auth=creds)
assert response.status_code == 200

Deciding on the config file with a command line argument​

When you don’t wish to use setting variables to pick out the config file, you could possibly as a substitute create a customized pytest command line argument. Bas Dijkstra wrote an excellent article exhibiting how to do that. Mainly, you could possibly add the next perform to conftest.py so as to add the customized argument:

def pytest_addoption(parser):
assist='Path to the goal setting config file')

Then, replace the target_env fixture:

import json
import pytest

def target_env(request):
config_path = request.config.getoption('--target-env')
with open(config_path) as config_file:
config_data = json.load(config_file)
return config_data

When working your exams, you’ll specify the config file path like this:

python -m pytest --target-env dev.json

Why trouble with JSON information?​

In idea, you could possibly cross all inputs into your exams with pytest command line arguments or setting variables. You don’t want config information. Nevertheless, I discover that storing configuration metadata in information is far more handy than setting a bunch of inputs every time I must run my exams. In our instance above, passing one worth for the config file path is way simpler than passing three completely different values for base URL, username, and password. Actual-world take a look at initiatives would possibly want extra inputs. Plus, configurations don’t change frequency, so it’s okay to avoid wasting them in a file for repeated use. Simply ensure that to maintain your config information secure if they’ve any secrets and techniques.

Validating inputs​

At any time when studying inputs, it’s good apply to ensure their values are good. In any other case, exams might crash! I like so as to add just a few primary assertions as security checks:

import json
import os
import pytest

def target_env(request):
config_path = request.config.getoption('--target-env')
assert os.path.isfile(config_path)

with open(config_path) as config_file:
config_data = json.load(config_file)

assert 'base_url' in config_data
assert 'username' in config_data
assert 'password' in config_data

return config_data

Now, pytest will cease instantly if inputs are incorrect.
Top Bottom