Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Step 3: Add tests #71

Open
wants to merge 4 commits into
base: master
Choose a base branch
from
Open

Step 3: Add tests #71

wants to merge 4 commits into from

Conversation

annie444
Copy link
Collaborator

This PR adds pytest tests to the project for automated testing. To get started, you can either use uv sync --all-extras, or, more specifically, uv sync --extra test. If you're not using uv yet, you can also use pip install -e .[test] to install pytest and the dependencies for these tests.

The tests are designed to spin up an instance of docker.io/ipac/firefly:latest and run the tests (as defined in the examples/ directory) against the container. I also added a mock cover for the python webbrowser module to keep the tests contained. As you probably have gathered by looking over the files, all python files that start with test_ will be scanned for tests by pytest. All functions in these files that start with test_ will be executed as tests.

After invoking the pytest command line tool and all tests complete (successfully or not), you'll get a coverage.xml report and an HTML coverage report in the coverage/ directory. These will show you a breakdown of how much of the code base is currently tested. I recommend using open coverage/index.html to view the coverage report, unless you have a code analysis tool that supports coverage.xml files.

NOTE: The test_adv_table_imgs_upload test in the test/test_demo_advanced_tables_images_upload.py file is currently failing as it relies on a local file that is not included in the repo, so I was never able to test that function.

Copy link

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

@annie444 annie444 requested a review from robyww February 20, 2025 22:30
@annie444 annie444 self-assigned this Feb 20, 2025
@annie444 annie444 mentioned this pull request Feb 20, 2025
Copy link
Member

@jaladh-singhal jaladh-singhal left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for incorporating pytest. I've left several comments.

There's another thing that I noticed that you converted example notebooks/scripts to test scripts. But there are a few issues here:

  • they will go out of sync as the notebook content is updated. That's why CI at https://github.com/Caltech-IPAC/irsa-tutorials/tree/main uses jupytext to convert myst-notebooks to test.py during run-time. I see firefly-client similar in longer term
  • many of these examples are obselete and redundant - we have a ticket to fix all of these problems with documentation notebooks, that I'd be working on in the near future when I get time.

So I'd recommend to hold off creating test files for examples/*.ipynb. I'm ok with leaving the test files for examples/*.py as proof of concept.

"pytest-mock>=3.14.0",
"pytest-xdist>=3.6.1",
]
astropy = ["astropy>=6"]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This can go in test list defined above.

Comment on lines +7 to +19
FIREFLY_CONTAINER = Container(
url="docker.io/ipac/firefly:latest",
extra_launch_args=["--memory=4g"],
entry_point=EntrypointSelection.AUTO,
forwarded_ports=[
PortForwarding(
container_port=8080,
protocol=NetworkProtocol.TCP,
host_port=random.randint(8000, 65534),
bind_ip="127.0.0.1",
)
],
)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I love this! I wasn't aware if this was possible in pytests world. @bsipocz this should also help with CI pipeline in irsa-tutorials - we will have to do some mocking though.



@pytest.mark.parametrize("container", [FIREFLY_CONTAINER], indirect=["container"])
def test_5_instances_3_channels(container: ContainerData, mocker: MockerFixture):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this is a great addition - I'll test it locally after doing setup.

Comment on lines +14 to +23
assert container.forwarded_ports[0].host_port >= 8000
assert container.forwarded_ports[0].host_port <= 65534
assert container.forwarded_ports[0].container_port == 8080
assert container.forwarded_ports[0].bind_ip == "127.0.0.1"

time.sleep(5)

mocker.patch("webbrowser.open")

host = f"http://{container.forwarded_ports[0].bind_ip}:{container.forwarded_ports[0].host_port}/firefly"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we perhaps move this repeating code in a fixture that returns fc object?

Comment on lines +64 to +73
# Set stretch using hue-preserving algorithm with scaling coefficients .15 for RED, 1.0 for GREEN, and .4 for BLUE.
fc.set_stretch_hprgb(
plot_id="wise_m101", asinh_q_value=4.2, scaling_k=[0.15, 1, 0.4]
)
# Set per-band stretch
fc.set_stretch(plot_id="wise_m101", stype="ztype", algorithm="asinh", band="ALL")
# Set contrast and bias for each band
fc.set_rgb_colors(plot_id="wise_m101", bias=[0.4, 0.6, 0.5], contrast=[1.5, 1, 2])
# Set to use red and blue band only, with default bias and contrast
fc.set_rgb_colors(plot_id="wise_m101", use_green=False)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

firefly client doesn't throw any error if these commands are not able to make the change in web application. But I think what we can do instead is to capture their result and assert result["success"]

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you! I wasn't diving into the logic very much, but just focusing on getting things working with pytest. I'll update the tests accordingly!

hips_url = "http://alasky.u-strasbg.fr/AllWISE/RGB-W4-W2-W1"
status = fc.show_hips(plot_id="aHipsID2", hips_root_url=hips_url)

assert status is not None
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same here, I think we should explicitly test if status has success key set to True

@annie444
Copy link
Collaborator Author

To this I would recommend an alternative approach of refactoring the tests to be true unit tests and then keep the examples separate. If you look now, the code coverage is just okay, and the goal there is to have as close to 100% testing coverage as possible. However, 100% coverage testing is often a bit pedantic and would only confuse users. Additionally, unit tests will give you/me/the developers of firefly_client a clear picture of what breaking changes have been made during a version change.

So I'd recommend to hold off creating test files for examples/*.ipynb. I'm ok with leaving the test files for examples/*.py as proof of concept.

These tests are definitely just a proof of concept. I literally just pulled the examples and reformatted them as pure python.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants