Skip to content

ci(doc): add doc build caching #2342

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 17 commits into
base: master
Choose a base branch
from
Open

Conversation

PProfizi
Copy link
Contributor

@PProfizi PProfizi commented Jun 5, 2025

The goal is to accelerate doc building using cached Sphinx build files whenever possible.

The caching logic is to retrieve, by order of priority:

  • the cache from last run on this branch
  • the cache from master

The cache should only contain files where Sphinx can detect changes:

  • this is true for modules used by AutoAPI
  • this is true for example scripts

The cache could be based on the hash of a lock file, as differences in installed graphics dependencies could result in changes despite Sphinx not detecting changes in the actual scripts.
That means however the cache will only work if no changes in dependencies.

@PProfizi PProfizi self-assigned this Jun 5, 2025
@PProfizi PProfizi added the CI/CD Related to CI/CD label Jun 5, 2025
Copy link

codecov bot commented Jun 5, 2025

❌ 2 Tests Failed:

Tests completed Failed Passed Skipped
25718 2 25716 3617
View the full list of 2 ❄️ flaky tests
tests/test_data_tree.py::test_read_from_txt_data_tree[ansys-grpc-dpf]

Flake rate in main: 15.87% (Passed 53 times, Failed 10 times)

Stack Traces | 0.016s run time
server_type = <ansys.dpf.core.server_types.LegacyGrpcServer object at 0x7f46aa3f6410>

    @conftest.raises_for_servers_version_under("4.0")
    def test_read_from_txt_data_tree(server_type):
        data_tree = dpf.DataTree(server=server_type)
        with data_tree.to_fill() as to_fill:
            to_fill.int = 1
            to_fill.double = 1.0
            to_fill.string = "hello"
            to_fill.list_int = [1, 2]
            to_fill.list_double = [1.5, 2.5]
            to_fill.add(list_string=["hello", "bye"])
>       txt = data_tree.write_to_txt()

tests/test_data_tree.py:243: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
..../test-api/lib/python3.10.../dpf/core/data_tree.py:300: in write_to_txt
    return self._serialize(path, op)
..../test-api/lib/python3.10.../dpf/core/data_tree.py:269: in _serialize
    return operator.get_output(0, core.types.string)
..../test-api/lib/python3.10.../dpf/core/dpf_operator.py:603: in get_output
    internal_obj = type_tuple[1](self, pin)
..../test-api/lib/python3.10.../dpf/core/dpf_operator.py:325: in _getoutput_string
    out = Operator._getoutput_string_as_bytes(self, pin)
..../test-api/lib/python3.10.../dpf/core/dpf_operator.py:340: in _getoutput_string_as_bytes
    return self._api.operator_getoutput_string(self, pin)
..../test-api/lib/python3.10.../dpf/gate/errors.py:38: in wrapper
    out = func(*args, **kwargs)
..../test-api/lib/python3.10.../dpf/gate/operator_grpcapi.py:351: in operator_getoutput_string
    return OperatorGRPCAPI.get_output_finish(op, request, stype, subtype)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

args = (<ansys.dpf.core.operators.serialization.data_tree_to_txt.data_tree_to_txt object at 0x7f46aad216f0>, op {
  id {
    id: 87
    server_address: "127.0.0.1:50059"
  }
  name: "data_tree_to_txt"
}
, 'string', '')
kwargs = {}, _InactiveRpcError = <class 'grpc._channel._InactiveRpcError'>
_MultiThreadedRendezvous = <class 'grpc._channel._MultiThreadedRendezvous'>
details = 'DPF issue due to licensing context: execution stopped. Apply Premium context to unlock this capability.'

    @wraps(func)
    def wrapper(*args, **kwargs):
        """Capture gRPC exceptions."""
        from grpc._channel import _InactiveRpcError, _MultiThreadedRendezvous
        try:
            out = func(*args, **kwargs)
        except (_InactiveRpcError, _MultiThreadedRendezvous) as error:
            details = error.details()
            if "object is null in the dataBase" in details:
                raise DPFServerNullObject(details) from None
            elif "Unable to open the following file" in details:
                raise DPFServerException(
                    "The result file could not be found or could not be opened, the server raised an error message: \n" + details) from None
>           raise DPFServerException(details) from None
E           ansys.dpf.gate.errors.DPFServerException: DPF issue due to licensing context: execution stopped. Apply Premium context to unlock this capability.

..../test-api/lib/python3.10.../dpf/gate/errors.py:46: DPFServerException
tests/test_data_tree.py::test_write_to_file_data_tree[ansys-grpc-dpf]

Flake rate in main: 14.29% (Passed 54 times, Failed 9 times)

Stack Traces | 0.024s run time
tmpdir = local('.../pydpf-core/pydpf-core/.tox.../pytest-of-runner/pytest-0/test_write_to_file_data_tree_a2')
server_type = <ansys.dpf.core.server_types.LegacyGrpcServer object at 0x7f46aa3f6410>

    @conftest.raises_for_servers_version_under("4.0")
    def test_write_to_file_data_tree(tmpdir, server_type):
        data_tree = dpf.DataTree(server=server_type)
        with data_tree.to_fill() as to_fill:
            to_fill.int = 1
            to_fill.double = 1.0
            to_fill.string = "hello"
            to_fill.list_int = [1, 2]
            to_fill.list_double = [1.5, 2.5]
            to_fill.list_string = ["hello", "bye"]
>       data_tree.write_to_txt(str(Path(tmpdir) / "file.txt"))

tests/test_data_tree.py:180: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
..../test-api/lib/python3.10.../dpf/core/data_tree.py:300: in write_to_txt
    return self._serialize(path, op)
..../test-api/lib/python3.10.../dpf/core/data_tree.py:260: in _serialize
    operator.run()
..../test-api/lib/python3.10.../dpf/core/dpf_operator.py:780: in run
    self.get_output()
..../test-api/lib/python3.10.../dpf/core/dpf_operator.py:588: in get_output
    return self._api.operator_run(self)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

args = (<ansys.dpf.core.operators.serialization.data_tree_to_txt.data_tree_to_txt object at 0x7f46aa35c610>,)
kwargs = {}, _InactiveRpcError = <class 'grpc._channel._InactiveRpcError'>
_MultiThreadedRendezvous = <class 'grpc._channel._MultiThreadedRendezvous'>
details = 'DPF issue due to licensing context: execution stopped. Apply Premium context to unlock this capability.'

    @wraps(func)
    def wrapper(*args, **kwargs):
        """Capture gRPC exceptions."""
        from grpc._channel import _InactiveRpcError, _MultiThreadedRendezvous
        try:
            out = func(*args, **kwargs)
        except (_InactiveRpcError, _MultiThreadedRendezvous) as error:
            details = error.details()
            if "object is null in the dataBase" in details:
                raise DPFServerNullObject(details) from None
            elif "Unable to open the following file" in details:
                raise DPFServerException(
                    "The result file could not be found or could not be opened, the server raised an error message: \n" + details) from None
>           raise DPFServerException(details) from None
E           ansys.dpf.gate.errors.DPFServerException: DPF issue due to licensing context: execution stopped. Apply Premium context to unlock this capability.

..../test-api/lib/python3.10.../dpf/gate/errors.py:46: DPFServerException

To view more test analytics, go to the Test Analytics Dashboard
📋 Got 3 mins? Take this short survey to help us improve Test Analytics.

@PProfizi PProfizi requested review from moe-ad and removed request for moe-ad June 6, 2025 07:55
Copy link
Contributor

github-actions bot commented Jun 6, 2025

The documentation for this pull request will be available at https://dpf.docs.pyansys.com/pull/2342. Please allow some time for the documentation to be deployed.

@PProfizi
Copy link
Contributor Author

@Revathyvenugopal162 pinging you FYI, this PR tries to use a few sphinx tricks to improve config change detection.
It then tries to add a caching logic to the pipelines to see how short we can the doc build step.
This is work in progress.
As you mentioned on the ansys-sphinx-theme issue, there are definitely extensions still triggering a rebuild somewhere.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CI/CD Related to CI/CD
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant