Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FIX: preserve original keys after key_builder transformation #564

Merged

Conversation

haizhou
Copy link
Contributor

@haizhou haizhou commented Apr 11, 2022

What do these changes do?

Preserve original keys with the underlying function call after key_builder transformation takes place in multi_cached

Are there changes in behavior for the user?

There shouldn't be if the intention is to allow differently formatted cache keys than the original keys passed in.

Related issue number

#563

Checklist

  • I think the code is well written
  • Unit tests for the changes exist
  • Documentation reflects the changes
  • Add a new news fragment into the CHANGES folder
    • name it <issue_id>.<type> (e.g. 588.bugfix)
    • if you don't have an issue_id change it to the pr id after creating the PR
    • ensure type is one of the following:
      • .feature: Signifying a new feature.
      • .bugfix: Signifying a bug fix.
      • .doc: Signifying a documentation improvement.
      • .removal: Signifying a deprecation or removal of public API.
      • .misc: A ticket has been closed, but it is not of interest to users.
    • Make sure to use full sentences with correct case and punctuation, for example: Fix issue with non-ascii contents in doctest text files.

@Dreamsorcerer
Copy link
Member

Looks right to me, but can you add a test to demonstrate the problem (and merge master).

Sorry for the delay.

@padraic-shafer
Copy link
Contributor

Hmmm, there still seems to be some ambiguity regarding the dict keys that are returned by the decorated function and the role of key_builder in multi_cached. I see two interpretations. Can someone help to clarify the intended usage (or the prominent use case)?

Interpretation 1

The values of the iterable passed to keys_from_attr exactly match the dict keys returned by the decorated function. key_builder enables these keys to be mapped to some other value before they are stored in the cache.

This appears to be aligned with the changes made by this PR.

Interpretation 2

The dict keys returned by the decorated function correspond 1-to-1 with the values of the iterable passed to keys_from_attr, but they need not match. It is the job of key_builder to provide this mapping so that the cache keys match the returned dict keys. [The cache keys might get further modified by the cache's build_key(key, namespace) but that is not relevant here.]

This was the context for my earlier comments on this PR.

@haizhou
Copy link
Contributor Author

haizhou commented Jan 12, 2023

Agree with @pshafer-als on need for clarification about intended use case for key_builder in multi_cached. I added an example, but as of the latest release, it is raising a KeyError on https://github.com/aio-libs/aiocache/blob/0.11.1/aiocache/decorators.py#L366 because the key returned from the result has the already transformed key by key_builder but is passed through again

@Dreamsorcerer
Copy link
Member

Hmmm, there still seems to be some ambiguity regarding the dict keys that are returned by the decorated function and the role of key_builder in multi_cached. I see two interpretations. Can someone help to clarify the intended usage (or the prominent use case)?

If I'm understanding correctly, you're suggesting in interpretation 2, that a function can return keys which don't match the input? Like:

def foo(keys=(1, 2)):
    return {"a": "foo", "b": "bar"}

That's not my understanding of the documentation.

By default, the callable’s name and call signature are not incorporated into the cache key, so if there is another cached function returning a dict with same keys, those keys will be overwritten. To avoid this, use a specific namespace in each cache decorator or pass a key_builder.

So, the primary purpose is to avoid cache key collisions with other functions. Not so sure why you'd use it over namespace though...

the values of keys_from_attr will be transformed before requesting them from the cache.

So, we use key_builder for the key to request from cache.

Equivalently, the keys in the dict-like mapping returned by the decorated callable will be transformed before storing them in the cache.

And we use key_builder on the returned keys to set them in the cache.

In other words, it seems like key_builder just transforms keys to deal with the cache. The function should have no knowledge of the caching details, and the transformed keys should not be visible outside of the caching code.

Which means something like this should work:

def my_func(*args):
    ...

f1 = multi_cached("args", key_builder=builder1)(my_func)
f2 = multi_cached("args", key_builder=builder2)(my_func)
my_func(1)
f1(1)
f2(1)
# All 3 calls should return the same result, with different caches used for the latter 2 functions.

@Dreamsorcerer
Copy link
Member

So, I think the changes here are still correct. If we can get a new test for this (in tests/ut/test_decorators.py using a mock cache should work), then we can merge this.

@Dreamsorcerer
Copy link
Member

Dreamsorcerer commented Jan 13, 2023

@haizhou There are 6 failing tests, due to the change of return value in .get_cache_keys(). Can you enable write access to your branch or update those tests?

All the failing tests are under tests/ut/test_decorators.py::TestMultiCached

@haizhou
Copy link
Contributor Author

haizhou commented Jan 13, 2023

@haizhou There are 6 failing tests, due to the change of return value in .get_cache_keys(). Can you enable write access to your branch or update those tests?

All the failing tests are under tests/ut/test_decorators.py::TestMultiCached

@Dreamsorcerer thanks for the collaboration. I'm not sure if we have enough seats left for adding an outside collaborator to our repos, but I made an attempt to fix the tests

@Dreamsorcerer
Copy link
Member

@haizhou There are 6 failing tests, due to the change of return value in .get_cache_keys(). Can you enable write access to your branch or update those tests?
All the failing tests are under tests/ut/test_decorators.py::TestMultiCached

@Dreamsorcerer thanks for the collaboration. I'm not sure if we have enough seats left for adding an outside collaborator to our repos, but I made an attempt to fix the tests

No problem. I think there's usually a checkbox to enable maintainer write access when opening a PR (not sure how to find it after opening though). That shouldn't affect those seats (and probably only applies to the branch, rather than the entire repo).

@codecov
Copy link

codecov bot commented Jan 13, 2023

Codecov Report

Merging #564 (6ea2f43) into master (349cf86) will increase coverage by 0.00%.
The diff coverage is 100.00%.

Impacted file tree graph

@@           Coverage Diff           @@
##           master     #564   +/-   ##
=======================================
  Coverage   99.67%   99.67%           
=======================================
  Files          35       35           
  Lines        3712     3713    +1     
=======================================
+ Hits         3700     3701    +1     
  Misses         12       12           
Flag Coverage Δ
unit 99.67% <100.00%> (+<0.01%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
aiocache/decorators.py 100.00% <100.00%> (ø)
tests/ut/test_decorators.py 100.00% <100.00%> (ø)
tests/ut/test_serializers.py 100.00% <0.00%> (ø)
tests/acceptance/test_serializers.py 100.00% <0.00%> (ø)
aiocache/serializers/serializers.py 95.45% <0.00%> (+0.06%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 349cf86...6ea2f43. Read the comment docs.

@Dreamsorcerer Dreamsorcerer merged commit 5007dc2 into aio-libs:master Jan 13, 2023
@haizhou haizhou deleted the hai/multi-cached-key-fix branch January 13, 2023 22:10
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants