Process bulk suggestions action async (PR #2148)

https://community.ankihub.net/t/bulk-suggestion-performance-issue/234916

GitHub

Codecov Report

Attention: Patch coverage is 88.88889% with 2 lines in your changes missing coverage. Please review.

Project coverage is 43.36%. Comparing base (5082026) to head (947c5ad).

Additional details and impacted files
@@             Coverage Diff             @@
##              dev    #2148       +/-   ##
===========================================
- Coverage   97.82%   43.36%   -54.46%     
===========================================
  Files         156      155        -1     
  Lines       21841    21841               
===========================================
- Hits        21365     9472    -11893     
- Misses        476    12369    +11893     

:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.

For now, it’s ok, but maybe we should think more about the UI/UX changes before merging this PR, move the calls of the use cases to a celery task will change the way that the users interact with this feature.

Not all use cases will have this execute method, maybe we should rewrite this function to be less generic

Not sure why this check was here. It seems unnecessary, so I removed it.

I’m not able to reproduce the pyupgrade errors locally :thinking:

:seedling: Would be great to define an ABC that all use cases inherit from.

The title of this pull request changed from “Exec use case async.” to "Process bulk suggestions action async

It’s a good practice to use the same name for the task and the function

:recycle: The way that you are calling it is synchronous. It’s the same as calling use_case(**arguments).execute(). To run asynchronously you should call process_bulk_suggestion_action.delay(use_case, arguments) but this will probably fail because you are passing a use_case instance as an argument which is not supported by celery because it’s not serializable, so you need to refactor the function

@pedroven, after setting serializer="pickle" on the shared_task decorator I got this error: kombu.exceptions.ContentDisallowed: Refusing to deserialize untrusted content of type pickle (application/x-python-serialize). So I updated the Celery conf. However, that caused the celery work to refuse to start due to this error:

2024-07-13 15:07:41 Running a worker with superuser privileges when the
2024-07-13 15:07:41 worker accepts messages serialized with pickle is a very bad idea!
2024-07-13 15:07:41 
2024-07-13 15:07:41 If you really want to continue then you have to set the C_FORCE_ROOT
2024-07-13 15:07:41 environment variable (but please think about this before you do).

Hence, the changes to the Dockerfile and compose.yaml

If you think this is acceptable (cc @cjasm) we will need to update the Dockerfile on Appliku and maybe the Procfile in this repo first.

I understand there can be security issues with using pickle, but I think it’s OK for this particular task.

@pedroven , I’m not exactly sure about the test failure. Obviously has something to do with calling that task async. However, I think that test just needs to check that the task is called, not Note contents.