[BUILD-432] convert chatbot streaming response view to async (PR #2157)

Related issues

  • Closes #

Proposed changes

Describe the big picture of your changes here to communicate to the reviewers why we should accept this pull request. If it fixes a bug or resolves a feature request, be sure to link to that issue.

How to reproduce

Explain like if someone who doesn’t know this project is reviewing your changes and how they can replicate this.

Screenshots and videos

Paste here any screenshots or videos related to your changes, if applicable.

Further comments

If this is a relatively large or complex change, kick off the discussion by explaining why you chose the solution you did and what alternatives you considered, etc…

GitHub

Codecov Report

Attention: Patch coverage is 68.00000% with 16 lines in your changes missing coverage. Please review.

Project coverage is 67.05%. Comparing base (ac2086c) to head (79f6039).

Additional details and impacted files
@@             Coverage Diff             @@
##              dev    #2157       +/-   ##
===========================================
- Coverage   97.81%   67.05%   -30.77%     
===========================================
  Files         160      159        -1     
  Lines       22258    22289       +31     
===========================================
- Hits        21772    14945     -6827     
- Misses        486     7344     +6858     

:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.

Does this need to be an async def? also, langchain shows examples of using astream like so: async for chunk in llm.astream

So does this function need to asynchronously yield from the stream?

The login method isn’t being called here.

    await sync_to_async(client.login()()  # FIXME: logging in is not working!

Lol, I’m an idiot. Ignore more. Maybe this is progress?

    await sync_to_async(client.force_login)(user)

New error: FAILED ankihub/ai/tests/test_views.py::test_stream_chatbot_response - django.db.utils.DatabaseError: Save with update_fields did not affect any rows.

Some questions about test_stream_chatbot_response:

  • Does the test itself really need to be async? Why not write a synchronous test using the normal client?

    If you merely want to test the output of your asynchronous views, the standard test client will run them inside their own asynchronous loop without any extra work needed on your part.
    Testing tools | Django documentation | Django

  • Mocking the test

    • Instead of mocking "ankihub.ai.views.llm_conversation_stream", shouldn’t we mock ChatOpenAI.astream?
    • There is a warning in Django docs about using decorators on async tests:

    If you are using test decorators, they must be async-compatible to ensure they work correctly. Django’s built-in decorators will behave correctly, but third-party ones may appear to not execute (they will “wrap” the wrong part of the execution flow and not your test).
    If you need to use these decorators, then you should decorate your test methods with async_to_sync() inside of them insteadf

Does the test itself really need to be async? Why not write a synchronous test using the normal client?

Because the view is async, so we need to await for it

Instead of mocking “ankihub.ai.views.llm_conversation_stream”, shouldn’t we mock ChatOpenAI.astream

I don’t think it matters. The expected return value should be the same: an async generator.

There is a warning in Django docs about using decorators on async tests

Yup! I’ve seen that. But we’re covered on most decorators that we are using. The only one that is not working async is the @override_flag decorator, which I’ve already worked around it

:pick: Remove comment