Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Excpetion in ASGI application #435

Open
kaziu007 opened this issue Apr 6, 2024 · 9 comments
Open

Excpetion in ASGI application #435

kaziu007 opened this issue Apr 6, 2024 · 9 comments

Comments

@kaziu007
Copy link

kaziu007 commented Apr 6, 2024

I encountered following error for query: "what is the market of commercial cleaning in usa?" Long report.

I have generated numerous of short reports already, but long reports fail almost all the time with this error. I am using MacOS, with conda virtual environment.


ERROR: Exception in ASGI application
Traceback (most recent call last):
File "/Users/Marcin/Anaconda3n/anaconda3/envs/AIResearcher/lib/python3.11/site-packages/uvicorn/protocols/websockets/wsproto_impl.py", line 233, in run_asgi
result = await self.app(self.scope, self.receive, self.send)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Marcin/Anaconda3n/anaconda3/envs/AIResearcher/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 69, in call
return await self.app(scope, receive, send)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Marcin/Anaconda3n/anaconda3/envs/AIResearcher/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in call
await super().call(scope, receive, send)
File "/Users/Marcin/Anaconda3n/anaconda3/envs/AIResearcher/lib/python3.11/site-packages/starlette/applications.py", line 123, in call
await self.middleware_stack(scope, receive, send)
File "/Users/Marcin/Anaconda3n/anaconda3/envs/AIResearcher/lib/python3.11/site-packages/starlette/middleware/errors.py", line 151, in call
await self.app(scope, receive, send)
File "/Users/Marcin/Anaconda3n/anaconda3/envs/AIResearcher/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 65, in call
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File "/Users/Marcin/Anaconda3n/anaconda3/envs/AIResearcher/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
raise exc
File "/Users/Marcin/Anaconda3n/anaconda3/envs/AIResearcher/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "/Users/Marcin/Anaconda3n/anaconda3/envs/AIResearcher/lib/python3.11/site-packages/starlette/routing.py", line 756, in call
await self.middleware_stack(scope, receive, send)
File "/Users/Marcin/Anaconda3n/anaconda3/envs/AIResearcher/lib/python3.11/site-packages/starlette/routing.py", line 776, in app
await route.handle(scope, receive, send)
File "/Users/Marcin/Anaconda3n/anaconda3/envs/AIResearcher/lib/python3.11/site-packages/starlette/routing.py", line 373, in handle
await self.app(scope, receive, send)
File "/Users/Marcin/Anaconda3n/anaconda3/envs/AIResearcher/lib/python3.11/site-packages/starlette/routing.py", line 96, in app
await wrap_app_handling_exceptions(app, session)(scope, receive, send)
File "/Users/Marcin/Anaconda3n/anaconda3/envs/AIResearcher/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
raise exc
File "/Users/Marcin/Anaconda3n/anaconda3/envs/AIResearcher/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "/Users/Marcin/Anaconda3n/anaconda3/envs/AIResearcher/lib/python3.11/site-packages/starlette/routing.py", line 94, in app
await func(session)
File "/Users/Marcin/Anaconda3n/anaconda3/envs/AIResearcher/lib/python3.11/site-packages/fastapi/routing.py", line 348, in app
await dependant.call(**values)
File "/Users/Marcin/Documents/Behavio.one/Projects/AIresearcher/backend/server.py", line 50, in websocket_endpoint
report = await manager.start_streaming(task, report_type, websocket)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Marcin/Documents/Behavio.one/Projects/AIresearcher/gpt_researcher/utils/websocket_manager.py", line 57, in start_streaming
report = await run_agent(task, report_type, websocket)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Marcin/Documents/Behavio.one/Projects/AIresearcher/gpt_researcher/utils/websocket_manager.py", line 75, in run_agent
report = await researcher.run()
^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Marcin/Documents/Behavio.one/Projects/AIresearcher/gpt_researcher/report_type/detailed_report/detailed_report.py", line 42, in run
_, report_body = await self._generate_subtopic_reports(subtopics)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Marcin/Documents/Behavio.one/Projects/AIresearcher/gpt_researcher/report_type/detailed_report/detailed_report.py", line 92, in _generate_subtopic_reports
result = await fetch_report(subtopic)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Marcin/Documents/Behavio.one/Projects/AIresearcher/gpt_researcher/report_type/detailed_report/detailed_report.py", line 70, in fetch_report
subtopic_report = await self._get_subtopic_report(subtopic)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Marcin/Documents/Behavio.one/Projects/AIresearcher/gpt_researcher/report_type/detailed_report/detailed_report.py", line 116, in _get_subtopic_report
await subtopic_assistant.conduct_research()
File "/Users/Marcin/Documents/Behavio.one/Projects/AIresearcher/gpt_researcher/master/agent.py", line 79, in conduct_research
self.context = await self.get_context_by_search(self.query)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Marcin/Documents/Behavio.one/Projects/AIresearcher/gpt_researcher/master/agent.py", line 142, in get_context_by_search
sub_queries = await get_sub_queries(query, self.role, self.cfg, self.parent_query, self.report_type) + [query]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Marcin/Documents/Behavio.one/Projects/AIresearcher/gpt_researcher/master/functions.py", line 101, in get_sub_queries
sub_queries = json.loads(response)
^^^^^^^^^^^^^^^^^^^^
File "/Users/Marcin/Anaconda3n/anaconda3/envs/AIResearcher/lib/python3.11/json/init.py", line 346, in loads
return _default_decoder.decode(s)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Marcin/Anaconda3n/anaconda3/envs/AIResearcher/lib/python3.11/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Marcin/Anaconda3n/anaconda3/envs/AIResearcher/lib/python3.11/json/decoder.py", line 355, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

@omkamal
Copy link

omkamal commented Apr 7, 2024

Same issue appeared with me also

@kaziu007
Copy link
Author

kaziu007 commented Apr 7, 2024

Same issue appeared with me also

@omkamal Did you find any workaround?

@omkamal
Copy link

omkamal commented Apr 7, 2024

I just tried installing it on Ubuntu instead of MacOs, and it worked fine.

So it seems to be an issue with MacOS

@assafelovic
Copy link
Owner

This looks like an issue with calling OpenAI which does not return a valid list of subqueries. It would help to get your full example to better investigate @kaziu007

@kaziu007
Copy link
Author

kaziu007 commented Apr 9, 2024

@assafelovic what exactly I shall share to provide you with more details? The issue is occurring once in a while only for detailed reports. While it happens it tends to repeat for next runs with different queries.

@assafelovic
Copy link
Owner

For example, what was your query? It seems not relevant to the detailed report but to generating sub queries.

@boriside
Copy link

I got the same multiple times. Query can be for example: tell me about nike
In this case I picked detailed research.

@Lego4005
Copy link

I modified the gpt_researcher/master/functions.py file and it hasn't happened since

Modifications Made
Error Handling: Introduced a try and except block specifically to catch json.JSONDecodeError. This type of error occurs when json.loads() tries to parse a string that isn’t valid JSON. By catching this error, the application can handle it gracefully instead of crashing.

Logging: Added a print statement before the JSON parsing occurs. This logs the raw response received from the create_chat_completion call. If there is an error with the JSON format, you can see exactly what the data looked like right before the failure. Additionally, if an error is caught, another print statement logs the error along with the problematic data.

Response Handling: If the JSON parsing fails, instead of letting the application crash, we set sub_queries to a default value (an empty list in this case). This allows the application to continue operating even if the data isn’t as expected.

I put the full file in a txt if you need the whole thing. Also commented out the original code.

functions.txt

import asyncio
import json
from fastapi import HTTPException

import markdown

from gpt_researcher.master.prompts import *
from gpt_researcher.scraper.scraper import Scraper
from gpt_researcher.utils.llm import *

# ... [other parts of your code] ...

async def get_sub_queries(query: str, agent_role_prompt: str, cfg, parent_query: str, report_type: str):
    """
    Gets the sub queries
    Args:
        query: original query
        agent_role_prompt: agent role prompt
        cfg: Config
        parent_query: Parent query for context
        report_type: Type of the report to generate

    Returns:
        sub_queries: List of sub queries

    """
    max_research_iterations = cfg.max_iterations if cfg.max_iterations else 1
    response = await create_chat_completion(
        model=cfg.smart_llm_model,
        messages=[
            {"role": "system", "content": f"{agent_role_prompt}"},
            {"role": "user", "content": generate_search_queries_prompt(query, parent_query, report_type, max_iterations=max_research_iterations)}],
        temperature=0,
        llm_provider=cfg.llm_provider
    )
    
    # Log the response for debugging purposes
    print(f"Response received from create_chat_completion: {response}")
    
    # Initialize sub_queries to None or a default value
    sub_queries = None
    
    # Try to parse the JSON, and handle exceptions if parsing fails
    try:
        sub_queries = json.loads(response)
    except json.JSONDecodeError as e:
        # Logging the error
        print(f"JSON decoding failed: {e} - Response content: {response}")
        # You can also use a logging library here if you prefer.
        
        # Optionally, raise an HTTPException for FastAPI to return a HTTP 400 response to the client
        # raise HTTPException(status_code=400, detail="Invalid response format received.")
        
        # If the function must not raise an exception, set sub_queries to a default or empty value
        sub_queries = []  # or {} or None, depending on how your code expects to handle this
    
    return sub_queries

# ... [rest of your code] ...

@boriside
Copy link

Thank you, I did something similar, but for me the rest of the code failed with an empty object so I left it like: sub_queries = []

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants