-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Python: Bug: 404-resource-not-found error when using kernel.invoke from Azure web app #10040
Comments
@moonbox3 / @eavanvalkenburg can you take a look? |
Hi @Shavivek, without being able to debug the code, my best guess would be some of the environment variables are not being configured correctly...
Could you add logging to verify that these environment variables are being set as expected in your deployment? |
Hi @Shavivek, is this accurate: In any case, as @alliscode mentioned, adding some logging can help -- it does look like something may be amiss with the environment variables that are configured in Azure -- if it's working locally, and it's not working in the cloud, then there should be some discrepancy. There very well could be a bug in the old package version (1.1.2) as well, so please do upgrade and let us know if that fixes it. Thanks! |
@moonbox3 When I upgrade to semantic-kernel 1.17.1 and had to upgrade open-ai package also to 1.59.3. But even after that, I get the same error message. This is the POST URL that gets generated: <Request('POST', 'https://xxxxxx-yyyyyy.openai.azure.com/openai/deployments/gpt-4o-mini/chat/completions?api-version=2024-07-18')> |
Hello @Shavivek, thanks for your response. I am not seeing the api-version |
Describe the bug
I have a Django web app (python stack) that has few features which use Azure OAI models using semantic-kernel. My app works perfectly on local machine but when I deploy it to Azure web service, the Azure OAI capabilities fail with the error: Error code: 404 - {'error': {'code': '404', 'message': 'Resource not found'}}
To Reproduce
Steps to reproduce the behavior:
I am unable to reproduce the issue locally. My app has 2 main files:
To get a kernel configured.
def getKernel():
kernel = Kernel()
if 'WEBSITE_HOSTNAME' not in os.environ:
config_file = "app_settings.config"
config_object = ConfigParser()
config_object.read(config_file)
deployment = config_object['azureOpenAI']['AZURE_OPENAI_DEPLOYMENT_NAME']
api_key=config_object['azureOpenAI']['AZURE_OPENAI_API_KEY']
endpoint=config_object['azureOpenAI']['AZURE_OPENAI_ENDPOINT']
else:
deployment = os.environ['AZURE_OPENAI_DEPLOYMENT_NAME']
api_key=os.environ['AZURE_OPENAI_API_KEY']
endpoint=os.environ['AZURE_OPENAI_ENDPOINT']
service_id = "default"
kernel.add_service(AzureChatCompletion(
deployment_name=deployment,
api_key=api_key,
endpoint=endpoint, # Used to point to your service
service_id=service_id,
))
return kernel
and 2 - files that call into semantic functions to get response from OpenAI models.
async def getSpendDetailsLLM(mssg):
The statement above "await kernel.invoke" causes exception when executed from the Azure instance but works as expected when works locally.
Traceback (most recent call last):
File "/tmp/8dd28badf0397ed/antenv/lib/python3.12/site-packages/semantic_kernel/connectors/ai/open_ai/services/open_ai_handler.py", line 51, in _send_request
response = await self.client.chat.completions.create(**request_settings.prepare_settings_dict())
File "/tmp/8dd28badf0397ed/antenv/lib/python3.12/site-packages/openai/resources/chat/completions.py", line 1289, in create
return await self._post(
File "/tmp/8dd28badf0397ed/antenv/lib/python3.12/site-packages/openai/_base_client.py", line 1805, in post
return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
File "/tmp/8dd28badf0397ed/antenv/lib/python3.12/site-packages/openai/_base_client.py", line 1503, in request
return await self._request(
File "/tmp/8dd28badf0397ed/antenv/lib/python3.12/site-packages/openai/_base_client.py", line 1599, in _request
raise self._make_status_error_from_response(err.response) from None
The above exception (Error code: 404 - {'error': {'code': '404', 'message': 'Resource not found'}}) was the direct cause of the following exception:
File "/tmp/8dd28badf0397ed/antenv/lib/python3.12/site-packages/semantic_kernel/functions/kernel_function_from_prompt.py", line 178, in _invoke_internal
chat_message_contents = await prompt_render_result.ai_service.get_chat_message_contents(
File "/tmp/8dd28badf0397ed/antenv/lib/python3.12/site-packages/semantic_kernel/connectors/ai/open_ai/services/open_ai_chat_completion_base.py", line 111, in get_chat_message_contents
return await self._send_chat_request(settings)
File "/tmp/8dd28badf0397ed/antenv/lib/python3.12/site-packages/semantic_kernel/connectors/ai/open_ai/services/open_ai_chat_completion_base.py", line 272, in _send_chat_request
response = await self._send_request(request_settings=settings)
File "/tmp/8dd28badf0397ed/antenv/lib/python3.12/site-packages/semantic_kernel/connectors/ai/open_ai/services/open_ai_handler.py", line 67, in _send_request
raise ServiceResponseException(
The above exception (("<class 'semantic_kernel.connectors.ai.open_ai.services.azure_chat_completion.AzureChatCompletion'> service failed to complete the prompt", NotFoundError("Error code: 404 - {'error': {'code': '404', 'message': 'Resource not found'}}"))) was the direct cause of the following exception:
File "/tmp/8dd28badf0397ed/antenv/lib/python3.12/site-packages/semantic_kernel/kernel.py", line 180, in invoke
return await function.invoke(kernel=self, arguments=arguments, metadata=metadata)
File "/tmp/8dd28badf0397ed/antenv/lib/python3.12/site-packages/semantic_kernel/functions/kernel_function.py", line 211, in invoke
await stack(function_context)
File "/tmp/8dd28badf0397ed/antenv/lib/python3.12/site-packages/semantic_kernel/functions/kernel_function_from_prompt.py", line 184, in _invoke_internal
raise FunctionExecutionException(f"Error occurred while invoking function {self.name}: {exc}") from exc
The above exception (Error occurred while invoking function SpendDetails: ("<class 'semantic_kernel.connectors.ai.open_ai.services.azure_chat_completion.AzureChatCompletion'> service failed to complete the prompt", NotFoundError("Error code: 404 - {'error': {'code': '404', 'message': 'Resource not found'}}"))) was the direct cause of the following exception:
File "/tmp/8dd28badf0397ed/antenv/lib/python3.12/site-packages/django/core/handlers/exception.py", line 47, in inner
response = get_response(request)
File "/tmp/8dd28badf0397ed/antenv/lib/python3.12/site-packages/django/core/handlers/base.py", line 181, in _get_response
response = wrapped_callback(request, *callback_args, **callback_kwargs)
File "/tmp/8dd28badf0397ed/antenv/lib/python3.12/site-packages/django/contrib/auth/decorators.py", line 21, in _wrapped_view
return view_func(request, *args, **kwargs)
File "/tmp/8dd28badf0397ed/antenv/lib/python3.12/site-packages/django/views/generic/base.py", line 69, in view
return self.dispatch(request, *args, **kwargs)
File "/tmp/8dd28badf0397ed/antenv/lib/python3.12/site-packages/django/views/generic/base.py", line 101, in dispatch
return handler(request, *args, **kwargs)
File "/tmp/8dd28badf0397ed/antenv/lib/python3.12/site-packages/django/views/generic/edit.py", line 170, in get
return super().get(request, *args, **kwargs)
File "/tmp/8dd28badf0397ed/antenv/lib/python3.12/site-packages/django/views/generic/edit.py", line 135, in get
return self.render_to_response(self.get_context_data())
File "/tmp/8dd28badf0397ed/spend/viewClasses/smsProcessviews.py", line 295, in get_context_data
context = super(SMSRecordCreateViewLLM, self).get_context_data(**kwargs)
File "/tmp/8dd28badf0397ed/antenv/lib/python3.12/site-packages/django/views/generic/edit.py", line 68, in get_context_data
kwargs['form'] = self.get_form()
File "/tmp/8dd28badf0397ed/antenv/lib/python3.12/site-packages/django/views/generic/edit.py", line 35, in get_form
return form_class(**self.get_form_kwargs())
File "/tmp/8dd28badf0397ed/spend/viewClasses/smsProcessviews.py", line 385, in get_form_kwargs
kwargs = super(SMSRecordCreateViewLLM, self).get_form_kwargs()
File "/tmp/8dd28badf0397ed/antenv/lib/python3.12/site-packages/django/views/generic/edit.py", line 107, in get_form_kwargs
kwargs = super().get_form_kwargs()
File "/tmp/8dd28badf0397ed/antenv/lib/python3.12/site-packages/django/views/generic/edit.py", line 40, in get_form_kwargs
'initial': self.get_initial(),
File "/tmp/8dd28badf0397ed/spend/viewClasses/smsProcessviews.py", line 316, in get_initial
smsDetails = asyncio.run(lu.getSpendDetailsLLM(sms))
File "/opt/python/3.12.2/lib/python3.12/asyncio/runners.py", line 194, in run
return runner.run(main)
File "/opt/python/3.12.2/lib/python3.12/asyncio/runners.py", line 118, in run
return self._loop.run_until_complete(task)
File "/opt/python/3.12.2/lib/python3.12/asyncio/base_events.py", line 685, in run_until_complete
return future.result()
File "/tmp/8dd28badf0397ed/azureOAI/llmExpense.py", line 112, in getSpendDetailsLLM
spendDetails = await kernel.invoke(function=getSpendDetailsFunc, arguments=KernelArguments(mssg=msgg))
File "/tmp/8dd28badf0397ed/antenv/lib/python3.12/site-packages/semantic_kernel/kernel.py", line 189, in invoke
raise KernelInvokeException(
Exception Type: KernelInvokeException at /spend/sms/new/3096
Exception Value: Error occurred while invoking function:
'ExpensePlugin-SpendDetails'
I have ensured that the env variables for API-version, Deployment-name etc. in config on local machine and on azure are same.
Expected behavior
The functions should behave same when executed locally or from Azure instance.
Screenshots
If applicable, add screenshots to help explain your problem.
Platform
The text was updated successfully, but these errors were encountered: