Support warm-up of loader caches#1462
Conversation
|
Important Installation incomplete: to start using Gemini Code Assist, please ask the organization owner(s) to visit the Gemini Code Assist Admin Console and sign the Terms of Services. |
726bb38 to
eca8210
Compare
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## main #1462 +/- ##
==========================================
+ Coverage 93.86% 93.92% +0.05%
==========================================
Files 77 77
Lines 8218 8298 +80
==========================================
+ Hits 7714 7794 +80
Misses 504 504
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
d2bc4da to
74d9cef
Compare
7eb8690 to
c60df2e
Compare
|
I can't really review this in-depth, just left a few infra/testing comments. |
Fair enough and thank you anyway! |
53c2115 to
95b54a2
Compare
|
I tried to validate this in Home Assistant (IDrive e2 backup provider), but Home Assistant is currently pinned to aiobotocore 2.x. Would it be possible to make this warm-up API available (or backported) on the 2.x line as well, so downstream projects pinned to 2.x can adopt and test it? |
We typically don't provide backports and there are no plans to publish additional 2.x releases. It should be straightforward to experiment with the code snippets I posted previously or the proposed changes in this PR. That should work with any version of aiobotocore, including 2.x releases. |
|
Home Assistant is currently using aiobotocore 2.21.1 so it is probably good anyhow to check whether it is possible to upgrade to the latest version. |
Possibly, but that won't change anything: You still need to duplicate my proposed cache warming code, if you want to test it. So far, it's only contained in this pending PR and we are awaiting community feedback before we agree on a way to move forward. |
|
I applied the change manually to aiobotocore 2.21.1 and preload the caches. session = AioSession()
await hass.async_add_executor_job(session.warm_up_loader_caches, "s3")Some detections like the |
Sounds to me like a confirmation that the proposed approach works in the HA use case. |
Is there also a way to get rid of the SSL one? ssl_context.load_verify_locations(ca_certs, None, None) |
Glancing at the stacktrace you shared, I don't see how that would be related to aiobotocore. |
It seems to be triggered by: await cast(Any, client).head_bucket(Bucket=entry.data[CONF_BUCKET])Which is part of aiobotocore. |
Unblocking setup of the SSL context is another can of worms. This is out of scope of this PR. Please create a separate issue. |
|
the SSL context thing sounds like it could be easily handled by my version where we have an executor. sorry haven't had a chance to review the comments across all the issues yet in case this was already discussed, will try to catch up soon |
It would be great if there is already a solution in place that could work. |
42d671d to
bbc0134
Compare
|
ok I'll try to get AI to enhance mine to support the ssl case hopefully tonight. Sorry been crazy busy last couple months. Lots of plans for aiobotocore now with AI! |
That would be great, thank you for looking into it! |
|
@claude can you read the comments in this PR regarding SSL context loading issue by patrickvorgers, would it be easy to be handled in #1451 ? |
There was a problem hiding this comment.
Code review
Bug: raises for services without optional model types (confidence: 95)
The warm_up_loader_caches method calls load_service_model for waiters-2, paginators-1, and examples-1 without catching DataNotFoundError. Many services (e.g. iot, accessanalyzer, account) don't have a waiters-2 file, causing UnknownServiceError (a subclass of DataNotFoundError) to be raised and failing client creation when warm_up_loader_caches=True.
Verified locally:
>>> from botocore.loaders import Loader
>>> Loader().load_service_model('iot', 'waiters-2')
# raises UnknownServiceError
Botocore's own client.py wraps these calls in try/except DataNotFoundError (see _get_waiter_config). The warm-up method should do the same:
# from session.py
loader.load_data_with_path('endpoints')
loader.load_data('sdk-default-configuration')
try:
loader.load_service_model(service_name, 'waiters-2', api_version)
except DataNotFoundError:
pass
try:
loader.load_service_model(service_name, 'paginators-1', api_version)
except DataNotFoundError:
pass
loader.load_service_model(
service_name, type_name='service-2', api_version=api_version
)
loader.list_available_services(type_name='service-2')
# from client.py
loader.load_data('partitions')
loader.load_service_model(
service_name, 'service-2', api_version=api_version
)
loader.load_service_model(
service_name, 'endpoint-rule-set-1', api_version=api_version
)
loader.load_data('_retry')
# from docs/service.py
try:
loader.load_service_model(service_name, 'examples-1', api_version)
except DataNotFoundError:
passYou'll also need to import DataNotFoundError from botocore.exceptions, and add a test case for a service that lacks waiters-2 (e.g. iot or accessanalyzer).
🤖 Generated with Claude Code
bbc0134 to
a9859d6
Compare
a9859d6 to
3fad65c
Compare
Description of Change
Support warm-up of loader caches
Assumptions
Replace this text with any assumptions made (if any)
Checklist for All Submissions
Checklist when updating botocore and/or aiohttp versions