Checks for erroneous data; new data types; reduced the number of photos needed to trigger a shorter timespan#13
Conversation
Changed lower limit for raising downloadbatchtoolarge – the API seems to return somewhere around 3500 photos even if the batch is not complete.
|
Hi @Tadusko, this looks kinda good, I have a few suggestions, though:
|
christophfink
left a comment
There was a problem hiding this comment.
See my comments in this reply: #13 (comment)
|
Hi Chris! Thank you for reviewing this and the comments! I'll get back to it soon – I need to dig a bit to check what were the exact errors I got & why I ended up with the current checks (ChatGPT assisted fixes, for full disclosure). |
- photo.date_posted not available in photodownloaderthread.py:62 - race conditions with tags/duplicate keys
|
@Tadusko sorry for making so massive changes, in the end - once you start touching one thing, a million other things pop up This is now working quite fine, I still would like to add another worker to fetch missing tags, license, and geo-accuracy information for existing photos |
Addressed all of my own concerns, requesting @Tadusko‘s review
|
@Tadusko I took all your changes in, but restructured a lot of things in general. I‘m going to merge this and release a dev-release. Could you then test it on a ‘real’ installation? (I‘ll write you in Slack about it) |
Hi! Sorry that this is bit of a mess, since these changes came over time as I kept working on it locally.
This branch has multiple changes:
if "photos" not in results or "photo" not in results["photos"]:. I don't know if this is the best approach.DownloadBatchIsTooLargeErrorfrom 4000 to 3000. A more conservative estimate, may return more photos.