I merged the bug from a few weeks ago back into the code after trying to reorganize it to make it better to manage. Thankfully, because of the first time, I have tools now to deal with this kind of thing. I’m in the process of purging over 500 posts from the feed that are all dupes.


Just a heads up. I found the issue again. In my haste to fix the issue I didn’t fully deploy the fix. I saw your comments before they were deleted with the posts they were under though:
Yeah I’ll have to add a dupe check to the queue itself and have it self purge before any posting happens. That’s a good idea. Good task for my day off tomorrow!Actually that might not work. The issue isn’t that the queue has dupes. It’s that they were never getting marked as posted. It’s because presstv.ir is odd. The URL doesn’t load right without www. In front but the RSS feed provides the article without the www. It gets queued that way so I’m always matching against what their server is delivering. I change the URL for posting so the links function. I was doing it in a way that also stored the change in the DB which was the original issue. It made a mismatch between the rss feed and the db resulting in the system thinking the posts were never posted.
It shouldn’t be an issue going forward. This was a unique situation. I can’t imagine any other site will have such an odd issue that requires me to mess with the URL like that.
This is what I get for making changes on a Friday/Thursday. Lol.
Maybe this is just stuff already queued, but note that it is still happening: https://hexbear.net/post/7667593
Looks like I cleared it earlier, just double checked and I didn’t see new dupes. It shouldn’t be doing it again. I tested it in a test instance and made sure the server running the code had s fresh docker image build and I’d ruining the current code.
I don’t expect it to act up more. Let me know if you see anything off.
Looks good at the moment, thanks again!
Thanks for all the work you’ve put into this!