I merged the bug from a few weeks ago back into the code after trying to reorganize it to make it better to manage. Thankfully, because of the first time, I have tools now to deal with this kind of thing. I’m in the process of purging over 500 posts from the feed that are all dupes.

  • RedWizardOPMA
    link
    fedilink
    arrow-up
    3
    ·
    edit-2
    25 days ago

    Just a heads up. I found the issue again. In my haste to fix the issue I didn’t fully deploy the fix. I saw your comments before they were deleted with the posts they were under though:

    Yeah I’ll have to add a dupe check to the queue itself and have it self purge before any posting happens. That’s a good idea. Good task for my day off tomorrow!

    Actually that might not work. The issue isn’t that the queue has dupes. It’s that they were never getting marked as posted. It’s because presstv.ir is odd. The URL doesn’t load right without www. In front but the RSS feed provides the article without the www. It gets queued that way so I’m always matching against what their server is delivering. I change the URL for posting so the links function. I was doing it in a way that also stored the change in the DB which was the original issue. It made a mismatch between the rss feed and the db resulting in the system thinking the posts were never posted.

    It shouldn’t be an issue going forward. This was a unique situation. I can’t imagine any other site will have such an odd issue that requires me to mess with the URL like that.

    This is what I get for making changes on a Friday/Thursday. Lol.