Skip to content

Block Domain

Solved Let's Build It
  • I have a great code here. I managed to modify the HTTP URL link as well as the link text (description) by modifying them both with a warning message and the URL of my choice for each domain. I am currently redirecting to my checkout page.

    The code also differentiates with text before and after the link and does not take it into account.

    It works :

    • In the topics
    • In the carousel
    • In the previews of the last post in the recent page, unread etc…
    • In Chat

    There are certainly things to improve but I’ll share the code with you here.

    –> This will be my Christmas present 🎅

    If you have any improvements or suggestions, don’t hesitate @phenomlab

    // ------------------------------------------------------------------
    // Block Domain URL on topics & Topic teasers & caroussel topic info
    // ------------------------------------------------------------------
    
    $(document).ready(function () {
      // Function to replace URLs and link text in a given HTML content
      function replaceUrls(html) {
        // Replace the old domain URLs with the new domain URL and update link text
        html = html.replace(/(<a[^>]*href="https?:\/\/(?:www\.)?francesoir\.fr(?:\/[^\s>]*)?"[^>]*>)([^<]*)<\/a>/g, '<a href="https://YOUR_URL_REDIRECTION/rules"><strong>🤖 [Oups. The PW anti fake news system detected a disinformation domain name] 🤖</strong></a>');
        html = html.replace(/(<a[^>]*href="https?:\/\/(?:www\.)?francais\.rt\.com(?:\/[^\s>]*)?"[^>]*>)([^<]*)<\/a>/g, '<a href="https://YOUR_URL_REDIRECTION/rules"><strong>🤖 [Oups. The PW anti fake news system detected a domain name of disinformation and propaganda] 🤖</strong></a>');
        html = html.replace(/(<a[^>]*href="https?:\/\/(?:www\.)?fr\.sputniknews\.africa(?:\/[^\s>]*)?"[^>]*>)([^<]*)<\/a>/g, '<a href="https://YOUR_URL_REDIRECTION/rules"><strong>🤖 [Oups. The PW anti fake news system detected a domain name of disinformation and propaganda] 🤖</strong></a>');
        html = html.replace(/(<a[^>]*href="https?:\/\/(?:www\.)?lecourrierdesstrateges\.fr(?:\/[^\s>]*)?"[^>]*>)([^<]*)<\/a>/g, '<a href="https://YOUR_URL_REDIRECTION/rules"><strong>🤖 [Oups. The PW anti fake news system detected a disinformation domain name] 🤖</strong></a>');
        html = html.replace(/(<a[^>]*href="https?:\/\/(?:www\.)?lelibrepenseur\.org(?:\/[^\s>]*)?"[^>]*>)([^<]*)<\/a>/g, '<a href="https://YOUR_URL_REDIRECTION/rules"><strong>🤖 [Oups. The PW anti fake news system detected a disinformation domain name] 🤖</strong></a>');
        html = html.replace(/(<a[^>]*href="https?:\/\/(?:www\.)?lesmoutonsenrages\.fr(?:\/[^\s>]*)?"[^>]*>)([^<]*)<\/a>/g, '<a href="https://YOUR_URL_REDIRECTION/rules"><strong>🤖 [Oups. The anti fake news system detected a domain name spreading hate speech and anti-Semitic comments] 🤖</strong></a>');
        html = html.replace(/(<a[^>]*href="https?:\/\/(?:www\.)?upr\.fr(?:\/[^\s>]*)?"[^>]*>)([^<]*)<\/a>/g, '<a href="https://YOUR_URL_REDIRECTION/rules"><strong>🤖 [Oups. The PW anti fake news system detected a disinformation domain name] 🤖</strong></a>');
        html = html.replace(/(<a[^>]*href="https?:\/\/(?:www\.)?lecourrier-du-soir\.com(?:\/[^\s>]*)?"[^>]*>)([^<]*)<\/a>/g, '<a href="https://YOUR_URL_REDIRECTION/rules"><strong>🤖 [Oups. The PW anti fake news system detected a disinformation domain name] 🤖</strong></a>');
        html = html.replace(/(<a[^>]*href="https?:\/\/(?:www\.)?fdesouche\.com(?:\/[^\s>]*)?"[^>]*>)([^<]*)<\/a>/g, '<a href="https://YOUR_URL_REDIRECTION/rules"><strong>🤖 [Oups. The anti-fake news system detected a domain name disseminating racist hate articles] 🤖</strong></a>');
        // Add more lines for additional domains if needed
        return html;
      }
    
      // Function to replace URLs and link text in a given content element
      function replaceUrlsInContent($content) {
        $content.html(function (_, oldHtml) {
          return replaceUrls(oldHtml);
        });
      }
    
      // Replace URLs and link text when a new post is added
      $(window).on('action:posts.loaded', function () {
        $('[component="post"]').each(function () {
          var $postContent = $(this).find('.content, .topic-info.text-sm.text-break, .content.mt-2.text-break, .stretched-link, .message-body.ps-0.py-0.overflow-auto.text-break');
          // Replace URLs and link text in the post content
          $postContent.html(replaceUrls($postContent.html()));
        });
      });
    
      // Replace URLs and link text when a post is edited
      $(window).on('action:topic.loaded', function () {
        $('[component="post"]').each(function () {
          var $postContent = $(this).find('.content, .topic-info.text-sm.text-break, .content.mt-2.text-break, .stretched-link, .message-body.ps-0.py-0.overflow-auto.text-break');
          // Replace URLs and link text in the post content
          $postContent.html(replaceUrls($postContent.html()));
        });
      });
    
      // Replace URLs and link text when a new page is loaded or content is updated for topic teaser & carroussel topic info
      $(document).ajaxComplete(function () {
        $('.post-content.text-xs.ps-2.line-clamp-sm-2.lh-sm.text-break, .topic-info.text-sm.text-break, .content.mt-2.text-break, .stretched-link, .message-body.ps-0.py-0.overflow-auto.text-break').each(function () {
          var $content = $(this);
          replaceUrlsInContent($content);
        });
      });
    });
    

    PS : nothing to see but appearance code on topics bug :
    The Block code is no longer displayed in its entirety but only on one line :

    f28171b1-e396-49dc-a8f3-97be40f04371-image.png

  • DownPWundefined DownPW has marked this topic as solved on
  • @DownPW that code looks great. I seem to have trained you well!

    I’ll investigate the single line of code issue. I think that’s probably because of a modified version of highlight. js I use.

  • @phenomlab

    My code work great but a little more 😉
    My code conflicts with your OGPROXY code.

    Do you have an idea so that the two can live together?

  • @DownPW Potentially, yes, but what is the conflict exactly?

  • I don’t know, I just see that OG proxy does not format the links when my code to block domains is active.

    If I disable it, OG proxy works fine.

    I would have to make sure that OG proxy does not activate when a blacklist link is detected or that it works one after the other

    Actually I don’t really know at the moment.

    EDIT:

    Maybe add an ignore list to OG PROXY

  • @DownPW said in Block Domain:

    I don’t know, I just see that OG proxy does not format the links when my code to block domains is active.

    This just means that the callback / hook isn’t monitored by OGProxy and can probably be quite easily rectified.

    @DownPW said in Block Domain:

    Maybe add an ignore list to OG PROXY

    There is one already in function.js - see below

    https://github.com/phenomlab/ogproxy/blob/03d5ff125611361700d785bd82a6ab16fcd68bfc/function.js#L9

  • In any case, it doesn’t change anything in fact I think to add exclusions in OG Proxy, because links which are not to be ignored are not formatted by OG proxy

    @phenomlab said in Block Domain:

    This just means that the callback / hook isn’t monitored by OGProxy and can probably be quite easily rectified.

    I’ll look and try to figure it out but can you help me with this ?

  • @DownPW said in Block Domain:

    I’ll look and try to figure it out but can you help me with this ?

    Yes, of course.

  • no luck for now

  • @DownPW Can you provide a summary of what you’ve tried?

  • I don’t have everything listed.

    In fact I’m trying to understand what’s going on, otherwise I’m going all over the place but I don’t have any errors in the console.

    In fact I have the impression that the 2 codes act on <a> and that they sometimes come into conflict because when I enter the topic via the recent page or other, OG proxy does the job, I refresh, it works again , I refresh and the OGproxy does nothing.

    I have the impression that both codes are playing with the DOM and it’s not okay.
    Sometimes OGproxy works, sometimes it works on every second link, and sometimes not at all.

    Anyway, I’m a little lost at the moment.

    I don’t really know what changed or what script changed.

  • @DownPW I don’t think this is OGProxy. I think it’s your script that manipulates the DOM afterwards and causes the issue. Is there somewhere I can see the script and review it (dev) ?

  • Script is on dev yes.

  • @DownPW is OGProxy also on dev? I’ll need both running

  • Yes ogproxy too is functionnal on dev


Did this solution help you?
Did you find the suggested solution useful? Why not buy me a coffee? It's a nice gesture, and a great way to show your appreciation 💗

  • 4 Votes
    4 Posts
    127 Views

    @Norrad Are you looking for anything in particular? I only ask because Sudonix uses a number of custom functions which I wrote, but all are available on GitHub and fully supported here.

  • Nodebb icon on google page

    Solved Customisation
    9
    4 Votes
    9 Posts
    597 Views

    @Panda It’s been raised multiple times, but only for the open source version, and not hosted.

  • SEO and Nodebb

    Performance
    2
    2 Votes
    2 Posts
    144 Views

    @Panda It’s the best it’s ever been to be honest. I’ve used a myriad of systems in the past - most notably, WordPress, and then Flarum (which for SEO, was absolutely dire - they never even had SEO out of the box, and relied on a third party extension to do it), and NodeBB easily fares the best - see below example

    https://www.google.com/search?q=site%3Asudonix.org&oq=site%3Asudonix.org&aqs=chrome..69i57j69i60j69i58j69i60l2.9039j0j3&sourceid=chrome&ie=UTF-8#ip=1

    However, this was not without significant effort on my part once I’d migrated from COM to ORG - see below posts

    https://community.nodebb.org/topic/17286/google-crawl-error-after-site-migration/17?_=1688461250365

    And also

    https://support.google.com/webmasters/thread/221027803?hl=en&msgid=221464164

    It was painful to say the least - as it turns out, there was an issue in NodeBB core that prevented spiders from getting to content, which as far as I understand, is now fixed. SEO in itself is a dark art - a black box that nobody really fully understands, and it’s essentially going to boil down to one thing - “content”.

    Google’s algorithm for indexing has also changed dramatically over the years. They only now crawl content that has value, so if it believes that your site has nothing to offer, it will simply skip it.

  • 0 Votes
    1 Posts
    130 Views
    No one has replied
  • 1 Votes
    5 Posts
    565 Views

    @DownPW very useful tip. Thanks

  • 14 Votes
    69 Posts
    5k Views

    @phenomlab

    Seems to be better with some scaling fix for redis on redis.conf. I haven’t seen the message yet since the changes I made

    # I increase it to the value of /proc/sys/net/core/somaxconn tcp-backlog 4096 # I'm uncommenting because it can slow down Redis. Uncommented by default !!!!!!!!!!!!!!!!!!! #save 900 1 #save 300 10 #save 60 10000

    If you have other Redis optimizations. I take all your advice

    https://severalnines.com/blog/performance-tuning-redis/

  • Error install plugin

    Solved Customisation
    8
    1 Votes
    8 Posts
    461 Views

    @pobojmoks

    Not WP plugin but nodeBB but it a known bug

  • Dark Theme Upper Padding

    Solved Customisation
    7
    6 Votes
    7 Posts
    421 Views

    @DownPW great! thanks a lot… this code solves my problem.