Skip to content

Adjusting HSTS settings for public wifi's

Solved Security
  • Hello there! Yesterday, I visited IKEA and connected to public Wi-Fi. Google and other websites worked as expected, but when I tried to access my own website, it didn’t load for some reason. I received an HSTS error and had to switch to mobile data to access my site.

    I understand we can’t just turn off HSTS since we already said browsers that we will use HSTS for next 12 months

    I’ve been using these cloudflare settings for the past two years. If there are any adjustments that need to be made to ensure my site functions in public Wi-Fi areas, please let me know.

    3b60d995-88a1-4d76-9755-a82464675f32-image.png

  • @Hari You already have the correct settings here. It’s more likely to be an issue with the Wi-Fi configuration at Ikea than an issue with CF or your own site. You can test this from another public Wi-Fi access point to either prove or disprove this theory.

    I would certainly not make any changes without validating this as I’ve mentioned above. If it does prove problematic from a completely different connection source, then fair enough, it needs review.

  • @phenomlab thanks for the reply, i will test this again when i visit ikea and also using the railway station wifi. for now lets mark this discussion as solved 🙂

  • Hariundefined Hari has marked this topic as solved on
  • @Hari Ok, no issues. Keep me posted…


Did this solution help you?
Did you find the suggested solution useful? Why not buy me a coffee? It's a nice gesture, and a great way to show your appreciation 💗

  • 4 Votes
    11 Posts
    373 Views

    @Hari Really? Can you elaborate a bit more here?

  • SSL certificates

    Solved Configure
    4
    2 Votes
    4 Posts
    167 Views

    @Panda Go for shared - don’t look at dedicated 😄

  • 3 Votes
    10 Posts
    767 Views

    @Hari DDoS protection is not just a switch, or one component. It’s a collection of different and often disparate technologies that when grouped together form the basis of a combined toolset that can be used in defence.

    Typically these consist of IDS (Instrusion Detection System) and IPS (Intrusion Prevention System) components that detect irregularities in network traffic, and will take decisive action based on predefined rulesets, or in the case of more modern systems, AI and ML.

    Traditional “traffic shaping” technology is also deployed, so if an attack cannot be easily identified as malicious, the bandwidth available to that connection is severely limited to nothing more than a trickle rather than a full flow.

    Years ago, ISP’s used traffic shaping (also called “policers”) as an effective means of stopping applications such as BearShare, eDonkey, Napster, and other P2P based sharing systems from functioning correctly - essentially reducing the “appeal” of distributing and seeding illegal downloads. This was essentially the ISP’s way of saying “stop what you are doing please” without actually pulling the plug.

    These days, DDoS attacks are designed to overwhelm - not assume control of - webservers and other public facing components. It’s rare for small entities to be attacked unless there is some form of political agenda driven by your site or product. A classic example is governmental institutions or lawmakers who effectively are classed as “enforcers” and those who disagree are effectively making a statement in the form of Denial of Service.

    DDoS protection is effectively the responsibility of the hosting provider, but you shouldn’t just assume that they will protect you or your site. Their responsibility stops at their infrastructure, so it’s then up to you too decide how you full the gap in between your host and the website.

    Typically, you’d leverage something like Imunify360 which you can get for Plesk (and something I’d strongly recommend) but it’s not free, and is a paid (not expensive per month) subscription. If you want to use VirtualMin then there are a variety of tools readily available out of the box such as firewalls and fail2ban.

  • 3 Votes
    18 Posts
    1k Views

    @justoverclock Not necessarily. You only need to use this if it fails from the Virtualmin window

  • 1 Votes
    2 Posts
    265 Views

    @hari the cache level for woocommerce should always be bypass. Any caching of woocommerce will cause you serious issues and will result in the checkout process not functioning correctly.

    This does mean that the overall experience will be slower (depending on geographic location) although CF is known to cause significant issues hence the need to bypass.

    If you want to cache as much as possible, then set rules to bypass caching on the cart and account pages etc.

  • 0 Votes
    1 Posts
    248 Views
    No one has replied
  • 1 Votes
    12 Posts
    852 Views

    @ash3t Great 🙂 Glad everything has worked out.

  • 5 Votes
    6 Posts
    447 Views

    @hari That’s it. Yes. Nothing more to do