Skip to content

Surface Web, Deep Web, And Dark Web Explained

Blog
  • 1631810200206-human1.jpg.webp
    When you think about the internet, what’s the first thing that comes to mind ? Online shopping ? Gaming ? Gambling sites ? Social media ? Each one of these would certainly fall into the category of what requires internet access to make possible, and it would be almost impossible to imagine a life without the web as we know it today. However, how well do we really know the internet and its underlying components ?

    Let’s first understand the origins of the Internet

    The “internet” as we know it today in fact began life as a product called ARPANET. The first workable version came in the late 1960s and used the acronym above rather than the less friendly “Advanced Research Projects Agency Network”. The product was initially funded by the U.S. Department of Defense, and used early forms of packet switching to allow multiple computers to communicate on a single network - known today as a LAN (Local Area Network).

    The internet itself isn’t one machine or server. It’s an enormous collection of networking components such as switches, routers, servers and much more located all over the world - all contacted using common “protocols” (a method of transport which data requires to reach other connected entities) such as TCP (Transmission Control Protocol) and UDP (User Datagram Protocol). Both TCP and UDP use a principle of “ports” to create connections, and ultimately each connected device requires an internet address (known as an IP address which is unique to each device meaning it can be identified individually amongst millions of other inter connected devices).

    In 1983, ARPANET began to leverage the newly available TCP/IP protocol which enabled scientists and engineers to assemble a “network of networks” that would begin to lay the foundation in terms of the required framework or “web” for the internet as we know it today to operate on. The icing on the cake came in 1990 when Tim Berners-Lee created the World Wide Web (www as we affectionately know it) - effectively allowing websites and hyperlinks to work together to form the internet we know and use daily.

    However, over time, the internet model changed as a result of various sites wishing to remain outside of the reach of search engines such as Google, Bing, Yahoo, and the like. This method also gave content owners a mechanism to charge users for access to content - referred to today as a “Paywall”. Out of this new model came, effectively, three layers of the internet.

    Three “Internets” ?

    To make this easier to understand (hopefully), I’ve put together the below diagram
    1626271657-557191-interneticeberg.webp

    The “Surface Web”

    Ok - with the history lesson out of the way, we’ll get back to the underlying purpose of this article, which is to reveal the three “layers” of the internet. For a simple paradigm, the easiest way to explain this is to use the “Iceberg Model”.

    The “internet” that forms part of our everyday lives consists of sites such as Google, Bing, Yahoo (to a lesser extent) and Wikipedia (as common examples - there are thousands more).

    The “Deep Web”

    The next layer is known as the “Deep Web” which typically consists of sites that do not expose themselves to search engines, meaning they cannot be “crawled” and will not feature in Google searches (in the sense that you cannot access a direct link without first having to login). Sites covered in this category - those such as Netflix, your Amazon or eBay account, PayPal, Google Drive, LinkedIn (essentially, anything that requires a login for you to gain access)

    The “Dark Web”

    The third layer down is known as the “Dark Web” - and it’s “Dark” for a reason. These are sites that truly live underground and out of reach for most standard internet users. Typically, access is gained via a TOR (The Onion Router - a bit more about that later) enabled browser, with links to websites being made up of completely random characters (and changing often to avoid detection), with the suffix of .onion. If I were asked to describe the Dark Web, I’d describe it as an underground online marketplace where literally anything goes - and I literally mean “anything”.

    Such examples are

    • Ransomware
    • Botnets,
    • Biitcoin trading
    • Hacker services and forums
    • Financial fraud
    • Illegal pornography
    • Terrorism
    • Anonymous journalism
    • Drug cartels (including online marketplaces for sale and distribution - a good example of this is Silk Road and Silk Road II)
    • Whistleblowing sites
    • Information leakage sites (a bit like Wikileaks, but often containing information that even that site cannot obtain and make freely available)
    • Murder for hire (hitmen etc.)

    Takeaway

    The Surface, Dark, and Deep Web are in fact interconnected. The purpose of these classifications is to determine where certain activities that take place on the internet fit. While internet activity on the Surface Web is, for the most part, secure, those activities in the Deep Web are hidden from view, but not necessarily harmful by nature. It’s very different in the case of the Dark Web. Thanks to it’s (virtually) anonymous nature little is known about the true content. Various attempts have been made to try and “map” the Dark Web, but given that URLs change frequently, and generally, there is no trail of breadcrumbs leading to the surface, it’s almost impossible to do so.
    In summary, the Surface Web is where search engine crawlers go to fetch useful information. By direct contrast, the Dark Web plays host to an entire range of nefarious activity, and is best avoided for security concerns alone.

  • @phenomlab some months ago I remember that I’ve take a look to the dark web……and I do not want to see it anymore….

    It’s really….dark, the content that I’ve seen scared me a lot……

  • @justoverclock yes, completely understand that. It’s a haven for criminal gangs and literally everything is on the table. Drugs, weapons, money laundering, cyber attacks for rent, and even murder for hire.

    Nothing it seems is off limits. The dark web is truly a place where the only limitation is the amount you are prepared to spend.


  • 0 Votes
    4 Posts
    395 Views

    @DownPW 🙂 most of this really depends on your desired security model. In all cases with firewalls, less is always more, although it’s never as clear cut as that, and there are always bespoke ports you’ll need to open periodically.

    Heztner’s DDoS protection is superior, and I know they have invested a lot of time, effort, and money into making it extremely effective. However, if you consider that the largest ever DDoS attack hit Cloudflare at 71m rps (and they were able to deflect it), and each attack can last anywhere between 8-24 hours which really depends on how determined the attacker(s) is/are, you can never be fully prepared - nor can you trace it’s true origin.

    DDoS attacks by their nature (Distributed Denial of Service) are conducted by large numbers of devices whom have become part of a “bot army” - and in most cases, the owners of these devices are blissfully unaware that they have been attacked and are under command and control from a nefarious resource. Given that the attacks originate from multiple sources, this allows the real attacker to observe from a distance whilst concealing their own identity and origin in the process.

    If you consider the desired effect of DDoS, it is not an attempt to access ports that are typically closed, but to flood (and eventually overwhelm) the target (such as a website) with millions of requests per second in an attempt to force it offline. Victims of DDoS attacks are often financial services for example, with either extortion or financial gain being the primary objective - in other words, pay for the originator to stop the attack.

    It’s even possible to get DDoS as a service these days - with a credit card, a few clicks of a mouse and a target IP, you can have your own proxy campaign running in minutes which typically involves “booters” or “stressers” - see below for more

    https://heimdalsecurity.com/blog/ddos-as-a-service-attacks-what-are-they-and-how-do-they-work

    @DownPW said in Setting for high load and prevent DDoS (sysctl, iptables, crowdsec or other):

    in short if you have any advice to give to secure the best.

    It’s not just about DDos or firewalls. There are a number of vulnerabilities on all systems that if not patched, will expose that same system to exploit. One of my favourite online testers which does a lot more than most basic ones is below

    https://www.immuniweb.com/websec/

    I’d start with the findings reported here and use that to branch outwards.

  • 4 Votes
    8 Posts
    611 Views

    @phenomlab
    Sorry to delay in responding, yes as i mentioned above, i had to remove my redis from docker and reinstall a new image with this command

    docker run --name=redis -p 127.0.0.1:6379:6379 -d -t redis:alpine

    and now when i test my ip and port on
    https://www.yougetsignal.com/tools/open-ports/

    the status of my redis port is closed. I think which to configure firewall in droplet digital ocean is a good idea too, and i will configure soon.
    Thanks for the help!

  • 4 Votes
    3 Posts
    689 Views

    @phenomlab

    No they have a free and pro console instance.
    We can see alert with IP, Source AS, scenario attack etc…

    Installation on the NODEBB server without problems. Very good tools

    cf7e5a89-84f4-435b-82eb-434c0bfc895e-image.png
    cc82a10e-a1f1-4fd8-a433-7c9b2d31f254-image.png

    1b7147b0-37c6-4d87-b4f1-a0fe92e74afd-image.png

    7c21fc10-1825-48e1-a993-92b84455f074-image.png


    We can also do research on IPs via the crowdsec analyzer

    I believe it’s 500 per month in the Free version

    43bc8265-a57c-4439-829c-0bb8602d99b4-image.png

  • 1 Votes
    2 Posts
    265 Views

    @mike-jones Hi Mike,

    There are multiple answers to this, so I’m going to provide some of the most important ones here

    JS is a client side library, so you shouldn’t rely on it solely for validation. Any values collected by JS will need to be passed back to the PHP backend for processing, and will need to be fully sanitised first to ensure that your database is not exposed to SQL injection. In order to pass back those values into PHP, you’ll need to use something like

    <script> var myvalue = $('#id').val(); $(document).ready(function() { $.ajax({ type: "POST", url: "https://myserver/myfile.php?id=" + myvalue, success: function() { $("#targetdiv").load('myfile.php?id=myvalue #targetdiv', function() {}); }, //error: ajaxError }); return false; }); </script>

    Then collect that with PHP via a POST / GET request such as

    <?php $myvalue= $_GET['id']; echo "The value is " . $myvalue; ?>

    Of course, the above is a basic example, but is fully functional. Here, the risk level is low in the sense that you are not attempting to manipulate data, but simply request it. However, this in itself would still be vulnerable to SQL injection attack if the request is not sent as OOP (Object Orientated Programming). Here’s an example of how to get the data safely

    <?php function getid($theid) { global $db; $stmt = $db->prepare("SELECT *FROM data where id = ?"); $stmt->execute([$theid]); while ($result= $stmt->fetch(PDO::FETCH_ASSOC)){ $name = $result['name']; $address = $result['address']; $zip = $result['zip']; } return array( 'name' => $name, 'address' => $address, 'zip' => $zip ); } ?>

    Essentially, using the OOP method, we send placeholders rather than actual values. The job of the function is to check the request and automatically sanitise it to ensure we only return what is being asked for, and nothing else. This prevents typical injections such as “AND 1=1” which of course would land up returning everything which isn’t what you want at all for security reasons.

    When calling the function, you’d simply use

    <?php echo getid($myvalue); ?>

    @mike-jones said in Securing javascript -> PHP mysql calls on Website:

    i am pretty sure the user could just use the path to the php file and just type a web address into the search bar

    This is correct, although with no parameters, no data would be returned. You can actually prevent the PHP script from being called directly using something like

    <?php if(!defined('MyConst')) { die('Direct access not permitted'); } ?>

    then on the pages that you need to include it

    <?php define('MyConst', TRUE); ?>

    Obviously, access requests coming directly are not going via your chosen route, therefore, the connection will die because MyConst does not equal TRUE

    @mike-jones said in Securing javascript -> PHP mysql calls on Website:

    Would it be enough to just check if the number are a number 1-100 and if the drop down is one of the 5 specific words and then just not run the rest of the code if it doesn’t fit one of those perameters?

    In my view, no, as this will expose the PHP file to SQL injection attack without any server side checking.

    Hope this is of some use to start with. Happy to elaborate if you’d like.

  • 0 Votes
    1 Posts
    239 Views
    No one has replied
  • 0 Votes
    1 Posts
    248 Views
    No one has replied
  • 5 Votes
    4 Posts
    558 Views

    @crazycells I guess the worst part for me was the trolling - made so much worse by the fact that the moderators allowed it to continue, insisting that the PeerLyst coming was seeing an example by allowing the community to “self moderate” - such a statement being completely ridiculous, and it wasn’t until someone else other than myself pointed out that all of this toxic activity could in fact be crawled by Google, that they decided to step in and start deleting posts.

    In fact, it reached a boiling point where the CEO herself had to step in and post an article stating their justification for “self moderation” which simply doesn’t work.

    The evidence here speaks for itself.

  • is my DMARC configured correctly?

    Solved Configure
    3
    3 Votes
    3 Posts
    326 Views

    @phenomlab said in is my DMARC configured correctly?:

    you’ll get one from every domain that receives email from yours.

    Today I have received another mail from outlook DMARC, i was referring to your reply again and found it very helpful/informative. thanks again.

    I wish sudonix 100 more great years ahead!