Quantcast
Channel: Hakin9 – IT Security Magazine
Viewing all articles
Browse latest Browse all 612

How to use the deep web as a force for good: Internet 2.0, Hyperreality and net neutrality

$
0
0

Introduction

The deep web houses the dark web’s illegal and provocative material, this paper will attempt to use the deep web as a force for good, effectively distancing it from the dark web. This will be helpful as the resurfacing of popular interest in the deep web is always a possibility.

Report it

If you see something negative, report it; many activities on the deep web depict not only deviant behavior but also illegal behavior. The ones responsible for uploading the content could be liable to receive punishment and the ones that acted in the illegal behavior will most likely be charged with a crime. It is best to contact the FBI for these issues, your local force might also help if the criminal or owner of the website resides locally. This is important to do in order to clean the deep web of deviancy. The deep web is a honeypot to attract deviants, so mark said deviants and then later list or pursue them. The criminals on the deep web and law enforcement would rather not deal with each other to begin with. The digital meeting point of both criminals and authorities is the deep web and this concern becomes greater when we observe how the clearnet has become more deviant over the years, meaning that more people will become even worse off and flock to the deep web where terrorist organizations and abuse are found. All this together means that the flock of people moving to the deep web, whether from TikTok or Discord, will become worse off and mix with other deviants. This is an environment for societal disintegration, the threat is definitely real. As more and more people are claiming that they have started to use the deep web, and with unrestricted access on the internet, children will also flock to the deep web as if the clearnet doesn’t have enough abuse in it. As such, those children will develop troublesome habits that will keep getting worse, a recipe for catastrophe. We must try to clean the dark web of all the negativities in it, of all the abuse and deviancies; it seems too late for the clearnet, so that even if identity documentation and age restrictions were put in place, a way around them would still be found out. The deep web must be cleared before the general population reaches it; when the hype dies down, they will not continue their journey on the deep web when they see that there is nothing there and they will not continue to lurk there. 

Internet 2.0

When the deep web becomes clear, it could serve as a bridge to internet 2.0, as opposed to having no insurance or backup to the loss of the current internet by some type of disaster. We could ready Tor and similar services to become internet 2.0, and in case that fails, then at least internet 2.0 will not be influenced by the negativities that  many people will come to acquire, or in the case that internet 2.0 were to be the internet of all things (as in the crossing of reality and the virtual world), then that too will get influenced by the  abuse that is now spreading and becoming more extreme by the day. With that, deviancies will leak into reality and become even more deprecating. It will keep getting worse, which is why we need to clean the deep web; in addition to this, the deep web if not cleaned, could merge into hyperreality in the future. Just imagine the amount of damage this could cause, where many would have the ability to commit financial crimes, such as bribery, corruption, money laundering, financing of terrorism, black market currency exchange, counterfeit money - those are only some of the financial crimes that could happen. Let’s not forget other crimes, such as making or watching violence and abuse; we must look forward to clean the deep web. In 2014, Operation Onymous was put together by Europol and the intelligence agencies of the U.S to shut down as many non-indexed sites as possible on the deep web. As we can see, the deep web’s extreme content relies mostly on illegal activities, meaning that with the coming digital progress, the progression of technology would lean to the more harmful side. Likewise, the deep web should be cleaned before it merges with artificial intelligence, virtual reality, hyperreality or the internet of things. If that is too hard to achieve, another strategy might be to split the dark web away from the deep web; the onion router could instead host only the dark web, while non-indexed websites could be accessed by another router service,  meaning that the illegal and abusive depictions and services could instead be kept in Tor while “good” deep web services could be still non-indexed. What type of good services could be in such a scenario? We will discuss this in the next session.

Archive it

The deep web could be used to keep archives of websites. This is important to consider because, as we know, with the passing of time, many websites close down or get shut down. If we were to save those pages on non-indexed websites, it could prevent the loss of internet history, and while many have saved websites on the Wayback machine, due to not saving those websites locally/mechanically, many websites were lost when they closed down. Another problem with the Wayback machine is that the websites are not dynamic enough, and many websites disallow the archiving of their websites with bots and passwords. This is not to say that personal data should be archived, but that many websites vanish through certain search engines. It is important to remember that the Wayback machine does work on non-indexed websites when accessed from Tor, yet many of these websites lose their dynamic ability and become mere pictures. This is why a new digital archive service is needed; using said service would be a more sure way of saving websites and an alternative to saving said websites on the internet archive website. In addition, this service could be made to collect private archives, similar to the internet archive, but in case the internet archive gets lost or damaged and there was not a backup, wouldn’t we need another backup not only for private archives (whether they be digital or physical) but for archiving the pages of services as well. At any point in time, these archives might disappear, which relates to the fear of losing the internet’s first form due to any disaster that could occur, and while some theorists may claim that web 1, 2 and 3 are progressions of user generated content, herein I am referring to the complete loss of the internet, which is what internet 2.0 refers to in this context. These fears would be further exacerbated by the fact that there is no real method of backing up the internet into physical form. How would it be possible to back up internet pages to physical spaces without oversimplifying the content of said digital information? Whether it be a socio-technological concern, as I have outlined before, a techno-sophical concern, or related to the philosophy of technology, a solution to this matter should be found. Now that we have built the theoretical basis behind creating a physical backup for the internet, the question remains: How do we back up the internet in physical form without oversimplifying its digital content? Would a library archive of printed webpages be a solution, or is that solution not dynamic enough?

Hyperreality

Likewise, we should observe that the internet around us is being embodied more and more, this web-manifestation is being materialized through many methods. The first is manifesting the ideas and forms we see online into products. This starts with creativity and ends with application; an example of how this method is evolving is 3D-printing. It would enable more digital items to be imagined, then applied. Another method is shipping, where items could be advertised and transported physically. Moreover, money and value is moved digitally (i.e., currency). Ideas are moved around digitally and then manifested physically in the world. Some services are also made digitally and consumed digitally; think of research papers, for example. In addition, demand creates the supply and progress of these consumable items and competition drives this progress even more. This is why we will want more and more of this digital manifestation; as such, we can categorize digital permeability into digital-space consumables, both physical space consumables and hyperphysical consumables; this is a techno-ontological categorization of the matter, in the frame of hyperreal manifestation.

Net neutrality

The principal of net neutrality ensures that companies do not make a set amount of websites exclusive to a certain internet provider, so that companies would not block the websites that competitors might monopolize, not charge extra for loading websites or censor content as they please. This has become an issue since Ajit Pai tried to end net neutrality in the interest of AT&T, Verizon and T-Mobile (according to The New York Times Cecilia Kang). He was the head of the Federal Communications Commission, otherwise known as the FCC, in 2017. Thankfully, the FCC leading members voted against his decision with a majority and he resigned when Joe Biden became president. Why is this issue relevant right now? A new FCC commissioner has been voted in and preparations are underway to ensure that the incident in 2017 not occur again; this is why other options should be considered as a failsafe. The threat of losing net neutrality will keep lurking; as such, laws should be passed to ensure net neutrality will not be lost again. This will not only affect the U.S., if it happens, but will spread to the whole world. So how does the deep web tie into this? In case search engines and rights groups are unable to solve this problem, Tor could be used as an alternative router, in which de-indexing websites (by having such websites use the noindex metatag) would keep the companies from being able to monopolize the internet, as they would simply not be able to monopolize non-indexed websites. Here they would further try to do just that, but would then be forced to stop as that would put too much of a spotlight on the deep web and Tor related services, leading a majority of people to find security in the deep web, which would not be welcome by intelligence agencies. This could be a way to inscribe net neutrality into U.S law, but of course de-indexing websites should only be a temporary solution until the issue of net neutrality is fully resolved.

Undercounter medication

Many types of medications need a prescription; in some countries it is illegal to give medication by pharmacies without prescription, yet, the countries that do allow them, allow whatever amount the patient pleases. As such, the deep web could be used to traffic pharmaceuticals, instead of drugs, for those in need. In this way, many could benefit from treatment without the need to pay a doctors fee, or in most cases, to not pay the doctor over and over again. For example, according to the New York Times, Daraprim, a type of medication, was acquired by a company called Turing Pharmaceuticals in August of 2015; they raised the price to $750 US per pill, as opposed to its early 2015 price of $13.50 US. In other countries, Daraprim is much less expensive. A suggestion would be to sell under the counter medication to and from countries that do not prohibit or curtail access to medication, or countries that do not force the need for prescription medication. In this case, neither the buyer nor the seller are engaged in illegal activities; those in countries that force prescriptions can consider the legal statutes in their own countries and are thus responsible for themselves. In any case, this would be a great way to fight against ‘Big Pharma’ and its monopoly on life.

How to access for beginners

The best and most secure way is to use the Onion router, otherwise known as Tor, with the addition of Tails program; you can use Ahmia.fl to search the relevant onion websites. A disclaimer is that, in theory, most Tor exit nodes are watched by governments and that Tor could be used as a honeypot for the deviant. Other than that, as long as an individual’s behavior is not suspicious, it should be fine; entry into non indexed websites (otherwise known as the deep web) in general is not suspicious, but delving further into the dark web might be so, especially if an illegal service was paid for. While the dark web is part of the deep web, it mostly hosts illegal services and depicts illegal behavior. Much of this paper hosts both technical and philosophical analysis related to the possibilities at hand.

Conclusion

We can see the need to clean up the deep web from the deviant behavior it contains, with the resurface in the popularity of Tor related service usage. We could use non-indexed websites to archive websites, ensure net neutrality, fight pharmaceutical companies and ensure a better future for the internet. Further, by cleaning up the deep web, not only do we ensure better and less deviant societies and a fight against monopolies but we also pushback against the possible progression of the dark web into hyperreality in the future.

On the Web:

https://rb.gy/wsu0r

https://rb.gy/rcmgo

https://rb.gy/wyic4

https://rb.gy/c5r2a

https://rb.gy/0xlbq

References:

Iskandar, Tatiana, Semien, Lee, and Vinegrad, Daniel. n.d. “Net Neutrality.” Cs.stanford.edu. Stanford. https://cs.stanford.edu/people/eroberts/cs201/projects/201011/NetNeutrality/Articles/Proponents.html#:~:text=One%20of%20the%20most%20basic.

Feiner, Lauren. “FCC Chairman Ajit Pai Will Step down on Jan 20.” CNBC, November 30, 2020. https://www.cnbc.com/2020/11/30/fcc-chairman-ajit-pai-will-step-down-on-january-20.html.

Kang, C. (2017, February 5). Trump’s F.C.C. Pick Quickly Targets Net Neutrality Rules. The New York Times. https://www.nytimes.com/2017/02/05/technology/trumps-fcc-quickly-targets-net-neutrality-rules.html

Pollack, A. (2015, September). Drug Goes From $13.50 a Tablet to $750, Overnight. The New York Times. https://www.nytimes.com/2015/09/21/business/a-huge-overnight-increase-in-a-drugs-price-raises-protests.html

About the author

I have worked in the field of journalism for the past two years. I am a journalistic writer and a curious person, interested in the social and humanities related aspects of topics. My beginning entry was about the deep web, which I have sparsely surfed for many years.


Viewing all articles
Browse latest Browse all 612

Trending Articles