Jump to content

Wikipedia:Reference desk/Archives/Computing/2019 December 19

From Wikipedia, the free encyclopedia
Computing desk
< December 18 << Nov | December | Jan >> Current desk >
Welcome to the Wikipedia Computing Reference Desk Archives
The page you are currently viewing is a transcluded archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


December 19

[edit]

How to access deep web and dark web?

[edit]

There are some websites which can be found through Tor only. — Preceding unsigned comment added by Темная планета (talkcontribs) 06:38, 19 December 2019 (UTC)[reply]

The term "dark" here has multiple meanings that can be nit-picked to death. In some cases, a person will use "dark" to mean that a website practically never shows up in any search engine. You can use Google and Bing all day and never find the website. That isn't particularly fascinating. Google has estimated multiple times in the past that they index about 10% of all available websites. I estimate that you don't want to know about 80% of them, so Google's doing a good job if they can index half of them you might care about. To access this form of "dark" website, you simply need to know it exists. Just about every sequence of 4 letters exists, so I'm sure there is a website named qwer.com (yep - it is a redirect to another website).
In other cases, "dark" means that the common domain name registration system isn't used. So, if I made "mysupersecretsite.sec" and you wanted to use it, you would havet o sert up your machine to use a domain name server that maps that domain name to the IP address. There are many unofficial DNS systems in use. All they do is map a domain name to an IP address. You can even make your own if you like and map "w.c" to Wikipedia's IP address.
In the area that most people mean when they say "dark web", domain names are not in use. Computers are referred to by IP address, such as 184.168.131.241. I put a web page or some other file on a computer. I give only those I trust the IP address. They can access it. If you don't know the IP address, you can't get to it.
There's one more level of "dark" that has to do with the user's IP address. Many websites alter content based on the IP address of the person requesting the web page. A dark page can check the user's IP address and refuse to send anything. But, if you are using an authorized IP address, you get a web page. That is where VPN systems come in. I might authorize a specific IP address used by a VPN systgem. You use a VPN system. Tor can work in this case, but it would be a completely different model. VPN disguises your IP address by bouncing traffic off a VPN server - you get the IP address of the server, not your computer. Tor bounces your traffic around a lot of servers, so you get a constantly changing IP address. But, I could set up my server so it only allows Tor servers, which is hard because they change.
All of this is about about hiding websites (or other files) on the web. In this case, the web is defined as any server that uses HTTP or HTTPS protocol. It is not about being anonymous when using the web. 135.84.167.41 (talk) 17:18, 19 December 2019 (UTC)[reply]
  1. Read this: https://helpdeskgeek.com/how-to/a-quick-guide-to-navigating-the-dark-web/
  2. Read this: https://www.techadvisor.co.uk/how-to/internet/dark-web-3593569/
  3. Read this: https://www.cnet.com/news/darknet-dark-web-101-your-guide-to-the-badlands-of-the-internet-tor-bitcoin/
  4. Read this: https://www.techrepublic.com/article/how-to-safely-access-and-navigate-the-dark-web/
  1. Go to wikitjerrta4qgz4.onion/ -- The Hidden Wiki
  2. Go to torlinkbgs6aabns.onion/ -- TorLinks .onion Link List
  3. Go to msydqstlz2kzerdg.onion/ -- Ahmia Tor search engine
  4. Go to hss3uro2hsxfogfq.onion/ -- Not Evil Tor search engine
  5. Go to gjobqjj7wyczbqie.onion/ -- Candle Tor search engine
  6. Go to tor66sezptuu2nta.onion/ -- Tor66 Tor search engine
  7. Go to darkfailllnkf4vf.onion/ -- Darkfail onion list
--Guy Macon (talk) 20:38, 19 December 2019 (UTC)[reply]
My understanding of "deep web" is that much content available from web servers is only available via a custom site search, behind a login form, a paywall, or some kind of query that prevents the content from being indexed by search engines. For example, on Wikipedia, we have maintenance queries such as looking at a user's contributions, their block log, or a page's history. That stuff is the deep web. If you log into a web mail service such as GMail or Outlook, your email folders and messages are in the deep web. If you go to Amazon.com and make an order, your package tracking and receipt are in the deep web. So in the normal course of using websites, I am confident that you access the deep web quite often already. Elizium23 (talk) 21:48, 19 December 2019 (UTC)[reply]
The OP made it clear that they were talking about "websites which can be found through Tor only", which is a description of the dark web, not the deep web. As part of my job I often access parts of the deep web, then I lock them down so that it is very difficult for anyone else to access them. --Guy Macon (talk) 11:06, 20 December 2019 (UTC)[reply]

What's wrong with this code?

[edit]

What's wrong with this code? :

   def snail(snail_map):
       newArray = []
       a = 0
       b = 0
       c = 0
       d = len(snail_map[a])-1
       while len(snail_map) > 0:
           while b < len(snail_map[a]):
               newArray.append(snail_map[a][b])
               snail_map[a].pop(b)
           if len(snail_map[a]) == 0:
               snail_map.pop(a)
           while c < len(snail_map):
               newArray.append(snail_map[c][d])
               snail_map[c].pop(d)
               c += 1
           c -= 1
           d -= 1
           while d >= 0:
               print(snail_map)
               print(newArray)
               print(c,d)
               print(snail_map[c][d])
               newArray.append(snail_map[c][d])
               snail_map[c].pop(d)
               if len(snail_map[c]) == 0:
                   snail_map.pop(c)
               d -= 1
           c -= 1
           while c >= 0:
               newArray.append(snail_map[c][0])
               snail_map[c].pop(0)
               c -= 1
   
   snail([[1,2,3],[8,9,4],[7,6,5]])
   
   snail([[1,2,3,4],[12,13,14,5],[11,16,15,6],[10,9,8,7]])

Basically, this code is meant to fulfill this exercise:

https://www.codewars.com/kata/snail/train/python

For some reason, however, I get this error message:

"Traceback (most recent call last):
  File "C:/Users/Josh/AppData/Local/Programs/Python/Python37-32/snail_map.py", line 37, in <module>
    snail([[1,2,3],[8,9,4],[7,6,5]])
  File "C:/Users/Josh/AppData/Local/Programs/Python/Python37-32/snail_map.py", line 14, in snail
    newArray.append(snail_map[c][d])
IndexError: list index out of range"

I don't know why exactly it states that the list index here is out of range considering that "snail_map[c][d]" means "snail_map[1][0]" and considering that the value of "snail_map[c][d]" is "7". So, why exactly does it state that the list index here is out of range?

Any thoughts on this? Futurist110 (talk) 23:15, 19 December 2019 (UTC)[reply]

I don't know what's wrong with the code, but I added print (c,d) before line 16, and c and d both have the value -1 when the error occurs. AndrewWTaylor (talk) 11:27, 20 December 2019 (UTC)[reply]
You are getting index out of range because you are popping values from an array and expecting the array to be the same length. This is a simple example to explain what you are doing. Assume I have an array named a and it has the values [8,20,70]. You want to do something with each index, like something(a[0]), something(a[1]), something a[2]). So, you set b=0 and then call something(a[b]). Then, you pop that off the array with a.pop(b). Then, you add 1 to b and repeat it for the second index: something(a[b]), a.pop(b). You add 1 to b and repeat it for the last index. But... index out of bounds. When you b was zero, you popped index 0 with a.php(b). Now, a is not [8,20,70]. It is [20,70]. You don't see an error when b=1. You can do something with a[b] and pop it. When you pop a.pop(b) in this case, a becomes just [20]. You add 1 to b and it is 2. You try to access a[b], but it doesn't work. There is no a[b] when b=2 becuase a only has one value. All in all, you don't really need to pop for this algorithm. It looks like you are trying to use recursion. You can manually hit every index around the outside and then recursively send the middle indexes back to the matrix itself. Another method is to increment an offset. It begins at zero. You check indexes offset to length-offset (all of them). Then, you add 1 to offset and check indexes offset to length-offset on the next row. No need for recursion. 135.84.167.41 (talk) 14:14, 20 December 2019 (UTC)[reply]

It looks like 135.84.167.41 has done a good job analyzing this issue. At a wider level, part of the reason this code is so confusing is because of all the values (variables and arrays) that keep changing. That is considered an old-fashioned style these days, and while going full-on functional programming (making everything immutable) creates its own problems, you will generally have an easier time debugging if you use immutable values instead of mutable ones when it's convenient to do so. Python itself makes that difficult because its iterators mutate, but if you only use them once you can look the other way. So my version of this code goes:

from typing import Tuple,Iterator,List
def squares(n: int) -> Iterator[Tuple[int,int]]:
    yield from [(0, i) for i in range(n)] # go along the top, left to right
    yield from [(i, n-1) for i in range(1,n)] # then down the right edge
    yield from [(n-1, i) for i in range(n-2,-1,-1)] # then along bottom, right to left
    yield from [(i, 0) for i in range(n-2,0,-1)] # then up the left edge
    if n > 2: # now recursively do the nested square inside the edge
        yield from [(a+1,b+1) for a,b in squares(n-2)]

def snail(array: List[List[int]]) -> List[int]:
    n = len(array)
    assert all(len(a) == n for a in array)
    return list(array[i][j] for i,j in squares(n))

array = [[1,2,3],
         [4,5,6],
         [7,8,9]]
print (snail(array))

I made some errors getting the ranges to not overlap, but they were very easy to debug by just examining the output.

With python 3, using type annotations (like above) can help keep your thoughts clear, and Mypy can check the annotations and frequently spot bugs in your code before you even try to run the code.

I noticed just now that the code above only does square arrays rather than arbitrary rectangular ones that might have been expected. However, that should be straightforwardly fixable. 173.228.123.190 (talk) 12:07, 21 December 2019 (UTC)[reply]