Which package should I use to fetch and open an URL?
I am using Python 3.5 and there are presently 4 versions:

urllib2
urllib3
urllib4
urllib5

Common sense is telling me to use the latest version.
Not sure if my common sense is fooling me here though ;-)

Then, there is another package, along with a dozen other
urllib-related packages (such as aiourllib). I thought this one is
doing what I need:

urllib.request

The latter I found on http://docs.python-requests.org along with these
encouraging words:

"Warning: Recreational use of the Python standard library for HTTP may
result in dangerous side-effects, including: security vulnerabilities,
verbose code, reinventing the wheel, constantly reading documentation,
depression, headaches, or even death."

How do I know where to find the right package - on python.org or elsewhere?
I found some code samples that show how to use urllib.request, now I
am trying to understand why I should use urllib.request.
Would it be also doable to do requests using urllib5 or any other
version? Like 2 or 3? Just trying to understand.

I am lost here. Feeback appreciated. Thank you!

BTW, here's some (working) exemplary code I have been using for
educational purposes:

import urllib.request
from bs4 import BeautifulSoup

theurl = "https://twitter.com/rafaelknuth";
thepage = urllib.request.urlopen(theurl)
soup = BeautifulSoup(thepage, "html.parser")

print(soup.title.text)

i = 1
for tweets in soup.findAll("div",{"class":"content"}):
    print(i)
    print(tweets.find("p").text)
    i = i + 1

I am assuming there are different solutions for fetching and open URLs?
Or is the above the only viable solution?
_______________________________________________
Tutor maillist  -  Tutor@python.org
To unsubscribe or change subscription options:
https://mail.python.org/mailman/listinfo/tutor

Reply via email to