yggscr package¶
Submodules¶
-
class
yggscr.sbrowser.
SBrowser
(log, scraper=None, browser=None, proxy=None, **kwargs)[source]¶ Bases:
object
General scrapper browser
-
proxify
(https_proxy=None)[source]¶ - Sets an https-only proxy:
- https://user:pass@host:port : HTTP proxy
- socks5h://user:pass@host:port : SOCKS proxy
- socks5://user:pass@host:port : SOCKS proxy with local DNS resolver
-
-
class
yggscr.shell.
YggShell
(log=None, ygg_browser=None, **kwargs)[source]¶ Bases:
cmd2.cmd2.Cmd
Ygg command line interface
-
do_proxify
(line)[source]¶ Sets or resets proxy settings proxify [xxx://user:pass@host:port] proxify without arguments will set no proxy proxify http://1.1.1.1:8080 to select an http proxy proxify socks5h://1.1.1.1 to select a socks proxy with remote dns proxify socks5://1.1.1.1 to select a socks proxy with local dns
-
do_search_torrents
(line)[source]¶ search torrents search_torrents q:<pattern> [c:<category>] [s:<subcategory>] [s:<subcategory>] [opt1:val] [opt2:val1] [opt2:val2] [d:False] [n:3] any other option will be passed unchanged to the webserver d is for detail which will fetch each torrent url for more details n is the number of torrents to display (all by default)
-
-
class
yggscr.shout.
ShoutMessage
(shout, soup=None, mtime=None, user=None, group=None, message=None)[source]¶ Bases:
object
-
class
yggscr.shout.
YggShout
(log, robs=None, debug=False, irc=False, colour=False)[source]¶ Bases:
object
-
class
yggscr.torrents.
Torrent
(torrent_title, torrent_comm, torrent_age, torrent_size, torrent_completed, torrent_seed, torrent_leech, href, log=None, thref=None, tid=None, cat=None, subcat=None, uploader=None)[source]¶ Bases:
object
-
class
yggscr.ygg.
YggBrowser
(log=None, scraper=None, browser=None, proxy=None)[source]¶ Bases:
yggscr.sbrowser.SBrowser
Ygg Scrapper
Logger object - As logging, we only need and enforce a single instance
-
yggscr.ylogging.
add_stdout_handler
(logger)[source]¶ If no handlers already registered, add a stream formatted handler, level WARN
Module contents¶
Yggtorrent scraper library - Webserver - Rss - Shell
Copyright: | © 2018-2019, Laurent Kislaire. |
---|---|
License: | ISC (see /LICENSE). |