Uname: Linux business55.web-hosting.com 4.18.0-553.lve.el8.x86_64 #1 SMP Mon May 27 15:27:34 UTC 2024 x86_64
Software: LiteSpeed
PHP version: 8.1.31 [ PHP INFO ] PHP os: Linux
Server Ip: 162.213.251.212
Your Ip: 18.118.30.58
User: allssztx (535) | Group: allssztx (533)
Safe Mode: OFF
Disable Function:
NONE

name : robotparser.cpython-313.pyc
�

+}g�$���SrSSKrSSKrSSKrS/r\R"SS5r"SS5r"SS5r	"S	S
5r
g)arobotparser.py

Copyright (C) 2000  Bastian Kleineidam

You can choose between two licenses when using this package:
1) GNU GPLv2
2) PSF license for Python 2.2

The robots.txt Exclusion Protocol is implemented as specified in
http://www.robotstxt.org/norobots-rfc.txt
�N�RobotFileParser�RequestRatezrequests secondsc�d�\rSrSrSrSSjrSrSrSrSr	Sr
S	rS
rSr
SrS
rSrSrg)r�zjThis class provides a set of methods to read, parse and answer
questions about a single robots.txt file.

c�z�/Ul/UlSUlSUlSUlURU5 SUlg)NFr)�entries�sitemaps�
default_entry�disallow_all�	allow_all�set_url�last_checked��self�urls  �9/opt/alt/python313/lib64/python3.13/urllib/robotparser.py�__init__�RobotFileParser.__init__s;�������
�!���!���������S�����c��UR$)z�Returns the time the robots.txt file was last fetched.

This is useful for long-running web spiders that need to
check for new robots.txt files periodically.

)r�rs r�mtime�RobotFileParser.mtime%s��� � � rc�6�SSKnUR5Ulg)zISets the time the robots.txt file was last fetched to the
current time.

rN)�timer)rrs  r�modified�RobotFileParser.modified.s��
	� �I�I�K��rc�n�Xl[RRU5SSuUlUlg)z,Sets the URL referring to a robots.txt file.��N)r�urllib�parse�urlparse�host�pathrs  rr
�RobotFileParser.set_url6s+����%�|�|�4�4�S�9�!�A�>���	�4�9rc���[RRUR5nUR	5nURUR
S5R55 g![RRaWnURS;aSUlSnAgURS:�a!URS:aSUlSnAgSnAgSnAgSnAff=f)z4Reads the robots.txt URL and feeds it to the parser.zutf-8)i�i�Ti�i�N)
r!�request�urlopenr�readr"�decode�
splitlines�error�	HTTPError�coderr)r�f�raw�errs    rr*�RobotFileParser.read;s���		9����&�&�t�x�x�0�A��&�&�(�C��J�J�s�z�z�'�*�5�5�7�8���|�|�%�%�	&��x�x�:�%�$(��!�!����S��S�X�X��^�!%����&4���	&�s�)A*�*C�C�$'C�Cc��SUR;aURcXlggURRU5 g�N�*)�
useragentsr
r�append)r�entrys  r�
_add_entry�RobotFileParser._add_entryHs;���%�"�"�"��!�!�)�%*�"�*�
�L�L����&rc�B�Sn[5nUR5 UGH�nU(d6US:Xa
[5nSnO#US:XaURU5 [5nSnURS5nUS:�aUSUnUR	5nU(dMvURSS5n[
U5S:XdM�USR	5R5US'[RRUSR	55US'USS:XaDUS:XaURU5 [5nURRUS5 SnGM=USS:Xa6US:wa-URR[USS	55 SnGMyGM|USS
:Xa6US:wa-URR[USS55 SnGM�GM�USS:XaGUS:wa>USR	5R5(a[!US5UlSnGMGMUSS
:Xa�US:wa�USRS5n[
U5S:XauUSR	5R5(aOUSR	5R5(a)[%[!US5[!US55UlSnGM�GM�USS:XdGM�UR(RUS5 GM� US:XaURU5 gg)z|Parse the input lines from a robots.txt file.

We allow that a user-agent: line is not preceded by
one or more blank lines.
rr��#N�:z
user-agent�disallowF�allowTzcrawl-delayzrequest-rate�/�sitemap)�Entryrr:�find�strip�split�len�lowerr!r"�unquoter7r8�	rulelines�RuleLine�isdigit�int�delayr�req_rater	)r�lines�stater9�line�i�numberss       rr"�RobotFileParser.parseQs���������
�
���D���A�:�!�G�E��E��a�Z��O�O�E�*�!�G�E��E��	�	�#��A��A�v��B�Q�x���:�:�<�D����:�:�c�1�%�D��4�y�A�~��q�'�-�-�/�/�/�1��Q�� �,�,�.�.�t�A�w�}�}��?��Q����7�l�*���z�����.� %����$�$�+�+�D��G�4��E��!�W�
�*���z����.�.�x��Q���/G�H� !��"��!�W��'���z����.�.�x��Q���/F�G� !��"��!�W�
�-���z� ��7�=�=�?�2�2�4�4�*-�d�1�g�,�E�K� !��
"��!�W��.���z�"&�q�'�-�-��"4����L�A�-�'�!�*�2B�2B�2D�2L�2L�2N�2N� '��
� 0� 0� 2� :� :� <� <�-8��W�Q�Z��#�g�VW�j�/�-Z�E�N� !��
"��!�W�	�)�
�M�M�(�(��a��1�o�p�A�:��O�O�E�"�rc��UR(agUR(agUR(dg[RR[RR
U55n[RRSSURURURUR45n[RRU5nU(dSnURH,nURU5(dMURU5s $ UR (aUR RU5$g)z=using the parsed robots.txt decide if useragent can fetch urlFT�rB)rrrr!r"r#rJ�
urlunparser%�params�query�fragment�quoter�
applies_to�	allowancer
)r�	useragentr�
parsed_urlr9s     r�	can_fetch�RobotFileParser.can_fetch�s��������>�>��
� � ���\�\�*�*�6�<�<�+?�+?��+D�E�
��l�l�%�%�r�"�Z�_�_����j�.�.�
�0C�0C�'E�F���l�l� � ��%����C��\�\�E����	�*�*����s�+�+�"�����%�%�/�/��4�4�rc���UR5(dgURH'nURU5(dMURs $ UR(aURR$g�N)rrr^rOr
�rr`r9s   r�crawl_delay�RobotFileParser.crawl_delay�sY���z�z�|�|���\�\�E����	�*�*��{�{�"�"�����%�%�+�+�+�rc���UR5(dgURH'nURU5(dMURs $ UR(aURR$gre)rrr^rPr
rfs   r�request_rate�RobotFileParser.request_rate�sY���z�z�|�|���\�\�E����	�*�*��~�~�%�"�����%�%�.�.�.�rc�>�UR(dgUR$re)r	rs r�	site_maps�RobotFileParser.site_maps�s���}�}���}�}�rc��URnURbXR/-nSR[[U55$)Nz

)rr
�join�map�str)rrs  r�__str__�RobotFileParser.__str__�s>���,�,�����)��!3�!3� 4�4�G��{�{�3�s�G�,�-�-r)	rr
rrr$rr%r	rN)rX)�__name__�
__module__�__qualname__�__firstlineno__�__doc__rrrr
r*r:r"rbrgrjrmrs�__static_attributes__�rrrrsE���
�!�(�?�
9�'�G#�R�:���
.rc�*�\rSrSrSrSrSrSrSrg)rL��zhA rule line is a single "Allow:" (allowance==True) or "Disallow:"
(allowance==False) followed by a path.c���US:Xa	U(dSn[RR[RRU55n[RR	U5UlX lg)NrXT)r!r"rYr#r]r%r_)rr%r_s   rr�RuleLine.__init__�sN���2�:�i��I��|�|�&�&�v�|�|�'<�'<�T�'B�C���L�L�&�&�t�,��	�"�rc�d�URS:H=(d URUR5$r5)r%�
startswith)r�filenames  rr^�RuleLine.applies_to�s%���y�y�C��A�8�#6�#6�t�y�y�#A�Arc�L�UR(aSOSS-UR-$)N�Allow�Disallowz: �r_r%rs rrs�RuleLine.__str__�s���>�>��z�T�A�D�I�I�M�Mrr�N)	rurvrwrxryrr^rsrzr{rrrLrL�s��1�#�B�NrrLc�0�\rSrSrSrSrSrSrSrSr	g)	rD��z?An entry has one or more user-agents and zero or more rulelinesc�<�/Ul/UlSUlSUlgre)r7rKrOrPrs rr�Entry.__init__�s����������
���
rc��/nURHnURSU35 M URbURSUR35 URb7URnURSURSUR
35 UR
[[UR55 SRU5$)NzUser-agent: z
Crawl-delay: zRequest-rate: rB�
)r7r8rOrP�requests�seconds�extendrqrrrKrp)r�ret�agent�rates    rrs�
Entry.__str__�s������_�_�E��J�J��e�W�-�.�%��:�:�!��J�J��t�z�z�l�3�4��=�=�$��=�=�D��J�J���
�
��a����~�F�G��
�
�3�s�D�N�N�+�,��y�y��~�rc��URS5SR5nURH"nUS:Xa gUR5nX!;dM" g g)z2check if this entry applies to the specified agentrBrr6TF)rGrIr7)rr`r�s   rr^�Entry.applies_to�sQ���O�O�C�(��+�1�1�3�	��_�_�E���|���K�K�M�E��!��
%�rc�r�URH'nURU5(dMURs $ g)zJPreconditions:
- our agent applies to this entry
- filename is URL decodedT)rKr^r_)rr�rSs   rr_�Entry.allowance
s0���N�N�D����x�(�(��~�~�%�#�r)rOrPrKr7N)
rurvrwrxryrrsr^r_rzr{rrrDrD�s��I��
��rrD)ry�collections�urllib.parser!�urllib.request�__all__�
namedtuplerrrLrDr{rr�<module>r�sS��
�����
���$�$�]�4F�G��~.�~.�BN�N�$(�(r
© 2025 GrazzMean-Shell