HOME


Mini Shell 1.0
DIR: /usr/lib/python3/dist-packages/pygments/lexers/__pycache__/
Upload File :
Current File : //usr/lib/python3/dist-packages/pygments/lexers/__pycache__/special.cpython-312.pyc
�

|�eV
���dZddlZddlmZmZddlmZmZmZm	Z	ddl
mZgd�ZGd�de�Z
Gd	�d
e�ZiZGd�de�Zy)
z�
    pygments.lexers.special
    ~~~~~~~~~~~~~~~~~~~~~~~

    Special lexers.

    :copyright: Copyright 2006-2023 by the Pygments team, see AUTHORS.
    :license: BSD, see LICENSE for details.
�N)�Lexer�line_re)�Token�Error�Text�Generic)�get_choice_opt)�	TextLexer�OutputLexer�
RawTokenLexerc�6�eZdZdZdZdgZdgZdgZdZd�Z	d�Z
y	)
r
z3
    "Null" lexer, doesn't highlight anything.
    z	Text only�textz*.txtz
text/plaing{�G�z�?c#�"K�dt|f��y�w�Nr)r��selfrs  �9/usr/lib/python3/dist-packages/pygments/lexers/special.py�get_tokens_unprocessedz TextLexer.get_tokens_unprocesseds������t�m��s�
c�"�tjS)N)r
�priority)rs r�analyse_textzTextLexer.analyse_text"s���!�!�!�N)�__name__�
__module__�__qualname__�__doc__�name�aliases�	filenames�	mimetypesrrr�rrr
r
s2����D��h�G��	�I���I��H��"rr
c� �eZdZdZdZdgZd�Zy)rzj
    Simple lexer that highlights everything as ``Token.Generic.Output``.

    .. versionadded:: 2.10
    zText output�outputc#�6K�dtj|f��y�wr)r�Outputrs  rrz"OutputLexer.get_tokens_unprocessed/s���������%�%�s�N)rrrrrrrr!rrrr&s���
�D��j�G�&rrc�4�eZdZdZdZgZgZdgZd�Zd�Z	d�Z
y)ra
    Recreate a token stream formatted with the `RawTokenFormatter`.

    Additional options accepted:

    `compress`
        If set to ``"gz"`` or ``"bz2"``, decompress the token stream with
        the given compression algorithm before lexing (default: ``""``).
    zRaw token datazapplication/x-pygments-tokensc�Z�t|dgd�d�|_tj|fi|��y)N�compress)��none�gz�bz2r))r	r(r�__init__)r�optionss  rr-zRawTokenLexer.__init__Es)��&�w�
�'@�"�F��
�
���t�'�w�'rc#��K�|jrkt|t�r|jd�}	|jdk(rddl}|j|�}n$|jdk(rddl}|j|�}t|t�r|jd�}|jd�dz}|j|�D]\}}}||f���y#t$rt|jd�f��Y�zwxYw�w)N�latin1r+rr,�
)
r(�
isinstance�str�encode�gzip�
decompressr,�OSErrorr�decode�bytes�stripr)rrr5r,�i�t�vs       r�
get_tokenszRawTokenLexer.get_tokensJs������=�=��$��$��{�{�8�,��
3��=�=�D�(���?�?�4�0�D��]�]�e�+���>�>�$�/�D��d�E�"��;�;�x�(�D��z�z�$��$�&���2�2�4�8�	�G�A�q�!��Q�$�J�	���
3��T�[�[��2�2�2�
3�s*�.C8�A	C�:AC8�"C5�2C8�4C5�5C8c#�dK�d}tj|�D]�}	|j�j�j	dd�\}}t
j
|�}|sVt}|j	d�dd}|D].}|r|dj�std��t||�}�0|t
|<tj|�}t|t�std��	|||f��|t#|�z
}��y#ttf$r|j�}t }Y�?wxYw�w)Nr�	��.zmalformed token namezexpected str)r�finditer�group�rstrip�split�_ttype_cache�getr�isupper�
ValueError�getattr�ast�literal_evalr2r3�SyntaxErrorr�len)	rr�length�match�ttypestr�val�ttype�ttypes�ttype_s	         rrz$RawTokenLexer.get_tokens_unprocessed_s-�������%�%�d�+�	�E�
� %���
� 4� 4� 6� <� <�T�1� E�
��#�$�(�(��2���!�E�%�^�^�C�0���4�F�"(�7��%�V�A�Y�->�->�-@�",�-C�"D�D� '��v� 6��7�.3�L��*��&�&�s�+��!�#�s�+�$�^�4�4�,�
�%��$�$��c�#�h��F�'	�� ��,�
��k�k�m����
�s)�D0�CD�,D0�%D-�*D0�,D-�-D0N)rrrrrrrr r-r>rr!rrrr6s/����D��G��I�0�1�I�(�
�*rr)rrL�pygments.lexerrr�pygments.tokenrrrr�
pygments.utilr	�__all__r
rrGrr!rr�<module>r[sM����)�6�6�(�8��"��"�"
&�%�
&���>�E�>r