merge heads.

This commit is contained in:
Gregory P. Smith 2011-06-04 23:05:19 -07:00
commit d64b2bae9b
223 changed files with 11831 additions and 3529 deletions

View File

@ -75,6 +75,7 @@ a69a031ac1402dede8b1ef80096436bca6d371f3 v3.1
960efa327c5d9c18df995437b0ac550cb89c9f85 v3.1.2
d18e9d71f369d8211f6ac87252c6d3211f9bd09f v3.1.3rc1
a4f75773c0060cee38b0bb651a7aba6f56b0e996 v3.1.3
32fcb9e94985cb19ce37ba9543f091c0dbe9d7dd v3.1.4rc1
b37b7834757492d009b99cf0ca4d42d2153d7fac v3.2a1
56d4373cecb73c8b45126ba7b045b3c7b3f94b0b v3.2a2
da012d9a2c23d144e399d2e01a55b8a83ad94573 v3.2a3

View File

@ -55,7 +55,7 @@ as much as it can.
Return the referenced object from a weak reference, *ref*. If the referent is
no longer live, returns :const:`Py_None`.
.. warning::
.. note::
This function returns a **borrowed reference** to the referenced object.
This means that you should always call :c:func:`Py_INCREF` on the object

View File

@ -11,7 +11,7 @@
library/index.rst
extending/index.rst
c-api/index.rst
distutils/index.rst
packaging/index.rst
install/index.rst
documenting/index.rst
howto/index.rst

View File

@ -21,7 +21,7 @@ setup script). Indirectly provides the :class:`distutils.dist.Distribution` and
.. function:: setup(arguments)
The basic do-everything function that does most everything you could ever ask
for from a Distutils method. See XXXXX
for from a Distutils method.
The setup function takes a large number of arguments. These are laid out in the
following table.
@ -147,11 +147,11 @@ setup script). Indirectly provides the :class:`distutils.dist.Distribution` and
In addition, the :mod:`distutils.core` module exposed a number of classes that
live elsewhere.
* :class:`Extension` from :mod:`distutils.extension`
* :class:`~distutils.extension.Extension` from :mod:`distutils.extension`
* :class:`Command` from :mod:`distutils.cmd`
* :class:`~distutils.cmd.Command` from :mod:`distutils.cmd`
* :class:`Distribution` from :mod:`distutils.dist`
* :class:`~distutils.dist.Distribution` from :mod:`distutils.dist`
A short description of each of these follows, but see the relevant module for
the full reference.
@ -1678,8 +1678,8 @@ lines, and joining lines with backslashes.
===================================================================
.. module:: distutils.cmd
:synopsis: This module provides the abstract base class Command. This class is subclassed
by the modules in the distutils.command subpackage.
:synopsis: This module provides the abstract base class Command. This class
is subclassed by the modules in the distutils.command subpackage.
This module supplies the abstract base class :class:`Command`.
@ -1689,20 +1689,84 @@ This module supplies the abstract base class :class:`Command`.
Abstract base class for defining command classes, the "worker bees" of the
Distutils. A useful analogy for command classes is to think of them as
subroutines with local variables called *options*. The options are declared in
:meth:`initialize_options` and defined (given their final values) in
:meth:`finalize_options`, both of which must be defined by every command class.
The distinction between the two is necessary because option values might come
from the outside world (command line, config file, ...), and any options
dependent on other options must be computed after these outside influences have
been processed --- hence :meth:`finalize_options`. The body of the subroutine,
where it does all its work based on the values of its options, is the
:meth:`run` method, which must also be implemented by every command class.
subroutines with local variables called *options*. The options are declared
in :meth:`initialize_options` and defined (given their final values) in
:meth:`finalize_options`, both of which must be defined by every command
class. The distinction between the two is necessary because option values
might come from the outside world (command line, config file, ...), and any
options dependent on other options must be computed after these outside
influences have been processed --- hence :meth:`finalize_options`. The body
of the subroutine, where it does all its work based on the values of its
options, is the :meth:`run` method, which must also be implemented by every
command class.
The class constructor takes a single argument *dist*, a :class:`Distribution`
The class constructor takes a single argument *dist*, a :class:`Distribution`
instance.
Creating a new Distutils command
================================
This section outlines the steps to create a new Distutils command.
A new command lives in a module in the :mod:`distutils.command` package. There
is a sample template in that directory called :file:`command_template`. Copy
this file to a new module with the same name as the new command you're
implementing. This module should implement a class with the same name as the
module (and the command). So, for instance, to create the command
``peel_banana`` (so that users can run ``setup.py peel_banana``), you'd copy
:file:`command_template` to :file:`distutils/command/peel_banana.py`, then edit
it so that it's implementing the class :class:`peel_banana`, a subclass of
:class:`distutils.cmd.Command`.
Subclasses of :class:`Command` must define the following methods.
.. method:: Command.initialize_options()
Set default values for all the options that this command supports. Note that
these defaults may be overridden by other commands, by the setup script, by
config files, or by the command-line. Thus, this is not the place to code
dependencies between options; generally, :meth:`initialize_options`
implementations are just a bunch of ``self.foo = None`` assignments.
.. method:: Command.finalize_options()
Set final values for all the options that this command supports. This is
always called as late as possible, ie. after any option assignments from the
command-line or from other commands have been done. Thus, this is the place
to to code option dependencies: if *foo* depends on *bar*, then it is safe to
set *foo* from *bar* as long as *foo* still has the same value it was
assigned in :meth:`initialize_options`.
.. method:: Command.run()
A command's raison d'etre: carry out the action it exists to perform, controlled
by the options initialized in :meth:`initialize_options`, customized by other
commands, the setup script, the command-line, and config files, and finalized in
:meth:`finalize_options`. All terminal output and filesystem interaction should
be done by :meth:`run`.
.. attribute:: Command.sub_commands
*sub_commands* formalizes the notion of a "family" of commands,
e.g. ``install`` as the parent with sub-commands ``install_lib``,
``install_headers``, etc. The parent of a family of commands defines
*sub_commands* as a class attribute; it's a list of 2-tuples ``(command_name,
predicate)``, with *command_name* a string and *predicate* a function, a
string or ``None``. *predicate* is a method of the parent command that
determines whether the corresponding command is applicable in the current
situation. (E.g. ``install_headers`` is only applicable if we have any C
header files to install.) If *predicate* is ``None``, that command is always
applicable.
*sub_commands* is usually defined at the *end* of a class, because
predicates can be methods of the class, so they must already have been
defined. The canonical example is the :command:`install` command.
:mod:`distutils.command` --- Individual Distutils commands
==========================================================
@ -1750,7 +1814,7 @@ This module supplies the abstract base class :class:`Command`.
.. module:: distutils.command.bdist_msi
:synopsis: Build a binary distribution as a Windows MSI file
.. class:: bdist_msi(Command)
.. class:: bdist_msi
Builds a `Windows Installer`_ (.msi) binary package.
@ -1829,9 +1893,9 @@ This module supplies the abstract base class :class:`Command`.
:synopsis: Build the .py/.pyc files of a package
.. class:: build_py(Command)
.. class:: build_py
.. class:: build_py_2to3(build_py)
.. class:: build_py_2to3
Alternative implementation of build_py which also runs the
2to3 conversion library on each .py file that is going to be
@ -1942,6 +2006,7 @@ This is described in more detail in :pep:`301`.
.. % todo
:mod:`distutils.command.check` --- Check the meta-data of a package
===================================================================
@ -1954,63 +2019,3 @@ For example, it verifies that all required meta-data are provided as
the arguments passed to the :func:`setup` function.
.. % todo
Creating a new Distutils command
================================
This section outlines the steps to create a new Distutils command.
A new command lives in a module in the :mod:`distutils.command` package. There
is a sample template in that directory called :file:`command_template`. Copy
this file to a new module with the same name as the new command you're
implementing. This module should implement a class with the same name as the
module (and the command). So, for instance, to create the command
``peel_banana`` (so that users can run ``setup.py peel_banana``), you'd copy
:file:`command_template` to :file:`distutils/command/peel_banana.py`, then edit
it so that it's implementing the class :class:`peel_banana`, a subclass of
:class:`distutils.cmd.Command`.
Subclasses of :class:`Command` must define the following methods.
.. method:: Command.initialize_options()
Set default values for all the options that this command supports. Note that
these defaults may be overridden by other commands, by the setup script, by
config files, or by the command-line. Thus, this is not the place to code
dependencies between options; generally, :meth:`initialize_options`
implementations are just a bunch of ``self.foo = None`` assignments.
.. method:: Command.finalize_options()
Set final values for all the options that this command supports. This is
always called as late as possible, ie. after any option assignments from the
command-line or from other commands have been done. Thus, this is the place
to to code option dependencies: if *foo* depends on *bar*, then it is safe to
set *foo* from *bar* as long as *foo* still has the same value it was
assigned in :meth:`initialize_options`.
.. method:: Command.run()
A command's raison d'etre: carry out the action it exists to perform, controlled
by the options initialized in :meth:`initialize_options`, customized by other
commands, the setup script, the command-line, and config files, and finalized in
:meth:`finalize_options`. All terminal output and filesystem interaction should
be done by :meth:`run`.
*sub_commands* formalizes the notion of a "family" of commands, eg. ``install``
as the parent with sub-commands ``install_lib``, ``install_headers``, etc. The
parent of a family of commands defines *sub_commands* as a class attribute; it's
a list of 2-tuples ``(command_name, predicate)``, with *command_name* a string
and *predicate* a function, a string or None. *predicate* is a method of
the parent command that determines whether the corresponding command is
applicable in the current situation. (Eg. we ``install_headers`` is only
applicable if we have any C header files to install.) If *predicate* is None,
that command is always applicable.
*sub_commands* is usually defined at the \*end\* of a class, because predicates
can be methods of the class, so they must already have been defined. The
canonical example is the :command:`install` command.

View File

@ -15,8 +15,8 @@ want to modify existing commands; many simply add a few file extensions that
should be copied into packages in addition to :file:`.py` files as a
convenience.
Most distutils command implementations are subclasses of the :class:`Command`
class from :mod:`distutils.cmd`. New commands may directly inherit from
Most distutils command implementations are subclasses of the
:class:`distutils.cmd.Command` class. New commands may directly inherit from
:class:`Command`, while replacements often derive from :class:`Command`
indirectly, directly subclassing the command they are replacing. Commands are
required to derive from :class:`Command`.

View File

@ -14,6 +14,10 @@ the module developer's point of view, describing how to use the Distutils to
make Python modules and extensions easily available to a wider audience with
very little overhead for build/release/install mechanics.
.. deprecated:: 3.3
:mod:`packaging` replaces Distutils. See :ref:`packaging-index` and
:ref:`packaging-install-index`.
.. toctree::
:maxdepth: 2
:numbered:
@ -29,3 +33,10 @@ very little overhead for build/release/install mechanics.
extending.rst
commandref.rst
apiref.rst
Another document describes how to install modules and extensions packaged
following the above guidelines:
.. toctree::
install.rst

1005
Doc/distutils/install.rst Normal file

File diff suppressed because it is too large Load Diff

View File

@ -14,6 +14,7 @@ Using make
Luckily, a Makefile has been prepared so that on Unix, provided you have
installed Python and Subversion, you can just run ::
cd Doc
make html
to check out the necessary toolset in the :file:`tools/` subdirectory and build

View File

@ -136,7 +136,7 @@ Good example (establishing confident knowledge in the effective use of the langu
Economy of Expression
---------------------
More documentation is not necessarily better documentation. Error on the side
More documentation is not necessarily better documentation. Err on the side
of being succinct.
It is an unfortunate fact that making documentation longer can be an impediment
@ -198,7 +198,7 @@ Audience
The tone of the tutorial (and all the docs) needs to be respectful of the
reader's intelligence. Don't presume that the readers are stupid. Lay out the
relevant information, show motivating use cases, provide glossary links, and do
our best to connect-the-dots, but don't talk down to them or waste their time.
your best to connect-the-dots, but don't talk down to them or waste their time.
The tutorial is meant for newcomers, many of whom will be using the tutorial to
evaluate the language as a whole. The experience needs to be positive and not

View File

@ -247,7 +247,7 @@ Glossary
processing, remembering the location execution state (including local
variables and pending try-statements). When the generator resumes, it
picks-up where it left-off (in contrast to functions which start fresh on
every invocation.
every invocation).
.. index:: single: generator expression

View File

@ -23,8 +23,8 @@ It's not really a tutorial - you'll still have work to do in getting things
working. It doesn't cover the fine points (and there are a lot of them), but I
hope it will give you enough background to begin using them decently.
I'm only going to talk about INET sockets, but they account for at least 99% of
the sockets in use. And I'll only talk about STREAM sockets - unless you really
I'm only going to talk about INET (i.e. IPv4) sockets, but they account for at least 99% of
the sockets in use. And I'll only talk about STREAM (i.e. TCP) sockets - unless you really
know what you're doing (in which case this HOWTO isn't for you!), you'll get
better behavior and performance from a STREAM socket than anything else. I will
try to clear up the mystery of what a socket is, as well as some hints on how to
@ -208,10 +208,10 @@ length message::
totalsent = totalsent + sent
def myreceive(self):
msg = ''
msg = b''
while len(msg) < MSGLEN:
chunk = self.sock.recv(MSGLEN-len(msg))
if chunk == '':
if chunk == b'':
raise RuntimeError("socket connection broken")
msg = msg + chunk
return msg

File diff suppressed because it is too large Load Diff

1029
Doc/install/install.rst Normal file

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,44 @@
.. _packaging-pysetup-config:
=====================
Pysetup Configuration
=====================
Pysetup supports two configuration files: :file:`.pypirc` and :file:`packaging.cfg`.
.. FIXME integrate with configfile instead of duplicating
Configuring indexes
-------------------
You can configure additional indexes in :file:`.pypirc` to be used for index-related
operations. By default, all configured index-servers and package-servers will be used
in an additive fashion. To limit operations to specific indexes, use the :option:`--index`
and :option:`--package-server options`::
$ pysetup install --index pypi --package-server django some.project
Adding indexes to :file:`.pypirc`::
[packaging]
index-servers =
pypi
other
package-servers =
django
[pypi]
repository: <repository-url>
username: <username>
password: <password>
[other]
repository: <repository-url>
username: <username>
password: <password>
[django]
repository: <repository-url>
username: <username>
password: <password>

View File

@ -0,0 +1,61 @@
.. _packaging-pysetup-servers:
===============
Package Servers
===============
Pysetup supports installing Python packages from *Package Servers* in addition
to PyPI indexes and mirrors.
Package Servers are simple directory listings of Python distributions. Directories
can be served via HTTP or a local file system. This is useful when you want to
dump source distributions in a directory and not worry about the full index structure.
Serving distributions from Apache
---------------------------------
::
$ mkdir -p /var/www/html/python/distributions
$ cp *.tar.gz /var/www/html/python/distributions/
<VirtualHost python.example.org:80>
ServerAdmin webmaster@domain.com
DocumentRoot "/var/www/html/python"
ServerName python.example.org
ErrorLog logs/python.example.org-error.log
CustomLog logs/python.example.org-access.log common
Options Indexes FollowSymLinks MultiViews
DirectoryIndex index.html index.htm
<Directory "/var/www/html/python/distributions">
Options Indexes FollowSymLinks MultiViews
Order allow,deny
Allow from all
</Directory>
</VirtualHost>
Add the Apache based distribution server to :file:`.pypirc`::
[packaging]
package-servers =
apache
[apache]
repository: http://python.example.org/distributions/
Serving distributions from a file system
----------------------------------------
::
$ mkdir -p /data/python/distributions
$ cp *.tar.gz /data/python/distributions/
Add the directory to :file:`.pypirc`::
[packaging]
package-servers =
local
[local]
repository: file:///data/python/distributions/

163
Doc/install/pysetup.rst Normal file
View File

@ -0,0 +1,163 @@
.. _packaging-pysetup:
================
Pysetup Tutorial
================
Getting started
---------------
Pysetup is a simple script that supports the following features:
- install, remove, list, and verify Python packages;
- search for available packages on PyPI or any *Simple Index*;
- verify installed packages (md5sum, installed files, version).
Finding out what's installed
----------------------------
Pysetup makes it easy to find out what Python packages are installed::
$ pysetup search virtualenv
virtualenv 1.6 at /opt/python3.3/lib/python3.3/site-packages/virtualenv-1.6-py3.3.egg-info
$ pysetup search --all
pyverify 0.8.1 at /opt/python3.3/lib/python3.3/site-packages/pyverify-0.8.1.dist-info
virtualenv 1.6 at /opt/python3.3/lib/python3.3/site-packages/virtualenv-1.6-py3.3.egg-info
wsgiref 0.1.2 at /opt/python3.3/lib/python3.3/wsgiref.egg-info
...
Installing a distribution
-------------------------
Pysetup can install a Python project from the following sources:
- PyPI and Simple Indexes;
- source directories containing a valid :file:`setup.py` or :file:`setup.cfg`;
- distribution source archives (:file:`project-1.0.tar.gz`, :file:`project-1.0.zip`);
- HTTP (http://host/packages/project-1.0.tar.gz).
Installing from PyPI and Simple Indexes::
$ pysetup install project
$ pysetup install project==1.0
Installing from a distribution source archive::
$ pysetup install project-1.0.tar.gz
Installing from a source directory containing a valid :file:`setup.py` or
:file:`setup.cfg`::
$ cd path/to/source/directory
$ pysetup install
$ pysetup install path/to/source/directory
Installing from HTTP::
$ pysetup install http://host/packages/project-1.0.tar.gz
Retrieving metadata
-------------------
You can gather metadata from two sources, a project's source directory or an
installed distribution. The `pysetup metadata` command can retrieve one or
more metadata fields using the `-f` option and a metadata field as the
argument. ::
$ pysetup metadata virtualenv -f version -f name
Version:
1.6
Name:
virtualenv
$ pysetup metadata virtualenv --all
Metadata-Version:
1.0
Name:
virtualenv
Version:
1.6
Platform:
UNKNOWN
Summary:
Virtual Python Environment builder
...
.. seealso::
There are three metadata versions, 1.0, 1.1, and 1.2. The following PEPs
describe specifics of the field names, and their semantics and usage. 1.0
:PEP:`241`, 1.1 :PEP:`314`, and 1.2 :PEP:`345`
Removing a distribution
-----------------------
You can remove one or more installed distributions using the `pysetup remove`
command::
$ pysetup remove virtualenv
removing 'virtualenv':
/opt/python3.3/lib/python3.3/site-packages/virtualenv-1.6-py3.3.egg-info/dependency_links.txt
/opt/python3.3/lib/python3.3/site-packages/virtualenv-1.6-py3.3.egg-info/entry_points.txt
/opt/python3.3/lib/python3.3/site-packages/virtualenv-1.6-py3.3.egg-info/not-zip-safe
/opt/python3.3/lib/python3.3/site-packages/virtualenv-1.6-py3.3.egg-info/PKG-INFO
/opt/python3.3/lib/python3.3/site-packages/virtualenv-1.6-py3.3.egg-info/SOURCES.txt
/opt/python3.3/lib/python3.3/site-packages/virtualenv-1.6-py3.3.egg-info/top_level.txt
Proceed (y/n)? y
success: removed 6 files and 1 dirs
The optional '-y' argument auto confirms, skipping the conformation prompt::
$ pysetup remove virtualenv -y
Getting help
------------
All pysetup actions take the `-h` and `--help` options which prints the commands
help string to stdout. ::
$ pysetup remove -h
Usage: pysetup remove dist [-y]
or: pysetup remove --help
Uninstall a Python package.
positional arguments:
dist installed distribution name
optional arguments:
-y auto confirm package removal
Getting a list of all pysetup actions and global options::
$ pysetup --help
Usage: pysetup [options] action [action_options]
Actions:
run: Run one or several commands
metadata: Display the metadata of a project
install: Install a project
remove: Remove a project
search: Search for a project
graph: Display a graph
create: Create a Project
To get more help on an action, use:
pysetup action --help
Global options:
--verbose (-v) run verbosely (default)
--quiet (-q) run quietly (turns verbosity off)
--dry-run (-n) don't actually do anything
--help (-h) show detailed help message
--no-user-cfg ignore pydistutils.cfg in your home directory
--version Display the version

View File

@ -1,3 +1,5 @@
.. _abstract-base-classes:
:mod:`abc` --- Abstract Base Classes
====================================
@ -12,7 +14,7 @@
--------------
This module provides the infrastructure for defining an :term:`abstract base
class` (ABCs) in Python, as outlined in :pep:`3119`; see the PEP for why this
class` (ABC) in Python, as outlined in :pep:`3119`; see the PEP for why this
was added to Python. (See also :pep:`3141` and the :mod:`numbers` module
regarding a type hierarchy for numbers based on ABCs.)

View File

@ -37,14 +37,18 @@ All of the classes in this module may safely be accessed from multiple threads.
*fileobj*), or operate directly on a named file (named by *filename*).
Exactly one of these two parameters should be provided.
The *mode* argument can be either ``'r'`` for reading (default), or ``'w'``
for writing.
The *mode* argument can be either ``'r'`` for reading (default), ``'w'`` for
overwriting, or ``'a'`` for appending. If *fileobj* is provided, a mode of
``'w'`` does not truncate the file, and is instead equivalent to ``'a'``.
The *buffering* argument is ignored. Its use is deprecated.
If *mode* is ``'w'``, *compresslevel* can be a number between ``1`` and
``9`` specifying the level of compression: ``1`` produces the least
compression, and ``9`` (default) produces the most compression.
If *mode* is ``'w'`` or ``'a'``, *compresslevel* can be a number between
``1`` and ``9`` specifying the level of compression: ``1`` produces the
least compression, and ``9`` (default) produces the most compression.
If *mode* is ``'r'``, the input file may be the concatenation of multiple
compressed streams.
:class:`BZ2File` provides all of the members specified by the
:class:`io.BufferedIOBase`, except for :meth:`detach` and :meth:`truncate`.
@ -70,6 +74,10 @@ All of the classes in this module may safely be accessed from multiple threads.
.. versionchanged:: 3.3
The *fileobj* argument to the constructor was added.
.. versionchanged:: 3.3
The ``'a'`` (append) mode was added, along with support for reading
multi-stream files.
Incremental (de)compression
---------------------------
@ -106,14 +114,20 @@ Incremental (de)compression
incrementally. For one-shot compression, use the :func:`decompress` function
instead.
.. note::
This class does not transparently handle inputs containing multiple
compressed streams, unlike :func:`decompress` and :class:`BZ2File`. If
you need to decompress a multi-stream input with :class:`BZ2Decompressor`,
you must use a new decompressor for each stream.
.. method:: decompress(data)
Provide data to the decompressor object. Returns a chunk of decompressed
data if possible, or an empty byte string otherwise.
Attempting to decompress data after the end of stream is reached raises
an :exc:`EOFError`. If any data is found after the end of the stream, it
is ignored and saved in the :attr:`unused_data` attribute.
Attempting to decompress data after the end of the current stream is
reached raises an :exc:`EOFError`. If any data is found after the end of
the stream, it is ignored and saved in the :attr:`unused_data` attribute.
.. attribute:: eof
@ -127,6 +141,9 @@ Incremental (de)compression
Data found after the end of the compressed stream.
If this attribute is accessed before the end of the stream has been
reached, its value will be ``b''``.
One-shot (de)compression
------------------------
@ -145,5 +162,11 @@ One-shot (de)compression
Decompress *data*.
If *data* is the concatenation of multiple compressed streams, decompress
all of the streams.
For incremental decompression, use a :class:`BZ2Decompressor` instead.
.. versionchanged:: 3.3
Support for multi-stream inputs was added.

View File

@ -458,7 +458,8 @@ define in order to be compatible with the Python codec registry.
.. method:: reset()
Reset the encoder to the initial state.
Reset the encoder to the initial state. The output is discarded: call
``.encode('', final=True)`` to reset the encoder and to get the output.
.. method:: IncrementalEncoder.getstate()

View File

@ -23,7 +23,7 @@ example, whether it is hashable or whether it is a mapping.
.. versionchanged:: 3.3
Formerly, this module was part of the :mod:`collections` module.
.. _abstract-base-classes:
.. _collections-abstract-base-classes:
Collections Abstract Base Classes
---------------------------------

View File

@ -34,7 +34,7 @@ Python's general purpose built-in containers, :class:`dict`, :class:`list`,
===================== ====================================================================
.. versionchanged:: 3.3
Moved :ref:`abstract-base-classes` to the :mod:`collections.abc` module.
Moved :ref:`collections-abstract-base-classes` to the :mod:`collections.abc` module.
For backwards compatibility, they continue to be visible in this module
as well.

View File

@ -29,6 +29,8 @@ this module.
Hashing Methods
---------------
.. versionadded:: 3.3
The :mod:`crypt` module defines the list of hashing methods (not all methods
are available on all platforms):
@ -37,33 +39,26 @@ are available on all platforms):
A Modular Crypt Format method with 16 character salt and 86 character
hash. This is the strongest method.
.. versionadded:: 3.3
.. data:: METHOD_SHA256
Another Modular Crypt Format method with 16 character salt and 43
character hash.
.. versionadded:: 3.3
.. data:: METHOD_MD5
Another Modular Crypt Format method with 8 character salt and 22
character hash.
.. versionadded:: 3.3
.. data:: METHOD_CRYPT
The traditional method with a 2 character salt and 13 characters of
hash. This is the weakest method.
.. versionadded:: 3.3
Module Attributes
-----------------
.. versionadded:: 3.3
.. attribute:: methods
@ -71,8 +66,6 @@ Module Attributes
``crypt.METHOD_*`` objects. This list is sorted from strongest to
weakest, and is guaranteed to have at least ``crypt.METHOD_CRYPT``.
.. versionadded:: 3.3
Module Functions
----------------
@ -108,9 +101,8 @@ The :mod:`crypt` module defines the following functions:
different sizes in the *salt*, it is recommended to use the full crypted
password as salt when checking for a password.
.. versionchanged:: 3.3
Before version 3.3, *salt* must be specified as a string and cannot
accept ``crypt.METHOD_*`` values (which don't exist anyway).
.. versionchanged:: 3.3
Accept ``crypt.METHOD_*`` values in addition to strings for *salt*.
.. function:: mksalt(method=None)
@ -124,25 +116,27 @@ The :mod:`crypt` module defines the following functions:
16 random characters from the set ``[./a-zA-Z0-9]``, suitable for
passing as the *salt* argument to :func:`crypt`.
.. versionadded:: 3.3
.. versionadded:: 3.3
Examples
--------
A simple example illustrating typical use::
import crypt, getpass, pwd
import pwd
import crypt
import getpass
def login():
username = input('Python login:')
username = input('Python login: ')
cryptedpasswd = pwd.getpwnam(username)[1]
if cryptedpasswd:
if cryptedpasswd == 'x' or cryptedpasswd == '*':
raise "Sorry, currently no support for shadow passwords"
raise ValueError('no support for shadow passwords')
cleartext = getpass.getpass()
return crypt.crypt(cleartext, cryptedpasswd) == cryptedpasswd
else:
return 1
return True
To generate a hash of a password using the strongest available method and
check it against the original::
@ -151,4 +145,4 @@ check it against the original::
hashed = crypt.crypt(plaintext)
if hashed != crypt.crypt(plaintext, hashed):
raise "Hashed version doesn't validate against original"
raise ValueError("hashed version doesn't validate against original")

Binary file not shown.

After

Width:  |  Height:  |  Size: 24 KiB

View File

@ -12,18 +12,26 @@ additional modules into a Python installation. The new modules may be either
100%-pure Python, or may be extension modules written in C, or may be
collections of Python packages which include modules coded in both Python and C.
This package is discussed in two separate chapters:
.. deprecated:: 3.3
:mod:`packaging` replaces Distutils. See :ref:`packaging-index` and
:ref:`packaging-install-index`.
User documentation and API reference are provided in another document:
.. seealso::
:ref:`distutils-index`
The manual for developers and packagers of Python modules. This describes
how to prepare :mod:`distutils`\ -based packages so that they may be
easily installed into an existing Python installation.
easily installed into an existing Python installation. If also contains
instructions for end-users wanting to install a distutils-based package,
:ref:`install-index`.
:ref:`install-index`
An "administrators" manual which includes information on installing
modules into an existing Python installation. You do not need to be a
Python programmer to read this manual.
.. trick to silence a Sphinx warning
.. toctree::
:hidden:
../distutils/index

View File

@ -290,19 +290,18 @@ are always available. They are listed here in alphabetical order.
The resulting list is sorted alphabetically. For example:
>>> import struct
>>> dir() # doctest: +SKIP
>>> dir() # show the names in the module namespace
['__builtins__', '__doc__', '__name__', 'struct']
>>> dir(struct) # doctest: +NORMALIZE_WHITESPACE
>>> dir(struct) # show the names in the struct module
['Struct', '__builtins__', '__doc__', '__file__', '__name__',
'__package__', '_clearcache', 'calcsize', 'error', 'pack', 'pack_into',
'unpack', 'unpack_from']
>>> class Foo:
... def __dir__(self):
... return ["kan", "ga", "roo"]
...
>>> f = Foo()
>>> dir(f)
['ga', 'kan', 'roo']
>>> class Shape(object):
def __dir__(self):
return ['area', 'perimeter', 'location']
>>> s = Shape()
>>> dir(s)
['area', 'perimeter', 'location']
.. note::
@ -333,15 +332,21 @@ are always available. They are listed here in alphabetical order.
:meth:`__next__` method of the iterator returned by :func:`enumerate` returns a
tuple containing a count (from *start* which defaults to 0) and the
corresponding value obtained from iterating over *iterable*.
:func:`enumerate` is useful for obtaining an indexed series: ``(0, seq[0])``,
``(1, seq[1])``, ``(2, seq[2])``, .... For example:
>>> for i, season in enumerate(['Spring', 'Summer', 'Fall', 'Winter']):
... print(i, season)
0 Spring
1 Summer
2 Fall
3 Winter
>>> for i, season in enumerate('Spring Summer Fall Winter'.split(), start=1):
print(i, season)
1 Spring
2 Summer
3 Fall
4 Winter
Equivalent to::
def enumerate(sequence, start=0):
n = start
for elem in sequence:
yield n, elem
n += 1
.. function:: eval(expression, globals=None, locals=None)
@ -580,7 +585,7 @@ are always available. They are listed here in alphabetical order.
Two objects with non-overlapping lifetimes may have the same :func:`id`
value.
.. impl-detail:: This is the address of the object.
.. impl-detail:: This is the address of the object in memory.
.. function:: input([prompt])
@ -652,10 +657,10 @@ are always available. They are listed here in alphabetical order.
One useful application of the second form of :func:`iter` is to read lines of
a file until a certain line is reached. The following example reads a file
until ``"STOP"`` is reached: ::
until the :meth:`readline` method returns an empty string::
with open("mydata.txt") as fp:
for line in iter(fp.readline, "STOP"):
with open('mydata.txt') as fp:
for line in iter(fp.readline, ''):
process_line(line)
@ -1169,8 +1174,9 @@ are always available. They are listed here in alphabetical order.
It can be called either on the class (such as ``C.f()``) or on an instance (such
as ``C().f()``). The instance is ignored except for its class.
Static methods in Python are similar to those found in Java or C++. For a more
advanced concept, see :func:`classmethod` in this section.
Static methods in Python are similar to those found in Java or C++. Also see
:func:`classmethod` for a variant that is useful for creating alternate class
constructors.
For more information on static methods, consult the documentation on the
standard type hierarchy in :ref:`types`.
@ -1270,6 +1276,10 @@ are always available. They are listed here in alphabetical order.
references. The zero argument form automatically searches the stack frame
for the class (``__class__``) and the first argument.
For practical suggestions on how to design cooperative classes using
:func:`super`, see `guide to using super()
<http://rhettinger.wordpress.com/2011/05/26/super-considered-super/>`_.
.. function:: tuple([iterable])

View File

@ -1019,6 +1019,19 @@ as internal buffering of data.
Availability: Unix, Windows.
.. function:: pipe2(flags=0)
Create a pipe with *flags* set atomically.
*flags* is optional and can be constructed by ORing together zero or more of
these values: :data:`O_NONBLOCK`, :data:`O_CLOEXEC`.
Return a pair of file descriptors ``(r, w)`` usable for reading and writing,
respectively.
Availability: some flavors of Unix.
.. versionadded:: 3.3
.. function:: posix_fallocate(fd, offset, len)
Ensures that enough disk space is allocated for the file specified by *fd*

View File

@ -0,0 +1,27 @@
.. temporary file for modules that don't need a dedicated file yet
:mod:`packaging.errors` --- Packaging exceptions
================================================
.. module:: packaging.errors
:synopsis: Packaging exceptions.
Provides exceptions used by the Packaging modules. Note that Packaging modules
may raise standard exceptions; in particular, SystemExit is usually raised for
errors that are obviously the end-user's fault (e.g. bad command-line arguments).
This module is safe to use in ``from ... import *`` mode; it only exports
symbols whose names start with ``Packaging`` and end with ``Error``.
:mod:`packaging.manifest` --- The Manifest class
================================================
.. module:: packaging.manifest
:synopsis: The Manifest class, used for poking about the file system and
building lists of files.
This module provides the :class:`Manifest` class, used for poking about the
filesystem and building lists of files.

View File

@ -0,0 +1,111 @@
:mod:`packaging.command` --- Standard Packaging commands
========================================================
.. module:: packaging.command
:synopsis: Standard packaging commands.
This subpackage contains one module for each standard Packaging command, such as
:command:`build` or :command:`upload`. Each command is implemented as a
separate module, with the command name as the name of the module and of the
class defined therein.
:mod:`packaging.command.cmd` --- Abstract base class for Packaging commands
===========================================================================
.. module:: packaging.command.cmd
:synopsis: Abstract base class for commands.
This module supplies the abstract base class :class:`Command`. This class is
subclassed by the modules in the packaging.command subpackage.
.. class:: Command(dist)
Abstract base class for defining command classes, the "worker bees" of the
Packaging. A useful analogy for command classes is to think of them as
subroutines with local variables called *options*. The options are declared
in :meth:`initialize_options` and defined (given their final values) in
:meth:`finalize_options`, both of which must be defined by every command
class. The distinction between the two is necessary because option values
might come from the outside world (command line, config file, ...), and any
options dependent on other options must be computed after these outside
influences have been processed --- hence :meth:`finalize_options`. The body
of the subroutine, where it does all its work based on the values of its
options, is the :meth:`run` method, which must also be implemented by every
command class.
The class constructor takes a single argument *dist*, a
:class:`~packaging.dist.Distribution` instance.
Creating a new Packaging command
--------------------------------
This section outlines the steps to create a new Packaging command.
.. XXX the following paragraph is focused on the stdlib; expand it to document
how to write and register a command in third-party projects
A new command lives in a module in the :mod:`packaging.command` package. There
is a sample template in that directory called :file:`command_template`. Copy
this file to a new module with the same name as the new command you're
implementing. This module should implement a class with the same name as the
module (and the command). So, for instance, to create the command
``peel_banana`` (so that users can run ``setup.py peel_banana``), you'd copy
:file:`command_template` to :file:`packaging/command/peel_banana.py`, then edit
it so that it's implementing the class :class:`peel_banana`, a subclass of
:class:`Command`. It must define the following methods:
.. method:: Command.initialize_options()
Set default values for all the options that this command supports. Note that
these defaults may be overridden by other commands, by the setup script, by
config files, or by the command line. Thus, this is not the place to code
dependencies between options; generally, :meth:`initialize_options`
implementations are just a bunch of ``self.foo = None`` assignments.
.. method:: Command.finalize_options()
Set final values for all the options that this command supports. This is
always called as late as possible, i.e. after any option assignments from the
command line or from other commands have been done. Thus, this is the place
to to code option dependencies: if *foo* depends on *bar*, then it is safe to
set *foo* from *bar* as long as *foo* still has the same value it was
assigned in :meth:`initialize_options`.
.. method:: Command.run()
A command's raison d'etre: carry out the action it exists to perform,
controlled by the options initialized in :meth:`initialize_options`,
customized by other commands, the setup script, the command line, and config
files, and finalized in :meth:`finalize_options`. All terminal output and
filesystem interaction should be done by :meth:`run`.
Command classes may define this attribute:
.. attribute:: Command.sub_commands
*sub_commands* formalizes the notion of a "family" of commands,
e.g. ``install_dist`` as the parent with sub-commands ``install_lib``,
``install_headers``, etc. The parent of a family of commands defines
*sub_commands* as a class attribute; it's a list of 2-tuples ``(command_name,
predicate)``, with *command_name* a string and *predicate* a function, a
string or ``None``. *predicate* is a method of the parent command that
determines whether the corresponding command is applicable in the current
situation. (E.g. ``install_headers`` is only applicable if we have any C
header files to install.) If *predicate* is ``None``, that command is always
applicable.
*sub_commands* is usually defined at the *end* of a class, because
predicates can be methods of the class, so they must already have been
defined. The canonical example is the :command:`install_dist` command.
.. XXX document how to add a custom command to another one's subcommands

View File

@ -0,0 +1,672 @@
:mod:`packaging.compiler` --- Compiler classes
==============================================
.. module:: packaging.compiler
:synopsis: Compiler classes to build C/C++ extensions or libraries.
This subpackage contains an abstract base class representing a compiler and
concrete implementations for common compilers. The compiler classes should not
be instantiated directly, but created using the :func:`new_compiler` factory
function. Compiler types provided by Packaging are listed in
:ref:`packaging-standard-compilers`.
Public functions
----------------
.. function:: new_compiler(plat=None, compiler=None, verbose=0, dry_run=0, force=0)
Factory function to generate an instance of some
:class:`~.ccompiler.CCompiler` subclass for the requested platform or
compiler type.
If no argument is given for *plat* and *compiler*, the default compiler type
for the platform (:attr:`os.name`) will be used: ``'unix'`` for Unix and
Mac OS X, ``'msvc'`` for Windows.
If *plat* is given, it must be one of ``'posix'``, ``'darwin'`` or ``'nt'``.
An invalid value will not raise an exception but use the default compiler
type for the current platform.
.. XXX errors should never pass silently; this behavior is particularly
harmful when a compiler type is given as first argument
If *compiler* is given, *plat* will be ignored, allowing you to get for
example a ``'unix'`` compiler object under Windows or an ``'msvc'`` compiler
under Unix. However, not all compiler types can be instantiated on every
platform.
.. function:: customize_compiler(compiler)
Do any platform-specific customization of a CCompiler instance. Mainly
needed on Unix to plug in the information that varies across Unices and is
stored in CPython's Makefile.
.. function:: gen_lib_options(compiler, library_dirs, runtime_library_dirs, libraries)
Generate linker options for searching library directories and linking with
specific libraries. *libraries* and *library_dirs* are, respectively, lists
of library names (not filenames!) and search directories. Returns a list of
command-line options suitable for use with some compiler (depending on the
two format strings passed in).
.. function:: gen_preprocess_options(macros, include_dirs)
Generate C preprocessor options (:option:`-D`, :option:`-U`, :option:`-I`) as
used by at least two types of compilers: the typical Unix compiler and Visual
C++. *macros* is the usual thing, a list of 1- or 2-tuples, where ``(name,)``
means undefine (:option:`-U`) macro *name*, and ``(name, value)`` means
define (:option:`-D`) macro *name* to *value*. *include_dirs* is just a list
of directory names to be added to the header file search path (:option:`-I`).
Returns a list of command-line options suitable for either Unix compilers or
Visual C++.
.. function:: get_default_compiler(osname, platform)
Determine the default compiler to use for the given platform.
*osname* should be one of the standard Python OS names (i.e. the ones
returned by ``os.name``) and *platform* the common value returned by
``sys.platform`` for the platform in question.
The default values are ``os.name`` and ``sys.platform``.
.. function:: set_compiler(location)
Add or change a compiler
.. function:: show_compilers()
Print list of available compilers (used by the :option:`--help-compiler`
options to :command:`build`, :command:`build_ext`, :command:`build_clib`).
.. _packaging-standard-compilers:
Standard compilers
------------------
Concrete subclasses of :class:`~.ccompiler.CCompiler` are provided in submodules
of the :mod:`packaging.compiler` package. You do not need to import them, using
:func:`new_compiler` is the public API to use. This table documents the
standard compilers; be aware that they can be replaced by other classes on your
platform.
=============== ======================================================== =======
name description notes
=============== ======================================================== =======
``'unix'`` typical Unix-style command-line C compiler [#]_
``'msvc'`` Microsoft compiler [#]_
``'bcpp'`` Borland C++ compiler
``'cygwin'`` Cygwin compiler (Windows port of GCC)
``'mingw32'`` Mingw32 port of GCC (same as Cygwin in no-Cygwin mode)
=============== ======================================================== =======
.. [#] The Unix compiler class assumes this behavior:
* macros defined with :option:`-Dname[=value]`
* macros undefined with :option:`-Uname`
* include search directories specified with :option:`-Idir`
* libraries specified with :option:`-llib`
* library search directories specified with :option:`-Ldir`
* compile handled by :program:`cc` (or similar) executable with
:option:`-c` option: compiles :file:`.c` to :file:`.o`
* link static library handled by :program:`ar` command (possibly with
:program:`ranlib`)
* link shared library handled by :program:`cc` :option:`-shared`
.. [#] On Windows, extension modules typically need to be compiled with the same
compiler that was used to compile CPython (for example Microsoft Visual
Studio .NET 2003 for CPython 2.4 and 2.5). The AMD64 and Itanium
binaries are created using the Platform SDK.
Under the hood, there are actually two different subclasses of
:class:`~.ccompiler.CCompiler` defined: one is compatible with MSVC 2005
and 2008, the other works with older versions. This should not be a
concern for regular use of the functions in this module.
Packaging will normally choose the right compiler, linker etc. on its
own. To override this choice, the environment variables
*DISTUTILS_USE_SDK* and *MSSdk* must be both set. *MSSdk* indicates that
the current environment has been setup by the SDK's ``SetEnv.Cmd``
script, or that the environment variables had been registered when the
SDK was installed; *DISTUTILS_USE_SDK* indicates that the user has made
an explicit choice to override the compiler selection done by Packaging.
.. TODO document the envvars in Doc/using and the man page
:mod:`packaging.compiler.ccompiler` --- CCompiler base class
============================================================
.. module:: packaging.compiler.ccompiler
:synopsis: Abstract CCompiler class.
This module provides the abstract base class for the :class:`CCompiler`
classes. A :class:`CCompiler` instance can be used for all the compile and
link steps needed to build a single project. Methods are provided to set
options for the compiler --- macro definitions, include directories, link path,
libraries and the like.
.. class:: CCompiler([verbose=0, dry_run=0, force=0])
The abstract base class :class:`CCompiler` defines the interface that must be
implemented by real compiler classes. The class also has some utility
methods used by several compiler classes.
The basic idea behind a compiler abstraction class is that each instance can
be used for all the compile/link steps in building a single project. Thus,
attributes common to all of those compile and link steps --- include
directories, macros to define, libraries to link against, etc. --- are
attributes of the compiler instance. To allow for variability in how
individual files are treated, most of those attributes may be varied on a
per-compilation or per-link basis.
The constructor for each subclass creates an instance of the Compiler object.
Flags are *verbose* (show verbose output), *dry_run* (don't actually execute
the steps) and *force* (rebuild everything, regardless of dependencies). All
of these flags default to ``0`` (off). Note that you probably don't want to
instantiate :class:`CCompiler` or one of its subclasses directly - use the
:func:`packaging.CCompiler.new_compiler` factory function instead.
The following methods allow you to manually alter compiler options for the
instance of the Compiler class.
.. method:: CCompiler.add_include_dir(dir)
Add *dir* to the list of directories that will be searched for header
files. The compiler is instructed to search directories in the order in
which they are supplied by successive calls to :meth:`add_include_dir`.
.. method:: CCompiler.set_include_dirs(dirs)
Set the list of directories that will be searched to *dirs* (a list of
strings). Overrides any preceding calls to :meth:`add_include_dir`;
subsequent calls to :meth:`add_include_dir` add to the list passed to
:meth:`set_include_dirs`. This does not affect any list of standard
include directories that the compiler may search by default.
.. method:: CCompiler.add_library(libname)
Add *libname* to the list of libraries that will be included in all links
driven by this compiler object. Note that *libname* should *not* be the
name of a file containing a library, but the name of the library itself:
the actual filename will be inferred by the linker, the compiler, or the
compiler class (depending on the platform).
The linker will be instructed to link against libraries in the order they
were supplied to :meth:`add_library` and/or :meth:`set_libraries`. It is
perfectly valid to duplicate library names; the linker will be instructed
to link against libraries as many times as they are mentioned.
.. method:: CCompiler.set_libraries(libnames)
Set the list of libraries to be included in all links driven by this
compiler object to *libnames* (a list of strings). This does not affect
any standard system libraries that the linker may include by default.
.. method:: CCompiler.add_library_dir(dir)
Add *dir* to the list of directories that will be searched for libraries
specified to :meth:`add_library` and :meth:`set_libraries`. The linker
will be instructed to search for libraries in the order they are supplied
to :meth:`add_library_dir` and/or :meth:`set_library_dirs`.
.. method:: CCompiler.set_library_dirs(dirs)
Set the list of library search directories to *dirs* (a list of strings).
This does not affect any standard library search path that the linker may
search by default.
.. method:: CCompiler.add_runtime_library_dir(dir)
Add *dir* to the list of directories that will be searched for shared
libraries at runtime.
.. method:: CCompiler.set_runtime_library_dirs(dirs)
Set the list of directories to search for shared libraries at runtime to
*dirs* (a list of strings). This does not affect any standard search path
that the runtime linker may search by default.
.. method:: CCompiler.define_macro(name[, value=None])
Define a preprocessor macro for all compilations driven by this compiler
object. The optional parameter *value* should be a string; if it is not
supplied, then the macro will be defined without an explicit value and the
exact outcome depends on the compiler used (XXX true? does ANSI say
anything about this?)
.. method:: CCompiler.undefine_macro(name)
Undefine a preprocessor macro for all compilations driven by this compiler
object. If the same macro is defined by :meth:`define_macro` and
undefined by :meth:`undefine_macro` the last call takes precedence
(including multiple redefinitions or undefinitions). If the macro is
redefined/undefined on a per-compilation basis (i.e. in the call to
:meth:`compile`), then that takes precedence.
.. method:: CCompiler.add_link_object(object)
Add *object* to the list of object files (or analogues, such as explicitly
named library files or the output of "resource compilers") to be included
in every link driven by this compiler object.
.. method:: CCompiler.set_link_objects(objects)
Set the list of object files (or analogues) to be included in every link
to *objects*. This does not affect any standard object files that the
linker may include by default (such as system libraries).
The following methods implement methods for autodetection of compiler
options, providing some functionality similar to GNU :program:`autoconf`.
.. method:: CCompiler.detect_language(sources)
Detect the language of a given file, or list of files. Uses the instance
attributes :attr:`language_map` (a dictionary), and :attr:`language_order`
(a list) to do the job.
.. method:: CCompiler.find_library_file(dirs, lib[, debug=0])
Search the specified list of directories for a static or shared library file
*lib* and return the full path to that file. If *debug* is true, look for a
debugging version (if that makes sense on the current platform). Return
``None`` if *lib* wasn't found in any of the specified directories.
.. method:: CCompiler.has_function(funcname [, includes=None, include_dirs=None, libraries=None, library_dirs=None])
Return a boolean indicating whether *funcname* is supported on the current
platform. The optional arguments can be used to augment the compilation
environment by providing additional include files and paths and libraries and
paths.
.. method:: CCompiler.library_dir_option(dir)
Return the compiler option to add *dir* to the list of directories searched for
libraries.
.. method:: CCompiler.library_option(lib)
Return the compiler option to add *dir* to the list of libraries linked into the
shared library or executable.
.. method:: CCompiler.runtime_library_dir_option(dir)
Return the compiler option to add *dir* to the list of directories searched for
runtime libraries.
.. method:: CCompiler.set_executables(**args)
Define the executables (and options for them) that will be run to perform the
various stages of compilation. The exact set of executables that may be
specified here depends on the compiler class (via the 'executables' class
attribute), but most will have:
+--------------+------------------------------------------+
| attribute | description |
+==============+==========================================+
| *compiler* | the C/C++ compiler |
+--------------+------------------------------------------+
| *linker_so* | linker used to create shared objects and |
| | libraries |
+--------------+------------------------------------------+
| *linker_exe* | linker used to create binary executables |
+--------------+------------------------------------------+
| *archiver* | static library creator |
+--------------+------------------------------------------+
On platforms with a command line (Unix, DOS/Windows), each of these is a string
that will be split into executable name and (optional) list of arguments.
(Splitting the string is done similarly to how Unix shells operate: words are
delimited by spaces, but quotes and backslashes can override this. See
:func:`packaging.util.split_quoted`.)
The following methods invoke stages in the build process.
.. method:: CCompiler.compile(sources[, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None])
Compile one or more source files. Generates object files (e.g. transforms a
:file:`.c` file to a :file:`.o` file.)
*sources* must be a list of filenames, most likely C/C++ files, but in reality
anything that can be handled by a particular compiler and compiler class (e.g.
an ``'msvc'`` compiler` can handle resource files in *sources*). Return a list of
object filenames, one per source filename in *sources*. Depending on the
implementation, not all source files will necessarily be compiled, but all
corresponding object filenames will be returned.
If *output_dir* is given, object files will be put under it, while retaining
their original path component. That is, :file:`foo/bar.c` normally compiles to
:file:`foo/bar.o` (for a Unix implementation); if *output_dir* is *build*, then
it would compile to :file:`build/foo/bar.o`.
*macros*, if given, must be a list of macro definitions. A macro definition is
either a ``(name, value)`` 2-tuple or a ``(name,)`` 1-tuple. The former defines
a macro; if the value is ``None``, the macro is defined without an explicit
value. The 1-tuple case undefines a macro. Later
definitions/redefinitions/undefinitions take precedence.
*include_dirs*, if given, must be a list of strings, the directories to add to
the default include file search path for this compilation only.
*debug* is a boolean; if true, the compiler will be instructed to output debug
symbols in (or alongside) the object file(s).
*extra_preargs* and *extra_postargs* are implementation-dependent. On platforms
that have the notion of a command line (e.g. Unix, DOS/Windows), they are most
likely lists of strings: extra command-line arguments to prepend/append to the
compiler command line. On other platforms, consult the implementation class
documentation. In any event, they are intended as an escape hatch for those
occasions when the abstract compiler framework doesn't cut the mustard.
*depends*, if given, is a list of filenames that all targets depend on. If a
source file is older than any file in depends, then the source file will be
recompiled. This supports dependency tracking, but only at a coarse
granularity.
Raises :exc:`CompileError` on failure.
.. method:: CCompiler.create_static_lib(objects, output_libname[, output_dir=None, debug=0, target_lang=None])
Link a bunch of stuff together to create a static library file. The "bunch of
stuff" consists of the list of object files supplied as *objects*, the extra
object files supplied to :meth:`add_link_object` and/or
:meth:`set_link_objects`, the libraries supplied to :meth:`add_library` and/or
:meth:`set_libraries`, and the libraries supplied as *libraries* (if any).
*output_libname* should be a library name, not a filename; the filename will be
inferred from the library name. *output_dir* is the directory where the library
file will be put. XXX defaults to what?
*debug* is a boolean; if true, debugging information will be included in the
library (note that on most platforms, it is the compile step where this matters:
the *debug* flag is included here just for consistency).
*target_lang* is the target language for which the given objects are being
compiled. This allows specific linkage time treatment of certain languages.
Raises :exc:`LibError` on failure.
.. method:: CCompiler.link(target_desc, objects, output_filename[, output_dir=None, libraries=None, library_dirs=None, runtime_library_dirs=None, export_symbols=None, debug=0, extra_preargs=None, extra_postargs=None, build_temp=None, target_lang=None])
Link a bunch of stuff together to create an executable or shared library file.
The "bunch of stuff" consists of the list of object files supplied as *objects*.
*output_filename* should be a filename. If *output_dir* is supplied,
*output_filename* is relative to it (i.e. *output_filename* can provide
directory components if needed).
*libraries* is a list of libraries to link against. These are library names,
not filenames, since they're translated into filenames in a platform-specific
way (e.g. *foo* becomes :file:`libfoo.a` on Unix and :file:`foo.lib` on
DOS/Windows). However, they can include a directory component, which means the
linker will look in that specific directory rather than searching all the normal
locations.
*library_dirs*, if supplied, should be a list of directories to search for
libraries that were specified as bare library names (i.e. no directory
component). These are on top of the system default and those supplied to
:meth:`add_library_dir` and/or :meth:`set_library_dirs`. *runtime_library_dirs*
is a list of directories that will be embedded into the shared library and used
to search for other shared libraries that \*it\* depends on at run-time. (This
may only be relevant on Unix.)
*export_symbols* is a list of symbols that the shared library will export.
(This appears to be relevant only on Windows.)
*debug* is as for :meth:`compile` and :meth:`create_static_lib`, with the
slight distinction that it actually matters on most platforms (as opposed to
:meth:`create_static_lib`, which includes a *debug* flag mostly for form's
sake).
*extra_preargs* and *extra_postargs* are as for :meth:`compile` (except of
course that they supply command-line arguments for the particular linker being
used).
*target_lang* is the target language for which the given objects are being
compiled. This allows specific linkage time treatment of certain languages.
Raises :exc:`LinkError` on failure.
.. method:: CCompiler.link_executable(objects, output_progname[, output_dir=None, libraries=None, library_dirs=None, runtime_library_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, target_lang=None])
Link an executable. *output_progname* is the name of the file executable, while
*objects* are a list of object filenames to link in. Other arguments are as for
the :meth:`link` method.
.. method:: CCompiler.link_shared_lib(objects, output_libname[, output_dir=None, libraries=None, library_dirs=None, runtime_library_dirs=None, export_symbols=None, debug=0, extra_preargs=None, extra_postargs=None, build_temp=None, target_lang=None])
Link a shared library. *output_libname* is the name of the output library,
while *objects* is a list of object filenames to link in. Other arguments are
as for the :meth:`link` method.
.. method:: CCompiler.link_shared_object(objects, output_filename[, output_dir=None, libraries=None, library_dirs=None, runtime_library_dirs=None, export_symbols=None, debug=0, extra_preargs=None, extra_postargs=None, build_temp=None, target_lang=None])
Link a shared object. *output_filename* is the name of the shared object that
will be created, while *objects* is a list of object filenames to link in.
Other arguments are as for the :meth:`link` method.
.. method:: CCompiler.preprocess(source[, output_file=None, macros=None, include_dirs=None, extra_preargs=None, extra_postargs=None])
Preprocess a single C/C++ source file, named in *source*. Output will be written
to file named *output_file*, or *stdout* if *output_file* not supplied.
*macros* is a list of macro definitions as for :meth:`compile`, which will
augment the macros set with :meth:`define_macro` and :meth:`undefine_macro`.
*include_dirs* is a list of directory names that will be added to the default
list, in the same way as :meth:`add_include_dir`.
Raises :exc:`PreprocessError` on failure.
The following utility methods are defined by the :class:`CCompiler` class, for
use by the various concrete subclasses.
.. method:: CCompiler.executable_filename(basename[, strip_dir=0, output_dir=''])
Returns the filename of the executable for the given *basename*. Typically for
non-Windows platforms this is the same as the basename, while Windows will get
a :file:`.exe` added.
.. method:: CCompiler.library_filename(libname[, lib_type='static', strip_dir=0, output_dir=''])
Returns the filename for the given library name on the current platform. On Unix
a library with *lib_type* of ``'static'`` will typically be of the form
:file:`liblibname.a`, while a *lib_type* of ``'dynamic'`` will be of the form
:file:`liblibname.so`.
.. method:: CCompiler.object_filenames(source_filenames[, strip_dir=0, output_dir=''])
Returns the name of the object files for the given source files.
*source_filenames* should be a list of filenames.
.. method:: CCompiler.shared_object_filename(basename[, strip_dir=0, output_dir=''])
Returns the name of a shared object file for the given file name *basename*.
.. method:: CCompiler.execute(func, args[, msg=None, level=1])
Invokes :func:`packaging.util.execute` This method invokes a Python function
*func* with the given arguments *args*, after logging and taking into account
the *dry_run* flag. XXX see also.
.. method:: CCompiler.spawn(cmd)
Invokes :func:`packaging.util.spawn`. This invokes an external process to run
the given command. XXX see also.
.. method:: CCompiler.mkpath(name[, mode=511])
Invokes :func:`packaging.dir_util.mkpath`. This creates a directory and any
missing ancestor directories. XXX see also.
.. method:: CCompiler.move_file(src, dst)
Invokes :meth:`packaging.file_util.move_file`. Renames *src* to *dst*. XXX see
also.
:mod:`packaging.compiler.extension` --- The Extension class
===========================================================
.. module:: packaging.compiler.extension
:synopsis: Class used to represent C/C++ extension modules.
This module provides the :class:`Extension` class, used to represent C/C++
extension modules.
.. class:: Extension
The Extension class describes a single C or C++ extension module. It accepts
the following keyword arguments in its constructor
+------------------------+--------------------------------+---------------------------+
| argument name | value | type |
+========================+================================+===========================+
| *name* | the full name of the | string |
| | extension, including any | |
| | packages --- i.e. *not* a | |
| | filename or pathname, but | |
| | Python dotted name | |
+------------------------+--------------------------------+---------------------------+
| *sources* | list of source filenames, | string |
| | relative to the distribution | |
| | root (where the setup script | |
| | lives), in Unix form (slash- | |
| | separated) for portability. | |
| | Source files may be C, C++, | |
| | SWIG (.i), platform-specific | |
| | resource files, or whatever | |
| | else is recognized by the | |
| | :command:`build_ext` command | |
| | as source for a Python | |
| | extension. | |
+------------------------+--------------------------------+---------------------------+
| *include_dirs* | list of directories to search | string |
| | for C/C++ header files (in | |
| | Unix form for portability) | |
+------------------------+--------------------------------+---------------------------+
| *define_macros* | list of macros to define; each | (string, string) tuple or |
| | macro is defined using a | (name, ``None``) |
| | 2-tuple ``(name, value)``, | |
| | where *value* is | |
| | either the string to define it | |
| | to or ``None`` to define it | |
| | without a particular value | |
| | (equivalent of ``#define FOO`` | |
| | in source or :option:`-DFOO` | |
| | on Unix C compiler command | |
| | line) | |
+------------------------+--------------------------------+---------------------------+
| *undef_macros* | list of macros to undefine | string |
| | explicitly | |
+------------------------+--------------------------------+---------------------------+
| *library_dirs* | list of directories to search | string |
| | for C/C++ libraries at link | |
| | time | |
+------------------------+--------------------------------+---------------------------+
| *libraries* | list of library names (not | string |
| | filenames or paths) to link | |
| | against | |
+------------------------+--------------------------------+---------------------------+
| *runtime_library_dirs* | list of directories to search | string |
| | for C/C++ libraries at run | |
| | time (for shared extensions, | |
| | this is when the extension is | |
| | loaded) | |
+------------------------+--------------------------------+---------------------------+
| *extra_objects* | list of extra files to link | string |
| | with (e.g. object files not | |
| | implied by 'sources', static | |
| | library that must be | |
| | explicitly specified, binary | |
| | resource files, etc.) | |
+------------------------+--------------------------------+---------------------------+
| *extra_compile_args* | any extra platform- and | string |
| | compiler-specific information | |
| | to use when compiling the | |
| | source files in 'sources'. For | |
| | platforms and compilers where | |
| | a command line makes sense, | |
| | this is typically a list of | |
| | command-line arguments, but | |
| | for other platforms it could | |
| | be anything. | |
+------------------------+--------------------------------+---------------------------+
| *extra_link_args* | any extra platform- and | string |
| | compiler-specific information | |
| | to use when linking object | |
| | files together to create the | |
| | extension (or to create a new | |
| | static Python interpreter). | |
| | Similar interpretation as for | |
| | 'extra_compile_args'. | |
+------------------------+--------------------------------+---------------------------+
| *export_symbols* | list of symbols to be exported | string |
| | from a shared extension. Not | |
| | used on all platforms, and not | |
| | generally necessary for Python | |
| | extensions, which typically | |
| | export exactly one symbol: | |
| | ``init`` + extension_name. | |
+------------------------+--------------------------------+---------------------------+
| *depends* | list of files that the | string |
| | extension depends on | |
+------------------------+--------------------------------+---------------------------+
| *language* | extension language (i.e. | string |
| | ``'c'``, ``'c++'``, | |
| | ``'objc'``). Will be detected | |
| | from the source extensions if | |
| | not provided. | |
+------------------------+--------------------------------+---------------------------+

View File

@ -0,0 +1,324 @@
:mod:`packaging.database` --- Database of installed distributions
=================================================================
.. module:: packaging.database
:synopsis: Functions to query and manipulate installed distributions.
This module provides an implementation of :PEP:`376`. It was originally
intended to land in :mod:`pkgutil`, but with the inclusion of Packaging in the
standard library, it was thought best to include it in a submodule of
:mod:`packaging`, leaving :mod:`pkgutil` to deal with imports.
Installed Python distributions are represented by instances of
:class:`Distribution`, or :class:`EggInfoDistribution` for legacy egg formats.
Most functions also provide an extra argument ``use_egg_info`` to take legacy
distributions into account.
Classes representing installed distributions
--------------------------------------------
.. class:: Distribution(path)
Class representing an installed distribution. It is different from
:class:`packaging.dist.Distribution` which holds the list of files, the
metadata and options during the run of a Packaging command.
Instantiate with the *path* to a ``.dist-info`` directory. Instances can be
compared and sorted. Other available methods are:
.. XXX describe how comparison works
.. method:: get_distinfo_file(path, binary=False)
Return a read-only file object for a file located at
:file:`{project-version}.dist-info/path}`. *path* should be a
``'/'``-separated path relative to the ``.dist-info`` directory or an
absolute path; if it is an absolute path and doesn't start with the path
to the :file:`.dist-info` directory, a :class:`PackagingError` is raised.
If *binary* is ``True``, the file is opened in binary mode.
.. method:: get_resource_path(relative_path)
.. TODO
.. method:: list_distinfo_files(local=False)
Return an iterator over all files located in the :file:`.dist-info`
directory. If *local* is ``True``, each returned path is transformed into
a local absolute path, otherwise the raw value found in the :file:`RECORD`
file is returned.
.. method:: list_installed_files(local=False)
Iterate over the files installed with the distribution and registered in
the :file:`RECORD` file and yield a tuple ``(path, md5, size)`` for each
line. If *local* is ``True``, the returned path is transformed into a
local absolute path, otherwise the raw value is returned.
A local absolute path is an absolute path in which occurrences of ``'/'``
have been replaced by :data:`os.sep`.
.. method:: uses(path)
Check whether *path* was installed by this distribution (i.e. if the path
is present in the :file:`RECORD` file). *path* can be a local absolute
path or a relative ``'/'``-separated path. Returns a boolean.
Available attributes:
.. attribute:: metadata
Instance of :class:`packaging.metadata.Metadata` filled with the contents
of the :file:`{project-version}.dist-info/METADATA` file.
.. attribute:: name
Shortcut for ``metadata['Name']``.
.. attribute:: version
Shortcut for ``metadata['Version']``.
.. attribute:: requested
Boolean indicating whether this distribution was requested by the user of
automatically installed as a dependency.
.. class:: EggInfoDistribution(path)
Class representing a legacy distribution. It is compatible with distutils'
and setuptools' :file:`.egg-info` and :file:`.egg` files and directories.
.. FIXME should be named EggDistribution
Instantiate with the *path* to an egg file or directory. Instances can be
compared and sorted. Other available methods are:
.. method:: list_installed_files(local=False)
.. method:: uses(path)
Available attributes:
.. attribute:: metadata
Instance of :class:`packaging.metadata.Metadata` filled with the contents
of the :file:`{project-version}.egg-info/PKG-INFO` or
:file:`{project-version}.egg` file.
.. attribute:: name
Shortcut for ``metadata['Name']``.
.. attribute:: version
Shortcut for ``metadata['Version']``.
Functions to work with the database
-----------------------------------
.. function:: get_distribution(name, use_egg_info=False, paths=None)
Return an instance of :class:`Distribution` or :class:`EggInfoDistribution`
for the first installed distribution matching *name*. Egg distributions are
considered only if *use_egg_info* is true; if both a dist-info and an egg
file are found, the dist-info prevails. The directories to be searched are
given in *paths*, which defaults to :data:`sys.path`. Return ``None`` if no
matching distribution is found.
.. FIXME param should be named use_egg
.. function:: get_distributions(use_egg_info=False, paths=None)
Return an iterator of :class:`Distribution` instances for all installed
distributions found in *paths* (defaults to :data:`sys.path`). If
*use_egg_info* is true, also return instances of :class:`EggInfoDistribution`
for legacy distributions found.
.. function:: get_file_users(path)
Return an iterator over all distributions using *path*, a local absolute path
or a relative ``'/'``-separated path.
.. XXX does this work with prefixes or full file path only?
.. function:: obsoletes_distribution(name, version=None, use_egg_info=False)
Return an iterator over all distributions that declare they obsolete *name*.
*version* is an optional argument to match only specific releases (see
:mod:`packaging.version`). If *use_egg_info* is true, legacy egg
distributions will be considered as well.
.. function:: provides_distribution(name, version=None, use_egg_info=False)
Return an iterator over all distributions that declare they provide *name*.
*version* is an optional argument to match only specific releases (see
:mod:`packaging.version`). If *use_egg_info* is true, legacy egg
distributions will be considered as well.
Utility functions
-----------------
.. function:: distinfo_dirname(name, version)
Escape *name* and *version* into a filename-safe form and return the
directory name built from them, for example
:file:`{safename}-{safeversion}.dist-info.` In *name*, runs of
non-alphanumeric characters are replaced with one ``'_'``; in *version*,
spaces become dots, and runs of other non-alphanumeric characters (except
dots) a replaced by one ``'-'``.
.. XXX wth spaces in version numbers?
For performance purposes, the list of distributions is being internally
cached. Caching is enabled by default, but you can control it with these
functions:
.. function:: clear_cache()
Clear the cache.
.. function:: disable_cache()
Disable the cache, without clearing it.
.. function:: enable_cache()
Enable the internal cache, without clearing it.
Examples
--------
Print all information about a distribution
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Given a path to a ``.dist-info`` distribution, we shall print out all
information that can be obtained using functions provided in this module::
import sys
import packaging.database
path = input()
# first create the Distribution instance
try:
dist = packaging.database.Distribution(path)
except IOError:
sys.exit('No such distribution')
print('Information about %r' % dist.name)
print()
print('Files')
print('=====')
for path, md5, size in dist.list_installed_files():
print('* Path: %s' % path)
print(' Hash %s, Size: %s bytes' % (md5, size))
print()
print('Metadata')
print('========')
for key, value in dist.metadata.items():
print('%20s: %s' % (key, value))
print()
print('Extra')
print('=====')
if dist.requested:
print('* It was installed by user request')
else:
print('* It was installed as a dependency')
If we save the script above as ``print_info.py``, we can use it to extract
information from a :file:`.dist-info` directory. By typing in the console:
.. code-block:: sh
$ echo /tmp/choxie/choxie-2.0.0.9.dist-info | python3 print_info.py
we get the following output:
.. code-block:: none
Information about 'choxie'
Files
=====
* Path: ../tmp/distutils2/tests/fake_dists/choxie-2.0.0.9/truffles.py
Hash 5e052db6a478d06bad9ae033e6bc08af, Size: 111 bytes
* Path: ../tmp/distutils2/tests/fake_dists/choxie-2.0.0.9/choxie/chocolate.py
Hash ac56bf496d8d1d26f866235b95f31030, Size: 214 bytes
* Path: ../tmp/distutils2/tests/fake_dists/choxie-2.0.0.9/choxie/__init__.py
Hash 416aab08dfa846f473129e89a7625bbc, Size: 25 bytes
* Path: ../tmp/distutils2/tests/fake_dists/choxie-2.0.0.9.dist-info/INSTALLER
Hash d41d8cd98f00b204e9800998ecf8427e, Size: 0 bytes
* Path: ../tmp/distutils2/tests/fake_dists/choxie-2.0.0.9.dist-info/METADATA
Hash 696a209967fef3c8b8f5a7bb10386385, Size: 225 bytes
* Path: ../tmp/distutils2/tests/fake_dists/choxie-2.0.0.9.dist-info/REQUESTED
Hash d41d8cd98f00b204e9800998ecf8427e, Size: 0 bytes
* Path: ../tmp/distutils2/tests/fake_dists/choxie-2.0.0.9.dist-info/RECORD
Hash None, Size: None bytes
Metadata
========
Metadata-Version: 1.2
Name: choxie
Version: 2.0.0.9
Platform: []
Supported-Platform: UNKNOWN
Summary: Chocolate with a kick!
Description: UNKNOWN
Keywords: []
Home-page: UNKNOWN
Author: UNKNOWN
Author-email: UNKNOWN
Maintainer: UNKNOWN
Maintainer-email: UNKNOWN
License: UNKNOWN
Classifier: []
Download-URL: UNKNOWN
Obsoletes-Dist: ['truffles (<=0.8,>=0.5)', 'truffles (<=0.9,>=0.6)']
Project-URL: []
Provides-Dist: ['truffles (1.0)']
Requires-Dist: ['towel-stuff (0.1)']
Requires-Python: UNKNOWN
Requires-External: []
Extra
=====
* It was installed as a dependency
Find out obsoleted distributions
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Now, we take tackle a different problem, we are interested in finding out
which distributions have been obsoleted. This can be easily done as follows::
import packaging.database
# iterate over all distributions in the system
for dist in packaging.database.get_distributions():
name, version = dist.name, dist.version
# find out which distributions obsolete this name/version combination
replacements = packaging.database.obsoletes_distribution(name, version)
if replacements:
print('%r %s is obsoleted by' % (name, version),
', '.join(repr(r.name) for r in replacements))
This is how the output might look like:
.. code-block:: none
'strawberry' 0.6 is obsoleted by 'choxie'
'grammar' 1.0a4 is obsoleted by 'towel-stuff'

View File

@ -0,0 +1,199 @@
:mod:`packaging.depgraph` --- Dependency graph builder
======================================================
.. module:: packaging.depgraph
:synopsis: Graph builder for dependencies between releases.
This module provides the means to analyse the dependencies between various
distributions and to create a graph representing these dependency relationships.
In this document, "distribution" refers to an instance of
:class:`packaging.database.Distribution` or
:class:`packaging.database.EggInfoDistribution`.
.. XXX terminology problem with dist vs. release: dists are installed, but deps
use releases
.. XXX explain how to use it with dists not installed: Distribution can only be
instantiated with a path, but this module is useful for remote dist too
.. XXX functions should accept and return iterators, not lists
The :class:`DependencyGraph` class
----------------------------------
.. class:: DependencyGraph
Represent a dependency graph between releases. The nodes are distribution
instances; the edge model dependencies. An edge from ``a`` to ``b`` means
that ``a`` depends on ``b``.
.. method:: add_distribution(distribution)
Add *distribution* to the graph.
.. method:: add_edge(x, y, label=None)
Add an edge from distribution *x* to distribution *y* with the given
*label* (string).
.. method:: add_missing(distribution, requirement)
Add a missing *requirement* (string) for the given *distribution*.
.. method:: repr_node(dist, level=1)
Print a subgraph starting from *dist*. *level* gives the depth of the
subgraph.
Direct access to the graph nodes and edges is provided through these
attributes:
.. attribute:: adjacency_list
Dictionary mapping distributions to a list of ``(other, label)`` tuples
where ``other`` is a distribution and the edge is labeled with ``label``
(i.e. the version specifier, if such was provided).
.. attribute:: reverse_list
Dictionary mapping distributions to a list of predecessors. This allows
efficient traversal.
.. attribute:: missing
Dictionary mapping distributions to a list of requirements that were not
provided by any distribution.
Auxiliary functions
-------------------
.. function:: dependent_dists(dists, dist)
Recursively generate a list of distributions from *dists* that are dependent
on *dist*.
.. XXX what does member mean here: "dist is a member of *dists* for which we
are interested"
.. function:: generate_graph(dists)
Generate a :class:`DependencyGraph` from the given list of distributions.
.. XXX make this alternate constructor a DepGraph classmethod or rename;
'generate' can suggest it creates a file or an image, use 'make'
.. function:: graph_to_dot(graph, f, skip_disconnected=True)
Write a DOT output for the graph to the file-like object *f*.
If *skip_disconnected* is true, all distributions that are not dependent on
any other distribution are skipped.
.. XXX why is this not a DepGraph method?
Example Usage
-------------
Depict all dependenciess in the system
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
First, we shall generate a graph of all the distributions on the system
and then create an image out of it using the tools provided by
`Graphviz <http://www.graphviz.org/>`_::
from packaging.database import get_distributions
from packaging.depgraph import generate_graph
dists = list(get_distributions())
graph = generate_graph(dists)
It would be interesting to print out the missing requirements. This can be done
as follows::
for dist, reqs in graph.missing.items():
if reqs:
reqs = ' ,'.join(repr(req) for req in reqs)
print('Missing dependencies for %r: %s' % (dist.name, reqs))
Example output is:
.. code-block:: none
Missing dependencies for 'TurboCheetah': 'Cheetah'
Missing dependencies for 'TurboGears': 'ConfigObj', 'DecoratorTools', 'RuleDispatch'
Missing dependencies for 'jockey': 'PyKDE4.kdecore', 'PyKDE4.kdeui', 'PyQt4.QtCore', 'PyQt4.QtGui'
Missing dependencies for 'TurboKid': 'kid'
Missing dependencies for 'TurboJson: 'DecoratorTools', 'RuleDispatch'
Now, we proceed with generating a graphical representation of the graph. First
we write it to a file, and then we generate a PNG image using the
:program:`dot` command-line tool::
from packaging.depgraph import graph_to_dot
with open('output.dot', 'w') as f:
# only show the interesting distributions, skipping the disconnected ones
graph_to_dot(graph, f, skip_disconnected=True)
We can create the final picture using:
.. code-block:: sh
$ dot -Tpng output.dot > output.png
An example result is:
.. figure:: depgraph-output.png
:alt: Example PNG output from packaging.depgraph and dot
If you want to include egg distributions as well, then the code requires only
one change, namely the line::
dists = list(packaging.database.get_distributions())
has to be replaced with::
dists = list(packaging.database.get_distributions(use_egg_info=True))
On many platforms, a richer graph is obtained because at the moment most
distributions are provided in the egg rather than the new standard
``.dist-info`` format.
.. XXX missing image
An example of a more involved graph for illustrative reasons can be seen
here:
.. image:: depgraph_big.png
List all dependent distributions
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
We will list all distributions that are dependent on some given distibution.
This time, egg distributions will be considered as well::
import sys
from packaging.database import get_distribution, get_distributions
from packaging.depgraph import dependent_dists
dists = list(get_distributions(use_egg_info=True))
dist = get_distribution('bacon', use_egg_info=True)
if dist is None:
sys.exit('No such distribution in the system')
deps = dependent_dists(dists, dist)
deps = ', '.join(repr(x.name) for x in deps)
print('Distributions depending on %r: %s' % (dist.name, deps))
And this is example output:
.. with the dependency relationships as in the previous section
(depgraph_big)
.. code-block:: none
Distributions depending on 'bacon': 'towel-stuff', 'choxie', 'grammar'

View File

@ -0,0 +1,102 @@
:mod:`packaging.dist` --- The Distribution class
================================================
.. module:: packaging.dist
:synopsis: Core Distribution class.
This module provides the :class:`Distribution` class, which represents the
module distribution being built/packaged/distributed/installed.
.. class:: Distribution(arguments)
A :class:`Distribution` describes how to build, package, distribute and
install a Python project.
The arguments accepted by the constructor are laid out in the following
table. Some of them will end up in a metadata object, the rest will become
data attributes of the :class:`Distribution` instance.
.. TODO improve constructor to take a Metadata object + named params?
(i.e. Distribution(metadata, cmdclass, py_modules, etc)
.. TODO also remove obsolete(?) script_name, etc. parameters? see what
py2exe and other tools need
+--------------------+--------------------------------+-------------------------------------------------------------+
| argument name | value | type |
+====================+================================+=============================================================+
| *name* | The name of the project | string |
+--------------------+--------------------------------+-------------------------------------------------------------+
| *version* | The version number of the | See :mod:`packaging.version` |
| | release | |
+--------------------+--------------------------------+-------------------------------------------------------------+
| *summary* | A single line describing the | a string |
| | project | |
+--------------------+--------------------------------+-------------------------------------------------------------+
| *description* | Longer description of the | a string |
| | project | |
+--------------------+--------------------------------+-------------------------------------------------------------+
| *author* | The name of the project author | a string |
+--------------------+--------------------------------+-------------------------------------------------------------+
| *author_email* | The email address of the | a string |
| | project author | |
+--------------------+--------------------------------+-------------------------------------------------------------+
| *maintainer* | The name of the current | a string |
| | maintainer, if different from | |
| | the author | |
+--------------------+--------------------------------+-------------------------------------------------------------+
| *maintainer_email* | The email address of the | |
| | current maintainer, if | |
| | different from the author | |
+--------------------+--------------------------------+-------------------------------------------------------------+
| *home_page* | A URL for the proejct | a URL |
| | (homepage) | |
+--------------------+--------------------------------+-------------------------------------------------------------+
| *download_url* | A URL to download the project | a URL |
+--------------------+--------------------------------+-------------------------------------------------------------+
| *packages* | A list of Python packages that | a list of strings |
| | packaging will manipulate | |
+--------------------+--------------------------------+-------------------------------------------------------------+
| *py_modules* | A list of Python modules that | a list of strings |
| | packaging will manipulate | |
+--------------------+--------------------------------+-------------------------------------------------------------+
| *scripts* | A list of standalone scripts | a list of strings |
| | to be built and installed | |
+--------------------+--------------------------------+-------------------------------------------------------------+
| *ext_modules* | A list of Python extensions to | A list of instances of |
| | be built | :class:`packaging.compiler.extension.Extension` |
+--------------------+--------------------------------+-------------------------------------------------------------+
| *classifiers* | A list of categories for the | The list of available |
| | distribution | categorizations is at |
| | | http://pypi.python.org/pypi?:action=list_classifiers. |
+--------------------+--------------------------------+-------------------------------------------------------------+
| *distclass* | the :class:`Distribution` | A subclass of |
| | class to use | :class:`packaging.dist.Distribution` |
+--------------------+--------------------------------+-------------------------------------------------------------+
| *script_name* | The name of the setup.py | a string |
| | script - defaults to | |
| | ``sys.argv[0]`` | |
+--------------------+--------------------------------+-------------------------------------------------------------+
| *script_args* | Arguments to supply to the | a list of strings |
| | setup script | |
+--------------------+--------------------------------+-------------------------------------------------------------+
| *options* | default options for the setup | a string |
| | script | |
+--------------------+--------------------------------+-------------------------------------------------------------+
| *license* | The license for the | a string; should be used when there is no suitable License |
| | distribution | classifier, or to specify a classifier |
+--------------------+--------------------------------+-------------------------------------------------------------+
| *keywords* | Descriptive keywords | a list of strings; used by catalogs |
+--------------------+--------------------------------+-------------------------------------------------------------+
| *platforms* | Platforms compatible with this | a list of strings; should be used when there is no |
| | distribution | suitable Platform classifier |
+--------------------+--------------------------------+-------------------------------------------------------------+
| *cmdclass* | A mapping of command names to | a dictionary |
| | :class:`Command` subclasses | |
+--------------------+--------------------------------+-------------------------------------------------------------+
| *data_files* | A list of data files to | a list |
| | install | |
+--------------------+--------------------------------+-------------------------------------------------------------+
| *package_dir* | A mapping of Python packages | a dictionary |
| | to directory names | |
+--------------------+--------------------------------+-------------------------------------------------------------+

View File

@ -0,0 +1,75 @@
:mod:`packaging.fancy_getopt` --- Wrapper around the getopt module
==================================================================
.. module:: packaging.fancy_getopt
:synopsis: Additional getopt functionality.
.. warning::
This module is deprecated and will be replaced with :mod:`optparse`.
This module provides a wrapper around the standard :mod:`getopt` module that
provides the following additional features:
* short and long options are tied together
* options have help strings, so :func:`fancy_getopt` could potentially create a
complete usage summary
* options set attributes of a passed-in object
* boolean options can have "negative aliases" --- e.g. if :option:`--quiet` is
the "negative alias" of :option:`--verbose`, then :option:`--quiet` on the
command line sets *verbose* to false.
.. function:: fancy_getopt(options, negative_opt, object, args)
Wrapper function. *options* is a list of ``(long_option, short_option,
help_string)`` 3-tuples as described in the constructor for
:class:`FancyGetopt`. *negative_opt* should be a dictionary mapping option names
to option names, both the key and value should be in the *options* list.
*object* is an object which will be used to store values (see the :meth:`getopt`
method of the :class:`FancyGetopt` class). *args* is the argument list. Will use
``sys.argv[1:]`` if you pass ``None`` as *args*.
.. class:: FancyGetopt([option_table=None])
The option_table is a list of 3-tuples: ``(long_option, short_option,
help_string)``
If an option takes an argument, its *long_option* should have ``'='`` appended;
*short_option* should just be a single character, no ``':'`` in any case.
*short_option* should be ``None`` if a *long_option* doesn't have a
corresponding *short_option*. All option tuples must have long options.
The :class:`FancyGetopt` class provides the following methods:
.. method:: FancyGetopt.getopt([args=None, object=None])
Parse command-line options in args. Store as attributes on *object*.
If *args* is ``None`` or not supplied, uses ``sys.argv[1:]``. If *object* is
``None`` or not supplied, creates a new :class:`OptionDummy` instance, stores
option values there, and returns a tuple ``(args, object)``. If *object* is
supplied, it is modified in place and :func:`getopt` just returns *args*; in
both cases, the returned *args* is a modified copy of the passed-in *args* list,
which is left untouched.
.. TODO and args returned are?
.. method:: FancyGetopt.get_option_order()
Returns the list of ``(option, value)`` tuples processed by the previous run of
:meth:`getopt` Raises :exc:`RuntimeError` if :meth:`getopt` hasn't been called
yet.
.. method:: FancyGetopt.generate_help([header=None])
Generate help text (a list of strings, one per suggested line of output) from
the option table for this :class:`FancyGetopt` object.
If supplied, prints the supplied *header* at the top of the help.

View File

@ -0,0 +1,112 @@
:mod:`packaging.install` --- Installation tools
===============================================
.. module:: packaging.install
:synopsis: Download and installation building blocks
Packaging provides a set of tools to deal with downloads and installation of
distributions. Their role is to download the distribution from indexes, resolve
the dependencies, and provide a safe way to install distributions. An operation
that fails will cleanly roll back, not leave half-installed distributions on the
system. Here's the basic process followed:
#. Move all distributions that will be removed to a temporary location.
#. Install all the distributions that will be installed in a temporary location.
#. If the installation fails, move the saved distributions back to their
location and delete the installed distributions.
#. Otherwise, move the installed distributions to the right location and delete
the temporary locations.
This is a higher-level module built on :mod:`packaging.database` and
:mod:`packaging.pypi`.
Public functions
----------------
.. function:: get_infos(requirements, index=None, installed=None, \
prefer_final=True)
Return information about what's going to be installed and upgraded.
*requirements* is a string string containing the requirements for this
project, for example ``'FooBar 1.1'`` or ``'BarBaz (<1.2)'``.
.. XXX are requirements comma-separated?
If you want to use another index than the main PyPI, give its URI as *index*
argument.
*installed* is a list of already installed distributions used to find
satisfied dependencies, obsoleted distributions and eventual conflicts.
By default, alpha, beta and candidate versions are not picked up. Set
*prefer_final* to false to accept them too.
The results are returned in a dictionary containing all the information
needed to perform installation of the requirements with the
:func:`install_from_infos` function:
>>> get_install_info("FooBar (<=1.2)")
{'install': [<FooBar 1.1>], 'remove': [], 'conflict': []}
.. TODO should return tuple or named tuple, not dict
.. TODO use "predicate" or "requirement" consistently in version and here
.. FIXME "info" cannot be plural in English, s/infos/info/
.. function:: install(project)
.. function:: install_dists(dists, path, paths=None)
Safely install all distributions provided in *dists* into *path*. *paths* is
a list of paths where already-installed distributions will be looked for to
find satisfied dependencies and conflicts (default: :data:`sys.path`).
Returns a list of installed dists.
.. FIXME dists are instances of what?
.. function:: install_from_infos(install_path=None, install=[], remove=[], \
conflicts=[], paths=None)
Safely install and remove given distributions. This function is designed to
work with the return value of :func:`get_infos`: *install*, *remove* and
*conflicts* should be list of distributions returned by :func:`get_infos`.
If *install* is not empty, *install_path* must be given to specify the path
where the distributions should be installed. *paths* is a list of paths
where already-installed distributions will be looked for (default:
:data:`sys.path`).
This function is a very basic installer; if *conflicts* is not empty, the
system will be in a conflicting state after the function completes. It is a
building block for more sophisticated installers with conflict resolution
systems.
.. TODO document typical value for install_path
.. TODO document integration with default schemes, esp. user site-packages
.. function:: install_local_project(path)
Install a distribution from a source directory, which must contain either a
Packaging-compliant :file:`setup.cfg` file or a legacy Distutils
:file:`setup.py` script (in which case Distutils will be used under the hood
to perform the installation).
.. function:: remove(project_name, paths=None, auto_confirm=True)
Remove one distribution from the system.
.. FIXME this is the only function using "project" instead of dist/release
..
Example usage
--------------
Get the scheme of what's gonna be installed if we install "foobar":

View File

@ -0,0 +1,122 @@
:mod:`packaging.metadata` --- Metadata handling
===============================================
.. module:: packaging.metadata
:synopsis: Class holding the metadata of a release.
.. TODO use sphinx-autogen to generate basic doc from the docstrings
.. class:: Metadata
This class can read and write metadata files complying with any of the
defined versions: 1.0 (:PEP:`241`), 1.1 (:PEP:`314`) and 1.2 (:PEP:`345`). It
implements methods to parse Metadata files and write them, and a mapping
interface to its contents.
The :PEP:`345` implementation supports the micro-language for the environment
markers, and displays warnings when versions that are supposed to be
:PEP:`386`-compliant are violating the specification.
Reading metadata
----------------
The :class:`Metadata` class can be instantiated
with the path of the metadata file, and provides a dict-like interface to the
values::
>>> from packaging.metadata import Metadata
>>> metadata = Metadata('PKG-INFO')
>>> metadata.keys()[:5]
('Metadata-Version', 'Name', 'Version', 'Platform', 'Supported-Platform')
>>> metadata['Name']
'CLVault'
>>> metadata['Version']
'0.5'
>>> metadata['Requires-Dist']
["pywin32; sys.platform == 'win32'", "Sphinx"]
The fields that support environment markers can be automatically ignored if
the object is instantiated using the ``platform_dependent`` option.
:class:`Metadata` will interpret in this case
the markers and will automatically remove the fields that are not compliant
with the running environment. Here's an example under Mac OS X. The win32
dependency we saw earlier is ignored::
>>> from packaging.metadata import Metadata
>>> metadata = Metadata('PKG-INFO', platform_dependent=True)
>>> metadata['Requires-Dist']
['Sphinx']
If you want to provide your own execution context, let's say to test the
metadata under a particular environment that is not the current environment,
you can provide your own values in the ``execution_context`` option, which
is the dict that may contain one or more keys of the context the micro-language
expects.
Here's an example, simulating a win32 environment::
>>> from packaging.metadata import Metadata
>>> context = {'sys.platform': 'win32'}
>>> metadata = Metadata('PKG-INFO', platform_dependent=True,
... execution_context=context)
...
>>> metadata['Requires-Dist'] = ["pywin32; sys.platform == 'win32'",
... "Sphinx"]
...
>>> metadata['Requires-Dist']
['pywin32', 'Sphinx']
Writing metadata
----------------
Writing metadata can be done using the ``write`` method::
>>> metadata.write('/to/my/PKG-INFO')
The class will pick the best version for the metadata, depending on the values
provided. If all the values provided exist in all versions, the class will
use :attr:`PKG_INFO_PREFERRED_VERSION`. It is set by default to 1.0, the most
widespread version.
Conflict checking and best version
----------------------------------
Some fields in :PEP:`345` have to comply with the version number specification
defined in :PEP:`386`. When they don't comply, a warning is emitted::
>>> from packaging.metadata import Metadata
>>> metadata = Metadata()
>>> metadata['Requires-Dist'] = ['Funky (Groovie)']
"Funky (Groovie)" is not a valid predicate
>>> metadata['Requires-Dist'] = ['Funky (1.2)']
See also :mod:`packaging.version`.
.. TODO talk about check()
:mod:`packaging.markers` --- Environment markers
================================================
.. module:: packaging.markers
:synopsis: Micro-language for environment markers
This is an implementation of environment markers `as defined in PEP 345
<http://www.python.org/dev/peps/pep-0345/#environment-markers>`_. It is used
for some metadata fields.
.. function:: interpret(marker, execution_context=None)
Interpret a marker and return a boolean result depending on the environment.
Example:
>>> interpret("python_version > '1.0'")
True

View File

@ -0,0 +1,114 @@
:mod:`packaging.pypi.dist` --- Classes representing query results
=================================================================
.. module:: packaging.pypi.dist
:synopsis: Classes representing the results of queries to indexes.
Information coming from the indexes is held in instances of the classes defined
in this module.
Keep in mind that each project (eg. FooBar) can have several releases
(eg. 1.1, 1.2, 1.3), and each of these releases can be provided in multiple
distributions (eg. a source distribution, a binary one, etc).
ReleaseInfo
-----------
Each release has a project name, version, metadata, and related distributions.
This information is stored in :class:`ReleaseInfo`
objects.
.. class:: ReleaseInfo
DistInfo
---------
:class:`DistInfo` is a simple class that contains
information related to distributions; mainly the URLs where distributions
can be found.
.. class:: DistInfo
ReleasesList
------------
The :mod:`~packaging.pypi.dist` module provides a class which works
with lists of :class:`ReleaseInfo` classes;
used to filter and order results.
.. class:: ReleasesList
Example usage
-------------
Build a list of releases and order them
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Assuming we have a list of releases::
>>> from packaging.pypi.dist import ReleasesList, ReleaseInfo
>>> fb10 = ReleaseInfo("FooBar", "1.0")
>>> fb11 = ReleaseInfo("FooBar", "1.1")
>>> fb11a = ReleaseInfo("FooBar", "1.1a1")
>>> ReleasesList("FooBar", [fb11, fb11a, fb10])
>>> releases.sort_releases()
>>> releases.get_versions()
['1.1', '1.1a1', '1.0']
>>> releases.add_release("1.2a1")
>>> releases.get_versions()
['1.1', '1.1a1', '1.0', '1.2a1']
>>> releases.sort_releases()
['1.2a1', '1.1', '1.1a1', '1.0']
>>> releases.sort_releases(prefer_final=True)
>>> releases.get_versions()
['1.1', '1.0', '1.2a1', '1.1a1']
Add distribution related information to releases
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
It's easy to add distribution information to releases::
>>> from packaging.pypi.dist import ReleasesList, ReleaseInfo
>>> r = ReleaseInfo("FooBar", "1.0")
>>> r.add_distribution("sdist", url="http://example.org/foobar-1.0.tar.gz")
>>> r.dists
{'sdist': FooBar 1.0 sdist}
>>> r['sdist'].url
{'url': 'http://example.org/foobar-1.0.tar.gz', 'hashname': None, 'hashval':
None, 'is_external': True}
Getting attributes from the dist objects
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
To abstract querying information returned from the indexes, attributes and
release information can be retrieved directly from dist objects.
For instance, if you have a release instance that does not contain the metadata
attribute, it can be fetched by using the "fetch_metadata" method::
>>> r = Release("FooBar", "1.1")
>>> print r.metadata
None # metadata field is actually set to "None"
>>> r.fetch_metadata()
<Metadata for FooBar 1.1>
.. XXX add proper roles to these constructs
It's possible to retrieve a project's releases (`fetch_releases`),
metadata (`fetch_metadata`) and distributions (`fetch_distributions`) using
a similar work flow.
.. XXX what is possible?
Internally, this is possible because while retrieving information about
projects, releases or distributions, a reference to the client used is
stored which can be accessed using the objects `_index` attribute.

View File

@ -0,0 +1,53 @@
:mod:`packaging.pypi` --- Interface to projects indexes
=======================================================
.. module:: packaging.pypi
:synopsis: Low-level and high-level APIs to query projects indexes.
Packaging queries PyPI to get information about projects or download them. The
low-level facilities used internally are also part of the public API designed to
be used by other tools.
The :mod:`packaging.pypi` package provides those facilities, which can be
used to access information about Python projects registered at indexes, the
main one being PyPI, located ad http://pypi.python.org/.
There is two ways to retrieve data from these indexes: a screen-scraping
interface called the "simple API", and XML-RPC. The first one uses HTML pages
located under http://pypi.python.org/simple/, the second one makes XML-RPC
requests to http://pypi.python.org/pypi/. All functions and classes also work
with other indexes such as mirrors, which typically implement only the simple
interface.
Packaging provides a class that wraps both APIs to provide full query and
download functionality: :class:`packaging.pypi.client.ClientWrapper`. If you
want more control, you can use the underlying classes
:class:`packaging.pypi.simple.Crawler` and :class:`packaging.pypi.xmlrpc.Client`
to connect to one specific interface.
:mod:`packaging.pypi.client` --- High-level query API
=====================================================
.. module:: packaging.pypi.client
:synopsis: Wrapper around :mod;`packaging.pypi.xmlrpc` and
:mod:`packaging.pypi.simple` to query indexes.
This module provides a high-level API to query indexes and search
for releases and distributions. The aim of this module is to choose the best
way to query the API automatically, either using XML-RPC or the simple index,
with a preference toward the latter.
.. class:: ClientWrapper
Instances of this class will use the simple interface or XML-RPC requests to
query indexes and return :class:`packaging.pypi.dist.ReleaseInfo` and
:class:`packaging.pypi.dist.ReleasesList` objects.
.. method:: find_projects
.. method:: get_release
.. method:: get_releases

View File

@ -0,0 +1,157 @@
:mod:`packaging.pypi.simple` --- Crawler using the PyPI "simple" interface
==========================================================================
.. module:: packaging.pypi.simple
:synopsis: Crawler using the screen-scraping "simple" interface to fetch info
and distributions.
`packaging.pypi.simple` can process Python Package Indexes and provides
useful information about distributions. It also can crawl local indexes, for
instance.
You should use `packaging.pypi.simple` for:
* Search distributions by name and versions.
* Process index external pages.
* Download distributions by name and versions.
And should not be used for:
* Things that will end up in too long index processing (like "finding all
distributions with a specific version, no matters the name")
API
---
.. class:: Crawler
Usage Exemples
---------------
To help you understand how using the `Crawler` class, here are some basic
usages.
Request the simple index to get a specific distribution
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Supposing you want to scan an index to get a list of distributions for
the "foobar" project. You can use the "get_releases" method for that.
The get_releases method will browse the project page, and return
:class:`ReleaseInfo` objects for each found link that rely on downloads. ::
>>> from packaging.pypi.simple import Crawler
>>> crawler = Crawler()
>>> crawler.get_releases("FooBar")
[<ReleaseInfo "Foobar 1.1">, <ReleaseInfo "Foobar 1.2">]
Note that you also can request the client about specific versions, using version
specifiers (described in `PEP 345
<http://www.python.org/dev/peps/pep-0345/#version-specifiers>`_)::
>>> client.get_releases("FooBar < 1.2")
[<ReleaseInfo "FooBar 1.1">, ]
`get_releases` returns a list of :class:`ReleaseInfo`, but you also can get the
best distribution that fullfil your requirements, using "get_release"::
>>> client.get_release("FooBar < 1.2")
<ReleaseInfo "FooBar 1.1">
Download distributions
^^^^^^^^^^^^^^^^^^^^^^
As it can get the urls of distributions provided by PyPI, the `Crawler`
client also can download the distributions and put it for you in a temporary
destination::
>>> client.download("foobar")
/tmp/temp_dir/foobar-1.2.tar.gz
You also can specify the directory you want to download to::
>>> client.download("foobar", "/path/to/my/dir")
/path/to/my/dir/foobar-1.2.tar.gz
While downloading, the md5 of the archive will be checked, if not matches, it
will try another time, then if fails again, raise `MD5HashDoesNotMatchError`.
Internally, that's not the Crawler which download the distributions, but the
`DistributionInfo` class. Please refer to this documentation for more details.
Following PyPI external links
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
The default behavior for packaging is to *not* follow the links provided
by HTML pages in the "simple index", to find distributions related
downloads.
It's possible to tell the PyPIClient to follow external links by setting the
`follow_externals` attribute, on instantiation or after::
>>> client = Crawler(follow_externals=True)
or ::
>>> client = Crawler()
>>> client.follow_externals = True
Working with external indexes, and mirrors
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
The default `Crawler` behavior is to rely on the Python Package index stored
on PyPI (http://pypi.python.org/simple).
As you can need to work with a local index, or private indexes, you can specify
it using the index_url parameter::
>>> client = Crawler(index_url="file://filesystem/path/")
or ::
>>> client = Crawler(index_url="http://some.specific.url/")
You also can specify mirrors to fallback on in case the first index_url you
provided doesnt respond, or not correctly. The default behavior for
`Crawler` is to use the list provided by Python.org DNS records, as
described in the :PEP:`381` about mirroring infrastructure.
If you don't want to rely on these, you could specify the list of mirrors you
want to try by specifying the `mirrors` attribute. It's a simple iterable::
>>> mirrors = ["http://first.mirror","http://second.mirror"]
>>> client = Crawler(mirrors=mirrors)
Searching in the simple index
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
It's possible to search for projects with specific names in the package index.
Assuming you want to find all projects containing the "distutils" keyword::
>>> c.search_projects("distutils")
[<Project "collective.recipe.distutils">, <Project "Distutils">, <Project
"Packaging">, <Project "distutilscross">, <Project "lpdistutils">, <Project
"taras.recipe.distutils">, <Project "zerokspot.recipe.distutils">]
You can also search the projects starting with a specific text, or ending with
that text, using a wildcard::
>>> c.search_projects("distutils*")
[<Project "Distutils">, <Project "Packaging">, <Project "distutilscross">]
>>> c.search_projects("*distutils")
[<Project "collective.recipe.distutils">, <Project "Distutils">, <Project
"lpdistutils">, <Project "taras.recipe.distutils">, <Project
"zerokspot.recipe.distutils">]

View File

@ -0,0 +1,143 @@
:mod:`packaging.pypi.xmlrpc` --- Crawler using the PyPI XML-RPC interface
=========================================================================
.. module:: packaging.pypi.xmlrpc
:synopsis: Client using XML-RPC requests to fetch info and distributions.
Indexes can be queried using XML-RPC calls, and Packaging provides a simple
way to interface with XML-RPC.
You should **use** XML-RPC when:
* Searching the index for projects **on other fields than project
names**. For instance, you can search for projects based on the
author_email field.
* Searching all the versions that have existed for a project.
* you want to retrieve METADATAs information from releases or
distributions.
You should **avoid using** XML-RPC method calls when:
* Retrieving the last version of a project
* Getting the projects with a specific name and version.
* The simple index can match your needs
When dealing with indexes, keep in mind that the index queries will always
return you :class:`packaging.pypi.dist.ReleaseInfo` and
:class:`packaging.pypi.dist.ReleasesList` objects.
Some methods here share common APIs with the one you can find on
:class:`packaging.pypi.simple`, internally, :class:`packaging.pypi.client`
is inherited by :class:`Client`
API
---
.. class:: Client
Usage examples
--------------
Use case described here are use case that are not common to the other clients.
If you want to see all the methods, please refer to API or to usage examples
described in :class:`packaging.pypi.client.Client`
Finding releases
^^^^^^^^^^^^^^^^
It's a common use case to search for "things" within the index. We can
basically search for projects by their name, which is the most used way for
users (eg. "give me the last version of the FooBar project").
This can be accomplished using the following syntax::
>>> client = xmlrpc.Client()
>>> client.get_release("Foobar (<= 1.3))
<FooBar 1.2.1>
>>> client.get_releases("FooBar (<= 1.3)")
[FooBar 1.1, FooBar 1.1.1, FooBar 1.2, FooBar 1.2.1]
And we also can find for specific fields::
>>> client.search_projects(field=value)
You could specify the operator to use, default is "or"::
>>> client.search_projects(field=value, operator="and")
The specific fields you can search are:
* name
* version
* author
* author_email
* maintainer
* maintainer_email
* home_page
* license
* summary
* description
* keywords
* platform
* download_url
Getting metadata information
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
XML-RPC is a prefered way to retrieve metadata information from indexes.
It's really simple to do so::
>>> client = xmlrpc.Client()
>>> client.get_metadata("FooBar", "1.1")
<ReleaseInfo FooBar 1.1>
Assuming we already have a :class:`packaging.pypi.ReleaseInfo` object defined,
it's possible to pass it to the xmlrpc client to retrieve and complete its
metadata::
>>> foobar11 = ReleaseInfo("FooBar", "1.1")
>>> client = xmlrpc.Client()
>>> returned_release = client.get_metadata(release=foobar11)
>>> returned_release
<ReleaseInfo FooBar 1.1>
Get all the releases of a project
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
To retrieve all the releases for a project, you can build them using
`get_releases`::
>>> client = xmlrpc.Client()
>>> client.get_releases("FooBar")
[<ReleaseInfo FooBar 0.9>, <ReleaseInfo FooBar 1.0>, <ReleaseInfo 1.1>]
Get information about distributions
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Indexes have information about projects, releases **and** distributions.
If you're not familiar with those, please refer to the documentation of
:mod:`packaging.pypi.dist`.
It's possible to retrieve information about distributions, e.g "what are the
existing distributions for this release ? How to retrieve them ?"::
>>> client = xmlrpc.Client()
>>> release = client.get_distributions("FooBar", "1.1")
>>> release.dists
{'sdist': <FooBar 1.1 sdist>, 'bdist': <FooBar 1.1 bdist>}
As you see, this does not return a list of distributions, but a release,
because a release can be used like a list of distributions.

78
Doc/library/packaging.rst Normal file
View File

@ -0,0 +1,78 @@
:mod:`packaging` --- Packaging support
======================================
.. module:: packaging
:synopsis: Packaging system and building blocks for other packaging systems.
.. sectionauthor:: Fred L. Drake, Jr. <fdrake@acm.org>, distutils and packaging
contributors
The :mod:`packaging` package provides support for building, packaging,
distributing and installing additional projects into a Python installation.
Projects may include Python modules, extension modules, packages and scripts.
:mod:`packaging` also provides building blocks for other packaging systems
that are not tied to the command system.
This manual is the reference documentation for those standalone building
blocks and for extending Packaging. If you're looking for the user-centric
guides to install a project or package your own code, head to `See also`__.
Building blocks
---------------
.. toctree::
:maxdepth: 2
:numbered:
packaging-misc
packaging.version
packaging.metadata
packaging.database
packaging.depgraph
packaging.pypi
packaging.pypi.dist
packaging.pypi.simple
packaging.pypi.xmlrpc
packaging.install
The command machinery
---------------------
.. toctree::
:maxdepth: 2
:numbered:
packaging.dist
packaging.command
packaging.compiler
packaging.fancy_getopt
Other utilities
----------------
.. toctree::
:maxdepth: 2
:numbered:
packaging.util
packaging.tests.pypi_server
.. XXX missing: compat config create (dir_util) run pypi.{base,mirrors}
.. __:
.. seealso::
:ref:`packaging-index`
The manual for developers of Python projects who want to package and
distribute them. This describes how to use :mod:`packaging` to make
projects easily found and added to an existing Python installation.
:ref:`packaging-install-index`
A user-centered manual which includes information on adding projects
into an existing Python installation. You do not need to be a Python
programmer to read this manual.

View File

@ -0,0 +1,105 @@
:mod:`packaging.tests.pypi_server` --- PyPI mock server
=======================================================
.. module:: packaging.tests.pypi_server
:synopsis: Mock server used to test PyPI-related modules and commands.
When you are testing code that works with Packaging, you might find these tools
useful.
The mock server
---------------
.. class:: PyPIServer
PyPIServer is a class that implements an HTTP server running in a separate
thread. All it does is record the requests for further inspection. The recorded
data is available under ``requests`` attribute. The default
HTTP response can be overridden with the ``default_response_status``,
``default_response_headers`` and ``default_response_data`` attributes.
By default, when accessing the server with urls beginning with `/simple/`,
the server also record your requests, but will look for files under
the `/tests/pypiserver/simple/` path.
You can tell the sever to serve static files for other paths. This could be
accomplished by using the `static_uri_paths` parameter, as below::
server = PyPIServer(static_uri_paths=["first_path", "second_path"])
You need to create the content that will be served under the
`/tests/pypiserver/default` path. If you want to serve content from another
place, you also can specify another filesystem path (which needs to be under
`tests/pypiserver/`. This will replace the default behavior of the server, and
it will not serve content from the `default` dir ::
server = PyPIServer(static_filesystem_paths=["path/to/your/dir"])
If you just need to add some paths to the existing ones, you can do as shown,
keeping in mind that the server will always try to load paths in reverse order
(e.g here, try "another/super/path" then the default one) ::
server = PyPIServer(test_static_path="another/super/path")
server = PyPIServer("another/super/path")
# or
server.static_filesystem_paths.append("another/super/path")
As a result of what, in your tests, while you need to use the PyPIServer, in
order to isolates the test cases, the best practice is to place the common files
in the `default` folder, and to create a directory for each specific test case::
server = PyPIServer(static_filesystem_paths = ["default", "test_pypi_server"],
static_uri_paths=["simple", "external"])
Base class and decorator for tests
----------------------------------
.. class:: PyPIServerTestCase
``PyPIServerTestCase`` is a test case class with setUp and tearDown methods that
take care of a single PyPIServer instance attached as a ``pypi`` attribute on
the test class. Use it as one of the base classes in your test case::
class UploadTestCase(PyPIServerTestCase):
def test_something(self):
cmd = self.prepare_command()
cmd.ensure_finalized()
cmd.repository = self.pypi.full_address
cmd.run()
environ, request_data = self.pypi.requests[-1]
self.assertEqual(request_data, EXPECTED_REQUEST_DATA)
.. decorator:: use_pypi_server
You also can use a decorator for your tests, if you do not need the same server
instance along all you test case. So, you can specify, for each test method,
some initialisation parameters for the server.
For this, you need to add a `server` parameter to your method, like this::
class SampleTestCase(TestCase):
@use_pypi_server()
def test_something(self, server):
...
The decorator will instantiate the server for you, and run and stop it just
before and after your method call. You also can pass the server initializer,
just like this::
class SampleTestCase(TestCase):
@use_pypi_server("test_case_name")
def test_something(self, server):
...

View File

@ -0,0 +1,186 @@
:mod:`packaging.util` --- Miscellaneous utility functions
=========================================================
.. module:: packaging.util
:synopsis: Miscellaneous utility functions.
This module contains various helpers for the other modules.
.. XXX a number of functions are missing, but the module may be split first
(it's ginormous right now, some things could go to compat for example)
.. function:: get_platform()
Return a string that identifies the current platform. This is used mainly to
distinguish platform-specific build directories and platform-specific built
distributions. Typically includes the OS name and version and the
architecture (as supplied by 'os.uname()'), although the exact information
included depends on the OS; e.g. for IRIX the architecture isn't particularly
important (IRIX only runs on SGI hardware), but for Linux the kernel version
isn't particularly important.
Examples of returned values:
* ``linux-i586``
* ``linux-alpha``
* ``solaris-2.6-sun4u``
* ``irix-5.3``
* ``irix64-6.2``
For non-POSIX platforms, currently just returns ``sys.platform``.
For Mac OS X systems the OS version reflects the minimal version on which
binaries will run (that is, the value of ``MACOSX_DEPLOYMENT_TARGET``
during the build of Python), not the OS version of the current system.
For universal binary builds on Mac OS X the architecture value reflects
the univeral binary status instead of the architecture of the current
processor. For 32-bit universal binaries the architecture is ``fat``,
for 64-bit universal binaries the architecture is ``fat64``, and
for 4-way universal binaries the architecture is ``universal``. Starting
from Python 2.7 and Python 3.2 the architecture ``fat3`` is used for
a 3-way universal build (ppc, i386, x86_64) and ``intel`` is used for
a univeral build with the i386 and x86_64 architectures
Examples of returned values on Mac OS X:
* ``macosx-10.3-ppc``
* ``macosx-10.3-fat``
* ``macosx-10.5-universal``
* ``macosx-10.6-intel``
.. XXX reinvention of platform module?
.. function:: convert_path(pathname)
Return 'pathname' as a name that will work on the native filesystem, i.e.
split it on '/' and put it back together again using the current directory
separator. Needed because filenames in the setup script are always supplied
in Unix style, and have to be converted to the local convention before we
can actually use them in the filesystem. Raises :exc:`ValueError` on
non-Unix-ish systems if *pathname* either starts or ends with a slash.
.. function:: change_root(new_root, pathname)
Return *pathname* with *new_root* prepended. If *pathname* is relative, this
is equivalent to ``os.path.join(new_root,pathname)`` Otherwise, it requires
making *pathname* relative and then joining the two, which is tricky on
DOS/Windows.
.. function:: check_environ()
Ensure that 'os.environ' has all the environment variables we guarantee that
users can use in config files, command-line options, etc. Currently this
includes:
* :envvar:`HOME` - user's home directory (Unix only)
* :envvar:`PLAT` - description of the current platform, including hardware
and OS (see :func:`get_platform`)
.. function:: find_executable(executable, path=None)
Search the path for a given executable name.
.. function:: subst_vars(s, local_vars)
Perform shell/Perl-style variable substitution on *s*. Every occurrence of
``$`` followed by a name is considered a variable, and variable is
substituted by the value found in the *local_vars* dictionary, or in
``os.environ`` if it's not in *local_vars*. *os.environ* is first
checked/augmented to guarantee that it contains certain values: see
:func:`check_environ`. Raise :exc:`ValueError` for any variables not found
in either *local_vars* or ``os.environ``.
Note that this is not a fully-fledged string interpolation function. A valid
``$variable`` can consist only of upper and lower case letters, numbers and
an underscore. No { } or ( ) style quoting is available.
.. function:: split_quoted(s)
Split a string up according to Unix shell-like rules for quotes and
backslashes. In short: words are delimited by spaces, as long as those spaces
are not escaped by a backslash, or inside a quoted string. Single and double
quotes are equivalent, and the quote characters can be backslash-escaped.
The backslash is stripped from any two-character escape sequence, leaving
only the escaped character. The quote characters are stripped from any
quoted string. Returns a list of words.
.. TODO Should probably be moved into the standard library.
.. function:: execute(func, args[, msg=None, verbose=0, dry_run=0])
Perform some action that affects the outside world (for instance, writing to
the filesystem). Such actions are special because they are disabled by the
*dry_run* flag. This method takes care of all that bureaucracy for you;
all you have to do is supply the function to call and an argument tuple for
it (to embody the "external action" being performed), and an optional message
to print.
.. function:: newer(source, target)
Return true if *source* exists and is more recently modified than *target*,
or if *source* exists and *target* doesn't. Return false if both exist and
*target* is the same age or newer than *source*. Raise
:exc:`PackagingFileError` if *source* does not exist.
.. function:: strtobool(val)
Convert a string representation of truth to true (1) or false (0).
True values are ``y``, ``yes``, ``t``, ``true``, ``on`` and ``1``; false
values are ``n``, ``no``, ``f``, ``false``, ``off`` and ``0``. Raises
:exc:`ValueError` if *val* is anything else.
.. TODO Add :term: markup to bytecode when merging into the stdlib
.. function:: byte_compile(py_files[, optimize=0, force=0, prefix=None, base_dir=None, verbose=1, dry_run=0, direct=None])
Byte-compile a collection of Python source files to either :file:`.pyc` or
:file:`.pyo` files in the same directory. *py_files* is a list of files to
compile; any files that don't end in :file:`.py` are silently skipped.
*optimize* must be one of the following:
* ``0`` - don't optimize (generate :file:`.pyc`)
* ``1`` - normal optimization (like ``python -O``)
* ``2`` - extra optimization (like ``python -OO``)
If *force* is true, all files are recompiled regardless of timestamps.
The source filename encoded in each bytecode file defaults to the filenames
listed in *py_files*; you can modify these with *prefix* and *basedir*.
*prefix* is a string that will be stripped off of each source filename, and
*base_dir* is a directory name that will be prepended (after *prefix* is
stripped). You can supply either or both (or neither) of *prefix* and
*base_dir*, as you wish.
If *dry_run* is true, doesn't actually do anything that would affect the
filesystem.
Byte-compilation is either done directly in this interpreter process with the
standard :mod:`py_compile` module, or indirectly by writing a temporary
script and executing it. Normally, you should let :func:`byte_compile`
figure out to use direct compilation or not (see the source for details).
The *direct* flag is used by the script generated in indirect mode; unless
you know what you're doing, leave it set to ``None``.
.. function:: rfc822_escape(header)
Return a version of *header* escaped for inclusion in an :rfc:`822` header, by
ensuring there are 8 spaces space after each newline. Note that it does no
other modification of the string.
.. TODO this _can_ be replaced

View File

@ -0,0 +1,104 @@
:mod:`packaging.version` --- Version number classes
===================================================
.. module:: packaging.version
:synopsis: Classes that represent project version numbers.
This module contains classes and functions useful to deal with version numbers.
It's an implementation of version specifiers `as defined in PEP 345
<http://www.python.org/dev/peps/pep-0345/#version-specifiers>`_.
Version numbers
---------------
.. class:: NormalizedVersion(self, s, error_on_huge_major_num=True)
A specific version of a distribution, as described in PEP 345. *s* is a
string object containing the version number (for example ``'1.2b1'``),
*error_on_huge_major_num* a boolean specifying whether to consider an
apparent use of a year or full date as the major version number an error.
The rationale for the second argument is that there were projects using years
or full dates as version numbers, which could cause problems with some
packaging systems sorting.
Instances of this class can be compared and sorted::
>>> NormalizedVersion('1.2b1') < NormalizedVersion('1.2')
True
:class:`NormalizedVersion` is used internally by :class:`VersionPredicate` to
do its work.
.. class:: IrrationalVersionError
Exception raised when an invalid string is given to
:class:`NormalizedVersion`.
>>> NormalizedVersion("irrational_version_number")
...
IrrationalVersionError: irrational_version_number
.. function:: suggest_normalized_version(s)
Before standardization in PEP 386, various schemes were in use. Packaging
provides a function to try to convert any string to a valid, normalized
version::
>>> suggest_normalized_version('2.1-rc1')
2.1c1
If :func:`suggest_normalized_version` can't make sense of the given string,
it will return ``None``::
>>> print(suggest_normalized_version('not a version'))
None
Version predicates
------------------
.. class:: VersionPredicate(predicate)
This class deals with the parsing of field values like
``ProjectName (>=version)``.
.. method:: match(version)
Test if a version number matches the predicate:
>>> version = VersionPredicate("ProjectName (<1.2, >1.0)")
>>> version.match("1.2.1")
False
>>> version.match("1.1.1")
True
Validation helpers
------------------
If you want to use :term:`LBYL`-style checks instead of instantiating the
classes and catching :class:`IrrationalVersionError` and :class:`ValueError`,
you can use these functions:
.. function:: is_valid_version(predicate)
Check whether the given string is a valid version number. Example of valid
strings: ``'1.2'``, ``'4.2.0.dev4'``, ``'2.5.4.post2'``.
.. function:: is_valid_versions(predicate)
Check whether the given string is a valid value for specifying multiple
versions, such as in the Requires-Python field. Example: ``'2.7, >=3.2'``.
.. function:: is_valid_predicate(predicate)
Check whether the given string is a valid version predicate. Examples:
``'some.project == 4.5, <= 4.7'``, ``'speciallib (> 1.0, != 1.4.2, < 2.0)'``.

View File

@ -193,7 +193,7 @@ Example
-------
To demonstrate several uses of the :func:`pprint` function and its parameters,
let's fetch information about a package from PyPI::
let's fetch information about a project from PyPI::
>>> import json
>>> import pprint
@ -201,8 +201,8 @@ let's fetch information about a package from PyPI::
>>> with urlopen('http://pypi.python.org/pypi/configparser/json') as url:
... http_info = url.info()
... raw_data = url.read().decode(http_info.get_content_charset())
>>> package_data = json.loads(raw_data)
>>> result = {'headers': http_info.items(), 'body': package_data}
>>> project_info = json.loads(raw_data)
>>> result = {'headers': http_info.items(), 'body': project_info}
In its basic form, :func:`pprint` shows the whole object::

View File

@ -25,4 +25,5 @@ overview:
inspect.rst
site.rst
fpectl.rst
packaging.rst
distutils.rst

View File

@ -43,6 +43,12 @@ The :mod:`random` module also provides the :class:`SystemRandom` class which
uses the system function :func:`os.urandom` to generate random numbers
from sources provided by the operating system.
.. warning::
The generators of the :mod:`random` module should not be used for security
purposes. Use :func:`ssl.RAND_bytes` if you require a cryptographically
secure pseudorandom number generator.
Bookkeeping functions:

View File

@ -1301,24 +1301,27 @@ The text categories are specified with regular expressions. The technique is
to combine those into a single master regular expression and to loop over
successive matches::
Token = collections.namedtuple('Token', 'typ value line column')
import collections
import re
Token = collections.namedtuple('Token', ['typ', 'value', 'line', 'column'])
def tokenize(s):
keywords = {'IF', 'THEN', 'FOR', 'NEXT', 'GOSUB', 'RETURN'}
tok_spec = [
('NUMBER', r'\d+(\.\d*)?'), # Integer or decimal number
('ASSIGN', r':='), # Assignment operator
('END', ';'), # Statement terminator
('ID', r'[A-Za-z]+'), # Identifiers
('OP', r'[+*\/\-]'), # Arithmetic operators
('NEWLINE', r'\n'), # Line endings
('SKIP', r'[ \t]'), # Skip over spaces and tabs
keywords = {'IF', 'THEN', 'ENDIF', 'FOR', 'NEXT', 'GOSUB', 'RETURN'}
token_specification = [
('NUMBER', r'\d+(\.\d*)?'), # Integer or decimal number
('ASSIGN', r':='), # Assignment operator
('END', r';'), # Statement terminator
('ID', r'[A-Za-z]+'), # Identifiers
('OP', r'[+*\/\-]'), # Arithmetic operators
('NEWLINE', r'\n'), # Line endings
('SKIP', r'[ \t]'), # Skip over spaces and tabs
]
tok_re = '|'.join('(?P<%s>%s)' % pair for pair in tok_spec)
gettok = re.compile(tok_re).match
tok_regex = '|'.join('(?P<%s>%s)' % pair for pair in token_specification)
get_token = re.compile(tok_regex).match
line = 1
pos = line_start = 0
mo = gettok(s)
mo = get_token(s)
while mo is not None:
typ = mo.lastgroup
if typ == 'NEWLINE':
@ -1330,13 +1333,15 @@ successive matches::
typ = val
yield Token(typ, val, line, mo.start()-line_start)
pos = mo.end()
mo = gettok(s, pos)
mo = get_token(s, pos)
if pos != len(s):
raise RuntimeError('Unexpected character %r on line %d' %(s[pos], line))
statements = '''\
total := total + price * quantity;
tax := price * 0.05;
statements = '''
IF quantity THEN
total := total + price * quantity;
tax := price * 0.05;
ENDIF;
'''
for token in tokenize(statements):
@ -1344,17 +1349,22 @@ successive matches::
The tokenizer produces the following output::
Token(typ='ID', value='total', line=1, column=8)
Token(typ='ASSIGN', value=':=', line=1, column=14)
Token(typ='ID', value='total', line=1, column=17)
Token(typ='OP', value='+', line=1, column=23)
Token(typ='ID', value='price', line=1, column=25)
Token(typ='OP', value='*', line=1, column=31)
Token(typ='ID', value='quantity', line=1, column=33)
Token(typ='END', value=';', line=1, column=41)
Token(typ='ID', value='tax', line=2, column=9)
Token(typ='ASSIGN', value=':=', line=2, column=13)
Token(typ='ID', value='price', line=2, column=16)
Token(typ='OP', value='*', line=2, column=22)
Token(typ='NUMBER', value='0.05', line=2, column=24)
Token(typ='END', value=';', line=2, column=28)
Token(typ='IF', value='IF', line=2, column=5)
Token(typ='ID', value='quantity', line=2, column=8)
Token(typ='THEN', value='THEN', line=2, column=17)
Token(typ='ID', value='total', line=3, column=9)
Token(typ='ASSIGN', value=':=', line=3, column=15)
Token(typ='ID', value='total', line=3, column=18)
Token(typ='OP', value='+', line=3, column=24)
Token(typ='ID', value='price', line=3, column=26)
Token(typ='OP', value='*', line=3, column=32)
Token(typ='ID', value='quantity', line=3, column=34)
Token(typ='END', value=';', line=3, column=42)
Token(typ='ID', value='tax', line=4, column=9)
Token(typ='ASSIGN', value=':=', line=4, column=13)
Token(typ='ID', value='price', line=4, column=16)
Token(typ='OP', value='*', line=4, column=22)
Token(typ='NUMBER', value='0.05', line=4, column=24)
Token(typ='END', value=';', line=4, column=28)
Token(typ='ENDIF', value='ENDIF', line=5, column=5)
Token(typ='END', value=';', line=5, column=10)

View File

@ -187,10 +187,9 @@ The :mod:`signal` module defines the following functions:
Send the signal *signum* to the thread *thread_id*, another thread in the same
process as the caller. The signal is asynchronously directed to thread.
*thread_id* can be read from the :attr:`~threading.Thread.ident` attribute
of :attr:`threading.Thread`. For example,
``threading.current_thread().ident`` gives the identifier of the current
thread.
Use :func:`threading.get_ident()` or the :attr:`~threading.Thread.ident`
attribute of :attr:`threading.Thread` to get a 'thread identifier' for
*thread_id*.
If *signum* is 0, then no signal is sent, but error checking is still
performed; this can be used to check if a thread is still running.

View File

@ -129,6 +129,10 @@ empty, and the path manipulations are skipped; however the import of
unless the :program:`python` interpreter was started with the :option:`-S`
flag.
.. versionchanged:: 3.3
This function used to be called unconditionnally.
.. function:: addsitedir(sitedir, known_paths=None)
Adds a directory to sys.path and processes its pth files.

View File

@ -153,8 +153,21 @@ Server Objects
.. method:: BaseServer.serve_forever(poll_interval=0.5)
Handle requests until an explicit :meth:`shutdown` request. Polls for
shutdown every *poll_interval* seconds.
shutdown every *poll_interval* seconds. It also calls
:meth:`service_actions` which may be used by a subclass or Mixin to provide
various cleanup actions. For e.g. ForkingMixin class uses
:meth:`service_actions` to cleanup the zombie child processes.
.. versionchanged:: 3.3
Added service_actions call to the serve_forever method.
.. method:: BaseServer.service_actions()
This is called by the serve_forever loop. This method is can be overridden
by Mixin's to add cleanup or service specific actions.
.. versionadded:: 3.3
.. method:: BaseServer.shutdown()

View File

@ -162,6 +162,35 @@ instead.
Random generation
^^^^^^^^^^^^^^^^^
.. function:: RAND_bytes(num)
Returns *num* cryptographically strong pseudo-random bytes. Raises an
:class:`SSLError` if the PRNG has not been seeded with enough data or if the
operation is not supported by the current RAND method. :func:`RAND_status`
can be used to check the status of the PRNG and :func:`RAND_add` can be used
to seed the PRNG.
Read the Wikipedia article, `Cryptographically secure pseudorandom number
generator (CSPRNG)
<http://en.wikipedia.org/wiki/Cryptographically_secure_pseudorandom_number_generator>`_,
to get the requirements of a cryptographically generator.
.. versionadded:: 3.3
.. function:: RAND_pseudo_bytes(num)
Returns (bytes, is_cryptographic): bytes are *num* pseudo-random bytes,
is_cryptographic is True if the bytes generated are cryptographically
strong. Raises an :class:`SSLError` if the operation is not supported by the
current RAND method.
Generated pseudo-random byte sequences will be unique if they are of
sufficient length, but are not necessarily unpredictable. They can be used
for non-cryptographic purposes and for certain purposes in cryptographic
protocols, but usually not for key generation etc.
.. versionadded:: 3.3
.. function:: RAND_status()
Returns True if the SSL pseudo-random number generator has been seeded with
@ -171,7 +200,7 @@ Random generation
.. function:: RAND_egd(path)
If you are running an entropy-gathering daemon (EGD) somewhere, and ``path``
If you are running an entropy-gathering daemon (EGD) somewhere, and *path*
is the pathname of a socket connection open to it, this will read 256 bytes
of randomness from the socket, and add it to the SSL pseudo-random number
generator to increase the security of generated secret keys. This is
@ -182,8 +211,8 @@ Random generation
.. function:: RAND_add(bytes, entropy)
Mixes the given ``bytes`` into the SSL pseudo-random number generator. The
parameter ``entropy`` (a float) is a lower bound on the entropy contained in
Mixes the given *bytes* into the SSL pseudo-random number generator. The
parameter *entropy* (a float) is a lower bound on the entropy contained in
string (so you can always use :const:`0.0`). See :rfc:`1750` for more
information on sources of entropy.

View File

@ -48,6 +48,17 @@ This module defines the following functions and objects:
returned.
.. function:: get_ident()
Return the 'thread identifier' of the current thread. This is a nonzero
integer. Its value has no direct meaning; it is intended as a magic cookie
to be used e.g. to index a dictionary of thread-specific data. Thread
identifiers may be recycled when a thread exits and another thread is
created.
.. versionadded:: 3.3
.. function:: enumerate()
Return a list of all :class:`Thread` objects currently alive. The list
@ -332,10 +343,10 @@ impossible to detect the termination of alien threads.
.. attribute:: ident
The 'thread identifier' of this thread or ``None`` if the thread has not
been started. This is a nonzero integer. See the
:func:`thread.get_ident()` function. Thread identifiers may be recycled
when a thread exits and another thread is created. The identifier is
available even after the thread has exited.
been started. This is a nonzero integer. See the :func:`get_ident()`
function. Thread identifiers may be recycled when a thread exits and
another thread is created. The identifier is available even after the
thread has exited.
.. method:: is_alive()

307
Doc/packaging/builtdist.rst Normal file
View File

@ -0,0 +1,307 @@
.. _packaging-built-dist:
****************************
Creating Built Distributions
****************************
A "built distribution" is what you're probably used to thinking of either as a
"binary package" or an "installer" (depending on your background). It's not
necessarily binary, though, because it might contain only Python source code
and/or byte-code; and we don't call it a package, because that word is already
spoken for in Python. (And "installer" is a term specific to the world of
mainstream desktop systems.)
A built distribution is how you make life as easy as possible for installers of
your module distribution: for users of RPM-based Linux systems, it's a binary
RPM; for Windows users, it's an executable installer; for Debian-based Linux
users, it's a Debian package; and so forth. Obviously, no one person will be
able to create built distributions for every platform under the sun, so the
Distutils are designed to enable module developers to concentrate on their
specialty---writing code and creating source distributions---while an
intermediary species called *packagers* springs up to turn source distributions
into built distributions for as many platforms as there are packagers.
Of course, the module developer could be his own packager; or the packager could
be a volunteer "out there" somewhere who has access to a platform which the
original developer does not; or it could be software periodically grabbing new
source distributions and turning them into built distributions for as many
platforms as the software has access to. Regardless of who they are, a packager
uses the setup script and the :command:`bdist` command family to generate built
distributions.
As a simple example, if I run the following command in the Distutils source
tree::
python setup.py bdist
then the Distutils builds my module distribution (the Distutils itself in this
case), does a "fake" installation (also in the :file:`build` directory), and
creates the default type of built distribution for my platform. The default
format for built distributions is a "dumb" tar file on Unix, and a simple
executable installer on Windows. (That tar file is considered "dumb" because it
has to be unpacked in a specific location to work.)
Thus, the above command on a Unix system creates
:file:`Distutils-1.0.{plat}.tar.gz`; unpacking this tarball from the right place
installs the Distutils just as though you had downloaded the source distribution
and run ``python setup.py install``. (The "right place" is either the root of
the filesystem or Python's :file:`{prefix}` directory, depending on the options
given to the :command:`bdist_dumb` command; the default is to make dumb
distributions relative to :file:`{prefix}`.)
Obviously, for pure Python distributions, this isn't any simpler than just
running ``python setup.py install``\ ---but for non-pure distributions, which
include extensions that would need to be compiled, it can mean the difference
between someone being able to use your extensions or not. And creating "smart"
built distributions, such as an executable installer for
Windows, is far more convenient for users even if your distribution doesn't
include any extensions.
The :command:`bdist` command has a :option:`--formats` option, similar to the
:command:`sdist` command, which you can use to select the types of built
distribution to generate: for example, ::
python setup.py bdist --format=zip
would, when run on a Unix system, create :file:`Distutils-1.0.{plat}.zip`\
---again, this archive would be unpacked from the root directory to install the
Distutils.
The available formats for built distributions are:
+-------------+------------------------------+---------+
| Format | Description | Notes |
+=============+==============================+=========+
| ``gztar`` | gzipped tar file | (1),(3) |
| | (:file:`.tar.gz`) | |
+-------------+------------------------------+---------+
| ``ztar`` | compressed tar file | \(3) |
| | (:file:`.tar.Z`) | |
+-------------+------------------------------+---------+
| ``tar`` | tar file (:file:`.tar`) | \(3) |
+-------------+------------------------------+---------+
| ``zip`` | zip file (:file:`.zip`) | (2),(4) |
+-------------+------------------------------+---------+
| ``wininst`` | self-extracting ZIP file for | \(4) |
| | Windows | |
+-------------+------------------------------+---------+
| ``msi`` | Microsoft Installer. | |
+-------------+------------------------------+---------+
Notes:
(1)
default on Unix
(2)
default on Windows
(3)
requires external utilities: :program:`tar` and possibly one of :program:`gzip`,
:program:`bzip2`, or :program:`compress`
(4)
requires either external :program:`zip` utility or :mod:`zipfile` module (part
of the standard Python library since Python 1.6)
You don't have to use the :command:`bdist` command with the :option:`--formats`
option; you can also use the command that directly implements the format you're
interested in. Some of these :command:`bdist` "sub-commands" actually generate
several similar formats; for instance, the :command:`bdist_dumb` command
generates all the "dumb" archive formats (``tar``, ``ztar``, ``gztar``, and
``zip``). The :command:`bdist` sub-commands, and the formats generated by
each, are:
+--------------------------+-----------------------+
| Command | Formats |
+==========================+=======================+
| :command:`bdist_dumb` | tar, ztar, gztar, zip |
+--------------------------+-----------------------+
| :command:`bdist_wininst` | wininst |
+--------------------------+-----------------------+
| :command:`bdist_msi` | msi |
+--------------------------+-----------------------+
The following sections give details on the individual :command:`bdist_\*`
commands.
.. _packaging-creating-dumb:
Creating dumb built distributions
=================================
.. XXX Need to document absolute vs. prefix-relative packages here, but first
I have to implement it!
.. _packaging-creating-wininst:
Creating Windows Installers
===========================
Executable installers are the natural format for binary distributions on
Windows. They display a nice graphical user interface, display some information
about the module distribution to be installed taken from the metadata in the
setup script, let the user select a few options, and start or cancel the
installation.
Since the metadata is taken from the setup script, creating Windows installers
is usually as easy as running::
python setup.py bdist_wininst
or the :command:`bdist` command with the :option:`--formats` option::
python setup.py bdist --formats=wininst
If you have a pure module distribution (only containing pure Python modules and
packages), the resulting installer will be version independent and have a name
like :file:`foo-1.0.win32.exe`. These installers can even be created on Unix
platforms or Mac OS X.
If you have a non-pure distribution, the extensions can only be created on a
Windows platform, and will be Python version dependent. The installer filename
will reflect this and now has the form :file:`foo-1.0.win32-py2.0.exe`. You
have to create a separate installer for every Python version you want to
support.
.. TODO Add :term: markup to bytecode when merging into the stdlib
The installer will try to compile pure modules into bytecode after installation
on the target system in normal and optimizing mode. If you don't want this to
happen for some reason, you can run the :command:`bdist_wininst` command with
the :option:`--no-target-compile` and/or the :option:`--no-target-optimize`
option.
By default the installer will display the cool "Python Powered" logo when it is
run, but you can also supply your own 152x261 bitmap which must be a Windows
:file:`.bmp` file with the :option:`--bitmap` option.
The installer will also display a large title on the desktop background window
when it is run, which is constructed from the name of your distribution and the
version number. This can be changed to another text by using the
:option:`--title` option.
The installer file will be written to the "distribution directory" --- normally
:file:`dist/`, but customizable with the :option:`--dist-dir` option.
.. _packaging-cross-compile-windows:
Cross-compiling on Windows
==========================
Starting with Python 2.6, packaging is capable of cross-compiling between
Windows platforms. In practice, this means that with the correct tools
installed, you can use a 32bit version of Windows to create 64bit extensions
and vice-versa.
To build for an alternate platform, specify the :option:`--plat-name` option
to the build command. Valid values are currently 'win32', 'win-amd64' and
'win-ia64'. For example, on a 32bit version of Windows, you could execute::
python setup.py build --plat-name=win-amd64
to build a 64bit version of your extension. The Windows Installers also
support this option, so the command::
python setup.py build --plat-name=win-amd64 bdist_wininst
would create a 64bit installation executable on your 32bit version of Windows.
To cross-compile, you must download the Python source code and cross-compile
Python itself for the platform you are targetting - it is not possible from a
binary installtion of Python (as the .lib etc file for other platforms are
not included.) In practice, this means the user of a 32 bit operating
system will need to use Visual Studio 2008 to open the
:file:`PCBuild/PCbuild.sln` solution in the Python source tree and build the
"x64" configuration of the 'pythoncore' project before cross-compiling
extensions is possible.
Note that by default, Visual Studio 2008 does not install 64bit compilers or
tools. You may need to reexecute the Visual Studio setup process and select
these tools (using Control Panel->[Add/Remove] Programs is a convenient way to
check or modify your existing install.)
.. _packaging-postinstallation-script:
The Postinstallation script
---------------------------
Starting with Python 2.3, a postinstallation script can be specified with the
:option:`--install-script` option. The basename of the script must be
specified, and the script filename must also be listed in the scripts argument
to the setup function.
This script will be run at installation time on the target system after all the
files have been copied, with ``argv[1]`` set to :option:`-install`, and again at
uninstallation time before the files are removed with ``argv[1]`` set to
:option:`-remove`.
The installation script runs embedded in the windows installer, every output
(``sys.stdout``, ``sys.stderr``) is redirected into a buffer and will be
displayed in the GUI after the script has finished.
Some functions especially useful in this context are available as additional
built-in functions in the installation script.
.. currentmodule:: bdist_wininst-postinst-script
.. function:: directory_created(path)
file_created(path)
These functions should be called when a directory or file is created by the
postinstall script at installation time. It will register *path* with the
uninstaller, so that it will be removed when the distribution is uninstalled.
To be safe, directories are only removed if they are empty.
.. function:: get_special_folder_path(csidl_string)
This function can be used to retrieve special folder locations on Windows like
the Start Menu or the Desktop. It returns the full path to the folder.
*csidl_string* must be one of the following strings::
"CSIDL_APPDATA"
"CSIDL_COMMON_STARTMENU"
"CSIDL_STARTMENU"
"CSIDL_COMMON_DESKTOPDIRECTORY"
"CSIDL_DESKTOPDIRECTORY"
"CSIDL_COMMON_STARTUP"
"CSIDL_STARTUP"
"CSIDL_COMMON_PROGRAMS"
"CSIDL_PROGRAMS"
"CSIDL_FONTS"
If the folder cannot be retrieved, :exc:`OSError` is raised.
Which folders are available depends on the exact Windows version, and probably
also the configuration. For details refer to Microsoft's documentation of the
c:function:`SHGetSpecialFolderPath` function.
.. function:: create_shortcut(target, description, filename[, arguments[, workdir[, iconpath[, iconindex]]]])
This function creates a shortcut. *target* is the path to the program to be
started by the shortcut. *description* is the description of the shortcut.
*filename* is the title of the shortcut that the user will see. *arguments*
specifies the command-line arguments, if any. *workdir* is the working directory
for the program. *iconpath* is the file containing the icon for the shortcut,
and *iconindex* is the index of the icon in the file *iconpath*. Again, for
details consult the Microsoft documentation for the :class:`IShellLink`
interface.
Vista User Access Control (UAC)
===============================
Starting with Python 2.6, bdist_wininst supports a :option:`--user-access-control`
option. The default is 'none' (meaning no UAC handling is done), and other
valid values are 'auto' (meaning prompt for UAC elevation if Python was
installed for all users) and 'force' (meaning always prompt for elevation).

View File

@ -0,0 +1,31 @@
=============
Command hooks
=============
Packaging provides a way of extending its commands by the use of pre- and
post- command hooks. The hooks are simple Python functions (or any callable
objects) and are specified in the config file using their full qualified names.
The pre-hooks are run after the command is finalized (its options are
processed), but before it is run. The post-hooks are run after the command
itself. Both types of hooks receive an instance of the command object.
Sample usage of hooks
=====================
Firstly, you need to make sure your hook is present in the path. This is usually
done by dropping them to the same folder where `setup.py` file lives ::
# file: myhooks.py
def my_install_hook(install_cmd):
print "Oh la la! Someone is installing my project!"
Then, you need to point to it in your `setup.cfg` file, under the appropriate
command section ::
[install_dist]
pre-hook.project = myhooks.my_install_hook
The hooks defined in different config files (system-wide, user-wide and
package-wide) do not override each other as long as they are specified with
different aliases (additional names after the dot). The alias in the example
above is ``project``.

View File

@ -0,0 +1,349 @@
.. _packaging-command-reference:
*****************
Command Reference
*****************
This reference briefly documents all standard Packaging commands and some of
their options.
.. FIXME does not work: Use pysetup run --help-commands to list all
standard and extra commands availavble on your system, with their
description. Use pysetup run <command> --help to get help about the options
of one command.
Preparing distributions
=======================
:command:`check`
----------------
Perform some tests on the metadata of a distribution.
For example, it verifies that all required metadata fields are provided in the
:file:`setup.cfg` file.
.. TODO document reST checks
:command:`test`
---------------
Run a test suite.
When doing test-driven development, or running automated builds that need
testing before they are installed for downloading or use, it's often useful to
be able to run a project's unit tests without actually installing the project
anywhere. The :command:`test` command runs project's unit tests without
actually installing it, by temporarily putting the project's source on
:data:`sys.path`, after first running :command:`build_ext -i` to ensure that any
C extensions are built.
You can use this command in one of two ways: either by specifying a
unittest-compatible test suite for your project (or any callable that returns
it) or by passing a test runner function that will run your tests and display
results in the console. Both options take a Python dotted name in the form
``package.module.callable`` to specify the object to use.
If none of these options are specified, Packaging will try to perform test
discovery using either unittest (for Python 3.2 and higher) or unittest2 (for
older versions, if installed).
.. this is a pseudo-command name used to disambiguate the options in indexes and
links
.. program:: packaging test
.. cmdoption:: --suite=NAME, -s NAME
Specify the test suite (or module, class, or method) to be run. The default
for this option can be set by in the project's :file:`setup.cfg` file::
.. code-block:: cfg
[test]
suite = mypackage.tests.get_all_tests
.. cmdoption:: --runner=NAME, -r NAME
Specify the test runner to be called.
:command:`config`
-----------------
Perform distribution configuration.
The build step
==============
This step is mainly useful to compile C/C++ libraries or extension modules. The
build commands can be run manually to check for syntax errors or packaging
issues (for example if the addition of a new source file was forgotten in the
:file:`setup.cfg` file), and is also run automatically by commands which need
it. Packaging checks the mtime of source and built files to avoid re-building
if it's not necessary.
:command:`build`
----------------
Build all files of a distribution, delegating to the other :command:`build_*`
commands to do the work.
:command:`build_clib`
---------------------
Build C libraries.
:command:`build_ext`
--------------------
Build C/C++ extension modules.
:command:`build_py`
-------------------
Build the Python modules (just copy them to the build directory) and
byte-compile them to .pyc files.
:command:`build_scripts`
------------------------
Build the scripts (just copy them to the build directory and adjust their
shebang if they're Python scripts).
:command:`clean`
----------------
Clean the build tree of the release.
.. program:: packaging clean
.. cmdoption:: --all, -a
Remove build directories for modules, scripts, etc., not only temporary build
by-products.
Creating source and built distributions
=======================================
:command:`sdist`
----------------
Build a source distribution for a release.
It is recommended that you always build and upload a source distribution. Users
of OSes with easy access to compilers and users of advanced packaging tools will
prefer to compile from source rather than using pre-built distributions. For
Windows users, providing a binary installer is also recommended practice.
:command:`bdist`
----------------
Build a binary distribution for a release.
This command will call other :command:`bdist_*` commands to create one or more
distributions depending on the options given. The default is to create a
.tar.gz archive on Unix and a zip archive on Windows or OS/2.
.. program:: packaging bdist
.. cmdoption:: --formats
Binary formats to build (comma-separated list).
.. cmdoption:: --show-formats
Dump list of available formats.
:command:`bdist_dumb`
---------------------
Build a "dumb" installer, a simple archive of files that could be unpacked under
``$prefix`` or ``$exec_prefix``.
:command:`bdist_wininst`
------------------------
Build a Windows installer.
:command:`bdist_msi`
--------------------
Build a `Microsoft Installer`_ (.msi) file.
.. _Microsoft Installer: http://msdn.microsoft.com/en-us/library/cc185688(VS.85).aspx
In most cases, the :command:`bdist_msi` installer is a better choice than the
:command:`bdist_wininst` installer, because it provides better support for Win64
platforms, allows administrators to perform non-interactive installations, and
allows installation through group policies.
Publishing distributions
========================
:command:`register`
-------------------
This command registers the current release with the Python Package Index. This
is described in more detail in :PEP:`301`.
.. TODO explain user and project registration with the web UI
:command:`upload`
-----------------
Upload source and/or binary distributions to PyPI.
The distributions have to be built on the same command line as the
:command:`upload` command; see :ref:`packaging-package-upload` for more info.
.. program:: packaging upload
.. cmdoption:: --sign, -s
Sign each uploaded file using GPG (GNU Privacy Guard). The ``gpg`` program
must be available for execution on the system ``PATH``.
.. cmdoption:: --identity=NAME, -i NAME
Specify the identity or key name for GPG to use when signing. The value of
this option will be passed through the ``--local-user`` option of the
``gpg`` program.
.. cmdoption:: --show-response
Display the full response text from server; this is useful for debugging
PyPI problems.
.. cmdoption:: --repository=URL, -r URL
The URL of the repository to upload to. Defaults to
http://pypi.python.org/pypi (i.e., the main PyPI installation).
.. cmdoption:: --upload-docs
Also run :command:`upload_docs`. Mainly useful as a default value in
:file:`setup.cfg` (on the command line, it's shorter to just type both
commands).
:command:`upload_docs`
----------------------
Upload HTML documentation to PyPI.
PyPI now supports publishing project documentation at a URI of the form
``http://packages.python.org/<project>``. :command:`upload_docs` will create
the necessary zip file out of a documentation directory and will post to the
repository.
Note that to upload the documentation of a project, the corresponding version
must already be registered with PyPI, using the :command:`register` command ---
just like with :command:`upload`.
Assuming there is an ``Example`` project with documentation in the subdirectory
:file:`docs`, for example::
Example/
example.py
setup.cfg
docs/
build/
html/
index.html
tips_tricks.html
conf.py
index.txt
tips_tricks.txt
You can simply specify the directory with the HTML files in your
:file:`setup.cfg` file:
.. code-block:: cfg
[upload_docs]
upload-dir = docs/build/html
.. program:: packaging upload_docs
.. cmdoption:: --upload-dir
The directory to be uploaded to the repository. By default documentation
is searched for in ``docs`` (or ``doc``) directory in project root.
.. cmdoption:: --show-response
Display the full response text from server; this is useful for debugging
PyPI problems.
.. cmdoption:: --repository=URL, -r URL
The URL of the repository to upload to. Defaults to
http://pypi.python.org/pypi (i.e., the main PyPI installation).
The install step
================
These commands are used by end-users of a project using :program:`pysetup` or
another compatible installer. Each command will run the corresponding
:command:`build_*` command and then move the built files to their destination on
the target system.
:command:`install_dist`
-----------------------
Install a distribution, delegating to the other :command:`install_*` commands to
do the work.
.. program:: packaging install_dist
.. cmdoption:: --user
Install in user site-packages directory (see :PEP:`370`).
:command:`install_data`
-----------------------
Install data files.
:command:`install_distinfo`
---------------------------
Install files recording details of the installation as specified in :PEP:`376`.
:command:`install_headers`
--------------------------
Install C/C++ header files.
:command:`install_lib`
----------------------
Install C library files.
:command:`install_scripts`
--------------------------
Install scripts.

View File

@ -0,0 +1,125 @@
.. _packaging-setup-config:
************************************
Writing the Setup Configuration File
************************************
Often, it's not possible to write down everything needed to build a distribution
*a priori*: you may need to get some information from the user, or from the
user's system, in order to proceed. As long as that information is fairly
simple---a list of directories to search for C header files or libraries, for
example---then providing a configuration file, :file:`setup.cfg`, for users to
edit is a cheap and easy way to solicit it. Configuration files also let you
provide default values for any command option, which the installer can then
override either on the command line or by editing the config file.
The setup configuration file is a useful middle-ground between the setup script
---which, ideally, would be opaque to installers [#]_---and the command line to
the setup script, which is outside of your control and entirely up to the
installer. In fact, :file:`setup.cfg` (and any other Distutils configuration
files present on the target system) are processed after the contents of the
setup script, but before the command line. This has several useful
consequences:
.. If you have more advanced needs, such as determining which extensions to
build based on what capabilities are present on the target system, then you
need the Distutils auto-configuration facility. This started to appear in
Distutils 0.9 but, as of this writing, isn't mature or stable enough yet
for real-world use.
* installers can override some of what you put in :file:`setup.py` by editing
:file:`setup.cfg`
* you can provide non-standard defaults for options that are not easily set in
:file:`setup.py`
* installers can override anything in :file:`setup.cfg` using the command-line
options to :file:`setup.py`
The basic syntax of the configuration file is simple::
[command]
option = value
...
where *command* is one of the Distutils commands (e.g. :command:`build_py`,
:command:`install_dist`), and *option* is one of the options that command supports.
Any number of options can be supplied for each command, and any number of
command sections can be included in the file. Blank lines are ignored, as are
comments, which run from a ``'#'`` character until the end of the line. Long
option values can be split across multiple lines simply by indenting the
continuation lines.
You can find out the list of options supported by a particular command with the
universal :option:`--help` option, e.g. ::
> python setup.py --help build_ext
[...]
Options for 'build_ext' command:
--build-lib (-b) directory for compiled extension modules
--build-temp (-t) directory for temporary files (build by-products)
--inplace (-i) ignore build-lib and put compiled extensions into the
source directory alongside your pure Python modules
--include-dirs (-I) list of directories to search for header files
--define (-D) C preprocessor macros to define
--undef (-U) C preprocessor macros to undefine
--swig-opts list of SWIG command-line options
[...]
.. XXX do we want to support ``setup.py --help metadata``?
Note that an option spelled :option:`--foo-bar` on the command line is spelled
:option:`foo_bar` in configuration files.
For example, say you want your extensions to be built "in-place"---that is, you
have an extension :mod:`pkg.ext`, and you want the compiled extension file
(:file:`ext.so` on Unix, say) to be put in the same source directory as your
pure Python modules :mod:`pkg.mod1` and :mod:`pkg.mod2`. You can always use the
:option:`--inplace` option on the command line to ensure this::
python setup.py build_ext --inplace
But this requires that you always specify the :command:`build_ext` command
explicitly, and remember to provide :option:`--inplace`. An easier way is to
"set and forget" this option, by encoding it in :file:`setup.cfg`, the
configuration file for this distribution::
[build_ext]
inplace = 1
This will affect all builds of this module distribution, whether or not you
explicitly specify :command:`build_ext`. If you include :file:`setup.cfg` in
your source distribution, it will also affect end-user builds---which is
probably a bad idea for this option, since always building extensions in-place
would break installation of the module distribution. In certain peculiar cases,
though, modules are built right in their installation directory, so this is
conceivably a useful ability. (Distributing extensions that expect to be built
in their installation directory is almost always a bad idea, though.)
Another example: certain commands take options that vary from project to
project but not depending on the installation system, for example,
:command:`test` needs to know where your test suite is located and what test
runner to use; likewise, :command:`upload_docs` can find HTML documentation in
a :file:`doc` or :file:`docs` directory, but needs an option to find files in
:file:`docs/build/html`. Instead of having to type out these options each
time you want to run the command, you can put them in the project's
:file:`setup.cfg`::
[test]
suite = packaging.tests
[upload_docs]
upload-dir = docs/build/html
.. seealso::
:ref:`packaging-config-syntax` in "Installing Python Projects"
More information on the configuration files is available in the manual for
system administrators.
.. rubric:: Footnotes
.. [#] This ideal probably won't be achieved until auto-configuration is fully
supported by the Distutils.

334
Doc/packaging/examples.rst Normal file
View File

@ -0,0 +1,334 @@
.. _packaging-examples:
********
Examples
********
This chapter provides a number of basic examples to help get started with
Packaging.
.. _packaging-pure-mod:
Pure Python distribution (by module)
====================================
If you're just distributing a couple of modules, especially if they don't live
in a particular package, you can specify them individually using the
:option:`py_modules` option in the setup script.
In the simplest case, you'll have two files to worry about: a setup script and
the single module you're distributing, :file:`foo.py` in this example::
<root>/
setup.py
foo.py
(In all diagrams in this section, *<root>* will refer to the distribution root
directory.) A minimal setup script to describe this situation would be::
from packaging.core import setup
setup(name='foo',
version='1.0',
py_modules=['foo'])
Note that the name of the distribution is specified independently with the
:option:`name` option, and there's no rule that says it has to be the same as
the name of the sole module in the distribution (although that's probably a good
convention to follow). However, the distribution name is used to generate
filenames, so you should stick to letters, digits, underscores, and hyphens.
Since :option:`py_modules` is a list, you can of course specify multiple
modules, e.g. if you're distributing modules :mod:`foo` and :mod:`bar`, your
setup might look like this::
<root>/
setup.py
foo.py
bar.py
and the setup script might be ::
from packaging.core import setup
setup(name='foobar',
version='1.0',
py_modules=['foo', 'bar'])
You can put module source files into another directory, but if you have enough
modules to do that, it's probably easier to specify modules by package rather
than listing them individually.
.. _packaging-pure-pkg:
Pure Python distribution (by package)
=====================================
If you have more than a couple of modules to distribute, especially if they are
in multiple packages, it's probably easier to specify whole packages rather than
individual modules. This works even if your modules are not in a package; you
can just tell the Distutils to process modules from the root package, and that
works the same as any other package (except that you don't have to have an
:file:`__init__.py` file).
The setup script from the last example could also be written as ::
from packaging.core import setup
setup(name='foobar',
version='1.0',
packages=[''])
(The empty string stands for the root package.)
If those two files are moved into a subdirectory, but remain in the root
package, e.g.::
<root>/
setup.py
src/
foo.py
bar.py
then you would still specify the root package, but you have to tell the
Distutils where source files in the root package live::
from packaging.core import setup
setup(name='foobar',
version='1.0',
package_dir={'': 'src'},
packages=[''])
More typically, though, you will want to distribute multiple modules in the same
package (or in sub-packages). For example, if the :mod:`foo` and :mod:`bar`
modules belong in package :mod:`foobar`, one way to lay out your source tree is
::
<root>/
setup.py
foobar/
__init__.py
foo.py
bar.py
This is in fact the default layout expected by the Distutils, and the one that
requires the least work to describe in your setup script::
from packaging.core import setup
setup(name='foobar',
version='1.0',
packages=['foobar'])
If you want to put modules in directories not named for their package, then you
need to use the :option:`package_dir` option again. For example, if the
:file:`src` directory holds modules in the :mod:`foobar` package::
<root>/
setup.py
src/
__init__.py
foo.py
bar.py
an appropriate setup script would be ::
from packaging.core import setup
setup(name='foobar',
version='1.0',
package_dir={'foobar': 'src'},
packages=['foobar'])
Or, you might put modules from your main package right in the distribution
root::
<root>/
setup.py
__init__.py
foo.py
bar.py
in which case your setup script would be ::
from packaging.core import setup
setup(name='foobar',
version='1.0',
package_dir={'foobar': ''},
packages=['foobar'])
(The empty string also stands for the current directory.)
If you have sub-packages, they must be explicitly listed in :option:`packages`,
but any entries in :option:`package_dir` automatically extend to sub-packages.
(In other words, the Distutils does *not* scan your source tree, trying to
figure out which directories correspond to Python packages by looking for
:file:`__init__.py` files.) Thus, if the default layout grows a sub-package::
<root>/
setup.py
foobar/
__init__.py
foo.py
bar.py
subfoo/
__init__.py
blah.py
then the corresponding setup script would be ::
from packaging.core import setup
setup(name='foobar',
version='1.0',
packages=['foobar', 'foobar.subfoo'])
(Again, the empty string in :option:`package_dir` stands for the current
directory.)
.. _packaging-single-ext:
Single extension module
=======================
Extension modules are specified using the :option:`ext_modules` option.
:option:`package_dir` has no effect on where extension source files are found;
it only affects the source for pure Python modules. The simplest case, a
single extension module in a single C source file, is::
<root>/
setup.py
foo.c
If the :mod:`foo` extension belongs in the root package, the setup script for
this could be ::
from packaging.core import setup, Extension
setup(name='foobar',
version='1.0',
ext_modules=[Extension('foo', ['foo.c'])])
If the extension actually belongs in a package, say :mod:`foopkg`, then
With exactly the same source tree layout, this extension can be put in the
:mod:`foopkg` package simply by changing the name of the extension::
from packaging.core import setup, Extension
setup(name='foobar',
version='1.0',
packages=['foopkg'],
ext_modules=[Extension('foopkg.foo', ['foo.c'])])
Checking metadata
=================
The ``check`` command allows you to verify if your project's metadata
meets the minimum requirements to build a distribution.
To run it, just call it using your :file:`setup.py` script. If something is
missing, ``check`` will display a warning.
Let's take an example with a simple script::
from packaging.core import setup
setup(name='foobar')
.. TODO configure logging StreamHandler to match this output
Running the ``check`` command will display some warnings::
$ python setup.py check
running check
warning: check: missing required metadata: version, home_page
warning: check: missing metadata: either (author and author_email) or
(maintainer and maintainer_email) must be supplied
If you use the reStructuredText syntax in the ``long_description`` field and
`Docutils <http://docutils.sourceforge.net/>`_ is installed you can check if
the syntax is fine with the ``check`` command, using the ``restructuredtext``
option.
For example, if the :file:`setup.py` script is changed like this::
from packaging.core import setup
desc = """\
Welcome to foobar!
===============
This is the description of the ``foobar`` project.
"""
setup(name='foobar',
version='1.0',
author=u'Tarek Ziadé',
author_email='tarek@ziade.org',
summary='Foobar utilities'
description=desc,
home_page='http://example.com')
Where the long description is broken, ``check`` will be able to detect it
by using the :mod:`docutils` parser::
$ python setup.py check --restructuredtext
running check
warning: check: Title underline too short. (line 2)
warning: check: Could not finish the parsing.
.. _packaging-reading-metadata:
Reading the metadata
====================
The :func:`packaging.core.setup` function provides a command-line interface
that allows you to query the metadata fields of a project through the
:file:`setup.py` script of a given project::
$ python setup.py --name
foobar
This call reads the ``name`` metadata by running the
:func:`packaging.core.setup` function. When a source or binary
distribution is created with Distutils, the metadata fields are written
in a static file called :file:`PKG-INFO`. When a Distutils-based project is
installed in Python, the :file:`PKG-INFO` file is copied alongside the modules
and packages of the distribution under :file:`NAME-VERSION-pyX.X.egg-info`,
where ``NAME`` is the name of the project, ``VERSION`` its version as defined
in the Metadata, and ``pyX.X`` the major and minor version of Python like
``2.7`` or ``3.2``.
You can read back this static file, by using the
:class:`packaging.dist.Metadata` class and its
:func:`read_pkg_file` method::
>>> from packaging.metadata import Metadata
>>> metadata = Metadata()
>>> metadata.read_pkg_file(open('distribute-0.6.8-py2.7.egg-info'))
>>> metadata.name
'distribute'
>>> metadata.version
'0.6.8'
>>> metadata.description
'Easily download, build, install, upgrade, and uninstall Python packages'
Notice that the class can also be instantiated with a metadata file path to
loads its values::
>>> pkg_info_path = 'distribute-0.6.8-py2.7.egg-info'
>>> Metadata(pkg_info_path).name
'distribute'
.. XXX These comments have been here for at least ten years. Write the
sections or delete the comments (we can maybe ask Greg Ward about
the planned contents). (Unindent to make them section titles)
.. multiple-ext::
Multiple extension modules
==========================
Putting it all together
=======================

View File

@ -0,0 +1,95 @@
.. _extending-packaging:
*******************
Extending Distutils
*******************
Distutils can be extended in various ways. Most extensions take the form of new
commands or replacements for existing commands. New commands may be written to
support new types of platform-specific packaging, for example, while
replacements for existing commands may be made to modify details of how the
command operates on a package.
Most extensions of the packaging are made within :file:`setup.py` scripts that
want to modify existing commands; many simply add a few file extensions that
should be copied into packages in addition to :file:`.py` files as a
convenience.
Most packaging command implementations are subclasses of the
:class:`packaging.cmd.Command` class. New commands may directly inherit from
:class:`Command`, while replacements often derive from :class:`Command`
indirectly, directly subclassing the command they are replacing. Commands are
required to derive from :class:`Command`.
.. .. _extend-existing:
Extending existing commands
===========================
.. .. _new-commands:
Writing new commands
====================
Integrating new commands
========================
There are different ways to integrate new command implementations into
packaging. The most difficult is to lobby for the inclusion of the new features
in packaging itself, and wait for (and require) a version of Python that
provides that support. This is really hard for many reasons.
The most common, and possibly the most reasonable for most needs, is to include
the new implementations with your :file:`setup.py` script, and cause the
:func:`packaging.core.setup` function use them::
from packaging.core import setup
from packaging.command.build_py import build_py as _build_py
class build_py(_build_py):
"""Specialized Python source builder."""
# implement whatever needs to be different...
setup(..., cmdclass={'build_py': build_py})
This approach is most valuable if the new implementations must be used to use a
particular package, as everyone interested in the package will need to have the
new command implementation.
Beginning with Python 2.4, a third option is available, intended to allow new
commands to be added which can support existing :file:`setup.py` scripts without
requiring modifications to the Python installation. This is expected to allow
third-party extensions to provide support for additional packaging systems, but
the commands can be used for anything packaging commands can be used for. A new
configuration option, :option:`command_packages` (command-line option
:option:`--command-packages`), can be used to specify additional packages to be
searched for modules implementing commands. Like all packaging options, this
can be specified on the command line or in a configuration file. This option
can only be set in the ``[global]`` section of a configuration file, or before
any commands on the command line. If set in a configuration file, it can be
overridden from the command line; setting it to an empty string on the command
line causes the default to be used. This should never be set in a configuration
file provided with a package.
This new option can be used to add any number of packages to the list of
packages searched for command implementations; multiple package names should be
separated by commas. When not specified, the search is only performed in the
:mod:`packaging.command` package. When :file:`setup.py` is run with the option
:option:`--command-packages` :option:`distcmds,buildcmds`, however, the packages
:mod:`packaging.command`, :mod:`distcmds`, and :mod:`buildcmds` will be searched
in that order. New commands are expected to be implemented in modules of the
same name as the command by classes sharing the same name. Given the example
command-line option above, the command :command:`bdist_openpkg` could be
implemented by the class :class:`distcmds.bdist_openpkg.bdist_openpkg` or
:class:`buildcmds.bdist_openpkg.bdist_openpkg`.
Adding new distribution types
=============================
Commands that create distributions (files in the :file:`dist/` directory) need
to add ``(command, filename)`` pairs to ``self.distribution.dist_files`` so that
:command:`upload` can upload it to PyPI. The *filename* in the pair contains no
path information, only the name of the file itself. In dry-run mode, pairs
should still be added to represent what would have been created.

45
Doc/packaging/index.rst Normal file
View File

@ -0,0 +1,45 @@
.. _packaging-index:
##############################
Distributing Python Projects
##############################
:Authors: The Fellowship of the Packaging
:Email: distutils-sig@python.org
:Release: |version|
:Date: |today|
This document describes Packaging for Python authors, describing how to use the
module to make Python applications, packages or modules easily available to a
wider audience with very little overhead for build/release/install mechanics.
.. toctree::
:maxdepth: 2
:numbered:
tutorial
setupcfg
introduction
setupscript
configfile
sourcedist
builtdist
packageindex
uploading
examples
extending
commandhooks
commandref
.. seealso::
:ref:`packaging-install-index`
A user-centered manual which includes information on adding projects
into an existing Python installation. You do not need to be a Python
programmer to read this manual.
:mod:`packaging`
A library reference for developers of packaging tools wanting to use
standalone building blocks like :mod:`~packaging.version` or
:mod:`~packaging.metadata`, or extend Packaging itself.

View File

@ -0,0 +1,193 @@
.. _packaging-intro:
*****************************
An Introduction to Packaging
*****************************
This document covers using Packaging to distribute your Python modules,
concentrating on the role of developer/distributor. If you're looking for
information on installing Python modules you should refer to the
:ref:`packaging-install-index` chapter.
Throughout this documentation, the terms "Distutils", "the Distutils" and
"Packaging" will be used interchangeably.
.. _packaging-concepts:
Concepts & Terminology
======================
Using Distutils is quite simple both for module developers and for
users/administrators installing third-party modules. As a developer, your
responsibilities (apart from writing solid, well-documented and well-tested
code, of course!) are:
* writing a setup script (:file:`setup.py` by convention)
* (optional) writing a setup configuration file
* creating a source distribution
* (optional) creating one or more "built" (binary) distributions of your
project
All of these tasks are covered in this document.
Not all module developers have access to multiple platforms, so one cannot
expect them to create buildt distributions for every platform. To remedy
this, it is hoped that intermediaries called *packagers* will arise to address
this need. Packagers take source distributions released by module developers,
build them on one or more platforms and release the resulting built
distributions. Thus, users on a greater range of platforms will be able to
install the most popular Python modules in the most natural way for their
platform without having to run a setup script or compile a single line of code.
.. _packaging-simple-example:
A Simple Example
================
A setup script is usually quite simple, although since it's written in Python
there are no arbitrary limits to what you can do with it, though you should be
careful about putting expensive operations in your setup script.
Unlike, say, Autoconf-style configure scripts the setup script may be run
multiple times in the course of building and installing a module
distribution.
If all you want to do is distribute a module called :mod:`foo`, contained in a
file :file:`foo.py`, then your setup script can be as simple as::
from packaging.core import setup
setup(name='foo',
version='1.0',
py_modules=['foo'])
Some observations:
* most information that you supply to the Distutils is supplied as keyword
arguments to the :func:`setup` function
* those keyword arguments fall into two categories: package metadata (name,
version number, etc.) and information about what's in the package (a list
of pure Python modules in this case)
* modules are specified by module name, not filename (the same will hold true
for packages and extensions)
* it's recommended that you supply a little more metadata than we have in the
example. In particular your name, email address and a URL for the
project if appropriate (see section :ref:`packaging-setup-script` for an example)
To create a source distribution for this module you would create a setup
script, :file:`setup.py`, containing the above code and run::
python setup.py sdist
which will create an archive file (e.g., tarball on Unix, ZIP file on Windows)
containing your setup script :file:`setup.py`, and your module :file:`foo.py`.
The archive file will be named :file:`foo-1.0.tar.gz` (or :file:`.zip`), and
will unpack into a directory :file:`foo-1.0`.
If an end-user wishes to install your :mod:`foo` module all he has to do is
download :file:`foo-1.0.tar.gz` (or :file:`.zip`), unpack it, and from the
:file:`foo-1.0` directory run ::
python setup.py install
which will copy :file:`foo.py` to the appropriate directory for
third-party modules in their Python installation.
This simple example demonstrates some fundamental concepts of Distutils.
First, both developers and installers have the same basic user interface, i.e.
the setup script. The difference is which Distutils *commands* they use: the
:command:`sdist` command is almost exclusively for module developers, while
:command:`install` is more often used by installers (although some developers
will want to install their own code occasionally).
If you want to make things really easy for your users, you can create more
than one built distributions for them. For instance, if you are running on a
Windows machine and want to make things easy for other Windows users, you can
create an executable installer (the most appropriate type of built distribution
for this platform) with the :command:`bdist_wininst` command. For example::
python setup.py bdist_wininst
will create an executable installer, :file:`foo-1.0.win32.exe`, in the current
directory. You can find out what distribution formats are available at any time
by running ::
python setup.py bdist --help-formats
.. _packaging-python-terms:
General Python terminology
==========================
If you're reading this document, you probably have a good idea of what Python
modules, extensions and so forth are. Nevertheless, just to be sure that
everyone is on the same page, here's a quick overview of Python terms:
module
The basic unit of code reusability in Python: a block of code imported by
some other code. Three types of modules are important to us here: pure
Python modules, extension modules and packages.
pure Python module
A module written in Python and contained in a single :file:`.py` file (and
possibly associated :file:`.pyc` and/or :file:`.pyo` files). Sometimes
referred to as a "pure module."
extension module
A module written in the low-level language of the Python implementation: C/C++
for Python, Java for Jython. Typically contained in a single dynamically
loaded pre-compiled file, e.g. a shared object (:file:`.so`) file for Python
extensions on Unix, a DLL (given the :file:`.pyd` extension) for Python
extensions on Windows, or a Java class file for Jython extensions. Note that
currently Distutils only handles C/C++ extensions for Python.
package
A module that contains other modules, typically contained in a directory of
the filesystem and distinguished from other directories by the presence of a
file :file:`__init__.py`.
root package
The root of the hierarchy of packages. (This isn't really a package,
since it doesn't have an :file:`__init__.py` file. But... we have to
call it something, right?) The vast majority of the standard library is
in the root package, as are many small standalone third-party modules that
don't belong to a larger module collection. Unlike regular packages,
modules in the root package can be found in many directories: in fact,
every directory listed in ``sys.path`` contributes modules to the root
package.
.. _packaging-term:
Distutils-specific terminology
==============================
The following terms apply more specifically to the domain of distributing Python
modules using Distutils:
module distribution
A collection of Python modules distributed together as a single downloadable
resource and meant to be installed all as one. Examples of some well-known
module distributions are NumPy, SciPy, PIL (the Python Imaging
Library) or mxBase. (Module distributions would be called a *package*,
except that term is already taken in the Python context: a single module
distribution may contain zero, one, or many Python packages.)
pure module distribution
A module distribution that contains only pure Python modules and packages.
Sometimes referred to as a "pure distribution."
non-pure module distribution
A module distribution that contains at least one extension module. Sometimes
referred to as a "non-pure distribution."
distribution root
The top-level directory of your source tree (or source distribution). The
directory where :file:`setup.py` exists. Generally :file:`setup.py` will
be run from this directory.

View File

@ -0,0 +1,104 @@
.. _packaging-package-index:
**********************************
Registering with the Package Index
**********************************
The Python Package Index (PyPI) holds metadata describing distributions
packaged with packaging. The packaging command :command:`register` is used to
submit your distribution's metadata to the index. It is invoked as follows::
python setup.py register
Distutils will respond with the following prompt::
running register
We need to know who you are, so please choose either:
1. use your existing login,
2. register as a new user,
3. have the server generate a new password for you (and email it to you), or
4. quit
Your selection [default 1]:
Note: if your username and password are saved locally, you will not see this
menu.
If you have not registered with PyPI, then you will need to do so now. You
should choose option 2, and enter your details as required. Soon after
submitting your details, you will receive an email which will be used to confirm
your registration.
Once you are registered, you may choose option 1 from the menu. You will be
prompted for your PyPI username and password, and :command:`register` will then
submit your metadata to the index.
You may submit any number of versions of your distribution to the index. If you
alter the metadata for a particular version, you may submit it again and the
index will be updated.
PyPI holds a record for each (name, version) combination submitted. The first
user to submit information for a given name is designated the Owner of that
name. They may submit changes through the :command:`register` command or through
the web interface. They may also designate other users as Owners or Maintainers.
Maintainers may edit the package information, but not designate other Owners or
Maintainers.
By default PyPI will list all versions of a given package. To hide certain
versions, the Hidden property should be set to yes. This must be edited through
the web interface.
.. _packaging-pypirc:
The .pypirc file
================
The format of the :file:`.pypirc` file is as follows::
[packaging]
index-servers =
pypi
[pypi]
repository: <repository-url>
username: <username>
password: <password>
The *packaging* section defines a *index-servers* variable that lists the
name of all sections describing a repository.
Each section describing a repository defines three variables:
- *repository*, that defines the url of the PyPI server. Defaults to
``http://www.python.org/pypi``.
- *username*, which is the registered username on the PyPI server.
- *password*, that will be used to authenticate. If omitted the user
will be prompt to type it when needed.
If you want to define another server a new section can be created and
listed in the *index-servers* variable::
[packaging]
index-servers =
pypi
other
[pypi]
repository: <repository-url>
username: <username>
password: <password>
[other]
repository: http://example.com/pypi
username: <username>
password: <password>
:command:`register` can then be called with the -r option to point the
repository to work with::
python setup.py register -r http://example.com/pypi
For convenience, the name of the section that describes the repository
may also be used::
python setup.py register -r other

648
Doc/packaging/setupcfg.rst Normal file
View File

@ -0,0 +1,648 @@
.. highlightlang:: cfg
*******************************************
Specification of the :file:`setup.cfg` file
*******************************************
.. :version: 1.0
This document describes the :file:`setup.cfg`, an ini-style configuration file
(compatible with :class:`configparser.RawConfigParser`) configuration file used
by Packaging to replace the :file:`setup.py` file.
Each section contains a description of its options.
- Options that are marked *multi* can have multiple values, one value per
line.
- Options that are marked *optional* can be omitted.
- Options that are marked *environ* can use environment markers, as described
in :PEP:`345`.
The sections are:
global
Global options not related to one command.
metadata
Name, version and other information defined by :PEP:`345`.
files
Modules, scripts, data, documentation and other files to include in the
distribution.
command sections
Options given for specific commands, identical to those that can be given
on the command line.
Global options
==============
Contains global options for Packaging. This section is shared with Distutils.
commands
Defined Packaging command. A command is defined by its fully
qualified name. *optional*, *multi*
Examples::
[global]
commands =
package.setup.CustomSdistCommand
package.setup.BdistDeb
compilers
Defined Packaging compiler. A compiler is defined by its fully
qualified name. *optional*, *multi*
Example::
[global]
compilers =
hotcompiler.SmartCCompiler
setup_hook
defines a callable that will be called right after the
:file:`setup.cfg` file is read. The callable receives the configuration
in form of a mapping and can make some changes to it. *optional*
Example::
[global]
setup_hook = package.setup.customize_dist
Metadata
========
The metadata section contains the metadata for the project as described in
:PEP:`345`. Field names are case-insensitive.
Fields:
name
Name of the project.
version
Version of the project. Must comply with :PEP:`386`.
platform
Platform specification describing an operating system
supported by the distribution which is not listed in the "Operating System"
Trove classifiers (:PEP:`301`). *optional*, *multi*
supported-platform
Binary distributions containing a PKG-INFO file will
use the Supported-Platform field in their metadata to specify the OS and
CPU for which the binary distribution was compiled. The semantics of
the Supported-Platform field are free form. *optional*, *multi*
summary
A one-line summary of what the distribution does.
(Used to be called *description* in Distutils1.)
description
A longer description. (Used to be called *long_description*
in Distutils1.) A file can be provided in the *description-file* field.
*optional*
description-file
path to a text file that will be used for the
**description** field. *optional*
keywords
A list of additional keywords to be used to assist searching
for the distribution in a larger catalog. Comma or space-separated.
*optional*
home-page
The URL for the distribution's home page.
download-url
The URL from which this version of the distribution
can be downloaded. *optional*
author
Author's name. *optional*
author-email
Author's e-mail. *optional*
maintainer
Maintainer's name. *optional*
maintainer-email
Maintainer's e-mail. *optional*
license
A text indicating the term of uses, when a trove classifier does
not match. *optional*.
classifiers
Classification for the distribution, as described in PEP 301.
*optional*, *multi*, *environ*
requires-dist
name of another packaging project required as a dependency.
The format is *name (version)* where version is an optional
version declaration, as described in PEP 345. *optional*, *multi*, *environ*
provides-dist
name of another packaging project contained within this
distribution. Same format than *requires-dist*. *optional*, *multi*,
*environ*
obsoletes-dist
name of another packaging project this version obsoletes.
Same format than *requires-dist*. *optional*, *multi*, *environ*
requires-python
Specifies the Python version the distribution requires.
The value is a version number, as described in PEP 345.
*optional*, *multi*, *environ*
requires-externals
a dependency in the system. This field is free-form,
and just a hint for downstream maintainers. *optional*, *multi*,
*environ*
project-url
A label, followed by a browsable URL for the project.
"label, url". The label is limited to 32 signs. *optional*, *multi*
Example::
[metadata]
name = pypi2rpm
version = 0.1
author = Tarek Ziadé
author-email = tarek@ziade.org
summary = Script that transforms an sdist archive into a RPM package
description-file = README
home-page = http://bitbucket.org/tarek/pypi2rpm/wiki/Home
project-url:
Repository, http://bitbucket.org/tarek/pypi2rpm/
RSS feed, https://bitbucket.org/tarek/pypi2rpm/rss
classifier =
Development Status :: 3 - Alpha
License :: OSI Approved :: Mozilla Public License 1.1 (MPL 1.1)
You should not give any explicit value for metadata-version: it will be guessed
from the fields present in the file.
Files
=====
This section describes the files included in the project.
packages_root
the root directory containing all packages and modules
(default: current directory). *optional*
packages
a list of packages the project includes *optional*, *multi*
modules
a list of packages the project includes *optional*, *multi*
scripts
a list of scripts the project includes *optional*, *multi*
extra_files
a list of patterns to include extra files *optional*,
*multi*
Example::
[files]
packages_root = src
packages =
pypi2rpm
pypi2rpm.command
scripts =
pypi2rpm/pypi2rpm.py
extra_files =
setup.py
README
.. Note::
The :file:`setup.cfg` configuration file is included by default. Contrary to
Distutils, :file:`README` (or :file:`README.txt`) and :file:`setup.py` are
not included by default.
Resources
---------
This section describes the files used by the project which must not be installed
in the same place that python modules or libraries, they are called
**resources**. They are for example documentation files, script files,
databases, etc...
For declaring resources, you must use this notation::
source = destination
Data-files are declared in the **resources** field in the **file** section, for
example::
[files]
resources =
source1 = destination1
source2 = destination2
The **source** part of the declaration are relative paths of resources files
(using unix path separator **/**). For example, if you've this source tree::
foo/
doc/
doc.man
scripts/
foo.sh
Your setup.cfg will look like::
[files]
resources =
doc/doc.man = destination_doc
scripts/foo.sh = destination_scripts
The final paths where files will be placed are composed by : **source** +
**destination**. In the previous example, **doc/doc.man** will be placed in
**destination_doc/doc/doc.man** and **scripts/foo.sh** will be placed in
**destination_scripts/scripts/foo.sh**. (If you want more control on the final
path, take a look at base_prefix_).
The **destination** part of resources declaration are paths with categories.
Indeed, it's generally a bad idea to give absolute path as it will be cross
incompatible. So, you must use resources categories in your **destination**
declaration. Categories will be replaced by their real path at the installation
time. Using categories is all benefit, your declaration will be simpler, cross
platform and it will allow packager to place resources files where they want
without breaking your code.
Categories can be specified by using this syntax::
{category}
Default categories are:
* config
* appdata
* appdata.arch
* appdata.persistent
* appdata.disposable
* help
* icon
* scripts
* doc
* info
* man
A special category also exists **{distribution.name}** that will be replaced by
the name of the distribution, but as most of the defaults categories use them,
so it's not necessary to add **{distribution.name}** into your destination.
If you use categories in your declarations, and you are encouraged to do, final
path will be::
source + destination_expanded
.. _example_final_path:
For example, if you have this setup.cfg::
[metadata]
name = foo
[files]
resources =
doc/doc.man = {doc}
And if **{doc}** is replaced by **{datadir}/doc/{distribution.name}**, final
path will be::
{datadir}/doc/foo/doc/doc.man
Where {datafir} category will be platform-dependent.
More control on source part
^^^^^^^^^^^^^^^^^^^^^^^^^^^
Glob syntax
"""""""""""
When you declare source file, you can use a glob-like syntax to match multiples file, for example::
scripts/* = {script}
Will match all the files in the scripts directory and placed them in the script category.
Glob tokens are:
* ``*``: match all files.
* ``?``: match any character.
* ``**``: match any level of tree recursion (even 0).
* ``{}``: will match any part separated by comma (example: ``{sh,bat}``).
.. TODO Add examples
Order of declaration
""""""""""""""""""""
The order of declaration is important if one file match multiple rules. The last
rules matched by file is used, this is useful if you have this source tree::
foo/
doc/
index.rst
setup.rst
documentation.txt
doc.tex
README
And you want all the files in the doc directory to be placed in {doc} category,
but README must be placed in {help} category, instead of listing all the files
one by one, you can declare them in this way::
[files]
resources =
doc/* = {doc}
doc/README = {help}
Exclude
"""""""
You can exclude some files of resources declaration by giving no destination, it
can be useful if you have a non-resources file in the same directory of
resources files::
foo/
doc/
RELEASES
doc.tex
documentation.txt
docu.rst
Your **files** section will be::
[files]
resources =
doc/* = {doc}
doc/RELEASES =
More control on destination part
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.. _base_prefix:
Defining a base prefix
""""""""""""""""""""""
When you define your resources, you can have more control of how the final path
is compute.
By default, the final path is::
destination + source
This can generate long paths, for example (example_final_path_)::
{datadir}/doc/foo/doc/doc.man
When you declare your source, you can use whitespace to split the source in
**prefix** **suffix**. So, for example, if you have this source::
docs/ doc.man
The **prefix** is "docs/" and the **suffix** is "doc.html".
.. note::
Separator can be placed after a path separator or replace it. So these two
sources are equivalent::
docs/ doc.man
docs doc.man
.. note::
Glob syntax is working the same way with standard source and splitted source.
So these rules::
docs/*
docs/ *
docs *
Will match all the files in the docs directory.
When you use splitted source, the final path is compute in this way::
destination + prefix
So for example, if you have this setup.cfg::
[metadata]
name = foo
[files]
resources =
doc/ doc.man = {doc}
And if **{doc}** is replaced by **{datadir}/doc/{distribution.name}**, final
path will be::
{datadir}/doc/foo/doc.man
Overwriting paths for categories
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
This part is intended for system administrators or downstream OS packagers.
The real paths of categories are registered in the *sysconfig.cfg* file
installed in your python installation. This file uses an ini format too.
The content of the file is organized into several sections:
* globals: Standard categories's paths.
* posix_prefix: Standard paths for categories and installation paths for posix
system.
* other ones XXX
Standard categories paths are platform independent, they generally refers to
other categories, which are platform dependent. :mod:`sysconfig` will choose
these category from sections matching os.name. For example::
doc = {datadir}/doc/{distribution.name}
It refers to datadir category, which can be different between platforms. In
posix system, it may be::
datadir = /usr/share
So the final path will be::
doc = /usr/share/doc/{distribution.name}
The platform-dependent categories are:
* confdir
* datadir
* libdir
* base
Defining extra categories
^^^^^^^^^^^^^^^^^^^^^^^^^
.. TODO
Examples
^^^^^^^^
These examples are incremental but work unitarily.
Resources in root dir
"""""""""""""""""""""
Source tree::
babar-1.0/
README
babar.sh
launch.sh
babar.py
:file:`setup.cfg`::
[files]
resources =
README = {doc}
*.sh = {scripts}
So babar.sh and launch.sh will be placed in {scripts} directory.
Now let's move all the scripts into a scripts directory.
Resources in sub-directory
""""""""""""""""""""""""""
Source tree::
babar-1.1/
README
scripts/
babar.sh
launch.sh
LAUNCH
babar.py
:file:`setup.cfg`::
[files]
resources =
README = {doc}
scripts/ LAUNCH = {doc}
scripts/ *.sh = {scripts}
It's important to use the separator after scripts/ to install all the shell
scripts into {scripts} instead of {scripts}/scripts.
Now let's add some docs.
Resources in multiple sub-directories
"""""""""""""""""""""""""""""""""""""
Source tree::
babar-1.2/
README
scripts/
babar.sh
launch.sh
LAUNCH
docs/
api
man
babar.py
:file:`setup.cfg`::
[files]
resources =
README = {doc}
scripts/ LAUNCH = {doc}
scripts/ *.sh = {scripts}
doc/ * = {doc}
doc/ man = {man}
You want to place all the file in the docs script into {doc} category, instead
of man, which must be placed into {man} category, we will use the order of
declaration of globs to choose the destination, the last glob that match the
file is used.
Now let's add some scripts for windows users.
Complete example
""""""""""""""""
Source tree::
babar-1.3/
README
doc/
api
man
scripts/
babar.sh
launch.sh
babar.bat
launch.bat
LAUNCH
:file:`setup.cfg`::
[files]
resources =
README = {doc}
scripts/ LAUNCH = {doc}
scripts/ *.{sh,bat} = {scripts}
doc/ * = {doc}
doc/ man = {man}
We use brace expansion syntax to place all the shell and batch scripts into
{scripts} category.
Command sections
================
To pass options to commands without having to type them on the command line
for each invocation, you can write them in the :file:`setup.cfg` file, in a
section named after the command. Example::
[sdist]
# special function to add custom files
manifest-builders = package.setup.list_extra_files
[build]
use-2to3 = True
[build_ext]
inplace = on
[check]
strict = on
all = on
Option values given in the configuration file can be overriden on the command
line. See :ref:`packaging-setup-config` for more information.

View File

@ -0,0 +1,689 @@
.. _packaging-setup-script:
************************
Writing the Setup Script
************************
The setup script is the center of all activity in building, distributing, and
installing modules using Distutils. The main purpose of the setup script is
to describe your module distribution to Distutils, so that the various
commands that operate on your modules do the right thing. As we saw in section
:ref:`packaging-simple-example`, the setup script consists mainly of a
call to :func:`setup` where the most information is supplied as
keyword arguments to :func:`setup`.
Here's a slightly more involved example, which we'll follow for the next couple
of sections: a setup script that could be used for Packaging itself::
#!/usr/bin/env python
from packaging.core import setup, find_packages
setup(name='Packaging',
version='1.0',
summary='Python Distribution Utilities',
keywords=['packaging', 'packaging'],
author=u'Tarek Ziadé',
author_email='tarek@ziade.org',
home_page='http://bitbucket.org/tarek/packaging/wiki/Home',
license='PSF',
packages=find_packages())
There are only two differences between this and the trivial one-file
distribution presented in section :ref:`packaging-simple-example`: more
metadata and the specification of pure Python modules by package rather than
by module. This is important since Ristutils consist of a couple of dozen
modules split into (so far) two packages; an explicit list of every module
would be tedious to generate and difficult to maintain. For more information
on the additional metadata, see section :ref:`packaging-metadata`.
Note that any pathnames (files or directories) supplied in the setup script
should be written using the Unix convention, i.e. slash-separated. The
Distutils will take care of converting this platform-neutral representation into
whatever is appropriate on your current platform before actually using the
pathname. This makes your setup script portable across operating systems, which
of course is one of the major goals of the Distutils. In this spirit, all
pathnames in this document are slash-separated.
This, of course, only applies to pathnames given to Distutils functions. If
you, for example, use standard Python functions such as :func:`glob.glob` or
:func:`os.listdir` to specify files, you should be careful to write portable
code instead of hardcoding path separators::
glob.glob(os.path.join('mydir', 'subdir', '*.html'))
os.listdir(os.path.join('mydir', 'subdir'))
.. _packaging-listing-packages:
Listing whole packages
======================
The :option:`packages` option tells the Distutils to process (build, distribute,
install, etc.) all pure Python modules found in each package mentioned in the
:option:`packages` list. In order to do this, of course, there has to be a
correspondence between package names and directories in the filesystem. The
default correspondence is the most obvious one, i.e. package :mod:`packaging` is
found in the directory :file:`packaging` relative to the distribution root.
Thus, when you say ``packages = ['foo']`` in your setup script, you are
promising that the Distutils will find a file :file:`foo/__init__.py` (which
might be spelled differently on your system, but you get the idea) relative to
the directory where your setup script lives. If you break this promise, the
Distutils will issue a warning but still process the broken package anyways.
If you use a different convention to lay out your source directory, that's no
problem: you just have to supply the :option:`package_dir` option to tell the
Distutils about your convention. For example, say you keep all Python source
under :file:`lib`, so that modules in the "root package" (i.e., not in any
package at all) are in :file:`lib`, modules in the :mod:`foo` package are in
:file:`lib/foo`, and so forth. Then you would put ::
package_dir = {'': 'lib'}
in your setup script. The keys to this dictionary are package names, and an
empty package name stands for the root package. The values are directory names
relative to your distribution root. In this case, when you say ``packages =
['foo']``, you are promising that the file :file:`lib/foo/__init__.py` exists.
Another possible convention is to put the :mod:`foo` package right in
:file:`lib`, the :mod:`foo.bar` package in :file:`lib/bar`, etc. This would be
written in the setup script as ::
package_dir = {'foo': 'lib'}
A ``package: dir`` entry in the :option:`package_dir` dictionary implicitly
applies to all packages below *package*, so the :mod:`foo.bar` case is
automatically handled here. In this example, having ``packages = ['foo',
'foo.bar']`` tells the Distutils to look for :file:`lib/__init__.py` and
:file:`lib/bar/__init__.py`. (Keep in mind that although :option:`package_dir`
applies recursively, you must explicitly list all packages in
:option:`packages`: the Distutils will *not* recursively scan your source tree
looking for any directory with an :file:`__init__.py` file.)
.. _packaging-listing-modules:
Listing individual modules
==========================
For a small module distribution, you might prefer to list all modules rather
than listing packages---especially the case of a single module that goes in the
"root package" (i.e., no package at all). This simplest case was shown in
section :ref:`packaging-simple-example`; here is a slightly more involved
example::
py_modules = ['mod1', 'pkg.mod2']
This describes two modules, one of them in the "root" package, the other in the
:mod:`pkg` package. Again, the default package/directory layout implies that
these two modules can be found in :file:`mod1.py` and :file:`pkg/mod2.py`, and
that :file:`pkg/__init__.py` exists as well. And again, you can override the
package/directory correspondence using the :option:`package_dir` option.
.. _packaging-describing-extensions:
Describing extension modules
============================
Just as writing Python extension modules is a bit more complicated than writing
pure Python modules, describing them to the Distutils is a bit more complicated.
Unlike pure modules, it's not enough just to list modules or packages and expect
the Distutils to go out and find the right files; you have to specify the
extension name, source file(s), and any compile/link requirements (include
directories, libraries to link with, etc.).
.. XXX read over this section
All of this is done through another keyword argument to :func:`setup`, the
:option:`ext_modules` option. :option:`ext_modules` is just a list of
:class:`Extension` instances, each of which describes a single extension module.
Suppose your distribution includes a single extension, called :mod:`foo` and
implemented by :file:`foo.c`. If no additional instructions to the
compiler/linker are needed, describing this extension is quite simple::
Extension('foo', ['foo.c'])
The :class:`Extension` class can be imported from :mod:`packaging.core` along
with :func:`setup`. Thus, the setup script for a module distribution that
contains only this one extension and nothing else might be::
from packaging.core import setup, Extension
setup(name='foo',
version='1.0',
ext_modules=[Extension('foo', ['foo.c'])])
The :class:`Extension` class (actually, the underlying extension-building
machinery implemented by the :command:`build_ext` command) supports a great deal
of flexibility in describing Python extensions, which is explained in the
following sections.
Extension names and packages
----------------------------
The first argument to the :class:`Extension` constructor is always the name of
the extension, including any package names. For example, ::
Extension('foo', ['src/foo1.c', 'src/foo2.c'])
describes an extension that lives in the root package, while ::
Extension('pkg.foo', ['src/foo1.c', 'src/foo2.c'])
describes the same extension in the :mod:`pkg` package. The source files and
resulting object code are identical in both cases; the only difference is where
in the filesystem (and therefore where in Python's namespace hierarchy) the
resulting extension lives.
If you have a number of extensions all in the same package (or all under the
same base package), use the :option:`ext_package` keyword argument to
:func:`setup`. For example, ::
setup(...,
ext_package='pkg',
ext_modules=[Extension('foo', ['foo.c']),
Extension('subpkg.bar', ['bar.c'])])
will compile :file:`foo.c` to the extension :mod:`pkg.foo`, and :file:`bar.c` to
:mod:`pkg.subpkg.bar`.
Extension source files
----------------------
The second argument to the :class:`Extension` constructor is a list of source
files. Since the Distutils currently only support C, C++, and Objective-C
extensions, these are normally C/C++/Objective-C source files. (Be sure to use
appropriate extensions to distinguish C++\ source files: :file:`.cc` and
:file:`.cpp` seem to be recognized by both Unix and Windows compilers.)
However, you can also include SWIG interface (:file:`.i`) files in the list; the
:command:`build_ext` command knows how to deal with SWIG extensions: it will run
SWIG on the interface file and compile the resulting C/C++ file into your
extension.
.. XXX SWIG support is rough around the edges and largely untested!
This warning notwithstanding, options to SWIG can be currently passed like
this::
setup(...,
ext_modules=[Extension('_foo', ['foo.i'],
swig_opts=['-modern', '-I../include'])],
py_modules=['foo'])
Or on the command line like this::
> python setup.py build_ext --swig-opts="-modern -I../include"
On some platforms, you can include non-source files that are processed by the
compiler and included in your extension. Currently, this just means Windows
message text (:file:`.mc`) files and resource definition (:file:`.rc`) files for
Visual C++. These will be compiled to binary resource (:file:`.res`) files and
linked into the executable.
Preprocessor options
--------------------
Three optional arguments to :class:`Extension` will help if you need to specify
include directories to search or preprocessor macros to define/undefine:
``include_dirs``, ``define_macros``, and ``undef_macros``.
For example, if your extension requires header files in the :file:`include`
directory under your distribution root, use the ``include_dirs`` option::
Extension('foo', ['foo.c'], include_dirs=['include'])
You can specify absolute directories there; if you know that your extension will
only be built on Unix systems with X11R6 installed to :file:`/usr`, you can get
away with ::
Extension('foo', ['foo.c'], include_dirs=['/usr/include/X11'])
You should avoid this sort of non-portable usage if you plan to distribute your
code: it's probably better to write C code like ::
#include <X11/Xlib.h>
If you need to include header files from some other Python extension, you can
take advantage of the fact that header files are installed in a consistent way
by the Distutils :command:`install_header` command. For example, the Numerical
Python header files are installed (on a standard Unix installation) to
:file:`/usr/local/include/python1.5/Numerical`. (The exact location will differ
according to your platform and Python installation.) Since the Python include
directory---\ :file:`/usr/local/include/python1.5` in this case---is always
included in the search path when building Python extensions, the best approach
is to write C code like ::
#include <Numerical/arrayobject.h>
.. TODO check if it's d2.sysconfig or the new sysconfig module now
If you must put the :file:`Numerical` include directory right into your header
search path, though, you can find that directory using the Distutils
:mod:`packaging.sysconfig` module::
from packaging.sysconfig import get_python_inc
incdir = os.path.join(get_python_inc(plat_specific=1), 'Numerical')
setup(...,
Extension(..., include_dirs=[incdir]))
Even though this is quite portable---it will work on any Python installation,
regardless of platform---it's probably easier to just write your C code in the
sensible way.
You can define and undefine preprocessor macros with the ``define_macros`` and
``undef_macros`` options. ``define_macros`` takes a list of ``(name, value)``
tuples, where ``name`` is the name of the macro to define (a string) and
``value`` is its value: either a string or ``None``. (Defining a macro ``FOO``
to ``None`` is the equivalent of a bare ``#define FOO`` in your C source: with
most compilers, this sets ``FOO`` to the string ``1``.) ``undef_macros`` is
just a list of macros to undefine.
For example::
Extension(...,
define_macros=[('NDEBUG', '1'),
('HAVE_STRFTIME', None)],
undef_macros=['HAVE_FOO', 'HAVE_BAR'])
is the equivalent of having this at the top of every C source file::
#define NDEBUG 1
#define HAVE_STRFTIME
#undef HAVE_FOO
#undef HAVE_BAR
Library options
---------------
You can also specify the libraries to link against when building your extension,
and the directories to search for those libraries. The ``libraries`` option is
a list of libraries to link against, ``library_dirs`` is a list of directories
to search for libraries at link-time, and ``runtime_library_dirs`` is a list of
directories to search for shared (dynamically loaded) libraries at run-time.
For example, if you need to link against libraries known to be in the standard
library search path on target systems ::
Extension(...,
libraries=['gdbm', 'readline'])
If you need to link with libraries in a non-standard location, you'll have to
include the location in ``library_dirs``::
Extension(...,
library_dirs=['/usr/X11R6/lib'],
libraries=['X11', 'Xt'])
(Again, this sort of non-portable construct should be avoided if you intend to
distribute your code.)
.. XXX Should mention clib libraries here or somewhere else!
Other options
-------------
There are still some other options which can be used to handle special cases.
The :option:`optional` option is a boolean; if it is true,
a build failure in the extension will not abort the build process, but
instead simply not install the failing extension.
The :option:`extra_objects` option is a list of object files to be passed to the
linker. These files must not have extensions, as the default extension for the
compiler is used.
:option:`extra_compile_args` and :option:`extra_link_args` can be used to
specify additional command-line options for the respective compiler and linker
command lines.
:option:`export_symbols` is only useful on Windows. It can contain a list of
symbols (functions or variables) to be exported. This option is not needed when
building compiled extensions: Distutils will automatically add ``initmodule``
to the list of exported symbols.
The :option:`depends` option is a list of files that the extension depends on
(for example header files). The build command will call the compiler on the
sources to rebuild extension if any on this files has been modified since the
previous build.
Relationships between Distributions and Packages
================================================
.. FIXME rewrite to update to PEP 345 (but without dist/release confusion)
A distribution may relate to packages in three specific ways:
#. It can require packages or modules.
#. It can provide packages or modules.
#. It can obsolete packages or modules.
These relationships can be specified using keyword arguments to the
:func:`packaging.core.setup` function.
Dependencies on other Python modules and packages can be specified by supplying
the *requires* keyword argument to :func:`setup`. The value must be a list of
strings. Each string specifies a package that is required, and optionally what
versions are sufficient.
To specify that any version of a module or package is required, the string
should consist entirely of the module or package name. Examples include
``'mymodule'`` and ``'xml.parsers.expat'``.
If specific versions are required, a sequence of qualifiers can be supplied in
parentheses. Each qualifier may consist of a comparison operator and a version
number. The accepted comparison operators are::
< > ==
<= >= !=
These can be combined by using multiple qualifiers separated by commas (and
optional whitespace). In this case, all of the qualifiers must be matched; a
logical AND is used to combine the evaluations.
Let's look at a bunch of examples:
+-------------------------+----------------------------------------------+
| Requires Expression | Explanation |
+=========================+==============================================+
| ``==1.0`` | Only version ``1.0`` is compatible |
+-------------------------+----------------------------------------------+
| ``>1.0, !=1.5.1, <2.0`` | Any version after ``1.0`` and before ``2.0`` |
| | is compatible, except ``1.5.1`` |
+-------------------------+----------------------------------------------+
Now that we can specify dependencies, we also need to be able to specify what we
provide that other distributions can require. This is done using the *provides*
keyword argument to :func:`setup`. The value for this keyword is a list of
strings, each of which names a Python module or package, and optionally
identifies the version. If the version is not specified, it is assumed to match
that of the distribution.
Some examples:
+---------------------+----------------------------------------------+
| Provides Expression | Explanation |
+=====================+==============================================+
| ``mypkg`` | Provide ``mypkg``, using the distribution |
| | version |
+---------------------+----------------------------------------------+
| ``mypkg (1.1)`` | Provide ``mypkg`` version 1.1, regardless of |
| | the distribution version |
+---------------------+----------------------------------------------+
A package can declare that it obsoletes other packages using the *obsoletes*
keyword argument. The value for this is similar to that of the *requires*
keyword: a list of strings giving module or package specifiers. Each specifier
consists of a module or package name optionally followed by one or more version
qualifiers. Version qualifiers are given in parentheses after the module or
package name.
The versions identified by the qualifiers are those that are obsoleted by the
distribution being described. If no qualifiers are given, all versions of the
named module or package are understood to be obsoleted.
.. _packaging-installing-scripts:
Installing Scripts
==================
So far we have been dealing with pure and non-pure Python modules, which are
usually not run by themselves but imported by scripts.
Scripts are files containing Python source code, intended to be started from the
command line. Scripts don't require Distutils to do anything very complicated.
The only clever feature is that if the first line of the script starts with
``#!`` and contains the word "python", the Distutils will adjust the first line
to refer to the current interpreter location. By default, it is replaced with
the current interpreter location. The :option:`--executable` (or :option:`-e`)
option will allow the interpreter path to be explicitly overridden.
The :option:`scripts` option simply is a list of files to be handled in this
way. From the PyXML setup script::
setup(...,
scripts=['scripts/xmlproc_parse', 'scripts/xmlproc_val'])
All the scripts will also be added to the ``MANIFEST`` file if no template is
provided. See :ref:`packaging-manifest`.
.. _packaging-installing-package-data:
Installing Package Data
=======================
Often, additional files need to be installed into a package. These files are
often data that's closely related to the package's implementation, or text files
containing documentation that might be of interest to programmers using the
package. These files are called :dfn:`package data`.
Package data can be added to packages using the ``package_data`` keyword
argument to the :func:`setup` function. The value must be a mapping from
package name to a list of relative path names that should be copied into the
package. The paths are interpreted as relative to the directory containing the
package (information from the ``package_dir`` mapping is used if appropriate);
that is, the files are expected to be part of the package in the source
directories. They may contain glob patterns as well.
The path names may contain directory portions; any necessary directories will be
created in the installation.
For example, if a package should contain a subdirectory with several data files,
the files can be arranged like this in the source tree::
setup.py
src/
mypkg/
__init__.py
module.py
data/
tables.dat
spoons.dat
forks.dat
The corresponding call to :func:`setup` might be::
setup(...,
packages=['mypkg'],
package_dir={'mypkg': 'src/mypkg'},
package_data={'mypkg': ['data/*.dat']})
All the files that match ``package_data`` will be added to the ``MANIFEST``
file if no template is provided. See :ref:`packaging-manifest`.
.. _packaging-additional-files:
Installing Additional Files
===========================
The :option:`data_files` option can be used to specify additional files needed
by the module distribution: configuration files, message catalogs, data files,
anything which doesn't fit in the previous categories.
:option:`data_files` specifies a sequence of (*directory*, *files*) pairs in the
following way::
setup(...,
data_files=[('bitmaps', ['bm/b1.gif', 'bm/b2.gif']),
('config', ['cfg/data.cfg']),
('/etc/init.d', ['init-script'])])
Note that you can specify the directory names where the data files will be
installed, but you cannot rename the data files themselves.
Each (*directory*, *files*) pair in the sequence specifies the installation
directory and the files to install there. If *directory* is a relative path, it
is interpreted relative to the installation prefix (Python's ``sys.prefix`` for
pure-Python packages, ``sys.exec_prefix`` for packages that contain extension
modules). Each file name in *files* is interpreted relative to the
:file:`setup.py` script at the top of the package source distribution. No
directory information from *files* is used to determine the final location of
the installed file; only the name of the file is used.
You can specify the :option:`data_files` options as a simple sequence of files
without specifying a target directory, but this is not recommended, and the
:command:`install_dist` command will print a warning in this case. To install data
files directly in the target directory, an empty string should be given as the
directory.
All the files that match ``data_files`` will be added to the ``MANIFEST`` file
if no template is provided. See :ref:`packaging-manifest`.
.. _packaging-metadata:
Metadata reference
==================
The setup script may include additional metadata beyond the name and version.
This table describes required and additional information:
.. TODO synchronize with setupcfg; link to it (but don't remove it, it's a
useful summary)
+----------------------+---------------------------+-----------------+--------+
| Meta-Data | Description | Value | Notes |
+======================+===========================+=================+========+
| ``name`` | name of the project | short string | \(1) |
+----------------------+---------------------------+-----------------+--------+
| ``version`` | version of this release | short string | (1)(2) |
+----------------------+---------------------------+-----------------+--------+
| ``author`` | project author's name | short string | \(3) |
+----------------------+---------------------------+-----------------+--------+
| ``author_email`` | email address of the | email address | \(3) |
| | project author | | |
+----------------------+---------------------------+-----------------+--------+
| ``maintainer`` | project maintainer's name | short string | \(3) |
+----------------------+---------------------------+-----------------+--------+
| ``maintainer_email`` | email address of the | email address | \(3) |
| | project maintainer | | |
+----------------------+---------------------------+-----------------+--------+
| ``home_page`` | home page for the project | URL | \(1) |
+----------------------+---------------------------+-----------------+--------+
| ``summary`` | short description of the | short string | |
| | project | | |
+----------------------+---------------------------+-----------------+--------+
| ``description`` | longer description of the | long string | \(5) |
| | project | | |
+----------------------+---------------------------+-----------------+--------+
| ``download_url`` | location where the | URL | |
| | project may be downloaded | | |
+----------------------+---------------------------+-----------------+--------+
| ``classifiers`` | a list of classifiers | list of strings | \(4) |
+----------------------+---------------------------+-----------------+--------+
| ``platforms`` | a list of platforms | list of strings | |
+----------------------+---------------------------+-----------------+--------+
| ``license`` | license for the release | short string | \(6) |
+----------------------+---------------------------+-----------------+--------+
Notes:
(1)
These fields are required.
(2)
It is recommended that versions take the form *major.minor[.patch[.sub]]*.
(3)
Either the author or the maintainer must be identified.
(4)
The list of classifiers is available from the `PyPI website
<http://pypi.python.org/pypi>`_. See also :mod:`packaging.create`.
(5)
The ``description`` field is used by PyPI when you are registering a
release, to build its PyPI page.
(6)
The ``license`` field is a text indicating the license covering the
distribution where the license is not a selection from the "License" Trove
classifiers. See the ``Classifier`` field. Notice that
there's a ``licence`` distribution option which is deprecated but still
acts as an alias for ``license``.
'short string'
A single line of text, not more than 200 characters.
'long string'
Multiple lines of plain text in reStructuredText format (see
http://docutils.sf.net/).
'list of strings'
See below.
In Python 2.x, "string value" means a unicode object. If a byte string (str or
bytes) is given, it has to be valid ASCII.
.. TODO move this section to the version document, keep a summary, add a link
Encoding the version information is an art in itself. Python projects generally
adhere to the version format *major.minor[.patch][sub]*. The major number is 0
for initial, experimental releases of software. It is incremented for releases
that represent major milestones in a project. The minor number is incremented
when important new features are added to the project. The patch number
increments when bug-fix releases are made. Additional trailing version
information is sometimes used to indicate sub-releases. These are
"a1,a2,...,aN" (for alpha releases, where functionality and API may change),
"b1,b2,...,bN" (for beta releases, which only fix bugs) and "pr1,pr2,...,prN"
(for final pre-release release testing). Some examples:
0.1.0
the first, experimental release of a project
1.0.1a2
the second alpha release of the first patch version of 1.0
:option:`classifiers` are specified in a Python list::
setup(...,
classifiers=[
'Development Status :: 4 - Beta',
'Environment :: Console',
'Environment :: Web Environment',
'Intended Audience :: End Users/Desktop',
'Intended Audience :: Developers',
'Intended Audience :: System Administrators',
'License :: OSI Approved :: Python Software Foundation License',
'Operating System :: MacOS :: MacOS X',
'Operating System :: Microsoft :: Windows',
'Operating System :: POSIX',
'Programming Language :: Python',
'Topic :: Communications :: Email',
'Topic :: Office/Business',
'Topic :: Software Development :: Bug Tracking',
])
Debugging the setup script
==========================
Sometimes things go wrong, and the setup script doesn't do what the developer
wants.
Distutils catches any exceptions when running the setup script, and print a
simple error message before the script is terminated. The motivation for this
behaviour is to not confuse administrators who don't know much about Python and
are trying to install a project. If they get a big long traceback from deep
inside the guts of Distutils, they may think the project or the Python
installation is broken because they don't read all the way down to the bottom
and see that it's a permission problem.
.. FIXME DISTUTILS_DEBUG is dead, document logging/warnings here
On the other hand, this doesn't help the developer to find the cause of the
failure. For this purpose, the DISTUTILS_DEBUG environment variable can be set
to anything except an empty string, and Packaging will now print detailed
information about what it is doing, and prints the full traceback in case an
exception occurs.

View File

@ -0,0 +1,273 @@
.. _packaging-source-dist:
******************************
Creating a Source Distribution
******************************
As shown in section :ref:`packaging-simple-example`, you use the :command:`sdist` command
to create a source distribution. In the simplest case, ::
python setup.py sdist
(assuming you haven't specified any :command:`sdist` options in the setup script
or config file), :command:`sdist` creates the archive of the default format for
the current platform. The default format is a gzip'ed tar file
(:file:`.tar.gz`) on Unix, and ZIP file on Windows.
You can specify as many formats as you like using the :option:`--formats`
option, for example::
python setup.py sdist --formats=gztar,zip
to create a gzipped tarball and a zip file. The available formats are:
+-----------+-------------------------+---------+
| Format | Description | Notes |
+===========+=========================+=========+
| ``zip`` | zip file (:file:`.zip`) | (1),(3) |
+-----------+-------------------------+---------+
| ``gztar`` | gzip'ed tar file | \(2) |
| | (:file:`.tar.gz`) | |
+-----------+-------------------------+---------+
| ``bztar`` | bzip2'ed tar file | |
| | (:file:`.tar.bz2`) | |
+-----------+-------------------------+---------+
| ``ztar`` | compressed tar file | \(4) |
| | (:file:`.tar.Z`) | |
+-----------+-------------------------+---------+
| ``tar`` | tar file (:file:`.tar`) | |
+-----------+-------------------------+---------+
Notes:
(1)
default on Windows
(2)
default on Unix
(3)
requires either external :program:`zip` utility or :mod:`zipfile` module (part
of the standard Python library since Python 1.6)
(4)
requires the :program:`compress` program. Notice that this format is now
pending for deprecation and will be removed in the future versions of Python.
When using any ``tar`` format (``gztar``, ``bztar``, ``ztar`` or
``tar``) under Unix, you can specify the ``owner`` and ``group`` names
that will be set for each member of the archive.
For example, if you want all files of the archive to be owned by root::
python setup.py sdist --owner=root --group=root
.. _packaging-manifest:
Specifying the files to distribute
==================================
If you don't supply an explicit list of files (or instructions on how to
generate one), the :command:`sdist` command puts a minimal default set into the
source distribution:
* all Python source files implied by the :option:`py_modules` and
:option:`packages` options
* all C source files mentioned in the :option:`ext_modules` or
:option:`libraries` options
* scripts identified by the :option:`scripts` option
See :ref:`packaging-installing-scripts`.
* anything that looks like a test script: :file:`test/test\*.py` (currently, the
Packaging don't do anything with test scripts except include them in source
distributions, but in the future there will be a standard for testing Python
module distributions)
* the configuration file :file:`setup.cfg`
* all files that matches the ``package_data`` metadata.
See :ref:`packaging-installing-package-data`.
* all files that matches the ``data_files`` metadata.
See :ref:`packaging-additional-files`.
Contrary to Distutils, :file:`README` (or :file:`README.txt`) and
:file:`setup.py` are not included by default.
Sometimes this is enough, but usually you will want to specify additional files
to distribute. The typical way to do this is to write a *manifest template*,
called :file:`MANIFEST.in` by default. The manifest template is just a list of
instructions for how to generate your manifest file, :file:`MANIFEST`, which is
the exact list of files to include in your source distribution. The
:command:`sdist` command processes this template and generates a manifest based
on its instructions and what it finds in the filesystem.
If you prefer to roll your own manifest file, the format is simple: one filename
per line, regular files (or symlinks to them) only. If you do supply your own
:file:`MANIFEST`, you must specify everything: the default set of files
described above does not apply in this case.
:file:`MANIFEST` files start with a comment indicating they are generated.
Files without this comment are not overwritten or removed.
See :ref:`packaging-manifest-template` section for a syntax reference.
.. _packaging-manifest-options:
Manifest-related options
========================
The normal course of operations for the :command:`sdist` command is as follows:
* if the manifest file, :file:`MANIFEST` doesn't exist, read :file:`MANIFEST.in`
and create the manifest
* if neither :file:`MANIFEST` nor :file:`MANIFEST.in` exist, create a manifest
with just the default file set
* if either :file:`MANIFEST.in` or the setup script (:file:`setup.py`) are more
recent than :file:`MANIFEST`, recreate :file:`MANIFEST` by reading
:file:`MANIFEST.in`
* use the list of files now in :file:`MANIFEST` (either just generated or read
in) to create the source distribution archive(s)
There are a couple of options that modify this behaviour. First, use the
:option:`--no-defaults` and :option:`--no-prune` to disable the standard
"include" and "exclude" sets.
Second, you might just want to (re)generate the manifest, but not create a
source distribution::
python setup.py sdist --manifest-only
:option:`-o` is a shortcut for :option:`--manifest-only`.
.. _packaging-manifest-template:
The MANIFEST.in template
========================
A :file:`MANIFEST.in` file can be added in a project to define the list of
files to include in the distribution built by the :command:`sdist` command.
When :command:`sdist` is run, it will look for the :file:`MANIFEST.in` file
and interpret it to generate the :file:`MANIFEST` file that contains the
list of files that will be included in the package.
This mechanism can be used when the default list of files is not enough.
(See :ref:`packaging-manifest`).
Principle
---------
The manifest template has one command per line, where each command specifies a
set of files to include or exclude from the source distribution. For an
example, let's look at the Packaging' own manifest template::
include *.txt
recursive-include examples *.txt *.py
prune examples/sample?/build
The meanings should be fairly clear: include all files in the distribution root
matching :file:`\*.txt`, all files anywhere under the :file:`examples` directory
matching :file:`\*.txt` or :file:`\*.py`, and exclude all directories matching
:file:`examples/sample?/build`. All of this is done *after* the standard
include set, so you can exclude files from the standard set with explicit
instructions in the manifest template. (Or, you can use the
:option:`--no-defaults` option to disable the standard set entirely.)
The order of commands in the manifest template matters: initially, we have the
list of default files as described above, and each command in the template adds
to or removes from that list of files. Once we have fully processed the
manifest template, we remove files that should not be included in the source
distribution:
* all files in the Packaging "build" tree (default :file:`build/`)
* all files in directories named :file:`RCS`, :file:`CVS`, :file:`.svn`,
:file:`.hg`, :file:`.git`, :file:`.bzr` or :file:`_darcs`
Now we have our complete list of files, which is written to the manifest for
future reference, and then used to build the source distribution archive(s).
You can disable the default set of included files with the
:option:`--no-defaults` option, and you can disable the standard exclude set
with :option:`--no-prune`.
Following the Packaging' own manifest template, let's trace how the
:command:`sdist` command builds the list of files to include in the Packaging
source distribution:
#. include all Python source files in the :file:`packaging` and
:file:`packaging/command` subdirectories (because packages corresponding to
those two directories were mentioned in the :option:`packages` option in the
setup script---see section :ref:`packaging-setup-script`)
#. include :file:`README.txt`, :file:`setup.py`, and :file:`setup.cfg` (standard
files)
#. include :file:`test/test\*.py` (standard files)
#. include :file:`\*.txt` in the distribution root (this will find
:file:`README.txt` a second time, but such redundancies are weeded out later)
#. include anything matching :file:`\*.txt` or :file:`\*.py` in the sub-tree
under :file:`examples`,
#. exclude all files in the sub-trees starting at directories matching
:file:`examples/sample?/build`\ ---this may exclude files included by the
previous two steps, so it's important that the ``prune`` command in the manifest
template comes after the ``recursive-include`` command
#. exclude the entire :file:`build` tree, and any :file:`RCS`, :file:`CVS`,
:file:`.svn`, :file:`.hg`, :file:`.git`, :file:`.bzr` and :file:`_darcs`
directories
Just like in the setup script, file and directory names in the manifest template
should always be slash-separated; the Packaging will take care of converting
them to the standard representation on your platform. That way, the manifest
template is portable across operating systems.
Commands
--------
The manifest template commands are:
+-------------------------------------------+-----------------------------------------------+
| Command | Description |
+===========================================+===============================================+
| :command:`include pat1 pat2 ...` | include all files matching any of the listed |
| | patterns |
+-------------------------------------------+-----------------------------------------------+
| :command:`exclude pat1 pat2 ...` | exclude all files matching any of the listed |
| | patterns |
+-------------------------------------------+-----------------------------------------------+
| :command:`recursive-include dir pat1 pat2 | include all files under *dir* matching any of |
| ...` | the listed patterns |
+-------------------------------------------+-----------------------------------------------+
| :command:`recursive-exclude dir pat1 pat2 | exclude all files under *dir* matching any of |
| ...` | the listed patterns |
+-------------------------------------------+-----------------------------------------------+
| :command:`global-include pat1 pat2 ...` | include all files anywhere in the source tree |
| | matching --- & any of the listed patterns |
+-------------------------------------------+-----------------------------------------------+
| :command:`global-exclude pat1 pat2 ...` | exclude all files anywhere in the source tree |
| | matching --- & any of the listed patterns |
+-------------------------------------------+-----------------------------------------------+
| :command:`prune dir` | exclude all files under *dir* |
+-------------------------------------------+-----------------------------------------------+
| :command:`graft dir` | include all files under *dir* |
+-------------------------------------------+-----------------------------------------------+
The patterns here are Unix-style "glob" patterns: ``*`` matches any sequence of
regular filename characters, ``?`` matches any single regular filename
character, and ``[range]`` matches any of the characters in *range* (e.g.,
``a-z``, ``a-zA-Z``, ``a-f0-9_.``). The definition of "regular filename
character" is platform-specific: on Unix it is anything except slash; on Windows
anything except backslash or colon.

112
Doc/packaging/tutorial.rst Normal file
View File

@ -0,0 +1,112 @@
==================
Packaging tutorial
==================
Welcome to the Packaging tutorial! We will learn how to use Packaging
to package your project.
.. TODO merge with introduction.rst
Getting started
---------------
Packaging works with the *setup.cfg* file. It contains all the metadata for
your project, as defined in PEP 345, but also declare what your project
contains.
Let's say you have a project called *CLVault* containing one package called
*clvault*, and a few scripts inside. You can use the *pysetup* script to create
a *setup.cfg* file for the project. The script will ask you a few questions::
$ mkdir CLVault
$ cd CLVault
$ pysetup create
Project name [CLVault]:
Current version number: 0.1
Package description:
>Command-line utility to store and retrieve passwords
Author name: Tarek Ziade
Author e-mail address: tarek@ziade.org
Project Home Page: http://bitbucket.org/tarek/clvault
Do you want to add a package ? (y/n): y
Package name: clvault
Do you want to add a package ? (y/n): n
Do you want to set Trove classifiers? (y/n): y
Please select the project status:
1 - Planning
2 - Pre-Alpha
3 - Alpha
4 - Beta
5 - Production/Stable
6 - Mature
7 - Inactive
Status: 3
What license do you use: GPL
Matching licenses:
1) License :: OSI Approved :: GNU General Public License (GPL)
2) License :: OSI Approved :: GNU Library or Lesser General Public License (LGPL)
Type the number of the license you wish to use or ? to try again:: 1
Do you want to set other trove identifiers (y/n) [n]: n
Wrote "setup.cfg".
A setup.cfg file is created, containing the metadata of your project and the
list of the packages it contains::
$ cat setup.cfg
[metadata]
name = CLVault
version = 0.1
author = Tarek Ziade
author_email = tarek@ziade.org
description = Command-line utility to store and retrieve passwords
home_page = http://bitbucket.org/tarek/clvault
classifier = Development Status :: 3 - Alpha
License :: OSI Approved :: GNU General Public License (GPL)
[files]
packages = clvault
Our project will depend on the *keyring* project. Let's add it in the
[metadata] section::
[metadata]
...
requires_dist =
keyring
Running commands
----------------
You can run useful commands on your project once the setup.cfg file is ready:
- sdist: creates a source distribution
- register: register your project to PyPI
- upload: upload the distribution to PyPI
- install_dist: install it
All commands are run using the run script::
$ pysetup run install_dist
$ pysetup run sdist
$ pysetup run upload
If you want to push a source distribution of your project to PyPI, do::
$ pysetup run sdist register upload
Installing the project
----------------------
The project can be installed by manually running the packaging install command::
$ pysetup run install_dist

View File

@ -0,0 +1,80 @@
.. _packaging-package-upload:
***************************************
Uploading Packages to the Package Index
***************************************
The Python Package Index (PyPI) not only stores the package info, but also the
package data if the author of the package wishes to. The packaging command
:command:`upload` pushes the distribution files to PyPI.
The command is invoked immediately after building one or more distribution
files. For example, the command ::
python setup.py sdist bdist_wininst upload
will cause the source distribution and the Windows installer to be uploaded to
PyPI. Note that these will be uploaded even if they are built using an earlier
invocation of :file:`setup.py`, but that only distributions named on the command
line for the invocation including the :command:`upload` command are uploaded.
The :command:`upload` command uses the username, password, and repository URL
from the :file:`$HOME/.pypirc` file (see section :ref:`packaging-pypirc` for more on this
file). If a :command:`register` command was previously called in the same
command, and if the password was entered in the prompt, :command:`upload` will
reuse the entered password. This is useful if you do not want to store a clear
text password in the :file:`$HOME/.pypirc` file.
You can specify another PyPI server with the :option:`--repository=*url*`
option::
python setup.py sdist bdist_wininst upload -r http://example.com/pypi
See section :ref:`packaging-pypirc` for more on defining several servers.
You can use the :option:`--sign` option to tell :command:`upload` to sign each
uploaded file using GPG (GNU Privacy Guard). The :program:`gpg` program must
be available for execution on the system :envvar:`PATH`. You can also specify
which key to use for signing using the :option:`--identity=*name*` option.
Other :command:`upload` options include :option:`--repository=<url>` or
:option:`--repository=<section>` where *url* is the url of the server and
*section* the name of the section in :file:`$HOME/.pypirc`, and
:option:`--show-response` (which displays the full response text from the PyPI
server for help in debugging upload problems).
PyPI package display
====================
The ``description`` field plays a special role at PyPI. It is used by
the server to display a home page for the registered package.
If you use the `reStructuredText <http://docutils.sourceforge.net/rst.html>`_
syntax for this field, PyPI will parse it and display an HTML output for
the package home page.
The ``description`` field can be filled from a text file located in the
project::
from packaging.core import setup
fp = open('README.txt')
try:
description = fp.read()
finally:
fp.close()
setup(name='Packaging',
description=description)
In that case, :file:`README.txt` is a regular reStructuredText text file located
in the root of the package besides :file:`setup.py`.
To prevent registering broken reStructuredText content, you can use the
:program:`rst2html` program that is provided by the :mod:`docutils` package
and check the ``description`` from the command line::
$ python setup.py --description | rst2html.py > output.html
:mod:`docutils` will display a warning if there's something wrong with your
syntax.

View File

@ -20,10 +20,10 @@
<span class="linkdescr">tutorial for C/C++ programmers</span></p>
<p class="biglink"><a class="biglink" href="{{ pathto("c-api/index") }}">Python/C API</a><br/>
<span class="linkdescr">reference for C/C++ programmers</span></p>
<p class="biglink"><a class="biglink" href="{{ pathto("install/index") }}">Installing Python Modules</a><br/>
<span class="linkdescr">information for installers &amp; sys-admins</span></p>
<p class="biglink"><a class="biglink" href="{{ pathto("distutils/index") }}">Distributing Python Modules</a><br/>
<span class="linkdescr">sharing modules with others</span></p>
<p class="biglink"><a class="biglink" href="{{ pathto("install/index") }}">Installing Python Projects</a><br/>
<span class="linkdescr">finding and installing modules and applications</span></p>
<p class="biglink"><a class="biglink" href="{{ pathto("packaging/index") }}">Distributing Python Projects</a><br/>
<span class="linkdescr">packaging and distributing modules and applications</span></p>
<p class="biglink"><a class="biglink" href="{{ pathto("documenting/index") }}">Documenting Python</a><br/>
<span class="linkdescr">guide for documentation authors</span></p>
<p class="biglink"><a class="biglink" href="{{ pathto("faq/index") }}">FAQs</a><br/>

View File

@ -106,6 +106,11 @@ connection when done::
os
--
* The :mod:`os` module has a new :func:`~os.pipe2` function that makes it
possible to create a pipe with :data:`~os.O_CLOEXEC` or
:data:`~os.O_NONBLOCK` flags set atomically. This is especially useful to
avoid race conditions in multi-threaded programs.
* The :mod:`os` module has a new :func:`~os.sendfile` function which provides
an efficent "zero-copy" way for copying data from one file (or socket)
descriptor to another. The phrase "zero-copy" refers to the fact that all of
@ -124,6 +129,27 @@ os
(Patch submitted by Giampaolo Rodolà in :issue:`10784`.)
packaging
---------
:mod:`distutils` has undergone additions and refactoring under a new name,
:mod:`packaging`, to allow developers to break backward compatibility.
:mod:`distutils` is still provided in the standard library, but users are
encouraged to transition to :mod:`packaging`. For older versions of Python, a
backport compatible with 2.4+ and 3.1+ will be made available on PyPI under the
name :mod:`distutils2`.
.. TODO add examples and howto to the packaging docs and link to them
pydoc
-----
The Tk GUI and the :func:`~pydoc.serve` function have been removed from the
:mod:`pydoc` module: ``pydoc -g`` and :func:`~pydoc.serve` have been deprecated
in Python 3.2.
sys
---
@ -152,6 +178,16 @@ signal
instead of a RuntimeError: OSError has an errno attribute.
ssl
---
The :mod:`ssl` module has new functions:
* :func:`~ssl.RAND_bytes`: generate cryptographically strong
pseudo-random bytes.
* :func:`~ssl.RAND_pseudo_bytes`: generate pseudo-random bytes.
Optimizations
=============

View File

@ -36,6 +36,8 @@ typedef struct _keyword *keyword_ty;
typedef struct _alias *alias_ty;
typedef struct _withitem *withitem_ty;
enum _mod_kind {Module_kind=1, Interactive_kind=2, Expression_kind=3,
Suite_kind=4};
@ -64,10 +66,9 @@ struct _mod {
enum _stmt_kind {FunctionDef_kind=1, ClassDef_kind=2, Return_kind=3,
Delete_kind=4, Assign_kind=5, AugAssign_kind=6, For_kind=7,
While_kind=8, If_kind=9, With_kind=10, Raise_kind=11,
TryExcept_kind=12, TryFinally_kind=13, Assert_kind=14,
Import_kind=15, ImportFrom_kind=16, Global_kind=17,
Nonlocal_kind=18, Expr_kind=19, Pass_kind=20, Break_kind=21,
Continue_kind=22};
Try_kind=12, Assert_kind=13, Import_kind=14,
ImportFrom_kind=15, Global_kind=16, Nonlocal_kind=17,
Expr_kind=18, Pass_kind=19, Break_kind=20, Continue_kind=21};
struct _stmt {
enum _stmt_kind kind;
union {
@ -128,8 +129,7 @@ struct _stmt {
} If;
struct {
expr_ty context_expr;
expr_ty optional_vars;
asdl_seq *items;
asdl_seq *body;
} With;
@ -142,12 +142,8 @@ struct _stmt {
asdl_seq *body;
asdl_seq *handlers;
asdl_seq *orelse;
} TryExcept;
struct {
asdl_seq *body;
asdl_seq *finalbody;
} TryFinally;
} Try;
struct {
expr_ty test;
@ -383,6 +379,11 @@ struct _alias {
identifier asname;
};
struct _withitem {
expr_ty context_expr;
expr_ty optional_vars;
};
#define Module(a0, a1) _Py_Module(a0, a1)
mod_ty _Py_Module(asdl_seq * body, PyArena *arena);
@ -421,18 +422,16 @@ stmt_ty _Py_While(expr_ty test, asdl_seq * body, asdl_seq * orelse, int lineno,
#define If(a0, a1, a2, a3, a4, a5) _Py_If(a0, a1, a2, a3, a4, a5)
stmt_ty _Py_If(expr_ty test, asdl_seq * body, asdl_seq * orelse, int lineno,
int col_offset, PyArena *arena);
#define With(a0, a1, a2, a3, a4, a5) _Py_With(a0, a1, a2, a3, a4, a5)
stmt_ty _Py_With(expr_ty context_expr, expr_ty optional_vars, asdl_seq * body,
int lineno, int col_offset, PyArena *arena);
#define With(a0, a1, a2, a3, a4) _Py_With(a0, a1, a2, a3, a4)
stmt_ty _Py_With(asdl_seq * items, asdl_seq * body, int lineno, int col_offset,
PyArena *arena);
#define Raise(a0, a1, a2, a3, a4) _Py_Raise(a0, a1, a2, a3, a4)
stmt_ty _Py_Raise(expr_ty exc, expr_ty cause, int lineno, int col_offset,
PyArena *arena);
#define TryExcept(a0, a1, a2, a3, a4, a5) _Py_TryExcept(a0, a1, a2, a3, a4, a5)
stmt_ty _Py_TryExcept(asdl_seq * body, asdl_seq * handlers, asdl_seq * orelse,
int lineno, int col_offset, PyArena *arena);
#define TryFinally(a0, a1, a2, a3, a4) _Py_TryFinally(a0, a1, a2, a3, a4)
stmt_ty _Py_TryFinally(asdl_seq * body, asdl_seq * finalbody, int lineno, int
col_offset, PyArena *arena);
#define Try(a0, a1, a2, a3, a4, a5, a6) _Py_Try(a0, a1, a2, a3, a4, a5, a6)
stmt_ty _Py_Try(asdl_seq * body, asdl_seq * handlers, asdl_seq * orelse,
asdl_seq * finalbody, int lineno, int col_offset, PyArena
*arena);
#define Assert(a0, a1, a2, a3, a4) _Py_Assert(a0, a1, a2, a3, a4)
stmt_ty _Py_Assert(expr_ty test, expr_ty msg, int lineno, int col_offset,
PyArena *arena);
@ -547,6 +546,9 @@ arg_ty _Py_arg(identifier arg, expr_ty annotation, PyArena *arena);
keyword_ty _Py_keyword(identifier arg, expr_ty value, PyArena *arena);
#define alias(a0, a1, a2) _Py_alias(a0, a1, a2)
alias_ty _Py_alias(identifier name, identifier asname, PyArena *arena);
#define withitem(a0, a1, a2) _Py_withitem(a0, a1, a2)
withitem_ty _Py_withitem(expr_ty context_expr, expr_ty optional_vars, PyArena
*arena);
PyObject* PyAST_mod2obj(mod_ty t);
mod_ty PyAST_obj2mod(PyObject* ast, PyArena* arena, int mode);

View File

@ -44,7 +44,7 @@ PyAPI_FUNC(PyObject *) PyImport_ImportModuleNoBlock(
const char *name /* UTF-8 encoded string */
);
PyAPI_FUNC(PyObject *) PyImport_ImportModuleLevel(
char *name, /* UTF-8 encoded string */
const char *name, /* UTF-8 encoded string */
PyObject *globals,
PyObject *locals,
PyObject *fromlist,

View File

@ -558,7 +558,11 @@ class RawIOBase(IOBase):
if not data:
break
res += data
return bytes(res)
if res:
return bytes(res)
else:
# b'' or None
return data
def readinto(self, b):
"""Read up to len(b) bytes into bytearray b.
@ -940,6 +944,12 @@ class BufferedReader(_BufferedIOMixin):
# Special case for when the number of bytes to read is unspecified.
if n is None or n == -1:
self._reset_read_buf()
if hasattr(self.raw, 'readall'):
chunk = self.raw.readall()
if chunk is None:
return buf[pos:] or None
else:
return buf[pos:] + chunk
chunks = [buf[pos:]] # Strip the consumed bytes.
current_size = 0
while True:

View File

@ -76,6 +76,10 @@ class BZ2File(io.BufferedIOBase):
mode = "wb"
mode_code = _MODE_WRITE
self._compressor = BZ2Compressor()
elif mode in ("a", "ab"):
mode = "ab"
mode_code = _MODE_WRITE
self._compressor = BZ2Compressor()
else:
raise ValueError("Invalid mode: {!r}".format(mode))
@ -155,20 +159,31 @@ class BZ2File(io.BufferedIOBase):
if not self.seekable():
self._check_not_closed()
raise io.UnsupportedOperation("Seeking is only supported "
"on files opening for reading")
"on files open for reading")
# Fill the readahead buffer if it is empty. Returns False on EOF.
def _fill_buffer(self):
if self._buffer:
return True
if self._decompressor.eof:
self._mode = _MODE_READ_EOF
self._size = self._pos
return False
rawblock = self._fp.read(_BUFFER_SIZE)
if self._decompressor.unused_data:
rawblock = self._decompressor.unused_data
else:
rawblock = self._fp.read(_BUFFER_SIZE)
if not rawblock:
raise EOFError("Compressed file ended before the "
"end-of-stream marker was reached")
if self._decompressor.eof:
self._mode = _MODE_READ_EOF
self._size = self._pos
return False
else:
raise EOFError("Compressed file ended before the "
"end-of-stream marker was reached")
# Continue to next stream.
if self._decompressor.eof:
self._decompressor = BZ2Decompressor()
self._buffer = self._decompressor.decompress(rawblock)
return True
@ -384,9 +399,15 @@ def decompress(data):
"""
if len(data) == 0:
return b""
decomp = BZ2Decompressor()
result = decomp.decompress(data)
if not decomp.eof:
raise ValueError("Compressed data ended before the "
"end-of-stream marker was reached")
return result
results = []
while True:
decomp = BZ2Decompressor()
results.append(decomp.decompress(data))
if not decomp.eof:
raise ValueError("Compressed data ended before the "
"end-of-stream marker was reached")
if not decomp.unused_data:
return b"".join(results)
# There is unused data left over. Proceed to next stream.
data = decomp.unused_data

View File

@ -269,6 +269,8 @@ class {typename}(tuple):
'Return a new OrderedDict which maps field names to their values'
return OrderedDict(zip(self._fields, self))
__dict__ = property(_asdict)
def _replace(_self, **kwds):
'Return a new {typename} object replacing specified fields with new values'
result = _self._make(map(kwds.pop, {field_names!r}, _self))

View File

@ -137,9 +137,7 @@ elif os.name == "posix":
rv = f.close()
if rv == 10:
raise OSError('objdump command not found')
with contextlib.closing(os.popen(cmd)) as f:
data = f.read()
res = re.search(r'\sSONAME\s+([^\s]+)', data)
res = re.search(r'\sSONAME\s+([^\s]+)', dump)
if not res:
return None
return res.group(1)

View File

@ -2001,9 +2001,9 @@ class Decimal(object):
nonzero. For efficiency, other._exp should not be too large,
so that 10**abs(other._exp) is a feasible calculation."""
# In the comments below, we write x for the value of self and
# y for the value of other. Write x = xc*10**xe and y =
# yc*10**ye.
# In the comments below, we write x for the value of self and y for the
# value of other. Write x = xc*10**xe and abs(y) = yc*10**ye, with xc
# and yc positive integers not divisible by 10.
# The main purpose of this method is to identify the *failure*
# of x**y to be exactly representable with as little effort as
@ -2011,13 +2011,12 @@ class Decimal(object):
# eliminate the possibility of x**y being exact. Only if all
# these tests are passed do we go on to actually compute x**y.
# Here's the main idea. First normalize both x and y. We
# express y as a rational m/n, with m and n relatively prime
# and n>0. Then for x**y to be exactly representable (at
# *any* precision), xc must be the nth power of a positive
# integer and xe must be divisible by n. If m is negative
# then additionally xc must be a power of either 2 or 5, hence
# a power of 2**n or 5**n.
# Here's the main idea. Express y as a rational number m/n, with m and
# n relatively prime and n>0. Then for x**y to be exactly
# representable (at *any* precision), xc must be the nth power of a
# positive integer and xe must be divisible by n. If y is negative
# then additionally xc must be a power of either 2 or 5, hence a power
# of 2**n or 5**n.
#
# There's a limit to how small |y| can be: if y=m/n as above
# then:
@ -2089,21 +2088,43 @@ class Decimal(object):
return None
# now xc is a power of 2; e is its exponent
e = _nbits(xc)-1
# find e*y and xe*y; both must be integers
if ye >= 0:
y_as_int = yc*10**ye
e = e*y_as_int
xe = xe*y_as_int
else:
ten_pow = 10**-ye
e, remainder = divmod(e*yc, ten_pow)
if remainder:
return None
xe, remainder = divmod(xe*yc, ten_pow)
if remainder:
return None
if e*65 >= p*93: # 93/65 > log(10)/log(5)
# We now have:
#
# x = 2**e * 10**xe, e > 0, and y < 0.
#
# The exact result is:
#
# x**y = 5**(-e*y) * 10**(e*y + xe*y)
#
# provided that both e*y and xe*y are integers. Note that if
# 5**(-e*y) >= 10**p, then the result can't be expressed
# exactly with p digits of precision.
#
# Using the above, we can guard against large values of ye.
# 93/65 is an upper bound for log(10)/log(5), so if
#
# ye >= len(str(93*p//65))
#
# then
#
# -e*y >= -y >= 10**ye > 93*p/65 > p*log(10)/log(5),
#
# so 5**(-e*y) >= 10**p, and the coefficient of the result
# can't be expressed in p digits.
# emax >= largest e such that 5**e < 10**p.
emax = p*93//65
if ye >= len(str(emax)):
return None
# Find -e*y and -xe*y; both must be integers
e = _decimal_lshift_exact(e * yc, ye)
xe = _decimal_lshift_exact(xe * yc, ye)
if e is None or xe is None:
return None
if e > emax:
return None
xc = 5**e
@ -2117,19 +2138,20 @@ class Decimal(object):
while xc % 5 == 0:
xc //= 5
e -= 1
if ye >= 0:
y_as_integer = yc*10**ye
e = e*y_as_integer
xe = xe*y_as_integer
else:
ten_pow = 10**-ye
e, remainder = divmod(e*yc, ten_pow)
if remainder:
return None
xe, remainder = divmod(xe*yc, ten_pow)
if remainder:
return None
if e*3 >= p*10: # 10/3 > log(10)/log(2)
# Guard against large values of ye, using the same logic as in
# the 'xc is a power of 2' branch. 10/3 is an upper bound for
# log(10)/log(2).
emax = p*10//3
if ye >= len(str(emax)):
return None
e = _decimal_lshift_exact(e * yc, ye)
xe = _decimal_lshift_exact(xe * yc, ye)
if e is None or xe is None:
return None
if e > emax:
return None
xc = 2**e
else:
@ -5529,6 +5551,27 @@ def _normalize(op1, op2, prec = 0):
_nbits = int.bit_length
def _decimal_lshift_exact(n, e):
""" Given integers n and e, return n * 10**e if it's an integer, else None.
The computation is designed to avoid computing large powers of 10
unnecessarily.
>>> _decimal_lshift_exact(3, 4)
30000
>>> _decimal_lshift_exact(300, -999999999) # returns None
"""
if n == 0:
return 0
elif e >= 0:
return n * 10**e
else:
# val_n = largest power of 10 dividing n.
str_n = str(abs(n))
val_n = len(str_n) - len(str_n.rstrip('0'))
return None if val_n < -e else n // 10**-e
def _sqrt_nearest(n, a):
"""Closest integer to the square root of the positive integer n. a is
an initial approximation to the square root. Any positive integer

View File

@ -57,12 +57,15 @@ class BuildPyTestCase(support.TempdirManager,
self.assertEqual(len(cmd.get_outputs()), 3)
pkgdest = os.path.join(destination, "pkg")
files = os.listdir(pkgdest)
self.assertTrue("__init__.py" in files)
if not sys.dont_write_bytecode:
self.assertTrue("__init__.pyc" in files)
self.assertTrue("README.txt" in files)
self.assertIn("__init__.py", files)
self.assertIn("README.txt", files)
# XXX even with -O, distutils writes pyc, not pyo; bug?
if sys.dont_write_bytecode:
self.assertNotIn("__init__.pyc", files)
else:
self.assertIn("__init__.pyc", files)
def test_empty_package_dir (self):
def test_empty_package_dir(self):
# See SF 1668596/1720897.
cwd = os.getcwd()
@ -110,7 +113,7 @@ class BuildPyTestCase(support.TempdirManager,
finally:
sys.dont_write_bytecode = old_dont_write_bytecode
self.assertTrue('byte-compiling is disabled' in self.logs[0][1])
self.assertIn('byte-compiling is disabled', self.logs[0][1])
def test_suite():
return unittest.makeSuite(BuildPyTestCase)

View File

@ -124,7 +124,7 @@ class HTMLParser(_markupbase.ParserBase):
_markupbase.ParserBase.reset(self)
def feed(self, data):
"""Feed data to the parser.
r"""Feed data to the parser.
Call this as often as you want, with as little or as much text
as you want (may include '\n').

View File

@ -249,15 +249,7 @@ class IMAP4:
def read(self, size):
"""Read 'size' bytes from remote."""
chunks = []
read = 0
while read < size:
data = self.file.read(min(size-read, 4096))
if not data:
break
read += len(data)
chunks.append(data)
return b''.join(chunks)
return self.file.read(size)
def readline(self):

View File

@ -41,10 +41,9 @@ except ImportError: #pragma: no cover
codecs = None
try:
import _thread as thread
import threading
except ImportError: #pragma: no cover
thread = None
threading = None
__author__ = "Vinay Sajip <vinay_sajip@red-dove.com>"
__status__ = "production"
@ -199,7 +198,7 @@ def _checkLevel(level):
#the lock would already have been acquired - so we need an RLock.
#The same argument applies to Loggers and Manager.loggerDict.
#
if thread:
if threading:
_lock = threading.RLock()
else: #pragma: no cover
_lock = None
@ -278,8 +277,8 @@ class LogRecord(object):
self.created = ct
self.msecs = (ct - int(ct)) * 1000
self.relativeCreated = (self.created - _startTime) * 1000
if logThreads and thread:
self.thread = thread.get_ident()
if logThreads and threading:
self.thread = threading.get_ident()
self.threadName = threading.current_thread().name
else: # pragma: no cover
self.thread = None
@ -773,7 +772,7 @@ class Handler(Filterer):
"""
Acquire a thread lock for serializing access to the underlying I/O.
"""
if thread:
if threading:
self.lock = threading.RLock()
else: #pragma: no cover
self.lock = None

View File

@ -128,6 +128,7 @@ class bdist(Command):
for i in range(len(self.formats)):
cmd_name = commands[i]
sub_cmd = self.get_reinitialized_command(cmd_name)
sub_cmd.format = self.formats[i]
# passing the owner and group names for tar archiving
if cmd_name == 'bdist_dumb':

View File

@ -32,7 +32,7 @@ class check(Command):
# XXX we could use a special handler for this, but would need to test
# if it works even if the logger has a too high level
self._warnings.append((msg, args))
return logger.warning(self.get_command_name() + msg, *args)
return logger.warning('%s: %s' % (self.get_command_name(), msg), *args)
def run(self):
"""Runs the command."""

View File

@ -1,10 +1,9 @@
"""Create a source distribution."""
import os
import sys
import re
import sys
from io import StringIO
from glob import glob
from shutil import get_archive_formats, rmtree
from packaging import logger
@ -203,45 +202,14 @@ class sdist(Command):
def add_defaults(self):
"""Add all the default files to self.filelist:
- README or README.txt
- test/test*.py
- all pure Python modules mentioned in setup script
- all files pointed by package_data (build_py)
- all files defined in data_files.
- all files defined as scripts.
- all C sources listed as part of extensions or C libraries
in the setup script (doesn't catch C headers!)
Warns if (README or README.txt) or setup.py are missing; everything
else is optional.
Everything is optional.
"""
standards = [('README', 'README.txt')]
for fn in standards:
if isinstance(fn, tuple):
alts = fn
got_it = False
for fn in alts:
if os.path.exists(fn):
got_it = True
self.filelist.append(fn)
break
if not got_it:
logger.warning(
'%s: standard file not found: should have one of %s',
self.get_command_name(), ', '.join(alts))
else:
if os.path.exists(fn):
self.filelist.append(fn)
else:
logger.warning('%s: standard file %r not found',
self.get_command_name(), fn)
optional = ['test/test*.py', 'setup.cfg']
for pattern in optional:
files = [f for f in glob(pattern) if os.path.isfile(f)]
if files:
self.filelist.extend(files)
for cmd_name in get_command_names():
try:
cmd_obj = self.get_finalized_command(cmd_name)

View File

@ -83,19 +83,16 @@ def customize_compiler(compiler):
# patterns. Order is important; platform mappings are preferred over
# OS names.
_default_compilers = (
# Platform string mappings
# on a cygwin built python we can use gcc like an ordinary UNIXish
# compiler
('cygwin.*', 'unix'),
('os2emx', 'emx'),
# OS name mappings
('posix', 'unix'),
('nt', 'msvc'),
)
)
def get_default_compiler(osname=None, platform=None):
""" Determine the default compiler to use for the given platform.

View File

@ -352,7 +352,7 @@ class CCompiler:
return macros, objects, extra, pp_opts, build
def _get_cc_args(self, pp_opts, debug, before):
# works for unixccompiler, emxccompiler, cygwinccompiler
# works for unixccompiler and cygwinccompiler
cc_args = pp_opts + ['-c']
if debug:
cc_args[:0] = ['-g']

View File

@ -18,7 +18,7 @@ __all__ = [
'get_distributions', 'get_distribution', 'get_file_users',
'provides_distribution', 'obsoletes_distribution',
'enable_cache', 'disable_cache', 'clear_cache',
]
'get_file_path', 'get_file']
# TODO update docs
@ -627,3 +627,17 @@ def get_file_users(path):
for dist in get_distributions():
if dist.uses(path):
yield dist
def get_file_path(distribution_name, relative_path):
"""Return the path to a resource file."""
dist = get_distribution(distribution_name)
if dist != None:
return dist.get_resource_path(relative_path)
raise LookupError('no distribution named %r found' % distribution_name)
def get_file(distribution_name, relative_path, *args, **kwargs):
"""Open and return a resource file."""
return open(get_file_path(distribution_name, relative_path),
*args, **kwargs)

View File

@ -13,7 +13,6 @@ It is used under the hood by the command classes. Do not use directly.
import getopt
import re
import sys
import string
import textwrap
from packaging.errors import PackagingGetoptError, PackagingArgError
@ -142,20 +141,20 @@ class FancyGetopt:
for option in self.option_table:
if len(option) == 3:
integer, short, help = option
longopt, short, help = option
repeat = 0
elif len(option) == 4:
integer, short, help, repeat = option
longopt, short, help, repeat = option
else:
# the option table is part of the code, so simply
# assert that it is correct
raise ValueError("invalid option tuple: %r" % option)
# Type- and value-check the option names
if not isinstance(integer, str) or len(integer) < 2:
if not isinstance(longopt, str) or len(longopt) < 2:
raise PackagingGetoptError(
("invalid long option '%s': "
"must be a string of length >= 2") % integer)
"must be a string of length >= 2") % longopt)
if (not ((short is None) or
(isinstance(short, str) and len(short) == 1))):
@ -163,55 +162,55 @@ class FancyGetopt:
("invalid short option '%s': "
"must be a single character or None") % short)
self.repeat[integer] = repeat
self.long_opts.append(integer)
self.repeat[longopt] = repeat
self.long_opts.append(longopt)
if integer[-1] == '=': # option takes an argument?
if longopt[-1] == '=': # option takes an argument?
if short:
short = short + ':'
integer = integer[0:-1]
self.takes_arg[integer] = 1
longopt = longopt[0:-1]
self.takes_arg[longopt] = 1
else:
# Is option is a "negative alias" for some other option (eg.
# "quiet" == "!verbose")?
alias_to = self.negative_alias.get(integer)
alias_to = self.negative_alias.get(longopt)
if alias_to is not None:
if self.takes_arg[alias_to]:
raise PackagingGetoptError(
("invalid negative alias '%s': "
"aliased option '%s' takes a value") % \
(integer, alias_to))
(longopt, alias_to))
self.long_opts[-1] = integer # XXX redundant?!
self.takes_arg[integer] = 0
self.long_opts[-1] = longopt # XXX redundant?!
self.takes_arg[longopt] = 0
else:
self.takes_arg[integer] = 0
self.takes_arg[longopt] = 0
# If this is an alias option, make sure its "takes arg" flag is
# the same as the option it's aliased to.
alias_to = self.alias.get(integer)
alias_to = self.alias.get(longopt)
if alias_to is not None:
if self.takes_arg[integer] != self.takes_arg[alias_to]:
if self.takes_arg[longopt] != self.takes_arg[alias_to]:
raise PackagingGetoptError(
("invalid alias '%s': inconsistent with "
"aliased option '%s' (one of them takes a value, "
"the other doesn't") % (integer, alias_to))
"the other doesn't") % (longopt, alias_to))
# Now enforce some bondage on the long option name, so we can
# later translate it to an attribute name on some object. Have
# to do this a bit late to make sure we've removed any trailing
# '='.
if not longopt_re.match(integer):
if not longopt_re.match(longopt):
raise PackagingGetoptError(
("invalid long option name '%s' " +
"(must be letters, numbers, hyphens only") % integer)
"(must be letters, numbers, hyphens only") % longopt)
self.attr_name[integer] = integer.replace('-', '_')
self.attr_name[longopt] = longopt.replace('-', '_')
if short:
self.short_opts.append(short)
self.short2long[short[0]] = integer
self.short2long[short[0]] = longopt
def getopt(self, args=None, object=None):
"""Parse command-line options in args. Store as attributes on object.
@ -297,10 +296,10 @@ class FancyGetopt:
# First pass: determine maximum length of long option names
max_opt = 0
for option in self.option_table:
integer = option[0]
longopt = option[0]
short = option[1]
l = len(integer)
if integer[-1] == '=':
l = len(longopt)
if longopt[-1] == '=':
l = l - 1
if short is not None:
l = l + 5 # " (-x)" where short == 'x'
@ -340,20 +339,20 @@ class FancyGetopt:
lines = ['Option summary:']
for option in self.option_table:
integer, short, help = option[:3]
longopt, short, help = option[:3]
text = textwrap.wrap(help, text_width)
# Case 1: no short option at all (makes life easy)
if short is None:
if text:
lines.append(" --%-*s %s" % (max_opt, integer, text[0]))
lines.append(" --%-*s %s" % (max_opt, longopt, text[0]))
else:
lines.append(" --%-*s " % (max_opt, integer))
lines.append(" --%-*s " % (max_opt, longopt))
# Case 2: we have a short option, so we have to include it
# just after the long option
else:
opt_names = "%s (-%s)" % (integer, short)
opt_names = "%s (-%s)" % (longopt, short)
if text:
lines.append(" --%-*s %s" %
(max_opt, opt_names, text[0]))
@ -378,68 +377,6 @@ def fancy_getopt(options, negative_opt, object, args):
return parser.getopt(args, object)
WS_TRANS = str.maketrans(string.whitespace, ' ' * len(string.whitespace))
def wrap_text(text, width):
"""Split *text* into lines of no more than *width* characters each.
*text* is a str and *width* an int. Returns a list of str.
"""
if text is None:
return []
if len(text) <= width:
return [text]
text = text.expandtabs()
text = text.translate(WS_TRANS)
chunks = re.split(r'( +|-+)', text)
chunks = [_f for _f in chunks if _f] # ' - ' results in empty strings
lines = []
while chunks:
cur_line = [] # list of chunks (to-be-joined)
cur_len = 0 # length of current line
while chunks:
l = len(chunks[0])
if cur_len + l <= width: # can squeeze (at least) this chunk in
cur_line.append(chunks[0])
del chunks[0]
cur_len = cur_len + l
else: # this line is full
# drop last chunk if all space
if cur_line and cur_line[-1][0] == ' ':
del cur_line[-1]
break
if chunks: # any chunks left to process?
# if the current line is still empty, then we had a single
# chunk that's too big too fit on a line -- so we break
# down and break it up at the line width
if cur_len == 0:
cur_line.append(chunks[0][0:width])
chunks[0] = chunks[0][width:]
# all-whitespace chunks at the end of a line can be discarded
# (and we know from the re.split above that if a chunk has
# *any* whitespace, it is *all* whitespace)
if chunks[0][0] == ' ':
del chunks[0]
# and store this line in the list-of-all-lines -- as a single
# string, of course!
lines.append(''.join(cur_line))
# while chunks
return lines
class OptionDummy:
"""Dummy class just used as a place to hold command-line option
values as instance attributes."""

View File

@ -6,7 +6,6 @@ obtained from an index (e.g. PyPI), with dependencies.
This is a higher-level module built on packaging.database and
packaging.pypi.
"""
import os
import sys
import stat
@ -14,7 +13,7 @@ import errno
import shutil
import logging
import tempfile
from sysconfig import get_config_var
from sysconfig import get_config_var, get_path
from packaging import logger
from packaging.dist import Distribution
@ -28,6 +27,8 @@ from packaging.depgraph import generate_graph
from packaging.errors import (PackagingError, InstallationException,
InstallationConflict, CCompilerError)
from packaging.pypi.errors import ProjectNotFound, ReleaseNotFound
from packaging import database
__all__ = ['install_dists', 'install_from_infos', 'get_infos', 'remove',
'install', 'install_local_project']
@ -75,6 +76,7 @@ def _run_distutils_install(path):
def _run_setuptools_install(path):
cmd = '%s setup.py install --record=%s --single-version-externally-managed'
record_file = os.path.join(path, 'RECORD')
os.system(cmd % (sys.executable, record_file))
if not os.path.exists(record_file):
raise ValueError('failed to install')
@ -88,8 +90,10 @@ def _run_packaging_install(path):
dist.parse_config_files()
try:
dist.run_command('install_dist')
name = dist.metadata['name']
return database.get_distribution(name) is not None
except (IOError, os.error, PackagingError, CCompilerError) as msg:
raise SystemExit("error: " + str(msg))
raise ValueError("Failed to install, " + str(msg))
def _install_dist(dist, path):
@ -115,18 +119,20 @@ def install_local_project(path):
If the source directory contains a setup.py install using distutils1.
If a setup.cfg is found, install using the install_dist command.
Returns True on success, False on Failure.
"""
path = os.path.abspath(path)
if os.path.isdir(path):
logger.info('installing from source directory: %s', path)
_run_install_from_dir(path)
logger.info('Installing from source directory: %s', path)
return _run_install_from_dir(path)
elif _is_archive_file(path):
logger.info('installing from archive: %s', path)
logger.info('Installing from archive: %s', path)
_unpacked_dir = tempfile.mkdtemp()
shutil.unpack_archive(path, _unpacked_dir)
_run_install_from_archive(_unpacked_dir)
return _run_install_from_archive(_unpacked_dir)
else:
logger.warning('no projects to install')
logger.warning('No projects to install.')
return False
def _run_install_from_archive(source_dir):
@ -152,7 +158,13 @@ def _run_install_from_dir(source_dir):
func = install_methods[install_method]
try:
func = install_methods[install_method]
return func(source_dir)
try:
func(source_dir)
return True
except ValueError as err:
# failed to install
logger.info(str(err))
return False
finally:
os.chdir(old_dir)
@ -174,16 +186,16 @@ def install_dists(dists, path, paths=sys.path):
installed_dists = []
for dist in dists:
logger.info('installing %s %s', dist.name, dist.version)
logger.info('Installing %r %s...', dist.name, dist.version)
try:
_install_dist(dist, path)
installed_dists.append(dist)
except Exception as e:
logger.info('failed: %s', e)
logger.info('Failed: %s', e)
# reverting
for installed_dist in installed_dists:
logger.info('reverting %s', installed_dist)
logger.info('Reverting %s', installed_dist)
_remove_dist(installed_dist, paths)
raise e
return installed_dists
@ -292,7 +304,7 @@ def get_infos(requirements, index=None, installed=None, prefer_final=True):
# or remove
if not installed:
logger.info('reading installed distributions')
logger.debug('Reading installed distributions')
installed = list(get_distributions(use_egg_info=True))
infos = {'install': [], 'remove': [], 'conflict': []}
@ -306,7 +318,7 @@ def get_infos(requirements, index=None, installed=None, prefer_final=True):
if predicate.name.lower() != installed_project.name.lower():
continue
found = True
logger.info('found %s %s', installed_project.name,
logger.info('Found %s %s', installed_project.name,
installed_project.metadata['version'])
# if we already have something installed, check it matches the
@ -316,7 +328,7 @@ def get_infos(requirements, index=None, installed=None, prefer_final=True):
break
if not found:
logger.info('project not installed')
logger.debug('Project not installed')
if not index:
index = wrapper.ClientWrapper()
@ -331,7 +343,7 @@ def get_infos(requirements, index=None, installed=None, prefer_final=True):
raise InstallationException('Release not found: "%s"' % requirements)
if release is None:
logger.info('could not find a matching project')
logger.info('Could not find a matching project')
return infos
metadata = release.fetch_metadata()
@ -348,7 +360,7 @@ def get_infos(requirements, index=None, installed=None, prefer_final=True):
# Get what the missing deps are
dists = depgraph.missing[release]
if dists:
logger.info("missing dependencies found, retrieving metadata")
logger.info("Missing dependencies found, retrieving metadata")
# we have missing deps
for dist in dists:
_update_infos(infos, get_infos(dist, index, installed))
@ -376,7 +388,10 @@ def _remove_dist(dist, paths=sys.path):
def remove(project_name, paths=sys.path, auto_confirm=True):
"""Removes a single project from the installation"""
"""Removes a single project from the installation.
Returns True on success
"""
dist = get_distribution(project_name, use_egg_info=True, paths=paths)
if dist is None:
raise PackagingError('Distribution "%s" not found' % project_name)
@ -384,13 +399,26 @@ def remove(project_name, paths=sys.path, auto_confirm=True):
rmdirs = []
rmfiles = []
tmp = tempfile.mkdtemp(prefix=project_name + '-uninstall')
def _move_file(source, target):
try:
os.rename(source, target)
except OSError as err:
return err
return None
success = True
error = None
try:
for file_, md5, size in files:
if os.path.isfile(file_):
dirname, filename = os.path.split(file_)
tmpfile = os.path.join(tmp, filename)
try:
os.rename(file_, tmpfile)
error = _move_file(file_, tmpfile)
if error is not None:
success = False
break
finally:
if not os.path.isfile(file_):
os.rename(tmpfile, file_)
@ -401,7 +429,12 @@ def remove(project_name, paths=sys.path, auto_confirm=True):
finally:
shutil.rmtree(tmp)
logger.info('removing %r: ', project_name)
if not success:
logger.info('%r cannot be removed.', project_name)
logger.info('Error: %s' % str(error))
return False
logger.info('Removing %r: ', project_name)
for file_ in rmfiles:
logger.info(' %s', file_)
@ -444,21 +477,41 @@ def remove(project_name, paths=sys.path, auto_confirm=True):
if os.path.exists(dist.path):
shutil.rmtree(dist.path)
logger.info('success: removed %d files and %d dirs',
logger.info('Success: removed %d files and %d dirs',
file_count, dir_count)
return True
def install(project):
logger.info('getting information about %r', project)
"""Installs a project.
Returns True on success, False on failure
"""
logger.info('Checking the installation location...')
purelib_path = get_path('purelib')
# trying to write a file there
try:
with tempfile.NamedTemporaryFile(suffix=project,
dir=purelib_path) as testfile:
testfile.write(b'test')
except OSError:
# was unable to write a file
logger.info('Unable to write in "%s". Do you have the permissions ?'
% purelib_path)
return False
logger.info('Getting information about %r...', project)
try:
info = get_infos(project)
except InstallationException:
logger.info('cound not find %r', project)
return
logger.info('Cound not find %r', project)
return False
if info['install'] == []:
logger.info('nothing to install')
return
logger.info('Nothing to install')
return False
install_path = get_config_var('base')
try:
@ -470,6 +523,8 @@ def install(project):
projects = ['%s %s' % (p.name, p.version) for p in e.args[0]]
logger.info('%r conflicts with %s', project, ','.join(projects))
return True
def _main(**attrs):
if 'script_args' not in attrs:

View File

@ -396,22 +396,24 @@ class Metadata:
value = []
if logger.isEnabledFor(logging.WARNING):
project_name = self['Name']
if name in _PREDICATE_FIELDS and value is not None:
for v in value:
# check that the values are valid predicates
if not is_valid_predicate(v.split(';')[0]):
logger.warning(
'%r is not a valid predicate (field %r)',
v, name)
'%r: %r is not a valid predicate (field %r)',
project_name, v, name)
# FIXME this rejects UNKNOWN, is that right?
elif name in _VERSIONS_FIELDS and value is not None:
if not is_valid_versions(value):
logger.warning('%r is not a valid version (field %r)',
value, name)
logger.warning('%r: %r is not a valid version (field %r)',
project_name, value, name)
elif name in _VERSION_FIELDS and value is not None:
if not is_valid_version(value):
logger.warning('%r is not a valid version (field %r)',
value, name)
logger.warning('%r: %r is not a valid version (field %r)',
project_name, value, name)
if name in _UNICODEFIELDS:
if name == 'Description':

View File

@ -1,6 +1,6 @@
"""Spider using the screen-scraping "simple" PyPI API.
This module contains the class SimpleIndexCrawler, a simple spider that
This module contains the class Crawler, a simple spider that
can be used to find and retrieve distributions from a project index
(like the Python Package Index), using its so-called simple API (see
reference implementation available at http://pypi.python.org/simple/).
@ -118,9 +118,10 @@ class Crawler(BaseClient):
def __init__(self, index_url=DEFAULT_SIMPLE_INDEX_URL, prefer_final=False,
prefer_source=True, hosts=DEFAULT_HOSTS,
follow_externals=False, mirrors_url=None, mirrors=None,
timeout=SOCKET_TIMEOUT, mirrors_max_tries=0):
timeout=SOCKET_TIMEOUT, mirrors_max_tries=0, verbose=False):
super(Crawler, self).__init__(prefer_final, prefer_source)
self.follow_externals = follow_externals
self.verbose = verbose
# mirroring attributes.
parsed = urllib.parse.urlparse(index_url)
@ -177,14 +178,14 @@ class Crawler(BaseClient):
def get_releases(self, requirements, prefer_final=None,
force_update=False):
"""Search for releases and return a ReleaseList object containing
"""Search for releases and return a ReleasesList object containing
the results.
"""
predicate = get_version_predicate(requirements)
if predicate.name.lower() in self._projects and not force_update:
return self._projects.get(predicate.name.lower())
prefer_final = self._get_prefer_final(prefer_final)
logger.info('reading info on PyPI about %s', predicate.name)
logger.debug('Reading info on PyPI about %s', predicate.name)
self._process_index_page(predicate.name)
if predicate.name.lower() not in self._projects:
@ -321,8 +322,9 @@ class Crawler(BaseClient):
infos = get_infos_from_url(link, project_name,
is_external=not self.index_url in url)
except CantParseArchiveName as e:
logger.warning(
"version has not been parsed: %s", e)
if self.verbose:
logger.warning(
"version has not been parsed: %s", e)
else:
self._register_release(release_info=infos)
else:

View File

@ -31,11 +31,11 @@ class Client(BaseClient):
If no server_url is specified, use the default PyPI XML-RPC URL,
defined in the DEFAULT_XMLRPC_INDEX_URL constant::
>>> client = XMLRPCClient()
>>> client = Client()
>>> client.server_url == DEFAULT_XMLRPC_INDEX_URL
True
>>> client = XMLRPCClient("http://someurl/")
>>> client = Client("http://someurl/")
>>> client.server_url
'http://someurl/'
"""
@ -69,7 +69,7 @@ class Client(BaseClient):
informations (eg. make a new XML-RPC call).
::
>>> client = XMLRPCClient()
>>> client = Client()
>>> client.get_releases('Foo')
['1.1', '1.2', '1.3']
@ -189,7 +189,7 @@ class Client(BaseClient):
If no server proxy is defined yet, creates a new one::
>>> client = XmlRpcClient()
>>> client = Client()
>>> client.proxy()
<ServerProxy for python.org/pypi>

View File

@ -1,25 +0,0 @@
"""Data file path abstraction.
Functions in this module use sysconfig to find the paths to the resource
files registered in project's setup.cfg file. See the documentation for
more information.
"""
# TODO write that documentation
from packaging.database import get_distribution
__all__ = ['get_file_path', 'get_file']
def get_file_path(distribution_name, relative_path):
"""Return the path to a resource file."""
dist = get_distribution(distribution_name)
if dist != None:
return dist.get_resource_path(relative_path)
raise LookupError('no distribution named %r found' % distribution_name)
def get_file(distribution_name, relative_path, *args, **kwargs):
"""Open and return a resource file."""
return open(get_file_path(distribution_name, relative_path),
*args, **kwargs)

View File

@ -5,10 +5,11 @@ import re
import sys
import getopt
import logging
from copy import copy
from packaging import logger
from packaging.dist import Distribution
from packaging.util import _is_archive_file
from packaging.util import _is_archive_file, generate_setup_py
from packaging.command import get_command_class, STANDARD_COMMANDS
from packaging.install import install, install_local_project, remove
from packaging.database import get_distribution, get_distributions
@ -37,6 +38,14 @@ Usage: pysetup create
Create a new Python package.
"""
generate_usage = """\
Usage: pysetup generate-setup
or: pysetup generate-setup --help
Generates a setup.py script for backward-compatibility purposes.
"""
graph_usage = """\
Usage: pysetup graph dist
or: pysetup graph --help
@ -203,6 +212,13 @@ def _create(distpatcher, args, **kw):
return main()
@action_help(generate_usage)
def _generate(distpatcher, args, **kw):
generate_setup_py()
print('The setup.py was generated')
@action_help(graph_usage)
def _graph(dispatcher, args, **kw):
name = args[1]
@ -224,15 +240,22 @@ def _install(dispatcher, args, **kw):
if 'setup.py' in listing or 'setup.cfg' in listing:
args.insert(1, os.getcwd())
else:
logger.warning('no project to install')
return
logger.warning('No project to install.')
return 1
target = args[1]
# installing from a source dir or archive file?
if os.path.isdir(args[1]) or _is_archive_file(args[1]):
install_local_project(args[1])
if os.path.isdir(target) or _is_archive_file(target):
if install_local_project(target):
return 0
else:
return 1
else:
# download from PyPI
install(args[1])
if install(target):
return 0
else:
return 1
@action_help(metadata_usage)
@ -335,13 +358,21 @@ def _run(dispatcher, args, **kw):
def _list(dispatcher, args, **kw):
opts = _parse_args(args[1:], '', ['all'])
dists = get_distributions(use_egg_info=True)
if 'all' in opts:
if 'all' in opts or opts['args'] == []:
results = dists
else:
results = [d for d in dists if d.name.lower() in opts['args']]
number = 0
for dist in results:
print('%s %s at %s' % (dist.name, dist.metadata['version'], dist.path))
number += 1
print('')
if number == 0:
print('Nothing seems to be installed.')
else:
print('Found %d projects installed.' % number)
@action_help(search_usage)
@ -351,8 +382,9 @@ def _search(dispatcher, args, **kw):
It is able to search for a specific index (specified with --index), using
the simple or xmlrpc index types (with --type xmlrpc / --type simple)
"""
opts = _parse_args(args[1:], '', ['simple', 'xmlrpc'])
#opts = _parse_args(args[1:], '', ['simple', 'xmlrpc'])
# 1. what kind of index is requested ? (xmlrpc / simple)
raise NotImplementedError()
actions = [
@ -364,6 +396,7 @@ actions = [
('list', 'Search for local projects', _list),
('graph', 'Display a graph', _graph),
('create', 'Create a Project', _create),
('generate-setup', 'Generates a backward-comptatible setup.py', _generate)
]
@ -399,6 +432,14 @@ class Dispatcher:
msg = 'Unrecognized action "%s"' % self.action
raise PackagingArgError(msg)
self._set_logger()
self.args = args
# for display options we return immediately
if self.help or self.action is None:
self._show_help(self.parser, display_options_=False)
def _set_logger(self):
# setting up the logging level from the command-line options
# -q gets warning, error and critical
if self.verbose == 0:
@ -416,13 +457,11 @@ class Dispatcher:
else: # -vv and more for debug
level = logging.DEBUG
# for display options we return immediately
option_order = self.parser.get_option_order()
self.args = args
if self.help or self.action is None:
self._show_help(self.parser, display_options_=False)
# setting up the stream handler
handler = logging.StreamHandler(sys.stderr)
handler.setLevel(level)
logger.addHandler(handler)
logger.setLevel(level)
def _parse_command_opts(self, parser, args):
# Pull the current command from the head of the command line
@ -567,8 +606,6 @@ class Dispatcher:
if isinstance(command, str):
command = get_command_class(command)
name = command.get_command_name()
desc = getattr(command, 'description', '(no description available)')
print('Description: %s' % desc)
print('')
@ -635,11 +672,17 @@ class Dispatcher:
def main(args=None):
dispatcher = Dispatcher(args)
if dispatcher.action is None:
return
old_level = logger.level
old_handlers = copy(logger.handlers)
try:
dispatcher = Dispatcher(args)
if dispatcher.action is None:
return
return dispatcher()
finally:
logger.setLevel(old_level)
logger.handlers[:] = old_handlers
return dispatcher()
if __name__ == '__main__':
sys.exit(main())

View File

@ -65,14 +65,17 @@ class LoggingCatcher:
configured to record all messages logged to the 'packaging' logger.
Use get_logs to retrieve messages and self.loghandler.flush to discard
them.
them. get_logs automatically flushes the logs; if you test code that
generates logging messages but don't use get_logs, you have to flush
manually before doing other checks on logging message, otherwise you
will get irrelevant results. See example in test_command_check.
"""
def setUp(self):
super(LoggingCatcher, self).setUp()
self.loghandler = handler = _TestHandler()
self.old_level = logger.level
logger.addHandler(handler)
self.addCleanup(logger.setLevel, logger.level)
logger.setLevel(logging.DEBUG) # we want all messages
def tearDown(self):
@ -84,22 +87,29 @@ class LoggingCatcher:
for ref in weakref.getweakrefs(handler):
logging._removeHandlerRef(ref)
del self.loghandler
logger.setLevel(self.old_level)
super(LoggingCatcher, self).tearDown()
def get_logs(self, *levels):
"""Return all log messages with level in *levels*.
Without explicit levels given, returns all messages.
*levels* defaults to all levels. For log calls with arguments (i.e.
logger.info('bla bla %s', arg)), the messages
Returns a list.
Without explicit levels given, returns all messages. *levels* defaults
to all levels. For log calls with arguments (i.e.
logger.info('bla bla %r', arg)), the messages will be formatted before
being returned (e.g. "bla bla 'thing'").
Returns a list. Automatically flushes the loghandler after being
called.
Example: self.get_logs(logging.WARN, logging.DEBUG).
"""
if not levels:
return [log.getMessage() for log in self.loghandler.buffer]
return [log.getMessage() for log in self.loghandler.buffer
if log.levelno in levels]
messages = [log.getMessage() for log in self.loghandler.buffer]
else:
messages = [log.getMessage() for log in self.loghandler.buffer
if log.levelno in levels]
self.loghandler.flush()
return messages
class TempdirManager:
@ -252,6 +262,15 @@ def create_distribution(configfiles=()):
return d
def fake_dec(*args, **kw):
"""Fake decorator"""
def _wrap(func):
def __wrap(*args, **kw):
return func(*args, **kw)
return __wrap
return _wrap
try:
from test.support import skip_unless_symlink
except ImportError:

View File

@ -265,7 +265,7 @@ class BuildExtTestCase(support.TempdirManager,
def test_get_outputs(self):
tmp_dir = self.mkdtemp()
c_file = os.path.join(tmp_dir, 'foo.c')
self.write_file(c_file, 'void PyInit_foo(void) {};\n')
self.write_file(c_file, 'void PyInit_foo(void) {}\n')
ext = Extension('foo', [c_file], optional=False)
dist = Distribution({'name': 'xx',
'ext_modules': [ext]})
@ -370,8 +370,8 @@ def test_suite():
src = _get_source_filename()
if not os.path.exists(src):
if verbose:
print ('test_build_ext: Cannot find source code (test'
' must run in python build dir)')
print('test_command_build_ext: Cannot find source code (test'
' must run in python build dir)')
return unittest.TestSuite()
else:
return unittest.makeSuite(BuildExtTestCase)

View File

@ -61,9 +61,12 @@ class BuildPyTestCase(support.TempdirManager,
pkgdest = os.path.join(destination, "pkg")
files = os.listdir(pkgdest)
self.assertIn("__init__.py", files)
if not sys.dont_write_bytecode:
self.assertIn("__init__.pyc", files)
self.assertIn("README.txt", files)
# XXX even with -O, distutils writes pyc, not pyo; bug?
if sys.dont_write_bytecode:
self.assertNotIn("__init__.pyc", files)
else:
self.assertIn("__init__.pyc", files)
def test_empty_package_dir(self):
# See SF 1668596/1720897.
@ -93,7 +96,7 @@ class BuildPyTestCase(support.TempdirManager,
try:
dist.run_commands()
except PackagingFileError as e:
except PackagingFileError:
self.fail("failed package_data test when package_dir is ''")
finally:
# Restore state.

View File

@ -36,7 +36,6 @@ class CheckTestCase(support.LoggingCatcher,
# now let's add the required fields
# and run it again, to make sure we don't get
# any warning anymore
self.loghandler.flush()
metadata = {'home_page': 'xxx', 'author': 'xxx',
'author_email': 'xxx',
'name': 'xxx', 'version': '4.2',
@ -50,8 +49,10 @@ class CheckTestCase(support.LoggingCatcher,
self.assertRaises(PackagingSetupError, self._run,
{'name': 'xxx', 'version': 'xxx'}, **{'strict': 1})
# and of course, no error when all metadata fields are present
# clear warnings from the previous calls
self.loghandler.flush()
# and of course, no error when all metadata fields are present
cmd = self._run(metadata, strict=True)
self.assertEqual([], self.get_logs(logging.WARNING))
@ -70,7 +71,6 @@ class CheckTestCase(support.LoggingCatcher,
'name': 'xxx', 'version': '4.2',
'requires_python': '2.4',
}
self.loghandler.flush()
cmd = self._run(metadata)
self.assertEqual([], self.get_logs(logging.WARNING))
@ -85,9 +85,11 @@ class CheckTestCase(support.LoggingCatcher,
self.assertRaises(PackagingSetupError, self._run, metadata,
**{'strict': 1})
# clear warnings from the previous calls
self.loghandler.flush()
# now with correct version format again
metadata['version'] = '4.2'
self.loghandler.flush()
cmd = self._run(metadata, strict=True)
self.assertEqual([], self.get_logs(logging.WARNING))
@ -100,7 +102,6 @@ class CheckTestCase(support.LoggingCatcher,
cmd.check_restructuredtext()
self.assertEqual(len(self.get_logs(logging.WARNING)), 1)
self.loghandler.flush()
pkg_info, dist = self.create_dist(description='title\n=====\n\ntest')
cmd = check(dist)
cmd.check_restructuredtext()
@ -123,6 +124,17 @@ class CheckTestCase(support.LoggingCatcher,
cmd.check_hooks_resolvable()
self.assertEqual(len(self.get_logs(logging.WARNING)), 1)
def test_warn(self):
_, dist = self.create_dist()
cmd = check(dist)
self.assertEqual([], self.get_logs())
cmd.warn('hello')
self.assertEqual(['check: hello'], self.get_logs())
cmd.warn('hello %s', 'world')
self.assertEqual(['check: hello world'], self.get_logs())
cmd.warn('hello %s %s', 'beautiful', 'world')
self.assertEqual(['check: hello beautiful world'], self.get_logs())
def test_suite():
return unittest.makeSuite(CheckTestCase)

View File

@ -67,6 +67,10 @@ class InstallLibTestCase(support.TempdirManager,
cmd.distribution.packages = [pkg_dir]
cmd.distribution.script_name = 'setup.py'
# make sure the build_lib is set the temp dir
build_dir = os.path.split(pkg_dir)[0]
cmd.get_finalized_command('build_py').build_lib = build_dir
# get_output should return 4 elements
self.assertEqual(len(cmd.get_outputs()), 4)

View File

@ -33,7 +33,6 @@ setup(name='fake')
MANIFEST = """\
# file GENERATED by packaging, do NOT edit
README
inroot.txt
data%(sep)sdata.dt
scripts%(sep)sscript.py
@ -129,7 +128,7 @@ class SDistTestCase(support.TempdirManager,
content = zip_file.namelist()
# making sure everything has been pruned correctly
self.assertEqual(len(content), 3)
self.assertEqual(len(content), 2)
@requires_zlib
@unittest.skipIf(find_executable('tar') is None or
@ -214,7 +213,7 @@ class SDistTestCase(support.TempdirManager,
# Making sure everything was added. This includes 9 code and data
# files in addition to PKG-INFO.
self.assertEqual(len(content), 10)
self.assertEqual(len(content), 9)
# Checking the MANIFEST
with open(join(self.tmp_dir, 'MANIFEST')) as fp:
@ -331,7 +330,7 @@ class SDistTestCase(support.TempdirManager,
with open(cmd.manifest) as f:
manifest = [line.strip() for line in f.read().split('\n')
if line.strip() != '']
self.assertEqual(len(manifest), 4)
self.assertEqual(len(manifest), 3)
# Adding a file
self.write_file((self.tmp_dir, 'somecode', 'doc2.txt'), '#')
@ -348,7 +347,7 @@ class SDistTestCase(support.TempdirManager,
if line.strip() != '']
# Do we have the new file in MANIFEST?
self.assertEqual(len(manifest2), 5)
self.assertEqual(len(manifest2), 4)
self.assertIn('doc2.txt', manifest2[-1])
@requires_zlib

View File

@ -150,8 +150,7 @@ class TestTest(TempdirManager,
cmd.tests_require = [phony_project]
cmd.ensure_finalized()
logs = self.get_logs(logging.WARNING)
self.assertEqual(1, len(logs))
self.assertIn(phony_project, logs[0])
self.assertIn(phony_project, logs[-1])
def prepare_a_module(self):
tmp_dir = self.mkdtemp()

View File

@ -176,9 +176,14 @@ class ConfigTestCase(support.TempdirManager,
self.addCleanup(os.chdir, os.getcwd())
tempdir = self.mkdtemp()
self.working_dir = os.getcwd()
os.chdir(tempdir)
self.tempdir = tempdir
def tearDown(self):
os.chdir(self.working_dir)
super(ConfigTestCase, self).tearDown()
def write_setup(self, kwargs=None):
opts = {'description-file': 'README', 'extra-files': '',
'setup-hook': 'packaging.tests.test_config.hook'}

Some files were not shown because too many files have changed in this diff Show More