pyprotocols-1.0a.svn20070625/0000755000175000017500000000000010640056704013725 5ustar kovkovpyprotocols-1.0a.svn20070625/CHANGES.txt0000644000175000017500000001713710167066622015553 0ustar kovkovFixes and changes since PyProtocols 0.9.3 - Added support to make 'protocols.advise()' operate correctly in a doctest or other 'exec' scenario. 'protocols.advice.getFrameInfo()' now returns a 'kind' of '"class"' when in a class body being exec'd. - There is a new 'protocols.advice.add_assignment_advisor' function that allows you to implement PEP 318-like decorators in Python 2.2 and 2.3. There is also a convenience function, 'as', that lets you use PEP 318-style decorators directly. (DOCS NEEDED) - 'StickyAdapter' is now a one-argument adapter factory; as a result, the 'protocol' attribute is now completely gone, and you *must* use the 'attachForProtocols' attribute in order to get any actual "stickiness". See the reference manual for details on the 'attachForProtocols' attribute. - 'adapt()' no longer supports the 'factory' argument that was deprecated in 0.9.3. - Using two-argument adapter factories now produces a DeprecationWarning; please update your code, since support for them will be gone entirely in version 1.1. Fixes and changes since PyProtocols 0.9.2 - Adapter factories are now only called with one argument: the object to adapt. For backward compatibility, any adapter factories that require more than one argument are wrapped in a converter. It's highly recommended that you transition to one-argument adapters as soon as practical, since using two-argument adapters will cause deprecation warnings in PyProtocols version 1.0 (and causes PendingDeprecationWarnings in 0.9.3). This change was made for symmetry with Zope and Twisted adapters, as well as Pythonic adapter factories like 'int' et al. (Note that as a result of this change, 'Adapter' objects no longer have a 'protocol' attribute, and 'StickyAdapter' objects will also lose their 'protocol' attribute in 1.0.) - The 'factory' parameter to 'adapt()' is DEPRECATED. An informal survey of PyProtocols' users indicated that nobody was using it to any significant degree, and its removal was unopposed. This feature was an extension to PEP 246, so this brings PyProtocols into closer conformance with the PEP. If you are currently using it, you will receive a 'DeprecationWarning', and in PyProtocols 1.0 your code will break. - Fixed 'protocols.sequenceOf()' being unable to directly imply a non-sequence protocol. - Raise 'AdaptationFailure' instead of 'NotImplementedError' when adaptation is unsuccessful. 'AdaptationFailure' is a subclass of both 'TypeError' and 'NotImplementedError', so code written according to either PEP 246 or older versions of PyProtocols will still catch the error. - There is now an 'AbstractBase' class, similar to 'Interface', that can be used for the "ABC" (Abstract Base Class) style of interface usage, where the interface may contain implementation code, and can be subclassed to create concrete implementations. In previous versions, you could use 'Interface' as such a base class, but now calling an 'Interface' object performs adaptation rather than instantiation, unless the subclass redefines '__init__'. - 'Protocol' instances (except for 'AbstractBase' subclasses) are now callable with a signature similar to 'adapt()'. E.g. 'ISomething(foo,*args)' is equivalent to 'adapt(foo,ISomething,*args)'. This convenient API, pioneered by Twisted and later adopted by Zope X3, is now available in PyProtocols as well. (Note that as a result of this change, the PyProtocols test suite now requires a Zope X3 alpha release or better.) - 'setup.py' now accepts a '--without-speedups' global option to disable the C speedups module. - We now support the latest 'adapter_hooks' protocol provided by Zope X3 interfaces, allowing multiple interface registry systems to participate in Zope interfaces' '__adapt__()' implementation. - Declaring an adapter from an instance to a protocol that was part of a circular implication path resulted in infinite recursion. Correcting the problem required a change in the return signature of the 'declareProvides()' method in the 'IOpenProvider' interface. Please see the docstring or the updated reference manual for details. Thanks to Bob Ippolito for discovering the problem and bringing it to my attention. - Defining an adapter from one protocol to another, when that adapter does not shorten the adaptation path, would produce a spurious 'KeyError'. Fixes since PyProtocols 0.9.1 - Fixed missing unit tests for 'Variation' class, and the two bugs in 'Variation' that weren't found because of the missing tests. Fixes and Enhancements since PyProtocols 0.9 - Added the 'factoryMethod' and 'equivalentProtocols' keywords to 'advise()'. - Added 'sequenceOf()', allowing you to easily create a protocol that represents a sequence of some base protocol, and automatically adapt basic sequences (e.g. lists and tuples) to a "sequence of" the base protocol, as long as all members of the input sequence can be adapted to the base protocol. By default, only lists and tuples are considered to support 'IBasicSequence'. - Added 'protocolForType()' and 'protocolForURI()', that allow you to link interfaces by intended semantics, not just by having identical instances. For example, you can use 'protocolForType(file,["read"])' to symbolize a file-like object with a 'read()' method, or 'protocolForURI("some UUID")' to symbolize some documented interface. In addition to compact declarations, this also allows a module to refer to an interface without importing a specific definition of it. Then, when that module is used in a larger program, the linkage between the symbolic and actual forms of the interface can be accomplished semi-automatically. - Enhanced Zope 3 support: Now, adapters can be registered between Zope interfaces, and any types or instances. Note, however, that interface-to-interface adaptation may not work if a class only declares what it implements using the Zope interface API. This limitation might be able to be removed later. Zope interfaces can now pass a much larger segment of the test suite than before. - Added 'protocols.Variation(baseProtocol,context=None)'; this implements the 'LocalProtocol' example in the documentation. - Added 'Adapter' and 'StickyAdapter' convenience base classes. 'Adapter' offers a ready-made '__init__()' method suitable for most adapter classes, while 'StickyAdapter' instances automatically declare themselves as an instance-specific adapter for any object they're used on. Thus, a 'StickyAdapter' can maintain its state across 'adapt()' calls for the same object, so long as the object can have instance-specific adapters declared. (See "Protocol Declarations for Individual Objects" in the reference manual for more information on this.) - Added experimental support for 'ExtensionClass'; previously, PyProtocols could raise bizarre errors and/or behave strangely when 'adapt()' was called on 'ExtensionClass' instances. - Fixed some problems with the test suite when running under Python 2.3. PyProtocols itself worked fine, but the test suite was bitten by two minor semantic changes that took effect in 2.3, resulting in lots of error messages about ModuleType needing a parameter, and a test failure for 'checkClassInfo' in the 'FrameInfoTest' test class. - Fixed a memory leak in the C "speedups" module that leaked unbound '__conform__' and '__adapt__' methods, as well as '__class__' and '__mro__' references. Also, fixed the C code potentially reraising invalid error tracebacks under certain circumstances. pyprotocols-1.0a.svn20070625/TODO.txt0000644000175000017500000000020310266233110015216 0ustar kovkovItems needing investigation or implementation for PyProtocols 1.0 * implement the declaration API shortcut decorators/advisors pyprotocols-1.0a.svn20070625/src/0000755000175000017500000000000010640056704014514 5ustar kovkovpyprotocols-1.0a.svn20070625/src/setup/0000755000175000017500000000000010640056704015654 5ustar kovkovpyprotocols-1.0a.svn20070625/src/setup/common.py0000644000175000017500000000437710263652332017531 0ustar kovkov"""This is a set of commonly useful distutils enhancements, designed to be 'execfile()'d in a 'setup.py' file. Don't import it, it's not a package! It also doesn't get installed with the rest of the package; it's only actually used while 'setup.py' is running.""" # Set up default parameters if 'HAPPYDOC_OUTPUT_PATH' not in globals(): HAPPYDOC_OUTPUT_PATH = 'docs/html/reference' if 'HAPPYDOC_IGNORE' not in globals(): HAPPYDOC_IGNORE = ['-i', 'tests'] if 'HAPPYDOC_TITLE' not in globals(): HAPPYDOC_TITLE = PACKAGE_NAME + ' API Reference' from setuptools import Command from setuptools.command.sdist import sdist as old_sdist class happy(Command): """Command to generate documentation using HappyDoc I should probably make this more general, and contribute it to either HappyDoc or the distutils, but this does the trick for PEAK for now... """ description = "Generate docs using happydoc" user_options = [] def initialize_options(self): self.happy_options = None self.doc_output_path = None def finalize_options(self): if self.doc_output_path is None: self.doc_output_path = HAPPYDOC_OUTPUT_PATH if self.happy_options is None: self.happy_options = [ '-t', HAPPYDOC_TITLE, '-d', self.doc_output_path, ] + HAPPYDOC_IGNORE + [ '.' ] if not self.verbose: self.happy_options.insert(0,'-q') def run(self): from distutils.dir_util import remove_tree, mkpath from happydoclib import HappyDoc mkpath(self.doc_output_path, 0755, self.verbose, self.dry_run) remove_tree(self.doc_output_path, self.verbose, self.dry_run) if not self.dry_run: HappyDoc(self.happy_options).run() class sdist(old_sdist): """Variant of 'sdist' that (re)builds the documentation first""" def run(self): # Build docs before source distribution try: import happydoclib except ImportError: pass else: self.run_command('happy') # Run the standard sdist command old_sdist.run(self) SETUP_COMMANDS = { 'sdist': sdist, 'happy': happy, 'sdist_nodoc': old_sdist, } pyprotocols-1.0a.svn20070625/src/protocols/0000755000175000017500000000000010640056704016540 5ustar kovkovpyprotocols-1.0a.svn20070625/src/protocols/generate.py0000644000175000017500000001267710072357530020721 0ustar kovkov"""Autogenerated protocols from type+method names, URI, sequence, etc.""" from interfaces import Protocol, allocate_lock, Interface from advice import metamethod, supermeta from api import declareAdapterForProtocol, declareAdapterForType from api import declareAdapter, adapt from adapters import NO_ADAPTER_NEEDED __all__ = [ 'protocolForType', 'protocolForURI', 'sequenceOf', 'IBasicSequence', 'URIProtocol', 'TypeSubset', 'WeakSubset', 'ADAPT_SEQUENCE', 'SequenceProtocol' ] class URIProtocol(Protocol): """Protocol representing a URI, UUID, or other unique textual identifier""" def __init__(self,uri): self.URI = uri Protocol.__init__(self) def __repr__(self): return "URIProtocol(%r)" % self.URI def __reduce__(self): return protocolForURI, (self.URI,) class TypeSubset(Protocol): """Protocol representing some set of a type's methods""" def __init__(self,baseType,methods): self.baseType = baseType self.methods = methods Protocol.__init__(self) def __repr__(self): return "TypeSubset(%r,%r)" % (self.baseType,self.methods) def __reduce__(self): return protocolForType, (self.baseType, self.methods, False) class WeakSubset(TypeSubset,object): """TypeSubset that accepts any object with the right attributes""" __metaclass__ = type # new-style so we can use super() def __adapt__(self,ob): result = supermeta(TypeSubset,self).__adapt__(ob) if result is not None: return result for name in self.methods: if not hasattr(ob,name): return None else: return ob __adapt__ = metamethod(__adapt__) def __repr__(self): return "WeakSubset(%r,%r)" % (self.baseType,self.methods) def __reduce__(self): return protocolForType, (self.baseType, self.methods, True) class SequenceProtocol(Protocol): """Protocol representing a "sequence of" some base protocol""" def __init__(self,baseProtocol): self.baseProtocol = baseProtocol Protocol.__init__(self) def __repr__(self): return "sequenceOf(%r)" % self.baseProtocol def __reduce__(self): return sequenceOf, (self.baseProtocol,) class IBasicSequence(Interface): """Non-string, iterable object sequence""" def __iter__(): """Return an iterator over the sequence""" declareAdapter( NO_ADAPTER_NEEDED, provides=[IBasicSequence], forTypes=[list,tuple] ) def ADAPT_SEQUENCE(ob, proto): """Convert iterable 'ob' into list of objects implementing 'proto'""" marker = object() out = [] proto = proto.baseProtocol # get the protocol to adapt to for item in ob: item = adapt(item,proto, marker) if item is marker: return None # can't adapt unless all members adapt out.append(item) return out __registryLock = allocate_lock() registry = {} def protocolForURI(uri): """Return a unique protocol object representing the supplied URI/UUID""" __registryLock.acquire() try: try: return registry[uri] except KeyError: proto = registry[uri] = URIProtocol(uri) return proto finally: __registryLock.release() def protocolForType(baseType, methods=(), implicit=False): """Return a protocol representing a subset of methods of a specific type""" # Normalize 'methods' to a sorted tuple w/no duplicate names methods = dict([(k,k) for k in methods]).keys() methods.sort() methods = tuple(methods) key = baseType, methods, (not not implicit) # ensure implicit is true/false return __protocolForType(key) def sequenceOf(baseProtocol): """Return a protocol representing an sequence of a given base protocol""" key = (sequenceOf, baseProtocol) __registryLock.acquire() try: try: return registry[key] except KeyError: proto = registry[key] = SequenceProtocol(baseProtocol) declareAdapterForProtocol( proto, lambda o: ADAPT_SEQUENCE(o,proto), IBasicSequence ) return proto finally: __registryLock.release() def __protocolForType(key): """Recursive implementation of protocolForType; assumes standardized key""" __registryLock.acquire() try: try: return registry[key] except KeyError: baseType, methods, implicit = key if implicit: proto = WeakSubset(baseType,methods) else: proto = TypeSubset(baseType,methods) registry[key] = proto finally: __registryLock.release() # declare that proto implies all subset-method protocols if len(methods)>1: for method in methods: subset = tuple([m for m in methods if m!=method]) implied = __protocolForType((baseType, subset, implicit)) declareAdapterForProtocol(implied, NO_ADAPTER_NEEDED, proto) # declare that explicit form implies implicit form if implicit: impliedBy = __protocolForType((baseType, methods, False)) declareAdapterForProtocol(proto, NO_ADAPTER_NEEDED, impliedBy) # declare that baseType implements this protocol declareAdapterForType(proto, NO_ADAPTER_NEEDED, baseType) return proto pyprotocols-1.0a.svn20070625/src/protocols/api.py0000644000175000017500000001774310570131557017701 0ustar kovkov"""Adapter and Declaration API""" __all__ = [ 'adapt', 'declareAdapterForType', 'declareAdapterForProtocol', 'declareAdapterForObject', 'advise', 'declareImplementation', 'declareAdapter', 'adviseObject', ] _marker = object() from sys import _getframe, exc_info, modules from types import ClassType ClassTypes = ClassType, type from adapters import NO_ADAPTER_NEEDED, DOES_NOT_SUPPORT, AdaptationFailure from adapters import bindAdapter from peak.util.decorators import decorate_class, frameinfo from interfaces import IOpenProtocol, IOpenProvider, IOpenImplementor from interfaces import Protocol, InterfaceClass def adapt(obj, protocol, default=_marker): """PEP 246-alike: Adapt 'obj' to 'protocol', return 'default' If 'default' is not supplied and no implementation is found, the result of 'factory(obj,protocol)' is returned. If 'factory' is also not supplied, 'NotImplementedError' is then raised.""" if isinstance(protocol,ClassTypes) and isinstance(obj,protocol): return obj try: _conform = obj.__conform__ except AttributeError: pass else: try: result = _conform(protocol) if result is not None: return result except TypeError: if exc_info()[2].tb_next is not None: raise try: _adapt = protocol.__adapt__ except AttributeError: pass else: try: result = _adapt(obj) if result is not None: return result except TypeError: if exc_info()[2].tb_next is not None: raise if default is _marker: raise AdaptationFailure("Can't adapt", obj, protocol) return default try: from _speedups import adapt except ImportError: pass # Fundamental, explicit interface/adapter declaration API: # All declarations should end up passing through these three routines. def declareAdapterForType(protocol, adapter, typ, depth=1): """Declare that 'adapter' adapts instances of 'typ' to 'protocol'""" adapter = bindAdapter(adapter,protocol) adapter = adapt(protocol, IOpenProtocol).registerImplementation( typ, adapter, depth ) oi = adapt(typ, IOpenImplementor, None) if oi is not None: oi.declareClassImplements(protocol,adapter,depth) def declareAdapterForProtocol(protocol, adapter, proto, depth=1): """Declare that 'adapter' adapts 'proto' to 'protocol'""" adapt(protocol, IOpenProtocol) # src and dest must support IOpenProtocol adapt(proto, IOpenProtocol).addImpliedProtocol(protocol, bindAdapter(adapter,protocol), depth) def declareAdapterForObject(protocol, adapter, ob, depth=1): """Declare that 'adapter' adapts 'ob' to 'protocol'""" adapt(protocol,IOpenProtocol).registerObject(ob,bindAdapter(adapter,protocol),depth) # Bootstrap APIs to work with Protocol and InterfaceClass, without needing to # give Protocol a '__conform__' method that's hardwired to IOpenProtocol. # Note that InterfaceClass has to be registered first, so that when the # registration propagates to IAdaptingProtocol and IProtocol, InterfaceClass # will already be recognized as an IOpenProtocol, preventing infinite regress. IOpenProtocol.registerImplementation(InterfaceClass) # VERY BAD!! IOpenProtocol.registerImplementation(Protocol) # NEVER DO THIS!! # From this line forward, the declaration APIs can work. Use them instead! # Interface and adapter declarations - convenience forms, explicit targets def declareAdapter(factory, provides, forTypes=(), forProtocols=(), forObjects=() ): """'factory' is an IAdapterFactory providing 'provides' protocols""" for protocol in provides: for typ in forTypes: declareAdapterForType(protocol, factory, typ) for proto in forProtocols: declareAdapterForProtocol(protocol, factory, proto) for ob in forObjects: declareAdapterForObject(protocol, factory, ob) def declareImplementation(typ, instancesProvide=(), instancesDoNotProvide=()): """Declare information about a class, type, or 'IOpenImplementor'""" for proto in instancesProvide: declareAdapterForType(proto, NO_ADAPTER_NEEDED, typ) for proto in instancesDoNotProvide: declareAdapterForType(proto, DOES_NOT_SUPPORT, typ) def adviseObject(ob, provides=(), doesNotProvide=()): """Tell an object what it does or doesn't provide""" for proto in provides: declareAdapterForObject(proto, NO_ADAPTER_NEEDED, ob) for proto in doesNotProvide: declareAdapterForObject(proto, DOES_NOT_SUPPORT, ob) # And now for the magic function... def advise(**kw): kw = kw.copy() frame = _getframe(1) kind, module, caller_locals, caller_globals = frameinfo(frame) if kind=="module": moduleProvides = kw.setdefault('moduleProvides',()) del kw['moduleProvides'] for k in kw: raise TypeError( "Invalid keyword argument for advising modules: %s" % k ) adviseObject(module, provides=moduleProvides ) return elif kind != "class": raise SyntaxError( "protocols.advise() must be called directly in a class or" " module body, not in a function or exec." ) classProvides = kw.setdefault('classProvides',()) classDoesNotProvide = kw.setdefault('classDoesNotProvide',()) instancesProvide = kw.setdefault('instancesProvide',()) instancesDoNotProvide = kw.setdefault('instancesDoNotProvide',()) asAdapterForTypes = kw.setdefault('asAdapterForTypes',()) asAdapterForProtocols = kw.setdefault('asAdapterForProtocols',()) protocolExtends = kw.setdefault('protocolExtends',()) protocolIsSubsetOf = kw.setdefault('protocolIsSubsetOf',()) factoryMethod = kw.setdefault('factoryMethod',None) equivalentProtocols = kw.setdefault('equivalentProtocols',()) map(kw.__delitem__,"classProvides classDoesNotProvide instancesProvide" " instancesDoNotProvide asAdapterForTypes asAdapterForProtocols" " protocolExtends protocolIsSubsetOf factoryMethod equivalentProtocols" .split()) for k in kw: raise TypeError( "Invalid keyword argument for advising classes: %s" % k ) def callback(klass): if classProvides or classDoesNotProvide: adviseObject(klass, provides=classProvides, doesNotProvide=classDoesNotProvide ) if instancesProvide or instancesDoNotProvide: declareImplementation(klass, instancesProvide=instancesProvide, instancesDoNotProvide=instancesDoNotProvide ) if asAdapterForTypes or asAdapterForProtocols: if not instancesProvide: raise TypeError( "When declaring an adapter, you must specify what" " its instances will provide." ) if factoryMethod: factory = getattr(klass,factoryMethod) else: factory = klass declareAdapter(factory, instancesProvide, forTypes=asAdapterForTypes, forProtocols=asAdapterForProtocols ) elif factoryMethod: raise TypeError( "'factoryMethod' is only used when declaring an adapter type" ) if protocolExtends: declareAdapter(NO_ADAPTER_NEEDED, protocolExtends, forProtocols=[klass] ) if protocolIsSubsetOf: declareAdapter(NO_ADAPTER_NEEDED, [klass], forProtocols=protocolIsSubsetOf ) if equivalentProtocols: declareAdapter( NO_ADAPTER_NEEDED, equivalentProtocols, forProtocols=[klass] ) declareAdapter( NO_ADAPTER_NEEDED, [klass], forProtocols=equivalentProtocols ) return klass decorate_class(callback) pyprotocols-1.0a.svn20070625/src/protocols/tests/0000755000175000017500000000000010640056704017702 5ustar kovkovpyprotocols-1.0a.svn20070625/src/protocols/tests/test_direct.py0000644000175000017500000000671610015254377022601 0ustar kovkov"""Tests for default IOpenProvider (etc.) adapters TODO: - Test Zope interface registrations """ from unittest import TestCase, makeSuite, TestSuite from protocols import * from checks import ProviderChecks, AdaptiveChecks, ClassProvidesChecks from checks import makeClassProvidesTests, makeInstanceTests from checks import makeMetaClassProvidesTests class IA(Interface): pass class IB(IA): pass class IPure(Interface): # We use this for pickle/copy tests because the other protocols # imply various dynamically created interfaces, and so any object # registered with them won't be picklable pass class BasicChecks(AdaptiveChecks, ProviderChecks): """Checks to be done on every object""" IA = IA IB = IB Interface = Interface IPure = IPure def checkCircularRegister(self): P1 = Protocol() P2 = Protocol() declareAdapter(NO_ADAPTER_NEEDED,provides=[P2],forProtocols=[P1]) declareAdapter(NO_ADAPTER_NEEDED,provides=[P1],forProtocols=[P2]) self.declareObImplements([P1]) # implies P1->P2->P1 class ClassChecks(ClassProvidesChecks, BasicChecks): """Checks to be done on classes and types""" class InstanceConformChecks: """Things to check on adapted instances""" def checkBadConform(self): def __conform__(proto): pass self.ob.__conform__ = __conform__ self.assertBadConform(self.ob, [self.IA], __conform__) def assertBadConform(self, ob, protocols, conform): try: adviseObject(ob, provides=protocols) except TypeError,v: assert v.args==( "Incompatible __conform__ on adapted object", ob, conform ), v.args else: raise AssertionError("Should've detected invalid __conform__") class ClassConformChecks(InstanceConformChecks): """Things to check on adapted classes""" def checkInheritedConform(self): class Base(self.ob): def __conform__(self,protocol): pass class Sub(Base): pass self.assertBadConform(Sub, [self.IA], Base.__conform__.im_func) def checkInstanceConform(self): class Base(self.ob): def __conform__(self,protocol): pass b = Base() self.assertBadConform(b, [self.IA], b.__conform__) class AdviseMixinInstance(BasicChecks): def setUp(self): self.ob = ProviderMixin() # Notice that we don't test the *metaclass* of the next three configurations; # it would fail because the metaclass itself can't be adapted to an open # provider, because it has a __conform__ method (from ProviderMixin). For # that to work, there'd have to be *another* metalevel. class AdviseMixinMultiMeta1(BasicChecks): def setUp(self): class Meta(ProviderMixin, type): pass class Test(ProviderMixin,object): __metaclass__ = Meta self.ob = Test() class InstanceTestsBase(BasicChecks, InstanceConformChecks): pass class ClassTestsBase(ClassChecks, ClassConformChecks): pass class Picklable: # Pickling needs classes in top-level namespace pass class NewStyle(object): pass TestClasses = ( AdviseMixinInstance, AdviseMixinMultiMeta1, ) TestClasses += makeMetaClassProvidesTests(ClassChecks) TestClasses += makeClassProvidesTests(ClassTestsBase) TestClasses += makeInstanceTests(InstanceTestsBase,Picklable,NewStyle) def test_suite(): return TestSuite([makeSuite(t,'check') for t in TestClasses]) pyprotocols-1.0a.svn20070625/src/protocols/tests/test_advice.py0000644000175000017500000000436110570131557022554 0ustar kovkov"""Tests for advice""" from unittest import TestCase, makeSuite, TestSuite from protocols.advice import * import sys from types import InstanceType class SuperTest(TestCase): def checkMetaSuper(self): class Meta(type): def foo(self,arg): return arg foo = metamethod(foo) class Class(object): __metaclass__ = Meta def foo(self,arg): return arg*2 # Verify that ob.foo() and ob.__class__.foo() are different assert Class.foo(1)==1 assert Class().foo(1)==2 # Verify that supermeta() works for such methods class SubMeta(Meta): def foo(self,arg): return -supermeta(SubMeta,self).foo(arg) foo = metamethod(foo) class ClassOfSubMeta(Class): __metaclass__ = SubMeta assert ClassOfSubMeta.foo(1)==-1 assert ClassOfSubMeta().foo(1)==2 def checkPropSuper(self): class Base(object): __slots__ = 'foo' class Sub(Base): def getFoo(self): return supermeta(Sub,self).foo * 2 def setFoo(self,val): Base.foo.__set__(self,val) foo = property(getFoo, setFoo) ob = Sub() ob.foo = 1 assert ob.foo == 2 def checkSuperNotFound(self): class Base(object): pass b = Base() try: supermeta(Base,b).foo except AttributeError: pass else: raise AssertionError("Shouldn't have returned a value") class MROTests(TestCase): def checkStdMRO(self): class foo(object): pass class bar(foo): pass class baz(foo): pass class spam(bar,baz): pass assert getMRO(spam) is spam.__mro__ def checkClassicMRO(self): class foo: pass class bar(foo): pass class baz(foo): pass class spam(bar,baz): pass basicMRO = [spam,bar,foo,baz,foo] assert list(getMRO(spam)) == basicMRO assert list(getMRO(spam,True)) == basicMRO+[InstanceType,object] TestClasses = SuperTest, MROTests def test_suite(): return TestSuite([makeSuite(t,'check') for t in TestClasses]) pyprotocols-1.0a.svn20070625/src/protocols/tests/test_classes.py0000644000175000017500000000244407670214215022756 0ustar kovkov"""Tests for implementor declarations (i.e. instancesProvides)""" from unittest import TestCase, makeSuite, TestSuite from protocols import * from checks import ImplementationChecks, AdaptiveChecks, makeClassTests class IA(Interface): pass class IB(IA): pass class IPure(Interface): # We use this for pickle/copy tests because the other protocols # imply various dynamically created interfaces, and so any object # registered with them won't be picklable pass class BasicChecks(AdaptiveChecks, ImplementationChecks): """PyProtocols-only class-instances-provide checks""" IA = IA IB = IB Interface = Interface IPure = IPure def checkChangingBases(self): # Zope and Twisted fail this because they rely on the first-found # __implements__ attribute and ignore a class' MRO/__bases__ M1, M2 = self.setupBases(self.klass) m1 = self.make(M1) m2 = self.make(M2) declareImplementation(M1, instancesProvide=[self.IA]) declareImplementation(M2, instancesProvide=[self.IB]) self.assertM1ProvidesOnlyAandM2ProvidesB(m1,m2) self.assertChangingBasesChangesInterface(M1,M2,m1,m2) TestClasses = makeClassTests(BasicChecks) def test_suite(): return TestSuite([makeSuite(t,'check') for t in TestClasses]) pyprotocols-1.0a.svn20070625/src/protocols/tests/test_zope.py0000644000175000017500000000401710607570201022266 0ustar kovkov"""Zope Interface tests""" from unittest import TestCase, makeSuite, TestSuite from protocols import * import protocols.zope_support from zope.interface import Interface # Dummy interfaces and adapters used in tests class IA(Interface): pass class IB(IA): pass class IPure(Interface): # We use this for pickle/copy tests because the other protocols # imply various dynamically created interfaces, and so any object # registered with them won't be picklable pass class Picklable: # Pickling needs classes in top-level namespace pass class NewStyle(object): pass from checks import ImplementationChecks, makeClassTests, makeInstanceTests from checks import ProviderChecks, BasicClassProvidesChecks from checks import makeMetaClassProvidesTests, AdaptiveChecks class BasicChecks(AdaptiveChecks, ImplementationChecks): IA = IA IB = IB Interface = Interface IPure = IPure class InstanceChecks(AdaptiveChecks, ProviderChecks): IA = IA IB = IB Interface = Interface IPure = IPure class ClassChecks(BasicClassProvidesChecks, ProviderChecks): IA = IA IB = IB Interface = Interface IPure = IPure TestClasses = makeClassTests(BasicChecks) TestClasses += makeMetaClassProvidesTests(ClassChecks) TestClasses += makeInstanceTests(InstanceChecks,Picklable,NewStyle) class IB(protocols.Interface): advise(protocolExtends = [IA]) class BasicChecks2(AdaptiveChecks, ImplementationChecks): IA = IA IB = IB Interface = Interface IPure = IPure class InstanceChecks2(AdaptiveChecks, ProviderChecks): IA = IA IB = IB Interface = Interface IPure = IPure class ClassChecks2(AdaptiveChecks, BasicClassProvidesChecks, ProviderChecks): IA = IA IB = IB Interface = Interface IPure = IPure TestClasses += makeClassTests(BasicChecks2) TestClasses += makeMetaClassProvidesTests(ClassChecks2) TestClasses += makeInstanceTests(InstanceChecks2,Picklable,NewStyle) def test_suite(): return TestSuite([makeSuite(t,'check') for t in TestClasses]) pyprotocols-1.0a.svn20070625/src/protocols/tests/__init__.py0000644000175000017500000001716410607570201022020 0ustar kovkovfrom unittest import TestSuite, TestCase, makeSuite from protocols import adapt, advise, Interface, Attribute, declareAdapter from protocols import AbstractBase, AdaptationFailure class APITests(TestCase): def checkAdaptTrapsTypeErrorsOnConform(self): class Conformer: def __conform__(self,ob): return [] assert adapt(Conformer,list,None) is None assert adapt(Conformer(),list,None) == [] def checkAdaptHandlesIsInstance(self): assert adapt([1,2,3],list,None) == [1,2,3] assert adapt('foo',str,None) == 'foo' assert adapt('foo',list,None) is None def checkAdviseFailsInCallContext(self): try: advise() except SyntaxError: pass else: raise AssertionError( "Should've got SyntaxError for advise() in function" ) def checkAdviseClassKeywordsValidated(self): try: class X: advise(moduleProvides=list) except TypeError,v: assert v.args==( "Invalid keyword argument for advising classes: moduleProvides", ) else: raise AssertionError("Should've caught invalid keyword") def checkAdviseClassKeywordsValidated(self): try: class X: advise(moduleProvides=list) except TypeError,v: assert v.args==( "Invalid keyword argument for advising classes: moduleProvides", ) else: raise AssertionError("Should've caught invalid keyword") def checkAdviseModuleKeywordsValidated(self): try: exec "advise(instancesProvide=[IProtocol1])" in globals(),globals() except TypeError,v: assert v.args==( "Invalid keyword argument for advising modules: instancesProvide", ) else: raise AssertionError("Should've caught invalid keyword") def checkAdviseInClassExec(self): d = {'advise':advise,'IProtocol1':IProtocol1} exec "class Foo: advise(instancesProvide=[IProtocol1])" in d def checkSimpleAdaptation(self): class Conformer: def __conform__(self,protocol): if protocol==42: return "hitchhiker",self class AdaptingProtocol: def __adapt__(klass,ob): return "adapted", ob __adapt__ = classmethod(__adapt__) c = Conformer() assert adapt(c,42,None) == ("hitchhiker",c) assert adapt(c,AdaptingProtocol,None) == ("adapted",c) assert adapt(42,AdaptingProtocol,None) == ("adapted",42) assert adapt(42,42,None) is None def checkAdaptFiltersTypeErrors(self): class Nonconformist: def __conform__(self,ob): raise TypeError("You got me!") class Unadaptive: def __adapt__(self,ob): raise TypeError("You got me!") # These will get a type errors calling __conform__/__adapt__ # but should be ignored since the error is at calling level assert adapt(None, Unadaptive, None) is None assert adapt(Nonconformist, None, None) is None # These will get type errors internally, and the error should # bleed through to the caller self.assertTypeErrorPassed(None, Unadaptive(), None) self.assertTypeErrorPassed(Nonconformist(), None, None) self.assertRaises(AdaptationFailure, adapt, None, None) def assertTypeErrorPassed(self, *args): try: # This should raise TypeError internally, and be caught adapt(*args) except TypeError,v: assert v.args==("You got me!",) else: raise AssertionError("Should've passed TypeError through") def checkImplicationBug(self): class I1(Interface): pass class I2(I1): pass declareAdapter(lambda o: o, provides=[I1],forProtocols=[I2]) def checkAttribute(self): for i in range(10)+[None]: class Abstract(AbstractBase): value = Attribute("testing", "value", i) ob = Abstract() assert ob.value == i for j in range(10): ob.value = j assert ob.value==j def checkStickyAdapter(self): class T: pass t = T() class I(Interface): pass from protocols.adapters import StickyAdapter class A(StickyAdapter): attachForProtocols = I, advise(instancesProvide=[I], asAdapterForTypes=[T]) a = I(t) assert adapt(t, I) is a assert a.subject is t n = T() a2 = I(n) assert a2 is not a assert a2.subject is n assert adapt(n, I) is a2 from protocols import protocolForType, protocolForURI, sequenceOf, advise from protocols import declareImplementation, Variation from UserDict import UserDict IGetSetMapping = protocolForType(dict,['__getitem__','__setitem__']) IGetMapping = protocolForType(dict,['__getitem__']) ISimpleReadFile = protocolForType(file,['read']) IImplicitRead = protocolForType(file,['read'], implicit=True) IProtocol1 = protocolForURI("http://peak.telecommunity.com/PyProtocols") multimap = sequenceOf(IGetMapping) declareImplementation(UserDict,[IGetSetMapping]) IMyUnusualMapping = Variation(IGetSetMapping) class MyUserMapping(object): pass declareImplementation(MyUserMapping,[IMyUnusualMapping]) class GenerationTests(TestCase): def checkTypeSubset(self): d = {} assert IGetSetMapping(d,None) is d assert IGetMapping(d,None) is d def checkImplications(self): d = UserDict() assert IGetMapping(d,None) is d assert IImplicitRead(d,None) is None def checkWeak(self): from cStringIO import StringIO s = StringIO("foo") assert ISimpleReadFile(s,None) is None assert IImplicitRead(s,None) is s def checkURI(self): p = protocolForURI("http://www.python.org/") assert p is not IProtocol1 p = protocolForURI("http://peak.telecommunity.com/PyProtocols") assert p is IProtocol1 def checkSequence(self): d1,d2 = {},{} seq = multimap([d1,d2]) assert seq == [d1,d2] assert seq[0] is d1 and seq[1] is d2 class ISequenceLike(Interface): advise(protocolIsSubsetOf=[multimap]) ISequenceLike([d1,d2]) def checkVariation(self): d = {} assert IMyUnusualMapping(d,None) is d # GetSet implies variation d = MyUserMapping(); assert IMyUnusualMapping(d,None) is d assert IGetSetMapping(d,None) is None # but not the other way self.assertEqual(repr(IMyUnusualMapping), "Variation(TypeSubset(,('__getitem__', '__setitem__')))") self.assertEqual(repr(Variation(Interface,42)), "Variation(,42)") def test_suite(): from protocols.tests import test_advice, test_direct, test_classes tests = [ test_advice.test_suite(), test_classes.test_suite(), test_direct.test_suite(), makeSuite(APITests,'check'), makeSuite(GenerationTests,'check'), ] try: import zope.interface except ImportError: pass else: from protocols.tests import test_zope tests.append( test_zope.test_suite() ) try: from twisted.python.components import Interface except (ImportError, SystemError): pass else: from protocols.tests import test_twisted tests.append( test_twisted.test_suite() ) return TestSuite( tests ) pyprotocols-1.0a.svn20070625/src/protocols/tests/checks.py0000644000175000017500000003056610607570201021522 0ustar kovkov"""Basic test setups""" __all__ = [ 'TestBase', 'ImplementationChecks', 'ProviderChecks', 'InstanceImplementationChecks', 'makeClassTests', 'ClassProvidesChecks', 'AdaptiveChecks', 'SimpleAdaptiveChecks', 'makeMetaClassProvidesTests', 'BasicClassProvidesChecks', ] from unittest import TestCase, makeSuite, TestSuite from protocols import * # Dummy interfaces and adapters used in tests def a1(ob): return 'a1',ob def a2(ob): return 'a2',ob class TestBase(TestCase): """Non-adapter instance tests""" a1 = staticmethod(a1) a2 = staticmethod(a2) def assertObProvidesOnlyA(self): assert self.IA(self.ob, None) is self.ob assert self.IB(self.ob, None) is None assert self.IA(None, None) is None assert adapt(self.IB, None, None) is None assert adapt(self.ob, None, None) is None def assertObProvidesAandB(self): assert self.IA(self.ob, None) is self.ob assert self.IB(self.ob, None) is self.ob assert adapt(self.IA, None, None) is None assert adapt(self.IB, None, None) is None assert adapt(self.ob, None, None) is None def assertAmbiguous(self, a1, a2, d1, d2, ifaces): try: self.declareObAdapts(a2,ifaces) except TypeError,v: assert v.args == ("Ambiguous adapter choice", a1, a2, d1, d2) def make(self,klass): # This is overridden by tests where 'klass' is a metaclass return klass() def assertObProvidesSubsetOfA(self): # Assert that self.ob provides a new subset of self.IA # (Caller must ensure that self.ob provides self.IA) class IC(self.Interface): advise(protocolIsSubsetOf=[self.IA]) if 'self' in locals(): del locals()['self'] # how the heck??? assert IC(self.ob, None) is self.ob def setupBases(self,base): class M1(base): pass class M2(base): pass return M1, M2 def assertM1ProvidesOnlyAandM2ProvidesB(self,M1,M2): assert self.IA(M1,None) is M1 assert self.IB(M1,None) is None assert self.IB(M2,None) is M2 def assertChangingBasesChangesInterface(self,M1,M2,m1,m2): try: M1.__bases__ = M2, except TypeError: # XXX 2.2 doesn't let newstyle __bases__ change pass else: assert self.IA(m1,None) is m1 assert self.IB(m1,None) is m1 def assertObProvidesABCD(self,IC,ID): assert self.IA(self.ob, None) is self.ob assert self.IB(self.ob, None) is self.ob assert IC(self.ob, None) is self.ob assert ID(self.ob, None) is self.ob def assertObProvidesCandDnotAorB(self,IC,ID): assert self.IA(self.ob, None) is None assert self.IB(self.ob, None) is None assert IC(self.ob, None) is self.ob assert ID(self.ob, None) is self.ob class ProviderChecks(TestBase): """Non-adapter specific-object-provides checks""" def declareObImplements(self,ifaces): adviseObject(self.ob, provides=ifaces) def declareObAdapts(self,factory,ifaces): declareAdapter(factory,provides=ifaces,forObjects=[self.ob]) def checkSimpleRegister(self): self.declareObImplements([self.IA]) self.assertObProvidesOnlyA() def checkImpliedRegister(self): self.declareObImplements([self.IB]) self.assertObProvidesAandB() class SimpleAdaptiveChecks: """Simple adapter-oriented checks that Twisted can handle (Well, it handles them as long as all the interfaces are Twisted)""" def checkDelayedImplication(self): self.declareObImplements([self.IA]) self.assertObProvidesSubsetOfA() def checkIndirectImplication(self): # Zope fails this because it does not support adding after-the-fact # implication. # IB->IA + ID->IC + IC->IB = ID->IA class IC(self.Interface): pass class ID(IC): pass self.declareObImplements([ID]) self.assertObProvidesCandDnotAorB(IC,ID) declareAdapter( NO_ADAPTER_NEEDED, provides=[self.IB], forProtocols=[IC] ) self.assertObProvidesABCD(IC,ID) class AdaptiveChecks(SimpleAdaptiveChecks): """General adapter/protocol implication checks Twisted can't handle these.""" def checkAmbiguity(self): self.declareObAdapts(self.a1,[self.IA]) self.assertAmbiguous(self.a1,self.a2,1,1,[self.IA]) def checkOverrideDepth(self): self.declareObAdapts(self.a1,[self.IB]) self.assertEquals(self.IA(self.ob,None), ('a1',self.ob)) self.declareObAdapts(self.a2,[self.IA]) self.assertEquals(self.IA(self.ob,None), ('a2',self.ob)) def checkComposed(self): class IC(self.Interface): pass declareAdapter(self.a2,provides=[IC],forProtocols=[self.IA]) self.declareObAdapts(self.a1,[self.IA]) self.assertEqual(IC(self.ob,None), ('a2',('a1',self.ob))) def checkLateDefinition(self): # Zope fails this because it has different override semantics self.declareObAdapts(DOES_NOT_SUPPORT, [self.IA]) assert self.IA(self.ob,None) is None self.declareObImplements([self.IA]) assert self.IA(self.ob,None) is self.ob # NO_ADAPTER_NEEDED at same depth should override DOES_NOT_SUPPORT self.declareObAdapts(DOES_NOT_SUPPORT, [self.IA]) assert self.IA(self.ob,None) is self.ob class InstanceImplementationChecks(TestBase): """Non-adapter class-instances-provide checks""" # Everybody can handle these def declareObImplements(self,ifaces): declareImplementation(self.klass, ifaces) def declareObAdapts(self,factory,ifaces): declareAdapter(factory,provides=ifaces,forTypes=[self.klass]) def checkSimpleRegister(self): self.declareObImplements([self.IA]) self.assertObProvidesOnlyA() def checkImpliedRegister(self): self.declareObImplements([self.IB]) self.assertObProvidesAandB() class ImplementationChecks(InstanceImplementationChecks): """Non-adapter class-instances vs. class-provide checks""" # Twisted fails these tests because it has no way to distinguish the # interfaces an object provides from the interfaces its class provides def checkNoClassPassThru(self): self.declareObImplements([self.IA]) assert self.IA(self.klass, None) is None def checkInheritedDeclaration(self): self.declareObImplements([self.IB]) class Sub(self.klass): pass inst = self.make(Sub) assert self.IB(inst,None) is inst assert self.IA(inst,None) is inst assert self.IA(Sub,None) is None # check not passed up to class assert self.IB(Sub,None) is None def checkRejectInheritanceAndReplace(self): self.declareObImplements([self.IB]) class Sub(self.klass): advise(instancesDoNotProvide=[self.IB]) inst = self.make(Sub) assert self.IA(inst,None) is inst assert self.IB(inst,None) is None declareImplementation(Sub, instancesProvide=[self.IB]) assert self.IB(inst,None) is inst class BasicClassProvidesChecks: """Object-provides checks for classes and types""" # Twisted doesn't support these because it makes no object/type distinction def checkNoInstancePassThru(self): inst = self.ob() adviseObject(self.ob, provides=[self.IA]) assert self.IA(inst, None) is None def checkChangingBases(self): M1, M2 = self.setupBases(self.ob) adviseObject(M1, provides=[self.IA]) adviseObject(M2, provides=[self.IB]) self.assertM1ProvidesOnlyAandM2ProvidesB(M1,M2) self.assertChangingBasesChangesInterface(M1,M2,M1,M2) class ClassProvidesChecks(BasicClassProvidesChecks): # Twisted doesn't support these because it makes no object/type distinction def checkInheritedDeclaration(self): class Sub(self.ob): pass adviseObject(self.ob, provides=[self.IB]) assert self.IB(Sub, None) is Sub assert self.IA(Sub, None) is Sub def checkRejectInheritanceAndReplace(self): adviseObject(self.ob, provides=[self.IB]) class Sub(self.ob): advise(classDoesNotProvide=[self.IB]) assert self.IA(Sub,None) is Sub assert self.IB(Sub,None) is None adviseObject(Sub,provides=[self.IB]) assert self.IB(Sub,None) is Sub def makeInstanceTests(base,Picklable,NewStyle): """Generate a set of instance-oriented test classes using 'base'""" class AdviseFunction(base): __module__ = base.__module__ def setUp(self): def aFunc(foo,bar): pass self.ob = aFunc AdviseFunction.__name__ = base.__name__+'.AdviseFunction' class AdviseModule(base): __module__ = base.__module__ def setUp(self): from types import ModuleType self.ob = ModuleType('x') AdviseModule.__name__ = base.__name__+'.AdviseModule' class AdviseInstance(base): __module__ = base.__module__ def setUp(self): self.ob = Picklable() def checkPickling(self): from cPickle import loads,dumps # pickle has a bug! adviseObject(self.ob, provides=[self.IPure]) newOb = loads(dumps(self.ob)) assert self.IPure(newOb,None) is newOb AdviseInstance.__name__ = base.__name__+'.AdviseInstance' class AdviseNewInstance(AdviseInstance): __module__ = base.__module__ def setUp(self): self.ob = NewStyle() AdviseNewInstance.__name__ = base.__name__+'.AdviseNewInstance' return AdviseFunction, AdviseModule, AdviseInstance, AdviseNewInstance def makeClassProvidesTests(base): """Generate a set of class-provides-oriented test classes using 'base'""" class AdviseClass(base): def setUp(self): class Classic: pass self.ob = Classic class AdviseType(AdviseClass): def setUp(self): class Class(object): pass self.ob = Class return AdviseClass, AdviseType def makeMetaClassProvidesTests(base): """Generate a set of class-provides-oriented test classes using 'base'""" # Notice that we don't test the *metaclass* of the next two configurations; # it would fail because the metaclass itself can't be adapted to an open # provider, because it has a __conform__ method (from ProviderMixin). For # that to work, there'd have to be *another* metalevel. class AdviseMixinClass(base): def setUp(self): class Meta(ProviderMixin, type): pass class Test(object): __metaclass__ = Meta self.ob = Test class AdviseMixinMultiMeta2(base): def setUp(self): class Meta(ProviderMixin, type): pass class Test(ProviderMixin,object): __metaclass__ = Meta self.ob = Test return AdviseMixinClass, AdviseMixinMultiMeta2 def makeClassTests(base): """Generate a set of class-oriented test classes using 'base'""" class TestClassic(base): __module__ = base.__module__ def setUp(self): class Classic: pass self.klass = Classic self.ob = Classic() TestClassic.__name__ = base.__name__+'.TestClassic' class TestBuiltin(base): __module__ = base.__module__ def setUp(self): # Note: We need a type with a no-arguments constructor class Newstyle(list): __slots__ = () self.klass = Newstyle self.ob = Newstyle() TestBuiltin.__name__ = base.__name__+'.TestBuiltin' class TestMetaclass(base): __module__ = base.__module__ def setUp(self): class Meta(type): pass self.klass = Meta class Base(object): __metaclass__ = Meta self.ob = Base def make(self,klass): return klass('Dummy',(object,),{}) TestMetaclass.__name__ = base.__name__+'.TestMetaclass' class TestMetaInstance(base): __module__ = base.__module__ def setUp(self): class Meta(type): pass class Base(object): __metaclass__ = Meta self.klass = Base self.ob = Base() TestMetaInstance.__name__ = base.__name__+'.TestMetaInstance' return TestClassic, TestBuiltin, TestMetaclass, TestMetaInstance pyprotocols-1.0a.svn20070625/src/protocols/tests/test_twisted.py0000644000175000017500000000312307670216374023007 0ustar kovkov"""Twisted Interface tests""" from unittest import TestCase, makeSuite, TestSuite from protocols import * import protocols.twisted_support from twisted.python.components import Interface # Dummy interfaces and adapters used in tests class IA(Interface): pass class IB(IA): pass class IPure(Interface): # We use this for pickle/copy tests because the other protocols # imply various dynamically created interfaces, and so any object # registered with them won't be picklable pass from checks import InstanceImplementationChecks, makeClassTests, ProviderChecks from checks import makeInstanceTests, SimpleAdaptiveChecks class Picklable: # Pickling needs classes in top-level namespace pass class NewStyle(object): pass class BasicChecks(SimpleAdaptiveChecks, InstanceImplementationChecks): IA = IA IB = IB Interface = Interface IPure = IPure class InstanceChecks(ProviderChecks): IA = IA IB = IB Interface = Interface IPure = IPure TestClasses = makeClassTests(BasicChecks) TestClasses += makeInstanceTests(InstanceChecks,Picklable,NewStyle) class IB(protocols.Interface): advise(protocolExtends = [IA]) class BasicChecks(InstanceImplementationChecks): IA = IA IB = IB Interface = Interface IPure = IPure class InstanceChecks(ProviderChecks): IA = IA IB = IB Interface = Interface IPure = IPure TestClasses += makeClassTests(BasicChecks) TestClasses += makeInstanceTests(InstanceChecks,Picklable,NewStyle) def test_suite(): return TestSuite([makeSuite(t,'check') for t in TestClasses]) pyprotocols-1.0a.svn20070625/src/protocols/tests/doctest.py0000644000175000017500000030260210521700707021721 0ustar kovkov# Module doctest. # Released to the public domain 16-Jan-2001, by Tim Peters (tim@python.org). # Major enhancements and refactoring by: # Jim Fulton # Edward Loper # Provided as-is; use at your own risk; no warranty; no promises; enjoy! try: basestring except NameError: basestring = str,unicode try: enumerate except NameError: def enumerate(seq): return zip(range(len(seq)),seq) r"""Module doctest -- a framework for running examples in docstrings. In simplest use, end each module M to be tested with: def _test(): import doctest doctest.testmod() if __name__ == "__main__": _test() Then running the module as a script will cause the examples in the docstrings to get executed and verified: python M.py This won't display anything unless an example fails, in which case the failing example(s) and the cause(s) of the failure(s) are printed to stdout (why not stderr? because stderr is a lame hack <0.2 wink>), and the final line of output is "Test failed.". Run it with the -v switch instead: python M.py -v and a detailed report of all examples tried is printed to stdout, along with assorted summaries at the end. You can force verbose mode by passing "verbose=True" to testmod, or prohibit it by passing "verbose=False". In either of those cases, sys.argv is not examined by testmod. There are a variety of other ways to run doctests, including integration with the unittest framework, and support for running non-Python text files containing doctests. There are also many ways to override parts of doctest's default behaviors. See the Library Reference Manual for details. """ __docformat__ = 'reStructuredText en' __all__ = [ # 0, Option Flags 'register_optionflag', 'DONT_ACCEPT_TRUE_FOR_1', 'DONT_ACCEPT_BLANKLINE', 'NORMALIZE_WHITESPACE', 'ELLIPSIS', 'IGNORE_EXCEPTION_DETAIL', 'COMPARISON_FLAGS', 'REPORT_UDIFF', 'REPORT_CDIFF', 'REPORT_NDIFF', 'REPORT_ONLY_FIRST_FAILURE', 'REPORTING_FLAGS', # 1. Utility Functions 'is_private', # 2. Example & DocTest 'Example', 'DocTest', # 3. Doctest Parser 'DocTestParser', # 4. Doctest Finder 'DocTestFinder', # 5. Doctest Runner 'DocTestRunner', 'OutputChecker', 'DocTestFailure', 'UnexpectedException', 'DebugRunner', # 6. Test Functions 'testmod', 'testfile', 'run_docstring_examples', # 7. Tester 'Tester', # 8. Unittest Support 'DocTestSuite', 'DocFileSuite', 'set_unittest_reportflags', # 9. Debugging Support 'script_from_examples', 'testsource', 'debug_src', 'debug', ] import __future__ import sys, traceback, inspect, linecache, os, re, types import unittest, difflib, pdb, tempfile import warnings from StringIO import StringIO # Don't whine about the deprecated is_private function in this # module's tests. warnings.filterwarnings("ignore", "is_private", DeprecationWarning, __name__, 0) # There are 4 basic classes: # - Example: a pair, plus an intra-docstring line number. # - DocTest: a collection of examples, parsed from a docstring, plus # info about where the docstring came from (name, filename, lineno). # - DocTestFinder: extracts DocTests from a given object's docstring and # its contained objects' docstrings. # - DocTestRunner: runs DocTest cases, and accumulates statistics. # # So the basic picture is: # # list of: # +------+ +---------+ +-------+ # |object| --DocTestFinder-> | DocTest | --DocTestRunner-> |results| # +------+ +---------+ +-------+ # | Example | # | ... | # | Example | # +---------+ # Option constants. OPTIONFLAGS_BY_NAME = {} def register_optionflag(name): flag = 1 << len(OPTIONFLAGS_BY_NAME) OPTIONFLAGS_BY_NAME[name] = flag return flag DONT_ACCEPT_TRUE_FOR_1 = register_optionflag('DONT_ACCEPT_TRUE_FOR_1') DONT_ACCEPT_BLANKLINE = register_optionflag('DONT_ACCEPT_BLANKLINE') NORMALIZE_WHITESPACE = register_optionflag('NORMALIZE_WHITESPACE') ELLIPSIS = register_optionflag('ELLIPSIS') IGNORE_EXCEPTION_DETAIL = register_optionflag('IGNORE_EXCEPTION_DETAIL') COMPARISON_FLAGS = (DONT_ACCEPT_TRUE_FOR_1 | DONT_ACCEPT_BLANKLINE | NORMALIZE_WHITESPACE | ELLIPSIS | IGNORE_EXCEPTION_DETAIL) REPORT_UDIFF = register_optionflag('REPORT_UDIFF') REPORT_CDIFF = register_optionflag('REPORT_CDIFF') REPORT_NDIFF = register_optionflag('REPORT_NDIFF') REPORT_ONLY_FIRST_FAILURE = register_optionflag('REPORT_ONLY_FIRST_FAILURE') REPORTING_FLAGS = (REPORT_UDIFF | REPORT_CDIFF | REPORT_NDIFF | REPORT_ONLY_FIRST_FAILURE) # Special string markers for use in `want` strings: BLANKLINE_MARKER = '' ELLIPSIS_MARKER = '...' ###################################################################### ## Table of Contents ###################################################################### # 1. Utility Functions # 2. Example & DocTest -- store test cases # 3. DocTest Parser -- extracts examples from strings # 4. DocTest Finder -- extracts test cases from objects # 5. DocTest Runner -- runs test cases # 6. Test Functions -- convenient wrappers for testing # 7. Tester Class -- for backwards compatibility # 8. Unittest Support # 9. Debugging Support # 10. Example Usage ###################################################################### ## 1. Utility Functions ###################################################################### def is_private(prefix, base): """prefix, base -> true iff name prefix + "." + base is "private". Prefix may be an empty string, and base does not contain a period. Prefix is ignored (although functions you write conforming to this protocol may make use of it). Return true iff base begins with an (at least one) underscore, but does not both begin and end with (at least) two underscores. >>> is_private("a.b", "my_func") False >>> is_private("____", "_my_func") True >>> is_private("someclass", "__init__") False >>> is_private("sometypo", "__init_") True >>> is_private("x.y.z", "_") True >>> is_private("_x.y.z", "__") False >>> is_private("", "") # senseless but consistent False """ warnings.warn("is_private is deprecated; it wasn't useful; " "examine DocTestFinder.find() lists instead", DeprecationWarning, stacklevel=2) return base[:1] == "_" and not base[:2] == "__" == base[-2:] def _extract_future_flags(globs): """ Return the compiler-flags associated with the future features that have been imported into the given namespace (globs). """ flags = 0 for fname in __future__.all_feature_names: feature = globs.get(fname, None) if feature is getattr(__future__, fname): flags |= feature.compiler_flag return flags def _normalize_module(module, depth=2): """ Return the module specified by `module`. In particular: - If `module` is a module, then return module. - If `module` is a string, then import and return the module with that name. - If `module` is None, then return the calling module. The calling module is assumed to be the module of the stack frame at the given depth in the call stack. """ if inspect.ismodule(module): return module elif isinstance(module, (str, unicode)): return __import__(module, globals(), locals(), ["*"]) elif module is None: return sys.modules[sys._getframe(depth).f_globals['__name__']] else: raise TypeError("Expected a module, string, or None") def _indent(s, indent=4): """ Add the given number of space characters to the beginning every non-blank line in `s`, and return the result. """ # This regexp matches the start of non-blank lines: return re.sub('(?m)^(?!$)', indent*' ', s) def _exception_traceback(exc_info): """ Return a string containing a traceback message for the given exc_info tuple (as returned by sys.exc_info()). """ # Get a traceback message. excout = StringIO() exc_type, exc_val, exc_tb = exc_info traceback.print_exception(exc_type, exc_val, exc_tb, file=excout) return excout.getvalue() # Override some StringIO methods. class _SpoofOut(StringIO): def getvalue(self): result = StringIO.getvalue(self) # If anything at all was written, make sure there's a trailing # newline. There's no way for the expected output to indicate # that a trailing newline is missing. if result and not result.endswith("\n"): result += "\n" # Prevent softspace from screwing up the next test case, in # case they used print with a trailing comma in an example. if hasattr(self, "softspace"): del self.softspace return result def truncate(self, size=None): StringIO.truncate(self, size) if hasattr(self, "softspace"): del self.softspace # Worst-case linear-time ellipsis matching. def _ellipsis_match(want, got): """ Essentially the only subtle case: >>> _ellipsis_match('aa...aa', 'aaa') False """ if want.find(ELLIPSIS_MARKER)==-1: return want == got # Find "the real" strings. ws = want.split(ELLIPSIS_MARKER) assert len(ws) >= 2 # Deal with exact matches possibly needed at one or both ends. startpos, endpos = 0, len(got) w = ws[0] if w: # starts with exact match if got.startswith(w): startpos = len(w) del ws[0] else: return False w = ws[-1] if w: # ends with exact match if got.endswith(w): endpos -= len(w) del ws[-1] else: return False if startpos > endpos: # Exact end matches required more characters than we have, as in # _ellipsis_match('aa...aa', 'aaa') return False # For the rest, we only need to find the leftmost non-overlapping # match for each piece. If there's no overall match that way alone, # there's no overall match period. for w in ws: # w may be '' at times, if there are consecutive ellipses, or # due to an ellipsis at the start or end of `want`. That's OK. # Search for an empty string succeeds, and doesn't change startpos. startpos = got.find(w, startpos, endpos) if startpos < 0: return False startpos += len(w) return True def _comment_line(line): "Return a commented form of the given line" line = line.rstrip() if line: return '# '+line else: return '#' class _OutputRedirectingPdb(pdb.Pdb): """ A specialized version of the python debugger that redirects stdout to a given stream when interacting with the user. Stdout is *not* redirected when traced code is executed. """ def __init__(self, out): self.__out = out pdb.Pdb.__init__(self) def trace_dispatch(self, *args): # Redirect stdout to the given stream. save_stdout = sys.stdout sys.stdout = self.__out # Call Pdb's trace dispatch method. try: return pdb.Pdb.trace_dispatch(self, *args) finally: sys.stdout = save_stdout # [XX] Normalize with respect to os.path.pardir? def _module_relative_path(module, path): if not inspect.ismodule(module): raise TypeError, 'Expected a module: %r' % module if path.startswith('/'): raise ValueError, 'Module-relative files may not have absolute paths' # Find the base directory for the path. if hasattr(module, '__file__'): # A normal module/package basedir = os.path.split(module.__file__)[0] elif module.__name__ == '__main__': # An interactive session. if len(sys.argv)>0 and sys.argv[0] != '': basedir = os.path.split(sys.argv[0])[0] else: basedir = os.curdir else: # A module w/o __file__ (this includes builtins) raise ValueError("Can't resolve paths relative to the module " + module + " (it has no __file__)") # Combine the base directory and the path. return os.path.join(basedir, *(path.split('/'))) ###################################################################### ## 2. Example & DocTest ###################################################################### ## - An "example" is a pair, where "source" is a ## fragment of source code, and "want" is the expected output for ## "source." The Example class also includes information about ## where the example was extracted from. ## ## - A "doctest" is a collection of examples, typically extracted from ## a string (such as an object's docstring). The DocTest class also ## includes information about where the string was extracted from. class Example: """ A single doctest example, consisting of source code and expected output. `Example` defines the following attributes: - source: A single Python statement, always ending with a newline. The constructor adds a newline if needed. - want: The expected output from running the source code (either from stdout, or a traceback in case of exception). `want` ends with a newline unless it's empty, in which case it's an empty string. The constructor adds a newline if needed. - exc_msg: The exception message generated by the example, if the example is expected to generate an exception; or `None` if it is not expected to generate an exception. This exception message is compared against the return value of `traceback.format_exception_only()`. `exc_msg` ends with a newline unless it's `None`. The constructor adds a newline if needed. - lineno: The line number within the DocTest string containing this Example where the Example begins. This line number is zero-based, with respect to the beginning of the DocTest. - indent: The example's indentation in the DocTest string. I.e., the number of space characters that preceed the example's first prompt. - options: A dictionary mapping from option flags to True or False, which is used to override default options for this example. Any option flags not contained in this dictionary are left at their default value (as specified by the DocTestRunner's optionflags). By default, no options are set. """ def __init__(self, source, want, exc_msg=None, lineno=0, indent=0, options=None): # Normalize inputs. if not source.endswith('\n'): source += '\n' if want and not want.endswith('\n'): want += '\n' if exc_msg is not None and not exc_msg.endswith('\n'): exc_msg += '\n' # Store properties. self.source = source self.want = want self.lineno = lineno self.indent = indent if options is None: options = {} self.options = options self.exc_msg = exc_msg class DocTest: """ A collection of doctest examples that should be run in a single namespace. Each `DocTest` defines the following attributes: - examples: the list of examples. - globs: The namespace (aka globals) that the examples should be run in. - name: A name identifying the DocTest (typically, the name of the object whose docstring this DocTest was extracted from). - filename: The name of the file that this DocTest was extracted from, or `None` if the filename is unknown. - lineno: The line number within filename where this DocTest begins, or `None` if the line number is unavailable. This line number is zero-based, with respect to the beginning of the file. - docstring: The string that the examples were extracted from, or `None` if the string is unavailable. """ def __init__(self, examples, globs, name, filename, lineno, docstring): """ Create a new DocTest containing the given examples. The DocTest's globals are initialized with a copy of `globs`. """ assert not isinstance(examples, basestring), \ "DocTest no longer accepts str; use DocTestParser instead" self.examples = examples self.docstring = docstring self.globs = globs.copy() self.name = name self.filename = filename self.lineno = lineno def __repr__(self): if len(self.examples) == 0: examples = 'no examples' elif len(self.examples) == 1: examples = '1 example' else: examples = '%d examples' % len(self.examples) return ('' % (self.name, self.filename, self.lineno, examples)) # This lets us sort tests by name: def __cmp__(self, other): if not isinstance(other, DocTest): return -1 return cmp((self.name, self.filename, self.lineno, id(self)), (other.name, other.filename, other.lineno, id(other))) ###################################################################### ## 3. DocTestParser ###################################################################### class DocTestParser: """ A class used to parse strings containing doctest examples. """ # This regular expression is used to find doctest examples in a # string. It defines three groups: `source` is the source code # (including leading indentation and prompts); `indent` is the # indentation of the first (PS1) line of the source code; and # `want` is the expected output (including leading indentation). _EXAMPLE_RE = re.compile(r''' # Source consists of a PS1 line followed by zero or more PS2 lines. (?P (?:^(?P [ ]*) >>> .*) # PS1 line (?:\n [ ]* \.\.\. .*)*) # PS2 lines \n? # Want consists of any non-blank lines that do not start with PS1. (?P (?:(?![ ]*$) # Not a blank line (?![ ]*>>>) # Not a line starting with PS1 .*$\n? # But any other line )*) ''', re.MULTILINE | re.VERBOSE) # A regular expression for handling `want` strings that contain # expected exceptions. It divides `want` into three pieces: # - the traceback header line (`hdr`) # - the traceback stack (`stack`) # - the exception message (`msg`), as generated by # traceback.format_exception_only() # `msg` may have multiple lines. We assume/require that the # exception message is the first non-indented line starting with a word # character following the traceback header line. _EXCEPTION_RE = re.compile(r""" # Grab the traceback header. Different versions of Python have # said different things on the first traceback line. ^(?P Traceback\ \( (?: most\ recent\ call\ last | innermost\ last ) \) : ) \s* $ # toss trailing whitespace on the header. (?P .*?) # don't blink: absorb stuff until... ^ (?P \w+ .*) # a line *starts* with alphanum. """, re.VERBOSE | re.MULTILINE | re.DOTALL) # A callable returning a true value iff its argument is a blank line # or contains a single comment. _IS_BLANK_OR_COMMENT = re.compile(r'^[ ]*(#.*)?$').match def parse(self, string, name=''): """ Divide the given string into examples and intervening text, and return them as a list of alternating Examples and strings. Line numbers for the Examples are 0-based. The optional argument `name` is a name identifying this string, and is only used for error messages. """ string = string.expandtabs() # If all lines begin with the same indentation, then strip it. min_indent = self._min_indent(string) if min_indent > 0: string = '\n'.join([l[min_indent:] for l in string.split('\n')]) output = [] charno, lineno = 0, 0 # Find all doctest examples in the string: for m in self._EXAMPLE_RE.finditer(string): # Add the pre-example text to `output`. output.append(string[charno:m.start()]) # Update lineno (lines before this example) lineno += string.count('\n', charno, m.start()) # Extract info from the regexp match. (source, options, want, exc_msg) = \ self._parse_example(m, name, lineno) # Create an Example, and add it to the list. if not self._IS_BLANK_OR_COMMENT(source): output.append( Example(source, want, exc_msg, lineno=lineno, indent=min_indent+len(m.group('indent')), options=options) ) # Update lineno (lines inside this example) lineno += string.count('\n', m.start(), m.end()) # Update charno. charno = m.end() # Add any remaining post-example text to `output`. output.append(string[charno:]) return output def get_doctest(self, string, globs, name, filename, lineno): """ Extract all doctest examples from the given string, and collect them into a `DocTest` object. `globs`, `name`, `filename`, and `lineno` are attributes for the new `DocTest` object. See the documentation for `DocTest` for more information. """ return DocTest(self.get_examples(string, name), globs, name, filename, lineno, string) def get_examples(self, string, name=''): """ Extract all doctest examples from the given string, and return them as a list of `Example` objects. Line numbers are 0-based, because it's most common in doctests that nothing interesting appears on the same line as opening triple-quote, and so the first interesting line is called \"line 1\" then. The optional argument `name` is a name identifying this string, and is only used for error messages. """ return [x for x in self.parse(string, name) if isinstance(x, Example)] def _parse_example(self, m, name, lineno): """ Given a regular expression match from `_EXAMPLE_RE` (`m`), return a pair `(source, want)`, where `source` is the matched example's source code (with prompts and indentation stripped); and `want` is the example's expected output (with indentation stripped). `name` is the string's name, and `lineno` is the line number where the example starts; both are used for error messages. """ # Get the example's indentation level. indent = len(m.group('indent')) # Divide source into lines; check that they're properly # indented; and then strip their indentation & prompts. source_lines = m.group('source').split('\n') self._check_prompt_blank(source_lines, indent, name, lineno) self._check_prefix(source_lines[1:], ' '*indent + '.', name, lineno) source = '\n'.join([sl[indent+4:] for sl in source_lines]) # Divide want into lines; check that it's properly indented; and # then strip the indentation. Spaces before the last newline should # be preserved, so plain rstrip() isn't good enough. want = m.group('want') want_lines = want.split('\n') if len(want_lines) > 1 and re.match(r' *$', want_lines[-1]): del want_lines[-1] # forget final newline & spaces after it self._check_prefix(want_lines, ' '*indent, name, lineno + len(source_lines)) want = '\n'.join([wl[indent:] for wl in want_lines]) # If `want` contains a traceback message, then extract it. m = self._EXCEPTION_RE.match(want) if m: exc_msg = m.group('msg') else: exc_msg = None # Extract options from the source. options = self._find_options(source, name, lineno) return source, options, want, exc_msg # This regular expression looks for option directives in the # source code of an example. Option directives are comments # starting with "doctest:". Warning: this may give false # positives for string-literals that contain the string # "#doctest:". Eliminating these false positives would require # actually parsing the string; but we limit them by ignoring any # line containing "#doctest:" that is *followed* by a quote mark. _OPTION_DIRECTIVE_RE = re.compile(r'#\s*doctest:\s*([^\n\'"]*)$', re.MULTILINE) def _find_options(self, source, name, lineno): """ Return a dictionary containing option overrides extracted from option directives in the given source string. `name` is the string's name, and `lineno` is the line number where the example starts; both are used for error messages. """ options = {} # (note: with the current regexp, this will match at most once:) for m in self._OPTION_DIRECTIVE_RE.finditer(source): option_strings = m.group(1).replace(',', ' ').split() for option in option_strings: if (option[0] not in '+-' or option[1:] not in OPTIONFLAGS_BY_NAME): raise ValueError('line %r of the doctest for %s ' 'has an invalid option: %r' % (lineno+1, name, option)) flag = OPTIONFLAGS_BY_NAME[option[1:]] options[flag] = (option[0] == '+') if options and self._IS_BLANK_OR_COMMENT(source): raise ValueError('line %r of the doctest for %s has an option ' 'directive on a line with no example: %r' % (lineno, name, source)) return options # This regular expression finds the indentation of every non-blank # line in a string. _INDENT_RE = re.compile('^([ ]*)(?=\S)', re.MULTILINE) def _min_indent(self, s): "Return the minimum indentation of any non-blank line in `s`" indents = [len(indent) for indent in self._INDENT_RE.findall(s)] if len(indents) > 0: return min(indents) else: return 0 def _check_prompt_blank(self, lines, indent, name, lineno): """ Given the lines of a source string (including prompts and leading indentation), check to make sure that every prompt is followed by a space character. If any line is not followed by a space character, then raise ValueError. """ for i, line in enumerate(lines): if len(line) >= indent+4 and line[indent+3] != ' ': raise ValueError('line %r of the docstring for %s ' 'lacks blank after %s: %r' % (lineno+i+1, name, line[indent:indent+3], line)) def _check_prefix(self, lines, prefix, name, lineno): """ Check that every line in the given list starts with the given prefix; if any line does not, then raise a ValueError. """ for i, line in enumerate(lines): if line and not line.startswith(prefix): raise ValueError('line %r of the docstring for %s has ' 'inconsistent leading whitespace: %r' % (lineno+i+1, name, line)) ###################################################################### ## 4. DocTest Finder ###################################################################### class DocTestFinder: """ A class used to extract the DocTests that are relevant to a given object, from its docstring and the docstrings of its contained objects. Doctests can currently be extracted from the following object types: modules, functions, classes, methods, staticmethods, classmethods, and properties. """ def __init__(self, verbose=False, parser=DocTestParser(), recurse=True, _namefilter=None, exclude_empty=True): """ Create a new doctest finder. The optional argument `parser` specifies a class or function that should be used to create new DocTest objects (or objects that implement the same interface as DocTest). The signature for this factory function should match the signature of the DocTest constructor. If the optional argument `recurse` is false, then `find` will only examine the given object, and not any contained objects. If the optional argument `exclude_empty` is false, then `find` will include tests for objects with empty docstrings. """ self._parser = parser self._verbose = verbose self._recurse = recurse self._exclude_empty = exclude_empty # _namefilter is undocumented, and exists only for temporary backward- # compatibility support of testmod's deprecated isprivate mess. self._namefilter = _namefilter def find(self, obj, name=None, module=None, globs=None, extraglobs=None): """ Return a list of the DocTests that are defined by the given object's docstring, or by any of its contained objects' docstrings. The optional parameter `module` is the module that contains the given object. If the module is not specified or is None, then the test finder will attempt to automatically determine the correct module. The object's module is used: - As a default namespace, if `globs` is not specified. - To prevent the DocTestFinder from extracting DocTests from objects that are imported from other modules. - To find the name of the file containing the object. - To help find the line number of the object within its file. Contained objects whose module does not match `module` are ignored. If `module` is False, no attempt to find the module will be made. This is obscure, of use mostly in tests: if `module` is False, or is None but cannot be found automatically, then all objects are considered to belong to the (non-existent) module, so all contained objects will (recursively) be searched for doctests. The globals for each DocTest is formed by combining `globs` and `extraglobs` (bindings in `extraglobs` override bindings in `globs`). A new copy of the globals dictionary is created for each DocTest. If `globs` is not specified, then it defaults to the module's `__dict__`, if specified, or {} otherwise. If `extraglobs` is not specified, then it defaults to {}. """ # If name was not specified, then extract it from the object. if name is None: name = getattr(obj, '__name__', None) if name is None: raise ValueError("DocTestFinder.find: name must be given " "when obj.__name__ doesn't exist: %r" % (type(obj),)) # Find the module that contains the given object (if obj is # a module, then module=obj.). Note: this may fail, in which # case module will be None. if module is False: module = None elif module is None: module = inspect.getmodule(obj) # Read the module's source code. This is used by # DocTestFinder._find_lineno to find the line number for a # given object's docstring. try: file = inspect.getsourcefile(obj) or inspect.getfile(obj) source_lines = linecache.getlines(file) if not source_lines: source_lines = None except TypeError: source_lines = None # Initialize globals, and merge in extraglobs. if globs is None: if module is None: globs = {} else: globs = module.__dict__.copy() else: globs = globs.copy() if extraglobs is not None: globs.update(extraglobs) # Recursively expore `obj`, extracting DocTests. tests = [] self._find(tests, obj, name, module, source_lines, globs, {}) return tests def _filter(self, obj, prefix, base): """ Return true if the given object should not be examined. """ return (self._namefilter is not None and self._namefilter(prefix, base)) def _from_module(self, module, object): """ Return true if the given object is defined in the given module. """ if module is None: return True elif inspect.isfunction(object): return module.__dict__ is object.func_globals elif inspect.isclass(object): return module.__name__ == object.__module__ elif inspect.getmodule(object) is not None: return module is inspect.getmodule(object) elif hasattr(object, '__module__'): return module.__name__ == object.__module__ elif isinstance(object, property): return True # [XX] no way not be sure. else: raise ValueError("object must be a class or function") def _find(self, tests, obj, name, module, source_lines, globs, seen): """ Find tests for the given object and any contained objects, and add them to `tests`. """ if self._verbose: print 'Finding tests in %s' % name # If we've already processed this object, then ignore it. if id(obj) in seen: return seen[id(obj)] = 1 # Find a test for this object, and add it to the list of tests. test = self._get_test(obj, name, module, globs, source_lines) if test is not None: tests.append(test) # Look for tests in a module's contained objects. if inspect.ismodule(obj) and self._recurse: for valname, val in obj.__dict__.items(): # Check if this contained object should be ignored. if self._filter(val, name, valname): continue valname = '%s.%s' % (name, valname) # Recurse to functions & classes. if ((inspect.isfunction(val) or inspect.isclass(val)) and self._from_module(module, val)): self._find(tests, val, valname, module, source_lines, globs, seen) # Look for tests in a module's __test__ dictionary. if inspect.ismodule(obj) and self._recurse: for valname, val in getattr(obj, '__test__', {}).items(): if not isinstance(valname, basestring): raise ValueError("DocTestFinder.find: __test__ keys " "must be strings: %r" % (type(valname),)) if not (inspect.isfunction(val) or inspect.isclass(val) or inspect.ismethod(val) or inspect.ismodule(val) or isinstance(val, basestring)): raise ValueError("DocTestFinder.find: __test__ values " "must be strings, functions, methods, " "classes, or modules: %r" % (type(val),)) valname = '%s.__test__.%s' % (name, valname) self._find(tests, val, valname, module, source_lines, globs, seen) # Look for tests in a class's contained objects. if inspect.isclass(obj) and self._recurse: for valname, val in obj.__dict__.items(): # Check if this contained object should be ignored. if self._filter(val, name, valname): continue # Special handling for staticmethod/classmethod. if isinstance(val, staticmethod): val = getattr(obj, valname) if isinstance(val, classmethod): val = getattr(obj, valname).im_func # Recurse to methods, properties, and nested classes. if ((inspect.isfunction(val) or inspect.isclass(val) or isinstance(val, property)) and self._from_module(module, val)): valname = '%s.%s' % (name, valname) self._find(tests, val, valname, module, source_lines, globs, seen) def _get_test(self, obj, name, module, globs, source_lines): """ Return a DocTest for the given object, if it defines a docstring; otherwise, return None. """ # Extract the object's docstring. If it doesn't have one, # then return None (no test for this object). if isinstance(obj, basestring): docstring = obj else: try: if obj.__doc__ is None: docstring = '' else: docstring = obj.__doc__ if not isinstance(docstring, basestring): docstring = str(docstring) except (TypeError, AttributeError): docstring = '' # Find the docstring's location in the file. lineno = self._find_lineno(obj, source_lines) # Don't bother if the docstring is empty. if self._exclude_empty and not docstring: return None # Return a DocTest for this object. if module is None: filename = None else: filename = getattr(module, '__file__', module.__name__) if filename[-4:] in (".pyc", ".pyo"): filename = filename[:-1] return self._parser.get_doctest(docstring, globs, name, filename, lineno) def _find_lineno(self, obj, source_lines): """ Return a line number of the given object's docstring. Note: this method assumes that the object has a docstring. """ lineno = None # Find the line number for modules. if inspect.ismodule(obj): lineno = 0 # Find the line number for classes. # Note: this could be fooled if a class is defined multiple # times in a single file. if inspect.isclass(obj): if source_lines is None: return None pat = re.compile(r'^\s*class\s*%s\b' % getattr(obj, '__name__', '-')) for i, line in enumerate(source_lines): if pat.match(line): lineno = i break # Find the line number for functions & methods. if inspect.ismethod(obj): obj = obj.im_func if inspect.isfunction(obj): obj = obj.func_code if inspect.istraceback(obj): obj = obj.tb_frame if inspect.isframe(obj): obj = obj.f_code if inspect.iscode(obj): lineno = getattr(obj, 'co_firstlineno', None)-1 # Find the line number where the docstring starts. Assume # that it's the first line that begins with a quote mark. # Note: this could be fooled by a multiline function # signature, where a continuation line begins with a quote # mark. if lineno is not None: if source_lines is None: return lineno+1 pat = re.compile('(^|.*:)\s*\w*("|\')') for lineno in range(lineno, len(source_lines)): if pat.match(source_lines[lineno]): return lineno # We couldn't find the line number. return None ###################################################################### ## 5. DocTest Runner ###################################################################### class DocTestRunner: """ A class used to run DocTest test cases, and accumulate statistics. The `run` method is used to process a single DocTest case. It returns a tuple `(f, t)`, where `t` is the number of test cases tried, and `f` is the number of test cases that failed. >>> tests = DocTestFinder().find(_TestClass) >>> runner = DocTestRunner(verbose=False) >>> for test in tests: ... print runner.run(test) (0, 2) (0, 1) (0, 2) (0, 2) The `summarize` method prints a summary of all the test cases that have been run by the runner, and returns an aggregated `(f, t)` tuple: >>> runner.summarize(verbose=1) 4 items passed all tests: 2 tests in _TestClass 2 tests in _TestClass.__init__ 2 tests in _TestClass.get 1 tests in _TestClass.square 7 tests in 4 items. 7 passed and 0 failed. Test passed. (0, 7) The aggregated number of tried examples and failed examples is also available via the `tries` and `failures` attributes: >>> runner.tries 7 >>> runner.failures 0 The comparison between expected outputs and actual outputs is done by an `OutputChecker`. This comparison may be customized with a number of option flags; see the documentation for `testmod` for more information. If the option flags are insufficient, then the comparison may also be customized by passing a subclass of `OutputChecker` to the constructor. The test runner's display output can be controlled in two ways. First, an output function (`out) can be passed to `TestRunner.run`; this function will be called with strings that should be displayed. It defaults to `sys.stdout.write`. If capturing the output is not sufficient, then the display output can be also customized by subclassing DocTestRunner, and overriding the methods `report_start`, `report_success`, `report_unexpected_exception`, and `report_failure`. """ # This divider string is used to separate failure messages, and to # separate sections of the summary. DIVIDER = "*" * 70 def __init__(self, checker=None, verbose=None, optionflags=0): """ Create a new test runner. Optional keyword arg `checker` is the `OutputChecker` that should be used to compare the expected outputs and actual outputs of doctest examples. Optional keyword arg 'verbose' prints lots of stuff if true, only failures if false; by default, it's true iff '-v' is in sys.argv. Optional argument `optionflags` can be used to control how the test runner compares expected output to actual output, and how it displays failures. See the documentation for `testmod` for more information. """ self._checker = checker or OutputChecker() if verbose is None: verbose = '-v' in sys.argv self._verbose = verbose self.optionflags = optionflags self.original_optionflags = optionflags # Keep track of the examples we've run. self.tries = 0 self.failures = 0 self._name2ft = {} # Create a fake output target for capturing doctest output. self._fakeout = _SpoofOut() #///////////////////////////////////////////////////////////////// # Reporting methods #///////////////////////////////////////////////////////////////// def report_start(self, out, test, example): """ Report that the test runner is about to process the given example. (Only displays a message if verbose=True) """ if self._verbose: if example.want: out('Trying:\n' + _indent(example.source) + 'Expecting:\n' + _indent(example.want)) else: out('Trying:\n' + _indent(example.source) + 'Expecting nothing\n') def report_success(self, out, test, example, got): """ Report that the given example ran successfully. (Only displays a message if verbose=True) """ if self._verbose: out("ok\n") def report_failure(self, out, test, example, got): """ Report that the given example failed. """ out(self._failure_header(test, example) + self._checker.output_difference(example, got, self.optionflags)) def report_unexpected_exception(self, out, test, example, exc_info): """ Report that the given example raised an unexpected exception. """ out(self._failure_header(test, example) + 'Exception raised:\n' + _indent(_exception_traceback(exc_info))) def _failure_header(self, test, example): out = [self.DIVIDER] if test.filename: if test.lineno is not None and example.lineno is not None: lineno = test.lineno + example.lineno + 1 else: lineno = '?' out.append('File "%s", line %s, in %s' % (test.filename, lineno, test.name)) else: out.append('Line %s, in %s' % (example.lineno+1, test.name)) out.append('Failed example:') source = example.source out.append(_indent(source)) return '\n'.join(out) #///////////////////////////////////////////////////////////////// # DocTest Running #///////////////////////////////////////////////////////////////// def __run(self, test, compileflags, out): """ Run the examples in `test`. Write the outcome of each example with one of the `DocTestRunner.report_*` methods, using the writer function `out`. `compileflags` is the set of compiler flags that should be used to execute examples. Return a tuple `(f, t)`, where `t` is the number of examples tried, and `f` is the number of examples that failed. The examples are run in the namespace `test.globs`. """ # Keep track of the number of failures and tries. failures = tries = 0 # Save the option flags (since option directives can be used # to modify them). original_optionflags = self.optionflags SUCCESS, FAILURE, BOOM = range(3) # `outcome` state check = self._checker.check_output # Process each example. for examplenum, example in enumerate(test.examples): # If REPORT_ONLY_FIRST_FAILURE is set, then supress # reporting after the first failure. quiet = (self.optionflags & REPORT_ONLY_FIRST_FAILURE and failures > 0) # Merge in the example's options. self.optionflags = original_optionflags if example.options: for (optionflag, val) in example.options.items(): if val: self.optionflags |= optionflag else: self.optionflags &= ~optionflag # Record that we started this example. tries += 1 if not quiet: self.report_start(out, test, example) # Use a special filename for compile(), so we can retrieve # the source code during interactive debugging (see # __patched_linecache_getlines). filename = '' % (test.name, examplenum) # Run the example in the given context (globs), and record # any exception that gets raised. (But don't intercept # keyboard interrupts.) try: # Don't blink! This is where the user's code gets run. exec compile(example.source, filename, "single", compileflags, 1) in test.globs self.debugger.set_continue() # ==== Example Finished ==== exception = None except KeyboardInterrupt: raise except: exception = sys.exc_info() self.debugger.set_continue() # ==== Example Finished ==== got = self._fakeout.getvalue() # the actual output self._fakeout.truncate(0) outcome = FAILURE # guilty until proved innocent or insane # If the example executed without raising any exceptions, # verify its output. if exception is None: if check(example.want, got, self.optionflags): outcome = SUCCESS # The example raised an exception: check if it was expected. else: exc_info = sys.exc_info() exc_msg = traceback.format_exception_only(*exc_info[:2])[-1] if not quiet: got += _exception_traceback(exc_info) # If `example.exc_msg` is None, then we weren't expecting # an exception. if example.exc_msg is None: outcome = BOOM # We expected an exception: see whether it matches. elif check(example.exc_msg, exc_msg, self.optionflags): outcome = SUCCESS # Another chance if they didn't care about the detail. elif self.optionflags & IGNORE_EXCEPTION_DETAIL: m1 = re.match(r'[^:]*:', example.exc_msg) m2 = re.match(r'[^:]*:', exc_msg) if m1 and m2 and check(m1.group(0), m2.group(0), self.optionflags): outcome = SUCCESS # Report the outcome. if outcome is SUCCESS: if not quiet: self.report_success(out, test, example, got) elif outcome is FAILURE: if not quiet: self.report_failure(out, test, example, got) failures += 1 elif outcome is BOOM: if not quiet: self.report_unexpected_exception(out, test, example, exc_info) failures += 1 else: assert False, ("unknown outcome", outcome) # Restore the option flags (in case they were modified) self.optionflags = original_optionflags # Record and return the number of failures and tries. self.__record_outcome(test, failures, tries) return failures, tries def __record_outcome(self, test, f, t): """ Record the fact that the given DocTest (`test`) generated `f` failures out of `t` tried examples. """ f2, t2 = self._name2ft.get(test.name, (0,0)) self._name2ft[test.name] = (f+f2, t+t2) self.failures += f self.tries += t __LINECACHE_FILENAME_RE = re.compile(r'[\w\.]+)' r'\[(?P\d+)\]>$') def __patched_linecache_getlines(self, filename, module_globals=None): m = self.__LINECACHE_FILENAME_RE.match(filename) if m and m.group('name') == self.test.name: example = self.test.examples[int(m.group('examplenum'))] return example.source.splitlines(True) elif self.save_linecache_getlines.func_code.co_argcount>1: return self.save_linecache_getlines(filename, module_globals) else: return self.save_linecache_getlines(filename) def run(self, test, compileflags=None, out=None, clear_globs=True): """ Run the examples in `test`, and display the results using the writer function `out`. The examples are run in the namespace `test.globs`. If `clear_globs` is true (the default), then this namespace will be cleared after the test runs, to help with garbage collection. If you would like to examine the namespace after the test completes, then use `clear_globs=False`. `compileflags` gives the set of flags that should be used by the Python compiler when running the examples. If not specified, then it will default to the set of future-import flags that apply to `globs`. The output of each example is checked using `DocTestRunner.check_output`, and the results are formatted by the `DocTestRunner.report_*` methods. """ self.test = test if compileflags is None: compileflags = _extract_future_flags(test.globs) save_stdout = sys.stdout if out is None: out = save_stdout.write sys.stdout = self._fakeout # Patch pdb.set_trace to restore sys.stdout during interactive # debugging (so it's not still redirected to self._fakeout). # Note that the interactive output will go to *our* # save_stdout, even if that's not the real sys.stdout; this # allows us to write test cases for the set_trace behavior. save_set_trace = pdb.set_trace self.debugger = _OutputRedirectingPdb(save_stdout) self.debugger.reset() pdb.set_trace = self.debugger.set_trace # Patch linecache.getlines, so we can see the example's source # when we're inside the debugger. self.save_linecache_getlines = linecache.getlines linecache.getlines = self.__patched_linecache_getlines try: return self.__run(test, compileflags, out) finally: sys.stdout = save_stdout pdb.set_trace = save_set_trace linecache.getlines = self.save_linecache_getlines if clear_globs: test.globs.clear() #///////////////////////////////////////////////////////////////// # Summarization #///////////////////////////////////////////////////////////////// def summarize(self, verbose=None): """ Print a summary of all the test cases that have been run by this DocTestRunner, and return a tuple `(f, t)`, where `f` is the total number of failed examples, and `t` is the total number of tried examples. The optional `verbose` argument controls how detailed the summary is. If the verbosity is not specified, then the DocTestRunner's verbosity is used. """ if verbose is None: verbose = self._verbose notests = [] passed = [] failed = [] totalt = totalf = 0 for x in self._name2ft.items(): name, (f, t) = x assert f <= t totalt += t totalf += f if t == 0: notests.append(name) elif f == 0: passed.append( (name, t) ) else: failed.append(x) if verbose: if notests: print len(notests), "items had no tests:" notests.sort() for thing in notests: print " ", thing if passed: print len(passed), "items passed all tests:" passed.sort() for thing, count in passed: print " %3d tests in %s" % (count, thing) if failed: print self.DIVIDER print len(failed), "items had failures:" failed.sort() for thing, (f, t) in failed: print " %3d of %3d in %s" % (f, t, thing) if verbose: print totalt, "tests in", len(self._name2ft), "items." print totalt - totalf, "passed and", totalf, "failed." if totalf: print "***Test Failed***", totalf, "failures." elif verbose: print "Test passed." return totalf, totalt #///////////////////////////////////////////////////////////////// # Backward compatibility cruft to maintain doctest.master. #///////////////////////////////////////////////////////////////// def merge(self, other): d = self._name2ft for name, (f, t) in other._name2ft.items(): if name in d: print "*** DocTestRunner.merge: '" + name + "' in both" \ " testers; summing outcomes." f2, t2 = d[name] f = f + f2 t = t + t2 d[name] = f, t class OutputChecker: """ A class used to check the whether the actual output from a doctest example matches the expected output. `OutputChecker` defines two methods: `check_output`, which compares a given pair of outputs, and returns true if they match; and `output_difference`, which returns a string describing the differences between two outputs. """ def check_output(self, want, got, optionflags): """ Return True iff the actual output from an example (`got`) matches the expected output (`want`). These strings are always considered to match if they are identical; but depending on what option flags the test runner is using, several non-exact match types are also possible. See the documentation for `TestRunner` for more information about option flags. """ # Handle the common case first, for efficiency: # if they're string-identical, always return true. if got == want: return True # The values True and False replaced 1 and 0 as the return # value for boolean comparisons in Python 2.3. if not (optionflags & DONT_ACCEPT_TRUE_FOR_1): if (got,want) == ("True\n", "1\n"): return True if (got,want) == ("False\n", "0\n"): return True # can be used as a special sequence to signify a # blank line, unless the DONT_ACCEPT_BLANKLINE flag is used. if not (optionflags & DONT_ACCEPT_BLANKLINE): # Replace in want with a blank line. want = re.sub('(?m)^%s\s*?$' % re.escape(BLANKLINE_MARKER), '', want) # If a line in got contains only spaces, then remove the # spaces. got = re.sub('(?m)^\s*?$', '', got) if got == want: return True # This flag causes doctest to ignore any differences in the # contents of whitespace strings. Note that this can be used # in conjunction with the ELLIPSIS flag. if optionflags & NORMALIZE_WHITESPACE: got = ' '.join(got.split()) want = ' '.join(want.split()) if got == want: return True # The ELLIPSIS flag says to let the sequence "..." in `want` # match any substring in `got`. if optionflags & ELLIPSIS: if _ellipsis_match(want, got): return True # We didn't find any match; return false. return False # Should we do a fancy diff? def _do_a_fancy_diff(self, want, got, optionflags): # Not unless they asked for a fancy diff. if not optionflags & (REPORT_UDIFF | REPORT_CDIFF | REPORT_NDIFF): return False # If expected output uses ellipsis, a meaningful fancy diff is # too hard ... or maybe not. In two real-life failures Tim saw, # a diff was a major help anyway, so this is commented out. # [todo] _ellipsis_match() knows which pieces do and don't match, # and could be the basis for a kick-ass diff in this case. ##if optionflags & ELLIPSIS and ELLIPSIS_MARKER in want: ## return False # ndiff does intraline difference marking, so can be useful even # for 1-line differences. if optionflags & REPORT_NDIFF: return True # The other diff types need at least a few lines to be helpful. return want.count('\n') > 2 and got.count('\n') > 2 def output_difference(self, example, got, optionflags): """ Return a string describing the differences between the expected output for a given example (`example`) and the actual output (`got`). `optionflags` is the set of option flags used to compare `want` and `got`. """ want = example.want # If s are being used, then replace blank lines # with in the actual output string. if not (optionflags & DONT_ACCEPT_BLANKLINE): got = re.sub('(?m)^[ ]*(?=\n)', BLANKLINE_MARKER, got) # Check if we should use diff. if self._do_a_fancy_diff(want, got, optionflags): # Split want & got into lines. want_lines = want.splitlines(True) # True == keep line ends got_lines = got.splitlines(True) # Use difflib to find their differences. if optionflags & REPORT_UDIFF: diff = difflib.unified_diff(want_lines, got_lines, n=2) diff = list(diff)[2:] # strip the diff header kind = 'unified diff with -expected +actual' elif optionflags & REPORT_CDIFF: diff = difflib.context_diff(want_lines, got_lines, n=2) diff = list(diff)[2:] # strip the diff header kind = 'context diff with expected followed by actual' elif optionflags & REPORT_NDIFF: engine = difflib.Differ(charjunk=difflib.IS_CHARACTER_JUNK) diff = list(engine.compare(want_lines, got_lines)) kind = 'ndiff with -expected +actual' else: assert 0, 'Bad diff option' # Remove trailing whitespace on diff output. diff = [line.rstrip() + '\n' for line in diff] return 'Differences (%s):\n' % kind + _indent(''.join(diff)) # If we're not using diff, then simply list the expected # output followed by the actual output. if want and got: return 'Expected:\n%sGot:\n%s' % (_indent(want), _indent(got)) elif want: return 'Expected:\n%sGot nothing\n' % _indent(want) elif got: return 'Expected nothing\nGot:\n%s' % _indent(got) else: return 'Expected nothing\nGot nothing\n' class DocTestFailure(Exception): """A DocTest example has failed in debugging mode. The exception instance has variables: - test: the DocTest object being run - excample: the Example object that failed - got: the actual output """ def __init__(self, test, example, got): self.test = test self.example = example self.got = got def __str__(self): return str(self.test) class UnexpectedException(Exception): """A DocTest example has encountered an unexpected exception The exception instance has variables: - test: the DocTest object being run - excample: the Example object that failed - exc_info: the exception info """ def __init__(self, test, example, exc_info): self.test = test self.example = example self.exc_info = exc_info def __str__(self): return str(self.test) class DebugRunner(DocTestRunner): r"""Run doc tests but raise an exception as soon as there is a failure. If an unexpected exception occurs, an UnexpectedException is raised. It contains the test, the example, and the original exception: >>> runner = DebugRunner(verbose=False) >>> test = DocTestParser().get_doctest('>>> raise KeyError\n42', ... {}, 'foo', 'foo.py', 0) >>> try: ... runner.run(test) ... except UnexpectedException, failure: ... pass >>> failure.test is test True >>> failure.example.want '42\n' >>> exc_info = failure.exc_info >>> raise exc_info[0], exc_info[1], exc_info[2] Traceback (most recent call last): ... KeyError We wrap the original exception to give the calling application access to the test and example information. If the output doesn't match, then a DocTestFailure is raised: >>> test = DocTestParser().get_doctest(''' ... >>> x = 1 ... >>> x ... 2 ... ''', {}, 'foo', 'foo.py', 0) >>> try: ... runner.run(test) ... except DocTestFailure, failure: ... pass DocTestFailure objects provide access to the test: >>> failure.test is test True As well as to the example: >>> failure.example.want '2\n' and the actual output: >>> failure.got '1\n' If a failure or error occurs, the globals are left intact: >>> del test.globs['__builtins__'] >>> test.globs {'x': 1} >>> test = DocTestParser().get_doctest(''' ... >>> x = 2 ... >>> raise KeyError ... ''', {}, 'foo', 'foo.py', 0) >>> runner.run(test) Traceback (most recent call last): ... UnexpectedException: >>> del test.globs['__builtins__'] >>> test.globs {'x': 2} But the globals are cleared if there is no error: >>> test = DocTestParser().get_doctest(''' ... >>> x = 2 ... ''', {}, 'foo', 'foo.py', 0) >>> runner.run(test) (0, 1) >>> test.globs {} """ def run(self, test, compileflags=None, out=None, clear_globs=True): r = DocTestRunner.run(self, test, compileflags, out, False) if clear_globs: test.globs.clear() return r def report_unexpected_exception(self, out, test, example, exc_info): raise UnexpectedException(test, example, exc_info) def report_failure(self, out, test, example, got): raise DocTestFailure(test, example, got) ###################################################################### ## 6. Test Functions ###################################################################### # These should be backwards compatible. # For backward compatibility, a global instance of a DocTestRunner # class, updated by testmod. master = None def testmod(m=None, name=None, globs=None, verbose=None, isprivate=None, report=True, optionflags=0, extraglobs=None, raise_on_error=False, exclude_empty=False): """m=None, name=None, globs=None, verbose=None, isprivate=None, report=True, optionflags=0, extraglobs=None, raise_on_error=False, exclude_empty=False Test examples in docstrings in functions and classes reachable from module m (or the current module if m is not supplied), starting with m.__doc__. Unless isprivate is specified, private names are not skipped. Also test examples reachable from dict m.__test__ if it exists and is not None. m.__test__ maps names to functions, classes and strings; function and class docstrings are tested even if the name is private; strings are tested directly, as if they were docstrings. Return (#failures, #tests). See doctest.__doc__ for an overview. Optional keyword arg "name" gives the name of the module; by default use m.__name__. Optional keyword arg "globs" gives a dict to be used as the globals when executing examples; by default, use m.__dict__. A copy of this dict is actually used for each docstring, so that each docstring's examples start with a clean slate. Optional keyword arg "extraglobs" gives a dictionary that should be merged into the globals that are used to execute examples. By default, no extra globals are used. This is new in 2.4. Optional keyword arg "verbose" prints lots of stuff if true, prints only failures if false; by default, it's true iff "-v" is in sys.argv. Optional keyword arg "report" prints a summary at the end when true, else prints nothing at the end. In verbose mode, the summary is detailed, else very brief (in fact, empty if all tests passed). Optional keyword arg "optionflags" or's together module constants, and defaults to 0. This is new in 2.3. Possible values (see the docs for details): DONT_ACCEPT_TRUE_FOR_1 DONT_ACCEPT_BLANKLINE NORMALIZE_WHITESPACE ELLIPSIS IGNORE_EXCEPTION_DETAIL REPORT_UDIFF REPORT_CDIFF REPORT_NDIFF REPORT_ONLY_FIRST_FAILURE Optional keyword arg "raise_on_error" raises an exception on the first unexpected exception or failure. This allows failures to be post-mortem debugged. Deprecated in Python 2.4: Optional keyword arg "isprivate" specifies a function used to determine whether a name is private. The default function is treat all functions as public. Optionally, "isprivate" can be set to doctest.is_private to skip over functions marked as private using the underscore naming convention; see its docs for details. Advanced tomfoolery: testmod runs methods of a local instance of class doctest.Tester, then merges the results into (or creates) global Tester instance doctest.master. Methods of doctest.master can be called directly too, if you want to do something unusual. Passing report=0 to testmod is especially useful then, to delay displaying a summary. Invoke doctest.master.summarize(verbose) when you're done fiddling. """ global master if isprivate is not None: warnings.warn("the isprivate argument is deprecated; " "examine DocTestFinder.find() lists instead", DeprecationWarning) # If no module was given, then use __main__. if m is None: # DWA - m will still be None if this wasn't invoked from the command # line, in which case the following TypeError is about as good an error # as we should expect m = sys.modules.get('__main__') # Check that we were actually given a module. if not inspect.ismodule(m): raise TypeError("testmod: module required; %r" % (m,)) # If no name was given, then use the module's name. if name is None: name = m.__name__ # Find, parse, and run all tests in the given module. finder = DocTestFinder(_namefilter=isprivate, exclude_empty=exclude_empty) if raise_on_error: runner = DebugRunner(verbose=verbose, optionflags=optionflags) else: runner = DocTestRunner(verbose=verbose, optionflags=optionflags) for test in finder.find(m, name, globs=globs, extraglobs=extraglobs): runner.run(test) if report: runner.summarize() if master is None: master = runner else: master.merge(runner) return runner.failures, runner.tries def testfile(filename, module_relative=True, name=None, package=None, globs=None, verbose=None, report=True, optionflags=0, extraglobs=None, raise_on_error=False, parser=DocTestParser()): """ Test examples in the given file. Return (#failures, #tests). Optional keyword arg "module_relative" specifies how filenames should be interpreted: - If "module_relative" is True (the default), then "filename" specifies a module-relative path. By default, this path is relative to the calling module's directory; but if the "package" argument is specified, then it is relative to that package. To ensure os-independence, "filename" should use "/" characters to separate path segments, and should not be an absolute path (i.e., it may not begin with "/"). - If "module_relative" is False, then "filename" specifies an os-specific path. The path may be absolute or relative (to the current working directory). Optional keyword arg "name" gives the name of the test; by default use the file's basename. Optional keyword argument "package" is a Python package or the name of a Python package whose directory should be used as the base directory for a module relative filename. If no package is specified, then the calling module's directory is used as the base directory for module relative filenames. It is an error to specify "package" if "module_relative" is False. Optional keyword arg "globs" gives a dict to be used as the globals when executing examples; by default, use {}. A copy of this dict is actually used for each docstring, so that each docstring's examples start with a clean slate. Optional keyword arg "extraglobs" gives a dictionary that should be merged into the globals that are used to execute examples. By default, no extra globals are used. Optional keyword arg "verbose" prints lots of stuff if true, prints only failures if false; by default, it's true iff "-v" is in sys.argv. Optional keyword arg "report" prints a summary at the end when true, else prints nothing at the end. In verbose mode, the summary is detailed, else very brief (in fact, empty if all tests passed). Optional keyword arg "optionflags" or's together module constants, and defaults to 0. Possible values (see the docs for details): DONT_ACCEPT_TRUE_FOR_1 DONT_ACCEPT_BLANKLINE NORMALIZE_WHITESPACE ELLIPSIS IGNORE_EXCEPTION_DETAIL REPORT_UDIFF REPORT_CDIFF REPORT_NDIFF REPORT_ONLY_FIRST_FAILURE Optional keyword arg "raise_on_error" raises an exception on the first unexpected exception or failure. This allows failures to be post-mortem debugged. Optional keyword arg "parser" specifies a DocTestParser (or subclass) that should be used to extract tests from the files. Advanced tomfoolery: testmod runs methods of a local instance of class doctest.Tester, then merges the results into (or creates) global Tester instance doctest.master. Methods of doctest.master can be called directly too, if you want to do something unusual. Passing report=0 to testmod is especially useful then, to delay displaying a summary. Invoke doctest.master.summarize(verbose) when you're done fiddling. """ global master if package and not module_relative: raise ValueError("Package may only be specified for module-" "relative paths.") # Relativize the path if module_relative: package = _normalize_module(package) filename = _module_relative_path(package, filename) # If no name was given, then use the file's name. if name is None: name = os.path.basename(filename) # Assemble the globals. if globs is None: globs = {} else: globs = globs.copy() if extraglobs is not None: globs.update(extraglobs) if raise_on_error: runner = DebugRunner(verbose=verbose, optionflags=optionflags) else: runner = DocTestRunner(verbose=verbose, optionflags=optionflags) # Read the file, convert it to a test, and run it. s = open(filename).read() test = parser.get_doctest(s, globs, name, filename, 0) runner.run(test) if report: runner.summarize() if master is None: master = runner else: master.merge(runner) return runner.failures, runner.tries def run_docstring_examples(f, globs, verbose=False, name="NoName", compileflags=None, optionflags=0): """ Test examples in the given object's docstring (`f`), using `globs` as globals. Optional argument `name` is used in failure messages. If the optional argument `verbose` is true, then generate output even if there are no failures. `compileflags` gives the set of flags that should be used by the Python compiler when running the examples. If not specified, then it will default to the set of future-import flags that apply to `globs`. Optional keyword arg `optionflags` specifies options for the testing and output. See the documentation for `testmod` for more information. """ # Find, parse, and run all tests in the given module. finder = DocTestFinder(verbose=verbose, recurse=False) runner = DocTestRunner(verbose=verbose, optionflags=optionflags) for test in finder.find(f, name, globs=globs): runner.run(test, compileflags=compileflags) ###################################################################### ## 7. Tester ###################################################################### # This is provided only for backwards compatibility. It's not # actually used in any way. class Tester: def __init__(self, mod=None, globs=None, verbose=None, isprivate=None, optionflags=0): warnings.warn("class Tester is deprecated; " "use class doctest.DocTestRunner instead", DeprecationWarning, stacklevel=2) if mod is None and globs is None: raise TypeError("Tester.__init__: must specify mod or globs") if mod is not None and not inspect.ismodule(mod): raise TypeError("Tester.__init__: mod must be a module; %r" % (mod,)) if globs is None: globs = mod.__dict__ self.globs = globs self.verbose = verbose self.isprivate = isprivate self.optionflags = optionflags self.testfinder = DocTestFinder(_namefilter=isprivate) self.testrunner = DocTestRunner(verbose=verbose, optionflags=optionflags) def runstring(self, s, name): test = DocTestParser().get_doctest(s, self.globs, name, None, None) if self.verbose: print "Running string", name (f,t) = self.testrunner.run(test) if self.verbose: print f, "of", t, "examples failed in string", name return (f,t) def rundoc(self, object, name=None, module=None): f = t = 0 tests = self.testfinder.find(object, name, module=module, globs=self.globs) for test in tests: (f2, t2) = self.testrunner.run(test) (f,t) = (f+f2, t+t2) return (f,t) def rundict(self, d, name, module=None): import new m = new.module(name) m.__dict__.update(d) if module is None: module = False return self.rundoc(m, name, module) def run__test__(self, d, name): import new m = new.module(name) m.__test__ = d return self.rundoc(m, name) def summarize(self, verbose=None): return self.testrunner.summarize(verbose) def merge(self, other): self.testrunner.merge(other.testrunner) ###################################################################### ## 8. Unittest Support ###################################################################### _unittest_reportflags = 0 def set_unittest_reportflags(flags): """Sets the unittest option flags. The old flag is returned so that a runner could restore the old value if it wished to: >>> old = _unittest_reportflags >>> set_unittest_reportflags(REPORT_NDIFF | ... REPORT_ONLY_FIRST_FAILURE) == old True >>> import doctest >>> doctest._unittest_reportflags == (REPORT_NDIFF | ... REPORT_ONLY_FIRST_FAILURE) True Only reporting flags can be set: >>> set_unittest_reportflags(ELLIPSIS) Traceback (most recent call last): ... ValueError: ('Only reporting flags allowed', 8) >>> set_unittest_reportflags(old) == (REPORT_NDIFF | ... REPORT_ONLY_FIRST_FAILURE) True """ global _unittest_reportflags if (flags & REPORTING_FLAGS) != flags: raise ValueError("Only reporting flags allowed", flags) old = _unittest_reportflags _unittest_reportflags = flags return old class DocTestCase(unittest.TestCase): def __init__(self, test, optionflags=0, setUp=None, tearDown=None, checker=None): unittest.TestCase.__init__(self) self._dt_optionflags = optionflags self._dt_checker = checker self._dt_test = test self._dt_setUp = setUp self._dt_tearDown = tearDown def setUp(self): test = self._dt_test if self._dt_setUp is not None: self._dt_setUp(test) def tearDown(self): test = self._dt_test if self._dt_tearDown is not None: self._dt_tearDown(test) test.globs.clear() def runTest(self): test = self._dt_test old = sys.stdout new = StringIO() optionflags = self._dt_optionflags if not (optionflags & REPORTING_FLAGS): # The option flags don't include any reporting flags, # so add the default reporting flags optionflags |= _unittest_reportflags runner = DocTestRunner(optionflags=optionflags, checker=self._dt_checker, verbose=False) try: runner.DIVIDER = "-"*70 failures, tries = runner.run( test, out=new.write, clear_globs=False) finally: sys.stdout = old if failures: raise self.failureException(self.format_failure(new.getvalue())) def format_failure(self, err): test = self._dt_test if test.lineno is None: lineno = 'unknown line number' else: lineno = '%s' % test.lineno lname = '.'.join(test.name.split('.')[-1:]) return ('Failed doctest test for %s\n' ' File "%s", line %s, in %s\n\n%s' % (test.name, test.filename, lineno, lname, err) ) def debug(self): r"""Run the test case without results and without catching exceptions The unit test framework includes a debug method on test cases and test suites to support post-mortem debugging. The test code is run in such a way that errors are not caught. This way a caller can catch the errors and initiate post-mortem debugging. The DocTestCase provides a debug method that raises UnexpectedException errors if there is an unexepcted exception: >>> test = DocTestParser().get_doctest('>>> raise KeyError\n42', ... {}, 'foo', 'foo.py', 0) >>> case = DocTestCase(test) >>> try: ... case.debug() ... except UnexpectedException, failure: ... pass The UnexpectedException contains the test, the example, and the original exception: >>> failure.test is test True >>> failure.example.want '42\n' >>> exc_info = failure.exc_info >>> raise exc_info[0], exc_info[1], exc_info[2] Traceback (most recent call last): ... KeyError If the output doesn't match, then a DocTestFailure is raised: >>> test = DocTestParser().get_doctest(''' ... >>> x = 1 ... >>> x ... 2 ... ''', {}, 'foo', 'foo.py', 0) >>> case = DocTestCase(test) >>> try: ... case.debug() ... except DocTestFailure, failure: ... pass DocTestFailure objects provide access to the test: >>> failure.test is test True As well as to the example: >>> failure.example.want '2\n' and the actual output: >>> failure.got '1\n' """ self.setUp() runner = DebugRunner(optionflags=self._dt_optionflags, checker=self._dt_checker, verbose=False) runner.run(self._dt_test) self.tearDown() def id(self): return self._dt_test.name def __repr__(self): name = self._dt_test.name.split('.') return "%s (%s)" % (name[-1], '.'.join(name[:-1])) __str__ = __repr__ def shortDescription(self): return "Doctest: " + self._dt_test.name def DocTestSuite(module=None, globs=None, extraglobs=None, test_finder=None, **options): """ Convert doctest tests for a module to a unittest test suite. This converts each documentation string in a module that contains doctest tests to a unittest test case. If any of the tests in a doc string fail, then the test case fails. An exception is raised showing the name of the file containing the test and a (sometimes approximate) line number. The `module` argument provides the module to be tested. The argument can be either a module or a module name. If no argument is given, the calling module is used. A number of options may be provided as keyword arguments: setUp A set-up function. This is called before running the tests in each file. The setUp function will be passed a DocTest object. The setUp function can access the test globals as the globs attribute of the test passed. tearDown A tear-down function. This is called after running the tests in each file. The tearDown function will be passed a DocTest object. The tearDown function can access the test globals as the globs attribute of the test passed. globs A dictionary containing initial global variables for the tests. optionflags A set of doctest option flags expressed as an integer. """ if test_finder is None: test_finder = DocTestFinder() module = _normalize_module(module) tests = test_finder.find(module, globs=globs, extraglobs=extraglobs) if globs is None: globs = module.__dict__ if not tests: # Why do we want to do this? Because it reveals a bug that might # otherwise be hidden. raise ValueError(module, "has no tests") tests.sort() suite = unittest.TestSuite() for test in tests: if len(test.examples) == 0: continue if not test.filename: filename = module.__file__ if filename[-4:] in (".pyc", ".pyo"): filename = filename[:-1] test.filename = filename suite.addTest(DocTestCase(test, **options)) return suite class DocFileCase(DocTestCase): def id(self): return '_'.join(self._dt_test.name.split('.')) def __repr__(self): return self._dt_test.filename __str__ = __repr__ def format_failure(self, err): return ('Failed doctest test for %s\n File "%s", line 0\n\n%s' % (self._dt_test.name, self._dt_test.filename, err) ) def DocFileTest(path, module_relative=True, package=None, globs=None, parser=DocTestParser(), **options): if globs is None: globs = {} if package and not module_relative: raise ValueError("Package may only be specified for module-" "relative paths.") # Relativize the path. if module_relative: package = _normalize_module(package) path = _module_relative_path(package, path) # Find the file and read it. name = os.path.basename(path) doc = open(path).read() # Convert it to a test, and wrap it in a DocFileCase. test = parser.get_doctest(doc, globs, name, path, 0) return DocFileCase(test, **options) def DocFileSuite(*paths, **kw): """A unittest suite for one or more doctest files. The path to each doctest file is given as a string; the interpretation of that string depends on the keyword argument "module_relative". A number of options may be provided as keyword arguments: module_relative If "module_relative" is True, then the given file paths are interpreted as os-independent module-relative paths. By default, these paths are relative to the calling module's directory; but if the "package" argument is specified, then they are relative to that package. To ensure os-independence, "filename" should use "/" characters to separate path segments, and may not be an absolute path (i.e., it may not begin with "/"). If "module_relative" is False, then the given file paths are interpreted as os-specific paths. These paths may be absolute or relative (to the current working directory). package A Python package or the name of a Python package whose directory should be used as the base directory for module relative paths. If "package" is not specified, then the calling module's directory is used as the base directory for module relative filenames. It is an error to specify "package" if "module_relative" is False. setUp A set-up function. This is called before running the tests in each file. The setUp function will be passed a DocTest object. The setUp function can access the test globals as the globs attribute of the test passed. tearDown A tear-down function. This is called after running the tests in each file. The tearDown function will be passed a DocTest object. The tearDown function can access the test globals as the globs attribute of the test passed. globs A dictionary containing initial global variables for the tests. optionflags A set of doctest option flags expressed as an integer. parser A DocTestParser (or subclass) that should be used to extract tests from the files. """ suite = unittest.TestSuite() # We do this here so that _normalize_module is called at the right # level. If it were called in DocFileTest, then this function # would be the caller and we might guess the package incorrectly. if kw.get('module_relative', True): kw['package'] = _normalize_module(kw.get('package')) for path in paths: suite.addTest(DocFileTest(path, **kw)) return suite ###################################################################### ## 9. Debugging Support ###################################################################### def script_from_examples(s): r"""Extract script from text with examples. Converts text with examples to a Python script. Example input is converted to regular code. Example output and all other words are converted to comments: >>> text = ''' ... Here are examples of simple math. ... ... Python has super accurate integer addition ... ... >>> 2 + 2 ... 5 ... ... And very friendly error messages: ... ... >>> 1/0 ... To Infinity ... And ... Beyond ... ... You can use logic if you want: ... ... >>> if 0: ... ... blah ... ... blah ... ... ... ... Ho hum ... ''' >>> print script_from_examples(text) # Here are examples of simple math. # # Python has super accurate integer addition # 2 + 2 # Expected: ## 5 # # And very friendly error messages: # 1/0 # Expected: ## To Infinity ## And ## Beyond # # You can use logic if you want: # if 0: blah blah # # Ho hum """ output = [] for piece in DocTestParser().parse(s): if isinstance(piece, Example): # Add the example's source code (strip trailing NL) output.append(piece.source[:-1]) # Add the expected output: want = piece.want if want: output.append('# Expected:') output += ['## '+l for l in want.split('\n')[:-1]] else: # Add non-example text. output += [_comment_line(l) for l in piece.split('\n')[:-1]] # Trim junk on both ends. while output and output[-1] == '#': output.pop() while output and output[0] == '#': output.pop(0) # Combine the output, and return it. return '\n'.join(output) def testsource(module, name): """Extract the test sources from a doctest docstring as a script. Provide the module (or dotted name of the module) containing the test to be debugged and the name (within the module) of the object with the doc string with tests to be debugged. """ module = _normalize_module(module) tests = DocTestFinder().find(module) test = [t for t in tests if t.name == name] if not test: raise ValueError(name, "not found in tests") test = test[0] testsrc = script_from_examples(test.docstring) return testsrc def debug_src(src, pm=False, globs=None): """Debug a single doctest docstring, in argument `src`'""" testsrc = script_from_examples(src) debug_script(testsrc, pm, globs) def debug_script(src, pm=False, globs=None): "Debug a test script. `src` is the script, as a string." import pdb # Note that tempfile.NameTemporaryFile() cannot be used. As the # docs say, a file so created cannot be opened by name a second time # on modern Windows boxes, and execfile() needs to open it. srcfilename = tempfile.mktemp(".py", "doctestdebug") f = open(srcfilename, 'w') f.write(src) f.close() try: if globs: globs = globs.copy() else: globs = {} if pm: try: execfile(srcfilename, globs, globs) except: print sys.exc_info()[1] pdb.post_mortem(sys.exc_info()[2]) else: # Note that %r is vital here. '%s' instead can, e.g., cause # backslashes to get treated as metacharacters on Windows. pdb.run("execfile(%r)" % srcfilename, globs, globs) finally: os.remove(srcfilename) def debug(module, name, pm=False): """Debug a single doctest docstring. Provide the module (or dotted name of the module) containing the test to be debugged and the name (within the module) of the object with the docstring with tests to be debugged. """ module = _normalize_module(module) testsrc = testsource(module, name) debug_script(testsrc, pm, module.__dict__) ###################################################################### ## 10. Example Usage ###################################################################### class _TestClass: """ A pointless class, for sanity-checking of docstring testing. Methods: square() get() >>> _TestClass(13).get() + _TestClass(-12).get() 1 >>> hex(_TestClass(13).square().get()) '0xa9' """ def __init__(self, val): """val -> _TestClass object with associated value val. >>> t = _TestClass(123) >>> print t.get() 123 """ self.val = val def square(self): """square() -> square TestClass's associated value >>> _TestClass(13).square().get() 169 """ self.val = self.val ** 2 return self def get(self): """get() -> return TestClass's associated value. >>> x = _TestClass(-42) >>> print x.get() -42 """ return self.val __test__ = {"_TestClass": _TestClass, "string": r""" Example of a string object, searched as-is. >>> x = 1; y = 2 >>> x + y, x * y (3, 2) """, "bool-int equivalence": r""" In 2.2, boolean expressions displayed 0 or 1. By default, we still accept them. This can be disabled by passing DONT_ACCEPT_TRUE_FOR_1 to the new optionflags argument. >>> 4 == 4 1 >>> 4 == 4 True >>> 4 > 4 0 >>> 4 > 4 False """, "blank lines": r""" Blank lines can be marked with : >>> print 'foo\n\nbar\n' foo bar """, "ellipsis": r""" If the ellipsis flag is used, then '...' can be used to elide substrings in the desired output: >>> print range(1000) #doctest: +ELLIPSIS [0, 1, 2, ..., 999] """, "whitespace normalization": r""" If the whitespace normalization flag is used, then differences in whitespace are ignored. >>> print range(30) #doctest: +NORMALIZE_WHITESPACE [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29] """, } def _test(): r = unittest.TextTestRunner() r.run(DocTestSuite()) if __name__ == "__main__": _test() pyprotocols-1.0a.svn20070625/src/protocols/zope_support.py0000644000175000017500000000637310607570201021670 0ustar kovkov"""Declaration support for Zope Interfaces""" __all__ = [] from types import ClassType from adapters import * from api import declareImplementation, advise, adapt from interfaces import IOpenProtocol, Protocol from advice import metamethod, supermeta # Monkeypatch Zope Interfaces try: import zope.interface as zi except ImportError: ZopeInterfaceTypes = [] zi = None else: def __adapt__(self, obj): return adapt(self,IOpenProtocol).__adapt__(obj) try: from zope.interface import adapter_hooks except ImportError: try: from zope.interface.interface import adapter_hooks except ImportError: zi.Interface.__class__.__adapt__ = __adapt__ adapter_hooks = [] adapter_hooks.append(__adapt__) ZopeInterfaceTypes = [zi.Interface.__class__] del __adapt__, adapter_hooks # Adapter for Zope X3 Interfaces class ZopeInterfaceAsProtocol(StickyAdapter, Protocol): advise( instancesProvide=[IOpenProtocol], asAdapterForTypes=ZopeInterfaceTypes, ) attachForProtocols = IOpenProtocol, def __init__(self, ob): StickyAdapter.__init__(self,ob) Protocol.__init__(self) if ZopeInterfaceTypes and hasattr(ZopeInterfaceTypes[0],'providedBy'): def __adapt__(self, obj): if self.subject.providedBy(obj): return obj return supermeta(ZopeInterfaceAsProtocol,self).__adapt__(obj) else: def __adapt__(self, obj): if self.subject.isImplementedBy(obj): return obj return supermeta(ZopeInterfaceAsProtocol,self).__adapt__(obj) def registerImplementation(self,klass,adapter=NO_ADAPTER_NEEDED,depth=1): if adapter is NO_ADAPTER_NEEDED: zi.classImplements(klass, self.subject) elif adapter is DOES_NOT_SUPPORT: ifaces = zi.Declaration( [i.__iro__ for i in zi.implementedBy(klass)] ) - self.subject zi.classImplementsOnly(klass, ifaces) return supermeta(ZopeInterfaceAsProtocol,self).registerImplementation( klass,adapter,depth ) registerImplementation = metamethod(registerImplementation) def registerObject(self, ob, adapter=NO_ADAPTER_NEEDED, depth=1): if adapter is NO_ADAPTER_NEEDED: zi.directlyProvides(ob,self.subject) elif adapter is DOES_NOT_SUPPORT: zi.directlyProvides(ob, zi.directlyProvidedBy(ob)-self.subject) return supermeta(ZopeInterfaceAsProtocol,self).registerObject( ob, adapter, depth ) registerObject = metamethod(registerObject) def getImpliedProtocols(self): protos = super(ZopeInterfaceAsProtocol,self).getImpliedProtocols() return list(protos) + [ (i,(NO_ADAPTER_NEEDED,1)) for i in self.subject.__bases__ if i is not zi.Interface ] def __getstate__(self): state = self.__dict__.copy() del state['_Protocol__lock'] # locks can't be pickled del state['_Protocol__listeners'] # and neither can weakref dict return state def __hash__(self): return hash(self.subject) def __cmp__(self,other): return cmp(self.subject, other) pyprotocols-1.0a.svn20070625/src/protocols/_speedups.pyx0000644000175000017500000002177110202024402021261 0ustar kovkov"""C Speedups for commonly-used operations""" __all__ = [ 'NO_ADAPTER_NEEDED', 'DOES_NOT_SUPPORT', 'adapt', 'Protocol__adapt__', 'metamethod', 'classicMRO', 'getMRO', 'Protocol__call__', ] cdef extern from "Python.h": int PyType_Check(object ob) int PyClass_Check(object ob) int PyInstance_Check(object ob) int PyObject_TypeCheck(object ob, object tp) int PyObject_IsInstance(object inst, object cls) int PyErr_ExceptionMatches(void *exc) void *PyExc_AttributeError void *PyObject_GetAttr(object ob, object attr) void PyErr_Clear() object PyString_InternFromString(char *v) object PyMethod_New(object func, object self, object cls) ctypedef struct PyTupleObject: void *ob_item # we don't use this, but we can't use 'pass' here ctypedef struct PyListObject: void *ob_item # we don't use this, but we can't use 'pass' here ctypedef struct PyTypeObject: PyTupleObject *tp_mro ctypedef struct PyObject: PyTypeObject *ob_type ctypedef struct PyClassObject: PyTupleObject *cl_bases ctypedef struct PyInstanceObject: PyClassObject *in_class int PyObject_IsSubclass(PyClassObject *derived, object cls) int PyList_Append(PyListObject *list, object item) except -1 int PyTuple_GET_SIZE(PyTupleObject *p) int PyList_GET_SIZE(PyListObject *p) int PyTuple_Check(object op) int PyList_Check(object op) int len "PyObject_Length" (object o) except -1 object type "PyObject_Type" (object o) # These macros return borrowed references, so we make them void * # When Pyrex casts them to objects, it will incref them void * PyTuple_GET_ITEM(PyTupleObject *p, int pos) void * PyList_GET_ITEM(PyListObject *p, int pos) void * PyDict_GetItem(object dict,object key) PyTypeObject PyInstance_Type PyTypeObject PyBaseObject_Type void Py_DECREF(PyObject *p) object __Pyx_GetExcValue() cdef object _marker, __conform, __adapt, __mro, __ECType from sys import exc_info from protocols.adapters import AdaptationFailure try: from ExtensionClass import ExtensionClass __ECType = ExtensionClass except ImportError: __ECType = type(object) _marker = object() __conform = PyString_InternFromString("__conform__") __adapt = PyString_InternFromString("__adapt__") __class = PyString_InternFromString("__class__") __mro = PyString_InternFromString("__mro__") # Fundamental Adapters def NO_ADAPTER_NEEDED(obj, protocol=None): """Assume 'obj' implements 'protocol' directly""" return obj def DOES_NOT_SUPPORT(obj, protocol=None): """Prevent 'obj' from supporting 'protocol'""" return None cdef class metamethod: """Wrapper for metaclass method that might be confused w/instance method""" cdef object func def __init__(self, func): self.func = func def __get__(self, ob, typ): if ob is None: return self return PyMethod_New(self.func, ob, typ) def __set__(self, ob, value): raise AttributeError("Read-only attribute") def __delete__(self, ob): raise AttributeError("Read-only attribute") cdef object _adapt(obj, protocol, default): # We use nested 'if' blocks here because using 'and' causes Pyrex to # convert the return values to Python ints, and then back to booleans! cdef void *tmp if PyType_Check(protocol): if PyObject_TypeCheck(obj, protocol): return obj if PyClass_Check(protocol): if PyInstance_Check(obj): if PyObject_IsInstance(obj,protocol): return obj tmp = PyObject_GetAttr(obj, __conform) if tmp: meth = tmp Py_DECREF(tmp) try: result = meth(protocol) if result is not None: return result except TypeError: if exc_info()[2].tb_next is not None: raise elif PyErr_ExceptionMatches(PyExc_AttributeError): PyErr_Clear() else: err = __Pyx_GetExcValue() raise tmp = PyObject_GetAttr(protocol, __adapt) if tmp: meth = tmp Py_DECREF(tmp) try: result = meth(obj) if result is not None: return result except TypeError: if exc_info()[2].tb_next is not None: raise elif PyErr_ExceptionMatches(PyExc_AttributeError): PyErr_Clear() else: err = __Pyx_GetExcValue() raise if default is _marker: raise AdaptationFailure("Can't adapt", obj, protocol) return default def adapt(obj, protocol, default=_marker): """PEP 246-alike: Adapt 'obj' to 'protocol', return 'default' If 'default' is not supplied and no implementation is found, raise 'AdaptationFailure'.""" return _adapt(obj,protocol,default) def Protocol__call__(self, ob, default=_marker): """Adapt to this protocol""" return _adapt(ob,self,default) cdef buildClassicMRO(PyClassObject *cls, PyListObject *list): cdef PyTupleObject *bases cdef int i PyList_Append(list, cls) bases = cls.cl_bases if bases: for i from 0 <= i < PyTuple_GET_SIZE(bases): tmp = PyTuple_GET_ITEM(bases, i) buildClassicMRO(tmp, list) def classicMRO(ob, extendedClassic=False): if PyClass_Check(ob): mro = [] buildClassicMRO(ob, mro) if extendedClassic: PyList_Append(mro, &PyInstance_Type) PyList_Append(mro, &PyBaseObject_Type) return mro raise TypeError("Not a classic class", ob) cdef buildECMRO(object cls, PyListObject *list): PyList_Append(list, cls) for i in cls.__bases__: buildECMRO(i, list) def extClassMRO(ob, extendedClassic=False): mro = [] buildECMRO(ob, mro) if extendedClassic: PyList_Append(mro, &PyInstance_Type) PyList_Append(mro, &PyBaseObject_Type) return mro def getMRO(ob, extendedClassic=False): if PyClass_Check(ob): return classicMRO(ob,extendedClassic) elif PyType_Check(ob): return ob.__mro__ elif PyObject_TypeCheck(ob,__ECType): return extClassMRO(ob, extendedClassic) return ob, def Protocol__adapt__(self, obj): cdef void *tmp cdef int i if PyInstance_Check(obj): cls = ((obj).in_class) else: # We use __class__ instead of type to support proxies tmp = PyObject_GetAttr(obj, __class) if tmp: cls = tmp Py_DECREF(tmp) elif PyErr_ExceptionMatches(PyExc_AttributeError): # Some object have no __class__; use their type PyErr_Clear() cls = (obj).ob_type else: # Some other error, pass it on up the line err = __Pyx_GetExcValue() raise tmp = 0 if PyType_Check(cls): # It's a type, we can use its mro directly tmp = ((cls).tp_mro) if tmp: mro = tmp elif PyClass_Check(cls): # It's a classic class, build up its MRO mro = [] buildClassicMRO(cls, mro) PyList_Append(mro, &PyInstance_Type) PyList_Append(mro, &PyBaseObject_Type) else: # Fallback to getting __mro__ (for e.g. security proxies/ExtensionClass) tmp = PyObject_GetAttr(cls, __mro) if tmp: mro = tmp Py_DECREF(tmp) # No __mro__? Is it an ExtensionClass? elif PyObject_TypeCheck(cls,__ECType): # Yep, toss out the error and compute a reasonable MRO PyErr_Clear() mro = extClassMRO(cls, 1) # Okay, we give up... reraise the error so somebody smarter than us # can figure it out. :( else: err = __Pyx_GetExcValue() raise get = self._Protocol__adapters.get if PyTuple_Check(mro): #print "tuple",mro for i from 0 <= i < PyTuple_GET_SIZE(mro): cls = PyTuple_GET_ITEM(mro, i) factory=get(cls) if factory is not None: return factory[0](obj) elif PyList_Check(mro): #print "list",mro for i from 0 <= i < PyList_GET_SIZE(mro): cls = PyList_GET_ITEM(mro, i) factory=get(cls) if factory is not None: return factory[0](obj) else: #print "other",mro for cls in mro: factory=get(cls) if factory is not None: return factory[0](obj) pyprotocols-1.0a.svn20070625/src/protocols/advice.py0000644000175000017500000001216310570131557020352 0ustar kovkovfrom __future__ import generators from new import instancemethod from types import ClassType, FunctionType, InstanceType import sys __all__ = [ 'metamethod', 'supermeta', 'getMRO', 'classicMRO', 'mkRef', 'StrongRef', # XXX these should be deprecated 'addClassAdvisor', 'isClassAdvisor', 'add_assignment_advisor', 'determineMetaclass', 'getFrameInfo', 'minimalBases', ] # No sense duplicating all this functionality any more... from peak.util import decorators def addClassAdvisor(callback, depth=2,frame=None): "protocols.advice.addClassAdvisor is deprecated, please use" " peak.util.decorators.decorate_class instead" from warnings import warn warn(addClassAdvisor.__doc__, DeprecationWarning, 2) return decorators.decorate_class(callback, (depth or 0)+1, frame) def add_assignment_advisor(callback,depth=2,frame=None): "protocols.advice.add_assignment_advisor is deprecated, please use" " peak.util.decorators.decorate_assignment instead" from warnings import warn warn(add_assignment_advisor.__doc__, DeprecationWarning, 2) return decorators.decorate_assignment(callback, (depth or 0)+1, frame) def getFrameInfo(frame): "protocols.advice.getFrameInfo is deprecated, please use" " peak.util.decorators.frameinfo instead" from warnings import warn warn(getFrameInfo.__doc__, DeprecationWarning, 2) return decorators.frameinfo(frame) def determineMetaclass(bases, explicit_mc=None): "protocols.advice.determineMetaclass is deprecated, please use" " peak.util.decorators.metaclass_for_bases instead" from warnings import warn warn(determineMetaclass.__doc__, DeprecationWarning, 2) return decorators.metaclass_for_bases(bases, explicit_mc) def isClassAdvisor(ob): "protocols.advice.isClassAdvisor is deprecated, please use" " peak.util.decorators.metaclass_is_decorator instead" from warnings import warn warn(isClassAdvisor.__doc__, DeprecationWarning, 2) return decorators.metaclass_is_decorator(ob) def metamethod(func): """Wrapper for metaclass method that might be confused w/instance method""" return property(lambda ob: func.__get__(ob,ob.__class__)) try: from ExtensionClass import ExtensionClass except ImportError: ClassicTypes = ClassType else: ClassicTypes = ClassType, ExtensionClass def classicMRO(ob, extendedClassic=False): stack = [] push = stack.insert pop = stack.pop push(0,ob) while stack: cls = pop() yield cls p = len(stack) for b in cls.__bases__: push(p,b) if extendedClassic: yield InstanceType yield object def getMRO(ob, extendedClassic=False): if isinstance(ob,ClassicTypes): return classicMRO(ob,extendedClassic) elif isinstance(ob,type): return ob.__mro__ return ob, try: from _speedups import metamethod, getMRO, classicMRO except ImportError: pass # property-safe 'super()' for Python 2.2; 2.3 can use super() instead def supermeta(typ,ob): starttype = type(ob) mro = starttype.__mro__ if typ not in mro: starttype = ob mro = starttype.__mro__ mro = iter(mro) for cls in mro: if cls is typ: mro = [cls.__dict__ for cls in mro] break else: raise TypeError("Not sub/supertypes:", starttype, typ) typ = type(ob) class theSuper(object): def __getattribute__(self,name): for d in mro: if name in d: descr = d[name] try: descr = descr.__get__ except AttributeError: return descr else: return descr(ob,typ) return object.__getattribute__(self,name) return theSuper() def minimalBases(classes): """DEPRECATED""" from warnings import warn warn("protocols.advice.minimalBases is deprecated; please do not use it", DeprecationWarning, 2) classes = [c for c in classes if c is not ClassType] candidates = [] for m in classes: for n in classes: if issubclass(n,m) and m is not n: break else: # m has no subclasses in 'classes' if m in candidates: candidates.remove(m) # ensure that we're later in the list candidates.append(m) return candidates from weakref import ref class StrongRef(object): """Like a weakref, but for non-weakrefable objects""" __slots__ = 'referent' def __init__(self,referent): self.referent = referent def __call__(self): return self.referent def __hash__(self): return hash(self.referent) def __eq__(self,other): return self.referent==other def __repr__(self): return 'StrongRef(%r)' % self.referent def mkRef(ob,*args): """Return either a weakref or a StrongRef for 'ob' Note that extra args are forwarded to weakref.ref() if applicable.""" try: return ref(ob,*args) except TypeError: return StrongRef(ob) pyprotocols-1.0a.svn20070625/src/protocols/__init__.py0000644000175000017500000000057110056452644020660 0ustar kovkov"""Trivial Interfaces and Adaptation""" from api import * from adapters import NO_ADAPTER_NEEDED,DOES_NOT_SUPPORT,Adapter,StickyAdapter from adapters import AdaptationFailure from interfaces import * from advice import metamethod, supermeta from classic import ProviderMixin from generate import protocolForType, protocolForURI from generate import sequenceOf, IBasicSequence pyprotocols-1.0a.svn20070625/src/protocols/twisted_support.py0000644000175000017500000000753310072357530022401 0ustar kovkov"""Declaration support for Twisted Interfaces""" __all__ = [] from adapters import * from api import advise from interfaces import IOpenProtocol from weakref import WeakKeyDictionary # Twisted uses an approach to __adapt__ that is largely incompatible with # PEP 246, so we have to jump through some twisty hoops to convince it to work # for our purposes, without breaking Twisted's test suite. class TwistedAdaptMethod(object): """__adapt__ implementation for Twisted interfaces""" __slots__ = 'iface' def __init__(self,iface): self.iface = iface def __call__(self, obj): # This is the __adapt__ method that you get # for ISomething.__adapt__()... if TwistedImplements(obj, self.iface): return obj # Get Twisted to try and adapt return self.iface(obj, None) def im_func(self, ob, default): # And this is what MetaInterface.__call__ calls when # it goes after __adapt__.im_func! meth = self.iface.__dict__.get('__adapt__') if meth is None: return default return meth(ob,default) # Monkeypatch Twisted Interfaces try: from twisted.python.components import \ implements as TwistedImplements, \ MetaInterface as TwistedInterfaceClass, \ getInterfaces as TwistedGetInterfaces except ImportError: TwistedInterfaceTypes = [] else: # Force all Twisted interfaces to have an __adapt__ method TwistedInterfaceClass.__adapt__ = property(lambda s: TwistedAdaptMethod(s)) TwistedInterfaceTypes = [TwistedInterfaceClass] class TwistedInterfaceAsProtocol(object): __slots__ = 'iface' advise( instancesProvide=[IOpenProtocol], asAdapterForTypes=TwistedInterfaceTypes, ) def __init__(self, iface): self.iface = iface def __adapt__(self, obj): return self.iface.__adapt__(obj) def registerImplementation(self,klass,adapter=NO_ADAPTER_NEEDED,depth=1): oldImplements = TwistedGetInterfaces(klass) if adapter is NO_ADAPTER_NEEDED: klass.__implements__ = self.iface, tuple(oldImplements) elif adapter is DOES_NOT_SUPPORT: if self.iface in oldImplements: oldImplements.remove(self.iface) klass.__implements__ = tuple(oldImplements) else: raise TypeError( "Twisted interfaces can only declare support, not adapters", self.iface, klass, adapter ) def addImpliedProtocol(self, proto, adapter=NO_ADAPTER_NEEDED, depth=1): iface = self.iface # XXX need to ensure 'proto' is usable w/Twisted! self.iface.adaptWith(lambda o: adapter(o, iface), proto) # XXX is the above sufficient? # XXX What are Twisted's adapter override semantics? listeners = iface.__dict__.get('_Protocol__listeners',{}) for listener in listeners.keys(): # Must use keys()! listener.newProtocolImplied(self, proto, adapter, depth) def registerObject(self, ob, adapter=NO_ADAPTER_NEEDED, depth=1): oldImplements = TwistedGetInterfaces(ob) if adapter is NO_ADAPTER_NEEDED: ob.__implements__ = self.iface, tuple(oldImplements) elif adapter is DOES_NOT_SUPPORT: if self.iface in oldImplements: oldImplements.remove(self.iface) ob.__implements__ = tuple(oldImplements) else: raise TypeError( "Twisted interfaces can only declare support, not adapters", self.iface, ob, adapter ) def addImplicationListener(self, listener): listeners = self.iface.__dict__.setdefault( '_Protocol__listeners',WeakKeyDictionary() ) listeners[listener] = True pyprotocols-1.0a.svn20070625/src/protocols/_speedups.c0000644000175000017500000022615010570123574020703 0ustar kovkov/* Generated by Pyrex 0.9.5.1 on Wed Jan 31 16:35:25 2007 */ #include "Python.h" #include "structmember.h" #ifndef PY_LONG_LONG #define PY_LONG_LONG LONG_LONG #endif #ifdef __cplusplus #define __PYX_EXTERN_C extern "C" #else #define __PYX_EXTERN_C extern #endif __PYX_EXTERN_C double pow(double, double); typedef struct {PyObject **p; char *s;} __Pyx_InternTabEntry; /*proto*/ typedef struct {PyObject **p; char *s; long n;} __Pyx_StringTabEntry; /*proto*/ static PyObject *__pyx_m; static PyObject *__pyx_b; static int __pyx_lineno; static char *__pyx_filename; static char **__pyx_f; static char __pyx_mdoc[] = "C Speedups for commonly-used operations"; static PyObject *__Pyx_Import(PyObject *name, PyObject *from_list); /*proto*/ static PyObject *__Pyx_GetName(PyObject *dict, PyObject *name); /*proto*/ static PyObject *__Pyx_GetExcValue(void); /*proto*/ static void __Pyx_Raise(PyObject *type, PyObject *value, PyObject *tb); /*proto*/ static void __Pyx_ReRaise(void); /*proto*/ static int __Pyx_InternStrings(__Pyx_InternTabEntry *t); /*proto*/ static int __Pyx_InitStrings(__Pyx_StringTabEntry *t); /*proto*/ static void __Pyx_AddTraceback(char *funcname); /*proto*/ /* Declarations from _speedups */ struct __pyx_obj_9_speedups_metamethod { PyObject_HEAD PyObject *func; }; static PyTypeObject *__pyx_ptype_9_speedups_metamethod = 0; static PyObject *__pyx_v_9_speedups__marker; static PyObject *__pyx_v_9_speedups___conform; static PyObject *__pyx_v_9_speedups___adapt; static PyObject *__pyx_v_9_speedups___mro; static PyObject *__pyx_v_9_speedups___ECType; static PyObject *__pyx_k19; static PyObject *__pyx_k20; static PyObject *__pyx_k21; static PyObject *__pyx_k22; static PyObject *__pyx_k23; static PyObject *__pyx_k24; static PyObject *__pyx_k25; static PyObject *(__pyx_f_9_speedups__adapt(PyObject *,PyObject *,PyObject *)); /*proto*/ static PyObject *(__pyx_f_9_speedups_buildClassicMRO(PyClassObject (*),PyListObject (*))); /*proto*/ static PyObject *(__pyx_f_9_speedups_buildECMRO(PyObject *,PyListObject (*))); /*proto*/ /* Implementation of _speedups */ static char (__pyx_k11[]) = "protocols.adapters"; static char (__pyx_k15[]) = "__conform__"; static char (__pyx_k16[]) = "__adapt__"; static char (__pyx_k17[]) = "__class__"; static char (__pyx_k18[]) = "__mro__"; static PyObject *__pyx_n___all__; static PyObject *__pyx_n_exc_info; static PyObject *__pyx_n_AdaptationFailure; static PyObject *__pyx_n_ExtensionClass; static PyObject *__pyx_n___class; static PyObject *__pyx_n_NO_ADAPTER_NEEDED; static PyObject *__pyx_n_DOES_NOT_SUPPORT; static PyObject *__pyx_n_adapt; static PyObject *__pyx_n_Protocol__call__; static PyObject *__pyx_n_classicMRO; static PyObject *__pyx_n_extClassMRO; static PyObject *__pyx_n_getMRO; static PyObject *__pyx_n_Protocol__adapt__; static PyObject *__pyx_n_metamethod; static PyObject *__pyx_n_sys; static PyObject *__pyx_n_ImportError; static PyObject *__pyx_n_object; static PyObject *__pyx_n_False; static PyObject *__pyx_k11p; static PyObject *__pyx_f_9_speedups_NO_ADAPTER_NEEDED(PyObject *__pyx_self, PyObject *__pyx_args, PyObject *__pyx_kwds); /*proto*/ static char __pyx_doc_9_speedups_NO_ADAPTER_NEEDED[] = "Assume \'obj\' implements \'protocol\' directly"; static PyObject *__pyx_f_9_speedups_NO_ADAPTER_NEEDED(PyObject *__pyx_self, PyObject *__pyx_args, PyObject *__pyx_kwds) { PyObject *__pyx_v_obj = 0; PyObject *__pyx_v_protocol = 0; PyObject *__pyx_r; static char *__pyx_argnames[] = {"obj","protocol",0}; __pyx_v_protocol = __pyx_k19; if (!PyArg_ParseTupleAndKeywords(__pyx_args, __pyx_kwds, "O|O", __pyx_argnames, &__pyx_v_obj, &__pyx_v_protocol)) return 0; Py_INCREF(__pyx_v_obj); Py_INCREF(__pyx_v_protocol); /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":87 */ Py_INCREF(__pyx_v_obj); __pyx_r = __pyx_v_obj; goto __pyx_L0; __pyx_r = Py_None; Py_INCREF(Py_None); __pyx_L0:; Py_DECREF(__pyx_v_obj); Py_DECREF(__pyx_v_protocol); return __pyx_r; } static PyObject *__pyx_f_9_speedups_DOES_NOT_SUPPORT(PyObject *__pyx_self, PyObject *__pyx_args, PyObject *__pyx_kwds); /*proto*/ static char __pyx_doc_9_speedups_DOES_NOT_SUPPORT[] = "Prevent \'obj\' from supporting \'protocol\'"; static PyObject *__pyx_f_9_speedups_DOES_NOT_SUPPORT(PyObject *__pyx_self, PyObject *__pyx_args, PyObject *__pyx_kwds) { PyObject *__pyx_v_obj = 0; PyObject *__pyx_v_protocol = 0; PyObject *__pyx_r; static char *__pyx_argnames[] = {"obj","protocol",0}; __pyx_v_protocol = __pyx_k20; if (!PyArg_ParseTupleAndKeywords(__pyx_args, __pyx_kwds, "O|O", __pyx_argnames, &__pyx_v_obj, &__pyx_v_protocol)) return 0; Py_INCREF(__pyx_v_obj); Py_INCREF(__pyx_v_protocol); /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":91 */ Py_INCREF(Py_None); __pyx_r = Py_None; goto __pyx_L0; __pyx_r = Py_None; Py_INCREF(Py_None); __pyx_L0:; Py_DECREF(__pyx_v_obj); Py_DECREF(__pyx_v_protocol); return __pyx_r; } static int __pyx_f_9_speedups_10metamethod___init__(PyObject *__pyx_v_self, PyObject *__pyx_args, PyObject *__pyx_kwds); /*proto*/ static int __pyx_f_9_speedups_10metamethod___init__(PyObject *__pyx_v_self, PyObject *__pyx_args, PyObject *__pyx_kwds) { PyObject *__pyx_v_func = 0; int __pyx_r; static char *__pyx_argnames[] = {"func",0}; if (!PyArg_ParseTupleAndKeywords(__pyx_args, __pyx_kwds, "O", __pyx_argnames, &__pyx_v_func)) return -1; Py_INCREF(__pyx_v_self); Py_INCREF(__pyx_v_func); /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":100 */ Py_INCREF(__pyx_v_func); Py_DECREF(((struct __pyx_obj_9_speedups_metamethod *)__pyx_v_self)->func); ((struct __pyx_obj_9_speedups_metamethod *)__pyx_v_self)->func = __pyx_v_func; __pyx_r = 0; Py_DECREF(__pyx_v_self); Py_DECREF(__pyx_v_func); return __pyx_r; } static PyObject *__pyx_f_9_speedups_10metamethod___get__(PyObject *__pyx_v_self, PyObject *__pyx_v_ob, PyObject *__pyx_v_typ); /*proto*/ static PyObject *__pyx_f_9_speedups_10metamethod___get__(PyObject *__pyx_v_self, PyObject *__pyx_v_ob, PyObject *__pyx_v_typ) { PyObject *__pyx_r; int __pyx_1; PyObject *__pyx_2 = 0; Py_INCREF(__pyx_v_self); Py_INCREF(__pyx_v_ob); Py_INCREF(__pyx_v_typ); /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":103 */ __pyx_1 = __pyx_v_ob == Py_None; if (__pyx_1) { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":104 */ Py_INCREF(__pyx_v_self); __pyx_r = __pyx_v_self; goto __pyx_L0; goto __pyx_L2; } __pyx_L2:; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":105 */ __pyx_2 = PyMethod_New(((struct __pyx_obj_9_speedups_metamethod *)__pyx_v_self)->func,__pyx_v_ob,__pyx_v_typ); if (!__pyx_2) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 105; goto __pyx_L1;} __pyx_r = __pyx_2; __pyx_2 = 0; goto __pyx_L0; __pyx_r = Py_None; Py_INCREF(Py_None); goto __pyx_L0; __pyx_L1:; Py_XDECREF(__pyx_2); __Pyx_AddTraceback("_speedups.metamethod.__get__"); __pyx_r = 0; __pyx_L0:; Py_DECREF(__pyx_v_self); Py_DECREF(__pyx_v_ob); Py_DECREF(__pyx_v_typ); return __pyx_r; } static PyObject *__pyx_n_AttributeError; static PyObject *__pyx_k26p; static char (__pyx_k26[]) = "Read-only attribute"; static int __pyx_f_9_speedups_10metamethod___set__(PyObject *__pyx_v_self, PyObject *__pyx_v_ob, PyObject *__pyx_v_value); /*proto*/ static int __pyx_f_9_speedups_10metamethod___set__(PyObject *__pyx_v_self, PyObject *__pyx_v_ob, PyObject *__pyx_v_value) { int __pyx_r; PyObject *__pyx_1 = 0; PyObject *__pyx_2 = 0; PyObject *__pyx_3 = 0; Py_INCREF(__pyx_v_self); Py_INCREF(__pyx_v_ob); Py_INCREF(__pyx_v_value); /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":108 */ __pyx_1 = __Pyx_GetName(__pyx_b, __pyx_n_AttributeError); if (!__pyx_1) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 108; goto __pyx_L1;} __pyx_2 = PyTuple_New(1); if (!__pyx_2) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 108; goto __pyx_L1;} Py_INCREF(__pyx_k26p); PyTuple_SET_ITEM(__pyx_2, 0, __pyx_k26p); __pyx_3 = PyObject_CallObject(__pyx_1, __pyx_2); if (!__pyx_3) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 108; goto __pyx_L1;} Py_DECREF(__pyx_1); __pyx_1 = 0; Py_DECREF(__pyx_2); __pyx_2 = 0; __Pyx_Raise(__pyx_3, 0, 0); Py_DECREF(__pyx_3); __pyx_3 = 0; {__pyx_filename = __pyx_f[0]; __pyx_lineno = 108; goto __pyx_L1;} __pyx_r = 0; goto __pyx_L0; __pyx_L1:; Py_XDECREF(__pyx_1); Py_XDECREF(__pyx_2); Py_XDECREF(__pyx_3); __Pyx_AddTraceback("_speedups.metamethod.__set__"); __pyx_r = -1; __pyx_L0:; Py_DECREF(__pyx_v_self); Py_DECREF(__pyx_v_ob); Py_DECREF(__pyx_v_value); return __pyx_r; } static PyObject *__pyx_k27p; static char (__pyx_k27[]) = "Read-only attribute"; static int __pyx_f_9_speedups_10metamethod___delete__(PyObject *__pyx_v_self, PyObject *__pyx_v_ob); /*proto*/ static int __pyx_f_9_speedups_10metamethod___delete__(PyObject *__pyx_v_self, PyObject *__pyx_v_ob) { int __pyx_r; PyObject *__pyx_1 = 0; PyObject *__pyx_2 = 0; PyObject *__pyx_3 = 0; Py_INCREF(__pyx_v_self); Py_INCREF(__pyx_v_ob); /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":111 */ __pyx_1 = __Pyx_GetName(__pyx_b, __pyx_n_AttributeError); if (!__pyx_1) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 111; goto __pyx_L1;} __pyx_2 = PyTuple_New(1); if (!__pyx_2) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 111; goto __pyx_L1;} Py_INCREF(__pyx_k27p); PyTuple_SET_ITEM(__pyx_2, 0, __pyx_k27p); __pyx_3 = PyObject_CallObject(__pyx_1, __pyx_2); if (!__pyx_3) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 111; goto __pyx_L1;} Py_DECREF(__pyx_1); __pyx_1 = 0; Py_DECREF(__pyx_2); __pyx_2 = 0; __Pyx_Raise(__pyx_3, 0, 0); Py_DECREF(__pyx_3); __pyx_3 = 0; {__pyx_filename = __pyx_f[0]; __pyx_lineno = 111; goto __pyx_L1;} __pyx_r = 0; goto __pyx_L0; __pyx_L1:; Py_XDECREF(__pyx_1); Py_XDECREF(__pyx_2); Py_XDECREF(__pyx_3); __Pyx_AddTraceback("_speedups.metamethod.__delete__"); __pyx_r = -1; __pyx_L0:; Py_DECREF(__pyx_v_self); Py_DECREF(__pyx_v_ob); return __pyx_r; } static PyObject *__pyx_n_TypeError; static PyObject *__pyx_n_tb_next; static PyObject *__pyx_k28p; static char (__pyx_k28[]) = "Can't adapt"; static PyObject *__pyx_f_9_speedups__adapt(PyObject *__pyx_v_obj,PyObject *__pyx_v_protocol,PyObject *__pyx_v_default) { void (*__pyx_v_tmp); PyObject *__pyx_v_meth; PyObject *__pyx_v_result; PyObject *__pyx_v_err; PyObject *__pyx_r; int __pyx_1; PyObject *__pyx_2 = 0; PyObject *__pyx_3 = 0; PyObject *__pyx_4 = 0; Py_INCREF(__pyx_v_obj); Py_INCREF(__pyx_v_protocol); Py_INCREF(__pyx_v_default); __pyx_v_meth = Py_None; Py_INCREF(Py_None); __pyx_v_result = Py_None; Py_INCREF(Py_None); __pyx_v_err = Py_None; Py_INCREF(Py_None); /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":131 */ __pyx_1 = PyType_Check(__pyx_v_protocol); if (__pyx_1) { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":132 */ __pyx_1 = PyObject_TypeCheck(__pyx_v_obj,__pyx_v_protocol); if (__pyx_1) { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":133 */ Py_INCREF(__pyx_v_obj); __pyx_r = __pyx_v_obj; goto __pyx_L0; goto __pyx_L3; } __pyx_L3:; goto __pyx_L2; } __pyx_L2:; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":135 */ __pyx_1 = PyClass_Check(__pyx_v_protocol); if (__pyx_1) { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":136 */ __pyx_1 = PyInstance_Check(__pyx_v_obj); if (__pyx_1) { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":137 */ __pyx_1 = PyObject_IsInstance(__pyx_v_obj,__pyx_v_protocol); if (__pyx_1) { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":138 */ Py_INCREF(__pyx_v_obj); __pyx_r = __pyx_v_obj; goto __pyx_L0; goto __pyx_L6; } __pyx_L6:; goto __pyx_L5; } __pyx_L5:; goto __pyx_L4; } __pyx_L4:; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":140 */ __pyx_v_tmp = PyObject_GetAttr(__pyx_v_obj,__pyx_v_9_speedups___conform); /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":141 */ __pyx_1 = (__pyx_v_tmp != 0); if (__pyx_1) { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":142 */ __pyx_2 = (PyObject *)__pyx_v_tmp; Py_INCREF(__pyx_2); Py_DECREF(__pyx_v_meth); __pyx_v_meth = __pyx_2; __pyx_2 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":143 */ Py_DECREF(((PyObject (*))__pyx_v_tmp)); /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":144 */ /*try:*/ { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":145 */ __pyx_2 = PyTuple_New(1); if (!__pyx_2) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 145; goto __pyx_L8;} Py_INCREF(__pyx_v_protocol); PyTuple_SET_ITEM(__pyx_2, 0, __pyx_v_protocol); __pyx_3 = PyObject_CallObject(__pyx_v_meth, __pyx_2); if (!__pyx_3) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 145; goto __pyx_L8;} Py_DECREF(__pyx_2); __pyx_2 = 0; Py_DECREF(__pyx_v_result); __pyx_v_result = __pyx_3; __pyx_3 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":146 */ __pyx_1 = __pyx_v_result != Py_None; if (__pyx_1) { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":147 */ Py_INCREF(__pyx_v_result); __pyx_r = __pyx_v_result; goto __pyx_L0; goto __pyx_L10; } __pyx_L10:; } goto __pyx_L9; __pyx_L8:; Py_XDECREF(__pyx_2); __pyx_2 = 0; Py_XDECREF(__pyx_3); __pyx_3 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":148 */ __pyx_2 = __Pyx_GetName(__pyx_b, __pyx_n_TypeError); if (!__pyx_2) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 148; goto __pyx_L1;} __pyx_1 = PyErr_ExceptionMatches(__pyx_2); Py_DECREF(__pyx_2); __pyx_2 = 0; if (__pyx_1) { __Pyx_AddTraceback("_speedups._adapt"); __pyx_3 = __Pyx_GetExcValue(); if (!__pyx_3) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 148; goto __pyx_L1;} Py_DECREF(__pyx_3); __pyx_3 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":149 */ __pyx_2 = __Pyx_GetName(__pyx_m, __pyx_n_exc_info); if (!__pyx_2) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 149; goto __pyx_L1;} __pyx_3 = PyObject_CallObject(__pyx_2, 0); if (!__pyx_3) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 149; goto __pyx_L1;} Py_DECREF(__pyx_2); __pyx_2 = 0; __pyx_2 = PyInt_FromLong(2); if (!__pyx_2) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 149; goto __pyx_L1;} __pyx_4 = PyObject_GetItem(__pyx_3, __pyx_2); if (!__pyx_4) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 149; goto __pyx_L1;} Py_DECREF(__pyx_3); __pyx_3 = 0; Py_DECREF(__pyx_2); __pyx_2 = 0; __pyx_3 = PyObject_GetAttr(__pyx_4, __pyx_n_tb_next); if (!__pyx_3) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 149; goto __pyx_L1;} Py_DECREF(__pyx_4); __pyx_4 = 0; __pyx_1 = __pyx_3 != Py_None; Py_DECREF(__pyx_3); __pyx_3 = 0; if (__pyx_1) { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":150 */ __Pyx_ReRaise(); {__pyx_filename = __pyx_f[0]; __pyx_lineno = 150; goto __pyx_L1;} goto __pyx_L11; } __pyx_L11:; goto __pyx_L9; } goto __pyx_L1; __pyx_L9:; goto __pyx_L7; } __pyx_1 = PyErr_ExceptionMatches(PyExc_AttributeError); if (__pyx_1) { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":152 */ PyErr_Clear(); goto __pyx_L7; } /*else*/ { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":154 */ __pyx_2 = __Pyx_GetExcValue(); if (!__pyx_2) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 154; goto __pyx_L1;} Py_DECREF(__pyx_v_err); __pyx_v_err = __pyx_2; __pyx_2 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":155 */ __Pyx_ReRaise(); {__pyx_filename = __pyx_f[0]; __pyx_lineno = 155; goto __pyx_L1;} } __pyx_L7:; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":165 */ __pyx_v_tmp = PyObject_GetAttr(__pyx_v_protocol,__pyx_v_9_speedups___adapt); /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":166 */ __pyx_1 = (__pyx_v_tmp != 0); if (__pyx_1) { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":167 */ __pyx_4 = (PyObject *)__pyx_v_tmp; Py_INCREF(__pyx_4); Py_DECREF(__pyx_v_meth); __pyx_v_meth = __pyx_4; __pyx_4 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":168 */ Py_DECREF(((PyObject (*))__pyx_v_tmp)); /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":169 */ /*try:*/ { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":170 */ __pyx_3 = PyTuple_New(1); if (!__pyx_3) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 170; goto __pyx_L13;} Py_INCREF(__pyx_v_obj); PyTuple_SET_ITEM(__pyx_3, 0, __pyx_v_obj); __pyx_2 = PyObject_CallObject(__pyx_v_meth, __pyx_3); if (!__pyx_2) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 170; goto __pyx_L13;} Py_DECREF(__pyx_3); __pyx_3 = 0; Py_DECREF(__pyx_v_result); __pyx_v_result = __pyx_2; __pyx_2 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":171 */ __pyx_1 = __pyx_v_result != Py_None; if (__pyx_1) { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":172 */ Py_INCREF(__pyx_v_result); __pyx_r = __pyx_v_result; goto __pyx_L0; goto __pyx_L15; } __pyx_L15:; } goto __pyx_L14; __pyx_L13:; Py_XDECREF(__pyx_4); __pyx_4 = 0; Py_XDECREF(__pyx_3); __pyx_3 = 0; Py_XDECREF(__pyx_2); __pyx_2 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":173 */ __pyx_4 = __Pyx_GetName(__pyx_b, __pyx_n_TypeError); if (!__pyx_4) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 173; goto __pyx_L1;} __pyx_1 = PyErr_ExceptionMatches(__pyx_4); Py_DECREF(__pyx_4); __pyx_4 = 0; if (__pyx_1) { __Pyx_AddTraceback("_speedups._adapt"); __pyx_3 = __Pyx_GetExcValue(); if (!__pyx_3) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 173; goto __pyx_L1;} Py_DECREF(__pyx_3); __pyx_3 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":174 */ __pyx_2 = __Pyx_GetName(__pyx_m, __pyx_n_exc_info); if (!__pyx_2) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 174; goto __pyx_L1;} __pyx_4 = PyObject_CallObject(__pyx_2, 0); if (!__pyx_4) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 174; goto __pyx_L1;} Py_DECREF(__pyx_2); __pyx_2 = 0; __pyx_3 = PyInt_FromLong(2); if (!__pyx_3) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 174; goto __pyx_L1;} __pyx_2 = PyObject_GetItem(__pyx_4, __pyx_3); if (!__pyx_2) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 174; goto __pyx_L1;} Py_DECREF(__pyx_4); __pyx_4 = 0; Py_DECREF(__pyx_3); __pyx_3 = 0; __pyx_4 = PyObject_GetAttr(__pyx_2, __pyx_n_tb_next); if (!__pyx_4) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 174; goto __pyx_L1;} Py_DECREF(__pyx_2); __pyx_2 = 0; __pyx_1 = __pyx_4 != Py_None; Py_DECREF(__pyx_4); __pyx_4 = 0; if (__pyx_1) { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":175 */ __Pyx_ReRaise(); {__pyx_filename = __pyx_f[0]; __pyx_lineno = 175; goto __pyx_L1;} goto __pyx_L16; } __pyx_L16:; goto __pyx_L14; } goto __pyx_L1; __pyx_L14:; goto __pyx_L12; } __pyx_1 = PyErr_ExceptionMatches(PyExc_AttributeError); if (__pyx_1) { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":177 */ PyErr_Clear(); goto __pyx_L12; } /*else*/ { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":179 */ __pyx_3 = __Pyx_GetExcValue(); if (!__pyx_3) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 179; goto __pyx_L1;} Py_DECREF(__pyx_v_err); __pyx_v_err = __pyx_3; __pyx_3 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":180 */ __Pyx_ReRaise(); {__pyx_filename = __pyx_f[0]; __pyx_lineno = 180; goto __pyx_L1;} } __pyx_L12:; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":182 */ __pyx_1 = __pyx_v_default == __pyx_v_9_speedups__marker; if (__pyx_1) { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":183 */ __pyx_2 = __Pyx_GetName(__pyx_m, __pyx_n_AdaptationFailure); if (!__pyx_2) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 183; goto __pyx_L1;} __pyx_4 = PyTuple_New(3); if (!__pyx_4) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 183; goto __pyx_L1;} Py_INCREF(__pyx_k28p); PyTuple_SET_ITEM(__pyx_4, 0, __pyx_k28p); Py_INCREF(__pyx_v_obj); PyTuple_SET_ITEM(__pyx_4, 1, __pyx_v_obj); Py_INCREF(__pyx_v_protocol); PyTuple_SET_ITEM(__pyx_4, 2, __pyx_v_protocol); __pyx_3 = PyObject_CallObject(__pyx_2, __pyx_4); if (!__pyx_3) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 183; goto __pyx_L1;} Py_DECREF(__pyx_2); __pyx_2 = 0; Py_DECREF(__pyx_4); __pyx_4 = 0; __Pyx_Raise(__pyx_3, 0, 0); Py_DECREF(__pyx_3); __pyx_3 = 0; {__pyx_filename = __pyx_f[0]; __pyx_lineno = 183; goto __pyx_L1;} goto __pyx_L17; } __pyx_L17:; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":185 */ Py_INCREF(__pyx_v_default); __pyx_r = __pyx_v_default; goto __pyx_L0; __pyx_r = Py_None; Py_INCREF(Py_None); goto __pyx_L0; __pyx_L1:; Py_XDECREF(__pyx_2); Py_XDECREF(__pyx_3); Py_XDECREF(__pyx_4); __Pyx_AddTraceback("_speedups._adapt"); __pyx_r = 0; __pyx_L0:; Py_DECREF(__pyx_v_meth); Py_DECREF(__pyx_v_result); Py_DECREF(__pyx_v_err); Py_DECREF(__pyx_v_obj); Py_DECREF(__pyx_v_protocol); Py_DECREF(__pyx_v_default); return __pyx_r; } static PyObject *__pyx_f_9_speedups_adapt(PyObject *__pyx_self, PyObject *__pyx_args, PyObject *__pyx_kwds); /*proto*/ static char __pyx_doc_9_speedups_adapt[] = "PEP 246-alike: Adapt \'obj\' to \'protocol\', return \'default\'\n\n If \'default\' is not supplied and no implementation is found,\n raise \'AdaptationFailure\'."; static PyObject *__pyx_f_9_speedups_adapt(PyObject *__pyx_self, PyObject *__pyx_args, PyObject *__pyx_kwds) { PyObject *__pyx_v_obj = 0; PyObject *__pyx_v_protocol = 0; PyObject *__pyx_v_default = 0; PyObject *__pyx_r; PyObject *__pyx_1 = 0; static char *__pyx_argnames[] = {"obj","protocol","default",0}; __pyx_v_default = __pyx_k21; if (!PyArg_ParseTupleAndKeywords(__pyx_args, __pyx_kwds, "OO|O", __pyx_argnames, &__pyx_v_obj, &__pyx_v_protocol, &__pyx_v_default)) return 0; Py_INCREF(__pyx_v_obj); Py_INCREF(__pyx_v_protocol); Py_INCREF(__pyx_v_default); /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":194 */ __pyx_1 = __pyx_f_9_speedups__adapt(__pyx_v_obj,__pyx_v_protocol,__pyx_v_default); if (!__pyx_1) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 194; goto __pyx_L1;} __pyx_r = __pyx_1; __pyx_1 = 0; goto __pyx_L0; __pyx_r = Py_None; Py_INCREF(Py_None); goto __pyx_L0; __pyx_L1:; Py_XDECREF(__pyx_1); __Pyx_AddTraceback("_speedups.adapt"); __pyx_r = 0; __pyx_L0:; Py_DECREF(__pyx_v_obj); Py_DECREF(__pyx_v_protocol); Py_DECREF(__pyx_v_default); return __pyx_r; } static PyObject *__pyx_f_9_speedups_Protocol__call__(PyObject *__pyx_self, PyObject *__pyx_args, PyObject *__pyx_kwds); /*proto*/ static char __pyx_doc_9_speedups_Protocol__call__[] = "Adapt to this protocol"; static PyObject *__pyx_f_9_speedups_Protocol__call__(PyObject *__pyx_self, PyObject *__pyx_args, PyObject *__pyx_kwds) { PyObject *__pyx_v_self = 0; PyObject *__pyx_v_ob = 0; PyObject *__pyx_v_default = 0; PyObject *__pyx_r; PyObject *__pyx_1 = 0; static char *__pyx_argnames[] = {"self","ob","default",0}; __pyx_v_default = __pyx_k22; if (!PyArg_ParseTupleAndKeywords(__pyx_args, __pyx_kwds, "OO|O", __pyx_argnames, &__pyx_v_self, &__pyx_v_ob, &__pyx_v_default)) return 0; Py_INCREF(__pyx_v_self); Py_INCREF(__pyx_v_ob); Py_INCREF(__pyx_v_default); /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":198 */ __pyx_1 = __pyx_f_9_speedups__adapt(__pyx_v_ob,__pyx_v_self,__pyx_v_default); if (!__pyx_1) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 198; goto __pyx_L1;} __pyx_r = __pyx_1; __pyx_1 = 0; goto __pyx_L0; __pyx_r = Py_None; Py_INCREF(Py_None); goto __pyx_L0; __pyx_L1:; Py_XDECREF(__pyx_1); __Pyx_AddTraceback("_speedups.Protocol__call__"); __pyx_r = 0; __pyx_L0:; Py_DECREF(__pyx_v_self); Py_DECREF(__pyx_v_ob); Py_DECREF(__pyx_v_default); return __pyx_r; } static PyObject *__pyx_f_9_speedups_buildClassicMRO(PyClassObject (*__pyx_v_cls),PyListObject (*__pyx_v_list)) { PyTupleObject (*__pyx_v_bases); int __pyx_v_i; PyObject *__pyx_v_tmp; PyObject *__pyx_r; PyObject *__pyx_1 = 0; int __pyx_2; __pyx_v_tmp = Py_None; Py_INCREF(Py_None); /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":211 */ __pyx_1 = (PyObject *)__pyx_v_cls; Py_INCREF(__pyx_1); __pyx_2 = PyList_Append(__pyx_v_list,__pyx_1); if (__pyx_2 == (-1)) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 211; goto __pyx_L1;} Py_DECREF(__pyx_1); __pyx_1 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":212 */ __pyx_v_bases = __pyx_v_cls->cl_bases; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":214 */ __pyx_2 = (__pyx_v_bases != 0); if (__pyx_2) { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":215 */ __pyx_2 = PyTuple_GET_SIZE(__pyx_v_bases); for (__pyx_v_i = 0; __pyx_v_i < __pyx_2; ++__pyx_v_i) { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":216 */ __pyx_1 = (PyObject *)PyTuple_GET_ITEM(__pyx_v_bases,__pyx_v_i); Py_INCREF(__pyx_1); Py_DECREF(__pyx_v_tmp); __pyx_v_tmp = __pyx_1; __pyx_1 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":217 */ __pyx_1 = __pyx_f_9_speedups_buildClassicMRO(((PyClassObject (*))__pyx_v_tmp),__pyx_v_list); if (!__pyx_1) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 217; goto __pyx_L1;} Py_DECREF(__pyx_1); __pyx_1 = 0; } goto __pyx_L2; } __pyx_L2:; __pyx_r = Py_None; Py_INCREF(Py_None); goto __pyx_L0; __pyx_L1:; Py_XDECREF(__pyx_1); __Pyx_AddTraceback("_speedups.buildClassicMRO"); __pyx_r = 0; __pyx_L0:; Py_DECREF(__pyx_v_tmp); return __pyx_r; } static PyObject *__pyx_k29p; static char (__pyx_k29[]) = "Not a classic class"; static PyObject *__pyx_f_9_speedups_classicMRO(PyObject *__pyx_self, PyObject *__pyx_args, PyObject *__pyx_kwds); /*proto*/ static PyObject *__pyx_f_9_speedups_classicMRO(PyObject *__pyx_self, PyObject *__pyx_args, PyObject *__pyx_kwds) { PyObject *__pyx_v_ob = 0; PyObject *__pyx_v_extendedClassic = 0; PyObject *__pyx_v_mro; PyObject *__pyx_r; int __pyx_1; PyObject *__pyx_2 = 0; PyObject *__pyx_3 = 0; PyObject *__pyx_4 = 0; static char *__pyx_argnames[] = {"ob","extendedClassic",0}; __pyx_v_extendedClassic = __pyx_k23; if (!PyArg_ParseTupleAndKeywords(__pyx_args, __pyx_kwds, "O|O", __pyx_argnames, &__pyx_v_ob, &__pyx_v_extendedClassic)) return 0; Py_INCREF(__pyx_v_ob); Py_INCREF(__pyx_v_extendedClassic); __pyx_v_mro = Py_None; Py_INCREF(Py_None); /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":222 */ __pyx_1 = PyClass_Check(__pyx_v_ob); if (__pyx_1) { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":223 */ __pyx_2 = PyList_New(0); if (!__pyx_2) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 223; goto __pyx_L1;} Py_DECREF(__pyx_v_mro); __pyx_v_mro = __pyx_2; __pyx_2 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":224 */ __pyx_2 = __pyx_f_9_speedups_buildClassicMRO(((PyClassObject (*))__pyx_v_ob),((PyListObject (*))__pyx_v_mro)); if (!__pyx_2) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 224; goto __pyx_L1;} Py_DECREF(__pyx_2); __pyx_2 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":225 */ __pyx_1 = PyObject_IsTrue(__pyx_v_extendedClassic); if (__pyx_1 < 0) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 225; goto __pyx_L1;} if (__pyx_1) { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":226 */ __pyx_2 = (PyObject *)(&PyInstance_Type); Py_INCREF(__pyx_2); __pyx_1 = PyList_Append(((PyListObject (*))__pyx_v_mro),__pyx_2); if (__pyx_1 == (-1)) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 226; goto __pyx_L1;} Py_DECREF(__pyx_2); __pyx_2 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":227 */ __pyx_2 = (PyObject *)(&PyBaseObject_Type); Py_INCREF(__pyx_2); __pyx_1 = PyList_Append(((PyListObject (*))__pyx_v_mro),__pyx_2); if (__pyx_1 == (-1)) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 227; goto __pyx_L1;} Py_DECREF(__pyx_2); __pyx_2 = 0; goto __pyx_L3; } __pyx_L3:; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":228 */ Py_INCREF(__pyx_v_mro); __pyx_r = __pyx_v_mro; goto __pyx_L0; goto __pyx_L2; } __pyx_L2:; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":230 */ __pyx_2 = __Pyx_GetName(__pyx_b, __pyx_n_TypeError); if (!__pyx_2) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 230; goto __pyx_L1;} __pyx_3 = PyTuple_New(2); if (!__pyx_3) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 230; goto __pyx_L1;} Py_INCREF(__pyx_k29p); PyTuple_SET_ITEM(__pyx_3, 0, __pyx_k29p); Py_INCREF(__pyx_v_ob); PyTuple_SET_ITEM(__pyx_3, 1, __pyx_v_ob); __pyx_4 = PyObject_CallObject(__pyx_2, __pyx_3); if (!__pyx_4) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 230; goto __pyx_L1;} Py_DECREF(__pyx_2); __pyx_2 = 0; Py_DECREF(__pyx_3); __pyx_3 = 0; __Pyx_Raise(__pyx_4, 0, 0); Py_DECREF(__pyx_4); __pyx_4 = 0; {__pyx_filename = __pyx_f[0]; __pyx_lineno = 230; goto __pyx_L1;} __pyx_r = Py_None; Py_INCREF(Py_None); goto __pyx_L0; __pyx_L1:; Py_XDECREF(__pyx_2); Py_XDECREF(__pyx_3); Py_XDECREF(__pyx_4); __Pyx_AddTraceback("_speedups.classicMRO"); __pyx_r = 0; __pyx_L0:; Py_DECREF(__pyx_v_mro); Py_DECREF(__pyx_v_ob); Py_DECREF(__pyx_v_extendedClassic); return __pyx_r; } static PyObject *__pyx_n___bases__; static PyObject *__pyx_f_9_speedups_buildECMRO(PyObject *__pyx_v_cls,PyListObject (*__pyx_v_list)) { PyObject *__pyx_v_i; PyObject *__pyx_r; int __pyx_1; PyObject *__pyx_2 = 0; PyObject *__pyx_3 = 0; Py_INCREF(__pyx_v_cls); __pyx_v_i = Py_None; Py_INCREF(Py_None); /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":248 */ __pyx_1 = PyList_Append(__pyx_v_list,__pyx_v_cls); if (__pyx_1 == (-1)) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 248; goto __pyx_L1;} /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":249 */ __pyx_2 = PyObject_GetAttr(__pyx_v_cls, __pyx_n___bases__); if (!__pyx_2) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 249; goto __pyx_L1;} __pyx_3 = PyObject_GetIter(__pyx_2); if (!__pyx_3) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 249; goto __pyx_L1;} Py_DECREF(__pyx_2); __pyx_2 = 0; for (;;) { __pyx_2 = PyIter_Next(__pyx_3); if (!__pyx_2) { if (PyErr_Occurred()) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 249; goto __pyx_L1;} break; } Py_DECREF(__pyx_v_i); __pyx_v_i = __pyx_2; __pyx_2 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":250 */ __pyx_2 = __pyx_f_9_speedups_buildECMRO(__pyx_v_i,__pyx_v_list); if (!__pyx_2) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 250; goto __pyx_L1;} Py_DECREF(__pyx_2); __pyx_2 = 0; } Py_DECREF(__pyx_3); __pyx_3 = 0; __pyx_r = Py_None; Py_INCREF(Py_None); goto __pyx_L0; __pyx_L1:; Py_XDECREF(__pyx_2); Py_XDECREF(__pyx_3); __Pyx_AddTraceback("_speedups.buildECMRO"); __pyx_r = 0; __pyx_L0:; Py_DECREF(__pyx_v_i); Py_DECREF(__pyx_v_cls); return __pyx_r; } static PyObject *__pyx_f_9_speedups_extClassMRO(PyObject *__pyx_self, PyObject *__pyx_args, PyObject *__pyx_kwds); /*proto*/ static PyObject *__pyx_f_9_speedups_extClassMRO(PyObject *__pyx_self, PyObject *__pyx_args, PyObject *__pyx_kwds) { PyObject *__pyx_v_ob = 0; PyObject *__pyx_v_extendedClassic = 0; PyObject *__pyx_v_mro; PyObject *__pyx_r; PyObject *__pyx_1 = 0; int __pyx_2; static char *__pyx_argnames[] = {"ob","extendedClassic",0}; __pyx_v_extendedClassic = __pyx_k24; if (!PyArg_ParseTupleAndKeywords(__pyx_args, __pyx_kwds, "O|O", __pyx_argnames, &__pyx_v_ob, &__pyx_v_extendedClassic)) return 0; Py_INCREF(__pyx_v_ob); Py_INCREF(__pyx_v_extendedClassic); __pyx_v_mro = Py_None; Py_INCREF(Py_None); /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":254 */ __pyx_1 = PyList_New(0); if (!__pyx_1) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 254; goto __pyx_L1;} Py_DECREF(__pyx_v_mro); __pyx_v_mro = __pyx_1; __pyx_1 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":255 */ __pyx_1 = __pyx_f_9_speedups_buildECMRO(__pyx_v_ob,((PyListObject (*))__pyx_v_mro)); if (!__pyx_1) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 255; goto __pyx_L1;} Py_DECREF(__pyx_1); __pyx_1 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":256 */ __pyx_2 = PyObject_IsTrue(__pyx_v_extendedClassic); if (__pyx_2 < 0) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 256; goto __pyx_L1;} if (__pyx_2) { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":257 */ __pyx_1 = (PyObject *)(&PyInstance_Type); Py_INCREF(__pyx_1); __pyx_2 = PyList_Append(((PyListObject (*))__pyx_v_mro),__pyx_1); if (__pyx_2 == (-1)) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 257; goto __pyx_L1;} Py_DECREF(__pyx_1); __pyx_1 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":258 */ __pyx_1 = (PyObject *)(&PyBaseObject_Type); Py_INCREF(__pyx_1); __pyx_2 = PyList_Append(((PyListObject (*))__pyx_v_mro),__pyx_1); if (__pyx_2 == (-1)) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 258; goto __pyx_L1;} Py_DECREF(__pyx_1); __pyx_1 = 0; goto __pyx_L2; } __pyx_L2:; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":259 */ Py_INCREF(__pyx_v_mro); __pyx_r = __pyx_v_mro; goto __pyx_L0; __pyx_r = Py_None; Py_INCREF(Py_None); goto __pyx_L0; __pyx_L1:; Py_XDECREF(__pyx_1); __Pyx_AddTraceback("_speedups.extClassMRO"); __pyx_r = 0; __pyx_L0:; Py_DECREF(__pyx_v_mro); Py_DECREF(__pyx_v_ob); Py_DECREF(__pyx_v_extendedClassic); return __pyx_r; } static PyObject *__pyx_n___mro__; static PyObject *__pyx_f_9_speedups_getMRO(PyObject *__pyx_self, PyObject *__pyx_args, PyObject *__pyx_kwds); /*proto*/ static PyObject *__pyx_f_9_speedups_getMRO(PyObject *__pyx_self, PyObject *__pyx_args, PyObject *__pyx_kwds) { PyObject *__pyx_v_ob = 0; PyObject *__pyx_v_extendedClassic = 0; PyObject *__pyx_r; int __pyx_1; PyObject *__pyx_2 = 0; PyObject *__pyx_3 = 0; PyObject *__pyx_4 = 0; static char *__pyx_argnames[] = {"ob","extendedClassic",0}; __pyx_v_extendedClassic = __pyx_k25; if (!PyArg_ParseTupleAndKeywords(__pyx_args, __pyx_kwds, "O|O", __pyx_argnames, &__pyx_v_ob, &__pyx_v_extendedClassic)) return 0; Py_INCREF(__pyx_v_ob); Py_INCREF(__pyx_v_extendedClassic); /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":265 */ __pyx_1 = PyClass_Check(__pyx_v_ob); if (__pyx_1) { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":266 */ __pyx_2 = __Pyx_GetName(__pyx_m, __pyx_n_classicMRO); if (!__pyx_2) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 266; goto __pyx_L1;} __pyx_3 = PyTuple_New(2); if (!__pyx_3) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 266; goto __pyx_L1;} Py_INCREF(__pyx_v_ob); PyTuple_SET_ITEM(__pyx_3, 0, __pyx_v_ob); Py_INCREF(__pyx_v_extendedClassic); PyTuple_SET_ITEM(__pyx_3, 1, __pyx_v_extendedClassic); __pyx_4 = PyObject_CallObject(__pyx_2, __pyx_3); if (!__pyx_4) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 266; goto __pyx_L1;} Py_DECREF(__pyx_2); __pyx_2 = 0; Py_DECREF(__pyx_3); __pyx_3 = 0; __pyx_r = __pyx_4; __pyx_4 = 0; goto __pyx_L0; goto __pyx_L2; } __pyx_1 = PyType_Check(__pyx_v_ob); if (__pyx_1) { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":269 */ __pyx_2 = PyObject_GetAttr(__pyx_v_ob, __pyx_n___mro__); if (!__pyx_2) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 269; goto __pyx_L1;} __pyx_r = __pyx_2; __pyx_2 = 0; goto __pyx_L0; goto __pyx_L2; } __pyx_1 = PyObject_TypeCheck(__pyx_v_ob,__pyx_v_9_speedups___ECType); if (__pyx_1) { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":272 */ __pyx_3 = __Pyx_GetName(__pyx_m, __pyx_n_extClassMRO); if (!__pyx_3) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 272; goto __pyx_L1;} __pyx_4 = PyTuple_New(2); if (!__pyx_4) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 272; goto __pyx_L1;} Py_INCREF(__pyx_v_ob); PyTuple_SET_ITEM(__pyx_4, 0, __pyx_v_ob); Py_INCREF(__pyx_v_extendedClassic); PyTuple_SET_ITEM(__pyx_4, 1, __pyx_v_extendedClassic); __pyx_2 = PyObject_CallObject(__pyx_3, __pyx_4); if (!__pyx_2) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 272; goto __pyx_L1;} Py_DECREF(__pyx_3); __pyx_3 = 0; Py_DECREF(__pyx_4); __pyx_4 = 0; __pyx_r = __pyx_2; __pyx_2 = 0; goto __pyx_L0; goto __pyx_L2; } __pyx_L2:; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":274 */ __pyx_3 = PyTuple_New(1); if (!__pyx_3) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 274; goto __pyx_L1;} Py_INCREF(__pyx_v_ob); PyTuple_SET_ITEM(__pyx_3, 0, __pyx_v_ob); __pyx_r = __pyx_3; __pyx_3 = 0; goto __pyx_L0; __pyx_r = Py_None; Py_INCREF(Py_None); goto __pyx_L0; __pyx_L1:; Py_XDECREF(__pyx_2); Py_XDECREF(__pyx_3); Py_XDECREF(__pyx_4); __Pyx_AddTraceback("_speedups.getMRO"); __pyx_r = 0; __pyx_L0:; Py_DECREF(__pyx_v_ob); Py_DECREF(__pyx_v_extendedClassic); return __pyx_r; } static PyObject *__pyx_n__Protocol__adapters; static PyObject *__pyx_n_get; static PyObject *__pyx_f_9_speedups_Protocol__adapt__(PyObject *__pyx_self, PyObject *__pyx_args, PyObject *__pyx_kwds); /*proto*/ static PyObject *__pyx_f_9_speedups_Protocol__adapt__(PyObject *__pyx_self, PyObject *__pyx_args, PyObject *__pyx_kwds) { PyObject *__pyx_v_self = 0; PyObject *__pyx_v_obj = 0; void (*__pyx_v_tmp); int __pyx_v_i; PyObject *__pyx_v_cls; PyObject *__pyx_v_err; PyObject *__pyx_v_mro; PyObject *__pyx_v_get; PyObject *__pyx_v_factory; PyObject *__pyx_r; int __pyx_1; PyObject *__pyx_2 = 0; PyObject *__pyx_3 = 0; PyObject *__pyx_4 = 0; int __pyx_5; PyObject *__pyx_6 = 0; static char *__pyx_argnames[] = {"self","obj",0}; if (!PyArg_ParseTupleAndKeywords(__pyx_args, __pyx_kwds, "OO", __pyx_argnames, &__pyx_v_self, &__pyx_v_obj)) return 0; Py_INCREF(__pyx_v_self); Py_INCREF(__pyx_v_obj); __pyx_v_cls = Py_None; Py_INCREF(Py_None); __pyx_v_err = Py_None; Py_INCREF(Py_None); __pyx_v_mro = Py_None; Py_INCREF(Py_None); __pyx_v_get = Py_None; Py_INCREF(Py_None); __pyx_v_factory = Py_None; Py_INCREF(Py_None); /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":293 */ __pyx_1 = PyInstance_Check(__pyx_v_obj); if (__pyx_1) { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":294 */ __pyx_2 = (PyObject *)((PyInstanceObject (*))__pyx_v_obj)->in_class; Py_INCREF(__pyx_2); Py_DECREF(__pyx_v_cls); __pyx_v_cls = __pyx_2; __pyx_2 = 0; goto __pyx_L2; } /*else*/ { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":297 */ __pyx_2 = __Pyx_GetName(__pyx_m, __pyx_n___class); if (!__pyx_2) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 297; goto __pyx_L1;} __pyx_v_tmp = PyObject_GetAttr(__pyx_v_obj,__pyx_2); Py_DECREF(__pyx_2); __pyx_2 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":299 */ __pyx_1 = (__pyx_v_tmp != 0); if (__pyx_1) { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":300 */ __pyx_2 = (PyObject *)__pyx_v_tmp; Py_INCREF(__pyx_2); Py_DECREF(__pyx_v_cls); __pyx_v_cls = __pyx_2; __pyx_2 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":301 */ Py_DECREF(((PyObject (*))__pyx_v_tmp)); goto __pyx_L3; } __pyx_1 = PyErr_ExceptionMatches(PyExc_AttributeError); if (__pyx_1) { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":305 */ PyErr_Clear(); /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":306 */ __pyx_2 = (PyObject *)((PyObject (*))__pyx_v_obj)->ob_type; Py_INCREF(__pyx_2); Py_DECREF(__pyx_v_cls); __pyx_v_cls = __pyx_2; __pyx_2 = 0; goto __pyx_L3; } /*else*/ { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":310 */ __pyx_2 = __Pyx_GetExcValue(); if (!__pyx_2) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 310; goto __pyx_L1;} Py_DECREF(__pyx_v_err); __pyx_v_err = __pyx_2; __pyx_2 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":311 */ __Pyx_ReRaise(); {__pyx_filename = __pyx_f[0]; __pyx_lineno = 311; goto __pyx_L1;} } __pyx_L3:; } __pyx_L2:; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":313 */ __pyx_v_tmp = ((void (*))0); /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":315 */ __pyx_1 = PyType_Check(__pyx_v_cls); if (__pyx_1) { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":317 */ __pyx_v_tmp = ((void (*))((PyTypeObject (*))__pyx_v_cls)->tp_mro); goto __pyx_L4; } __pyx_L4:; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":329 */ __pyx_1 = (__pyx_v_tmp != 0); if (__pyx_1) { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":330 */ __pyx_2 = (PyObject *)__pyx_v_tmp; Py_INCREF(__pyx_2); Py_DECREF(__pyx_v_mro); __pyx_v_mro = __pyx_2; __pyx_2 = 0; goto __pyx_L5; } __pyx_1 = PyClass_Check(__pyx_v_cls); if (__pyx_1) { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":334 */ __pyx_2 = PyList_New(0); if (!__pyx_2) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 334; goto __pyx_L1;} Py_DECREF(__pyx_v_mro); __pyx_v_mro = __pyx_2; __pyx_2 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":335 */ __pyx_2 = __pyx_f_9_speedups_buildClassicMRO(((PyClassObject (*))__pyx_v_cls),((PyListObject (*))__pyx_v_mro)); if (!__pyx_2) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 335; goto __pyx_L1;} Py_DECREF(__pyx_2); __pyx_2 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":336 */ __pyx_2 = (PyObject *)(&PyInstance_Type); Py_INCREF(__pyx_2); __pyx_1 = PyList_Append(((PyListObject (*))__pyx_v_mro),__pyx_2); if (__pyx_1 == (-1)) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 336; goto __pyx_L1;} Py_DECREF(__pyx_2); __pyx_2 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":337 */ __pyx_2 = (PyObject *)(&PyBaseObject_Type); Py_INCREF(__pyx_2); __pyx_1 = PyList_Append(((PyListObject (*))__pyx_v_mro),__pyx_2); if (__pyx_1 == (-1)) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 337; goto __pyx_L1;} Py_DECREF(__pyx_2); __pyx_2 = 0; goto __pyx_L5; } /*else*/ { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":341 */ __pyx_v_tmp = PyObject_GetAttr(__pyx_v_cls,__pyx_v_9_speedups___mro); /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":342 */ __pyx_1 = (__pyx_v_tmp != 0); if (__pyx_1) { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":343 */ __pyx_2 = (PyObject *)__pyx_v_tmp; Py_INCREF(__pyx_2); Py_DECREF(__pyx_v_mro); __pyx_v_mro = __pyx_2; __pyx_2 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":344 */ Py_DECREF(((PyObject (*))__pyx_v_tmp)); goto __pyx_L6; } __pyx_1 = PyObject_TypeCheck(__pyx_v_cls,__pyx_v_9_speedups___ECType); if (__pyx_1) { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":349 */ PyErr_Clear(); /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":350 */ __pyx_2 = __Pyx_GetName(__pyx_m, __pyx_n_extClassMRO); if (!__pyx_2) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 350; goto __pyx_L1;} __pyx_3 = PyInt_FromLong(1); if (!__pyx_3) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 350; goto __pyx_L1;} __pyx_4 = PyTuple_New(2); if (!__pyx_4) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 350; goto __pyx_L1;} Py_INCREF(__pyx_v_cls); PyTuple_SET_ITEM(__pyx_4, 0, __pyx_v_cls); PyTuple_SET_ITEM(__pyx_4, 1, __pyx_3); __pyx_3 = 0; __pyx_3 = PyObject_CallObject(__pyx_2, __pyx_4); if (!__pyx_3) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 350; goto __pyx_L1;} Py_DECREF(__pyx_2); __pyx_2 = 0; Py_DECREF(__pyx_4); __pyx_4 = 0; Py_DECREF(__pyx_v_mro); __pyx_v_mro = __pyx_3; __pyx_3 = 0; goto __pyx_L6; } /*else*/ { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":355 */ __pyx_2 = __Pyx_GetExcValue(); if (!__pyx_2) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 355; goto __pyx_L1;} Py_DECREF(__pyx_v_err); __pyx_v_err = __pyx_2; __pyx_2 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":356 */ __Pyx_ReRaise(); {__pyx_filename = __pyx_f[0]; __pyx_lineno = 356; goto __pyx_L1;} } __pyx_L6:; } __pyx_L5:; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":370 */ __pyx_4 = PyObject_GetAttr(__pyx_v_self, __pyx_n__Protocol__adapters); if (!__pyx_4) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 370; goto __pyx_L1;} __pyx_3 = PyObject_GetAttr(__pyx_4, __pyx_n_get); if (!__pyx_3) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 370; goto __pyx_L1;} Py_DECREF(__pyx_4); __pyx_4 = 0; Py_DECREF(__pyx_v_get); __pyx_v_get = __pyx_3; __pyx_3 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":372 */ __pyx_1 = PyTuple_Check(__pyx_v_mro); if (__pyx_1) { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":374 */ __pyx_1 = PyTuple_GET_SIZE(((PyTupleObject (*))__pyx_v_mro)); for (__pyx_v_i = 0; __pyx_v_i < __pyx_1; ++__pyx_v_i) { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":375 */ __pyx_2 = (PyObject *)PyTuple_GET_ITEM(((PyTupleObject (*))__pyx_v_mro),__pyx_v_i); Py_INCREF(__pyx_2); Py_DECREF(__pyx_v_cls); __pyx_v_cls = __pyx_2; __pyx_2 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":376 */ __pyx_4 = PyTuple_New(1); if (!__pyx_4) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 376; goto __pyx_L1;} Py_INCREF(__pyx_v_cls); PyTuple_SET_ITEM(__pyx_4, 0, __pyx_v_cls); __pyx_3 = PyObject_CallObject(__pyx_v_get, __pyx_4); if (!__pyx_3) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 376; goto __pyx_L1;} Py_DECREF(__pyx_4); __pyx_4 = 0; Py_DECREF(__pyx_v_factory); __pyx_v_factory = __pyx_3; __pyx_3 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":377 */ __pyx_5 = __pyx_v_factory != Py_None; if (__pyx_5) { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":378 */ __pyx_2 = PyInt_FromLong(0); if (!__pyx_2) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 378; goto __pyx_L1;} __pyx_4 = PyObject_GetItem(__pyx_v_factory, __pyx_2); if (!__pyx_4) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 378; goto __pyx_L1;} Py_DECREF(__pyx_2); __pyx_2 = 0; __pyx_3 = PyTuple_New(1); if (!__pyx_3) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 378; goto __pyx_L1;} Py_INCREF(__pyx_v_obj); PyTuple_SET_ITEM(__pyx_3, 0, __pyx_v_obj); __pyx_2 = PyObject_CallObject(__pyx_4, __pyx_3); if (!__pyx_2) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 378; goto __pyx_L1;} Py_DECREF(__pyx_4); __pyx_4 = 0; Py_DECREF(__pyx_3); __pyx_3 = 0; __pyx_r = __pyx_2; __pyx_2 = 0; goto __pyx_L0; goto __pyx_L10; } __pyx_L10:; } goto __pyx_L7; } __pyx_5 = PyList_Check(__pyx_v_mro); if (__pyx_5) { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":382 */ __pyx_1 = PyList_GET_SIZE(((PyListObject (*))__pyx_v_mro)); for (__pyx_v_i = 0; __pyx_v_i < __pyx_1; ++__pyx_v_i) { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":383 */ __pyx_4 = (PyObject *)PyList_GET_ITEM(((PyListObject (*))__pyx_v_mro),__pyx_v_i); Py_INCREF(__pyx_4); Py_DECREF(__pyx_v_cls); __pyx_v_cls = __pyx_4; __pyx_4 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":384 */ __pyx_3 = PyTuple_New(1); if (!__pyx_3) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 384; goto __pyx_L1;} Py_INCREF(__pyx_v_cls); PyTuple_SET_ITEM(__pyx_3, 0, __pyx_v_cls); __pyx_2 = PyObject_CallObject(__pyx_v_get, __pyx_3); if (!__pyx_2) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 384; goto __pyx_L1;} Py_DECREF(__pyx_3); __pyx_3 = 0; Py_DECREF(__pyx_v_factory); __pyx_v_factory = __pyx_2; __pyx_2 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":385 */ __pyx_5 = __pyx_v_factory != Py_None; if (__pyx_5) { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":386 */ __pyx_4 = PyInt_FromLong(0); if (!__pyx_4) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 386; goto __pyx_L1;} __pyx_3 = PyObject_GetItem(__pyx_v_factory, __pyx_4); if (!__pyx_3) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 386; goto __pyx_L1;} Py_DECREF(__pyx_4); __pyx_4 = 0; __pyx_2 = PyTuple_New(1); if (!__pyx_2) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 386; goto __pyx_L1;} Py_INCREF(__pyx_v_obj); PyTuple_SET_ITEM(__pyx_2, 0, __pyx_v_obj); __pyx_4 = PyObject_CallObject(__pyx_3, __pyx_2); if (!__pyx_4) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 386; goto __pyx_L1;} Py_DECREF(__pyx_3); __pyx_3 = 0; Py_DECREF(__pyx_2); __pyx_2 = 0; __pyx_r = __pyx_4; __pyx_4 = 0; goto __pyx_L0; goto __pyx_L13; } __pyx_L13:; } goto __pyx_L7; } /*else*/ { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":391 */ __pyx_3 = PyObject_GetIter(__pyx_v_mro); if (!__pyx_3) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 391; goto __pyx_L1;} for (;;) { __pyx_2 = PyIter_Next(__pyx_3); if (!__pyx_2) { if (PyErr_Occurred()) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 391; goto __pyx_L1;} break; } Py_DECREF(__pyx_v_cls); __pyx_v_cls = __pyx_2; __pyx_2 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":392 */ __pyx_4 = PyTuple_New(1); if (!__pyx_4) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 392; goto __pyx_L1;} Py_INCREF(__pyx_v_cls); PyTuple_SET_ITEM(__pyx_4, 0, __pyx_v_cls); __pyx_2 = PyObject_CallObject(__pyx_v_get, __pyx_4); if (!__pyx_2) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 392; goto __pyx_L1;} Py_DECREF(__pyx_4); __pyx_4 = 0; Py_DECREF(__pyx_v_factory); __pyx_v_factory = __pyx_2; __pyx_2 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":393 */ __pyx_5 = __pyx_v_factory != Py_None; if (__pyx_5) { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":394 */ __pyx_4 = PyInt_FromLong(0); if (!__pyx_4) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 394; goto __pyx_L1;} __pyx_2 = PyObject_GetItem(__pyx_v_factory, __pyx_4); if (!__pyx_2) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 394; goto __pyx_L1;} Py_DECREF(__pyx_4); __pyx_4 = 0; __pyx_4 = PyTuple_New(1); if (!__pyx_4) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 394; goto __pyx_L1;} Py_INCREF(__pyx_v_obj); PyTuple_SET_ITEM(__pyx_4, 0, __pyx_v_obj); __pyx_6 = PyObject_CallObject(__pyx_2, __pyx_4); if (!__pyx_6) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 394; goto __pyx_L1;} Py_DECREF(__pyx_2); __pyx_2 = 0; Py_DECREF(__pyx_4); __pyx_4 = 0; __pyx_r = __pyx_6; __pyx_6 = 0; Py_DECREF(__pyx_3); __pyx_3 = 0; goto __pyx_L0; goto __pyx_L16; } __pyx_L16:; } Py_DECREF(__pyx_3); __pyx_3 = 0; } __pyx_L7:; __pyx_r = Py_None; Py_INCREF(Py_None); goto __pyx_L0; __pyx_L1:; Py_XDECREF(__pyx_2); Py_XDECREF(__pyx_3); Py_XDECREF(__pyx_4); Py_XDECREF(__pyx_6); __Pyx_AddTraceback("_speedups.Protocol__adapt__"); __pyx_r = 0; __pyx_L0:; Py_DECREF(__pyx_v_cls); Py_DECREF(__pyx_v_err); Py_DECREF(__pyx_v_mro); Py_DECREF(__pyx_v_get); Py_DECREF(__pyx_v_factory); Py_DECREF(__pyx_v_self); Py_DECREF(__pyx_v_obj); return __pyx_r; } static __Pyx_InternTabEntry __pyx_intern_tab[] = { {&__pyx_n_AdaptationFailure, "AdaptationFailure"}, {&__pyx_n_AttributeError, "AttributeError"}, {&__pyx_n_DOES_NOT_SUPPORT, "DOES_NOT_SUPPORT"}, {&__pyx_n_ExtensionClass, "ExtensionClass"}, {&__pyx_n_False, "False"}, {&__pyx_n_ImportError, "ImportError"}, {&__pyx_n_NO_ADAPTER_NEEDED, "NO_ADAPTER_NEEDED"}, {&__pyx_n_Protocol__adapt__, "Protocol__adapt__"}, {&__pyx_n_Protocol__call__, "Protocol__call__"}, {&__pyx_n_TypeError, "TypeError"}, {&__pyx_n__Protocol__adapters, "_Protocol__adapters"}, {&__pyx_n___all__, "__all__"}, {&__pyx_n___bases__, "__bases__"}, {&__pyx_n___class, "__class"}, {&__pyx_n___mro__, "__mro__"}, {&__pyx_n_adapt, "adapt"}, {&__pyx_n_classicMRO, "classicMRO"}, {&__pyx_n_exc_info, "exc_info"}, {&__pyx_n_extClassMRO, "extClassMRO"}, {&__pyx_n_get, "get"}, {&__pyx_n_getMRO, "getMRO"}, {&__pyx_n_metamethod, "metamethod"}, {&__pyx_n_object, "object"}, {&__pyx_n_sys, "sys"}, {&__pyx_n_tb_next, "tb_next"}, {0, 0} }; static __Pyx_StringTabEntry __pyx_string_tab[] = { {&__pyx_k11p, __pyx_k11, sizeof(__pyx_k11)}, {&__pyx_k26p, __pyx_k26, sizeof(__pyx_k26)}, {&__pyx_k27p, __pyx_k27, sizeof(__pyx_k27)}, {&__pyx_k28p, __pyx_k28, sizeof(__pyx_k28)}, {&__pyx_k29p, __pyx_k29, sizeof(__pyx_k29)}, {0, 0, 0} }; static PyObject *__pyx_tp_new_9_speedups_metamethod(PyTypeObject *t, PyObject *a, PyObject *k) { PyObject *o = (*t->tp_alloc)(t, 0); struct __pyx_obj_9_speedups_metamethod *p = (struct __pyx_obj_9_speedups_metamethod *)o; p->func = Py_None; Py_INCREF(Py_None); return o; } static void __pyx_tp_dealloc_9_speedups_metamethod(PyObject *o) { struct __pyx_obj_9_speedups_metamethod *p = (struct __pyx_obj_9_speedups_metamethod *)o; Py_XDECREF(p->func); (*o->ob_type->tp_free)(o); } static int __pyx_tp_traverse_9_speedups_metamethod(PyObject *o, visitproc v, void *a) { int e; struct __pyx_obj_9_speedups_metamethod *p = (struct __pyx_obj_9_speedups_metamethod *)o; if (p->func) { e = (*v)(p->func, a); if (e) return e; } return 0; } static int __pyx_tp_clear_9_speedups_metamethod(PyObject *o) { struct __pyx_obj_9_speedups_metamethod *p = (struct __pyx_obj_9_speedups_metamethod *)o; Py_XDECREF(p->func); p->func = Py_None; Py_INCREF(Py_None); return 0; } static PyObject *__pyx_tp_descr_get_9_speedups_metamethod(PyObject *o, PyObject *i, PyObject *c) { PyObject *r = 0; if (!i) i = Py_None; if (!c) c = Py_None; r = __pyx_f_9_speedups_10metamethod___get__(o, i, c); return r; } static int __pyx_tp_descr_set_9_speedups_metamethod(PyObject *o, PyObject *i, PyObject *v) { if (v) { return __pyx_f_9_speedups_10metamethod___set__(o, i, v); } else { return __pyx_f_9_speedups_10metamethod___delete__(o, i); } } static struct PyMethodDef __pyx_methods_9_speedups_metamethod[] = { {0, 0, 0, 0} }; static PyNumberMethods __pyx_tp_as_number_metamethod = { 0, /*nb_add*/ 0, /*nb_subtract*/ 0, /*nb_multiply*/ 0, /*nb_divide*/ 0, /*nb_remainder*/ 0, /*nb_divmod*/ 0, /*nb_power*/ 0, /*nb_negative*/ 0, /*nb_positive*/ 0, /*nb_absolute*/ 0, /*nb_nonzero*/ 0, /*nb_invert*/ 0, /*nb_lshift*/ 0, /*nb_rshift*/ 0, /*nb_and*/ 0, /*nb_xor*/ 0, /*nb_or*/ 0, /*nb_coerce*/ 0, /*nb_int*/ 0, /*nb_long*/ 0, /*nb_float*/ 0, /*nb_oct*/ 0, /*nb_hex*/ 0, /*nb_inplace_add*/ 0, /*nb_inplace_subtract*/ 0, /*nb_inplace_multiply*/ 0, /*nb_inplace_divide*/ 0, /*nb_inplace_remainder*/ 0, /*nb_inplace_power*/ 0, /*nb_inplace_lshift*/ 0, /*nb_inplace_rshift*/ 0, /*nb_inplace_and*/ 0, /*nb_inplace_xor*/ 0, /*nb_inplace_or*/ 0, /*nb_floor_divide*/ 0, /*nb_true_divide*/ 0, /*nb_inplace_floor_divide*/ 0, /*nb_inplace_true_divide*/ }; static PySequenceMethods __pyx_tp_as_sequence_metamethod = { 0, /*sq_length*/ 0, /*sq_concat*/ 0, /*sq_repeat*/ 0, /*sq_item*/ 0, /*sq_slice*/ 0, /*sq_ass_item*/ 0, /*sq_ass_slice*/ 0, /*sq_contains*/ 0, /*sq_inplace_concat*/ 0, /*sq_inplace_repeat*/ }; static PyMappingMethods __pyx_tp_as_mapping_metamethod = { 0, /*mp_length*/ 0, /*mp_subscript*/ 0, /*mp_ass_subscript*/ }; static PyBufferProcs __pyx_tp_as_buffer_metamethod = { 0, /*bf_getreadbuffer*/ 0, /*bf_getwritebuffer*/ 0, /*bf_getsegcount*/ 0, /*bf_getcharbuffer*/ }; PyTypeObject __pyx_type_9_speedups_metamethod = { PyObject_HEAD_INIT(0) 0, /*ob_size*/ "_speedups.metamethod", /*tp_name*/ sizeof(struct __pyx_obj_9_speedups_metamethod), /*tp_basicsize*/ 0, /*tp_itemsize*/ __pyx_tp_dealloc_9_speedups_metamethod, /*tp_dealloc*/ 0, /*tp_print*/ 0, /*tp_getattr*/ 0, /*tp_setattr*/ 0, /*tp_compare*/ 0, /*tp_repr*/ &__pyx_tp_as_number_metamethod, /*tp_as_number*/ &__pyx_tp_as_sequence_metamethod, /*tp_as_sequence*/ &__pyx_tp_as_mapping_metamethod, /*tp_as_mapping*/ 0, /*tp_hash*/ 0, /*tp_call*/ 0, /*tp_str*/ 0, /*tp_getattro*/ 0, /*tp_setattro*/ &__pyx_tp_as_buffer_metamethod, /*tp_as_buffer*/ Py_TPFLAGS_DEFAULT|Py_TPFLAGS_CHECKTYPES|Py_TPFLAGS_BASETYPE|Py_TPFLAGS_HAVE_GC, /*tp_flags*/ "Wrapper for metaclass method that might be confused w/instance method", /*tp_doc*/ __pyx_tp_traverse_9_speedups_metamethod, /*tp_traverse*/ __pyx_tp_clear_9_speedups_metamethod, /*tp_clear*/ 0, /*tp_richcompare*/ 0, /*tp_weaklistoffset*/ 0, /*tp_iter*/ 0, /*tp_iternext*/ __pyx_methods_9_speedups_metamethod, /*tp_methods*/ 0, /*tp_members*/ 0, /*tp_getset*/ 0, /*tp_base*/ 0, /*tp_dict*/ __pyx_tp_descr_get_9_speedups_metamethod, /*tp_descr_get*/ __pyx_tp_descr_set_9_speedups_metamethod, /*tp_descr_set*/ 0, /*tp_dictoffset*/ __pyx_f_9_speedups_10metamethod___init__, /*tp_init*/ 0, /*tp_alloc*/ __pyx_tp_new_9_speedups_metamethod, /*tp_new*/ 0, /*tp_free*/ 0, /*tp_is_gc*/ 0, /*tp_bases*/ 0, /*tp_mro*/ 0, /*tp_cache*/ 0, /*tp_subclasses*/ 0, /*tp_weaklist*/ }; static struct PyMethodDef __pyx_methods[] = { {"NO_ADAPTER_NEEDED", (PyCFunction)__pyx_f_9_speedups_NO_ADAPTER_NEEDED, METH_VARARGS|METH_KEYWORDS, __pyx_doc_9_speedups_NO_ADAPTER_NEEDED}, {"DOES_NOT_SUPPORT", (PyCFunction)__pyx_f_9_speedups_DOES_NOT_SUPPORT, METH_VARARGS|METH_KEYWORDS, __pyx_doc_9_speedups_DOES_NOT_SUPPORT}, {"adapt", (PyCFunction)__pyx_f_9_speedups_adapt, METH_VARARGS|METH_KEYWORDS, __pyx_doc_9_speedups_adapt}, {"Protocol__call__", (PyCFunction)__pyx_f_9_speedups_Protocol__call__, METH_VARARGS|METH_KEYWORDS, __pyx_doc_9_speedups_Protocol__call__}, {"classicMRO", (PyCFunction)__pyx_f_9_speedups_classicMRO, METH_VARARGS|METH_KEYWORDS, 0}, {"extClassMRO", (PyCFunction)__pyx_f_9_speedups_extClassMRO, METH_VARARGS|METH_KEYWORDS, 0}, {"getMRO", (PyCFunction)__pyx_f_9_speedups_getMRO, METH_VARARGS|METH_KEYWORDS, 0}, {"Protocol__adapt__", (PyCFunction)__pyx_f_9_speedups_Protocol__adapt__, METH_VARARGS|METH_KEYWORDS, 0}, {0, 0, 0, 0} }; static void __pyx_init_filenames(void); /*proto*/ PyMODINIT_FUNC init_speedups(void); /*proto*/ PyMODINIT_FUNC init_speedups(void) { PyObject *__pyx_1 = 0; PyObject *__pyx_2 = 0; int __pyx_3; PyObject *__pyx_4 = 0; __pyx_init_filenames(); __pyx_m = Py_InitModule4("_speedups", __pyx_methods, __pyx_mdoc, 0, PYTHON_API_VERSION); if (!__pyx_m) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 1; goto __pyx_L1;}; __pyx_b = PyImport_AddModule("__builtin__"); if (!__pyx_b) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 1; goto __pyx_L1;}; if (PyObject_SetAttrString(__pyx_m, "__builtins__", __pyx_b) < 0) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 1; goto __pyx_L1;}; if (__Pyx_InternStrings(__pyx_intern_tab) < 0) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 1; goto __pyx_L1;}; if (__Pyx_InitStrings(__pyx_string_tab) < 0) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 1; goto __pyx_L1;}; __pyx_v_9_speedups__marker = Py_None; Py_INCREF(Py_None); __pyx_v_9_speedups___conform = Py_None; Py_INCREF(Py_None); __pyx_v_9_speedups___adapt = Py_None; Py_INCREF(Py_None); __pyx_v_9_speedups___mro = Py_None; Py_INCREF(Py_None); __pyx_v_9_speedups___ECType = Py_None; Py_INCREF(Py_None); __pyx_type_9_speedups_metamethod.tp_free = _PyObject_GC_Del; if (PyType_Ready(&__pyx_type_9_speedups_metamethod) < 0) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 94; goto __pyx_L1;} if (PyObject_SetAttrString(__pyx_m, "metamethod", (PyObject *)&__pyx_type_9_speedups_metamethod) < 0) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 94; goto __pyx_L1;} __pyx_ptype_9_speedups_metamethod = &__pyx_type_9_speedups_metamethod; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":3 */ __pyx_1 = PyList_New(8); if (!__pyx_1) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 3; goto __pyx_L1;} Py_INCREF(__pyx_n_NO_ADAPTER_NEEDED); PyList_SET_ITEM(__pyx_1, 0, __pyx_n_NO_ADAPTER_NEEDED); Py_INCREF(__pyx_n_DOES_NOT_SUPPORT); PyList_SET_ITEM(__pyx_1, 1, __pyx_n_DOES_NOT_SUPPORT); Py_INCREF(__pyx_n_adapt); PyList_SET_ITEM(__pyx_1, 2, __pyx_n_adapt); Py_INCREF(__pyx_n_Protocol__adapt__); PyList_SET_ITEM(__pyx_1, 3, __pyx_n_Protocol__adapt__); Py_INCREF(__pyx_n_metamethod); PyList_SET_ITEM(__pyx_1, 4, __pyx_n_metamethod); Py_INCREF(__pyx_n_classicMRO); PyList_SET_ITEM(__pyx_1, 5, __pyx_n_classicMRO); Py_INCREF(__pyx_n_getMRO); PyList_SET_ITEM(__pyx_1, 6, __pyx_n_getMRO); Py_INCREF(__pyx_n_Protocol__call__); PyList_SET_ITEM(__pyx_1, 7, __pyx_n_Protocol__call__); if (PyObject_SetAttr(__pyx_m, __pyx_n___all__, __pyx_1) < 0) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 3; goto __pyx_L1;} Py_DECREF(__pyx_1); __pyx_1 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":64 */ __pyx_1 = PyList_New(1); if (!__pyx_1) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 64; goto __pyx_L1;} Py_INCREF(__pyx_n_exc_info); PyList_SET_ITEM(__pyx_1, 0, __pyx_n_exc_info); __pyx_2 = __Pyx_Import(__pyx_n_sys, __pyx_1); if (!__pyx_2) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 64; goto __pyx_L1;} Py_DECREF(__pyx_1); __pyx_1 = 0; __pyx_1 = PyObject_GetAttr(__pyx_2, __pyx_n_exc_info); if (!__pyx_1) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 64; goto __pyx_L1;} if (PyObject_SetAttr(__pyx_m, __pyx_n_exc_info, __pyx_1) < 0) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 64; goto __pyx_L1;} Py_DECREF(__pyx_1); __pyx_1 = 0; Py_DECREF(__pyx_2); __pyx_2 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":65 */ __pyx_2 = PyList_New(1); if (!__pyx_2) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 65; goto __pyx_L1;} Py_INCREF(__pyx_n_AdaptationFailure); PyList_SET_ITEM(__pyx_2, 0, __pyx_n_AdaptationFailure); __pyx_1 = __Pyx_Import(__pyx_k11p, __pyx_2); if (!__pyx_1) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 65; goto __pyx_L1;} Py_DECREF(__pyx_2); __pyx_2 = 0; __pyx_2 = PyObject_GetAttr(__pyx_1, __pyx_n_AdaptationFailure); if (!__pyx_2) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 65; goto __pyx_L1;} if (PyObject_SetAttr(__pyx_m, __pyx_n_AdaptationFailure, __pyx_2) < 0) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 65; goto __pyx_L1;} Py_DECREF(__pyx_2); __pyx_2 = 0; Py_DECREF(__pyx_1); __pyx_1 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":67 */ /*try:*/ { /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":68 */ __pyx_1 = PyList_New(1); if (!__pyx_1) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 68; goto __pyx_L17;} Py_INCREF(__pyx_n_ExtensionClass); PyList_SET_ITEM(__pyx_1, 0, __pyx_n_ExtensionClass); __pyx_2 = __Pyx_Import(__pyx_n_ExtensionClass, __pyx_1); if (!__pyx_2) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 68; goto __pyx_L17;} Py_DECREF(__pyx_1); __pyx_1 = 0; __pyx_1 = PyObject_GetAttr(__pyx_2, __pyx_n_ExtensionClass); if (!__pyx_1) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 68; goto __pyx_L17;} if (PyObject_SetAttr(__pyx_m, __pyx_n_ExtensionClass, __pyx_1) < 0) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 68; goto __pyx_L17;} Py_DECREF(__pyx_1); __pyx_1 = 0; Py_DECREF(__pyx_2); __pyx_2 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":69 */ __pyx_2 = __Pyx_GetName(__pyx_m, __pyx_n_ExtensionClass); if (!__pyx_2) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 69; goto __pyx_L17;} Py_DECREF(__pyx_v_9_speedups___ECType); __pyx_v_9_speedups___ECType = __pyx_2; __pyx_2 = 0; } goto __pyx_L18; __pyx_L17:; Py_XDECREF(__pyx_1); __pyx_1 = 0; Py_XDECREF(__pyx_2); __pyx_2 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":70 */ __pyx_1 = __Pyx_GetName(__pyx_b, __pyx_n_ImportError); if (!__pyx_1) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 70; goto __pyx_L1;} __pyx_3 = PyErr_ExceptionMatches(__pyx_1); Py_DECREF(__pyx_1); __pyx_1 = 0; if (__pyx_3) { __Pyx_AddTraceback("_speedups"); __pyx_2 = __Pyx_GetExcValue(); if (!__pyx_2) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 70; goto __pyx_L1;} Py_DECREF(__pyx_2); __pyx_2 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":71 */ __pyx_1 = __Pyx_GetName(__pyx_b, __pyx_n_object); if (!__pyx_1) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 71; goto __pyx_L1;} __pyx_2 = PyObject_Type(__pyx_1); if (!__pyx_2) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 71; goto __pyx_L1;} Py_DECREF(__pyx_1); __pyx_1 = 0; Py_DECREF(__pyx_v_9_speedups___ECType); __pyx_v_9_speedups___ECType = __pyx_2; __pyx_2 = 0; goto __pyx_L18; } goto __pyx_L1; __pyx_L18:; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":73 */ __pyx_1 = __Pyx_GetName(__pyx_b, __pyx_n_object); if (!__pyx_1) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 73; goto __pyx_L1;} __pyx_2 = PyObject_CallObject(__pyx_1, 0); if (!__pyx_2) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 73; goto __pyx_L1;} Py_DECREF(__pyx_1); __pyx_1 = 0; Py_DECREF(__pyx_v_9_speedups__marker); __pyx_v_9_speedups__marker = __pyx_2; __pyx_2 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":74 */ __pyx_1 = PyString_InternFromString(__pyx_k15); if (!__pyx_1) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 74; goto __pyx_L1;} Py_DECREF(__pyx_v_9_speedups___conform); __pyx_v_9_speedups___conform = __pyx_1; __pyx_1 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":75 */ __pyx_2 = PyString_InternFromString(__pyx_k16); if (!__pyx_2) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 75; goto __pyx_L1;} Py_DECREF(__pyx_v_9_speedups___adapt); __pyx_v_9_speedups___adapt = __pyx_2; __pyx_2 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":76 */ __pyx_1 = PyString_InternFromString(__pyx_k17); if (!__pyx_1) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 76; goto __pyx_L1;} if (PyObject_SetAttr(__pyx_m, __pyx_n___class, __pyx_1) < 0) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 76; goto __pyx_L1;} Py_DECREF(__pyx_1); __pyx_1 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":77 */ __pyx_2 = PyString_InternFromString(__pyx_k18); if (!__pyx_2) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 77; goto __pyx_L1;} Py_DECREF(__pyx_v_9_speedups___mro); __pyx_v_9_speedups___mro = __pyx_2; __pyx_2 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":85 */ Py_INCREF(Py_None); __pyx_k19 = Py_None; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":89 */ Py_INCREF(Py_None); __pyx_k20 = Py_None; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":188 */ Py_INCREF(__pyx_v_9_speedups__marker); __pyx_k21 = __pyx_v_9_speedups__marker; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":196 */ Py_INCREF(__pyx_v_9_speedups__marker); __pyx_k22 = __pyx_v_9_speedups__marker; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":220 */ __pyx_1 = __Pyx_GetName(__pyx_b, __pyx_n_False); if (!__pyx_1) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 220; goto __pyx_L1;} __pyx_k23 = __pyx_1; __pyx_1 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":253 */ __pyx_2 = __Pyx_GetName(__pyx_b, __pyx_n_False); if (!__pyx_2) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 253; goto __pyx_L1;} __pyx_k24 = __pyx_2; __pyx_2 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":263 */ __pyx_4 = __Pyx_GetName(__pyx_b, __pyx_n_False); if (!__pyx_4) {__pyx_filename = __pyx_f[0]; __pyx_lineno = 263; goto __pyx_L1;} __pyx_k25 = __pyx_4; __pyx_4 = 0; /* "/home/pje/projects/PyProtocols/src/protocols/_speedups.pyx":288 */ return; __pyx_L1:; Py_XDECREF(__pyx_1); Py_XDECREF(__pyx_2); Py_XDECREF(__pyx_4); __Pyx_AddTraceback("_speedups"); } static char *__pyx_filenames[] = { "_speedups.pyx", }; /* Runtime support code */ static void __pyx_init_filenames(void) { __pyx_f = __pyx_filenames; } static PyObject *__Pyx_Import(PyObject *name, PyObject *from_list) { PyObject *__import__ = 0; PyObject *empty_list = 0; PyObject *module = 0; PyObject *global_dict = 0; PyObject *empty_dict = 0; PyObject *list; __import__ = PyObject_GetAttrString(__pyx_b, "__import__"); if (!__import__) goto bad; if (from_list) list = from_list; else { empty_list = PyList_New(0); if (!empty_list) goto bad; list = empty_list; } global_dict = PyModule_GetDict(__pyx_m); if (!global_dict) goto bad; empty_dict = PyDict_New(); if (!empty_dict) goto bad; module = PyObject_CallFunction(__import__, "OOOO", name, global_dict, empty_dict, list); bad: Py_XDECREF(empty_list); Py_XDECREF(__import__); Py_XDECREF(empty_dict); return module; } static PyObject *__Pyx_GetName(PyObject *dict, PyObject *name) { PyObject *result; result = PyObject_GetAttr(dict, name); if (!result) PyErr_SetObject(PyExc_NameError, name); return result; } static PyObject *__Pyx_GetExcValue(void) { PyObject *type = 0, *value = 0, *tb = 0; PyObject *result = 0; PyThreadState *tstate = PyThreadState_Get(); PyErr_Fetch(&type, &value, &tb); PyErr_NormalizeException(&type, &value, &tb); if (PyErr_Occurred()) goto bad; if (!value) { value = Py_None; Py_INCREF(value); } Py_XDECREF(tstate->exc_type); Py_XDECREF(tstate->exc_value); Py_XDECREF(tstate->exc_traceback); tstate->exc_type = type; tstate->exc_value = value; tstate->exc_traceback = tb; result = value; Py_XINCREF(result); type = 0; value = 0; tb = 0; bad: Py_XDECREF(type); Py_XDECREF(value); Py_XDECREF(tb); return result; } static void __Pyx_Raise(PyObject *type, PyObject *value, PyObject *tb) { Py_XINCREF(type); Py_XINCREF(value); Py_XINCREF(tb); /* First, check the traceback argument, replacing None with NULL. */ if (tb == Py_None) { Py_DECREF(tb); tb = 0; } else if (tb != NULL && !PyTraceBack_Check(tb)) { PyErr_SetString(PyExc_TypeError, "raise: arg 3 must be a traceback or None"); goto raise_error; } /* Next, replace a missing value with None */ if (value == NULL) { value = Py_None; Py_INCREF(value); } /* Next, repeatedly, replace a tuple exception with its first item */ while (PyTuple_Check(type) && PyTuple_Size(type) > 0) { PyObject *tmp = type; type = PyTuple_GET_ITEM(type, 0); Py_INCREF(type); Py_DECREF(tmp); } if (PyString_Check(type)) { if (PyErr_Warn(PyExc_DeprecationWarning, "raising a string exception is deprecated")) goto raise_error; } else if (PyType_Check(type) || PyClass_Check(type)) ; /*PyErr_NormalizeException(&type, &value, &tb);*/ else { /* Raising an instance. The value should be a dummy. */ if (value != Py_None) { PyErr_SetString(PyExc_TypeError, "instance exception may not have a separate value"); goto raise_error; } /* Normalize to raise , */ Py_DECREF(value); value = type; if (PyInstance_Check(type)) type = (PyObject*) ((PyInstanceObject*)type)->in_class; else type = (PyObject*) type->ob_type; Py_INCREF(type); } PyErr_Restore(type, value, tb); return; raise_error: Py_XDECREF(value); Py_XDECREF(type); Py_XDECREF(tb); return; } static void __Pyx_ReRaise(void) { PyThreadState *tstate = PyThreadState_Get(); PyObject *type = tstate->exc_type; PyObject *value = tstate->exc_value; PyObject *tb = tstate->exc_traceback; Py_XINCREF(type); Py_XINCREF(value); Py_XINCREF(tb); PyErr_Restore(type, value, tb); } static int __Pyx_InternStrings(__Pyx_InternTabEntry *t) { while (t->p) { *t->p = PyString_InternFromString(t->s); if (!*t->p) return -1; ++t; } return 0; } static int __Pyx_InitStrings(__Pyx_StringTabEntry *t) { while (t->p) { *t->p = PyString_FromStringAndSize(t->s, t->n - 1); if (!*t->p) return -1; ++t; } return 0; } #include "compile.h" #include "frameobject.h" #include "traceback.h" static void __Pyx_AddTraceback(char *funcname) { PyObject *py_srcfile = 0; PyObject *py_funcname = 0; PyObject *py_globals = 0; PyObject *empty_tuple = 0; PyObject *empty_string = 0; PyCodeObject *py_code = 0; PyFrameObject *py_frame = 0; py_srcfile = PyString_FromString(__pyx_filename); if (!py_srcfile) goto bad; py_funcname = PyString_FromString(funcname); if (!py_funcname) goto bad; py_globals = PyModule_GetDict(__pyx_m); if (!py_globals) goto bad; empty_tuple = PyTuple_New(0); if (!empty_tuple) goto bad; empty_string = PyString_FromString(""); if (!empty_string) goto bad; py_code = PyCode_New( 0, /*int argcount,*/ 0, /*int nlocals,*/ 0, /*int stacksize,*/ 0, /*int flags,*/ empty_string, /*PyObject *code,*/ empty_tuple, /*PyObject *consts,*/ empty_tuple, /*PyObject *names,*/ empty_tuple, /*PyObject *varnames,*/ empty_tuple, /*PyObject *freevars,*/ empty_tuple, /*PyObject *cellvars,*/ py_srcfile, /*PyObject *filename,*/ py_funcname, /*PyObject *name,*/ __pyx_lineno, /*int firstlineno,*/ empty_string /*PyObject *lnotab*/ ); if (!py_code) goto bad; py_frame = PyFrame_New( PyThreadState_Get(), /*PyThreadState *tstate,*/ py_code, /*PyCodeObject *code,*/ py_globals, /*PyObject *globals,*/ 0 /*PyObject *locals*/ ); if (!py_frame) goto bad; py_frame->f_lineno = __pyx_lineno; PyTraceBack_Here(py_frame); bad: Py_XDECREF(py_srcfile); Py_XDECREF(py_funcname); Py_XDECREF(empty_tuple); Py_XDECREF(empty_string); Py_XDECREF(py_code); Py_XDECREF(py_frame); } pyprotocols-1.0a.svn20070625/src/protocols/classic.py0000644000175000017500000001224710072357530020541 0ustar kovkov"""Declaration support for Python built-in types""" __all__ = ['ProviderMixin'] from types import FunctionType, ModuleType, InstanceType, ClassType from adapters import * from api import declareImplementation, advise, declareAdapterForObject, adapt from interfaces import * from new import instancemethod from advice import getMRO, metamethod, mkRef class ProviderMixin: """Mixin to support per-instance declarations""" advise( instancesProvide=[IOpenProvider, IImplicationListener] ) def declareProvides(self,protocol,adapter=NO_ADAPTER_NEEDED,depth=1): registry = self.__dict__.get('__protocols_provided__') if registry is None: self.__protocols_provided__ = registry = {} if updateWithSimplestAdapter(registry,protocol,adapter,depth): adapt(protocol,IOpenProtocol).addImplicationListener(self) return True declareProvides = metamethod(declareProvides) def newProtocolImplied(self, srcProto, destProto, adapter, depth): registry = self.__dict__.get('__protocols_provided__',()) if srcProto not in registry: return baseAdapter, d = registry[srcProto] adapter = composeAdapters(baseAdapter,srcProto,adapter) declareAdapterForObject( destProto, adapter, self, depth+d ) newProtocolImplied = metamethod(newProtocolImplied) def __conform__(self,protocol): for cls in getMRO(self): conf = cls.__dict__.get('__protocols_provided__',()) if protocol in conf: return conf[protocol][0](self) __conform__ = metamethod(__conform__) class conformsRegistry(dict): """Helper type for objects and classes that need registration support""" def __call__(self, protocol): # This only gets called for non-class objects if protocol in self: subject = self.subject() if subject is not None: return self[protocol][0](subject) def findImplementation(self, subject, protocol, checkSelf=True): for cls in getMRO(subject): conf = cls.__dict__.get('__conform__') if conf is None: continue if not isinstance(conf,conformsRegistry): raise TypeError( "Incompatible __conform__ in base class", conf, cls ) if protocol in conf: return conf[protocol][0](subject) def newProtocolImplied(self, srcProto, destProto, adapter, depth): subject = self.subject() if subject is None or srcProto not in self: return baseAdapter, d = self[srcProto] adapter = composeAdapters(baseAdapter,srcProto,adapter) declareAdapterForObject( destProto, adapter, subject, depth+d ) def __hash__(self): # Need this because dictionaries aren't hashable, but we need to # be referenceable by a weak-key dictionary return id(self) def __get__(self,ob,typ=None): if ob is not None: raise AttributeError( "__conform__ registry does not pass to instances" ) # Return a bound method that adds the retrieved-from class to the return instancemethod(self.findImplementation, typ, type(typ)) def __getstate__(self): return self.subject(), self.items() def __setstate__(self,(subject,items)): self.clear() self.update(dict(items)) self.subject = mkRef(subject) class MiscObjectsAsOpenProvider(object): """Supply __conform__ registry for funcs, modules, & classic instances""" advise( instancesProvide=[IOpenProvider], asAdapterForTypes=[ FunctionType, ModuleType, InstanceType, ClassType, type, object ] ) def __init__(self,ob): obs = list(getMRO(ob)) for item in obs: try: reg = item.__dict__.get('__conform__') if reg is None and obs==[ob]: # Make sure we don't obscure a method from the class! reg = getattr(item,'__conform__',None) except AttributeError: raise TypeError( "Only objects with dictionaries can use this adapter", ob ) if reg is not None and not isinstance(reg,conformsRegistry): raise TypeError( "Incompatible __conform__ on adapted object", ob, reg ) reg = ob.__dict__.get('__conform__') if reg is None: reg = ob.__conform__ = self.newRegistry(ob) self.ob = ob self.reg = reg def declareProvides(self, protocol, adapter=NO_ADAPTER_NEEDED, depth=1): if updateWithSimplestAdapter(self.reg, protocol, adapter, depth): adapt(protocol,IOpenProtocol).addImplicationListener(self.reg) return True def newRegistry(self,subject): # Create a registry that's also set up for inheriting declarations reg = conformsRegistry() reg.subject = mkRef(subject) return reg pyprotocols-1.0a.svn20070625/src/protocols/adapters.py0000644000175000017500000001211710132267136020716 0ustar kovkov"""Basic Adapters and Adapter Operations""" __all__ = [ 'NO_ADAPTER_NEEDED','DOES_NOT_SUPPORT', 'Adapter', 'minimumAdapter', 'composeAdapters', 'updateWithSimplestAdapter', 'StickyAdapter', 'AdaptationFailure', 'bindAdapter', ] from types import FunctionType,ClassType,MethodType try: PendingDeprecationWarning except NameError: class PendingDeprecationWarning(Warning): 'Base class for warnings about features which will be deprecated in the future.' class AdaptationFailure(NotImplementedError,TypeError): """A suitable implementation/adapter could not be found""" # Fundamental Adapters def NO_ADAPTER_NEEDED(obj, protocol=None): """Assume 'obj' implements 'protocol' directly""" return obj def DOES_NOT_SUPPORT(obj, protocol=None): """Prevent 'obj' from supporting 'protocol'""" return None try: from _speedups import NO_ADAPTER_NEEDED, DOES_NOT_SUPPORT except ImportError: pass class Adapter(object): """Convenient base class for adapters""" def __init__(self, ob): self.subject = ob class StickyAdapter(object): """Adapter that attaches itself to its subject for repeated use""" attachForProtocols = () def __init__(self, ob): self.subject = ob # Declare this instance as a per-instance adaptation for the # given protocols provides = list(self.attachForProtocols) from protocols.api import declareAdapter declareAdapter(lambda s: self, provides, forObjects=[ob]) # Adapter "arithmetic" def minimumAdapter(a1,a2,d1=0,d2=0): """Shortest route to implementation, 'a1' @ depth 'd1', or 'a2' @ 'd2'? Assuming both a1 and a2 are interchangeable adapters (i.e. have the same source and destination protocols), return the one which is preferable; that is, the one with the shortest implication depth, or, if the depths are equal, then the adapter that is composed of the fewest chained adapters. If both are the same, then prefer 'NO_ADAPTER_NEEDED', followed by anything but 'DOES_NOT_SUPPORT', with 'DOES_NOT_SUPPORT' being least preferable. If there is no unambiguous choice, and 'not a1 is a2', TypeError is raised. """ if d1=maxargs: newAdapter = lambda ob: adapter(ob,proto) newAdapter.__adapterCount__ = getattr( adapter,'__adapterCount__',1 ) newAdapter.__unbound_adapter__ = adapter from warnings import warn warn("Adapter %r to protocol %r needs multiple arguments" % (adapter,proto), PendingDeprecationWarning, 6) return newAdapter return adapter def updateWithSimplestAdapter(mapping, key, adapter, depth): """Replace 'mapping[key]' w/'adapter' @ 'depth', return true if changed""" new = adapter old = mapping.get(key) if old is not None: old, oldDepth = old new = minimumAdapter(old,adapter,oldDepth,depth) if old is new and depth>=oldDepth: return False mapping[key] = new, depth return True pyprotocols-1.0a.svn20070625/src/protocols/interfaces.py0000644000175000017500000002770110072402627021242 0ustar kovkov"""Implement Interfaces and define the interfaces used by the package""" from __future__ import generators __all__ = [ 'Protocol', 'InterfaceClass', 'Interface', 'AbstractBase', 'AbstractBaseMeta', 'IAdapterFactory', 'IProtocol', 'IAdaptingProtocol', 'IOpenProtocol', 'IOpenProvider', 'IOpenImplementor', 'IImplicationListener', 'Attribute', 'Variation' ] import api from advice import metamethod, classicMRO, mkRef from adapters import composeAdapters, updateWithSimplestAdapter from adapters import NO_ADAPTER_NEEDED, DOES_NOT_SUPPORT from types import InstanceType # Thread locking support try: from thread import allocate_lock except ImportError: try: from dummy_thread import allocate_lock except ImportError: class allocate_lock(object): __slots__ = () def acquire(*args): pass def release(*args): pass # Trivial interface implementation class Protocol: """Generic protocol w/type-based adapter registry""" def __init__(self): self.__adapters = {} self.__implies = {} self.__listeners = None self.__lock = allocate_lock() def getImpliedProtocols(self): # This is messy so it can clean out weakrefs, but this method is only # called for declaration activities and is thus not at all # speed-critical. It's more important that we support weak refs to # implied protocols, so that dynamically created subset protocols can # be garbage collected. out = [] add = out.append self.__lock.acquire() # we might clean out dead weakrefs try: for k,v in self.__implies.items(): proto = k() if proto is None: del self.__implies[k] else: add((proto,v)) return out finally: self.__lock.release() def addImpliedProtocol(self,proto,adapter=NO_ADAPTER_NEEDED,depth=1): self.__lock.acquire() try: key = mkRef(proto) if not updateWithSimplestAdapter( self.__implies, key, adapter, depth ): return self.__implies[key][0] finally: self.__lock.release() # Always register implied protocol with classes, because they should # know if we break the implication link between two protocols for klass,(baseAdapter,d) in self.__adapters.items(): api.declareAdapterForType( proto,composeAdapters(baseAdapter,self,adapter),klass,depth+d ) if self.__listeners: for listener in self.__listeners.keys(): # Must use keys()! listener.newProtocolImplied(self, proto, adapter, depth) return adapter addImpliedProtocol = metamethod(addImpliedProtocol) def registerImplementation(self,klass,adapter=NO_ADAPTER_NEEDED,depth=1): self.__lock.acquire() try: if not updateWithSimplestAdapter( self.__adapters,klass,adapter,depth ): return self.__adapters[klass][0] finally: self.__lock.release() if adapter is DOES_NOT_SUPPORT: # Don't register non-support with implied protocols, because # "X implies Y" and "not X" doesn't imply "not Y". In effect, # explicitly registering DOES_NOT_SUPPORT for a type is just a # way to "disinherit" a superclass' claim to support something. return adapter for proto, (extender,d) in self.getImpliedProtocols(): api.declareAdapterForType( proto, composeAdapters(adapter,self,extender), klass, depth+d ) return adapter registerImplementation = metamethod(registerImplementation) def registerObject(self, ob, adapter=NO_ADAPTER_NEEDED,depth=1): # Object needs to be able to handle registration if api.adapt(ob,IOpenProvider).declareProvides(self,adapter,depth): if adapter is DOES_NOT_SUPPORT: return # non-support doesn't imply non-support of implied # Handle implied protocols for proto, (extender,d) in self.getImpliedProtocols(): api.declareAdapterForObject( proto, composeAdapters(adapter,self,extender), ob, depth+d ) registerObject = metamethod(registerObject) def __adapt__(self, obj): get = self.__adapters.get try: typ = obj.__class__ except AttributeError: typ = type(obj) try: mro = typ.__mro__ except AttributeError: # Note: this adds 'InstanceType' and 'object' to end of MRO mro = classicMRO(typ,extendedClassic=True) for klass in mro: factory=get(klass) if factory is not None: return factory[0](obj) try: from _speedups import Protocol__adapt__ as __adapt__ except ImportError: pass __adapt__ = metamethod(__adapt__) def addImplicationListener(self, listener): self.__lock.acquire() try: if self.__listeners is None: from weakref import WeakKeyDictionary self.__listeners = WeakKeyDictionary() self.__listeners[listener] = 1 finally: self.__lock.release() addImplicationListener = metamethod(addImplicationListener) def __call__(self, ob, default=api._marker): """Adapt to this protocol""" return api.adapt(ob,self,default) # Use faster __call__ method, if possible # XXX it could be even faster if the __call__ were in the tp_call slot # XXX directly, but Pyrex doesn't have a way to do that AFAIK. try: from _speedups import Protocol__call__ except ImportError: pass else: from new import instancemethod Protocol.__call__ = instancemethod(Protocol__call__, None, Protocol) class AbstractBaseMeta(Protocol, type): """Metaclass for 'AbstractBase' - a protocol that's also a class (Note that this should not be used as an explicit metaclass - always subclass from 'AbstractBase' or 'Interface' instead.) """ def __init__(self, __name__, __bases__, __dict__): type.__init__(self, __name__, __bases__, __dict__) Protocol.__init__(self) for b in __bases__: if isinstance(b,AbstractBaseMeta) and b.__bases__<>(object,): self.addImpliedProtocol(b) def __setattr__(self,attr,val): # We could probably support changing __bases__, as long as we checked # that no bases are *removed*. But it'd be a pain, since we'd # have to do callbacks, remove entries from our __implies registry, # etc. So just punt for now. if attr=='__bases__': raise TypeError( "Can't change interface __bases__", self ) type.__setattr__(self,attr,val) __call__ = type.__call__ class AbstractBase(object): """Base class for a protocol that's a class""" __metaclass__ = AbstractBaseMeta class InterfaceClass(AbstractBaseMeta): """Metaclass for 'Interface' - a non-instantiable protocol (Note that this should not be used as an explicit metaclass - always subclass from 'AbstractBase' or 'Interface' instead.) """ def __call__(self, *__args, **__kw): if self.__init__ is Interface.__init__: return Protocol.__call__(self,*__args, **__kw) else: return type.__call__(self,*__args, **__kw) def getBases(self): return [ b for b in self.__bases__ if isinstance(b,AbstractBaseMeta) and b.__bases__<>(object,) ] class Interface(object): __metaclass__ = InterfaceClass class Variation(Protocol): """A variation of a base protocol - "inherits" the base's adapters See the 'LocalProtocol' example in the reference manual for more info. """ def __init__(self, baseProtocol, context = None): self.baseProtocol = baseProtocol self.context = context # Note: Protocol is a ``classic'' class, so we don't use super() Protocol.__init__(self) api.declareAdapterForProtocol(self,NO_ADAPTER_NEEDED,baseProtocol) def __repr__(self): if self.context is None: return "Variation(%r)" % self.baseProtocol return "Variation(%r,%r)" % (self.baseProtocol, self.context) # Semi-backward compatible 'interface.Attribute' class Attribute(object): """Attribute declaration; should we get rid of this?""" def __init__(self,doc,name=None,value=None): self.__doc__ = doc self.name = name self.value = value def __get__(self,ob,typ=None): if ob is None: return self if not self.name: raise NotImplementedError("Abstract attribute") try: return ob.__dict__[self.name] except KeyError: return self.value def __set__(self,ob,val): if not self.name: raise NotImplementedError("Abstract attribute") ob.__dict__[self.name] = val def __delete__(self,ob): if not self.name: raise NotImplementedError("Abstract attribute") del ob.__dict__[self.name] def __repr__(self): return "Attribute: %s" % self.__doc__ # Interfaces and adapters for declaring protocol/type/object relationships class IAdapterFactory(Interface): """Callable that can adapt an object to a protocol""" def __call__(ob): """Return an implementation of protocol for 'ob'""" class IProtocol(Interface): """Object usable as a protocol by 'adapt()'""" def __hash__(): """Protocols must be usable as dictionary keys""" def __eq__(other): """Protocols must be comparable with == and !=""" def __ne__(other): """Protocols must be comparable with == and !=""" class IAdaptingProtocol(IProtocol): """A protocol that potentially knows how to adapt some object to itself""" def __adapt__(ob): """Return 'ob' adapted to protocol, or 'None'""" class IConformingObject(Interface): """An object that potentially knows how to adapt to a protocol""" def __conform__(protocol): """Return an implementation of 'protocol' for self, or 'None'""" class IOpenProvider(Interface): """An object that can be told how to adapt to protocols""" def declareProvides(protocol, adapter=NO_ADAPTER_NEEDED, depth=1): """Register 'adapter' as providing 'protocol' for this object Return a true value if the provided adapter is the "shortest path" to 'protocol' for the object, or false if a shorter path already existed. """ class IOpenImplementor(Interface): """Object/type that can be told how its instances adapt to protocols""" def declareClassImplements(protocol, adapter=NO_ADAPTER_NEEDED, depth=1): """Register 'adapter' as implementing 'protocol' for instances""" class IOpenProtocol(IAdaptingProtocol): """A protocol that be told what it implies, and what supports it Note that these methods are for the use of the declaration APIs only, and you should NEVER call them directly.""" def addImpliedProtocol(proto, adapter=NO_ADAPTER_NEEDED, depth=1): """'adapter' provides conversion from this protocol to 'proto'""" def registerImplementation(klass, adapter=NO_ADAPTER_NEEDED, depth=1): """'adapter' provides protocol for instances of klass""" def registerObject(ob, adapter=NO_ADAPTER_NEEDED, depth=1): """'adapter' provides protocol for 'ob' directly""" def addImplicationListener(listener): """Notify 'listener' whenever protocol has new implied protocol""" class IImplicationListener(Interface): def newProtocolImplied(srcProto, destProto, adapter, depth): """'srcProto' now implies 'destProto' via 'adapter' at 'depth'""" pyprotocols-1.0a.svn20070625/docs/0000755000175000017500000000000010640056704014655 5ustar kovkovpyprotocols-1.0a.svn20070625/docs/ref/0000755000175000017500000000000010640056704015431 5ustar kovkovpyprotocols-1.0a.svn20070625/docs/ref/libprotocols.tex0000644000175000017500000044256510167066622020712 0ustar kovkov\section{\module{protocols} --- Protocol Definition, Declaration, and Adaptation} \declaremodule{}{protocols} \moduleauthor{Phillip J. Eby}{pje@telecommunity.com} \sectionauthor{Phillip J. Eby}{pje@telecommunity.com} \modulesynopsis{Protocol declaration and adaptation functions as described in PEP XXX.} \begin{quotation} The typical Python programmer is an integrator, someone who is connecting components from various vendors. Often times the interfaces between these components require an intermediate adapter. Usually the burden falls upon the programmer to study the interface exposed by one component and required by another, determine if they are directly compatible, or develop an adapter. Sometimes a vendor may even include the appropriate adapter, but then searching for the adapter and figuring out how to deploy the adapter takes time. \hfill --- Martelli \& Evans, PEP 246 \end{quotation} This package builds on the object adaptation protocol presented in \pep{246} to make it easier for component authors, framework suppliers, and other developers to: \begin{itemize} \item Specify what behavior a component requires or provides \item Specify how to adapt the interface provided by one component to that required by another \item Specify how to adapt objects of a particular type or class (even built-in types) to a particular required interface \item Automatically adapt a supplied object to a required interface, and \item Do all of the above, even when the components or frameworks involved were not written to take advantage of this package, and even if the frameworks have different mechanisms for defining interfaces. \end{itemize} Assuming that a particular framework either already supports this package, or has been externally adapted to do so, then framework users will typically use this package's declaration API to declare what interfaces their classes or objects provide, and/or to declare adaptations between interfaces or components. For framework developers, this package offers an opportunity to replace tedious and repetitive type-checking code (such as \function{isinstance()}, \function{type()}, \function{hasattr()}, or interface checks) with single calls to \function{adapt()} instead. In addition, if the framework has objects that represent interfaces or protocols, the framework developer can make them usable with this package's declaration API by adding adapters for (or direct implementations of) the \class{IOpenProtocol} interface provided herein. If the developer of a framework does not do these things, it may still be possible for a framework user or third-party developer to do them, in order to be able to use this package's API. The user of a framework can often call \function{adapt()} on a component before passing it to a non-adapting framework. And, it's possible to externally adapt a framework's interface objects as well. For example, the \module{protocols.zope_support} and \module{protocols.twisted_support} modules define adapters that implement \class{IOpenProtocol} on behalf of Zope and Twisted \class{Interface} objects. This allows them to be used as arguments to this package's protocol declaration API. This works even though Zope and Twisted are completely unaware of the \module{protocols} package. (Of course, this does not give Zope or Twisted \class{Interface} objects all of the capabilities that \class{Protocol} objects have, but it does make most of their existing functionality accessible through the same API.) Finally, framework and non-framework developers alike may also wish to use the \class{Protocol} and \class{Interface} base classes from this package to define protocols or interfaces of their own, or perhaps use some of the adaptation mechanisms supplied here to implement ``double dispatching'' or the ``visitor pattern''. \begin{seealso} \seepep{246}{Object Adaptation}{PEP 246 describes an early version of the adaptation protocol used by this package.} \end{seealso} \subsection{Big Example 1 --- A Python Documentation Framework\label{protocols-example1}} To provide the reader with a ``feel'' for the use and usefulness of the \module{protocols} package, we will begin with a motivating example: a simple Python documentation framework. To avoid getting bogged down in details, we will only sketch a skeleton of the framework, highlighting areas where the \module{protocols} package would play a part. First, let's consider the background and requirements. Python has many documentation tools available, ranging from the built-in \program{pydoc} to third-party tools such as \program{HappyDoc}, \program{epydoc}, and \program{Synopsis}. Many of these tools generate documentation from ``live'' Python objects, some using the Python \module{inspect} module to do so. However, such tools often encounter difficulties in the Python 2.2 world. Tools that use \function{type()} checks break with custom metaclasses, and even tools that use \function{isinstance()} break when dealing with custom descriptor types. These tools often handle other custom object types poorly as well: for example, Zope \class{Interface} objects can cause some versions of \function{help()} and \program{pydoc} to crash! The state of Python documentation tools is an example of the problem that both \pep{246} and the \module{protocols} package were intended to solve: introspection makes frameworks brittle and unextensible. We can't easily plug new kinds of ``documentables'' into our documentation tools, or control how existing objects are documented. These are exactly the kind of problems that component adaptation and open protocols were created to address. So let's review our requirements for the documentation framework. First, it should work with existing built-in types, without requiring a new version of Python. Second, it should allow users to control how objects are recognized and documented. Third, we want the framework to be flexible enough to create different kinds of documentation, like JavaDoc-style HTML, PDF reference manuals, plaintext online help or manpages, and so on -- with whatever kinds of documentable objects exist. If user A creates a new ``documentable object,'' and user B creates a new documentation format, user C should be able to combine the two. To design a framework with the \module{protocols} package, the best place to start is often with an ``ideal'' interface. We pretend that every object is already the kind of object that would do everything we need it to do. In the case of documentation, we want objects to be able to tell us their name, what kind of object they should be listed as, their base classes (if applicable), their call signature (if callable), and their contents (if a namespace). So let's define our ideal interface, using \class{protocols.Interface}. (\note{This is not the only way to define an interface; we could also use an abstract base class, or other techniques. Interface definition is discussed further in section \ref{protocols-defining}.}) \begin{verbatim% }from protocols import Interface class IDocumentable(Interface): """A documentable object""" def getKind(): """Return "kind" string, e.g. "class", "interface", "package", etc.""" def getName(): """Return an unqualified (undotted) name of the object""" def getBases(): """Return a sequence of objects this object is derived from (or an empty sequence if object is not class/type-like)""" def getSignature(): """Return a description of the object's call signature, or None""" def getSummary(): """Return a one-line summary description of the object""" def getDoc(): """Return documentation for the object (not including its contents)""" def getContents(includeInherited): """Return list of (key,value) pairs for namespace contents, if any""" \end{verbatim} Now, in the ``real world,'' no existing Python objects provide this interface. But, if every Python object did, we'd be in great shape for writing documentation tools. The tools could focus purely on issues of formatting and organization, not ``digging up dirt'' on what to say about the objects. Notice that we don't need to care whether an object is a type or a module or a staticmethod or something else. If a future version of Python made modules callable or functions able to somehow inherit from other functions, we'd be covered, as long as those new objects supported the methods we described. Even if a tool needs to care about the ``package'' kind vs. the ``module'' kind for formatting or organizational reasons, it can easily be written to assume that new ``kinds'' might still appear. For example, an index generator might just generate separate alphabetical indexes for each ``kind'' it encounters during processing. Okay, so we've envisioned our ideal scenario, and documented it as an interface. Now what? Well, we could start writing documentation tools that expect to be given objects that support the \class{IDocumentable} interface, but that wouldn't be very useful, since we don't have any \class{IDocumentable} objects yet. So let's define some \strong{adapters} for built-in types, so that we have something for our tools to document. \begin{verbatim% }from protocols import advise from types import FunctionType class FunctionAsDocumentable(object): advise( # Declare that this class provides IDocumentable for FunctionType # (we'll explain this further in later sections) provides = [IDocumentable], asAdapterForTypes = [FunctionType], ) def __init__(self, ob): self.func = ob def getKind(self): return "function" def getName(self): return self.func.func_name def getBases(self): return () # ... etc. \end{verbatim} The \class{FunctionAsDocumentable} class wraps a function object with the methods of an \class{IDocumentable}, giving us the behavior we need to document it. Now all we need to do is define similar wrappers for all the other built-in types, and for any user-defined types, and then pick the right kind of wrapper to use when given an object to document. But wait! Isn't this just as complicated as writing a documentation tool the ``old fashioned'' way? We still have to write code to get the data we need, \emph{and} we still need to figure out what code to use. Where's the benefit? Enter the \pep{246} \function{adapt()} function. \function{adapt()} has two required arguments: an object to adapt, and a \strong{protocol} that you want it adapted to. Our documentation tool, when given an object to document, will simply call \code{adapt(object,IDocumentable)} and receive an instance of the appropriate adapter. (Or, if the object has already declared that it supports \class{IDocumentable}, we'll simply get the same object back from \function{adapt()} that we passed in.) But that's only the beginning. If we create and distribute a documentation tool based on \class{IDocumentable}, then anyone who creates a new kind of documentable object can write their own adapters, and register them via the \module{protocols} package, \emph{without} changing the documentation tool, or needing to give special configuration options to the tool or call tool-specific registration functions. (Which means we don't have to design or code those options or registration functions into our tool.) Also, lets say I use some kind of ``finite state machine'' library written by vendor A, and I'd like to use it with this new documentation tool from vendor B. I can write and register adapters from such types as ``finite state machine,'' ``state,'' and ``transition'' to \class{IDocumentable}. I can then use vendor B's tool to document vendor A's object types. And it goes further. Suppose vendor C comes out with a new super-duper documentation framework with advanced features. To use the new features, however, a more sophisticated interface than \class{IDocumentable} is needed. So vendor C's tool requires objects to support his new \class{ISuperDocumentable} interface. What happens then? Is the new package doomed to sit idle because everybody else only has \class{IDocumentable} objects? Heck no. At least, not if vendor C starts by defining an adapter from \class{IDocumentable} to \class{ISuperDocumentable} that supplies reasonable default behavior for ``older'' objects. Then he or she writes adapters from built-in types to \class{ISuperDocumentable} that provide more useful-than-default behaviors where applicable. Now, the super-framework is instantly usable with existing adapters for other object types. And, if vendor C defined \class{ISuperDocumentable} as \emph{extending} \class{IDocumentable}, there's another benefit, too. Suppose vendor A upgrades their finite state machine library to include direct or adapted support for the new \class{ISuperDocumentable} interface. Do I need to switch to vendor C's documentation tool now? Must I continue to maintain my FSM-to-\class{IDocumentable} adapters? No, to both questions. If \class{ISuperDocumentable} is a strict extension of \class{IDocumentable}, then I may use an \class{ISuperDocumentable} anywhere an \class{IDocumentable} is accepted. Thus, I can use vendor A's new library with my existing (non-``super'') documentation tool from vendor B, without needing my old adapters any more. As you can see, replacing introspection with adaptation makes frameworks more flexible and extensible. It also simplifies the design process, by letting you focus on what you \emph{want} to do, instead of on the details of which objects are doing it. As we mentioned at the beginning of this section, this example is only a sketch of a skeleton of a real documentation framework. For example, a real documentation framework would also need to define an \class{ISignature} interface for objects returned from \method{getSignature()}. We've also glossed over many other issues that the designers of a real documentation framework would face, in order to focus on the problems that can be readily solved with adaptable components. And that's our point, actually. Every framework has two kinds of design issues: the ones that are specific to the framework's subject area, and the ones that would apply to any framework. The \module{protocols} package can save you a lot of work dealing with the latter, so you can spend more time focusing on the former. Let's start looking at how, beginning with the concepts of ``protocols'' and ``interfaces''. \subsection{Protocols and Interfaces \label{protocol-concepts}} Many languages and systems provide ways of defining \strong{interfaces} that components provide or require. Some mechanisms are purely for documentation, others are used at runtime to obtain or verify an implementation. Typically, interfaces are formal, intended for compiler-verified static type checking. As a dynamic language, Python more often uses a looser notion of interface, known as a \strong{protocol}. While protocols are often very precisely specified, their intended audience is a human reader or developer, not a compiler or automated verification tool. Automated verification tools, however, usually extract a high overhead cost from developers. The Java language, for example, requires that all methods of an interface be defined by a class that claims to implement the interface, even if those methods are never used in the program being compiled! And yet, the more important \emph{dynamic} behavior of the interface at runtime is not captured or verifiable by the compiler, so written documentation for human readers is still required! In the Python language, the primary uses for objects representing protocols or interfaces are at runtime, rather than at compile time. Typically, such objects are used to ask for an implementation of the interface, or supplied by an object to claim that it provides an implementation of that interface. In principle, any Python object may be used as a \strong{protocol object}. However, for a variety of practical reasons, it is best that protocol objects be hashable and comparable. That is, protocol objects should be usable as dictionary keys. This still allows for a wide variety of protocol object implementations, however. One might assign meaning to the number 42, for example, as referring to some hypothetical ``hitchhiker'' protocol. More realistically, the Microsoft COM framework uses UUIDs (Universally Unique Identifiers) to identify interfaces. UUIDs can be represented as Python strings, and thus are usable as protocol objects. But a simple string or number is often not very useful as a protocol object. Aside from the issue of how to assign strings or numbers to protocols, these passive protocol objects cannot \emph{do} anything, and by themselves they document nothing. There are thus two more common approaches to creating protocol objects in Python: classes (such as abstract base classes or ``ABCs''), and \strong{ interface objects}. Interface objects are typically also defined using Python \code{class} statements, but use a custom metaclass to create an object that may not be usable in the same ways as a ``real'' Python class. Many Python frameworks (such as Twisted, Zope, and this package) provide their own framework-specific implementations of this ``interface object'' approach. Since classes and most interface object implementations can be used as dictionary keys, and because their Python source code can serve as (or be converted to) useful documentation, both of these approaches are viable ways to create protocol objects usable with the \module{protocols} package. In addition, inheriting from a class or interface objects is a simple way to define implication relationships between protocol objects. Inheriting from a protocol to create a new protocol means that the new protocol \strong{implies} the old protocol. That is, any implementation or adaptation to the new protocol, is implied to be usable in a place where the old protocol was required. (We will have more to say about direct and adapted implication relationships later on, in section \ref{proto-implication}.) At this point, we still haven't described any mechanisms for making adapters available, or declaring what protocols are supported by a class or object. To do that, we need to define two additional kinds of protocol objects, that have more specialized abilities. An \strong{adapting protocol} is a protocol object that is potentially able to adapt components to support the protocol it represents, or at least to recognize that a component supports (or claims to support) the protocol. To do this, an adapting protocol must have an \method{__adapt__} method, as will be described in section \ref{adapt-protocol}. (Often, this method can be added to an existing class, or patched into an interface object implementation.) An \strong{open protocol} is an adapting protocol that is also capable of accepting adapter declarations, and managing its implication relationships with other protocols. Open protocols can be used with this package's protocol declaration API, as long as they implement (or can be adapted to) the \class{IOpenProtocol} interface, as will be described in section \ref{protocols-declaration-interfaces}. Notice that the concepts of protocol objects, adapting protocols, and open protocols are themselves ``protocols''. The \module{protocols} package supplies three interface objects that symbolize these concepts: \class{IProtocol}, \class{IAdaptingProtocol}, and \class{IOpenProtocol}, respectively. Just as the English phrases represent the concepts in this text, the interface objects represent these concepts at runtime. Whether a protocol object is as simple as a string, or as complex as an \class{IOpenProtocol}, it can be used to request that a component provide (or be adaptable to) the protocol that it symbolizes. In the next section, we'll look at how to make such a request, and how the different kinds of protocol objects participate (or not) in fulfilling such requests. \subsection{\function{adapt()} and the Adaptation Protocol \label{adapt-protocol}} Component adaptation is the central focus of the \module{protocols} package. All of the package's protocol declaration API depends on component adaptation in order to function, and the rest of the package is just there to make it easier for developers to use component adaptation in their frameworks and programs. Component adaptation is performed by calling the \function{adapt()} function, whose design is based largely on the specification presented in \pep{246}: \begin{funcdesc}{adapt}{component, protocol, \optional{, default}} Return an implementation of \var{protocol} (a protocol object) for \var{component} (any object). The implementation returned may be \var{component}, or a wrapper that implements the protocol on its behalf. If no implementation is available, return \var{default}. If no \var{default} is provided, raise \exception{protocols.AdaptationFailure}. \end{funcdesc} The component adaptation process performed by \function{adapt()} proceeds in four steps: \begin{enumerate} \item If \var{protocol} is a class or type, and \var{component} is an instance of that class or type, the component is returned unchanged. (This quickly disposes of the most trivial cases). \item If \var{component} has a \method{__conform__} method, it is called, passing in the protocol. If the method returns a value other than \constant{None}, it is returned as the result of \function{adapt()}. \item If \var{protocol} has an \method{__adapt__} method, it is called, passing in \var{component}. If the method returns a value other than \constant{None}, it is returned as the result of \function{adapt()}. \item Perform default processing as described above, returning \var{default} or raising \exception{protocols.AdaptationFailure} as appropriate. \end{enumerate} This four-step process is called the \strong{adaptation protocol}. Note that it can be useful even in the case where neither the component nor the protocol object are aware that the adaptation protocol exists, and it gracefully degrades to a kind of \function{isinstance()} check in that case. However, if either the component or the protocol object has been constructed (or altered) so that it has the appropriate \method{__conform__} or \method{__adapt__} method, then much more meaningful results can be achieved. Throughout the rest of this document, we will say that a component \strong{supports} a protocol, if calling \code{adapt(component,protocol)} does not raise an error. That is, a component supports a protocol if its \method{__conform__} method or the protocol's \method{__adapt__} method return a non-\constant{None} value. This is different from saying that an object \strong{provides} a protocol. An object provides a protocol if \code{adapt(ob,protocol) is ob}. Thus, if an object \emph{provides} a protocol, it \emph{supports} the protocol, but an object can also support a protocol by having an adapter that provides the protocol on its behalf. Now that you know how \function{adapt()} works, you can actually make use of it without any of the other tools in the \module{protocols} package. Just define your own \method{__conform__} and \method{__adapt__} methods, and off you go! In practice, however, this is like creating a new kind of Python ``number'' type. That is, it's certainly possible, but can be rather tedious and is perhaps best left to a specialist. For that reason, the \module{protocols} package supplies some useful basic protocol types, and a ``declaration API'' that lets you declare how protocols, types, and objects should be adapted to one another. The rest of this document deals with how to use those types and APIs. You don't need to know about those types and APIs to create your own kinds of protocols or components, just as you don't need to have studied Python's numeric types or math libraries to create a numeric type of your own. But, if you'd like your new types to interoperate well with existing types, and conform to users' expectations of how such a type behaves, it would be a good idea to be familiar with existing implementations, such as the ones described here. \subsubsection{Creating and Using Adapters, Components, and Protocols} Because the adaptation protocol is so simple and flexible, there are a few guidelines you should follow when using \function{adapt()} or creating \method{__conform__} and \method{__adapt__} methods, to ensure that adapted objects are as usable as unadapted objects. First, adaptation should be \strong{idempotent}. That is, if you \function{adapt()} an object to a protocol, and then \function{adapt()} the return value to the same protocol, the same object should be returned the second time. If you are using the \module{protocols} declaration API, it suffices to declare that instances of the adapter class provide the protocol they adapt to. That is, if an adapter class provides protocol P for objects of type X, then it should declare that it provides protocol P. If you are not using the declaration API, but relying only upon your custom \method{__conform__} and \method{__adapt__} methods, you need to ensure that any adapters you return will return themselves when asked to support the protocol that they were returned as an adapter for. Second, adaptation is not automatically \strong{symmetric}. That is, if I have an object X that provides protocol P1, and I \function{adapt()} it to protocol P2, it is not guaranteed that I can \function{adapt()} the resulting object to P1 and receive the original object. Ideally, someone who defines an adapter function would also declare an inverse adapter function to ``unwrap'' an adapted object to its original identity. In practice, however, this can be complex, since the adapter might need some fairly global knowledge of the system to know when it is better to unwrap and rewrap, and when it is better to further wrap the existing wrapper. Another issue that occurs with such wrapper-based adaptation, is that the wrapper does not have the same object identity as the base object, and may not hash or compare equal to it, either. Further, it is not guaranteed that subsequent calls to \function{adapt()} will yield the same wrapper object -- in fact it's quite unlikely. These characteristics of adapted objects can be easily dealt with, however, by following a few simple rules: \begin{itemize} \item Always \function{adapt()} from the ``original'' object you're supplied; avoid adapting adaptations. \item Always pass ``original'' objects to functions or methods that expect their input to support more than one protocol; only pass adapted objects to functions or methods that expect support for only one protocol. \item Always use ``original'' objects for equality or identity comparisons -- or else ensure that callers know they will need to provide you with an equal or identical adapter. (One good way to document this, is to include the requirement in the definition of the interface or protocol that your system requires.) \end{itemize} In some respects, these rules are similar to dealing with objects in statically typed languages like Java. In Java, if one simply has an ``object'', it is not possible to perform operations specific to an interface, without first ``casting'' the object to that interface. But, the object that was ``cast'' can't be stored in the same variable that the ``object'' was in, because it is of a different type. \newpage \subsubsection{Replacing Introspection with Adaptation\label{replintrowadapt}} \begin{quotation} To summarize: don't type check. \hfill --- Alex Martelli, on \newsgroup{comp.lang.python} \end{quotation} Component adaptation is intended to completely replace all non-cooperative introspection techniques, such as \function{type()}, \function{isinstance()}, \function{hasattr()}, and even interface checks. Such introspection tends to limit framework flexibility by unnecessarily closing policies to extension by framework users. It often makes code maintenance more difficult as well, since such checks are often performed in more than one place, and must be kept in sync whenever a new interface or type must be checked. Some common use cases for such introspection are: \begin{itemize} \item To manually adapt a supplied component to a needed interface \item To select one of several possible behaviors, based on the kind of component supplied \item To select another component, or take some action, using information about the interfaces supported by the supplied component \end{itemize} Obviously, the first case is handled quite well by \function{adapt()}, at least in an environment where it's easy to declare adapters between types and protocols. The second and third cases may at first seem to demand an ability to introspect what interfaces are supported by a component. But they are almost always better served by defining new protocols that supply the required behavior or metadata, and then requesting implementations of those protocols. In all three use cases, replacing introspection with adaptation opens the framework to third party extensions, without further modifications being required -- and without the need to do extensive design or documentation of a new hook or extension point to be added to the framework. Indeed, the availability of a standard mechanism for adaptation means that the extension mechanism need only be documented once: right here in this document. In section \ref{introspect-elim}, we will present techniques for refactoring all three kinds of introspection code to purely adaptation-driven code, showing how the flexibility and readability of the code improves in the process. But first, we will need to cover how protocols and interfaces can be defined, declared, and adapted, using the API provided by the \module{protocols} package. \begin{seealso} \seetitle[http://www.canonical.org/\symbol{126}kragen/isinstance/]{isinstance() Considered Harmful}{A brief critique of common justifications for using introspection} \end{seealso} \newpage \subsubsection{Differences Between \function{protocols.adapt()} and \pep{246}} If you have read \pep{246} or are looking for an exact implementation of it, you should know that there are a few differences between the \module{protocols} implementation of \function{adapt()} and the \pep{246} specification. If you don't care about these differences, you can skip this mini-appendix and proceed directly to section \ref{protocols-defining}, ``Defining and Subclassing Interfaces''. The first difference is that \exception{TypeError} is treated differently in each implementation. \pep{246} says that if a \method{__conform__} or \method{__adapt__} method raises a \exception{TypeError}, it should be treated in the same way as if the method returned \constant{None}. This was a workaround for the issue of accidentally calling an unbound class method, in the case where a component or protocol supplied to \function{adapt()} was a class. The \module{protocols} implementation of \function{adapt()} attempts to catch such errors also, but will reraise any exception that appears to come from \emph{within} the execution of the \method{__conform__} or \method{__adapt__} method. So if these methods raise a \exception{TypeError}, it will be passed through to the caller of \function{adapt}. Thus, if you are writing one of these methods, you should not raise a \exception{TypeError} to signal the lack of an adaptation. Rather, you should return \constant{None}. Second, \exception{protocols.AdaptationFailure} is raised when no adaptation is found, and no default is supplied, rather than the \exception{TypeError} specified by \pep{246}. (\note{\exception{protocols.AdaptationFailure} is a subclass of \exception{TypeError} and \exception{NotImplementedError}, so code written to catch either of these errors will work.}) These differences are the result of experience using the \module{protocols} package with PEAK, and advances in the Python state-of-the-art since \pep{246} was written (over two years ago). We believe that they make the adaptation protocol more robust, more predictable, and easier to use for its most common applications. \subsubsection{Convenience Adaptation API (NEW in 0.9.3)\label{protocols-calling}} As of version 0.9.3, PyProtocols supports the simplified adaptation API that was pioneered by Twisted, and later adopted by Zope. In this simplified API, a protocol can be called, passing in the object to be adapted. So, for example, instead of calling \code{adapt(foo,IBar)}, one may call \code{IBar(foo)}. The optional \var{default} arguments can also be supplied, following the \var{component} parameter. All of the protocol types supplied by PyProtocols now support this simpler calling scheme, except for \class{AbstractBase} subclasses, because calling an \class{AbstractBase} subclass should create an instance of that subclass, not attempt to adapt an arbitrary object. Notice, by the way, that you should only use this simplified API if you know for certain that the protocol supports it. For example, it's safe to invoke a known, constant interface object in this way. But if you're writing code that may receive a protocol object as a parameter or via another object, you should use \function{adapt()} instead, because you may receive a protocol object that does not support this shortcut API. \newpage \subsection{Defining and Subclassing Interfaces \label{protocols-defining}} The easiest way to define an interface with the \module{protocols} package is to subclass \class{protocols.Interface}. \class{Interface} does not supply any data or methods of its own, so you are free to define whatever you need. There are two common styles of defining interfaces, illustrated below: \begin{verbatim% }from protocols import Interface, AbstractBase # "Pure" interface style class IReadMapping(Interface): """A getitem-only mapping""" def __getitem__(key): """Return value for key""" # Abstract Base Class (ABC) style class AbstractMapping(AbstractBase): """A getitem-only mapping""" def __getitem__(self,key): """Return value for key""" raise NotImplementedError \end{verbatim} The ``pure'' style emphasizes the interface as seen by the caller, and is not intended to be subclassed for implementation. Notice that the \code{self} parameter is not included in its method definitions, because \code{self} is not supplied when calling the methods. The ``ABC'' style, on the other hand, emphasizes implementation, as it is intended to be subclassed for that purpose. Therefore, it includes method bodies, even for abstract methods. Each style has different uses: ``ABC'' is a popular rapid development style, while the ``pure'' approach has some distinct documentation advantages. \class{protocols.AbstractBase} may be used as a base class for either style, but \class{protocols.Interface} is only usable for the "pure" interface style, as it supports the convenience adaptation API (see section \ref{protocols-calling}). (\note{both base classes use an explicit metaclass, so keep in mind that if you want to subclass an abstract base for implementation using a different metaclass, you may need to create a third metaclass that combines \class{protocols.AbstractBaseMeta} with your desired metaclass.}) Subclassing a subclass of \class{Interface} (or \class{AbstractBase}) creates a new interface (or ABC) that implies the first interface (or ABC). This means that any object that supports the second interface (or ABC), is considered to implicitly support the first interface (or ABC). For example: \begin{verbatim% }class IReadWriteMapping(IReadMapping): """Mapping with getitem and setitem only""" def __setitem__(key,value): """Store value for key, return None""" \end{verbatim} The \code{IReadWriteMapping} interface implies the \code{IReadMapping} interface. Therefore, any object that supports \code{IReadWriteMapping} is understood to also support the \code{IReadMapping} interface. The reverse, however, is not true. Inheritance is only one way to declare that one interface implies another, however, and its uses are limited. Let's say for example, that some package \code{A} supplies objects that support \code{IReadWriteMapping}, while package \code{B} needs objects that support \code{IReadMapping}. But each package declared its own interface, neither inheriting from the other. As developers reading the documentation of these interfaces, it is obvious to us that \code{IReadWriteMapping} implies \code{IReadMapping}, because we understand what they do. But there is no way for Python to know this, unless we explicitly state it, like this: \begin{verbatim% }import protocols from A import IReadWriteMapping from B import IReadMapping protocols.declareAdapter( protocols.NO_ADAPTER_NEEDED, provides = [IReadMapping], forProtocols = [IReadWriteMapping] ) \end{verbatim} In the above example, we use the \module{protocols} declaration API to say that no adapter is needed to support the \code{B.IReadMapping} interface for objects that already support the \code{A.IReadWriteMapping} interface. At this point, if we supply an object that supports \code{IReadWriteMapping}, to a function that expects an \code{IReadMapping}, it should work, as long as we call \code{adapt(ob,IReadMapping)} (or \code{IReadMapping(ob)}) first, or the code we're calling does so. There are still other ways to declare that one interface implies another. For example, if the author of our example package \code{B} knew about package {A} and its \code{IReadWriteMapping} interface, he or she might have defined \code{IReadMapping} this way: \begin{verbatim% }import protocols from protocols import Interface from A import IReadWriteMapping class IReadMapping(Interface): """A getitem-only mapping""" protocols.advise( protocolIsSubsetOf = [IReadWriteMapping] ) def __getitem__(key): """Return value for key""" \end{verbatim} This is syntax sugar for creating the interface first, and then using \code{protocols.declareAdapter(NO_ADAPTER_NEEDED)}. Of course, you can only use this approach if you are the author of the interface! Otherwise, you must use \function{declareAdapter()} after the fact, as in the previous example. In later sections, we will begin looking at the \module{protocols} declaration APIs -- like \function{declareAdapter()} and \function{advise()} -- in more detail. But first, we must look briefly at the interfaces that the \module{protocols} package expects from the protocols, adapters, and other objects supplied as parameters to the declaration API. \newpage \subsection{Interfaces Used by the Declaration API\label{protocols-declaration-interfaces}} Like any other API, the \module{protocols} declaration API has certain expectations regarding its parameters. These expectations are documented and referenced in code using interfaces defined in the \module{protocols.interfaces} module. (The interfaces are also exported directly from the top level of the \module{protocols} package.) You will rarely use or subclass any of these interface objects, unless you are customizing or extending the system. Four of the interfaces exist exclusively for documentation purposes, while the rest are used in \function{adapt()} calls made by the API. First, let's look at the documentation-only interfaces. It is not necessary for you to declare that an object supports these interfaces, and the \module{protocols} package never tries to \function{adapt()} objects to them. \begin{description} \item[IAdapterFactory] \hfill \\ Up until this point, we've been talking about ``adapters'' rather loosely. The \class{IAdapterFactory} interface formalizes the concept. An \strong{adapter factory} is a callable object that accepts an object to be adapted, and returns an object that provides the protocol on behalf of the passed-in object. Declaration API functions that take ``adapter'' or ``factory'' arguments must be adapter factories. The \module{protocols} package supplies two functions that provide this interface: \function{NO_ADAPTER_NEEDED} and \function{DOES_NOT_SUPPORT}. \function{NO_ADAPTER_NEEDED} is used to declare that an object provides a protocol directly, and thus it returns the object passed into it, rather than some kind of adapter. \function{DOES_NOT_SUPPORT} is used to declare that an object does not support a protocol, even with an adapter. (Since this is the default case, \function{DOES_NOT_SUPPORT} is rarely used, except to indicate that a subclass does not support an interface that one of its superclasses does.) \item[IProtocol] \hfill \\ This interface formalizes the idea of a ``protocol object''. As you will recall from section \ref{protocol-concepts}, a protocol object is any object that can be used as a Python dictionary key. The second argument to \function{adapt()} must be a protocol object according to this definition. \item[IAdaptingProtocol] \hfill \\ This interface formalizes the idea of an ``adapting protocol'', specifically that it is a protocol object (i.e., it provides \class{IProtocol}) that also has an \method{__adapt__} method as described in section \ref{adapt-protocol}. \class{IAdaptingProtocol} is a subclass of \class{IProtocol}, so of course \function{adapt()} accepts such objects as protocols. \item[IImplicationListener] \hfill \\ This interface is for objects that want to receive notification when new implication relationships (i.e. adapters) are registered between two protocols. If you have objects that want to keep track of what interfaces they support, you may want those object to implement this interface so they can be kept informed of new protocol-to-protocol adapters. \end{description} The other three interfaces are critical to the operation of the declaration API, and thus must be supported by objects supplied to it. The \module{protocols} package supplies and registers various adapter classes that provide these interfaces on behalf of many commonly used Python object types. So, for each interface, we will list ``known supporters'' of that interface, whether they are classes supplied by \module{protocols}, or built-in types that are automatically adapted to the interface. We will not, however, go into details here about the methods and behavior required by each interface. (Those details can be found in section \ref{protocols-interfaces-module}.) \begin{description} \item[IOpenProtocol] \hfill \\ This interface formalizes the ``open protocol'' concept that was introduced in section \ref{protocol-concepts}. An \class{IOpenProtocol} is an \class{IAdaptingProtocol} that can also accept declarations made by the \module{protocols} declaration API. The \module{protocols} package supplies two implementations of this interface: \class{Protocol} and \class{InterfaceClass}. Thus, any \class{Interface} subclass or \class{Protocol} instance is automatically considered to provide \class{IOpenProtocol}. \note{\class{Interface} is an instance of \class{InterfaceClass}, and thus provides \class{IOpenProtocol}. But if you create an instance of an \class{Interface}, that object does not provide \class{IOpenProtocol}, because the interfaces provided by an object and its class (or its instances) can be different.} In addition to its built-in implementations, the \module{protocols} package also supplies and can declare adapter factories that adapt Zope X3 and Twisted's interface objects to the \class{IOpenProtocol} interface, thus allowing you to use Zope and Twisted interfaces in calls to the declaration API. Similar adapters for other frameworks' interfaces may be added, if there is sufficient demand and/or contributed code, and the frameworks' authors do not add the adapters to their frameworks. \item[IOpenImplementor] \hfill \\ An \class{IOpenImplementor} is a class or type that can be told (via the declaration API) what protocols its instances provide (or support via an \class{IAdapterFactory}). Note that this implies that the instances have a \method{__conform__} method, or else they would not be able to tell \function{adapt()} about the declared support! Support for this interface is optional, since types that don't support it can still have their instances be adapted by \class{IOpenProtocol} objects. The \module{protocols} package does not supply any implementations or adapters for this interface, either. It is intended primarily as a hook for classes to be able to receive notification about protocol declarations for their instances. \item[IOpenProvider] \hfill \\ Because objects' behavior usually comes from a class definition, it's not that often that you will declare that a specific object provides or supports an interface. But objects like functions and modules do not have a class definition, and classes themselves sometimes provide an interface. (For example, one could say that class objects provide an \class{IClass} interface.) So, the declaration API needs to also be able to declare what protocols an individual object (such as a function, module, or class) supports or provides. That's what the \class{IOpenProvider} interface is for. An \class{IOpenProvider} is an object with a \class{__conform__} method, that can be told (via the declaration API) what protocols it provides (or supports via an \class{IAdapterFactory}). Notice that this is different from \class{IOpenImplementor}, which deals with an class or type's instances. \class{IOpenProvider} deals with the object itself. A single object can potentially be both an \class{IOpenProvider} and an \class{IOpenImplementor}, if it is a class or type. The \module{protocols} package supplies and declares an adapter factory that can adapt most Python objects to support this interface, assuming that they have a \code{__dict__} attribute. Thus, it is acceptable to pass a Python function, module, or instance of a ``classic'' class to any declaration API that expects an \class{IOpenProvider} argument. We'll talk more about making protocol declarations for individual objects (as opposed to types) in section \ref{protocols-instances}, ``Protocol Declarations for Individual Objects''. \end{description} \newpage \subsection{Declaring Implementations and Adapters} There are three kinds of relationships that a protocol can participate in: \begin{itemize} \item A relationship between a class or type, and a protocol its instances provide or can be adapted to, \item A relationship between an instance, and a protocol it provides or can be adapted to, and \item A relationship between a protocol, and another protocol that it implies or can be adapted to. \end{itemize} Each of these relationships is defined by a \strong{source} (a type, instance or protocol), a \strong{destination} (desired) protocol, and an \strong{adapter factory} used to convert from one to the other. If no adapter is needed, we can say that the adapter factory is the special \function{NO_ADAPTER_NEEDED} function. To declare relationships like these, the \module{protocols} declaration API provides three ``primitive'' declaration functions. Each accepts a destination protocol (that must support the \class{IOpenProtocol} interface), an adapter factory (or \function{NO_ADAPTER_NEEDED}), and a source (type, instance, or protocol). These three functions are \function{declareAdapterForType()}, \function{declareAdapterForObject()}, and \function{declareAdapterForProtocol()}, respectively. You will not ordinarily use these primitives, however, unless you are customizing or extending the framework. It is generally easier to call one of the higher level functions in the declaration API. These higher-level functions may make several calls to the primitive functions on your behalf, or supply useful defaults for certain parameters. They are, however, based entirely on the primitive functions, which is important for customizations and extensions. The next higher layer of declaration APIs are the explicit declaration functions: \function{declareImplementation}, \function{declareAdapter}, and \function{adviseObject}. These functions are structured to support the most common declaration use cases. For declaring protocols related to a type or class: \begin{funcdesc}{declareImplementation}{typ \optional{, instancesProvide=[ ]} \optional{, instancesDoNotProvide=[ ]}} Declare that instances of class or type \var{typ} do or do not provide implementations of the specified protocols. \var{instancesProvide} and \var{instancesDoNotProvide} must be sequences of protocol objects that provide (or are adaptable to) the \class{IOpenProtocol} interface, such as \class{protocols.Interface} subclasses, or \class{Interface} objects from Zope or Twisted. This function is shorthand for calling \function{declareAdapterForType()} with \function{NO_ADAPTER_NEEDED} and \function{DOES_NOT_SUPPORT} as adapters from the type to each of the specified protocols. Note, therefore, that the listed protocols must be adaptable to \class{IOpenProtocol}. See \function{declareAdapterForType()} in section \ref{protocols-contents} for details. \end{funcdesc} For declaring protocols related to a specific, individual instance: \begin{funcdesc}{adviseObject}{ob \optional{, provides=[ ]} \optional{, doesNotProvide=[ ]}} Declare that \var{ob} provides (or does not provide) the specified protocols. This is shorthand for calling \function{declareAdapterForObject()} with \function{NO_ADAPTER_NEEDED} and \function{DOES_NOT_SUPPORT} as adapters from the object to each of the specified protocols. Note, therefore, that \var{ob} may need to support \class{IOpenProvider}, and the listed protocols must be adaptable to \class{IOpenProtocol}. See \function{declareAdapterForObject()} in section \ref{protocols-contents} for details. Also, see section \ref{protocols-instances}, ``Protocol Declarations for Individual Objects'', for more information on using \function{adviseObject}. \end{funcdesc} And for declaring all other kinds of protocol relationships: \begin{funcdesc}{declareAdapter}{factory, provides, \optional{, forTypes=[ ]} \optional{, forProtocols=[ ]} \optional{, forObjects=[ ]}} Declare that \var{factory} is an \class{IAdapterFactory} whose return value provides the protocols listed in \var{provides} as an adapter for the classes/types listed in \var{forTypes}, for objects providing the protocols listed in \var{forProtocols}, and for the specific objects listed in \var{forObjects}. This function is shorthand for calling the primitive declaration functions for each of the protocols listed in \var{provides} and each of the sources listed in the respective keyword arguments. \end{funcdesc} Although these forms are easier to use than raw \code{declareAdapterForX} calls, they still require explicit reference to the types or objects involved. For the most common use cases, such as declaring protocol relationships to a class, or declaring an adapter class, it is usually easier to use the ``magic'' \function{protocols.advise()} function, which we will discuss next. \subsubsection{Convenience Declarations in Class, Interface and Module Bodies \label{protcols-advise}} Adapters, interfaces, and protocol implementations are usually defined in Python \code{class} statements. To make it more convenient to make protocol declarations for these classes, the \module{protocols} package supplies the \function{advise()} function. This function can make declarations about a class, simply by being called from the body of that class. It can also be called from the body of a module, to make a declaration about the module. \begin{funcdesc}{advise}{**kw} Declare protocol relationships for the containing class or module. All parameters must be supplied as keyword arguments. This function must be called directly from a class or module body, or a \exception{SyntaxError} results at runtime. Different arguments are accepted, according to whether the function is called within a class or module. When invoked in the top-level code of a module, this function only accepts the \code{moduleProvides} keyword argument. When invoked in the body of a class definition, this function accepts any keyword arguments \emph{except} \code{moduleProvides}. The complete list of keyword arguments follows. Unless otherwise specified, protocols must support the \class{IOpenProtocol} interface. \note{When used in a class body, this function works by temporarily replacing the \code{__metaclass__} of the class. If your class sets an explicit \code{__metaclass__}, it must do so \emph{before} \function{advise()} is called, or the protocol declarations will not occur!} \end{funcdesc} Keyword arguments accepted by \function{advise()}: \begin{description} \item[instancesProvide = \var{protocols}] \hfill \\ A sequence of protocols that instances of the containing class provide, without needing an adapter. Supplying this argument is equivalent to calling \code{declareImplementation(\var{containing class},\var{protocols})}. \item[instancesDoNotProvide = \var{protocols}] \hfill \\ A sequence of protocols that instances of the containing class do not provide. This is primarily intended for ``rejecting'' protocols provided or supported by base classes of the containing class. Supplying this argument is equivalent to calling \code{declareImplementation(\var{containing class},instancesDoNotProvide=\var{protocols})}. \item[asAdapterForTypes = \var{types}] \hfill \\ Declare the containing class as an adapter for \var{types}, to the protocols listed by the \code{instancesProvide} argument (which must also be supplied). Supplying this argument is equivalent to calling \code{declareAdapter(\var{containing class}, \var{instancesProvide}, forTypes=\var{types})}. (Note that this means the containing class must be an object that provides \class{IAdapterFactory}; i.e., its constructor should accept being called with a single argument: the object to be adapted.) \item[asAdapterForProtocols = \var{protocols}] \hfill \\ Declare the containing class as an adapter for \var{protocols}, to the protocols listed by the \code{instancesProvide} argument (which must also be supplied). Supplying this argument is equivalent to calling \code{declareAdapter(\var{containing class}, \var{instancesProvide}, forProtocols=\var{types})}. (Note that this means the containing class must be an object that provides \class{IAdapterFactory}; i.e., its constructor should accept being called with a single argument: the object to be adapted.) \item[factoryMethod = \var{methodName}] \hfill \\ \versionadded{0.9.1} When using \code{asAdapterForTypes} or \code{asAdapterForProtocols}, you can also supply a factory method name, using this keyword. The method named must be a \emph{class} method, and it will be used in place of the class' normal constructor. (Note that this means the named method must be able to be called with a single argument: the object to be adapted.) \item[protocolExtends = \var{protocols}] \hfill \\ Declare that the containing class is a protocol that extends (i.e., implies) the listed protocols. This keyword argument is intended for use inside class statements that themselves define protocols, such as \class{Interface} subclasses, and that need to ``inherit'' from incompatible protocols. For example, an \class{Interface} cannot directly subclass a Zope interface, because their metaclasses are incompatible. But using \code{protocolExtends} works around this: \begin{verbatim% }import protocols from mypackage import ISomeInterface from zope.something.interfaces import ISomeZopeInterface class IAnotherInterface(ISomeInterface): protocols.advise( protocolExtends = [ISomeZopeInterface] ) #... etc. \end{verbatim} In the above example, \code{IAnotherInterface} wants to extend both \code{ISomeInterface} and \code{ISomeZopeInterface}, but cannot do so directly because the interfaces are of incompatible types. \code{protocolExtends} informs the newly created interface that it implies \code{ISomeZopeInterface}, even though it isn't derived from it. Using this keyword argument is equivalent to calling \code{declareAdapter(NO_ADAPTER_NEEDED, \var{protocols}, forProtocols=[\var{containing class}])}. Note that this means that the containing class must be an object that supports \class{IOpenProtocol}, such as an \class{Interface} subclass. \item[protocolIsSubsetOf = \var{protocols}] \hfill \\ Declare that the containing class is a protocol that is implied (extended) by the listed protocols. This is just like \code{protocolExtends}, but in the ``opposite direction''. It allows you to declare (in effect) that some other interface is actually a subclass of (extends, implies) this one. See the examples in section \ref{protocols-defining} for illustration. Using this keyword argument is equivalent to calling \code{declareAdapter(NO_ADAPTER_NEEDED, [\var{containing class}], forProtocols=\var{protocols})}. \item[equivalentProtocols = \var{protocols}] \hfill \\ \versionadded{0.9.1} Declare that the containing class is a protocol that is equivalent to the listed protocols. That is, the containing protocol both implies, and is implied by, the listed protocols. This is a convenience feature intended mainly to support the use of generated protocols. Using this keyword is equivalent to using both the \code{protocolExtends} and \code{protocolIsSubsetOf} keywords, with the supplied protocols. \item[classProvides = \var{protocols}] \hfill \\ Declare that the containing class \emph{itself} provides the specified protocols. Supplying this argument is equivalent to calling \code{adviseObject(\var{containing class}, \var{protocols})}. Note that this means that the containing class may need to support the \class{IOpenProvider} interface. The \module{protocols} package supplies default adapters to support \class{IOpenProvider} for both classic and new-style classes, as long as they do not have custom \method{__conform__} methods. See section \ref{protocols-instances}, ``Protocol Declarations for Individual Objects'' for more details. \item[classDoesNotProvide = \var{protocols}] \hfill \\ Declare that the containing class \emph{itself} does not provide the specified protocols. This is for classes that need to reject inherited class-level \code{classProvides} declarations. Supplying this argument is equivalent to calling \code{adviseObject(\var{containing class}, doesNotProvide=\var{protocols})}, and the \class{IOpenProvider} requirements mentioned above for \code{classProvides} apply here as well. \item[moduleProvides = \var{protocols}] (module context only) \hfill \\ A sequence of protocols that the enclosing module provides. Equivalent to \code{adviseObject(\var{containing module}, \var{protocols})}. \end{description} \newpage \subsubsection{Protocol Declarations for Individual Objects \label{protocols-instances}} Because objects' behavior usually comes from a class definition, it's not too often that you will declare that an individual object provides or supports an interface, as opposed to making a blanket declaration about an entire class or type of object. But objects like functions and modules do not \emph{have} a class definition that encompasses their behavior, and classes themselves sometimes provide an interface (e.g. via \function{classmethod} objects). So, the declaration API needs to also be able to declare what protocols an individual object (such as a function, module, or class) supports or provides. This is what \function{adviseObject()} and the \code{classProvides}/\code{classDoesNotProvide} keywords of \function{advise()} do. In most cases, for an object to be usable with \function{adviseObject()}, it must support the \class{IOpenProvider} interface. Since many of the objects one might wish to use with \function{adviseObject()} (such as modules, functions, and classes) do not directly provide this interface, the \module{protocols.classic} module supplies and declares an adapter factory tha can adapt most Python objects to support this interface, assuming that they have a \code{__dict__} attribute. This default adapter works well for many situations, but it has some limitations you may need to be aware of. First, it works by ``poking'' a new \method{__conform__} method into the adapted object. If the object already has a \method{__conform__} method, a \exception{TypeError} will be raised. So, if you need an object to be an \class{IOpenProvider}, but it has a \method{__conform__} method, you may want to have its class include \class{ProviderMixin} among its base classes, so that your objects won't rely on the default adapter for \class{IOpenProvider}. (See \class{ProviderMixin} in section \ref{protocols-contents} for more on this.) Both the default adapter and \class{ProviderMixin} support inheritance of protocol declarations, when the object being adapted is a class or type. In this way, \code{advise(classProvides=\var{protocols})} declarations (or \code{adviseObject(someClass,\var{protocols})} calls) are inherited by subclasses. Of course, you can always reject inherited protocol information using \code{advise(classDoesNotProvide=\var{protocols})} or \function{adviseObject(newClass,doesNotProvide=\var{protocols})}. Both the default adapter and \class{ProviderMixin} work by keeping a mapping of protocols to adapter factories. Keep in mind that this means the protocols and adapter factories will continue to live until your object is garbage collected. Also, that means for your object to be pickleable, all of the protocols and adapter factories used must be pickleable. (This latter requirement can be quite difficult to meet, since composed adapter factories are dynamically created functions at present.) Note that none of these restrictions apply if you are only using declarations about types and protocols, as opposed to individual objects. (Or if you only make individual-object declarations for functions, modules, and classes.) Also note that if you have some objects that need to dynamically support or not support a protocol on a per-instance basis, then \function{adviseObject()} is probably not what you want anyway! Instead, give your objects' class a \method{__conform__()} method that does the right thing when the object is asked to conform to a protocol. \function{adviseObject()} is really intended for adding metadata to objects that ``don't know any better''. In general, protocol declarations are a \emph{static} mechanism: they cannot be changed or removed at will, only successively refined. All protocol declarations made must be consistent with the declarations that have already been made. This makes them unsuitable as a mechanism for dynamic behavior such as supporting a protocol based on an object's current state. In the next section, we'll look more at the static nature of declarations, and explore what it means to make conflicting (or refining) protocol declarations. \newpage \subsection{Protocol Implication and Adapter Precedence \label{proto-implication}} So far, we've only dealt with simple one-to-one relationships between protocols, types, and adapter factories. We haven't looked, for example, at what happens when you define that class X instances provide interface IX, that AXY is an adapter factory that adapts interface IX to interface IY, and class Z subclasses class X. (As you might expect, what happens is that Z instances will be wrapped with an AXY adapter when you call \code{adapt(instanceOfZ, IY)}.) Adaptation relationships declared via the declaration API are \strong{transitive}. This means that if you declare an adaptation from item A to item B, and from item B to item C, then there is an \strong{adapter path} from A to C. An adapter path is effectively a sequence of adapter factories that can be applied one by one to get from a source (type, object, or protocol) to a desired destination protocol. Adapter paths are automatically composed by the types, objects, and protocols used with the declaration API, using the \function{composeAdapters()} function. Adapter paths are said to have a \strong{depth}, which is the number of steps taken to get from the source to the destination protocol. For example, if factory AB adapts from A to B, and factory BC adapts from B to C, then an adapter factory composed of AB and BC would have a depth of 2. However, if we registered another adapter, AC, that adapts directly from A to C, this adapter path would have a depth of 1. Naturally, adapter paths with lesser depth are more desirable, as they are less likely to be a ``lossy'' conversion, and are more likely to be efficient. For this reason, shorter paths take precedence over longer paths. Whenever an adapter factory is declared between two points that previously required a longer path, all adapter paths that previously included the longer path segment are updated to use the newly shortened route. Whenever an adapter factory is declared that would \emph{lengthen} an existing path, it is ignored. The net result is that the overall network of adapter paths will tend to stabilize over time. As an added benefit, it is safe to define circular adapter paths (e.g. A to B, B to C, C to A), as only the shortest useful adapter paths are generated. We've previously mentioned the special adapter factories \function{NO_ADAPTER_NEEDED} and \function{DOES_NOT_SUPPORT}. There are a couple of special rules regarding these adapters that we need to add. Any adapter path that contains \function{DOES_NOT_SUPPORT} can be reduced to a single instance of \function{DOES_NOT_SUPPORT}, and any adapter path that contains \function{NO_ADAPTER_NEEDED} is equivalent to the same adapter path without it. These changes can be used to simplify adapter paths, but are only taken into consideration when comparing paths, if the ``unsimplified'' version of the adapter paths are the same length. Lets' consider two adapter paths between A and C. Each proceeds by way of B. (i.e., they go from A to B to C.) Which one is preferable? Both adapters have a depth of 2, because there are two steps (A to B, B to C). But suppose one adapter path contains two arbitrary adapter factories, and the other is composed of one factory plus \function{NO_ADAPTER_NEEDED}. Clearly, that path is superior, since it effectively contains only one adapter instead of two. This simplification, however, can \emph{only} be applied when the unsimplified paths are of the same length. Why? Consider our example of two paths from A to B to C. If someone declares a direct path from A to C (i.e. not via B or any other intermediate protocol), we want this path to take precedence over an indirect path, even if both paths ``simplify'' to the same length. Only if we are choosing between two paths with the same number of steps can we can use the length of their simplified forms as a ``tiebreaker''. So what happens when choosing between paths of the same number of steps and the same simplified length? A \exception{TypeError} occurs, unless one of these conditions applies: \begin{itemize} \item One of the paths simplifies to \function{DOES_NOT_SUPPORT}, in which case the other path is considered preferable. (Some ability is better than none.) \item One of the paths simplifies to \function{NO_ADAPTER_NEEDED}, in which case it is considered preferable. (It's better not to have to adapt.) \item Both of the paths are the same object, in which case no change is required to the existing path. (The declaration is redundant.) \end{itemize} Notice that this means that it is not possible to override an existing adapter path unless you are improving on it a way visible to the system. This doesn't mean, however, that you can't take advantage of existing declarations, while still overriding some of them. Suppose that there exists a set of existing adapters and protocols defined by some frameworks, and we are writing an application using them. We would like, however, for our application to be able to override certain existing relationships. Say for example that we'd like to have an adapter path from A to C that's custom for our application, but we'd like to ``inherit'' all the other adaptations to C, so that by default any C implementation is still useful for our application. The simple solution is to define a new protocol D as a \strong{subset} of protocol C. This is effectively saying that \function{NO_ADAPTER_NEEDED} can adapt from C to D. All existing declarations adapting to C, are now usable as adaptations to D, but they will have lower precedence than any direct adaptation to D. So now we define our direct adaptation from A to D, and it will take precedence over any A to C to D path. But, any existing path that goes to C will be ``inherited'' by D. Speaking of inheritance, please note that inheritance between types/classes has no effect on adapter path depth calculations. Instead, any path defined for a subclass takes absolute precedence over paths defined for a superclass, because the subclass is effectively a different starting point. In other words, if A is a class, and Q subclasses A, then an adapter path between Q and some protocol is a different path than the path between A and that protocol. There is no comparison between the two, and no conflict. However, if a path from Q to a desired protocol does not exist, then the existing best path for A will be used. Sometimes, one wishes to subclass a class without taking on its full responsibilities. It may be that we want Q to use A's implementation, but we do not want to support some of A's protocols. In that case, we can declare \function{DOES_NOT_SUPPORT} adapters for those protocols, and these will ensure that the corresponding adapter paths for A are not used. This is called \strong{rejecting inherited declarations}. It is not, generally speaking, a good idea. If you want to use an existing class' implementation, but do not wish to abide by its contracts (protocols), you should be using \strong{delegation} rather than inheritance. That is, you should define your new class so that it has an attribute that is an instance of the old class. For example, if you are tempted to subclass Python's built-in dictionary type, but you do not want your subclass to really \emph{be} a dictionary, you should simply have an attribute that is a dictionary. Because rejecting inherited declarations is a good indication that inheritance is being used improperly, the \module{protocols} package does not encourage the practice. Declaring a protocol as \function{DOES_NOT_SUPPORT} does not propagate to implied protocols, so every rejected protocol \emph{must} be listed explicitly. If class A provided protocol B, and protocol B derived from (i.e. implied) protocol C, then you must explicitly reject both B and C if you do not want your subclass to support them. \begin{seealso} The logic of composing and comparing adapter paths is implemented via the \function{composeAdapters()} and \function{minimumAdapter()} functions in the \module{protocols.adapters} module. See section \ref{protocol-adapters-module} for more details on these and other functions that relate to adapter paths. \end{seealso} \newpage \subsection{Dynamic Protocols (NEW in 0.9.1)\label{protocols-generated}} For many common uses of protocols, it may be inconvenient to subclass \class{protocols.Interface} or to manually create a \class{Protocol} instance. So, the \module{protocols} package includes a number of utility functions to make these uses more convenient. \subsubsection{Defining a protocol based on a URI or UUID}\label{protocols-generated-uri} \begin{funcdesc}{protocolForURI}{uri}\versionadded{0.9.1} Return a protocol object that represents the supplied URI or UUID string. It is guaranteed that you will receive the same protocol object if you call this routine more than once with equal strings. This behavior is preserved even across pickling and unpickling of the returned protocol object. The purpose of this function is to permit modules to refer to protocols defined in another module, that may or may not be present at runtime. To do this, a protocol author can declare that their protocol is equivalent to a URI string: \begin{verbatim% }from protocols import advise, Interface, protocolForURI class ISomething(Interface): advise( equivalentProtocols = [protocolForURI("some URI string")] ) # etc... \end{verbatim} Then, if someone wishes to use this protocol without importing \code{ISomething} (and thereby becoming dependent on the module that provides it), they can do something like: \begin{verbatim% }from protocols import advise, protocolForURI class MyClass: advise( provides = [protocolForURI("some URI string")] ) # etc... \end{verbatim} Thus, instances of \code{MyClass} will be considered to support \code{ISomething}, if needed. But, if \code{ISomething} doesn't exist, no error occurs. \end{funcdesc} \subsubsection{Defining a protocol as a subset of an existing type}\label{protocols-generated-type} \begin{funcdesc}{protocolForType}{baseType, \optional{methods=(), implicit=False}} Return a protocol object that represents the subset of \var{baseType} denoted by \var{methods}. It is guaranteed that you will receive the same protocol object if you call this routine more than once with eqivalent paremeters. This behavior is preserved even across pickling and unpickling of the returned protocol object. \var{baseType} should be a type object, and \var{methods} should be a sequence of attribute or method names. (The order of the names is not important.) The \var{implicit} flag allows adapting objects that don't explicitly declare support for the protocol. (More on this later.) Typical usage of this function is to quickly define a simple protocol based on a Python built-in type such as \class{list}, \class{dict}, or \class{file}: \begin{verbatim% }IReadFile = protocols.protocolForType(file, ['read','close']) IReadMapping = protocols.protocolForType(dict, ['__getitem__']) \end{verbatim} The advantage of using this function instead of creating an \class{Interface} subclass is that users do not need to import your specific \class{Interface} definition. As long as they declare support for a protocol based on the same type, and with at least the required methods, then their object will be considered to support the protocol. For example, declaring that you support \code{protocolForType(file, ['read','write','close'])} automatically implies that you support \code{protocolForType(file, ['read','close'])} and \code{protocolForType(file, ['write','close'])} as well. (Note: instances of the \var{baseType} and its subclasses will also be considered to provide the returned protocol, whether or not they explicitly declare support.) If you supply a true value for the \var{implicit} flag, the returned protocol will also adapt objects that have the specified methods or attributes. In other words, \code{protocolForType(file, ['read','close'], True)} returns a protocol that will consider any object with \method{read} and \method{close} methods to provide that protocol, as well as objects that explicitly support \code{protocolForType(file, ['read','close'])}. In order to automatically declare the relationships between the protocols for different subsets, this function internally generates all possible subsets of a requested \var{methods} list. So, for example, requesting a protocol with 8 method names may cause as many as 127 protocol objects to be created. Of course, these are generated only once in the lifetime of the program, but you should be aware of this if you are using large method subsets. Using as few as 32 method names would create 2 billion protocols! Note also that the supplied \var{baseType} is used only as a basis for semantic distinctions between sets of similar method names, and to declare that the \var{baseType} and its subclasses support the returned protocol. No protocol-to-protocol relationships are automatically defined between protocols requested for different base types, regardless of any subclass/superclass relationship between the base types. \end{funcdesc} \subsubsection{Defining a protocol for a sequence}\label{protocols-generated-sequence} \begin{funcdesc}{sequenceOf}{protocol} \versionadded{0.9.1} Return a protocol object that represents a sequence of objects adapted to \var{protocol}. Thus, \code{protocols.sequenceOf(IFoo)} is a protocol that represents a \class{protocols.IBasicSequence} of objects supporting the \class{IFoo} protocol. It is guaranteed that you will receive the same protocol object if you call this routine more than once with the same protocol, even across pickling and unpickling of the returned protocol object. When this function creates a new sequence protocol, it automatically declares an adapter function from \class{protocols.IBasicSequence} to the new protocol. The adapter function returns the equivalent of \code{[adapt(x,protocol) for x in sequence]}, unless one of the adaptations fails, in which case it returns \code{None}, causing the adaptation to fail. The built-in \class{list} and \class{tuple} types are declared as implementations of \class{protocols.IBasicSequence}, so protocols returned by \function{sequenceOf()} can be used immediately to convert lists or tuples into lists of objects supporting \var{protocol}. If you need to adapt other kinds of sequences using your \function{sequenceProtocol()}, you will need to declare that those sequences implement \class{protocols.IBasicSequence} unless they subclass \class{tuple}, \class{list}, or some other type that implements \class{protocols.IBasicSequence}. \end{funcdesc} \subsubsection{Defining a protocol as a local variation of another protocol}\label{protocols-generated-local} \begin{classdesc}{Variation}{baseProtocol \optional{, context=None}} \versionadded{0.9.1} A \class{Variation} is a \class{Protocol} that "inherits" adapter declarations from an existing protocol. When you create a \class{Variation}, it declares that it is implied by its \var{baseProtocol}, and so any adpater suitable for adapting to the base protocol is therefore suitable for the \class{Variation}. This allows you to then declare adapters to the variation protocol, without affecting those declared for the base protocol. In this way, you can have a protocol object that represents the use of the base protocol in a particular context. You can optionally specify that context via the \var{context} argument, which will then serve as the \member{context} attribute of the protocol. For more background on how this works and what it might be used for, see section \ref{protocols-context}. \end{classdesc} \newpage \subsection{Package Contents and Contained Modules\label{protocols-contents}} The following functions, classes, and interfaces are available from the top-level \module{protocols} package. \begin{funcdesc}{adapt}{component, protocol \optional{, default}} Return an implementation of \var{protocol} (a protocol object) for \var{component} (any object). The implementation returned may be \var{component}, or an adapter that implements the protocol on its behalf. If no implementation is available, return \var{default}. If no \var{default} is provided, raise \exception{protocols.AdaptationFailure}. A detailed description of this function's operations and purpose may be found in section \ref{adapt-protocol}. \end{funcdesc} \begin{excdesc}{AdaptationFailure} \versionadded{0.9.3} A subclass of \exception{TypeError} and \exception{NotImplementedError}, this exception type is raised by \function{adapt()} when no implementation can be found, and no \var{default} was supplied. \end{excdesc} \begin{classdesc}{Adapter}{ob, proto} \versionadded{0.9.1} This base class provides a convenient \method{__init__} method for adapter classes. To use it, just subclass \class{protocols.Adapter} and add methods to implement the desired interface(s). (And of course, declare what interfaces the adapter provides, for what types, and so on.) Your subclass' methods can use the following attribute, which will have been set by the \method{__init__} method: \begin{memberdesc}{subject} The \member{subject} attribute of an \class{Adapter} instance is the \var{ob} supplied to its constructor. That is, it is the object being adapted. \end{memberdesc} \end{classdesc} \begin{funcdesc}{advise}{**kw} Declare protocol relationships for the containing class or module. All parameters must be supplied as keyword arguments. This function must be called directly from a class or module body, or a \exception{SyntaxError} results at runtime. Different arguments are accepted, according to whether the function is called within a class or module. When invoked in the top-level code of a module, this function only accepts the \code{moduleProvides} keyword argument. When invoked in the body of a class definition, this function accepts any keyword arguments \emph{except} \code{moduleProvides}. The complete list of keyword arguments can be found in section \ref{protcols-advise}. \note{When used in a class body, this function works by temporarily replacing the \code{__metaclass__} of the class. If your class sets an explicit \code{__metaclass__}, it must do so \emph{before} \function{advise()} is called, or the protocol declarations will not occur!} \end{funcdesc} \begin{funcdesc}{adviseObject}{ob \optional{, provides=[ ]} \optional{, doesNotProvide=[ ]}} Declare that \var{ob} provides (or does not provide) the specified protocols. This is shorthand for calling \function{declareAdapterForObject()} with \function{NO_ADAPTER_NEEDED} and \function{DOES_NOT_SUPPORT} as adapters from the object to each of the specified protocols. Note, therefore, that \var{ob} may need to support \class{IOpenProvider}, and the listed protocols must be adaptable to \class{IOpenProtocol}. See section \ref{protocols-instances}, ``Protocol Declarations for Individual Objects'', for more information on using \function{adviseObject}. \end{funcdesc} \begin{classdesc}{Attribute}{doc\optional{,name=\constant{None}, value=\constant{None}}} This class is used to document attributes required by an interface. An example usage: \begin{verbatim% }from protocols import Interface, Attribute class IFoo(Interface): Bar = Attribute("""All IFoos must have a Bar attribute""") \end{verbatim} If you are using the ``Abstract Base Class'' or ABC style of interface documentation, you may wish to also use the \var{name} and \var{value} attributes. If supplied, the \class{Attribute} object will act as a data descriptor, supplying \var{value} as a default value, and storing any newly set value in the object's instance dictionary. This is useful if you will be subclassing the abstract base and creating instances of it, but still want to have documentation appear in the interface. When the interface is displayed with tools like \program{pydoc} or \function{help()}, the attribute documentation will be shown. \end{classdesc} \begin{funcdesc}{declareAdapter}{factory, provides \optional{, forTypes=[ ]} \optional{, forProtocols=[ ]} \optional{, forObjects=[ ]}} Declare that \var{factory} is an \class{IAdapterFactory} whose return value provides the protocols listed in \var{provides} as an adapter for the classes/types listed in \var{forTypes}, for objects providing the protocols listed in \var{forProtocols}, and for the specific objects listed in \var{forObjects}. This function is shorthand for calling the primitive declaration functions (\function{declareAdapterForType}, \function{declareAdapterForProtocol}, and \function{declareAdapterForObject}) for each of the protocols listed in \var{provides} and each of the items listed in the respective keyword arguments. \end{funcdesc} \begin{funcdesc}{declareImplementation}{typ \optional{, instancesProvide=[ ]} \optional{, instancesDoNotProvide=[ ]}} Declare that instances of class or type \var{typ} do or do not provide implementations of the specified protocols. \var{instancesProvide} and \var{instancesDoNotProvide} must be sequences of protocol objects that provide (or are adaptable to) the \class{IOpenProtocol} interface, such as \class{protocols.Interface} subclasses, or \class{Interface} objects from Zope or Twisted. This function is shorthand for calling \function{declareAdapterForType()} with \function{NO_ADAPTER_NEEDED} and \function{DOES_NOT_SUPPORT} as adapters from the type to each of the specified protocols. Note, therefore, that the listed protocols must be adaptable to \class{IOpenProtocol}. \end{funcdesc} \begin{funcdesc}{DOES_NOT_SUPPORT}{component, protocol} This function simply returns \constant{None}. It is a placeholder used whenever an object, type, or protocol does not implement or imply another protocol. Whenever adaptation is not possible, but the \module{protocols} API function you are calling requires an adapter, you should supply this function as the adapter. Some protocol implementations, such as the one for Zope interfaces, are unable to handle adapters other than \function{NO_ADAPTER_NEEDED} and \function{DOES_NOT_SUPPORT}. \end{funcdesc} \begin{classdesc*}{Interface} Subclass this to create a "pure" interface. See section \ref{protocols-defining} for more details. \end{classdesc*} \begin{classdesc*}{AbstractBase} \versionadded{0.9.3} Subclass this to create an "abstract base class" or "ABC" interface. See section \ref{protocols-defining} for more details. \end{classdesc*} \begin{funcdesc}{NO_ADAPTER_NEEDED}{component, protocol} This function simply returns \var{component}. It is a placeholder used whenever an object, type, or protocol directly implements or implies another protocol. Whenever an adapter is not required, but the \module{protocols} API function you are calling requires an adapter, you should supply this function as the adapter. Some protocol implementations may be unable to handle adapters other than \function{NO_ADAPTER_NEEDED} and \function{DOES_NOT_SUPPORT}. \end{funcdesc} \begin{funcdesc}{protocolForType}{baseType, \optional{methods=(), implicit=False}} \versionadded{0.9.1} Return a protocol object that represents the subset of \var{baseType} denoted by \var{methods}. It is guaranteed that you will receive the same protocol object if you call this routine more than once with eqivalent paremeters. This behavior is preserved even across pickling and unpickling of the returned protocol object. \var{baseType} should be a type object, and \var{methods} should be a sequence of attribute or method names. (The order of the names is not important.) The \var{implicit} flag allows adapting objects that don't explicitly declare support for the protocol. If you supply a true value for the \var{implicit} flag, the returned protocol will also adapt objects that have the specified methods or attributes. In other words, \code{protocolForType(file, ['read','close'], True)} returns a protocol that will consider any object with \method{read} and \method{close} methods to provide that protocol, as well as objects that explicitly support \code{protocolForType(file, ['read','close'])}. A more detailed description of this function's operations and purpose may be found in section \ref{protocols-generated-type}. (Note: this function may generate up to \code{2**len(\var{methods})} protocol objects, so beware of using large method lists.) \end{funcdesc} \begin{funcdesc}{protocolForURI}{uri} \versionadded{0.9.1} Return a protocol object that represents the supplied URI or UUID string. It is guaranteed that you will receive the same protocol object if you call this routine more than once with equal strings. This behavior is preserved even across pickling and unpickling of the returned protocol object. The purpose of this function is to permit modules to refer to protocols defined in another module, that may or may not be present at runtime. A more detailed description of this function's operations and purpose may be found in section \ref{protocols-generated-uri}. \end{funcdesc} \begin{funcdesc}{sequenceOf}{protocol} \versionadded{0.9.1} Return a protocol object that represents a sequence of objects adapted to \var{protocol}. Thus, \code{protocols.sequenceOf(IFoo)} is a protocol that represents a \class{protocols.IBasicSequence} of objects supporting the \class{IFoo} protocol. It is guaranteed that you will receive the same protocol object if you call this routine more than once with the same protocol, even across pickling and unpickling of the returned protocol object. When this function creates a new sequence protocol, it automatically declares an adapter function from \class{protocols.IBasicSequence} to the new protocol. The adapter function returns the equivalent of \code{[adapt(x,protocol) for x in sequence]}, unless one of the adaptations fails, in which case it returns \code{None}, causing the adaptation to fail. The built-in \class{list} and \class{tuple} types are declared as implementations of \class{protocols.IBasicSequence}, so protocols returned by \function{sequenceOf()} can be used immediately to convert lists or tuples into lists of objects supporting \var{protocol}. If you need to adapt other kinds of sequences using your \function{sequenceProtocol()}, you will need to declare that those sequences implement \class{protocols.IBasicSequence} unless they subclass \class{tuple}, \class{list}, or some other type that implements \class{protocols.IBasicSequence}. \end{funcdesc} \begin{classdesc}{StickyAdapter}{ob, proto} \versionadded{0.9.1} This base class is the same as the \class{Adapter} class, but with an extra feature. When a \class{StickyAdapter} instance is created, it declares itself as an adapter for its \member{subject}, so that subsequent \function{adapt()} calls will return the same adapter instance. (Technically, it declares an adapter function that returns itself.) This approach is useful when an adapter wants to hold information on behalf of its subject, that must not be lost when the subject is adapted in more than one place. Note that for a \class{StickyAdapter} subclass to be useful, the types it adapts \emph{must} support \class{IOpenProvider}. See section \ref{protocols-instances}, ``Protocol Declarations for Individual Objects'' for more information on this. Also, you should never declare that a \class{StickyAdapter} subclass adapts an individual object (as opposed to a type or protocol), since such a declaration would create a conflict when the adapter instance tries to register itself as an adapter for that same object and protocol. \class{StickyAdapter} adds one attribute to those defined by \class{Adapter}: \begin{memberdesc}{attachForProtocols} A tuple of protocols to be declared by the constructor. Define this in your subclass' body to indicate what protocols it should attach itself for. \end{memberdesc} \end{classdesc} \newpage \subsubsection{Classes and Functions typically used for Customization/Extension} These classes and functions are also available from the top-level \module{protocols} package. In contrast to the items already covered, these classes and functions are generally needed only when extending the protocols framework, as opposed to merely using it. \begin{classdesc*}{Protocol} \class{Protocol} is a base class that implements the \class{IOpenProtocol} interface, supplying internal adapter registries for adapting from other protocols or types/classes. Note that you do not necessarily need to use this class (or any other \class{IOpenProtocol} implementation) in your programs. Any object that implements the simpler \class{IProtocol} or \class{IAdaptingProtocol} interfaces may be used as protocols for the \function{adapt()} function. Compliance with the \class{IOpenProtocol} interface is only required to use the \module{protocols} declaration API. (That is, functions whose names begin with \code{declare} or \code{advise}.) To create protocols dynamically, you can create individual \class{Protocol} instances, and then use them with the declaration API. You can also subclass \class{Protocol} to create your own protocol types. If you override \function{__init__}, however, be sure to call \function{Protocol.__init__()} in your subclass' \function{__init__} method. \end{classdesc*} \begin{classdesc}{Variation}{baseProtocol \optional{, context=None}} \versionadded{0.9.1} A \class{Variation} is a \class{Protocol} that "inherits" adapter declarations from an existing protocol. When you create a \class{Variation}, it declares that it is implied by its \var{baseProtocol}, and so any adpater suitable for adapting to the base protocol is therefore suitable for the \class{Variation}. This allows you to then declare adapters to the variation protocol, without affecting those declared for the base protocol. In this way, you can have a protocol object that represents the use of the base protocol in a particular context. You can optionally specify that context via the \var{context} argument, which will then serve as the \member{context} attribute of the protocol. For more background on how this works and what it might be used for, see section \ref{protocols-context}. \end{classdesc} \begin{classdesc}{AbstractBaseMeta}{name, bases, dictionary}\versionadded{0.9.3} \class{AbstractBaseMeta}, a subclass of \class{Protocol} and \class{type}, is a metaclass used to create new "ABC-style" protocol objects, using class statements. You can use this metaclass directly, but it's generally simpler to just subclass \class{AbstractBase} instead. Normally, you will only use \class{AbstractBaseMeta} if you need to combine it with another metaclass. \end{classdesc} \begin{classdesc}{InterfaceClass}{name, bases, dictionary} \class{InterfaceClass} is a subclass of \class{AbstractBaseMeta} that implements the convenience adapation API (see section \ref{protocols-calling}) for its instances. This metaclass is used to create new "pure-style" interfaces (i.e., protocol objects) using class statements. Normally, you will only use \class{InterfaceClass} directly if you need to combine it with another metaclass, as it is usually easier just to subclass \class{Interface}. \end{classdesc} \begin{classdesc}{IBasicSequence} \versionadded{0.9.1} This interface represents the ability to iterate over a container-like object, such as a list or tuple. An \class{IBasicSequence} object must have an \method{__iter__()} method. By default, only the built-in \class{list} and \class{tuple} types are declared as having instances providing this interface. If you want to be able to adapt to \function{sequenceOf()} protocols from other sequence types, you should declare that their instances support this protocol. \end{classdesc} \begin{classdesc*}{ProviderMixin} If you have a class with a \method{__conform__} method for its instances, but you also want the instances to support \class{IOpenprovider} (so that \function{adviseObject} can be used on them), you may want to include this class as one of your class' bases. The default adapters for \class{IOpenprovider} can only adapt objects that do not already have a \method{__conform__} method of their own. So, to support \class{IOpenprovider} with a custom \method{__conform__} method, subclass \class{ProviderMixin}, and have your \method{__conform__} method invoke the base \method{__conform__} method as a default, using \function{supermeta()}. (E.g. \code{return supermeta(MyClass,self).__conform__(protocol)}.) See below for more on the \function{supermeta()} function. \end{classdesc*} \begin{funcdesc}{metamethod}{func} Wrap \var{func} in a manner analagous to \function{classmethod} or \function{staticmethod}, but as a metaclass-level method that may be redefined by metaclass instances for their instances. For example, if a metaclass wants to define a \method{__conform__} method for its instances (i.e. classes), and those instances (classes) want to define a \method{__conform__} method for \emph{their} instances, the metaclass should wrap its \method{__conform__} method with \function{metamethod}. Otherwise, the metaclass' \method{__conform__} method will be hidden by the class-level \method{__conform__} defined for the class' instances. \end{funcdesc} \begin{funcdesc}{supermeta}{typ, ob} Emulates the Python built-in \function{super()} function, but with support for metamethods. If you ordinarily would use \function{super()}, but are calling a \function{metamethod}, you should use \function{supermeta()} instead. This is because Python 2.2 does not support using super with properties (which is effectively what metamethods are). Note that if you are subclassing \class{ProviderMixin} or \class{Protocol}, you will need to use \function{supermeta()} to call almost any inherited methods, since most of the methods provided are wrapped with \function{metamethod()}. \end{funcdesc} \begin{funcdesc}{declareAdapterForType}{protocol, adapter, typ \optional{, depth=1}} Declare that \var{adapter} adapts instances of class or type \var{typ} to \var{protocol}, by adapting \var{protocol} to \class{IOpenProtocol} and calling its \function{registerImplementation} method. If \var{typ} is adaptable to \class{IOpenImplementor}, its \function{declareClassImplements} method is called as well. \end{funcdesc} \begin{funcdesc}{declareAdapterForObject}{protocol, adapter, ob \optional{, depth=1}} Declare that \var{adapter} adapts the object \var{ob} to \var{protocol}, by adapting \var{protocol} to \class{IOpenProtocol} and calling its \function{registerObject} method. Typically, \var{ob} must support \class{IOpenProvider}. See section \ref{protocols-instances} for details. \end{funcdesc} \begin{funcdesc}{declareAdapterForProtocol}{protocol, adapter, proto \optional{, depth=1}} Declare that \var{adapter} adapts objects that provide protocol \var{proto} to \var{protocol}, by calling \code{adapt(\var{proto},IOpenProtocol).addImpliedProtocol(\var{protocol},\var{adapter},\var{depth})}. \end{funcdesc} \newpage \subsubsection{\module{protocols.interfaces} --- Package Interfaces\label{protocols-interfaces-module}} \declaremodule{}{protocols.interfaces} \note{All of the interfaces listed here can also be imported directly from the top-level \module{protocols} package. However, you will probably only need them if you are extending the framework, as opposed to merely using it.} \begin{classdesc*}{IOpenProtocol} This interface documents the behavior required of protocol objects in order to be used with the \module{protocols} declaration API (the functions whose names begin with \code{declare} or \code{advise}.) The declaration API functions will attempt to \function{adapt()} supplied protocols to this interface. The methods an \class{IOpenProtocol} implementation must supply are: \begin{methoddesc}{addImpliedProtocol}{proto, adapter, depth} Declare that this protocol can be adapted to protocol \var{proto} via the \class{IAdapterFactory} supplied in \var{adapter}, at the specified implication level \var{depth}. The protocol object should ensure that the implied protocol is able to adapt objects implementing its protocol (typically by recursively invoking \function{declareAdapterForType()} with increased depth and appropriately composed adapters), and notify any registered implication listeners via their \method{newProtocolImplied()} methods. If the protocol already implied \var{proto}, this method should have no effect and send no notifications unless the new \var{adapter} and \var{depth} represent a ``shorter path'' as described in section \ref{proto-implication}. \end{methoddesc} \begin{methoddesc}{registerImplementation}{klass, adapter, depth} Declare that instances of type or class \var{klass} can be adapted to this protocol via the \class{IAdapterFactory} supplied in \var{adapter}, at the specified implication level \var{depth}. Unless \var{adapter} is \function{DOES_NOT_SUPPORT}, the protocol object must ensure that any protocols it implies are also able to perform the adaptation (typically by recursively invoking \function{declareAdapterForType()} with increased depth and appropriately composed adapters for its implied protocols). If the protocol already knew a way to adapt instances of \var{klass}, this method should be a no-op unless the new \var{adapter} and \var{depth} represent a ``shorter path'' as described in section \ref{proto-implication}. \end{methoddesc} \begin{methoddesc}{registerObject}{ob, adapter, depth} Ensure that the specific object \var{ob} will be adapted to this protocol via the \class{IAdapterFactory} supplied in \var{adapter}, at the specified implication level \var{depth}. The protocol object must also ensure that the object can be adapted to any protocols it implies. This method may be implemented by adapting \var{ob} to \class{IOpenProvider}, calling the \method{declareProvides()} method, and then recursively invoking \function{declareAdapterForObject} with increased depth and appropriately composed adapters for the protocols' implied protocols. \end{methoddesc} \begin{methoddesc}{addImplicationListener}{listener} Ensure that \var{listener} (an \class{IImplicationListener}) will be notified whenever an implied protocol is added to this protocol, or an implication path from this protocol is shortened. The protocol should at most retain a weak reference to \var{listener}. Note that if a protocol can guarantee that no notices will ever need to be sent, it is free to implement this method as a no-op. For example, Zope interfaces cannot imply any protocols besides their base interfaces, which are not allowed to change. Therefore, no change notifications would ever need to be sent, so the \class{IOpenProtocol} adapter for Zope interfaces implements this method as a no-op. \end{methoddesc} \note{\class{IOpenProtocol} is a subclass of \class{IAdaptingProtocol}, which means that implementations must therefore meet its requirements as well, such as having an \method{__adapt__()} method.} \end{classdesc*} \begin{classdesc*}{IOpenProvider} This interface documents the behavior required of an object to be usable with \function{adviseObject()}. Note that some protocol objects, such as the \class{IOpenProtocol} adapter for Zope interfaces, can handle \function{adviseObject()} operations without adapting the target object to \class{IOpenProvider}. This should be considered an exception, rather than the rule. However, the \module{protocols} package declares default adapters so that virtually any Python object that doesn't already have a \method{__conform__()} method can be adapted to \class{IOpenProvider} automatically. \begin{methoddesc}{declareProvides}{protocol, adapter, depth} Declare that this object can provide \var{protocol} if adapted by the \class{IAdapterFactory} supplied in \var{adapter}, at implication level \var{depth}. Return a true value if the new adapter was used, or a false value if the object already knew a ``shorter path'' for adapting to \var{protocol} (as described in section \ref{proto-implication}). Typically, an implementation of this method will also adapt \var{protocol} to \class{IOpenProtocol}, and then register with \method{addImplicationListener()} to receive notice of any protocols that might be implied by \var{protocol} in future. \end{methoddesc} \end{classdesc*} \begin{classdesc*}{IImplicationListener} This interface documents the behavior required of an object supplied to the \method{IOpenProtocol.addImplicationListener()} method. Such objects must be weak-referenceable, usable as a dictionary key, and supply the following method: \begin{methoddesc}{newProtocolImplied}{srcProto, destProto, adapter, depth} Receive notice that an adaptation was declared from \var{srcProto} to \var{destProto}, using the \class{IAdapterFactory} \var{adapter}, at implication level \var{depth}. When used as part of an \class{IOpenProvider} implementation, this method is typically used to recursively invoke \function{declareAdapterForObject()} with increased depth and appropriately composed adapters from protocols already supported by the object. \end{methoddesc} \end{classdesc*} \begin{classdesc*}{IOpenImplementor} If an class or type supplied to \function{declareAdapterForType} supports this interface, it will be notified of the declaration and any future declarations that affect the class, due to current or future protocol implication relationships. Supporting this interface is not necessary; it is provided as a hook for advanced users. Note that to declare a class or type as an \class{IOpenImplementor}, you must call \code{adviseObject(theClass, provides=[IOpenImplementor])} after the class definition or place \code{advise(classProvides=[IOpenImplementor])} in the body of the class, since this interface must be provided by the class itself, not by its instances. (Of course, if you implement this interface via a metaclass, you can declare that the metaclass' instances provide the interface.) Notification to classes supporting \class{IOpenImplementor} occurs via the following method: \begin{methoddesc}{declareClassImplements}{protocol, adapter, depth} Receive notice that instances of the class support \var{protocol} via the the \class{IAdapterFactory} supplied in \var{adapter}, at implication level \var{depth}. \end{methoddesc} \end{classdesc*} \begin{classdesc*}{IAdapterFactory} An interface documenting the requirements for an object to be used as an adapter factory: i.e., that it be a callable accepting an object to be adapted.) This interface is not used by the \module{protocols} package except as documentation. \end{classdesc*} \begin{classdesc*}{IProtocol} An interface documenting the basic requirements for an object to be used as a protocol for \function{adapt()}: i.e., that it be usable as a dictionary key. This interface is not used by the \module{protocols} package except as documentation. \end{classdesc*} \begin{classdesc*}{IAdaptingProtocol} An interface documenting the requirements for a protocol object to be able to adapt objects when used with \function{adapt()}: i.e., that it have a \method{__adapt__} method that accepts the object to be adapted and returns either an object providing the protocol or \constant{None}. This interface is not used by the \module{protocols} package except as documentation. It is a subclass of \class{IProtocol}, so any implementation of this interface must support the requirements defined by \class{IProtocol} as well. \end{classdesc*} \newpage \subsubsection{\module{protocols.adapters} --- ``Adapter arithmetic'' support \label{protocol-adapters-module}} \declaremodule{}{protocols.adapters} The \module{protocols.adapters} module provides support for doing ``adapter arithmetic'' such as determining which of two adapter paths is shorter, composing a new adapter from two existing adapters, and updating an adapter registry with a new adapter path. See section \ref{proto-implication} for a more general discussion of adapter arithmetic. \begin{funcdesc}{minimumAdapter}{a1, a2 \optional{, d1=0, d2=0}} Find the ``shortest'' adapter path, \var{a1} at depth \var{d1}, or \var{a2} at depth \var{d2}. Assuming \var{a1} and \var{a2} are adapter factories that accept similar input and return similar output, this function returns the one which is the ``shortest path'' between its input and its output. That is, the one with the smallest implication depth (\var{d1} or \var{d2}), or, if the depths are equal, then the adapter factory that is composed of the fewest chained factories (as composed by \function{composeAdapters()}) is returned. If neither factory is composed of multiple factories, or they are composed of the same number of intermediate adapter factories, then the following preference order is used: \begin{enumerate} \item If one of the adapters is \function{NO_ADAPTER_NEEDED}, it is returned \item If one of the adapters is \function{DOES_NOT_SUPPORT}, the \emph{other} adapter is returned. \item If both adapters are the exact same object (i.e. \code{a1 is a2}), either one is returned \end{enumerate} If none of the above conditions apply, then the adapter precedence is considered ambiguous, and a \exception{TypeError} is raised. This function is used by \function{updateWithSimplestAdapter} to determine whether a new adapter declaration should result in a registry update. Note that the determination of adapter composition length uses the \member{__adapterCount__} attribute, if present. (It is assumed to be \constant{1} if not present. See \function{composeAdapters()} for more details.) \end{funcdesc} \begin{funcdesc}{composeAdapters}{baseAdapter, baseProtocol, extendingAdapter} Return a new \class{IAdapterFactory} composed of the input adapter factories \var{baseAdapter} and \var{extendingAdapter}. If either input adapter is \function{DOES_NOT_SUPPORT}, \function{DOES_NOT_SUPPORT} is returned. If either input adapter is \function{NO_ADAPTER_NEEDED}, the other input adapter is returned. Otherwise, a new adapter factory is created that will return \code{\var{extendingAdapter}(\var{baseAdapter}(object))} when called with \code{object}. (Note: the actual implementation verifies that \var{baseAdapter} didn't return \constant{None} before it calls \var{extendingAdapter}). If this function creates a new adapter factory, the factory will have an \member{__adapterCount__} attribute set to the sum of the \member{__adapterCount__} attributes of the input adapter factories. If an input factory does not have an \member{__adapterCount__} attribute, it is assumed to equal \constant{1}. This is done so that the \function{minimumAdapter()} can compare the length of composed adapter chains. \end{funcdesc} \begin{funcdesc}{updateWithSimplestAdapter}{mapping, key, adapter, depth} Treat \var{mapping} as an adapter registry, replacing the entry designated by \var{key} with an \code{(\var{adapter},\var{depth})} tuple, if and only if the new entry would be a ``shorter path'' than the existing entry, if any. (I.e., if \code{minimumAdapter(old, \var{adapter}, oldDepth, \var{depth})} returns \var{adapter}, and \var{adapter} is not the existing registered adapter. The function returns a true value if it updates the contents of \var{mapping}. This function is used to manage type-to-protocol, protocol-to-protocol, and object-to-protocol adapter registries, keyed by type or protocol. The \var{mapping} argument must be a mapping providing \method{__setitem__()} and \method{get()} methods. Values stored in the mapping will be \code{(\var{adapter},\var{depth})} tuples. \end{funcdesc} \newpage \subsubsection{\module{protocols.zope_support} --- Support for Zope Interfaces} \declaremodule[protocols.zopesupport]{}{protocols.zope_support} Importing this module enables experimental support for using Zope X3 \class{Interface} objects with the \module{protocols} package, by registering an adapter from Zope X3's \class{InterfaceClass} to \class{IOpenProtocol}. The adapter supports the following subset of the declaration API: \begin{itemize} \item The only adapters supported via Zope APIs are \function{NO_ADAPTER_NEEDED} and \function{DOES_NOT_SUPPORT}. By using PyProtocols APIs, you may declare and use other adapters for Zope interfaces, but Zope itself will not use them, since the Zope interface API does not directly support adaptation. \item Zope's interface APIs do not conform to \module{protocols} package ``shortest path wins'' semantics. Instead, new declarations override older ones. \item Interface-to-interface adaptation may not work if a class only declares what it implements using Zope's interface API. That is, if a class declares that it implements \class{ISomeZopeInterface}, and you define an adaptation from \class{ISomeZopeInterface} to \class{ISomeOtherInterface}, PyProtocols may not recognize that the class can be adapted to \class{ISomeOtherInterface}. \item Changing the \member{__bases__} of a class that has Zope interfaces declared for it (either as ``class provides'' or ``instances provide''), may have unexpected results, because Zope uses inheritance of a single descriptor to control declarations. In general, it will only work if the class whose \member{bases} are changed, has no declarations of its own. \item You cannot declare an implication relationship from a Zope \class{Interface}, because Zope only supports implication via inheritance, which is fixed at interface definition time. Therefore, you cannot create a ``subset'' of a Zope \class{Interface}, and subscribing an \class{IImplicationListener} to an adapted Zope \class{Interface} silently does nothing. \item You can, however, declare that a \module{protocols.Interface} extends a Zope \class{Interface}. Declaring that a class' instances or that an object provides the extended interface, will automatically declare that the class' instances or the object provides the Zope \class{Interface} as well. For example: \begin{verbatim% }import protocols from zope.somepackage.interfaces import IBase class IExtended(protocols.Interface): advise( protocolExtends = [IBase] ) class AnImplementation: advise( instancesProvide = [IExtended] ) \end{verbatim} The above code should result in Zope recognizing that instances of \class{AnImplementation} provide the Zope \class{IBase} interface. \item You cannot extend both a Zope interface and a Twisted interface in the same \class{protocols.Interface}. Although this may not give you any errors, Twisted and Zope both expect to use an \member{__implements__} attribute to store information about what interface a class or object provides. But each has a different interpretation of the contents, and does not expect to find ``foreign'' interfaces contained within. So, until this issue between Zope and Twisted is resolved, it is not very useful to create interfaces that extend both Zope and Twisted interfaces. \item Zope does not currently appear to support classes inheriting direct declarations (e.g. \code{classProvides}). This appears to be a by-design limitation. \end{itemize} The current implementation of support for Zope X3 interfaces is currently based on Zope X3 beta 1; it may not work with older releases. Zope X3 requires Python 2.3.4 or better, so even though PyProtocols works with 2.2.2 and up in general, you will need 2.3.4 to use PyProtocols with Zope X3. \newpage \subsubsection{\module{protocols.twisted_support} --- Support for Twisted Interfaces} \declaremodule[protocols.twistedsupport]{}{protocols.twisted_support} Importing this module enables experimental support for using Twisted 1.1.0 \class{Interface} objects with the \module{protocols} package, by registering an adapter from Twisted's \class{MetaInterface} to \class{IOpenProtocol}. The adapter supports the following subset of the declaration API: \begin{itemize} \item Only protocol-to-protocol adapters defined via the \module{protocols} declaration API will be available to implication listeners. If protocol-to-protocol adapters are registered via Twisted's \function{registerAdapter()}, implication listeners are \emph{not} notified. \item You cannot usefully create a ``subset'' of a Twisted interface, or an adaptation from a Twisted interface to another interface type, as Twisted insists that interfaces must subclass its interface base class. Also, Twisted does not support transitive adaptation, nor can it notify the destination interface(s) of any new incoming adapter paths. \item If you register an adapter factory that can return \constant{None} with a Twisted interface, note that Twisted does not check for a \class{None} return value from \function{getAdapter()}. This means that code in Twisted might receive \constant{None} when it expected either an implementation or an error. \item Only Twisted's global adapter registry is supported for declarations and \function{adapt()}. \item Twisted doesn't support classes providing interfaces (as opposed to their instances providing them). You may therefore obtain unexpected results if you declare that a class provides a Twisted interface or an interface that extends a Twisted interface. \item Changing the \member{__bases__} of a class that has Twisted interfaces declared for it may have unexpected results, because Twisted uses inheritance of a single descriptor to control declarations. In general, it will only work if the class whose \member{bases} are changed, has no declarations of its own. \item Any adapter factory may be used for protocol-to-protocol adapter declarations. But, for any other kind of declaration, \function{NO_ADAPTER_NEEDED} and \function{DOES_NOT_SUPPORT} are the only adapter factories that can be used with Twisted. \item Twisted interfaces do not conform to \module{protocols} package ``shortest path wins'' semantics. For protocol-to-protocol adapter declarations, only one adapter declaration between a given pair of interfaces is allowed. Any subsequent declarations with the same source and destination will result in a \exception{ValueError}. For all other kinds of adapter declarations, new declarations override older ones. \item You cannot extend both a Zope interface and a Twisted interface in the same \class{protocols.Interface}. Although this may not give you any errors, Twisted and Zope both expect to use an \member{__implements__} attribute to store information about what interface a class or object provides. But each has a different interpretation of the contents, and does not expect to find ``foreign'' interfaces contained within. So, until this issue between Zope and Twisted is resolved, it is not very useful to create interfaces that extend both Zope and Twisted interfaces. \end{itemize} \newpage \subsubsection{\module{protocols.advice} --- Metaclasses and other ``Magic''} \declaremodule{}{protocols.advice} This module provides a variety of utility functions and classes used by the \module{protocols} package. None of them are really specific to the \module{protocols} package, and so may be useful to other libraries or applications. \begin{funcdesc}{addClassAdvisor}{callback \optional{, depth=\constant{2}}} Set up \var{callback} to be called with the containing class, once it is created. This function is designed to be called by an ``advising'' function (such as \function{protocols.advise()}) executed in the body of a class suite. The ``advising'' function supplies a callback that it wishes to have executed when the containing class is created. The callback will be given one argument: the newly created containing class. The return value of the callback will be used in \emph{place} of the class, so the callback should return the input if it does not wish to replace the class. The optional \var{depth} argument determines the number of frames between this function and the targeted class suite. \var{depth} defaults to 2, since this skips this function's frame and one calling function frame. If you use this function from a function called directly in the class suite, the default will be correct, otherwise you will need to determine the correct depth yourself. This function works by installing a special class factory function in place of the \member{__metaclass__} of the containing class. Therefore, only callbacks \emph{after} the last \member{__metaclass__} assignment in the containing class will be executed. Be sure that classes using ``advising'' functions declare any \member{__metaclass__} \emph{first}, to ensure all callbacks are run. \end{funcdesc} \begin{funcdesc}{isClassAdvisor}{ob} Returns truth if \var{ob} is a class advisor function. This is used to determine if a \member{__metaclass__} value is a ``magic'' metaclass installed by \function{addClassAdvisor()}. If so, then \var{ob} will have a \member{previousMetaclass} attribute pointing to the previous metaclass, if any, and a \member{callback} attribute containing the callback that was given to \function{addClassAdvisor()}. \end{funcdesc} \begin{funcdesc}{getFrameInfo}{frame} Return a \code{(\var{kind},\var{module},\var{locals},\var{globals})} tuple for the supplied frame object. The returned \var{kind} is a string: either ``exec'', ``module'', ``class'', ``function call'', or ``unknown''. \var{module} is the module object the frame is/was executed in, or \constant{None} if the frame's globals could not be correlated with a module in \code{sys.modules}. \var{locals} and \var{globals} are the frame's local and global dictionaries, respectively. Note that they can be the same dictionary, and that modifications to locals may not have any effect on the execution of the frame. This function is used by functions like \function{addClassAdvisor()} and \function{advise()} to verify where they're being called from, and to work their respective magics. \end{funcdesc} \begin{funcdesc}{getMRO}{ob \optional{, extendedClassic=\constant{False}}} Return an iterable over the ``method resolution order'' of \var{ob}. If \var{ob} is a ``new-style'' class or type, this returns its \member{__mro__} attribute. If \var{ob} is a ``classic'' class, this returns \code{classicMRO(\var{ob},\var{extendedClassic})}. If \var{ob} is not a class or type of any kind, a one-element sequence containing just \var{ob} is returned. \end{funcdesc} \begin{funcdesc}{classicMRO}{ob \optional{, extendedClassic=\constant{False}}} Return an iterator over the ``method resolution order'' of classic class \var{ob}, following the ``classic'' method resolution algorithm of recursively traversing \member{__bases__} from left to right. (Note that this may return the same class more than once, for some inheritance graphs.) If var{extendedClassic} is a true value, \class{InstanceType} and \class{object} are added at the end of the iteration. This is used by \class{Protocol} objects to allow generic adapters for \class{InstanceType} and \class{object} to be used with ``classic'' class instances. \end{funcdesc} \begin{funcdesc}{determineMetaclass}{bases \optional{, explicit_mc=\constant{None}}} Determine the metaclass that would be used by Python, given a non-empty sequence of base classes, and an optional explicitly supplied \member{__metaclass__}. Returns \class{ClassType} if all bases are ``classic'' and there is no \var{explicit_mc}. Raises \exception{TypeError} if the bases' metaclasses are incompatible, just like Python would. \end{funcdesc} \begin{funcdesc}{minimalBases}{classes} Return the shortest ordered subset of the input sequence \var{classes} that still contains the ``most specific'' classes. That is, the result sequence contains only classes that are not subclasses of each other. This function is used by \function{determineMetaclass()} to narrow down its list of candidate metaclasses, but is also useful for dynamically generating metaclasses. \end{funcdesc} \begin{funcdesc}{mkRef}{ob \optional{, callable}} If \var{ob} is weak-referenceable, returns \code{weakref.ref(\var{ob},\var{callable})}. Otherwise, returns a \code{StrongRef(\var{ob})}, emulating the interface of \code{weakref.ref()}. This is used by code that wants to use weak references, but may be given objects that are not weak-referenceable. Note that \var{callable}, if supplied, will \emph{not} be called if \var{ob} is not weak-referenceable. \end{funcdesc} \begin{classdesc}{StrongRef}{ob} An object that emulates the interface of \class{weakref.ref()}. When called, an instance of \class{StrongRef} will return the \var{ob} it was created for. Also, it will hash the same as \var{ob} and compare equal to it. Thus, it can be used as a dictionary key, as long as the underlying object can. Of course, since it is not really a weak reference, it does not contribute to the garbage collection of the underlying object, and may in fact hinder it, since it holds a live reference to the object. \end{classdesc} \newpage \subsection{Big Example 2 --- Extending the Framework for Context\label{protocols-context}} Now it's time for our second ``big'' example. This time, we're going to add an extension to the \module{protocols} framework to support ``contextual adaptation''. The tools we've covered so far are probably adequate to support 80-90\% of situations requiring adaptation. But, they are essentially global in nature: only one adapter path is allowed between any two points. What if we need to define a different adaptation in a specific context? For example, let's take the documentation framework we began designing in section \ref{protocols-example1}. Suppose we'd like, for the duration of a single documentation run, to replace the factory that adapts from \class{FunctionType} to \class{IDocumentable}? For example, we might like to do this so that functions used by our ``finite state machine'' objects as ``transitions'' are documented differently than regular functions. Using only the tools described so far, we can't do this if \class{IDocumentable} is a single object. The framework that registered the \class{FunctionAsDocumentable} adapter effectively ensured that we cannot replace that adapter with another, since it is already the shortest adapter path. What can we do? In section \ref{proto-implication}, we discussed how we could create ``subset'' protocols and ``inherit'' adapter declarations from existing protocols. In this way, we could create a new subset protocol of \class{IDocumentable}, and then register our context-specific adapters with that subset. These subset protocols are just as fast as the original protocols in looking up adapters, so there's no performance penalty. But who creates the subset protocol? The client or the framework? And how do we get the framework to use our subset instead of its built-in \class{IDocumentable} protocol? To answer these questions, we will create an extension to the \module{protocols} framework that makes it easy for frameworks to manage ``contextual'' or ``local'' protocols. Then, framework creators will have a straightforward way to support context-specific adapter overrides. As before, we'll start by envisioning our ideal situation. Let's assume that our documentation tools are object-based. That is, we instantiate a ``documentation set'' or ``documentation run'' object in order to generate documentation. How do we want to register adapters? Well, we could have the framework add a bunch of methods to do this, but it seems more straightforward to simply supply the interfaces as attributes of the ``documentation set'' or ``documentation run'' object, e.g.: \begin{verbatim% }from theDocTool import DocSet from myAdapters import specialFunctionAdapter from types import FunctionType import protocols myDocs = DocSet() protocols.registerAdapter( specialFunctionAdapter, provides = [myDocs.IDocumentable], forTypes = [FunctionType] ) myDocs.run() \end{verbatim} So, instead of importing the interface, we access it as an attribute of some relevant ``context'' object, and declare adapters for it. Anything we don't declare a ``local'' adapter for, will use the adapters declared for the underlying ``global'' protocol. Naturally, the framework author could implement this by writing code in the \class{DocSet} class' \method{__init__} method, to create the new ``local'' protocol and register it as a subset of the ``global'' \class{IDocumentable} interface. But that would be time-consuming and error prone, and therefore discourage the use of such ``local'' protocols. Again, let's consider what our ideal situation would be. The author of the \class{DocSet} class should be able to do something like: \begin{verbatim% }class DocSet: from doctool.interfaces import IDocumentable, ISignature IDocumentable = subsetPerInstance(IDocumentable) ISignature = subsetPerInstance(ISignature) # ... etc. \end{verbatim} Our hypothetical \class{subsetPerInstance} class would be a descriptor that did all the work needed to provide a ``localized'' version of each interface for each instance of \class{DocSet}. Code in the \class{DocSet} class would always refer to \code{self.IDocumentable} or \code{self.ISignature}, rather than using the ``global'' versions of the interfaces. Thus, we can now register adapters that are unique to a specific \class{DocSet}, but still use any globally declared adapters as defaults. Okay, so that's our hypothetical ideal. How do we implement it? I personally like to try writing the ideal thing, to find out what other pieces are needed. So let's start with writing the \class{subsetPerInstance} descriptor, since that's really the only piece we know we need so far. \begin{verbatim% }from protocols import Protocol, declareAdapterForProtocol, NO_ADAPTER_NEEDED class subsetPerInstance(object): def __init__(self,protocol,name=None): self.protocol = protocol self.name = name or getattr(protocol,'__name__',None) if not self.name: raise TypeError("Descriptor needs a name for", protocol) def __get__(self,ob,typ=None): if ob is None: return self name = self.name if getattr(type(ob),name) is not self or name in ob.__dict__: raise TypeError( "Descriptor is under more than one name or the wrong name", self, name, type(ob) ) local = Protocol() declareAdapterForProtocol(local,NO_ADAPTER_NEEDED,self.protocol) # save it in the instance's dictionary so we won't be called again ob.__dict__[name] = local return local def __repr__(self): return "subsetPerInstance(%r)" % self.protocol \end{verbatim} Whew. Most of the complexity above comes from the need for the descriptor to know its ``name'' in the containing class. As written, it will guess its name to be the name of the wrapped interface, if available. It can also detect some potential aliasing/renaming issues that could occur. The actual work of the descriptor occurs in just two lines, buried deep in the middle of the \method{__get__} method. As written, it's a handy enough tool. We could leave things where they are right now and still get the job done. But that would hardly be an example of extending the framework, since we didn't even subclass anything! So let's add another feature. As it sits, our descriptor should work with both old and new-style classes, automatically generating one subset protocol for each instance of its containing class. But, the subset protocol doesn't \emph{know} it's a subset protocol, or of what context. If we were to print \code{DocSet().IDocumentable}, we'd just get something like \constant{}. Here's what we'd like it to do instead. We'd like it to say something like \constant{LocalProtocol(, )}. That is, we want the local protocol to: \begin{itemize} \item ``know'' it's a local protocol \item know what protocol it's a local subset of \item know what ``context'' object it's a local protocol for \end{itemize} What does this do for us? Aside from debugging, it gives us a chance to find related interfaces, or access methods or data available from the context. So, let's create a \class{LocalProtocol} class: \begin{verbatim% }class LocalProtocol(Protocol): def __init__(self, baseProtocol, context): self.baseProtocol = baseProtocol self.context = context # Note: Protocol is a ``classic'' class, so we don't use super() Protocol.__init__(self) declareAdapterForProtocol(self,NO_ADAPTER_NEEDED,baseProtocol) def __repr__(self): return "LocalProtocol(%r,%r)" % (self.baseProtocol, self.context) \end{verbatim} And now, we can replace these two lines in our earlier \method{__get__} method: \begin{verbatim% } local = Protocol() declareAdapterForProtocol(local,NO_ADAPTER_NEEDED,self.protocol) \end{verbatim} with this one: \begin{verbatim% } local = LocalProtocol(self.protocol, ob) \end{verbatim} Thus, the new local protocol will know its context is the instance it was retrieved from. Of course, to make this new extension really robust, we would need to add some more documentation. For example, it might be good to add an \class{ILocalProtocol} interface that documents what local protocols do. Context-sensitive adapters would then be able to verify whether they are working with a local protocol or a global one. Framework developers would also want to document what local interfaces are provided by their frameworks' objects, and authors of context-sensitive adapters need to document what interface they expect their local protocols' \member{context} attribute to supply! Also, see below for a web site with some interesting papers on patterns for using localized adaptation of this kind. \note{In practice, the idea of having local protocols turned out to be useful enough that as of version 0.9.1, our \class{LocalProtocol} example class was added to the protocols package as \class{protocols.Variation}. So, if you want to make use of the idea, you don't need to type in the source or write your own any more.} \begin{seealso} \seetitle[http://www.objectteams.org/]{Object Teams}{If you find the idea of context-specific interfaces and adapters interesting, you'll find ``Object Teams'' intriguing as well. In effect, the ideas we've presented here map onto a subset of the ``Object Teams'' concept. Our local interfaces correspond to their ``abstract roles'', our local adapters' instances map to their ``role instances'', and our contexts are their ``team instances''. Adapting an object corresponds to their ``lifting'', and so on. The main concept that's not directly supported by our implementation here is ``callin binding''. (Callin binding is a way of (possibly temporarily) injecting hooks into an adapted object so that the adapter can be informed when the adapted object's methods are called directly by other code.)} \end{seealso} \newpage \subsection{Additional Examples and Usage Notes} If you have any ideas or examples you'd like to share for inclusion in this section, please contact the author. In the meantime, here are a few additional examples of things you can do with \function{adapt()} and the \module{protocols} package. \subsubsection{Double Dispatch and the ``Visitor'' Pattern\label{dispatch-example}} Double dispatch and the ``Visitor'' pattern are mechanisms for selecting a method to be executed, based on the type of two objects at the same time. To implement either pattern, both object types must have code specifically to support the pattern. Object adaptation makes this easier by requiring at most one of the objects to directly support the pattern; the other side can provide support via adaptation. This is useful both for writing new code clearly and for adapting existing code to use the pattern. First, let's look at double dispatching. Suppose we are creating a business application GUI that supports drag-and-drop. We have various kinds of objects that can be dragged and dropped onto other objects: users, files, folders, a trash can, and a printer. When we drop a user on a file, we want to grant the user access to the file, and when we drop a file on the user, we want to email them the file. If we drop a file on a folder, it should be filed in the folder, but if we drop the folder on the file, that's an error. The classic ``double dispatch'' approach would look something like: \begin{verbatim% }class Printer: def drop(self,thing): thing.droppedOnPrinter(self) class Trashcan: def drop(self,thing): thing.droppedInTrash(self) class User: def drop(self,thing): thing.droppedOnUser(self) def droppedOnPrinter(self,printer): printer.printUser(self) def droppedInTrash(self,trash): self.delete() class File: def drop(self,thing): thing.droppedOnFile(self) def droppedOnPrinter(self,printer): printer.printFile(self) def droppedOnUser(self,user): user.sendMail(self) def droppedInTrash(self,trash): self.delete() \end{verbatim} We've left out any of the methods that actually \emph{do} anything, of course, and all of the methods for things that the objects don't do. For example, the \class{Trashcan} should have methods for \method{droppedInTrash()}, \method{droppedOnPrinter()}, etc., that display an error or beep or whatever. (Of course, in Python you can just trap the \exception{AttributeError} from the missing method to do this; but we didn't show that here either.) Every time another kind of object is added to this system, new \code{droppedOnX} methods spring up everywhere like weeds. Now let's look at the adaptation approach: \begin{verbatim% }class Printer: def drop(self,thing): IPrintable(thing).printOn(self) class Trashcan: def drop(self,thing): IDeletable(thing).delete(self) class User: protocols.advise(instancesProvide=[IDeletable,IPrintable]) def drop(self,thing): IMailable(thing).mailTo(self) class File: protocols.advise(instancesProvide=[IDeletable,IMailable,IPrintable]) def drop(self,thing): IInsertable(thing).insertInto(self) class Undroppable(protocols.Adapter): protocols.advise( instancesProvide=[IPrintable,IDeletable,IMailable,IInsertable], asAdapterForTypes=[object] ) def printOn(self,printer): print "Can't print", self.subject def mailTo(self,user): print "Can't mail", self.subject # ... etc. \end{verbatim} Notice how our default \class{Undroppable} adapter class implements the \class{IPrintable}, \class{IDeletable}, \class{IMailable}, and \class{IInsertable} protocols on behalf of arbitrary objects, by giving user feedback that the operation isn't possible. (This technique of using a default adapter factory that provides an empty or error-raising implementation of an interface, is an example of the \strong{null object pattern}.) Notice that the adaptation approach is much more scalable, because new methods are not required for every new droppable item. Third parties can declare adaptations between two other developers' objects, making drag and drop between them possible. Now let's look at the ``Visitor'' pattern. The ``Visitor'' pattern is a specialized form of double dispatch, used to apply an algorithm to a structured collection of objects. For example, the Python \program{docutils} tookit implements the visitor pattern to create various kinds of output from a document node tree (much like an XML DOM). Each node has a \method{walk()} method that accepts a ``visitor'' argument. The visitor must provide a set of \code{visit_X} methods, where X is the name of a type of node. The idea of the approach is that one can write new visitor types that perform different functions. One visitor writes out HTML, another writes out LaTeX or maybe plain ASCII text. The nodes don't care what the visitor does, they just tell it what kind of object is being visited. Like double dispatch, this pattern is definitely an improvement over writing large if-then-else blocks to introspect types. But it does have a few drawbacks. First, all the types must have unique names. Second, the visitor must have methods for all possible node types (or the caller must handle the absence of the methods). Third, there is no way for the methods to mimic the inheritance or interface structure of the source types. So, if there are node types like \class{Shape} and \class{Square}, you must write \method{visit_Shape} and \method{visit_Square} methods, even if you would like to treat all subtypes of \class{Shape} the same. The object adaptation approach to this, is to define visitor(s) as adapters from the objects being traversed, to an interface that supplies the desired behavior. For example, one might define \class{IHTMLWriter} and \class{ILaTeXWriter} interfaces, with \class{writeHTML()} and \class{writeLaTeX()} methods. Then, by defining adapters from the appropriate node base types to these interfaces, the desired behavior is achieved. Just use \code{IHTMLWriter(document).writeHTML()}, and off you go. This approach is far less fragile, since new node types do not require new methods in the visitors, and if the new node type specializes an existing type, the default adaptation might be reasonable. Also, the approach is non-invasive, so it can be applied to existing frameworks that don't support the visitor pattern (such as \module{xml.dom.minidom}). Further, the adapters can exercise fine-grained control over any traversal that takes place, since it is the adapter rather than the adaptee that controls the visiting order. Last, but not least, notice that by adapting from interfaces rather than types, one can apply this pattern to multiple implementations of the interface. For example, Python has many XML DOM implementations; to the extent that two implementations provide the same interface, the adapters you write could be used with any of them, even if each pacakge has different names for their node types. Are there any downsides to using adaptation over double-dispatch or the Visitor pattern? The total size of your program may be larger, because you'll be writing lots of adapter classes. But, your program will also be more modular, and you'll be able to group the classes in ways that make more sense for the reader. Using adaptation also may be faster or slower than not using it, depending on various implementation factors. It's rare that the difference is significant, however. In most uses of these patterns, runtime is dominated by the useful work being done, not by the dispatching. The exception is when a structure to be visited contains many thousands of elements that need virtually no work done to them. (For example, if an XML visitor wrote text nodes out unchanged, and the input was mostly text nodes.) Under such conditions, the time taken by the dispatch mechanism (whether name-based or adapter-based) would be more visible. The author has found, however, that in that situation, one can gain more speed by registering null adapter factories or \function{DOES_NOT_SUPPORT} for the element types in question. This shortens the adapter lookup time enough to make the adaptation overhead competitive with name-based approaches. But this only needs to be done when ``trivial'' elements dominate the structures to be processed, \emph{and} performance is critical. \begin{seealso} \seetitle[http://citeseer.nj.nec.com/nordberg96variations.html]{Variations on the Visitor Pattern}{A critique of the Visitor pattern that raises some of the same issues we raise here, and with similar solutions. Reading its C++ examples, however, will increase your appreciation for the simplicity and modularity that \function{adapt()} offers!} \seetitle[http://citeseer.nj.nec.com/woolf96null.html]{The Null Object Pattern}{The original write-up of this handy approach to simplifying framework code.} \end{seealso} \newpage \subsubsection{Replacing introspection with Adaptation, Revisited\label{introspect-elim}} \begin{quotation} ``Potentially-idempotent adapter functions are a honking great idea -- let's do more of those'', to paraphrase the timbot. \hfill --- Alex Martelli, on \newsgroup{comp.lang.python} \end{quotation} All programs that use GOTO's can be rewritten without GOTOs, using higher-level constructs for control flow like function calls, \code{while} loops, and so on. In the same way, type checks -- and even interface checks -- are not essential in the presence of higher-level control constructs such as adaptation. Just as getting rid of GOTO ``spaghetti code'' helped make programs easier to read and understand, so too can replacing introspection with adaptation. In section \ref{replintrowadapt}, we listed three common uses for using type or interface checks (e.g. using \function{isinstance()}): \begin{itemize} \item To manually adapt a supplied component to a needed interface \item To select one of several possible behaviors, based on the kind of component supplied \item To select another component, or take some action, using information about the interfaces supported by the supplied component \end{itemize} By now, you've seen enough uses of the \class{protocols} module that it should be apparent that all three of the above use cases can -- in principle -- be handled by adaptation. However, in the course of moving PEAK from using introspection to adaptation, I ran into some use cases that at first seemed very difficult to ``adapt''. However, once I understood how to handle them, I realized that there was a straightforward approach to refactoring any introspection use cases I encountered. Although this approach seems to have more than one step, in reality they are all variations on the same theme: expose the hidden interface, then adapt to it. Here's how you do it: \begin{enumerate} \item First, is this just a case of adapting different types to a common interface? If yes, then just declare the adapters and use \function{adapt()} normally. If the interface isn't explicit or documented, make it so. \item Is this a case of choosing a behavior, based on the type? If yes, then \emph{define the missing interface} that you want to adapt to. In other words, code that switches on type to select a behavior, really wants the behavior to be in the \emph{other} object. So, there is in effect an ``undocumented implicit interface'' that the code is adapting the other object to. Make the interface explicit and documented, move the code into adapters (or into the other classes!), and use \function{adapt()}. \item Is this a case of choosing a behavior or a component based on using interfaces as metadata? If so, this is really a special case of \#2. An example of this use case is where Zope X3 provides UI components based on what interfaces an object supports. In this case, the ``undocumented implicit interface'' is the ability to select an appropriate UI component! Or perhaps it's an ability to provide a set of ``tags'' or ``keys'' that can be used to look up UI components or other things. You'll have to decide what the real ``essence'' is. But either way, you make the needed behavior explicit (as an interface), and then use \function{adapt()}. \end{enumerate} Notice that in each case, the code is demonstrably improved. First, there is more documentation of the \emph{intended} behavior (as opposed to merely the actual behavior, which might be broken). Second, there is greater extensibility, because it isn't necessary to change the code to add more type cases. Third, the code is more readable, because the code's purpose is highlighted, not all the possible variations of its implementation. In the words of Tim Peters, ``Explicit is better than implicit. Simple is better than complex. Sparse is better than dense. Readability counts.'' Now that we've covered how to replace all forms of introspection with adaptation, I'll readily admit that I still write code that does introspection when I'm in ``first draft'' mode! Brevity is the soul of prototyping, and I don't mind banging out a few quick \code{if isinstance():} checks in order to figure out what it is I want the code to do. But then, I refactor, because I want my code to be... adaptable! Chances are good, that you will too. pyprotocols-1.0a.svn20070625/docs/ref/ref.tex0000644000175000017500000000400710132267136016727 0ustar kovkov% Complete documentation on the extended LaTeX markup used for Python % documentation is available in ``Documenting Python'', which is part % of the standard documentation for Python. It may be found online % at: % % http://www.python.org/doc/current/doc/doc.html \documentclass{manual} \title{Component Adaptation + Open Protocols \\ = The PyProtocols Package} \author{Phillip J. Eby} % Please at least include a long-lived email address; % the rest is at your discretion. \authoraddress{ % Organization name, if applicable \\ % Street address, if you want to use it \\ Email: \email{pje@telecommunity.com} } \date{October 10, 2004} % update before release! \release{1.0a0} % release version; this is used to define the % \version macro \makeindex % tell \index to actually write the .idx file \makemodindex % If this contains a lot of module sections. \begin{document} \maketitle % This makes the contents more accessible from the front page of the HTML. %\ifhtml %\chapter*{Front Matter\label{front}} %\fi %\input{copyright} \begin{abstract} \noindent The Python Protocols package provides framework developers and users with tools for defining, declaring, and adapting components between interfaces, even when those interfaces are defined using different mechanisms. \end{abstract} \tableofcontents \chapter{Reference} \input{libprotocols.tex} %\appendix %\chapter{...} %My appendix. %The \code{\e appendix} markup need not be repeated for additional %appendices. % % The ugly "%begin{latexonly}" pseudo-environments are really just to % keep LaTeX2HTML quiet during the \renewcommand{} macros; they're % not really valuable. % % If you don't want the Module Index, you can remove all of this up % until the second \input line. % %begin{latexonly} \renewcommand{\indexname}{Module Index} %end{latexonly} \input{mod\jobname.ind} % Module Index %begin{latexonly} \renewcommand{\indexname}{Index} %end{latexonly} \input{\jobname.ind} % Index \end{document} pyprotocols-1.0a.svn20070625/docs/ref/.cvsignore0000644000175000017500000000001407661024473017433 0ustar kovkov*.l2h *.pdf pyprotocols-1.0a.svn20070625/setup.py0000755000175000017500000000214310570131557015444 0ustar kovkov#!/usr/bin/env python """Distutils setup file""" import ez_setup, sys ez_setup.use_setuptools() from setuptools import setup, Feature, Extension, find_packages # Metadata PACKAGE_NAME = "PyProtocols" PACKAGE_VERSION = "1.0a0" HAPPYDOC_IGNORE = ['-i', 'tests', '-i', 'setup', '-i', 'setuptools'] execfile('src/setup/common.py') speedups = Feature( "optional C speed-enhancement modules", standard = True, ext_modules = [ Extension("protocols._speedups", ["src/protocols/_speedups.pyx"]), ] ) setup( name=PACKAGE_NAME, version=PACKAGE_VERSION, description="Open Protocols and Component Adaptation for Python", author="Phillip J. Eby", author_email="peak@eby-sarna.com", license="PSF or ZPL", url="http://peak.telecommunity.com/PyProtocols.html", zip_safe = sys.version>='2.3.5', test_suite = 'protocols.tests.test_suite', package_dir = {'':'src'}, package_data = {'': ['*.txt']}, packages = find_packages('src'), cmdclass = SETUP_COMMANDS, features = {'speedups': speedups}, install_requires = ['DecoratorTools>=1.3'], ) pyprotocols-1.0a.svn20070625/UPGRADING.txt0000644000175000017500000000321010132267136016001 0ustar kovkovUpgrading from older versions of PyProtocols Due to some features added in PyProtocols 0.9.3, some programs may have compatibility issues upon upgrading. The changes affect only a handful of features that are very infrequently used, so if you don't know what the following features are, you have nothing to worry about: * The 'protocol' attribute of 'Adapter' and 'StickyAdapter' objects was deprecated in 0.9.3, and removed in 1.0a0. * If you use the "ABC" (abstract base class) style of interfaces, you *must* define an __init__ method in either the ABC itself, or in any of its concrete subclasses that are to be instantiated. If you do not, the resulting behavior may not be as expected. * The 'factory' argument to 'adapt()' has been deprecated and using it will issue a 'DeprecationWarning'. * Adapter factories are now only called with one argument: the object to adapt. For backward compatibility, any adapter factories that require more than one argument are wrapped in a converter. It's highly recommended that you transition to one-argument adapters as soon as practical, since using two-argument adapter factories is deprecated and will cause deprecation warnings to appear on 'sys.stderr' at runtime. (And, by version 1.1, support for two-argument adapters will be removed completely.) This change was made for symmetry with Zope and Twisted adapters, as well as Pythonic adapter factories like 'int'. * 'StickyAdapter' subclasses must define an 'attachForProtocols' attribute, or they will stop working correctly. See the reference manual for details on the 'attachForProtocols' attribute. pyprotocols-1.0a.svn20070625/README.txt0000644000175000017500000001211010072402627015414 0ustar kovkovPyProtocols Release 0.9.3 (release candidate 1) Copyright (C) 2003 by Phillip J. Eby. All rights reserved. This software may be used under the same terms as Zope or Python. THERE ARE ABSOLUTELY NO WARRANTIES OF ANY KIND. Package Description Do you hate having to write lots of if-then logic to test what type something is? Wouldn't it be nice if you could just declare "I want this object to have this behavior" and magically convert whatever value you have, to the type you need? PyProtocols lets you do just that, cleanly, quickly, and robustly -- even with built-in types or other people's classes. PyProtocols extends the PEP 246 adapt() function with a new "declaration API" that lets you easily define your own interfaces and adapters, and declare what adapters should be used to adapt what types, objects, or interfaces. In addition to its own Interface type, PyProtocols can also use Twisted and Zope's Interface types. (Of course, since Twisted and Zope interfaces aren't as flexible, only a subset of the PyProtocols API works with them. Specific limitations are listed in the documentation.) If you're familiar with Interface objects in Zope, Twisted, or PEAK, the Interface objects in PyProtocols are very similar. But, they can also do many things that no other Python interface types can do. For example, PyProtocols supports "subsetting" of interfaces, where you can declare that one interface is a subset of another existing interface. This is like declaring that somebody else's existing interface is actually a subclass of the new interface. Twisted and Zope don't allow this, which makes them very hard to use if you're trying to define interfaces like "Read-only Mapping" as a subset of "Mapping Object". Unlike Zope and Twisted, PyProtocols also doesn't force you to use a particular interface coding style or even a specific interface type. You can use its built-in interface types, or define your own. If there's another Python package out there with interface types that you'd like to use (CORBA? COM?), you can even create your own adapters to make them work with the PyProtocols API. PyProtocols is also the only interface package that supports automatic "transitive adaptation". That is, if you define an adapter from interface A to interface B, and another from B to C, PyProtocols automatically creates and registers a new adapter from A to C for you. If you later declare an explicit adapter from A to C, it silently replaces the automatically created one. PyProtocols may be used, modified, and distributed under the same terms and conditions as Python or Zope. Version 0.9.3 Release Notes For **important** notes on upgrading from previous releases, and information about changes coming in 1.0, please see the UPGRADING.txt file. For the complete list of changes from 0.9.2, please see the CHANGES.txt file. Note that the 0.9.x release series is now in "maintenance mode", and no new features will be added in future 0.9.x releases. From now on, new features will only be added to the 1.x releases, beginning with 1.0a1 later this year. If you'd like to use Zope interfaces with PyProtocols, you must use Zope X3 beta 1 or later, as PyProtocols' Zope support uses the latest Zope interface declaration API. If you'd like to use Twisted interfaces with PyProtocols, you must use Twisted 1.0.5 or later. Obtaining the Package and Documentation Please see the "PyProtocols Home Page":http://peak.telecommunity.com/PyProtocols.html for download links, CVS links, reference manuals, etc. Installation Instructions Python 2.2.2 or better is required. To install, just unpack the archive, go to the directory containing 'setup.py', and run:: python setup.py install PyProtocols will be installed in the 'site-packages' directory of your Python installation. (Unless directed elsewhere; see the "Installing Python Modules" section of the Python manuals for details on customizing installation locations, etc.). (Note: for the Win32 installer release, just run the .exe file.) If you wish to run the associated test suite, you can also run:: python setup.py test which will verify the correct installation and functioning of the package. PyProtocols includes an optional speed-enhancing module written in Pyrex and C. If you do not have a C compiler available, you can disable installation of the C module by invoking 'setup.py' with '--without-speedups', e.g.:: python setup.py --without-speedups install or:: python setup.py --without-speedups test You do not need to worry about this if you are using the Win32 binary installer, since it includes a pre-compiled speedups module. Note: if you have installed Pyrex on your Python path, be sure it is Pyrex version 0.7.2. You do *not* have to have Pyrex installed, even to build the C extension, but if you do have it installed, make sure it's up to date before building the C extension. pyprotocols-1.0a.svn20070625/.cvsignore0000644000175000017500000000003510104460235015714 0ustar kovkovbuild dist MANIFEST .project pyprotocols-1.0a.svn20070625/setup.cfg0000644000175000017500000000020610570131557015546 0ustar kovkov[sdist_nodoc] formats=gztar,zip [sdist] formats=gztar,zip [bdist] formats=wininst [egg_info] tag_build = dev tag_svn_revision = 1