After my last blog, many megabytes on the precise definition of adaptation were posted to python-dev (search for "PEP-246"). Overwhelmed by the discussion, I'm going to propose a somewhat Solomonic alternative.
Tony Lownds posted the following in response to my last blog:
There is a property that a typing system for python could have: that removing the type declarations from a program, or ignoring them at run-time, doesn't stop any programs from working.
I like this idea, not so much because it rules out implicit adaptation (the main bone of contention in the python-dev discussions), but because it stresses the optional aspect of the type declarations. But at the same time I still like the adaptation interpretation. So here's a possible way out: make the semantics of type checking configurable at run time.
Specifically, I propose to let module authors set a global variable named __typecheck__. This could be a function defined as follows:
def __typecheck__(x, T):
if "your condition here":
raise TypeError("your error message here") # Other errors are okay too
This is to be used as follows. Consider a function with type declarations:
def foo(a: t1, b: t2) -> t3:
"your code here"
This would be replaced by something like the following:
def foo__(a, b): # The original function
"your code here"
def foo(a, b): # Typechecking wrapper
a = __typecheck__(a, t1)
b = __typecheck__(b, t2)
r = foo__(a, b)
r = __typecheck__(r, t3)
(However, the expressions for t1, t2, t3 and foo__ should be evaluated only once, at function definition time. Probably, __typecheck__ should also be evaluated at this time.)
There would be a very conservative default __typecheck__ function (a built-in) which would assert isinstance(x, T). The -O option could disable typechecking entirely by not generating wrappers at all, if if the __typecheck__ global is not defined at function definition time.
If you like implicit adaptation, you can put this in your module:
from adaptation import adapt as __typecheck__
You could also define __typecheck__ inside a class, just like for __metaclass__ (although its scope would only be the current class, not its subclasses, unlike for __metaclass__, which is inherited). In either case, -O won't remove the type checking wrapper if __typecheck__ is set. (This can be done dynamically, e.g. by the metaclass checking for __debug__ if __typecheck__ is not set explicitly.)
You can also implement duck typing as defined by adapt(), as follows:
from adaptation import adapt
def __typecheck__(x, T):
if adapt(x, T) is not x:
This accepts all values for x that conform to T according to the adaptation registry.
After a long conversation with Phillip Eby, and awaiting the new PEP 246, it's likely that this latter version ("duck typing as defined by adapt()") would become the default if PEP 246 is accepted. (Feb 6, 2005.)
> <p>Specifically, I propose to let module authors set a > global variable named __typecheck__. This could be a > function defined as follows:</p>
> <p>This is to be used as follows. Consider a function > with type declarations:</p> > > <pre> > def foo(a: t1, b: t2) -> t3: > "your code here" > </pre> > (nitpicking here, really) Using the approach you propose, why not retain the standard python notation for function, including with optional value for parameters, and using a 'decorator notation' for type, something like the following:
I, for one, strongly agree with the per-module and per-class approach you describe here. I was angling for this in a comment 'Switching adaption on and off' on the previous blog entry.
Universal, automatic adaption acts like 'overloaded white space', to borrow an old C++ joke. Implicit conversions in that language are confusing and expensive if not kept under control. (I think this point got a run in python-dev)
The function signature syntax is just too useful, as documentation and as fodder for static analysis, to saddle with mandatory runtime overheads. People will use it for documentation, I'm sure.
ISTM that this makes it very easy for people to implement overly strict type checks, with no way to override them. In fact, the default type checking you propose implements an overly strict checking (isinstance) unless you use -O!
I think this makes it too easy to change the "flavor" of Python, because people used to Java will get the idea that you should define __typecheck__ to do isinstance() checking and then we'll all have to put up with it. And we won't get to bypass it by just declaring that our types conform, so this is actually worse than your original conformance-based proposal.
If you must allow per-module customization, at least lets have a default behavior that doesn't lock out alternative implementations of things.
> Using the approach you propose, why not retain the > standard python notation for function, including with > optional value for parameters, and using a 'decorator > notation' for type, something like the following: > > @typecheck(t1, t2; t3) # return types separated by ; > def foo(a, b=42): > "your code here" > > When reading, one can easily skip over the (optional) type > information. It also does not introduce any new meanings > to symbols (i.e. ":") nor introduce new ones ("->"). > > André
This is proposed as an interim solution until the real syntax is implemented, but it really is too verbose and error-prone to be the only syntax for this feature.
> ISTM that this makes it very easy for people to implement > overly strict type checks, with no way to override them. > In fact, the default type checking you propose implements > s an overly strict checking (isinstance) unless you use > -O!
That was just a strawman; if adaptation is ready on time we could make the adaptation-based duck typing test from my example the default.
Another way to prevent this from happening is to introduce a set of standard interfaces (iterable, sequence, etc.) at the same time (with an appropriate conformance checn), so people will start using the right kind of types in their declarations from day zero.
Also, remember what people (outside Zope, Twisted, PEAK) are using today: assert isinstance(x, (int, long))!
> I think this makes it too easy to change the "flavor" of > Python, because people used to Java will get the idea that > you should define __typecheck__ to do isinstance() > checking and then we'll all have to put up with it.
No we don't have to put up with it. Let them do that in their own internal code (believe me, they already commit all sorts of similar sins). If folks attempt to share libraries with such poorly designed type checking, they will quickly be told to fix it by their users.
> And > we won't get to bypass it by just declaring that our types > conform, so this is actually worse than your original > conformance-based proposal.
I think you're too worried about poorly designed libraries. A poorly designed library is just that, and the forces of evolution will take care of this "problem".
> If you must allow per-module customization, at least lets > have a default behavior that doesn't lock out alternative > implementations of things.
I want the default behavior to be easily understandable, and I want it to be purely optional (so -O can disable it), and I want it to catch blatant mistakes like passing a string where a mutable sequence is expected.
Unfortunately, I don't have the depth of knowledge in Computer Sciences to comment on the technical merits of the details of the language changes being proposed, however, I would like to throw in the following high level observations:
Will the changes make Python more verbose? Yes. Will the changes make Python more complex? Yes. Will the changes enable Python to things it couldn't do before? Maybe but probably nothing significant. Will the changes improve Python? For some people: yes; for others: no. Overall about neutral.
What I see is a lot of effort and fiddling going on at a very low level in the syntax of the language. This is small scale tactical thinking that adds nothing to, and conceivably detracts from, the strategic value of Python.
These changes (regardless of their technical merit) won't make Python easier to use and they won't make computers easier to use. It think it's time to sit back and take a long look at the future of Python (and other languages) (...and software development in general) to see where it is that we think we are going and how we intend to get there. And I don't think 'being able to do everything' is a useful strategic goal for Python.
Having started coding with Fortran and moved through Cobol, PL/1, Visual Basic, Java to Python; I find the differences between all these languages to be far, far smaller than their proponents would have us believe. 90% of the differences amount to nothing more than different syntax for doing exactly the same thing as was being done before.
When I get a customer requirement that says something along the lines of: "Give me something that shows me which staff are scheduled to travel to the US today." then, despite their differences, any of the languages - and many others - would provide equally good solutions.
So please, don't waste too much of your life fiddling with the detail when it's not the detail that's the problem.
Well, that's my ranting done for today. Back to fiddling with the @#$!* fiddly detail of why java's definition of a clob is not compatable with Oracle's.
It's quite a long time since my last post... in fact all (really interesting) tnings I wanted to say have been told in the 2 first blog post...
Now it seems we are coming nearer to a "solution". Many consensus have appeared... and I agree with them: - usage of decorators first... waiting for a syntax modification - usage of adaptation as the default typechecking for python as soon as possible, ASAP for "geeky" guys :þ
And even with this last post we can see how typechecking can be turned ON/OFF or modified so that it conforms to the writer's view of typechecking
Here is one question :
(However, the expressions for t1, t2, t3 and foo__ should be evaluated only once, at function definition time. Probably, __typecheck__ should also be evaluated at this time.) How would you do this? Any proposition for an implementation?
Guido writes Also, remember what people (outside Zope, Twisted, PEAK) are using today: assert isinstance(x, (int, long))!
But that's not true... what people are doing today is mostly putting nothing at all! And I fear that Phillip is right: once people learn "the way to do typechecking in Python", there are a large number of people who will use it everywhere.
and the @typechecking decorator would produce a wrapper that holds references to t1, t2, t3 and the wrapped function object. (It will have to use sys._getframe() to access __typecheck__ and bind that to the wrapper too.)
Maybe the way to do it is to have the default type-checker print out a warning to stderr if the type-check fails, using a fairly simple and conservative check such as isinstance(), and then work on a system for suppressing warnings selectively for usages that are actually okay. Note that you need to be able to suppress warnings for method calls that are fairly deep in the stack. For example, if you write function foo() which calls standard library function bar(), which calls another function baz(), which raises a warning, there needs to be a way for you to suppress certain warnings raised by baz() without disabling useful warnings for unrelated calls to baz().
Standard library authors could then suppress a bunch of warnings for common usages that they know are okay, without having to worry about getting every last case. A caller of the library could try doing something new, and if it works, suppress the warning.
Flat View: This topic has 36 replies
on 3 pages