The Artima Developer Community
Sponsored Link

Weblogs Forum
Optional Static Typing -- Stop the Flames!

79 replies. Most recent reply: Feb 4, 2008 9:52 AM by Wolfgang Lipp

Welcome Guest
  Sign In

Go back to the topic listing  Back to Topic List Click to reply to this topic  Reply to this Topic Click to search messages in this forum  Search Forum Click for a flat view of this topic  Flat View
Previous Topic   Next Topic
Threaded View: This topic has 79 replies on 1 page
Guido van van Rossum

Posts: 359
Nickname: guido
Registered: Apr, 2003

Optional Static Typing -- Stop the Flames! (View in Weblogs) Posted: Jan 6, 2005 5:05 PM
Reply to this message Reply
Summary
My two posts on adding optional static typing to Python have been widely misunderstood, and spurred some flames from what I'll call the NIMPY (Not In My PYthon) crowd. In this post I'm describing a scaled-down proposal with run-time semantics based on interfaces and adaptation.

In a Nutshell

  • Argument and return type declarations
  • Attribute declarations (maybe)
  • Interface declarations
  • Design by contract (maybe)

Argument and Return Type Declarations

Let's go back to the basics. A function is defined like this:

def foo(x: t1, y: t2) -> t3:
    ...body...

is more or less equivalent to this:

def foo__(x, y):  # original function
    ...body...

def foo(x, y):    # wrapper function
    x = adapt(x, t1)
    y = adapt(y, t2)
    r = foo__(x, y)
    return adapt(r, t3)

Here t1, t2 and t3 are expressions that are evaluated once, at function definition time (i.e. at the same time as argument default values). Also, the wrapper doesn't really access the original by name; more likely it is an object that has a reference to the original function somehow.

The types are standard Python expressions, there's no separate syntax for type expressions. A type can be combined with a default value:

def foo(x: int = 42):
    pass

The default value gets adapted to the given type at function declaration time.

A function has a __signature__ attribute from which the names, types, and default values of the arguments can be introspected, as well as the return type (and the types for *args and **kwds, if specified).

Until the parser has been modified to accept this syntax, we can experiment with decorators like this:

@arguments(t1, t2)
@returns(t3)
def foo(x, y):
    ...body...

but that notation has little to recommend it in the long run.

Attribute Declarations (Maybe)

It makes some sense to allow attribute declarations like this:

class C:
    x: t1

which would create a property x that calls adapt(value, t1) upon assignment.

This is syntactic sugar for something we can write today:

x = typedAttribute(t1)

The syntax can be combined with a default value:

x : t1 = default

meaning:

x = typedAttribute(t1, default)

The implementation of typedAttribute() is left as an exercise to the reader. This is only for classes, and only defines instance variables. (A mutable default will suck just like it always did.)

Interfaces

This is the only other place where I still think that new syntax is needed. My syntax proposal is still:

interface I1(I2, I3):
    def foo(a: t1, b: t2) -> t3:
        "docstring"

class C(I1):    # implements I1
    def foo(a, b):
        return a+b

The metaclass gives C.foo the __signature__ attribute and adaptation wrappers gleaned from the interface at run time. You don't have to specify argument and return types in the interface declaration; if the type is absent adapt() is not called.

The interfaces don't show up in the __bases__ attribute of the class; rather, they show up in a new __implements__ attribute (the metaclass can tell the difference between an interface and a class).

This proposal is even simpler than PEP 245 (Python Interface Syntax); I don't think the "implements" keyword proposed there is needed.

Design by Contract (Maybe)

This has received an inordinate amount of attention in the discussion forum, but I'm not very impressed with the resulting designs. There are basically two styles of proposals: statement-based and expression-based.

I think the expression-based proposals are too limited: they don't handle guards involving multiple arguments very well, and the proposed overloading of type expressions and boolean guards feels error-prone (what if I make my guard 'True' while awaiting inspiration for something better?). Also, there are clear use cases for guards that (in Python) can only be expressed using multiple statements.

But the statement-based designs are pretty cumbersome too, and I expect that in practice these will be used only in large projects. At the moment I am leaning towards not defining any new syntax for these, but instead use a decorator until we've got more usage experience. Here's a strawman proposal:

def _pre_foo(self, a, b):  # The pre-condition has the same signature as the function
    assert a > 0
    assert b > a

def _post_foo(self, rv, a, b):  # The signature inserts the return value in front!
    assert rv > b

@dbc(_pre_foo, _post_foo)   # design-by-contract decorator
def foo(self, a, b):
    return a+b

In this example, _pre_foo and _post_foo are just names I picked; they are associated with the foo method by the @dbc decorator.

An alternative proposal could use an implicit binding based on naming conventions; then pre- and post-conditions could automatically be inherited, but the metaclass has to do more work.

But if you really want my opinion, I think these should not become a part of standard Python just yet -- I'd rather see others experiment with the ideas sketched here, write a PEP, and then we can talk about standardization.

That's It!

I'm dropping the advanced and untried ideas for now, such as overloaded methods, parameterized types, variable declarations, and 'where' clauses. I'm also dropping things like unions and cartesian products, and explicit references to duck typing (the adapt() function can default to duck typing). Most of these (except for 'where' clauses) can be added back later without introducing new syntax when people feel the need, but right now they just act as red flags for the NIMPY (Not In My PYthon) crowd.

Most importantly, I'm dropping any direct connection to compile-time type checking or generating more efficient code. The adaptation wrappers will slow things down -- a price some people will gladly pay for the flexibility offered by adaptation and better run-time error checking. I expect that interface declarations will be helpful to PyChecker-like static bug finders and to optimizers using type inferencing, but these will have to deal with pretty much the entire range of dynamic usage that's possible in Python, or they will have to explicitly say that certain programming styles are not supported.

A Note on Adaptation

PEP 246 (Object Adaptation) has lots of good things to say about adaptation that I won't repeat.

When deferring to adapt() for all our type checking needs, we could give built-in types like int and list a suitably wide meaning. For example:

def foo(a: int, b: list) -> list:
    return b[:a]

This should accept a long value for a, because (presumably) adapt(x, int) returns x when x is a long; and it should accept any sequence object for b. But what if I have a sequence object that doesn't implement sort()? That method isn't used here, but it's defined by the built-in list type, so won't the default duck adaptation to list fail here?

There are a few interesting ideas here (e.g. Eiffel conformance), but in practice we'll likely end up declaring a bunch of standard interfaces that finally define carefully what it means to be an integer, sequence, mapping, or file-like object (etc.), and we'll be writing things like this instead:

def foo(a: integer, b: sequence) -> sequence:
    return b[:a]

(But most of the time you'll still be writing just this:

def foo(a, b):
    return b[:a]

:-)


Re: Optional Static Typing -- Stop the Flames! Posted: Jan 6, 2005 6:07 PM
Reply to this message Reply
Posted by: Phillip J. Eby    Posts: 28 / Nickname: pje / Registered: Dec, 2004
+1.

The interface portion will need some fleshing out, as far as whether people will be able to define their own interface types, or whether it's even 100% necessary to have a special interface type at all. For "pure" PEP 246 purposes, plain old abstract classes would suffice, and no special syntax is needed for that. (And by implication, neither is any of the potential bickering over what features a built-in interface type should have. :)

Then there are little things like the distinction between instance.__implements__ and class.__implements__ which bedeviled both Zope and Twisted's implementations at one time or another. (Not that they do now; but it looks like your proposal might be based on Zope interfaces circa PEP 245, which is a few generations ago for Zope's interface implementation, and PEP 245 hasn't been kept up to date.)

But I think all of the open issues are quite resolvable, and even without an official interface type, this sounds at least like a blessing on PEP 246 - which means that maybe I should polish off the C implementation of 'adapt()' I have floating around in PyProtocols and prepare a patch for 2.5. :)

An interesting question there is whether __conform__/__adapt__ should actually become type slots, not just special names. Becoming slots would probably be beneficial for performance if adaptation is widely used. On the other hand, the non-slot nature of __conform__ currently allows PyProtocols to define per-instance adapters for modules and functions, to indicate that they implement a particular interface or to define how to adapt to that interface. Zope also does per-instance interface declarations like these, but using a different mechanism. (Because Zope considers "implements" to be the fundamental concept and "adapts" as derived from that, while PyProtocols goes the other way and says that "implements" is just a special case of "adapts to" where no adapter is needed.)

Anyway, I think these are all *good* questions to have, because I think they're reasonably answerable ones in the near term. :)

Btw, my condolences on being misunderstood/flamed; if it's any consolation, I think it's inherent to blogging, as I have been discovering in the last few months. The bigger your audience, the more likely that you will be misunderstood, misquoted, and flamed for things you didn't even say, or get figuratively called on the carpet for ideas you were just exploring but people took as some kind of gospel proclamation. On the other hand, you'll also get insightful helpful comments from quarters you never expected. Welcome to the internet! :)


Re: Optional Static Typing -- Stop the Flames! Posted: Jan 6, 2005 7:23 PM
Reply to this message Reply
Posted by: D H    Posts: 13 / Nickname: dug / Registered: Jul, 2003
Yes, adding built-in support for adaptation (PEP 246, http://www.python.org/peps/pep-0246.html) will be great, especially if it could come as soon as Philip suggested.

I hope you will consider the syntax again though. Notice this sample, posted earlier:

def draw(items: iter, canvas):
for item: Shape in items:
item.draw(canvas)

Your eyes stop at each colon, which isn't a good thing in this case.


The "as" keyword was proposed for adapation almost a year ago:
http://mail.python.org/pipermail/python-dev/2004-February/042786.html
and this was the syntax later adopted by the boo programming language.

Here is the sample again:

def draw(items as iter, canvas):
for item as Shape in items:
item.draw(canvas)


Re: Optional Static Typing -- Stop the Flames! Posted: Jan 6, 2005 8:00 PM
Reply to this message Reply
Posted by: Phillip J. Eby    Posts: 28 / Nickname: pje / Registered: Dec, 2004

> I hope you will consider the syntax again though. Notice
> this sample, posted earlier:
>
> def draw(items: iter, canvas):
> for item: Shape in items:
> item.draw(canvas)
>
> Your eyes stop at each colon, which isn't a good thing in
> this case.


He's not proposing it for arbitrary variable bindings; just instance attribute definitions in class bodies. Not for class attributes (unless you put them on a metaclass), and not for local or global variables. Just functions, methods, and instance attributes. These are the only places where it makes sense to add them; in normal code 'adapt(x,IFoo)' or 'IFoo(x)' suffices to indicate your intent, and it's not going to give a type inferencer any new problems to solve that it doesn't already have.

(The latter syntax, btw, is a shortcut currently used in Twisted, Zope, and PyProtocols for 'adapt(x,IFoo)' when 'IFoo' is an interface object.)


No syntax debate, please Posted: Jan 7, 2005 8:00 AM
Reply to this message Reply
Posted by: Guido van van Rossum    Posts: 359 / Nickname: guido / Registered: Apr, 2003
Doug Holton: "I hope you will consider the syntax again though."

In my first two posts I tried to discourage a syntax debate because it tends to never end until I break the tie, so I might as well just stick to my own preference and avoid the whole hullabaloo. :-)

But since 'as' keeps being proposed: the problem with using 'as' is that Python already uses 'as' with a very different meaning: like in SQL, "import X as Y" is a local renaming of X to Y. Using the same keyword for type declarations is confusing; it would also preclude adding optional type declarations to imports in the distant future (not an entirely unreasonable extension).

Besides, I'd rather inherit from Pascal than from VB. :-)


Re: No syntax debate, please Posted: Jan 7, 2005 8:45 AM
Reply to this message Reply
Posted by: Cameron Aycock    Posts: 1 / Nickname: cameron / Registered: Jan, 2005
I am a Python neophyte. I have been researching and genuinly enthuastic about Python for the past month. I am a professional Delphi developer, and have used Java, C/C++, x86 asm, basic etc for nearly 10 years.

I have one question and possible suggestion:

One of the things that I REALLY like about Python as I am exploring it, and, ultimatly what attracted me to it, is the common-sense nature of the language. The main turning point was indention based scoping. A lot of developers hear this and freak out, but the more I thought about it, the more it just made sense. Humans read the scope that way, why not require the compiler to do the same? I love it.

I am not arguing the debate about putting OPTIONAL static typing in the language, there are FAR more people with much more experience in that arena. I hope it stays optional. I do have an idea about implementation, that, on the surface seems "pythonic".

Most developers use a prefix notation when declaring a variable of a certain type. Why not us this as part of the language, as indention scoping is? I don't know how this would effect function signatures and such, but just ponder:

iSomeVariable: int
or
def gcd(a: int, b: int) -> int:

vs

def int_gcd(int_a, int_b)
or
int_SomeVariable

Obviously int_[code] could be anything ... [code]i_ ... but I didn't want to promote a cryptic syntax.

Just musings from a Python outsider looking in.


Re: No syntax debate, please Posted: Jan 7, 2005 9:44 AM
Reply to this message Reply
Posted by: Phillip J. Eby    Posts: 28 / Nickname: pje / Registered: Dec, 2004
> Why not us this as part of the
> language, as indention scoping is?

For one thing, it would disallow the use of type expressions, like "sequence[integer]". The current proposal still allows this, if you implement your own parameterized interfaces.


Re: No syntax debate, please Posted: Jan 7, 2005 10:24 AM
Reply to this message Reply
Posted by: Daniel Fackrell    Posts: 1 / Nickname: intchanter / Registered: Jan, 2005
> Most developers use a prefix notation when declaring a
> variable of a certain type. Why not us this as part of the
> language, as indention scoping is? I don't know how this
> would effect function signatures and such, but just
> ponder:
>
> iSomeVariable: int
> or
> def gcd(a: int, b: int) -> int:
>
> vs
>
> def int_gcd(int_a, int_b)
> or
> int_SomeVariable
>
> Obviously int_[code] could be anything ...
> [code]i_
... but I didn't want to promote a cryptic
> syntax.
>
> Just musings from a Python outsider looking in.

I've been following threads in several locations about this, but have been dismayed that what seems to me to be an "obvious way to do it" hasn't yet been mentioned. Maybe I just missed it?

def foo(int(a, b), str(c), list(d)):

This has the benefits that any type or class could be specified, it doesn't mean anything in the language yet (tested on 2.3.4 and 2.4), reads more like natural English (I don't yet know Dutch ;), evokes a similarity to a factory call, and can reduce duplication of type specifiers.

My ignorance:
- implementation
- a way to extend to specifying return types
- extending to more complex specification (though this could be worked around by using these throughout a class)

I do apologize for mentioning syntax, and I promise not to debate it or even post on this article again unless asked.


Re: No syntax debate, please Posted: Jan 7, 2005 10:26 AM
Reply to this message Reply
Posted by: Nick Coghlan    Posts: 13 / Nickname: ncoghlan / Registered: Dec, 2004
> But since 'as' keeps being proposed: the problem with
> using 'as' is that Python already uses 'as' with a very
> different meaning: like in SQL, "import X as Y" is a local
> renaming of X to Y.

I've been a proponent of 'as' for adaptation myself, but recently realised what you describe here: that it conflicts with the 'as for name binding' that import statements already use, and Python 3.0 except clauses are expected to use. (I realised this in the context of a discussion where someone suggested a *third* distinct meaning for 'as')

So, Guido's syntax it is for now, since picking an adaptation syntax is well into the future. I still dislike colons in expressions, though :)


Re: No syntax debate, please Posted: Jan 7, 2005 12:38 PM
Reply to this message Reply
Posted by: D H    Posts: 13 / Nickname: dug / Registered: Jul, 2003
> But since 'as' keeps being proposed:

Actually I see Paul Prescod proposed using "as" over 5 years ago when you were debating adding static typing to Python 1.6: http://www.prescod.net/pytypes/#AEN231

And you and Greg Stein gave the same response against "as" here:
http://www.python.org/~guido/types-sig.html
http://www.lyra.org/greg/python/typesys/type-proposal.html

Sorry, I wasn't aware there was already a debate about this back then.

Here is Greg Stein's argument against "as":

"""
I having been maintaining that the word "as" is the wrong semantic for declarations or the type-assert operator. For example, consider the following code fragments:

def foo(x as Int):
...
foo("123")

z = "123"
y = z as Int


In the first example, does the "as" mean that "123" will be converted to an integer and then bound to x? In the second example, will z be converted to an integer before its assignment to y?

I propose the use of a colon for all declarations. For the type-assert operator, I propose the exclamation point. If a word is required (by Guido :-) for the type-assert operator, then "isa" is closest to the proper semantic.
"""

So his proposal is for this syntax:

def foo(x : Int):
...
foo("123")

z = "123"
y = z ! Int #or y = z isa Int


(the answer to both his hypothetical questions is yes, but they would throw an error because you cannot cast a string to an integer. "isa" is used in other languages to check the type of an object at runtime "if x isa string: ...")


Re: Optional Static Typing -- Stop the Flames! Posted: Jan 6, 2005 8:35 PM
Reply to this message Reply
Posted by: Brett C.    Posts: 4 / Nickname: drifty / Registered: Jan, 2005
First off, I am disappointed people flipped out so much over your blog posts. People just didn't seem to grasp they were just brain dumps on your part for ideas. There were not PEPs, they had not been thrown to python-dev to be torn apart, nore officially presented to the python community to comment upon. Hopefully people will come to realize this and let up a little so that you can at least share any and all ideas, reasonable or not, with us all.

Anyway, on to the topics covered:

- Argument and return type declarations

I already liked the idea and this just keeps it going. Moving it explicitly to use 'adapt' is great and making it all run-time is also good since that will help discourage extraneous usage.


- Attribute declarations (maybe)

I think these are going to be important if thorough coverage of type checking is wanted. If people go with heavy OOP design then checking function signatures will not be enough; checking instance attributes will definitely be needed.


- Interface declarations

Once again I was already happy with what you were initially laying out and this version is still good with me.

I think once we have a stdlib module with common interfaces for sequences, iterables, iterators, lists, ints, etc. these will be used extensively with run-time type checking and help alleviate defensive hasattr checks and even EAFP code by doing it at the entry and exit points of methods.


- Design by contract (maybe)

I say give a decorator a shot (should probably start thinking about a decorator collection module in the stdlib, huh?). The only reason I am thinking that above a metaclass is that random chance of reusable conditions are then possible. Some people might come up with auto-generated conditions and thus want to be able to pass that in to a decorator than do a step of wrapping that generation in an explicit method definition.


And I love the new NIMPY term. =)


Re: Optional Static Typing -- Stop the Flames! Posted: Jan 7, 2005 1:25 AM
Reply to this message Reply
Posted by: Mark Williamson    Posts: 14 / Nickname: mjw / Registered: Jun, 2003
I have to say proposing controversial changes to Python and then complaining when people react is a little disingenuous. Especially by using an emotive phrase like NIMPY.

I have a huge problem with adding typing to Python. Python's strength is its agility and its dynamic behavior. Adding code to essentially help the compiler increases the amount of code I write, decreases the adaptability of the code (if as I have often done decide that the way I pass values around has changed - I now will need to go round and change types, etc. etc.).

You may argue that this is all optional. I think that this is also a very disingenuous argument - it might be optional in the next version of python but as people start to use it , it will be required. Style guides will recommend it , new comers from other languages will use it out of habit and like design patterns and over-engineering in java it will become the culture of the language.

The question that has to be asked is where is all this going? Is Python stepping up to the plate to become the next Java (or C#) and take on enterprise culture. Or is it a slick and different way of doing things that gives a real edge.

Having worked in many languages for more years than I care to remember it seems to me language culture is just as important as the language itself. Part of my problem with Java (in which I have designed and built a number of enterprise scale products) is the culture that guides it is one that assumes the programmer is essentially stupid - and that he or she needs to be protected from this. This ends up with a language that can only be efficiently written in an IDE that in turn protects you from the quite considerable amount of work needed to protect the programmer from themselves "help": types, structure, etc.

Python is *not* like this at the moment. It assumes that its practitioners can think for themselves and in return you get to write less code which you can create faster. I have seen no evidence either from direct experience or from research that shows this produces less production ready code than Java. In fact my personal experience (which obviously not real evidence) is that its the other way round.

This is not just a "not in my python" argument: I am embarking on a big product soon - I have currently chosen Python because of the edge it will give our company. I have to say I am now worried about this choice - if python goes down this route and I loose the commercial edge that writing in Python gives us then I need to rethink our whole strategy.

I've been doing this far to long not to recognize warning signs when I see them.


Re: Stopping the Flames! Posted: Jan 7, 2005 4:24 AM
Reply to this message Reply
Posted by: Michael Chermside    Posts: 17 / Nickname: mcherm / Registered: Jan, 2004
Hey, Guido... I just wanted to say two things. First of all, this sounds pretty good! I'm still having difficulty fully grocking how adaption will work. Yes, I see the obvious uses, but once those are in place, how will it influence people? Will people feel forced to use adaption? How will it alter their behavior? But even allowing for such reservations, it's pretty clear that adaption is useful, and this syntax would make it quite useful. It seems to me that you started out with a wide and ambitious set of ideas and over the course of a week managed to pare it down to the simplest and best ideas at the core. (That doesn't preclude adding more ideas later, just starting with the core.)

The second point I wanted to make is that this is a pretty good way of working. I want to publicly THANK you for putting up these postings, and for putting up with the flames and complaints that resulted. It may be an annoying process, but it's a productive one. Please don't let the flames scare you off.

(PS: I've got some other thoughts, particularly around the definitions of standard types for Python, but I think perhaps I'll wait until there's a PEP before chiming in with those.)


Re: Optional Static Typing -- Stop the Flames! Posted: Jan 7, 2005 7:01 AM
Reply to this message Reply
Posted by: Nick Coghlan    Posts: 13 / Nickname: ncoghlan / Registered: Dec, 2004
> Most of these (except for
> 'where' clauses) can be added back later without
> introducing new syntax

Just to demonstrate this point, Guido's previous post, and some helpful feedback I got from PJE led me to come up with the following classes:
class strict(object): # Explicit type check
def __init__(self, face):
object.__init__(self)
self.face = face

def __adapt__(self, obj, params = None):
obj = adapt(obj, self.face)
# True strictness would need a check that disallows
# instances of subclasses in the next line
if not isinstance(obj, self.face):
raise AdaptError("Strict adaptation requires instance")
return obj


class any_of(object): # Interface union
def __init__(self):
object.__init__(self)
self.faces = faces

def __adapt__(self, obj):
for face in self.faces:
try:
return adapt(obj, face)
catch AdaptError:
pass
raise AdaptError("Unable to adapt to any interface")


class all_of(object): # Interface intersection
def __init__(self, faces):
object.__init__(self)
self.faces = faces

def __adapt__(self, obj):
for face in self.faces:
obj = adapt(obj, face)
return obj

class each_of(object): # Interface Cartesian product
def __init__(self, faces):
object.__init__(self)
self.faces = faces

def __adapt__(self, seq):
return tuple(adapt(x, face) for x, face in zip(seq, self.faces))

class any(object):
def __adapt__(self, obj):
return obj

any = any()

Hopefully what these are for is fairly self-explanatory.

Parameterised types can also be done without any new syntax. However, Guido's suggested __getitem__ method for type (which looks like new syntax, even though it isn't really) makes them a lot easier to use. Using list and dict as examples:
class parameterised_face(object):
def __init__(self, face, *params):
# Hopefully the next line can be replaced with
# something based on __signature__
if face.__adapt__.func_code.co_argcount < 3:
raise TypeError("Unparameterisable interface")
self.face = face
self.params = params

def __adapt__(self, obj):
return self.face.__adapt__(obj, *self.params)

class type(object):
# Rest of type's definition. . .
def __getitem__(self, params):
if isinstance(params, tuple):
return parameterised_face(self, *params)
else:
return parameterised_face(self, params)

class list(object):
# Rest of list's definition. . .
def __adapt__(self, obj, item_face = None):
try:
obj = list(obj)
catch StandardError, ex:
raise AdaptError(ex)
if item_face is not None:
return [adapt(x, item_face) for x in obj)]

class dict(object):
# Rest of dict's definition. . .
def __adapt__(self, obj, key_face = None, value_face = None):
try:
obj = dict(obj)
catch StandardError, ex:
raise AdaptError(ex)
if key_face is not None:
if value_face is None:
return dict((adapt(k, item_face), v) for k, v in obj.iteritems()))
else:
return dict((adapt(k, item_face), adapt(v, value_face)) for k, v in obj.iteritems()))

adapt(s, list[int])
adapt(d, dict[int])
adapt(d, dict[int, str])
adapt(d, dict[any, str])


Obviously, none of the above has actually been tested, but it gives the general idea. . .


Re: Optional Static Typing -- Stop the Flames! Posted: Jan 7, 2005 7:04 AM
Reply to this message Reply
Posted by: Michael Hudson    Posts: 8 / Nickname: mwh / Registered: Jul, 2003
This seems a reasonable proposal. Calling it "Optional Static Typing" doesn't seem especially accurate, though :)

As for your previous posts being misunderstood, I thought your first post on the topic was pretty hard to understand! That's not to say people were justified in seeing all their pet hates in it and ranting, but I certainly wasn't sure what your real goal was.

I don't really see the point of attribute declarations. Something about the auto-adaptation of return values strikes me as a bit odd (not sure just why, though).

And also, the real issue is clearly going to be the details of interfaces and adaptation. I don't know so much about that, but I guess the twisted and zope crowd, at least, do by now...


implements vs "subclassing" Posted: Jan 7, 2005 7:41 AM
Reply to this message Reply
Posted by: Matt Goodall    Posts: 1 / Nickname: mg / Registered: Jan, 2003
I really like the direction typing in Python is heading with the focus on adaption.

The one issue I have is about the proposed syntax of "subclassing" for signalling that an interface is implemented by a class. I prefer something more like the zope.interface style:


class Foo(Base):
implements(IFoo)


or even


class Foo(Base) implements(IFoo):
pass


(although I prefer the first version).

implements() makes it clear that Base is a class and IFoo is an interface. This is especially true if the interface name does not begin with an 'I'.

I also suspect that API documentation generators that only parse the source code (i.e. don't walk the AST) will have a much easier time with implements(). Unless IFoo is already known to be an interface, the parser has no way of knowing what to do with it and will probably have to assume it's a class. If IFoo is listed inside implements() it is unambigous.


Re: implements vs "subclassing" Posted: Jan 7, 2005 7:52 AM
Reply to this message Reply
Posted by: Guido van van Rossum    Posts: 359 / Nickname: guido / Registered: Apr, 2003
You have a point about doc generators, but I don't know if there are any that actually use this -- when I worked for Zope they were importing the modules AFAICT.

I don't mind adding new syntax, but I'd like to avoid introducing two new reserved words. (The implements(...) function uses a horrible hack which I don't want to standardize.)


Re: implements vs "subclassing" Posted: Jan 7, 2005 1:02 PM
Reply to this message Reply
Posted by: Blake Winton    Posts: 4 / Nickname: bwinton / Registered: Jan, 2005
Fisrt, a syntax suggestion, cause I can't resist. Following the:
def foo( a, b ) -> int:
pass

idea, how about:
class MyClass( object ) -> IMyInterface1, IMyInterface2:
pass

You'll already need to reserve the "->" for return value typing, and the syntax is invalid today. (Also, the interfaces are kind of like the type of the class's constructor... Okay, so that's a bit of a stretch.)

Second, for me at least, the scariest thing about your first two posts were their complexity. Python feels very simple to use currently, and the things you were talking about are inherently complicated. There are a lot of twisty edge cases, and if you start trying to think about and plan for all of them, there is a very large chance of vastly complicating the language. I think you should be allowed to think about them, and even post some thoughts on them, but surely you can understand my worries. This post, on the other hand, seemed to get much closer to the simplicity I've come to know and love (perhaps only because it ignored the edge cases).


Re: implements vs "subclassing" Posted: Jan 11, 2005 3:00 AM
Reply to this message Reply
Posted by: Jim Fulton    Posts: 3 / Nickname: j1m / Registered: Jan, 2005
I would really hate to see subclassing used because it mixes concerns. Subclasses provide implementation. Interfaces provide specification. I think it would be very confusing for the reader of a class definition to mix the base classes from interfaces.

> I don't mind adding new syntax, but I'd like to avoid
> introducing two new reserved words. (The
> implements(...) function uses a horrible hack which I
> don't want to standardize.)

I understand your distaste, however, this has worked our very very well from a readability point of view. So much so, that we've recently added an adapts function that uses the same trick:

class FooBar:

adapts(IFoo)
implements(IBar)

...

I and others find this very readable. I wonder if there is some way, in the long term to make this or someting close to it work without the hack.

Or, perhaps this is a job for class decorators:

@adapts(IFoo)
@implements(IBar)
class FooBar:
...

But please don't mix interfaces and base classes.


Re: implements vs "subclassing" Posted: Jan 11, 2005 4:04 AM
Reply to this message Reply
Posted by: Ka-Ping Yee    Posts: 24 / Nickname: ping / Registered: Dec, 2004
> I would really hate to see subclassing used because it
> mixes concerns. Subclasses provide implementation.
> Interfaces provide specification.

Okay, i hear your concern. I'm trying to raise an issue that seems worth discussing, but i don't intend to claim that i have the only or the best answer.

I agree with you that interfaces are for specification. The question is, how does an interface specify the relationship of one method to another? For example, how would an interface say, "In order for your class to properly implement the mapping interface, its get(key, default) method should behave like __getitem__(key), except that if the key is not present it should return default instead of failing."

There are a few ways one might do this, of which code in the interface is only one. Putting code in the interface ensures that by default the methods will have the expected relationship. Alternatively, you could have a mechanism for the interface to specify an associated mix-in class and have implementations automatically inherit from that class. Or instead of automatic inheritance you could just give implementors a way to say "I want the default mix-in". Or you could just establish a convention for specifying the mix-in in the interface documentation, and let the implementor figure it out. Or you could establish a naming convention that relates the name of the mix-in to the name of the interface. Or you could explain the expected behaviour of all the extra methods in the docstring for each method. Or you could decide this isn't worth it, and do nothing.

Anyway, i'm sure there are other possibilities. I just wanted to see if this is something anyone cares about.

Of greater practical concern, i think, is the scenario: Your function expects an argument that behaves like a mapping. You would like to have the convenience of using the expanded mapping interface (e.g. get(), setdefault(), etc.). How do you do this safely? Ideally, you would like your function to work with previously written mappings that might not implement the entire expanded interface.

Specifying the expanded API in the interface or providing a mix-in are ways of getting at this problem on the implementation end. On the client end, it would be nice to have a mechanism to widen an object's interface. I'm curious to know whether such a mechanism would be interesting to you.

(Without any additional mechanism, inheritance doesn't help on the client end; you would have to write a separate class that expands the API by delegation, like UserDict. Would interfaces typically come in three parts, like dict, UserDict, and DictMixin?)


Re: implements vs "subclassing" Posted: Jan 11, 2005 8:23 AM
Reply to this message Reply
Posted by: Guido van van Rossum    Posts: 359 / Nickname: guido / Registered: Apr, 2003
> The question is, how does an interface specify the
> relationship of one method to another? For example, how
> would an interface say, "In order for your class to
> properly implement the mapping interface, its
> get(key, default) method should behave like
> __getitem__(key), except that if the key is
> not present it should return default instead
> of failing."

This is an interesting problem, but scratches only the surface of a bigger issue, which touches on the associated invariants of an interface. (For example, after x.append(y), len(x) has increased by one, and x[len(x)-1] is y, etc.)

I don't know of any realistic programming language that even begins to touch this one, and I think it's an interesting Ph.D. topic, but not something we should try to solve today by adding semantics to Python's interfaces.

Mathematically, some simple data structures like sets, lists and dicts (and I suspect certain kinds of trees) can probably be specified with extreme rigor, but that would (a) exclude a lot of interesting implementations that cheat in some way (I can imagine all sort of useful mappings that violate some dict invariants) and (b) be impossible to validate. (Even in Java this would be impossible, and Python is a lot more dynamic. I expect one would quickly encounter the halting problem if one tried.)

What happens in practice is that you should trust that something is a mapping when it claims to conform to the mapping interface, and if it breaks, too bad.

I believe that Zope uses so-called "marker" interfaces which have no methods but are intended to convey to the user that an object has certain semantics that isn't expressible in method signatures. This, and documenting intended semantics, is IMO sufficient.

So, with respect, I ask you again to give this up. :-)


Re: implements vs "subclassing" Posted: Jan 11, 2005 5:05 AM
Reply to this message Reply
Posted by:    Posts: 1 / Nickname: hdima / Registered: Jan, 2005
> > I don't mind adding new syntax, but I'd like to avoid
> > introducing two new reserved words. (The
> > implements(...) function uses a horrible hack which I
> > don't want to standardize.)
>
> I understand your distaste, however, this has worked our
> very very well from a readability point of view. So much
> so, that we've recently added an adapts function that uses
> the same trick:
>

> class FooBar:
>
> adapts(IFoo)
> implements(IBar)
>
> ...
>

> I and others find this very readable. I wonder if there
> is some way, in the long term to make this or someting
> close to it work without the hack.

Just an idea:

class FooBar(adapts(IFoo), implements(IBar)):
pass

The metaclass should know about adapts and implements objects.


Re: implements vs "subclassing" Posted: Jan 11, 2005 8:07 AM
Reply to this message Reply
Posted by: Guido van van Rossum    Posts: 359 / Nickname: guido / Registered: Apr, 2003
> class FooBar(adapts(IFoo), implements(IBar)):
> pass

I like this, not just because it means that adapts() and implements() can be builtins rather than keywords, but also because it can be extended to other functionality without requiring adding even more keywords.

(Come to think of it, that was, subconsciously, my main reason for disliking implements as a keyword: there are other relationships between classes and interfaces and requiring a keyword for each relationship feels like a waste of keywords, since the keywords have no use elsewhere in the syntax.)


A few responses Posted: Jan 7, 2005 7:48 AM
Reply to this message Reply
Posted by: Guido van van Rossum    Posts: 359 / Nickname: guido / Registered: Apr, 2003
Mark Williamson: "I have to say proposing controversial changes to Python and then complaining when people react is a little disingenuous. Especially by using an emotive phrase like NIMPY."

Sorry to sound defensive, but you're still missing the point. I was (and still am) thinking out loud and (I thought) made that pretty clear. I was hoping to spur people's thoughts. Instead, several folks (e.g. Chris Petrilli) wrote their own blog representing my post in the worst possible light and then tearing it apart to look cool.

"Adding code to essentially help the compiler increases the amount of code I write, decreases the adaptability of the code"

Did you read and understand the proposal? It is not there for helping the compiler, it is a concise way to cause adaptation to happen automatically at run-time. Adaptation is something that large projects like Zope and Twisted already use with great success.

Michael Hudson: "Calling it "Optional Static Typing" doesn't seem especially accurate, though :)"

I know; I figured it would be better to keep it in the title for continuity with the previous two posts. The next installment may be titled "Optional (Not So) Static Typing" :-)


Re: Optional Static Typing -- Stop the Flames! Posted: Jan 7, 2005 10:40 AM
Reply to this message Reply
Posted by: Alex Martelli    Posts: 3 / Nickname: aleax / Registered: Jan, 2005
" +1 " doesn't even come _close_ to describing my delight in this proposal, Guido -- it's just wonderful. As you asked, I'm not even _looking_ at the specific syntax sugar (if I were, I might notice that the colon is even more overloaded in Python than the keyword 'as' -- so, fortunately, I'm not;-); I'm focusing on the semantics.

One great aspect of this proposal is that, with slightly clumsier syntax to be sure, we can start experimenting with it right now, just about -- using a decorator to stand for declarations of arguments, as you suggested, and maybe a metaclass plus some kind of marker to make interfaces, e.g.:

class MetaInterface(type):
def __new__(mcl, cnam, cbas, cdic):
if cbas and cbas[0] is not Interface:
""" this is a class which implements interfaces, deal with that """
else:
""" this is an interface, deal with that """

class Interface:
__metaclass__ = MetaInterface

so, instead of the future syntax:

interface IFoo(IBar, IBaz): ...

we'd have for these experiments to code:

class IFoo(Interface, IBar, IBaz): ...

with Interface as the first base class. A few simple conventions such as these would let us assemble a "coalition of the willing" to do the hard work of sketching out "interfaces" (I'd rather call them "protocols" or "concepts" if they include syntax and semantics and pragmatics -- but if they're just syntax, i.e. method names and signatures, "interfaces" is just fine, I think).

Do you think I should include these issues in the long-promised, not-yet-delivered rewrite of PEP 246? Having 'adapt' work with ``interfaces'' only would make a big difference, etc, etc...


Alex


Re: Optional Static Typing -- Stop the Flames! Posted: Jan 7, 2005 10:42 AM
Reply to this message Reply
Posted by: Stephen Birch    Posts: 3 / Nickname: sgbirch / Registered: Jan, 2005
Although I have been watching python with great interest for about 5 years now, my corp has only just started to actually use it.

I have to say that the inability to optionally specify type in function calls has been one of the biggest show-stoppers.

Thank you so much for starting down the path of optional static typing. Python just keeps getting better.

Steve


Re: No syntax debate, please. Posted: Jan 7, 2005 1:02 PM
Reply to this message Reply
Posted by: Ka-Ping Yee    Posts: 24 / Nickname: ping / Registered: Dec, 2004
Sigh. I guess this goes to show you can't avoid a syntax debate no matter what you do.


Re: Optional Static Typing -- Stop the Flames! Posted: Jan 7, 2005 1:29 PM
Reply to this message Reply
Posted by: Ka-Ping Yee    Posts: 24 / Nickname: ping / Registered: Dec, 2004
I like this proposal a lot. The concepts make a lot of sense to me.

1. Argument types and return types: Perfect. Those are exactly the semantics i'd want.

One question: could you clarify what specifying types for *args and **kwds would mean? I must admit this possibility didn't even occur to me. If a function signature says

def foo(a: int, b: str, *c: int, **d: str):

does that mean c will be a tuple of ints, i.e. each non-keyword argument after the second will be adapted to an int? And does it mean all the values in the dictionary d will be adapted to strings?

2. Attribute declarations: Clever! Also something i hadn't considered, but i think it's a good idea. It will be very handy to have attributes that obey known protocols.

3. Interfaces: Looks great to me. The issue of standardizing protocols isn't addressed, though, and i think it's worth looking at (see my next post).

4. Design by contract: I think your suggestion is workable, but i agree with you that it shouldn't become part of Python until something better is found. Let me just make a couple of points: you described a dichotomy between statement-based and expression-based contracts, with the two options being too cumbersome and too limited respectively.

I don't think it's really an exclusive choice. With a statement-based design, you must define contracts in separate blocks, so the contracts are cumbersome no matter what. But with an expression-based design, you have the option of writing your contracts in expressions or statements — you can always put a block of statements in a reusable function and call it. So using expressions doesn't prevent you from writing more complex guards.

You do have a point that guards don't handle conditions involving multiple arguments well, in terms of violating TOOWTDI. On the other hand, i think this flexibility is a small matter (about the same magnitude as Python letting you put the arguments in any order you want instead of enforcing that the arguments have to be in alphabetical order), and it's outweighed by the simplicity advantage of using one mechanism for both adaptation and contracts instead of introducing a whole separate mechanism for contracts.

Oh, and about defining True.__adapt__(x) to return x — i don't buy your claim that anyone would "just use True while waiting for something better". Huh?

Part of my brain still longs for an even slightly more general mechanism — a single mechanism allowing aspect-oriented programming, for which adaptation and contracts are both special cases. Nothing beautiful has come to mind yet, though.


Providing standard protocols Posted: Jan 7, 2005 2:07 PM
Reply to this message Reply
Posted by: Ka-Ping Yee    Posts: 24 / Nickname: ping / Registered: Dec, 2004
In my wild and crazy fantasies, i wish that Python could do either or both of these things:

1. To implement a class that provides the complete mapping protocol, i only have to write 3-4 basic methods. When i derive my class from a standard Mapping interface, the rest of the methods get filled in for me.

2. I'm writing a function that expects a mapping as one of its arguments. Even if the value passed to my function only implements part of the standard Mapping protocol, i can get the complete protocol filled in by declaring that the argument is to be adapted to a Mapping.

I think that making it effortless to provide standard protocols would be more than just a convenience; it would also make programs more reliable and less error-prone. What i'm not so sure about is the best way to do this.

Strategy 1 (inheritance) is pretty straightforward to use and easy from a language design standpoint: just allow code in interfaces. I'm not sure whether Guido has made a decision about this, but i'm assuming since his post didn't mention it that he's not supporting this idea yet. The inheritance strategy has the advantage of clear semantics, but the disadvantage that you have to rely on your caller to supply an object derived from the correct interface.

Strategy 2 (adaptation) is more flexible from the called function's point of view, because the argument doesn't have to derive from any particular interface. The argument just has to provide implementations for the methods that interface requires (the "basic" methods or what C++ would call the "pure virtual methods"), and then everything will work. The disadvantage is that it's a little more complicated to define a mechanism that will take care of this.

Here's one strawman idea for how to do strategy 2. Suppose that if you define an interface without explicitly writing out its __adapt__ method, it comes with a standard adaptor that does this:
def __adapt__(iface, value):
if implements(value, iface):
return value
if implements(value, pure_virtual_part_of(iface)):
return iface(value)
raise AdaptationError

This example assumes that it's possible to flag which methods on the interface are "pure virtual". If we allow code in interfaces, then these would be the methods with no code (or we could allow a token such as "..." to indicate that a method is intentionally declared with a missing body). The line that says pure_virtual_part_of would look at just that part of the interface and make sure that these methods have been implemented; instantiating the interface then fills in the rest of the methods.

I'm not assuming that this is the only or the best way to do this, but it seems to me that combining these two strategies would give us a lot of flexibility, convenience, and resilience.

More generally, it seems to me that one way or another we have to decide what the default adaptor should do. If we try to adapt an object to an interface and the object implements all the methods in the interface even though its class doesn't declare the interface, should the adaptor still fail?

These ideas also led me to a couple of other questions:

Q1. How should interfaces indicate that they require their implementors to implement certain methods?

Q2. Should a missing implementation for a method cause an error when the class that claims to implement an interface is compiled, when the class is instantiated, when an instance of the class is adapted to the interface, or only when a client of the instance tries to call the missing method? Or at some other time?


Re: Providing standard protocols Posted: Jan 10, 2005 3:12 AM
Reply to this message Reply
Posted by: Alex Martelli    Posts: 3 / Nickname: aleax / Registered: Jan, 2005
> In my wild and crazy fantasies, i wish that Python could
> do either or both of these things:
>
> 1. To implement a class that provides the complete mapping
> protocol, i only have to write 3-4 basic methods. When i
> derive my class from a standard Mapping interface, the
> rest of the methods get filled in for me.

The time machine seems to have solved this one for you: see UserDict.DictMixin which appears to be exactly what you ask.


Alex


Re: Providing standard protocols Posted: Jan 10, 2005 1:04 PM
Reply to this message Reply
Posted by: Ka-Ping Yee    Posts: 24 / Nickname: ping / Registered: Dec, 2004
> > 1. To implement a class that provides the complete mapping
> > protocol, i only have to write 3-4 basic methods. When i
> > derive my class from a standard Mapping interface, the
> > rest of the methods get filled in for me.
>
> The time machine seems to have solved this one for you:
> see UserDict.DictMixin which appears to be exactly what
> you ask.

That's not quite what i'm talking about. I'm aware that this functionality can be brought in with a mix-in class; what i'm suggesting is that this be part of the standard interface. For example, if a Mapping interface is introduced, i'd like it to do this as part of its definition so we can be confident that all implementors have it. More generally, i'd like this to be part of the way interfaces typically work, not just to have a couple of one-off mix-ins in the library for mappings and sequences.

Both of the things i suggested in the original comment, which i guess you could call API-widening-by-interface and API-widening-by-adaptation, are aimed at improving flexibility and resilience by giving the client of an object more assurance that all the methods will be present and have the expected semantics.

Does that make sense?


Re: Providing standard protocols Posted: Jan 10, 2005 1:42 PM
Reply to this message Reply
Posted by: Guido van van Rossum    Posts: 359 / Nickname: guido / Registered: Apr, 2003
> Both of the things i suggested in the original comment,
> which i guess you could call API-widening-by-interface and
> API-widening-by-adaptation, are aimed at improving
> flexibility and resilience by giving the client of an
> object more assurance that all the methods will be present
> and have the expected semantics.

-1

Since you keep defending this...

I don't think that anybody who has experience with Pythonic interfaces (e.g. the Zope and Twisted folks) has tried this yet, so I'm reluctant to accept either proposal. The 2nd one can be prototyped within the framework of an PEP 246 implementation; in fact, PEP 246 won't stop any interface from implementing this.

I'm not keen on your proposal for having code in interfaces. It introduces some ambiguity, and you will need to mark which methods are *not* optional (even if it were possible by analyzing all the code to automatically determine this, I think the human reader would be better off if it was explicit). Also, it means the code in the interface can't be used for invariant testing (or else it would introduce even more ambiguity).

Personally, I expect that this will be useful mostly in "toy" examples or for specific backwards compatibility situations, where the adaptation trick may be handy.


Re: Providing standard protocols Posted: Jan 11, 2005 4:14 AM
Reply to this message Reply
Posted by: Ka-Ping Yee    Posts: 24 / Nickname: ping / Registered: Dec, 2004
> I'm not keen on your proposal for having code in interfaces.

All right.

> you will need to mark which methods are *not* optional

If i may ask, what's your current thinking on how adapt() will behave with an interface? I wasn't clear about whether it should require the interface to be listed in __implements__, require the presence of all the methods with the correct signatures, either, or both.

At what point (compilation, instantiation, adaptation, call, etc.) do you think it's best to have missing methods cause exceptions?


Re: Providing standard protocols Posted: Jan 11, 2005 8:02 AM
Reply to this message Reply
Posted by: Guido van van Rossum    Posts: 359 / Nickname: guido / Registered: Apr, 2003
> If i may ask, what's your current thinking on how
> adapt() will behave with an interface? I
> wasn't clear about whether it should require the interface
> to be listed in __implements__, require the
> presence of all the methods with the correct signatures,
> either, or both.

This is really a question for PEP 246, not for me, but I believe that it is entirely up to the objects implementing __adapt__ and __conform__. If either wants to cheat, that's fine -- after all, this is Python, and there may be a reason to cheat (maybe the interface is wrong).


Re: Providing standard protocols Posted: Jan 10, 2005 2:06 PM
Reply to this message Reply
Posted by: Alex Martelli    Posts: 3 / Nickname: aleax / Registered: Jan, 2005
> > > 1. To implement a class that provides the complete
> mapping
> > > protocol, i only have to write 3-4 basic methods.
> When i
> > > derive my class from a standard Mapping interface,
> the
> > > rest of the methods get filled in for me.
> >
> > The time machine seems to have solved this one for you:
> > see UserDict.DictMixin which appears to be exactly what
> > you ask.
>
> That's not quite what i'm talking about. I'm aware that
> this functionality can be brought in with a mix-in class;
> what i'm suggesting is that this be part of the
> standard interface. For example, if a Mapping

So, basically, you want to make the interface into an abstract class instead, offering "template methods" which call back to a few lower-level "hook" methods -- just like, say, DictMixin does today, except that DictMixin is more flexible. If you have, say, a readonly mapping, you can still use DictMixin to give you many readonly "richer" methods -- if somebody tries to call a read-write "richer" method, it will fail one or two levels down when DictMixin's implementation calls to the __setitem__ (say) which you don't implement (or in which you raise an exception). You can also use DictMixin without claiming to conform to the dictionary protocol. For example, consider shelve -- can it really be said to conform to the dictionary protocol, when it can only accept strings, not arbitrary hashables, as keys? For DictMixin it doesn't matter: thanks to signature-based polymorphism, it can still work just fine to provide higher-level methods by Template DP even under such restrictions.

So, even if Mapping (carefully and painstakingly segregated -- at least -- into readonly and readwrite variants), RandomAddressableSequence (ditto), etc etc, all came with the relevant mixins bundled in, I would still want to have those mixins available in the standard library separately from the interfaces, because pragmatically speaking they're still very useful.

> interface is introduced, i'd like it to do this as part of
> its definition so we can be confident that all
> implementors have it. More generally, i'd like this to be
> part of the way interfaces typically work, not just to
> have a couple of one-off mix-ins in the library for
> mappings and sequences.

Maybe a special mechanism is warranted to let a "pure" interface (where the contents of methods should really represent checks, pre/post conditions, and the like, rather than default implementations) indicate one (or more, see below) "supporting mixins" to be added to classes claiming to implement the interface. I believe that the existence of a good set of pure interfaces and supporting mixins for the typical protocols found in the standard library is more likely to be the determinant of how interfaces will typically be designed, rather than any such "bundling" mechanism, but that's just a guess, and I could be wrong. In any case, the ability to use the mixins separately from the interfaces would remain important, as above indicated.


> Both of the things i suggested in the original comment,
> which i guess you could call API-widening-by-interface and
> API-widening-by-adaptation, are aimed at improving
> flexibility and resilience by giving the client of an
> object more assurance that all the methods will be present
> and have the expected semantics.
>
> Does that make sense?

I think it does, and indeed it prompts me to raise the ante by going back to my original idea, which was to have in Python a closer equivalent to Haskell's typeclasses, rather than to pure interfaces or abstract classes.

A typeclass is more powerful and flexible than an abstract class because it need not specify which methods are abstract, aka lower-level, in other terms exactly which methods need to be supplied by an object to claim compliance (and to possibly get the Template DP implementations of the others in terms of the ones it does supply). As a toy example:

typeclass IWriteable:
def write(self, onething):
self.writelines([onething])
def writelines(self, manythings):
for onething in manythings:
self.write(onething)

this looks like write and writelines are mutually recursive, but that's not what it means: it means that a class claiming to implement IWriteable must supply either write or writelines (it may also supply both, for performance reasons). If it only supplies EITHER one, the other is supplied Template DP - like from the typeclass.

If we had typeclass, then we'd need the metaclass to determine the dependencies at typeclass level, then when a concrete class is being built, find out which methods the concrete class supplies and pick the others appropriately (or give an error if the concrete class just doesn't supply a sufficient baseset of concrete methods).

An almost equivalent alternative might be to have an interface be able to specify several mixins depending on the sets of concrete methods supplied by the concrete class which is claiming to implement the interface; this would offer less automatism, more explicitness and control, perhaps more readability, and also accomplish the key purpose of ensuring the mixins are also available for separate use.

I don't really know which of the alternatives on a growing scale of depth and elegance -- provide no special mechanism, provide a special mechanism that can only deal with one mixin, provide one that can deal with several, implement full-fledged typeclasses... -- would give the best bang for the buck. Maybe we should widen this discussion by moving to python-dev...


Alex


Re: Optional Static Typing -- Stop the Flames! Posted: Jan 7, 2005 2:59 PM
Reply to this message Reply
Posted by: Jayson Vantuyl    Posts: 7 / Nickname: kagato / Registered: Jan, 2005
Things That Are Good

I can handle adapt as a function name.

I also like the separate interface syntax well enough to not complain.

Questionables

I would caution against using a colon for type specifiers. The colon is threatening to become Python's curly brace. If there are 18 colons on a line, it will become difficult to read.

What do types on *args and **kwargs mean? Will the internally used tuple and dict be adapted to that type? That's the only thing I can think of that even comes close to being intuitive...

Is it good to evaluate the type expressions at define-time? Could using namespacing to evaluate these save us the later trouble of doing parameterized types?

Serious Issues

I like the interface idea. However, could type specifiers only accept interfaces or types, not classes?

The reasoning goes like this, you can't duck-type integers or strings. They are what they are. This is what types do, it specifies which data is core data--it's primitive. Classes are constructed to implement interfaces of one type or another. This is why could completely hose duck-typing.

If we type based on core-data, then only a certain type or subtype is acceptable, since you can't "imitate" an integer or a string--only subtype them to adjust behavior or add methods. However, when we specify classes, the case is not so clear-cut. I will now describe my worst nightmare.

Take the situation where I have to use a framework that does this:


def file_mangler(x: myFileHandleClass):
__...


Now assume myFileHandleClass does something unacceptable in one of the base __init__'s, (or worse, in __new__). For example, let's say I'm going to put network support into this framework. Before I could implement my own class and duck-type it to behave like the myFileHandleClass.

With non-Pythonic type specifiers, I have the problem of having to either implement a subclass of a broken class or the other coder must have the foresight to use an interface, not a class, in the type specifier. Now, I will get an exception, because my duck-typed class cannot be adapted to myFileHandleClass. There is no good way that an adapt function can predict the duck-typing. This MUST be for Python to remain "My Python". Again, it should not be possible to specify that only a certain CLASS may be passed in.

My gut reaction, then, is to only allow types or interfaces to be specified. In most other Statically Typable languages, classes can be specified here. As far as I am concerned, this is what hamstrings their ability to duck-type. A class is not a type, it shouldn't be treated as one. A class has behavior, that's what's important, and that's what an interface indicates.

Also, it is important to support specifying interfaces post-hoc.

Without either of the above, other programmers mistakes limit my ability to duck-type and clever-glue-class my way out of those mistakes.

Actually, Should interfaces actually be special types? I understand that they wouldn't be useful for actually instantiating data, but perhaps that's what we're looking for. It would be especially useful for bridging the gap created with lists and dicts. Right now, we don't insist that things passed around actually be subclassed from list or dict. We simply duck-type list and dict methods.

This actually has an elegance to it. Our adapt implementation, assuming it attempts to duck-type interfaces, will need to match interface signatures to object signatures. Special cases will be needed for internal interfaces. Additionally, there are some interfaces that are more nebulous than just being an internal type primitive (generators/iterators). If we stash the signature information in the built-in types and have the interface syntax generate abstract types as interfaces, then the adapt information (and adaptation registry) could be encapsulated entirely in the existing type structures.

Look at the utility of the following example:


from types import types.IteratorType
# Note, this type is not actually instatiated, it just exists to integrate interfaces into the base language

interface RestartableIterator(types.IteratorType):
__def restart():
____...

class randomNumberGenerator implements RestartableIterator:
__...

def encrypt(r: types.IteratorType):
__...


This example means what it says and says what it means. No wierd inheritance acrobatics involved to trick it into accepting our class.

Summary

Again, I cannot stress enough, enforcing a certain CLASS is folly. Enforcing a certain TYPE is not (with some informational types provided, such as IteratorType, NumberType). Making interfaces generate types, retrofitting built-in types, making a few informational types, and allowing post-hoc interface specification will make this a feature with real utility.

Self-Criticism

You probably have a different feeling for types than I do. It might be there is an impedence mismatch between what you envision a type is and what I do.


Re: Optional Static Typing -- Stop the Flames! Posted: Jan 7, 2005 3:55 PM
Reply to this message Reply
Posted by: Phillip J. Eby    Posts: 28 / Nickname: pje / Registered: Dec, 2004
> I like the interface idea. However, could type specifiers
> only accept interfaces or types, not classes?
>
> ...snip...
>
> Take the situation where I have to use a framework that
> does this:
>
>
> def file_mangler(x: myFileHandleClass):
> __...
>

>
> Now assume myFileHandleClass does something unacceptable
> in one of the base __init__'s, (or worse, in __new__).
> For example, let's say I'm going to put network support
> t into this framework. Before I could implement my own
> class and duck-type it to behave like the
> myFileHandleClass.
>
> With non-Pythonic type specifiers, I have the problem of
> having to either implement a subclass of a broken class or
> the other coder must have the foresight to use an
> interface, not a class, in the type specifier. Now, I
> will get an exception, because my duck-typed class cannot
> be adapted to myFileHandleClass.

Try this:

class DuckType(object):
def __conform__(self,protocol):
return self


And then inherit from it. Your DuckType subclasses will happily claim they support any type, even concrete types. For extra credit, expand the definition of __conform__ to actually check for matching public signatures, and only return self if they match.

Alternatively, the default __adapt__ of class objects could do the same thing, i.e. check for conformance of the class' signatures and then cache the result for future reference.


> Again, I cannot stress enough, enforcing a certain CLASS
> is folly. Enforcing a certain TYPE is not (with some
> informational types provided, such as IteratorType,
> NumberType). Making interfaces generate types,
> retrofitting built-in types, making a few informational
> types, and allowing post-hoc interface specification will
> make this a feature with real utility.

Allow me to rephrase/redirect: instead of forbidding classes to be used as types, simply treat a class as an *implied* type based on signature. This is actually pretty expensive computationally, but it might be easier to get people to refrain from using concrete classes instead of interfaces if you can convince them it's bad for performance. :)


Re: Serious Issues Posted: Jan 7, 2005 5:13 PM
Reply to this message Reply
Posted by: Guido van van Rossum    Posts: 359 / Nickname: guido / Registered: Apr, 2003
Jayson Vantuyl: "There is no good way that an adapt function can predict the duck-typing."

I think PEP 246 actually has a provision for this. Your class that duck-types myFileHandleClass can implement an __adapt__() method that returns self when asked to adapt to myFileHandleClass.

Even if both myFileHandleClass and the class that duck-types it are 3rd party classes that you can't modify, I believe PEP 246 has a provision whereby you can tell adapt how to adapt one to the other by modifying a global table.


Switching adaption on and off Posted: Jan 7, 2005 3:34 PM
Reply to this message Reply
Posted by: Arnold deVos    Posts: 18 / Nickname: arnoldd / Registered: Dec, 2002
I think people will quickly adopt any standard notation for function signatures despite quibbles about the details and fears that it would spoil the language. Reason: people are already inventing such notations and using them in docstrings. It is hard to accurately describe your function to your users in plain english.

Driving dynamic adaption from these declarations makes python more dynamic. The topic should be "Optional Dynamic Features" instead of "Optional Static Typing"!

My question: how can we turn off adaption when it is not wanted but leave it on in places where it is (and where the semantics of the program depend on it)?

Perhaps a per-module switch or some kind?

I know one can always redefine adapt() globally - but that might break some modules that use it. One can always opt out of the adaption protocol by defining a __conform__ method that raises the appropriate exception - but it should be possible to opt out with zero runtime overhead.


Re: Switching adaption on and off Posted: Jan 7, 2005 3:51 PM
Reply to this message Reply
Posted by: Guido van van Rossum    Posts: 359 / Nickname: guido / Registered: Apr, 2003
Arnold de Vos: "how can we turn off adaption when it is not wanted but leave it on in places where it is (and where the semantics of the program depend on it)?"

By not using the optional syntax for declaring argument interfaces:

def foo(a, b):
...

will not invoke adaptation for its arguments.

I don't think it would be very Pythonic to have a way to let you use the notation while disabling its intended semantics.


Re: Switching adaption on and off Posted: Jan 7, 2005 3:56 PM
Reply to this message Reply
Posted by: Arnold deVos    Posts: 18 / Nickname: arnoldd / Registered: Dec, 2002
>
> I don't think it would be very Pythonic to have a way to
> let you use the notation while disabling its intended
> semantics.

No way to use the notation just for documentation, PyChecker and other static analysis without getting the runtime adaption semantics?


Re: Optional Static Typing -- Stop the Flames! Posted: Jan 7, 2005 6:45 PM
Reply to this message Reply
Posted by: Charles Hixson    Posts: 1 / Nickname: charlesh / Registered: Jan, 2005
+1 on optional static typing (as shown)

I do hope that eventually Python and Pyrex could combine into a single project...and that optimization could be done smoothly in various degrees as needed.

But +1 anyway, even if that's not possible.


+1 on argument declarations also, with the same considerations. I hope it will improve efficiency, I expect it will improve error checking. I see minimal downsides (as long as it's optional).


==0 on Interfaces. I really have no opinion, and can't see why I would use them.


Design by Contract is a really good idea, but I'm not sure what the implementation should be. And I'm not totally certain how the basic idea translates from Eiffel into Python.


Re: Optional Static Typing -- Stop the Flames! Posted: Jan 8, 2005 12:13 AM
Reply to this message Reply
Posted by: Roeland Rengelink    Posts: 5 / Nickname: rengelink / Registered: Jan, 2005
I certainly like the wording of current proposal better than the original one. Although I'm not sure that they are in fact different ;).

Maybe the problem is in the examples. Integers are really not very interesting as interfaces (and very interesting as optimization hints). Hence, your gcd example seemed to be about compile-time type-checks and optimization, rather than documentation and/or adaptation.



I have a couple of questions/suggestions.


1. I would expect the following invariant:

x.__conform__(T) is T.__adapt__(x.__conform__(T))

That is, if I (the function writer) say that x needs to be adaptable to T, can x lie about that? Is Python going to enforce the invariant?


2. I was wondering, if this would be more correct:

def f(x:T1) -> T2

is equivalent to:

def f(x):
x = adapt(x, T1)
r = f__(x)
assert adaptable(r, T2)
return r

I would be very surprised if I saw a function definition that claimed to return a T2, but in fact returned something that only became a T2 after adaptation to T2.


3 Actually, I would expect a function definition to be very concrete about its return type. For example: This would surprise me very much:

def weird(x: list) -> sequence:

This, on the other hand, nicely illustrates the difference between interface and type:

def not_weird(x: iterable) -> iterator

Should Python enforce that argument types are interfaces, while return types are concrete types? (in that case, replace 'adaptable' with 'isinstance' in my previous question/suggestion)


4. Type unions (T1|T2) seem to be out. However, I often return either a T or None from my function. How would I deal with that? Also, I often return tuples. Is (T1, T2) still in? How does adapt(x, (T1, T2)) work?


5. About pre and post conditions. How about using interfaces for pre-conditions? E.g:

interface posint(Int):
"""A positive integer"""
def __adapt__(self, x):
r = Int.__adapt__(x)
assert r>=0
return r

def f(x: posint):
# adaptation now asserts that x>=0

In your current proposal, where the return value is adapted to the return type, this would also work for post-conditions. However, as I noted previously, I think return types should be concrete types, and should be checked with isinstance(), rather than adapt(). In fact, I think there is a nice asymmetry between pre-and post conditions, that reflects the asymmetry between interfaces for arguments and concrete types for return types.

I.e, rather than:

interface Ipos_int_list(List):
def __adapt__(self, x):
r = List.__adapt__(x)
for i in r:
assert isinstance(i, int) and i>0

def f()->Ipos_int_list # adapt

I'd write

class pos_int_list(list):
def append(self, item):
assert isinstance(item, int) and item>0
list.append(self, item)
# other methods

def f()->pos_int_list # isinstance

and get the errors when they occur, rather than in a post-condition check. In other words, forget about post-conditions, and let interfaces deal with pre-conditions.


Re: Roeland Rengelink Posted: Jan 8, 2005 10:49 AM
Reply to this message Reply
Posted by: Guido van van Rossum    Posts: 359 / Nickname: guido / Registered: Apr, 2003
Roeland Rengelink: "x.__conform__(T) is T.__adapt__(x.__conform__(T))"

One would hope so, otherwise this should be classified as a bug in the __adapt__ or __conform__ implementations. I don't think Python can ensure this.

"I would be very surprised if I saw a function definition that claimed to return a T2, but in fact returned something that only became a T2 after adaptation to T2."

To the contrary. Remember that T2 may be specified by the interface. The implementation may return something suitable and can leave the adaptation to the desired type to the framework. In C or Java, would you be surprised to see "return 0;" in a function declared to return a float?

"I would expect a function definition to be very concrete about its return type."

Again, not at all. If I can write a function that takes a sequence (any sequence at all), why couldn't I write code that receives a sequence from something it calls? Why not allow conforming implementations of an interface to return either a list or a tuple?

"Type unions (T1|T2) seem to be out."

Good point. Perhaps adapt(None, T) should return None for all T? Or perhaps this should be up to T? I haven't thought much about this; perhaps someone who has used adaptation in Zope or Twisted can comment?

"How about using interfaces for pre-conditions?"

Sounds messy, but this can all be done by the interface implementer without my approval. I doubt we'll have standard types to represent lists of positive ints, though.


Re: Roeland Rengelink Posted: Jan 8, 2005 2:05 PM
Reply to this message Reply
Posted by: Roeland Rengelink    Posts: 5 / Nickname: rengelink / Registered: Jan, 2005
>
> "I would be very surprised if I saw a function
> definition that claimed to return a T2, but in fact
> returned something that only became a T2 after adaptation
> to T2."

>
> To the contrary. Remember that T2 may be specified by the
> interface. The implementation may return something
> suitable and can leave the adaptation to the desired type
> to the framework. In C or Java, would you be surprised to
> see "return 0;" in a function declared to return a float?
>

I wouldn't be surprised, but that's why we call C and Java weakly typed. I haven't been thinking of adaptation as casting yet. I'm not sure that I want to ;)

You're right if Python produced an error here:

def f()->float:
return 0

I would be pissed. Although that's also because Python already 'casts' ints where I think it should. (Note that this wouldn't be an error if int is a subclass of float)

And I even know I'm inconsistent, because what I'm saying is that I would be pissed if this didn't produce an error:

def f()->list:
return ()


> "I would expect a function definition to be very
> concrete about its return type."

>
> Again, not at all. If I can write a function that takes a
> sequence (any sequence at all), why couldn't I write code
> that receives a sequence from something it calls?

Sure, but f()->list allows you to use f() as a sequence. Saying that the return type of f should be concrete, doesn't restrict the user of f(). It does restrict the implementor of f. I think that's good.

Suppose that, as a user, I want to sort the result of f, and that f returns a sequence. I now have to
write:

x = f()
x = adapt(x, list) # or adapt(x, sortable)?
x.sort()

This wouldn't be necessary if f just said that it returned a list.

My point would be that the default 'user' of an interface is someone who uses a concrete implementation, not someone who writes a concrete implementation. And the interface should cater to the default user. I must admit that in the case of frameworks, this may actually be the other way around. Well, mabye that is the point that I've been missing.

> Why not
> allow conforming implementations of an interface to return
> either a list or a tuple?
>

Probably because I think of the interface definition first as a promise by the original implementor, not as a requirement on the writer of a conforming implementation.

Moreover, to parafrase what I said before, I think that you can ask the writer of conforming implementation, to be liberal about what he accepts and strict about what he produces too.

> "Type unions (T1|T2) seem to be out."
>
> Good point. Perhaps adapt(None, T) should return None for
> all T? Or perhaps this should be up to T? I haven't
> thought much about this; perhaps someone who has used
> adaptation in Zope or Twisted can comment?
>

Hmm. That would mean that the writer of a conforming implementation of f(x:T) should accept None as an input. Would you really want that in the case of f(x:int)?

Probably easiest to change youw wrapper to:

def f(x, y):
x = adapt(x, T1)
y = adapt(x, T2)
res = f__(x, y)
if res is not None:
res = adapt(res, T3)
return res


On the other hand it could actually be useful to write

def f(x:int=None):


Thanks for your reply. This is interesting.


Re: Roeland Rengelink Posted: Jan 9, 2005 12:14 AM
Reply to this message Reply
Posted by: Phillip J. Eby    Posts: 28 / Nickname: pje / Registered: Dec, 2004
> Suppose that, as a user, I want to sort the result of
> f, and that f returns a sequence. I now have
> to
> write:
>

> x = f()
> x = adapt(x, list) # or adapt(x, sortable)?
> x.sort()
>

> This wouldn't be necessary if f just said that it returned
> a list.

Actually, 'x = sorted(f())' should work just fine, since 'sorted()' takes an iterable now.

Even if sorted didn't exist, it would suffice to say 'x=list(f()); x.sort()' anyway.


> My point would be that the default 'user' of an interface
> is someone who uses a concrete implementation, not someone
> who writes a concrete implementation. And the interface
> should cater to the default user. I must admit that in the
> case of frameworks, this may actually be the other way
> around.

Both kinds of user occur quite a bit, but for a framework one of the big reasons for having an interface is to document what the framework expects of an implementation created by the user. IOW, the user of an interface is quite frequently the author of a concrete implementation.



> Probably because I think of the interface definition first
> as a promise by the original implementor, not as a
> requirement on the writer of a conforming implementation.

It's both. You can't really have an agreement unless both sides agree; this is what "programming by contract" means. :)


> Moreover, to parafrase what I said before, I think that
> you can ask the writer of conforming implementation, to be
> liberal about what he accepts and strict about what he
> produces too.

In practice this isn't a good idea for the framework writer; an upgraded version of the framework may provide a different concrete class, or one of several concrete classes. It's better for the author to only guarantee as much as the client needs, and that's an interface.


> > "Type unions (T1|T2) seem to be out."
> >
> > Good point. Perhaps adapt(None, T) should return None
> for
> > all T? Or perhaps this should be up to T? I haven't
> > thought much about this; perhaps someone who has used
> > adaptation in Zope or Twisted can comment?
> >
>
> Hmm. That would mean that the writer of a conforming
> implementation of f(x:T) should accept None
> as an input. Would you really want that in the case of
> f(x:int)?

Returning None from __adapt__ and __conform__ means that there is *no* adaptation. The way the current PEP 246 adaptation scheme works, there's no way to interpret 'None' as being a valid output of *any* adaptation other than 'adapt(None,type(None))' and 'adapt(None,object)'. If you wanted for 'None' to be considered valid, it would be necessary to have a change in the design of the adaptation protocol.


> On the other hand it could actually be useful to write
> def f(x:int=None):

If the adaptation protocol could be made to support None as a valid adaptation target, then it would probably make sense to have an 'optional' type, used as e.g. 'optional[int]' to do the equivalent of 'int|None' in the earlier proposal.

However, I have no idea how to allow None to be an adaptation target in a PEP 246-style protocol, without having to use exceptions for control flow (very bad in this case) or introducing another special singleton value like 'CantAdapt' or something like that (which breaks existing PEP 246-based code). I suppose we could raise a special exception to mean "No, I really mean to return None", but that seems really kludgy.


Adaptation and None Posted: Jan 9, 2005 2:37 AM
Reply to this message Reply
Posted by: Nick Coghlan    Posts: 13 / Nickname: ncoghlan / Registered: Dec, 2004
I certainly like the ability to 'fall off the end' of __adapt__ or __conform__ in order to tell the adaptation machinery "I got nuthin'".

Then raising an exception can be used to say, "Not only do I not implement this conversion, but I want you to abort the whole attempt completely".

The main use for "this type or None" is in declaring return types.

Without syntactic support, this can be done manually (by only doing the adaptation if the result is not None).

Once a syntax goes in, I think the issue is better addressed by having the ability to flag "may be None" in the declaration of the return type.

e.g.:
def f() -> int or None:
pass


Re: Adaptation and None Posted: Jan 9, 2005 10:05 AM
Reply to this message Reply
Posted by: Guido van van Rossum    Posts: 359 / Nickname: guido / Registered: Apr, 2003
"Once a syntax goes in, I think the issue is better addressed by having the ability to flag "may be None" in the declaration of the return type."

I agree. Over the past year I've read and written a lot of Java code; one of the most pernicious problems we ran into was unexpected null pointers, mostly due to programmers being unclear (or changing their minds!) on which APIs accepted or could return null and which didn't.

As long as we're going to add interface declarations to Python, I think we should be explicit about whether null is okay, both for arguments and for return values.


Re: Roeland Rengelink Posted: Jan 9, 2005 7:35 AM
Reply to this message Reply
Posted by: Roeland Rengelink    Posts: 5 / Nickname: rengelink / Registered: Jan, 2005
> > Suppose that, as a user, I want to sort the result of
> > f, and that f returns a sequence. I now
> > have to write:
> >

> > x = f()
> > x = adapt(x, list) # or adapt(x, sortable)?
> > x.sort()
> >

> > This wouldn't be necessary if f just said that it
> > returned a list.
>
> Actually, 'x = sorted(f())' should work just fine, since
> 'sorted()' takes an iterable now.
>

s/sort/append

> Even if sorted didn't exist, it would suffice to say
> 'x=list(f()); x.sort()' anyway.
>

Ahh, but that's my point. This would be a bad solution if f() did return a list, because then list(f()) is an unnecessary (potentially expensive) operation. So, you'd have to use adapt(f(), list), which just returns the result of f() if f() did indeed produce a list.

>
> > My point would be that the default 'user' of an
> > interface is someone who uses a concrete
> > implementation, not someone who writes a concrete
> > implementation. And the interface
> > should cater to the default user. I must admit that in
> > the case of frameworks, this may actually be the other
> > way around.
>
> Both kinds of user occur quite a bit, but for a framework
> one of the big reasons for having an interface is to
> document what the framework expects of an implementation
> created by the user. IOW, the user of an interface is
> quite frequently the author of a concrete implementation.
>

Let's make this distinction:

- The default user of a library is someone who would call f

- The default user of a framework is someone who would implement f

I.e., libraries and frameworks are distinguished by the way in which a user would use the interface.

I wrote my original comment as a library user. And I do think that a library interface shouldn't be vague about what it returns. Moreover, even if the library is not vague about what it returns, as a reader of library code, I wouldn't like to see something like:

def f()->list:
return ()

that return an empty list due to the adapatation magic. Hence, my only suggestion to Guido was to remove the adaptation step on the return value and replace it with an assertion

Note that when I write 'list' as the return type, I mean the list interface (List?), i.e.: an interface with 1 concrete implementation (possibly sub-classed).

I'm thinking of a new built-in conforms(obj, interface), where conforms(x, T) returns True iff adapt(x, T) would return x. And my suggestion would be to replace

def f(x:T1, y:T2)-> T2:

with

def f(x, t):
x=adapt(x, T1)
y=adapt(x, T2)
r = f__(x, y)
assert conforms(r, T3)
return r


This would still allow one to write

def f()->sequence:
return ()

but wouldn't allow:

def f()->List:
return ()

This doen't hurt the framework writers, but keeps the library writers honest.

>
> > Probably because I think of the interface definition
> > first as a promise by the original implementor, not as
> > a requirement on the writer of a conforming
> > implementation.
>
> It's both. You can't really have an agreement unless both
> sides agree; this is what "programming by contract" means.
> :)
>

But it's library writers and framework writers who have to agree with their respective users. They don't necessarily have to agree with each other (except about the implementation of Guido's wrapper)


Re: Roeland Rengelink Posted: Jan 9, 2005 9:58 AM
Reply to this message Reply
Posted by: Guido van van Rossum    Posts: 359 / Nickname: guido / Registered: Apr, 2003
"And I do think that a library interface shouldn't be vague about what it returns."

This isn't a discussion for here. The proposed syntax and semantics will let library and interface writers do what they want or need; if you need to be vague about a return type or you need to be concrete about an input type, Python won't stop you. What's best for a given situation depends; we shouldn't (in this stage, without much more experience) try to legislate this.


Re: Roeland Rengelink Posted: Jan 9, 2005 1:56 PM
Reply to this message Reply
Posted by: Roeland Rengelink    Posts: 5 / Nickname: rengelink / Registered: Jan, 2005
> "And I do think that a library interface shouldn't be
> vague about what it returns."

>
> This isn't a discussion for here. The proposed syntax and
> semantics will let library and interface writers do what
> they want or need; if you need to be vague about a return
> type or you need to be concrete about an input type,
> Python won't stop you. What's best for a given situation
> depends; we shouldn't (in this stage, without much more
> experience) try to legislate this.

Don't worrq. I'll quit about this one. If only because you (and Philip Elby) have actually convinced me ;)

Now, the other thing you were discussing, was maybe adding attribute typing as in:

class C:
x:int = 3

as syntactic sugar for

class C:
x = typedAttribute(int, 3)

which can currently be implemented with properties. Since this is an idiom I use occasionally in the following form:

class C:
x = typesAttribute(int, 3, "a nice doc string")

I was wondering if you had considered attribute docstrings. Would this work?

class C:
x:int=3
"""a nice doc string"""


Re: Optional Static Typing -- Stop the Flames! Posted: Jan 8, 2005 3:16 PM
Reply to this message Reply
Posted by: Bruce Eckel    Posts: 875 / Nickname: beckel / Registered: Jun, 2003
A small brainstorm on coverage: would it make sense for all objects (classes, functions) to have some standard method, say __tests__(), or perhaps a property __tests__[] that would be respectively overridden or populated with tests. To run the tests you find all the objects in your system and call the functions, and to discover coverage, you find out how many objects have overloaded/populated the __tests__ thing.

Each object could then be analyzed to see whether all of its code paths are exercised via its __tests__, so instead of trying to discover this using some complex tool that looks at all the code at once, it's done at a small level of granularity.

This or something like it might go a long way to alleviate the concern of unexercised code paths, but using dynamic tests rather than static ones.


Did I understand correctly? Posted: Jan 8, 2005 4:06 PM
Reply to this message Reply
Posted by: nes    Posts: 137 / Nickname: nn / Registered: Jul, 2004
Just seeing if I understood correctly.
We are talking about something like the following?

Argument and Return Type Declarations

.......considering......


def foo(x,y):
x=str(x)
y=str(y)
...body...
return str(r)

(x and y have a __str__ method)


........analogy to............


def foo(x,y):
x=t1(x)
y=t2(y)
...body...
return t3(r)


......and object x has...

def __t1__(self):
if cannot_convert: raise adaptation_error
return t1(...some stuff here...)


.....and object y has...

def __t2___(self):
if cannot_convert: raise adaptation_error
return t2(...something here...)

...etc..same with r....
I asume subclasses will automatically converted to their superclasses?


Attribute Declarations

class C:
def __init__(self,p):
x=t1(p)

..etc...

....We will have a lot of fun once we get overloaded methods:

def foo(x:t1):
...body...

def foo(y:t2):
...body...

r=foo(z:t3)


....where z can be converted either to t1 or t2.
....(The compiler will have fun choosing :-))

Interfaces

def bar(x:I1)->I1:
return I1()

r=bar(C())


....what kind of object will x and r be after conversion?

Design by Contract

We are forgetting invariants. Those are assertions that get checked after each method or attribute access to the object. They ensure that the object is always kept in a sane state.

e.g.

Class Person:
def __init__(self):
self.age=Int()
self.children=List()
Invariants:
if self.age<9:
assert len(self.children)=0


Related note:
I would really like to have pychecker in its current form integrated into python, maybe with a python -warn option. It is the first package I download after python itself. This stuff is lower priority for me.

Nestor


Which user communities for Python 3/3000? Posted: Jan 9, 2005 9:03 AM
Reply to this message Reply
Posted by: Edward C. Jones    Posts: 3 / Nickname: edcjones / Registered: Dec, 2004
I think that the design of Python 3/3000 should start with user needs.

What types of applications should Python 3/3000 be designed for? Teaching, scripting, wrapping, algorithm development, ...?

What can be added to Python to make writing these applications easier?

But please keep Python easy to read and write.


Which user communities for Python 3/3000? Posted: Jan 9, 2005 9:04 AM
Reply to this message Reply
Posted by: Edward C. Jones    Posts: 3 / Nickname: edcjones / Registered: Dec, 2004
I think that the design of Python 3/3000 should start with user needs.

What types of applications should Python 3/3000 be designed for? Teaching, scripting, wrapping, algorithm development, ...?

What can be added to Python to make writing these applications easier?

But please keep Python easy to read and write.


Re: Optional Static Typing -- Stop the Flames! Posted: Jan 9, 2005 4:35 PM
Reply to this message Reply
Posted by: Jaroslaw Zabiello    Posts: 1 / Nickname: zbiru / Registered: Jan, 2005
There is another static typed language which uses pythonic syntax: http://boo.codehaus.org/. Its syntax looks very cool and may be an inspiration for this debate.


Re: Optional Static Typing -- Stop the Flames! Posted: Jan 9, 2005 6:40 PM
Reply to this message Reply
Posted by: SX    Posts: 2 / Nickname: shrimpx / Registered: Jan, 2005
I just want to say that the idea of static typing was a sound one and it should by no means be abandoned.

First of all, everyone seems to be ignoring the _optional_ part of it. I agree that Python shouldn't become statically typed overnight (or never), but if you can make it an option that many people will enjoy, why not?!

Second, static typing has definite uses and a long, successful history. As a Web programmer programming in Perl and Python, I witnessed countless errors and crashes that exhibited themselves weeks or months after I did the coding, and which my unit tests didn't catch. If your program passes a correctly implemented type check, you have a formal proof that your code will never crash due to a particular set of errors. That's pretty nice!

Third, ideas are in the very early stages, so giving up or slandering are premature. Let it cook for a couple of months, check out ideas in other languages -- Haskell's type classes have not been mentioned in the whole interface debacle, for example.

Finally, I have to critique what many people seem to think Python culture means. Somehow, Python has become "a really really really dynamic language" whose dynamic ways should in now way be messed with, unless of course they were made even more dynamic. For some reason I thought that Python was more than that.

I've always seen Python as a "bridge" langauge, willing with no hesitation to borrow, assimilate, and immitate in an effort to preserve and improve expressiveness and usefulness. In this, it has always managed to maintain originality and freshness. List comprehensions? They were borrowed directly from Miranda and Haskell. Generators? Lazy languages have them mandatorily and many imperative languages have generator libraries and call them "lazy lists." filter? reduce? map? lambdas? closures? That's functional programming!

Python is, in my opinion, the one language that makes an ongoing effort to bring advanced programming to the mainstream in a nice, elegant manner, and is underestimated by the research community as well as its own. Please, let it grow and expand and don't hinder it with cultural drivel, blaming it that it's robbing you of your sense of identity.

People are very afraid of big changes but are willing to admit one little change at a time. Eventually, you will end up with an optional annotation for every little disjoint, scattered feature that could have been nicely unified in a sound type system.

With a powerful module system and static typing, Python would be very, very hard to beat in ANY context.

Thanks, Guido, for Python.


Re: Optional Static Typing -- Stop the Flames! Posted: Jan 9, 2005 7:37 PM
Reply to this message Reply
Posted by: John M Camara    Posts: 15 / Nickname: camara / Registered: Dec, 2004
SX

> As a Web programmer programming in
> Perl and Python, I witnessed countless errors and crashes
> that exhibited themselves weeks or months after I did the
> coding, and which my unit tests didn't catch.

Sounds like you need to write better unit tests. Don't blame these problems on the lack of type checking. To do so, shows you have a lack of understanding on the use or application of unit tests.

John


Re: Optional Static Typing -- Stop the Flames! Posted: Jan 9, 2005 8:55 PM
Reply to this message Reply
Posted by: Guido van van Rossum    Posts: 359 / Nickname: guido / Registered: Apr, 2003
[SX]
> > As a Web programmer programming in
> > Perl and Python, I witnessed countless errors and
> > crashes that exhibited themselves weeks or months
> > after I did the coding, and which my unit tests
> > didn't catch.

[John M Camara]
> Sounds like you need to write better unit tests. Don't
> blame these problems on the lack of type checking. To do
> so, shows you have a lack of understanding on the use or
> application of unit tests.

Now, now, John. We all know static typing is no panacea. But neither is unit testing. They each have their strenghts and weaknesses, and SX has a point: one of Python's strength is its borrowing from many other languages and paradigms (and another strength is my restraint in so borrowing :-).


Re: Optional Static Typing -- Stop the Flames! Posted: Jan 10, 2005 6:00 PM
Reply to this message Reply
Posted by: SX    Posts: 2 / Nickname: shrimpx / Registered: Jan, 2005
>Sounds like you need to write better unit tests. Don't blame these
>problems on the lack of type checking. To do so, shows you have
>a lack of understanding on the use or application of unit tests.

Are you saying that you can write a spread of unit tests that basically guarantees correctness? Sorry, but that's not possible, even if you have a better understanding of applications of unit tests than I do.

I hope we can all agree that writing unit tests to catch type errors is utterly superfluous.


Re: Optional Static Typing -- Stop the Flames! Posted: Jan 10, 2005 2:05 PM
Reply to this message Reply
Posted by: Kurt B. Kaiser    Posts: 1 / Nickname: kbk / Registered: Jan, 2005
Eiffel has several levels of assertions associated with DBC:

preconditions
postconditions
invariants
loop invariants & variants
checks [groups of assertions]

These can be enabled in increasing order using the LACE compiler control. Or switched completely off for performance reasons.


Re: Optional Static Typing -- Stop the Flames! Posted: Jan 11, 2005 11:52 AM
Reply to this message Reply
Posted by: Nicolas Fleury    Posts: 7 / Nickname: nidoizo / Registered: Dec, 2004
I just want to say I agree with the root message (Stop the Flames). I think the syntax should be used for duck-typing, and simplifying our day-to-day life and forget about compiler optimization is the way to go. I'm very happy with this direction.

I just hope that the chosen syntax will be scalable to specify types at additional places (and consequently enable auto-completion at more places). For example the iteration of a typeless container:


for i:MyClass in mylist:
# cool, auto-completion-and-doc for i.


Regards,
Nicolas


Re: Optional Static Typing -- Stop the Flames! Posted: Jan 11, 2005 1:49 PM
Reply to this message Reply
Posted by: Guido van van Rossum    Posts: 359 / Nickname: guido / Registered: Apr, 2003
> I just hope that the chosen syntax will be scalable to
> specify types at additional places (and consequently
> enable auto-completion at more places). For example the
> iteration of a typeless container:
>
> for i:MyClass in mylist:
> # cool, auto-completion-and-doc for i.

That syntax would work (and it is one of the reasons why I like my syntax better than other proposals), although I'm -1 on adding it now; let's reserve it for the future.

But this reminds me of Java code where we have to declare the type of the loop variable even if it's totally evident from the type of the container. I'd much rather eventually evolve to declaring the (presumably) argument mylist as sequence[MyClass] or list[MyClass] or iterable[MyClass] or whatever is appropriate, so that i's type can be inferred without having to say it.


Re: Optional Static Typing -- Stop the Flames! Posted: Jan 11, 2005 2:06 PM
Reply to this message Reply
Posted by: Andre Roberge    Posts: 6 / Nickname: aroberge / Registered: Dec, 2004
If Python were to allow assert: to introduce a bloc of code, instead of having to repeat assert at the beginning of each line and also allow the identification
isinstance(object, class-or-type-or-tuple)
with
object: class-or-type-or-tuple
as you (GvR) have suggested, one could (almost) write the following as valid Python code:

01 def gcd(a, b):
02 assert:
03 a: int
04 b: int
05 a != 0 and b != 0 # fake pre-condition
06 return c assert:
07 c: int
08 c != 0 # fake post-condition
09 '''Returns the Greatest Common Divisor,
10 implementing Euclid's algorithm.
11 Input arguments must be integers;
12 return value is an integer.'''
13 while a:
14 a, b = b%a, a
15 return b
The more general form would be:

01 def foo(...):
02 assert:
. type information and/or
. pre-conditions
. return [...] assert:
. type information and/or
. post-conditions
. '''docstring'''
. ...body...
where we have three blocks at the same level:
1) the static typing information and/or pre/post-conditions
2) the docstring (as a visual separator)
3) the actual function body statement.

I have written about using a similar syntax in my blog (aroberge.blogger.com) and it has received some (limited) positive feedback.

Disclaimer: I am a relative newbie to Python and have no experience in programming language design.

André Roberge


Re: Optional Static Typing -- Stop the Flames! Posted: Jan 12, 2005 9:31 PM
Reply to this message Reply
Posted by: John M Camara    Posts: 15 / Nickname: camara / Registered: Dec, 2004
Four years ago when I was a Python newbie I would have argued strongly for static types. I had felt that this feature was badly needed and wondered how much better the language would be with them. It's not that I thought Python was a bad language, I just believed it would be a better language with types.

Now I thought I had many good reasons why types should be added. For well over a decade I had bought into all the common arguments that were made to prove that types were a necessary evil. You know, basically the benefits out weigh the costs. After all, some really smart people came up with very convincing and sound arguments over the years and experiences I had seamed to back up there arguments.

Over the years I have programmed in countless languages and any time I used a language that lacked types it was always a painful experience. In languages where types were optional I would eventually use them for nearly all variables. I would make exceptions for some of them like i, j, k, and other common variables.

Something was different about Python, as it wasn't a painful experience. It was a pleasure and it even put the fun back into programming. Although, it took awhile for me to accept the fact that it was a pleasure. I had felt like I was breaking a taboo. Programming without the safety net of types. Have I sold my soul to the devil? Have I forgot the painful lessons I have learned over the years? Why could I not stop using Python? It made so sense. All I knew was that the programs I wrote in Python were more reliable than the ones I wrote in other languages. I was also far more productive. I also felt that I was becoming a better programmer at a faster rate. The light bulbs were turning on in my head far more often than in the past.

After more than 12 years of faithfully believing in types I realized that they just aren't that important in creating quality code. When you think about it, what type of protection do you really get from them. If you say you need an int, is that what you really need? Do you need the full range. Do you need just positive numbers (oh that's what unsigned int is for)? Maybe even or odd numbers? Maybe values from 0 to 100? Does it protect you from a divide by zero? I could go on and on here.

OK, so we add constraints. How complex do we make these constraints? After all, we can dream up so many possibilities. At some point isn't it just simpler to write the code out than use a complex constraint syntax.

In an earlier reply to this blog I had made a harsh comment. I didn't intend to be harsh but was too lazy to write a proper response. Guido slapped my hands and stated that unit tests, like static types were no panacea. I agree with Guido that unit tests are no panacea but I do believe they can play the role of protector that some programmers have come to expect from static types.

Now some will argue that it's crazy to define a unit test to test an int when a static type of an int would do the trick (assuming that what you really need is the full range of an int). To some degree I also agree but I also wonder what the consequence of adding static types would have on the language or better yet the users of the language. Will it add roadblocks to the learning curve to the many users.

I know I became a better programmer because Python lacked types. If they were optional I would have used them and I might not have realized how dependent I became on them and the false sense of protection that they gave me. How good it makes you feel when you compile code and it has no errors. Even better when there are no warnings. Even better yet when you set the compiler options to increase the number of warnings it will provide and none are given. Makes you feel real good but it doesn't really make much of a difference in the quality or correctness of the code. It's just an illusion.

Python, because of it lack of static types and the fact that it is a great language made me come to this realization (made the light bulb turn on). I wonder just how many others out there had a similar experience as I have had. Could this be the reason some are strongly against adding static types to the language? Do others feel that it may be a disservice to all those programmers that haven't quite got the point that static types are not always the best protection or that they only provide solid protection in limited cases. I know this is not the only reason why some are against static types but it's the reason I would have against it.

For the record, I'm not 100% against adding static types. I'm just more against it than for it at this point in time. If they can be added in such a way that they are truly optional and I don't mean it in the sense that you don't have to use them but that the syntax is such that it somewhat discourages their use unless they are needed. Needed for reasons other than for protection like being used for optimization in cases where performance is an issue.

You may ask what is syntax that discourages it's use unless it is needed. I believe decorators fit this bill. It may not seem like it at this point in time as everyone under the sun is trying to find ways to use and abuse decorators but this will change when decorators are no longer new toys. After all, many of use like to play with new toys. In time though only the usefully uses of decorators will prevail.

Unfortunately I'm not convinced that adding static types will follow this pattern. If the syntax of static types are not enough of an inconvenience to use they will likely be used and abused forever. This is why I believe it may be a disservice to add static types.

Now, are there times that I wished Python had static types. Sure. Not for their false sense of protection but for their compact form of documentation or the intent that they show (what a variable/function/method is for or how it is to be used). I miss that that kind of documentation/intent is missing from Python. It's a good thing that Python code tends to be very clean and compact so normally you can quickly skim the code to get that same level of understanding that you would get from types. It's just that it is more work than would be require if types exist. Not that I believe that this is a good reason to add types.

I believe that there is something fundamentally missing from Python that would take care of this issue but I have no idea what it is. Some would say “that's what doc strings are for” but I'm not convinced that the lack of a syntax to address such an issue is the best solution. Doc strings can certainly play this role but if everyone uses a different format I'm not sure it's best for the common good.

I want to see this intent in the code I write so bad that I find myself abusing __slots__. I add __slots__ to, far too many of my classes. I even add them when I know they will cause me to write more code as I feel that the intent they show out weighs the cost of adding them. I wonder just how many others do the same?

Well, just some of my thoughts.

John


Re: Optional Static Typing -- Stop the Flames! Posted: Jan 12, 2005 9:52 PM
Reply to this message Reply
Posted by: Guido van van Rossum    Posts: 359 / Nickname: guido / Registered: Apr, 2003
> Four years ago when I was a Python newbie I would have
> argued strongly for static types. [...]

Nice post John. You might want to subscribe to the python-dev list and follow the, um, lively PEP 246 discussion there.

Your admitted overuse of __slots__ is interesting. I've seen this in others too, though I almost never use them myself. It is similar to the "cast of characters" printed at the start of a play -- a helpful reference for the reader.

Maybe my "name: type = value" syntax could take over this role. It would be nice if we could also work a docstring in that syntax, but the proposal done earlier here (put the docstring on the next line) is too ambiguous to work, and neither does putting the docstring on the same line. Bonus points to make the same syntax work for arguments.


Re: Optional Static Typing -- Stop the Flames! Posted: Jan 13, 2005 6:30 AM
Reply to this message Reply
Posted by: Roeland Rengelink    Posts: 5 / Nickname: rengelink / Registered: Jan, 2005
>
> Maybe my "name: type = value" syntax could take over this
> role. It would be nice if we could also work a docstring
> in that syntax, but the proposal done earlier here (put
> the docstring on the next line) is too ambiguous to work,
> and neither does putting the docstring on the same line.
> Bonus points to make the same syntax work for arguments.

I'm going to assume that for attributes x:protocol is syntactic sugar for x=typed_prop(protocol), where:

class typed_prop(property):
def __init__(self, protocol):
def getter(inst):
return adapt(inst.__dict__[self], protocol)
def setter(inst, obj):
inst.__dict__[self] = adapt(obj, protocol)
def deleter(inst):
...
property.__init__(self, getter, setter, deleter,
protocol.__doc__)
self.__doc__ = protocol.__doc__

Then the following 'protocol wrapper' does pretty much what you'd expect:

class info:
def __init__(self, protocol, doc):
self.protocol = protocol
self.__doc__ = doc
def __adapt__(self, obj):
return adapt(obj, self.protocol)

I.e:

>>> class X(object):
... attr : info(float, 'a doc string')
... def f(arg:info(sequence, 'another doc string')):
... pass
...
>>> X.attr.__doc__
'a doc string'
>>> X.f.__signature__['arguments']['arg'].__doc__
'another doc string'


But, this goes the way of 'protocol expressions' again, which I understand you want to avoid at the moment. So, I will stop here.


Re: Optional Static Typing -- Stop the Flames! Posted: Jan 13, 2005 12:25 PM
Reply to this message Reply
Posted by: Christian Tismer    Posts: 2 / Nickname: stackless / Registered: Dec, 2004
John, Guido,

I just would like to add that I like how this whole
topic morphed from static types to something much
more useful, and that my own opinion about types
has changed drastically over the last years
when I was using Python.

As John, I thought types were something very valuable,
and it would be great to add static typing to Python,
especially since I'm so keen for optimization,
but also because certain errors are known at compile time.
I still remember those Python 1.6 discussions.

Meanwhile I learned that this is nonsense.

Speed:
There are better ways to get speed than static typing.

Correctness:
I agree completely, this is an illusion.
The "no errors encountered" during compilation
means just that you didn't make some silly errors,
and that you followed the rules that the language
forced you to obey. But these things are imposed by
the language which tells me it needs (correct) types.
But this helps me almost None in writing good code.


Well, I don't want to repeat things others have said before,
this is just to express my happiness how this whole thing
has evolved. When the Blog started, I was really in fear
of what this would do to Python. This is gone, now.

Congrats, Guido and all the contributors!


Re: Optional Static Typing -- Stop the Flames! Posted: Jan 15, 2005 10:56 AM
Reply to this message Reply
Posted by: Neal Norwitz    Posts: 2 / Nickname: neal / Registered: Jan, 2005
I finally got around to reading all Guido's blogs and the replies. Here's the sum of my comments. Some apply primarily to the second blog, but they should all be fairly relevant.

Simplicity - Motivation:

Very good summary. The timing of Oliver Steele's blog entry is a very funny coincidence. I also think a retrospective on adding bool to Python is in order. There was a huge outcry about how bool was going to ruin Python (yes, I'm exaggerating, but only a bit). Many brought up dire consequences of how there would be all kinds of compatibility problems. While there are always compatibility issues with changing versions, I have not heard about any major issues. Nearly all of the concern in change was misplaced. I believe adding "Optional Static Typing" will be more of the same. Certainly, this change would be much deeper and will lead to more changes in Python software. However, many of the fears are overly cautious.

---

I think there are two major hurdles for people to get past when considering "Optional Static Typing". I've grown to dislike this term because of the animosity and assumptions that go along with it. This is also not exactly how I think about it.

I would prefer to call it "Conformance Checking". (It would still be optional.) I think this is a more descriptive term than "duck typing". My view is that conformance checking is primarily about checking an interface, rather than checking a type. If I declare a function:

def foo(x: int) -> float: ...

I am not thinking about int and float as concrete types, but rather more like abstract interfaces. int and float are merely a short-hand, concise, and convenient way to name the concept (interface) that I mean.

I have not used Eiffel, nor do I know a great deal about Design By Contract. However, I view adding type information as a very specific and terse use of DbC, basically a limited short-hand. For a more complete approach to using DbC, decorators can be employed. This may not be maximally optimal, but should be good enough until we can learn
how to better implement DbC in Python. (Heh, I wrote this last sentence before reading Guido's third post.)

I wonder if we should use function attributes for pre/post conditions. This might lead to better re-use. Either way, there is the problem of choosing magical names.

---

I'm wondering if we should have a list of potential keywords in the future. We seem to want this often enough. This would really just be for guidance, although we could enforce it (with a warning).

The two words I recall are: as, interface. But I know there were others.

---

I like the | and & proposals in "Unions and Such". The syntax reminds me of set notation (even if they are really C operators) which is how I think about it. I'm not sure I like * though. I don't really like the idea of using words like intersection or cartesian for the same reason, they are too verbose.

On the other hand, | and & are very C-ish to me. Perhaps we should not try to borrow from it. But if not C, then what?

---

I wonder if it might not be better to require a command line switch to allow type declaration syntax? This would add a hurdle for newbies who try to use types by default.


Re: Optional Static Typing -- Stop the Flames! Posted: Jan 15, 2005 12:00 PM
Reply to this message Reply
Posted by: Tony Lownds    Posts: 2 / Nickname: tlownds / Registered: Jan, 2005

> def foo(x: t1, y: t2) -> t3:
> ...body...
>
> def foo__(x, y): # original function
> ...body...
>
> def foo(x, y): # wrapper function
> x = adapt(x, t1)
> y = adapt(y, t2)
> r = foo__(x, y)
> return adapt(r, t3)


There is a property that a typing system for python could have: that removing the type declarations from a program, or ignoring them at run-time, doesn't stop any programs from working.

Here is an example. Lets say that adapt(1.0, int) returns 1.


from__future__ import division
def medianindex(seq: sequence) -> int:
return len(seq)/2

def pseudomedian(seq: sequence) -> any:
if not seq: raise ValueError, "median of empty sequence not defined"
index = medianindex(seq)
return seq[index]


Now, this code works when adapt() is implicitly called, because of the implicit conversion. But if adapt() isn't called, then suddenly we get "TypeError: list indices must be integers"

Some reasons that adapt() may not be called are,
1. -O switch or some other switch that elides run-time type-checking
2. back-porting code to older pythons (perhaps mechanically)

IMO, it would be better to call a type-checking function rather than a type-coercion function -- one that can only raise TypeErrors

With type "adapting" there is no TypeError in my example; with type "checking" the TypeError is raised in correct spot (the return statement of medianindex).

Automatic coercion has risks. Being able to know that working programs will work without type checking is a nice property. For both of those reasons, it seems to me that the conservative approach would be to type check only.


def foo(x, y): # wrapper function
typecheck(x, t1)
typecheck(y, t2)
r = foo__(x, y)
typecheck(r, t3)
return r


Here is a little proposal that would give implementors the choice. Build the (nice) type annotation syntax but don't hard-code the run-time semantics.
Instead, put those semantics into a standard decorator. Allow module implementors to define a per-module decorator using __decorator__. Have __builtin__.__decorator__ default to the standard checking decorator.
This would let adaption frameworks get the implicit adaption behavior that they want.

-Tony


Expression-based guards Posted: Jan 17, 2005 9:23 PM
Reply to this message Reply
Posted by: Robert Brewer    Posts: 7 / Nickname: fumanchu / Registered: Jan, 2005
> I think the expression-based proposals are too limited:
> they don't handle guards involving multiple arguments
> very well, and the proposed overloading of type
> expressions and boolean guards feels error-prone
> (what if I make my guard 'True' while awaiting
> inspiration for something better?). Also, there
> are clear use cases for guards that (in Python)
> can only be expressed using multiple statements.

Why can't you just farm complex guards out to full functions? It seems to me that a guard which requires multiple statements should be broken out and named for readability's sake. Simple example:

def nz(x):
return (x != 0)

def gcd(a, b):
-> a: int and b: int and nz(a) and nz(b)
<- (output) output: int and nz(output)
'''Euclid's Greatest Common Divisor.'''

while a:
a, b = b%a, a
return b


Doing so would probably be a show-stopper for pychecker's ability to benefit from _those portions_ of the guard expressions, but ISTM that this would be the case in any statement-based syntax. By using an expression-based syntax, we could at least enable partial pychecker support. In the many, many cases in which we do *not* need multiple statements, we could even have a sys.enable_guards flag, which (when turned off) would skip the guards at runtime, leaving the check to be performed by a (possibly built-in) pychecker.

As a side note, I prefer the explicit binding of the output value (as I've done above) when asserting the postconditions. But then, arg-checking is already orthogonal to getting "x: int" into the language, IMO. ;)


Re: Optional Static Typing -- Stop the Flames! Posted: Feb 12, 2005 5:10 PM
Reply to this message Reply
Posted by: Jim Nisbet    Posts: 1 / Nickname: jimniz / Registered: Feb, 2005
Having both written and read large Python programs I do think that it would be very helpful to be able to "bind a type restriction" to a variable. It certainly helps the reader of a function absolutely know that (say) a list is expected (or returned). The same can be said for local variables.

If this could be integrated with PyChecker that would be even better.

One question, I would expect the following to fail:

def foo(int: i):
i = "junk"

because "i" can not be adapted to an int. In my view the type adaptations applies to all assignments in to that variable.

For the syntax, instead of:
def foo(int: t1, str: t2) -> list:
...body...

could you not support:
def foo(int x, str y) returns list:
...body...

you could also support typed variable creation/assignment:

int i = 1
for int ino in items:
etc.

This seems to me this would be more in keeping with Guido's comment from his previous posting re syntax "Python's parser generator is so lame, but that in turn is intentional -- it is so lame to prevent me from inventing syntax that is either hard to write a parser for or hard to disambiguate by human readers, who always come first in Python's design".

/j


Re: Optional Static Typing -- Stop the Flames! Posted: Mar 7, 2005 8:02 AM
Reply to this message Reply
Posted by: Edson César Cunha de Oliveira    Posts: 1 / Nickname: ecco / Registered: Mar, 2005
Well, maybe i'm too late for this, but anyway...

Guido, I read all your posts about the static type thing and concluded that making it optional is the way to go. But i had learned a python principle that says "there's only one way to do it!" and liked because it makes my code more readable. Im not trying to stick in radical way with it, but i think we have some alternatives to evaluate.

As you said, interfaces is the only place where a new syntax is needed. So why do you put all that "new features" (types and contracts) in the interface? So you have a good backward compatible source code, and "only one way" to use these new features?

Interfaces could still have implementation body... this way you can add a lot of opportunities to new styles of programming, like pre/post condition:

interface I1(I2, I3):
def foo(a: t1, b: t2) -> t3:
"docstring"
assert a > 0
assert b > a
assert Do() > b # Where Do() calls the wrapped function!

class C(I1): # implements I1
def foo(a, b):
return a+b

Or Aspect programming coding...

Well, that was my 2 cents...


Re: Optional Static Typing -- Stop the Flames! Posted: Mar 14, 2005 6:20 PM
Reply to this message Reply
Posted by: Ronald Adam    Posts: 5 / Nickname: ron100 / Registered: Mar, 2005
Ok, here's my keep it simple oppinion. :)

How about just putting constraints on names in the name space using an "as" keyword?

syntax:
name [as 'type'] = object

The "as 'type'" statement would have the meaning of:

"From now on, this name can only point to objects of this type."

For single varibles:

a as 'int' = 1
b as 'float' = 3.5

To take a type constraint off:

a as 'none' = 'mars' # can be assigned to anything.

Or to change it to another type:

a as 'int' = 5
a as 'float' = 23.5

It wouldn't make sense to change a name to a different type without pointing it at an object of that type. To do so should generate an error.

a as 'string' = '123'
a as 'int' = a.int()

For functions, we primerily need types to check for errors, we really only need to put constraints on varables in the function some place.


def gcd(a, b):
a as 'int' = a
b as 'int' = b
while a:
a, b = b%a, a
return b


This function of course will only return an int, and will generate errors if it recieves other than ints for arguments.

To check the return value we only need to:

z as 'int' = acd(u,v)

Now, after saying this, I have no idea how difficult it would be to add types constraints to names in names space. But this seems "to me" from a average python programmer's point of view to be simple, easy to understand, easy to read, consistant with the rest of python, and doesn't crate any compatablility problems that I can think of.

Not sure if there would be a big performance hit.. or maybe gain, compaired with other methods. I've heard that doing things in name space is one of the things that makes python as fast as it is.

Ronald Adam


Re: Optional Static Typing -- Stop the Flames! Posted: Mar 14, 2005 7:02 PM
Reply to this message Reply
Posted by: Ronald Adam    Posts: 5 / Nickname: ron100 / Registered: Mar, 2005
> How about just putting constraints on names in the name
> space using an "as" keyword?
>
> syntax:
> name [as 'type'] = object
>
> The "as 'type'" statement would have the meaning of:
>
> "From now on, this name can only point to objects of
> this type."


Change all references to 'as' to 'of' in my previous post. I didn't remember 'as' was used else where.

It's the concept I'm sugesting anyways, not the specific word 'as'.

syntax:
name [of 'type'] = object

x of 'int' = 123
y of 'float' = 2.5

etc...

The use of 'of' implys the name belongs to a set of 'type'. That might be a useful way to look at it.

It still keeps the readability that made 'as' attractive.

Ron


Re: Optional Static Typing -- Stop the Flames! Posted: Jul 18, 2005 4:04 AM
Reply to this message Reply
Posted by: Richard Prosser    Posts: 3 / Nickname: rprosser / Registered: Jul, 2005
I may be naive but I am surprised by the complexity of the debate here. To my mind, what is required in essence is a text pre-processor ('macro definition language'), applied before run-time. Decorators appear to be heading in that direction, but I confess that I don't really understand them yet.

A pre-processor would also permit 'conditional compilation' in the manner of C, to cater for different host environments perhaps. In such cases you do not want to check the OS type or whatever every time that you run your program. So if you had a statement like '#IF OS == "Unix":, then the relevant code could be included in the main text (or not, as the case may be).

Coming from a C background, I find Static Typing to be useful, but I also have strong scripting experience and I appreciate that enforcement of ST can be a real burden for a programmer, and doesn't always deliver the promised benefits in terms of reduced development/testing effort. So I definitely favour the Optional flavour, but perhaps the pre-processor could issue Warnings rather than reject particular constructs, to permit a 'strong' implementation.

As for particular ST use cases, my inclination would be to cater for the specific rather than the generic. In other words, if min() is defined for an 'int' type then it should *not* accept a 'long'; rather, a new type would need to be delared - 'Natural' in this case perhaps? Intuitively, a similar argument applies to the min() sequence examples: as these (appear to have) different behaviours, they should be different types. If you really need a generic min() function, then write the type checker yourself, or persuade the Python developers to do so.

Finally, the syntax that I prefer is along the lines of C - i.e., declare the type first on the same line as the variable or function, such as 'Integer a=0'. If a text pre-processor is employed as I suggest, then it could easily (?) convert such syntactic sugar to decorators or whatever.

I'm not an expert in such matters, so I hope that these ideas are taken in the spirit of Guido's blog as a whole - i.e. "brainstorming" topics that could be used as stepping-stones to better/stronger proposals.


Cheers ...

Richard Prosser


Re: Optional Static Typing -- Stop the Flames! Posted: Jul 18, 2005 5:02 AM
Reply to this message Reply
Posted by: Richard Prosser    Posts: 3 / Nickname: rprosser / Registered: Jul, 2005
Apologies for commenting on my own post, but I feel that some clarification would be desirable.

If min() - for example - is currently defined to accept an 'int' type rather than a 'long', then it should obviously be modified to accept the latter. This is because these are essentially the same types, but simply of different sizes. Indeed, 'int' and 'long' are articifical to a large extent, included for historical or (perhaps) performance reasons, rather than being natural constructs that people normally use.

On the other hand, a min() function that compares strings is clearly a different beast - i.e. the characteristics differ from numbers in this case - and so should be considered as a separate implementation. Given a declaration of the form 'String s = min('a', 'b'), the compiler/pre-processor would employ the String version of min() rather than the "Numeric" one. This could be implemented as 'if ... elif' statements as described elsewhere, but there is no need for a generic type min() function as such - just a collection of particular types.

Finally, I note that min(a, b) applies to a sub-set of a sequence; it is a special case of min(a, b, c, ...).


Richard


Call signatures Posted: Nov 4, 2005 11:16 AM
Reply to this message Reply
Posted by: Ernesto Posse    Posts: 1 / Nickname: eposse / Registered: Nov, 2005
I was wondering what is the proposed syntax for call signatures?

Guido mentioned some alternatives:

1) x: def(int, int) -> str
2) x: def(a: int, b: int) -> str
3) x: def(:int, :int) -> str
4) x: def(_: int, _: int) -> str
5) x: lambda(int, int) -> str
6) x: (int, int) -> str

I like all except 3 and 4.

I suppose other alternatives along the same line could use "function" or "fun" instead of "def" or "lambda".


What is Guido's preference these days?


Re: Call signatures Posted: Jan 27, 2006 2:46 PM
Reply to this message Reply
Posted by: Brantley Harris    Posts: 3 / Nickname: brantley / Registered: Jan, 2006
I guess I don't get it.

I mean I know Python is trying to get away from C/Java, but come on...

Why not this?
def int foo(int a, list b):
int something = 5

Your changing syntax anyway, might as well.
And since you're messing with some of the core concepts of Python anyway, just drop the other shoe and do:
__strict__ = True
At the module, class, heck function level.

And interfaces could be handled perfectly well with class @decorators:

class Interface:
yadda yadda...

@Interface
class Whatever:
yadda yadda...


Re: Optional Static Typing -- Stop the Flames! Posted: Feb 4, 2008 9:52 AM
Reply to this message Reply
Posted by: Wolfgang Lipp    Posts: 17 / Nickname: wolf2005 / Registered: Sep, 2005
i have found that the cobra language deals with the problem of optional static typing and programming by contract in a very nice and unified way: http://cobra-language.com/docs/quality/

the big deal that i see here is that within cobra blocks you can use all of the language without having to introduce arcane constructs:


class Customer

var _contacts as List<of Contact>

get contacts from var

def addContact(contact as Contact)
require
contact not in .contacts
contact.name
contact.customer is nil
ensure
contact.customer == this
.contacts.count = old .contacts.count + 1
body
contact.customer = this
_contacts.add(contact)



while not all of the above can or should be readily implemented in python, maybe it’s possible to do something very similar this way:


@contract
def f( x, y ):
def require():
isinstance( x, int )
isinstance( y, float )
x > y
...



so there is only a single decorator and pre- and postconditions can use full-fledged python. testing could be integrated, too. of course, method definitions would suffer a slight bloat.


Topic: Optional Static Typing -- Stop the Flames! Previous Topic   Next Topic Topic: Setting Up for Emulation

Sponsored Links



Google
  Web Artima.com   

Copyright © 1996-2019 Artima, Inc. All Rights Reserved. - Privacy Policy - Terms of Use