is a type system a hindrance to programming creatively ?

Just thinking out aloud, for the sake of clarity.

Consider music improvisation. Should a musician stop and restart because one tune was a bit off ?

(I am not talking about reproducing mozart here but improvisation i.e, coming up with new tunes, maybe even a mozart remix)

I think not.

Programming languages without type systems don't complain much. This makes it easier to program despite the obvious errors. This makes it easy to _improvise_ algorithms.

Most improvised music can sound pretty _dirty_. This is why, after an improvisation session , what follows is an editing session which adds *structure* and *corrections* to make it sound _clean_

- unit tests --> music idea
- programming without a type system --> improvisation
- functional testing / benchmarks --> editing the music

Interestingly classical music with all of its "harmony" formulae and chord sequences analogically maps to type system and design patterns

So what do you prefer ?

Stingy mozart

or

Smooth jazz ?

7 Comments

Ugh – I don’t think your metaphor holds. a type system does not mean that you need to predeclare your types. For example, you can program in Haskell without declaring the types your functions accept as parameters and the return types. See Type Inference.

Also, you could say that Perl also is very strongly typed and you declare the type of every value by its sigil, constantly. I’m not sure where that fits into your metaphor either.

Actually I kind of like the metaphor. Although I prefer Mozart to most of the jazz I've made an effort to listen to. There again Moondog is pretty cool.

Tangential question: Why do we feel perl is an easier language to program in than many? For me not having to prepare a sophisticated type hierarchy is a big part of that. For example:


sub some_sub {
do_stuff();
return { a=>b, c=>d }
}

Later in life:

sub some_sub {
do_stuff();
return { a=>b, c=>d , e=>f}
}

Sure - I return a hash ref in both cases; with stricter typing they would be different types. Now my client code that worked with the earlier version will still work with the newer version - no changes needed.

I could just as easily manipulate the symbol table to get a context-dependent some_sub:

{
no strict refs;

local *some_sub = sub {
do_stuff();
return { a=>b, c=>d , e=>f}

};

.... call some_sub somewhere ....

}


Rigorous type safety surely has its place - but there can't be too much argument that development with that kind of type system will be less 'improvised'.


Datatypes were invented by compiler writers to make their job easier. They do not make programming easier because people do not think in terms of datatypes.

If you go out into the streets and ask people how to multiple a number by ten, they say: put a zero on the end of it. That's string manipulation, not arithmetic. Apparently people can instantaneously convert between strings and numbers and back again.

If you look at the code of any strongly-typed language you see things like this:

int iCount = 0;

Notice the "i". That's a reminder of what type Count is. If people thought in terms of datatypes, then why do they need constant reminders of a variable's type?

Dataypes are just an extra burden for a programmer and should be done away with. People do not think in terms of datatypes.

Here's what Doug Crockford says:

"JavaScript's loose typing is a big benefit here because we are not burdened with a type system that is concerned about the lineage of classes. Instead, we can focus on the character of their contents."

"Javascript: The Good Parts" Chapter 5: Inheritance.

The context is flexible composition of objects via functional means.

Flexible composition of objects ~~ flexible composition of relations between what you decide are your data and your algorithms.

Type systems seem to me an attempt to impose a immutable structure on a dynamic world. Nothing new in that - imagine a simplified reverse timeline: Go4 -> Carl Linnaeus -> Aristotle -> "That's a Sabre-Tooth, not a duck". It's even necessary - you simply haven't got time to analyse every item you encounter - but all classification systems are arbitrary and incomplete. Cows are mammals but also good bait for velociraptors.

I don't know that you can abolish type systems in programming languages entirely? There are some cultural norms that are probably necessary to accomodate. 4 plus 2 will be 6 to most people, 42 to a relative minority.


You don't need type systems to distinguish 6 from 42. You simply say that "+" is an operator that yields the sum of its two operands, and "." is an operator that yields the concatenation of the characters of its operands. Thus 4 + 2 yields 6, and 4 . 2 yields 42.

Leave a comment

About [deleted]

user-pic I blog about Perl.