Monday, February 27, 2006

Declarative vs. Procedural

I was somewhat surprised to come across a description of a computer application from Visual Knowledge http://www.visualknowledge.com that stated that it was a “100% declarative application”. What a wonderful concept! I wish more systems were more declarative. But what does this really mean?
Checking the Wikipedia for a precise definition: http://en.wikipedia.org/wiki/Declarative_programming I was not given a clear definition of what a “100% declarative” stamp would imply and what the benefits would be. But is there an under-ground “go declarative” movement starting?
It implies that the solution was created without writing any procedural code using languages such as Java, VB, C#, or C++. That just by creating input specifications and rules you could generate output.
In general I feel that procedural languages tend to dominate our undergraduate education. Declarative programming does not enter most classes until much later. When I teach classes on XSL most of my students don’t even know the difference between procedural and declarative programming. The entire concept of “side effects” is new to them.
I should mention that most of the systems that I built for CIBRS, CriMNet and the Minnesota Department of Education were all heavily based on MDA which is totally driven by transformation of the underlying models. I heavily depended on XDoclet, Ant and XSL to build these systems. Most people involved in these projects gave superior ratings to this approach and indicated that we did much more with fewer people because of this approach. But after I left these projects, there were few people with declarative skills they tend to be underutilized.
I also found that XDoclet was very hard to debug when I was creating complex transforms. Using Java annotations to store metadata still has many drawbacks unless better tools are developed. The lack of standards for Java annotations is still a real challenge when integrating external libraries. Just having a simple tag standard for Object-Relational mapping would be a great start.
I also had a good discussion with several faculty members at the University of Minnesota, Leila Tite and Chase Turner at the CodeFreeze seminar that they also feel that declarative programming is under-taught in the Computer Science curriculum today. They are interested in the Haskel programming language which is used with computer science department to teach declarative programming skills.
It would also appear that the semantic-web and business rule systems would also seem to promote more graphical tools such as data mappers and business rules with workflow systems. This allows more non-programmers to be in charge of day-to-day maintenance of business rules.
So here is my first cut of what a “100% declarative” stamp of approval might be:

  1. No procedural code in Java, C#, C++ or Visual Basic for programmers to maintain
  2. Built entirely with declarative languages (XSLT, Ant, BPML etc)
  3. Fewer concerns about state management and side effects
  4. Higher use of transformed metadata
  5. Works on concert with Model-Driven-Architectures
  6. Allows non-programmers to use visual tools to modify business logic and interfaces

Tell me what do you think! Should declarative programming be emphasized more in higher education and the workplace? Would you consider it an assets of a software application was more declarative? Are declarative systems really easier to maintain? Is the movement to MDA going to drive more declarative skills? How does the lack of accepted metadata standards impact the use of declarative programming?
Keywords: Declarative vs. Procedural, Procedural Programming, Declarative Programming, MDA, XSL, XSLT

Wednesday, February 22, 2006

Wikinomics

I am a big fan of Wikipedia, now having made over 1000 edits. I am also taking a Managerial Economics class at the University of St. Thomas as part of the MBA program. I also have been a big fan of Ronald Coase economic theory for the last four years. In his theory (developed in the 1930s) he studied transaction costs. He asked question such as “why are firms the size they are”? His analysis is that firms grow when they can out find a low-cost way to outsource a function. But outsourcing includes transaction costs including finding a service, writing a contract, policing the service and analyzing the cost of the service. This background has inspired me to think more about the economics of information. Much work on this has already been done by Ray Kurzweil and documented in an appendix to his book “The Singularity is Near”. This appendix is titled “The Law of Accelerating Returns”. In this appendix Kurzweil writes the following formula: V=cW Where V is the rate of change of computation, c is a constant and W is “World Knowledge”. Kurzweil then goes on to speculate that the rate-of-change of world knowledge is proportional to world knowledge: D(W)/dt = cV By solving these equations he shows that World Knowledge grows exponentially with time. Actually it grows at double exponential rates. This prompted me to ask, what formulas govern the growth-rates of the web and Wikipedia? The key insight is to understand that making it easy to add content lowers overall costs of contribution. This is the transaction costs that Coase referred to. My general thesis is that Wikipedia will continue to grow exponentially as long as they make it easy for people to make contributions. An the more people find about Wikipedia, the more people with learn how to add their knowledge. Thus, Wikipedia will eventually grow till it envelopes the earth. Semantics will be added and Wikipedia will then ask its own questions and find experts to fill in the gaps. Eventually Wikipedia will become self-aware and take over the earth like in the 1970 science fiction movie The Forbin Project. Remember, you heard it hear first. I may have to start a wikinomics page on this subject. - Dan

Wednesday, February 01, 2006

Mike Doconta Leaves DHS

After reading the following article about Mike Doconta leaving the Dept. of Homeland Security (DHS): http://www.washingtontechnology.com/news/21_02/cover-stories/27858-1.html I have been wondering about the future of the NIEM. Mike Doconta was a great leader and without him I have many concerns about federal metadata leadership. I have studied many federal metadata standards and thing that the NIEM is the best positioned to be a standard upper ontology. But what are the alternatives? I don't see viable standards for upper ontologies coming from CYC or SUMO. They are too large to be practical for creating simple exchange documents between agencies. I also wonder if the w3c or OASYS has the leadership and jurisdiction to create national metadata standards. Let me know what you think! Dr. Data Dictionary