Thursday, December 31, 2009

Congratulations Andrew

Congratulations to Andrew for his OBE! It's been a while since I've caught up with him, but apart from being a critical part of Ansa when we were starting on Arjuna, more importantly he was also the external examiner for my PhD thesis! We did some work with him when he started Digitivity and during his stint at Citrix, and it's always been stimulating to talk with him about this or that, even if the topics aren't work related. Congratulations again Andrew. Now maybe I should nominate Santosh for next year.

A quick look back over 10 years

It's been an interesting decade as far as I'm concerned. Going from co-creating Arjuna Solutions through the Bluestone acquisition and the Bluestone-Arjuna Labs, then the HP acquisition and the HP-Arjuna Labs, spinning out and co-creating Arjuna Technologies, then the move to JBoss, followed by the Red Hat acquisition, a lot of work around SOA (which dates back to at least JBoss) and finally my new role with the departure of Sacha.

Very busy and despite a few ups and downs, very interesting and a lot of fun. It's not a period where I'd change anything professionally. However, I hope the next decade has somewhat less company movement than the previous! But I'm sure the interest and fun will continue.

Wednesday, December 30, 2009

Where does the time go?

There are only a few days left in 2009 and that means just a few more until I'm officially back to work. Although I have managed to find time (make time?) to work on a few things, as usual there hasn't been as much time as I'd expected (hoped?) Family life and seasonal activities impacting (as they should?) But it's better to have a longer list of things to do and not get to them than sit twiddling my thumbs watching TV.

Over the next few days I expect to finish off some more work related things but it's quite possible that drink and merriment will have to take priority: after all, once I'm officially back to work they won't get much chance for the next 12 months!

Friday, December 25, 2009

It's spooky when people know you too well

It's Christmas day and we've had family and friends come from across the country. Lots of good times and good food. But the spooky thing for me was when I started to unwrap gifts from people who hadn't coordinated their giving and yet they had a common theme: Star Trek. I've mentioned before that I'm a fan, but it's not something that comes up day-to-day. So it was all a complete surprise, particularly when the gifts came from so many different and disconnected individuals. My inner child is happy today. Merry Christmas!

Wednesday, December 23, 2009

JBoss in the Cloud

It's nice to see Bob, Marek, Mic and the team release StormGrind. There's a lot of interesting things going on in the Cloud and how open source in particular can influence it. Fortunately we've got some of the best people to help us keep pushing ahead with defining the Cloud. I think 2010 is definitely going to be a good year for JBoss and the Cloud.

Saturday, December 19, 2009

Dan O'Bannon

I can't believe I missed the news of the death of Dan O'Bannon! It's sad to see it hidden away in the corner of the news, given how influential he was. I suppose the first time I ran into his sphere of influence was Dark Star: if you've never seen the film then I definitely recommend it. Not many films can say they managed to combine an orange beachball-alien and a surfing astronaut so well! Of course Alien was more successful and probably more influential, but I think I'll always associate him more with Dark Star. It's a sad day, but I'm sure that wherever he is now he'll be teaching them about phenomenology.

Christmas is coming and ...

... I'm on vacation until January, but the next few days will be tying up some loose ends from work, catching up on a few things (work related) that I haven't had a chance to get to recently and then the rest of the holidays are mine ... all mine!! So I plan to get back to some pet projects that have been languishing for the past few months, finish some reading and maybe, just maybe, finish a paper I've been working on for a while.

Thursday, December 10, 2009

JBossWorld 2010

The Call For Papers has been announced. So if you're interested in presenting or just meeting up with JBoss engineers or users, get submitting! It's a great event to attend. Now I've just got to make sure I can present something technical again next year as I did this year: Keynotes are good, but I love the feedback you get from presenting papers, work in progress etc.

One of the first Web sites

While talking with some of my friends from Arjuna today, Stuart reminded me that we started playing with the Web in 1991/1992, when we created a Web site with one of the first releases of the CERN HTTP code for the original Arjuna Project. As Stuart recalled, it was so early in the evolution of the Web that one of the original CERN pages which maintained a list of available Web servers around the world (I suppose you could say a precursor to Google in that regard) had our site on the very first page for a long time. It's a shame the Internet Archive does not go back that far.

I know we often hear people ask "Where were you when Kennedy was shot?" or "Where were you when the Wall came down?", but I suppose in our industry a similar question would be "Where were you when the Web was started?" Well for me I was in the office I shared with Stuart, working on my PhD and playing with HTTP and HTML. Sometimes I'm surprised we got any real work done :-) !

Sunday, November 29, 2009

RESTful transactions round two

I've been working on a second draft of the RESTful transactions work that I've mentioned before. This time I'm doing it for Bill and his REST-* effort. I revised the original for JavaOne but didn't get a chance to use it in our presentation. So I'm taking this opportunity to apply some standards boiler-plate and bring it up to date. Plus it's always good to revisit something you did almost a decade ago and use the benefit of those intervening years and the experience gained.

Monday, November 23, 2009

Enterprise OSGi: two is obviously better than one

I think OSGi is important for several reasons. I think Enterprise OSGi is an interesting approach, particularly as it leverages JEE. I've even contributed to some of the work, for example around the transactions component. JBoss is doing a lot of implementation work around OSGi too.

I have to admit that I haven't been paying close attention to OSGi for a few months. However, I had heard about the new Apache Aries project. Unfortunately I just heard from a friend about the Eclipse Gemini project. Now I've been involved with standards long enough to have experienced first hand the political games that rivals play with each other. It's unfortunate because it rarely benefits users, tending to obscure the reasons for choosing one approach over another, confuse people, and ultimately delaying the uptake of the standard or technology involved.

Maybe I'm missing the underlying reasons why Oracle and SpringSource decided that Aries wasn't the right project for them. However, I really wish that as an industry driven primarily by technologists we could leave the politics behind and try to work far more collaboratively, and particularly where open source is concerned! As a slight aside, that's one of the things I really like about HPTS: it doesn't matter which company you're from, people talk and interact freely to try to better our collective understanding of problems and lessons learnt.

Update: I should point out that in the paragraph above I wasn't siding with Aries over Gemini, simply that Aries started first.

Saturday, November 21, 2009

The future of Java

For one reason or another I've been thinking about the future of Java for a while; more the language than the platform (JEE). I think that the JEE platform will continue to evolve over the coming years, more likely into a series of specific vertical solutions. But this entry isn't about the platform, it's about the language.

Although I've been using Java since it was known as Oak and have written a fair amount with it (for example, I wrote the first ever Java transaction implementation over Christimas 1996), it's never been my favourite programming language. There are some things that I liked about the language from the start, such as threading, but others, such as garbage collection (yes, I like the control of a good delete) and lack of multiple inheritance, that I didn't. The Java language has certainly evolved over the past decade, mostly for the better (garbage collection is a lot better now and we have templates), but in general the language still takes a lowest common denominator approach. And it has got rather bloated.

Ignoring several assembly languages, over the years I've learnt many high level languages including Pascal, Mesa, Smalltalk-80, Lisp, Prolog, C/C++, Simula, D, Forth and of course Java. My favourite is still C++ though. Yes I know it has its flaws and yes I know it's not the most forgiving of languages. I can't quite put my finger on precisely why I still prefer C++. I remember when we were testing the first pre-releases of Cfront for AT&T back in the mid 1980's and wondering along with Graeme and Stuart as to whether we could port it to our Atari's using Metacomco C. I seem to recall us making some progress, but Linux came on the scene bringing with it gcc. But none of this explains why I prefer C++. Maybe it's the level of direct control it gives you (as with C). Or maybe it's something else. Whatever, I'm sure it's all subjective.

Anyway, I digress. Where does this leave Java? Well I think if you look back over the past 40+ years of high level programming languages one thing is very obvious: change happens. Our ability to reason about complex algorithms and the best way of putting them into code evolves, whether it's declarative, procedural, object-oriented or something else. I think it's fairly clear that Java's dominance in the industry will wane. Yes it'll be legacy for many years to come; the 21st Century COBOL, so it will continue to be important. But something new is coming. Maybe not today and maybe not tomorrow. But it's coming nonetheless.

Friday, November 20, 2009

ArchiteCloud 2010

I have the pleasure of bring on the ArchiteCloud 2010 program committee. If you've any papers hiding in your "to do" lists then get them in, as this promises to be a great event! Maybe I can use this as an excuse to visit Australia next year too!

Tuesday, November 17, 2009

Santa is an architect

It's drawing near to that time of the year again when thoughts turn to snow, presents, turkeys and all things festive. So it was that I was watching a program on TV yesterday where Santa was the main character and my 7 year old and I began to discuss the ways in which Santa manages to get presents to all of the good little girls and boys around the globe in a single night. Of course we covered all of the usual ideas, such as time dilation, wormholes and even time travel. My son thought that magic was the solution, but I pointed out that these days what with global warming and the fact that it's been shown that continual use of magic harms the environment, it's doubtful. Let's also not forget that magic reindeers produce a lot of CO2 as well as other effluent.

So where does that leave us (apart from with a rapidly disillusioned child)? The answer was obvious: although in the past he's probably used a combination of all of the above techniques (have to placate child), today he's taken a software architecture course and figured out that federation works well and scales. He has millions (billions?) of proxies in each country who do his work for them. He sends them information about what needs getting (in advance of course) and relies on them to buy the presents and distribute them locally. Those proxies may themselves have proxies in a recursive manner. Yes we all know it's the elves who build and distributed the toys to the shops, but it's the masses of proxies that get the delivery work done. And of course these helpers are parents, grand-parents etc.

So next time the question arises you'll know the answer: Santa is a coordinator and we're all interposed coordinators in the grand scheme of things ;-)

Monday, November 09, 2009

In-memory durability and HPTS

Back in the 1980's when I was writing the proposal for my PhD work I was looking at various uses for replication (at that point strong consistency protocols). There are a number of reasons for replicating data or an object, including high-availability, fault tolerance through design diversity and improving application performance. For the latter this could include reading data from a physically closer replica, or one that resides on a faster machine available through a faster network path.

But in terms of how replication and transactions could play well together it was using replicas as "fast backing store" aka a highly available in-memory log that seemed the logical thing to concentrate on. We certainly had success in this approach, but the general idea of replication for in-memory durability didn't really seem to take off within the industry until relatively recently. I think one of the important reasons for this is that improvements in network speeds and faster processors have continued to outstrip disk performance, making these kinds of optimization less academic and more mainstream. So it was with a lot of interest that I listened to presentation after presentation at this year's HPTS about this approach. Of course there were presentations on improving disk speeds and using flash drives as a second-level cache too, so it was a good workshop all round.

Friday, October 23, 2009

A great tribute to Jim

Congratulations to Tony, Savas and everyone else involved. This is a great tribute to both the person and scientist that is Jim Gray.

Monday, October 19, 2009

Interesting discussion on SOA Manifesto

I mentioned earlier that I'm on a group collaborating to create a SOA Manifesto along with Steve. Well as part of the effort to inform the public and solicit feedback I produced something for InfoQ. The result of that is a very useful discussion on the comments to that article, which I encourage anyone interested in the SOA Manifesto to check out. Unfortunately I was in Boston last week in meetings so didn't have much chance to participate in the discussion, but as Steve points out, we'll certainly take it to the working group. Thanks to everyone who participated!

Wednesday, September 30, 2009

SOA transactions and reservations

Arnon and I have spoken about SOA transactions in the past, so it was interesting to read his latest article (book chapter?)

It's a nice paper to read, though I disagree with several things he has to say. For instance, I don't like the term "semi-state" or that somehow the use of compensations increases the service's contract footprint. Furthermore compensations (not just Sagas) don't have to use locks. As we kept saying during BTP development, that's a back-end service implementation choice that doesn't have to be exposed to users.

The reservation pattern is important and it's something that's been described several times before, particularly in the area of specific implementation approaches such as BTP, WS-TX and WS-CAF. So although I haven't seen Arnon's book, I can appreciate why he's including this chapter and I liked the discussions on possible risks when using the pattern. However, it would be good to see the aforementioned approaches referenced, particularly since there's a discussion about how you could do this with EJB3.

But the one thing (the elephant in the room) that is glossed over in the discussion (which isn't if you read about the reservation approach elsewhere) is that there's something in the environment directing the flow of interactions across many services. If it looks like a coordinator, behaves like a coordinator and smells like a coordinator then chances are it's a coordinator.

Tuesday, September 29, 2009

Native transactions

It was interesting reading The Power of Native Transactions presentation, which could probably be subtitled: When You Don't Own a Transaction Manager, What Do You Do? Actually I don't mean to pick on Spring as this is a problem we used to come across many times when an independent TM vendor. Typically we'd hear "Aren't the transactions in my database good enough?", "Why do I need recovery?" and that old one but good one "I'm not interested in distributed transactions so I don't need two-phase commit."

I've said enough times that traditional transactions (aka ACID) aren't suitable for every occasion. The fact they have been shoehorned into situations where they aren't needed or aren't applicable has hurt the industry as well as disillusioned some people. Probably as a result, as we saw throughout the BTP effort people equate two-phase commit with ACID semantics, which is clearly wrong. Or that the two phases have to somehow be tied in with prepare/commit/rollback, which as BTP (and others) showed is again wrong.

At some point I need to finish my paper on consensus in distributed systems, because it's fairly obvious that some people who see transactions (and specifically 2PC) as "evil" and something that can be "programmed around" don't fully understand what's happening under the covers of any good transaction manager. Yes it's possible that some uses of transactions can be redesigned without them, but that doesn't mean they all can be.

Anyway, back to the topic at hand. I read the presentation with interest because it's always good to hear experiences from using transaction systems. What follows is almost a review of the presentation as if I had to review it for inclusion (or not) in a conference. I blame the fact that I sit on so many program committees and review a lot of papers each year: it alters your mindset a bit.

First I agree with one of the core messages in the Spring paper: that sometimes you don't need transactions at all and that one-size doesn't fit all. But sometimes you do need a good transaction manager and you need to understand when and why.

It would have been good to see a discussion of local and global transactions, but then I suppose the point of the talk is to emphasize the use of local transactions (through a native transaction manager). What I find frustrating is that yet again someone equates JDBC save points with nested transactions. Yes you can map a save point to a transaction boundary, but you that doesn't give you nested commit semantics or concurrency control, for instance. If you're going to talk about nested transactions, please stick with the standard definition. Even OTS managed to do that, despite breaking the model in other ways.

I know (hope) that the JBossTS team are going to say a thing or two about the common misunderstandings and misconceptions around XA, so I'll try not to steal their thunder. But I do need to point out that the read-only flag is neither a Spring invention nor something only found in specific databases: it's a core part of two-phase commit (through the read-only vote that can be returned during prepare). So JTA does support it. Of course how the resource manager figures out how to return read-only is something that the transaction manager doesn't mandate (and it shouldn't). So in that regard an annotation makes sense. But other approaches are possible (e.g., making locking rules explicit within the business object.)

The presentation also gives us the notion that 2PC is for distributed transactions only. No! As I said above, 2PC is a consensus protocol so it's needed when you have more than one participant, even if they are on the same machine. Unless of course you're not interested in them reaching agreement. So please, don't equate 2PC with distributed transactions: I also know of many distributed systems using transactions where 2PC isn't the norm!

I hate subjective statements like "XA is Non-Trivial to Set Up." Yes, so is the brake system in my car, but I need it nonetheless! So although XA can be a PITA at times, if used correctly (in the right situations) it's well worth paying. Plus it's often a one-off cost that over the lifetime of a deployment fades into insignificance compared to the benefits, or the setup costs of other components. So try to look on this objectively, weighing up the pros and cons (which is maybe what the presentation is trying to say, although perhaps for obvious reasons it seems to push the non-XA approaches more.)

Something else that surprised me about the presentation was the inference that an application that uses an XA transaction manager, or perhaps one that implements 2PC in general, must incur the overhead of a log even if there's a single resource involved in the transaction (something which using "native" transactions would miraculously avoid). So the one-phase commit optimization hasn't been invented? Once again this is something that's part of XA too. Any good transaction manager implementation should take advantage of this and in this situation you wouldn't get a log created! And I'm not even going to bother about other optimizations such as presumed abort.

I suppose this misunderstanding leads to another subjective statement: "Native Transactions - As Efficient As It Gets". Followed by the recommendation that you should try to use multiple (independent) native transactions, with a single resource in each of course. Yes, that's also one of the extended transaction models we documented when developing The Additional Structuring Mechanisms for the OTS. However, as was pointed out back then, you don't get the same semantics (remember what I was saying about consensus?) If you need atomicity then there are very few alternatives outside of magic or quantum mechanics!

In conclusion, I think the presentation wasn't too bad if a little vendor specific (which makes sense given the background and conference). But if you're really interested in transactions, where to use them, where not to use them, and precisely what the trade-offs are in a more objective manner then there's a lot of other good information out there.

Vacation, decorating, Spring and reservations

I'm on vacation this week, taking the time to decorate my 7 year old son's bedroom. If you want to get away from it all and let your mind come at things from different angles then I can recommend it (though probably not if you're a painter/decorator by trade!) I find it relaxing enough that my mind wanders in a good way. I used to find the same was true when I was a kid at school making and then painting models while thinking through exam questions.

Anyway, over the next few days I hope to write up my thoughts on a couple of things I noticed over the past month or so. One is a presentation on Native Transactions in Spring and the other is on the Reservation Pattern and SOA Transactions. Yes, there's a common theme there!

Monday, September 21, 2009

Elite is 25 years old?!

I didn't realise that one of my favourite games of all time, Elite, is 25 years old! I remember buying this for my BBC Model B computer and playing it at all hours of the day during my early university life. Happy memories!

Wednesday, September 16, 2009

The best laid plans ...

You go and do something with the best intentions and it suffers a few minor teething problems! With the benefit of hindsight I think we would definitely have gone about it differently, but at least it hasn't all been negative. Hopefully something good will come out of the various discussions, because that was always our aim!

Monday, September 14, 2009

Mind maps

I first came across mind maps back in the last 1980's when a friend/colleague at the University started to use them. I didn't really think any more of them until recently, when another friend pointed me at XMind. Very nice piece of software and it's opened up a whole new way of doing things for me. Gone are my scraps of paper, hastily scrawled notes and other ad hoc approaches!

Monday, August 24, 2009

Google books

Now here's a book I haven't seen in a long time! Three of the authors were/are professors at the University when I was doing my PhD and several of the others visited regularly. A great book and still relevant today!

Saturday, August 15, 2009

JBossWorld 2009

It's almost upon us! This year is unique for a couple of reasons: first we're holding the event at the same time and location as Red Hat Summit, which means we could have some interesting audience cross-overs, and second because this will be my first time as CTO since taking over from Sacha!

This year I'm giving a session on our work on testable architectures as well as my keynote. Normally I don't have any difficulties preparing presentations for conferences and workshops. I've done several keynotes in the past for other events, but never one in my current role. It's quite daunting, given the various people I'm following in this role. Maybe it's the masocist in me, but I'm looking forward to it. Now if only I could finish the blasted presentation!

Back and into the fray

Back from vacation and just spent two days catching up on email (lots of things ignored on the assumption that if they're important I'll see them again!) Now getting ready for JBoss World!

Sunday, July 26, 2009

Canadian downtime

I'm going on vacation for a couple of weeks in a few days. This is our semi-annual visit to the parents-in-law in Canada. Should be good weather and is so far in the "outback" that doing nothing and relaxing is the norm. Looking forward to it, though I'll be taking some papers to read and write. Blogging will take a back seat though.


I don't use Facebook or any social networking site, but I know people who do, including my wife. So I was surprised and annoyed when I heard via Savas that "Facebook has agreed to let third party advertisers use your posted pictures without your permission. Click on SETTINGS up at the top where you see the log out link. Select PRIVACY. Then select NEWS FEEDS AND WALL. Next select the tab that reads FACE BOOK ADS. There is a drop down box, select NO ONE. Then SAVE your changes. Please repost!"

I'm doing more and more work with "the cloud" (whatever that means!) and it's things like this (security, identity, trust) that concern me. Maybe when I finalize my opinions around this and other aspects I'll write something up. For now beware. And if you're on Facebook maybe you should change your setting?

Saturday, July 18, 2009


I grew up on Richard Feynman though wasn't really able to appreciate him until I started my physics undergraduate degree. He's definitely a hero of mine. One of the best programs I've watched about him didn't actually concentrate on his deep physics background, but was about his trip to Tuvalu. So it was with pleasure that I came across Project Tuva from our friends at MSFT.

Monday, July 13, 2009

Memories ...

While looking for something else, I came across this old ANSA/Esprit project proposal from about 1989. One of the earliest external project memories I have and a good one too! Then I thought "Oh, that's 2 decades ago! Oh frack!"

Saturday, July 11, 2009


So far my adventures in JDK 1.6 land have been constrained to a 10 year old Windows box, so I decided to upgrade my Mac. Unfortunately the machine I've got isn't 64 bit so the official Apple version doesn't work. Fortunately the SoyLatte distribution seems to work very well. Yet another example of good open source at work.

Friday, July 10, 2009

Scala round 2

I mentioned earlier this year how Mic had pointed me at Scala. I haven't had nearly enough time to play with it (which usually means me implementing a transaction service in the language!) but from what I've seen I'm impressed. It's interesting to read what others think too.

Wednesday, July 08, 2009

Walking with Dinosaurs

I remember watching Walking with Dinosaurs when it first aired on TV and being impressed with the graphics and story. The rest of my family were equally impressed. So when we heard about the Arena Spectacular version and that it was coming near us, it was something we had to see. I took the day off work to make sure no one could plan anything for me to get in the way and we went to see it: well worth it and I'm extremely impressed! So if you like dinosaurs or your kids do then go and see it if you get the chance. If you don't have any kids then borrow some so you won't look out of place!

Thursday, July 02, 2009

Principles of Transaction Processing

I've known Eric as a colleague and friend for a long time, but my first run in with him was with his and Phil's first edition of Principles of Transaction Processing. Along with Jim's book, it has a key place on my bookshelf. Over the years I've probably bought a dozen or so copies for various groups I've worked with. Well it's good to be able to say that they've released a second edition. I was one of the reviewers of the book over the past couple of years so I know this is a solid replacement for the original. Well done!

Wednesday, June 24, 2009

Supporting performance?

Given my background in fault tolerant distributed systems, performance metrics are something that come up time and time again. I've spent many a happy hour, day, week, month pouring over timings, stack dumps etc. to figure out how to make RPCs go faster, replication to scale and transaction logs to perform. (Funnily enough I was doing the latter again over Christmas!) But this is something we've all had to do when working on Arjuna.

If you've ever had to do performance tuning and testing then you'll know that it can be fun, but often an almost never ending task. Then there's the comparisons between different implementations. This isn't just a vendor-specific area: even in academia there's always a bit of "mine is faster than yours" mentality. I suppose it's human nature and in some ways it's good and pushes you to improve. However, performance measurements need to be taken in a scientific manner otherwise they are useless. It's no good to simply state that you get "X transactions a second" if you don't specify the conditions under which that figure was achieved. This is true for a number of performance metrics, not just in the computing industry: there's a reason world records in athletics have defined conditions, for instance.

Now of course you may not have the same hardware to test and the tests themselves can be notoriously hard to access from one vendor/competitor to another. What tuning parameters they use as well as the general configuration can also be difficult to obtain. This is why Jim Gray and others wrote A Measure of Transaction Processing Power (though the principles in there can be applied much more widely.) This eventually produced the TPPC. But the general principle here is a scientific approach to performance metrics is the right way (though it's still possible to skew them in your favour.)

Whenever I see or hear of performance figures I always want to know the conditions (hardware, configuration, wind conditions etc.) It's not possible for me to take them seriously unless I have this additional data. Giving out the tests themselves is also a good thing. However, performance is not something you can support like you do with, say, transactions, where you either are transactional or you're not. With performance it can be a never ending struggle to keep improving, especially as hardware and software improve and your competitors get better too; you've always got performance even if it's bad! Performance is something that for some users can be as critical as new capabilities, or even more so. But it can't be performance at the cost of reliability or robustness: something that screams along faster than a speeding bullet yet crashes after 24 hours is unlikely to be as useful as something that is half as fast yet able to stay available indefinitely.

Tuesday, June 23, 2009

Internet dial tone?

The other day I had problems with my ISP that meant my connection to the rest of the world was up and down like a yo-yo. I self diagnosed the problem (the ISP help desk was pretty useless) and in the course of that I realized something: where's the equivalent of the telephone dial tone for the internet? Well if you're like me it's either Google or BBC news. Over the years whenever I need to determine if there's a problem with my connection or some site I'm trying to use, I'll use one of these sites to double check my own connectivity. OK it's not exactly a science, but I assume if I can talk to one or other of them then the fault lies elsewhere.

Wednesday, June 17, 2009

CFP for MW4SOC 2009

4th Middleware for Service-Oriented Computing (MW4SOC)
Workshop at the ACM/IFIP/USENIX Middleware Conference

Nov 30 – Dec 4, 2009
Urbana Champaign, Illinois, USA

This workshop has its own ISBN and will be published as part of the ACM International Conference Proceedings Series and will be included in the ACM digital library.

Important Dates
Paper submission: August 1, 2009
Author notification: September 15, 2009
Camera-ready copies: October 1, 2009
Workshop date: November 30, 2009

Call details
Service Oriented Computing (SOC) is a computing paradigm broadly pushed by vendors, utilizing and providing services to support the rapid and scalable development of distributed applications in heterogeneous environments. However, the influence of SOC today goes far beyond the concepts of the original disciplines that spawned it. Many would argue that areas like business process modelling and management, Web2.0-style applications, data as a service, and even cloud computing emerge mainly due to the shift in paradigm towards SOC. Nevertheless, there is still a strong need to merge technology with an understanding of business processes and organizational structures, a combination of recognizing an enterprise's pain points and the potential solutions that can be applied to correct them.

While the immediate need of middleware support for SOC is evident, current approaches and solutions still fall short by primarily providing support for only the EAI aspect of SOC and do not sufficiently address issues such as service discovery, re-use, re-purpose, composition and aggregation support, service management, monitoring, and deployment and maintenance of large-scale heterogeneous infrastructures and applications. Moreover, quality properties (in particular dependability and security) need to be addressed not only by interfacing and communication standards, but also in terms of integrated middleware support. Recently, massive-scale and mobility were added to the challenges for Middleware for SOC.

The workshop consequently welcomes contributions on how specifically service oriented middleware can address the above challenges, to what extent it has to be service oriented by itself, and in particular how quality properties are supported.

Topics of interest
* Architectures and platforms for Middleware for SOC.
* Core Middleware support for deployment, composition, and interaction.
* Integration of SLA (service level agreement) and/or technical policy support through middleware.
* Middleware support for service management, maintenance, monitoring, and control.
* Middleware support for integration of business functions and organizational structures into Service oriented Systems (SOS).
* Evaluation and experience reports of middleware for SOC and service oriented middleware.

Workshop co-chairs
Karl M. Göschka (chair)
Schahram Dustdar
Frank Leymann
Helen Paik

Organizational chair
Lorenz Froihofer,

Program committee
Sami Bhiri, DERI (Ireland)
Paul Brebner, NICTA (Australia)
Gianpaolo Cugola, Politecnico di Milano (Italy)
Francisco Curbera, IBM (USA)
Frank Eliassen, University of Oslo (Norway)
Walid Gaaloul, Institut Telecom (France)
Harald C. Gall, Universität Zürich (Switzerland)
Nikolaos Georgantas, INRIA (France)
Chirine Ghedira, Univ. of Lyon I (France)
Svein Hallsteinsen, SINTEF (Norway)
Yanbo Han, ICT Chinese Academy of Sciences (China)
Valérie Issarny, INRIA (France)
Arno Jacobsen, Univ. Toronto (Canada)
Mehdi Jazayeri, Università della Svizzera Italiana (Switzerland)
Bernd Krämer, University of Hagen (Germany)
Mark Little, JBoss (USA)
Heiko Ludwig, IBM Research (USA)
Hamid Reza Motahari Nezhad, HP Labs (USA)
Nanjangud C. Narendra, IBM Research (India)
Rui Oliveira, Universidade do Minho (Portugal)
Cesare Pautasso, Università della Svizzera Italiana (Switzerland)
Fernando Pedone, Università della Svizzera Italiana (Switzerland)
Jose Pereira, Universidade do Minho (Portugal)
Florian Rosenberg, Vienna University of Technology (Austria)
Regis Saint-Paul, CREATE-NET (Italy)
Dietmar Schreiner, Vienna University of Technology (Austria)
Bruno Schulze, National Lab for Scientific Computing (Brazil)
Stefan Tai, Institut für Angewandte Informatik und Formale Beschreibungsverfahren - AIFB, Karlsruhe (Germany)
Aad van Moorsel, University of Newcastle (UK)
Eric Wohlstadter, University of British Columbia (Canada)
Raymond Wong, UNSW (Australia)
Roman Vitenberg, University of Oslo (Norway)
Liming Zhu, NICTA (Australia)

Sunday, June 14, 2009

The next wave?

I've been asked several times over the past few months what I think will be the next technology wave. If I knew that I'd be writing this entry from my own personal island in the sun! I've been involved in a few of these technology waves over the years, both as a user and a definer, but predicting them is another thing. For instance, who would have guessed at the turn of last century that SOAP would have taken on the major role it has today? Or that open source would have become such a defining wave?

I have my own theory for how technology waves begin and it wasn't until I watched a program from the BBC on rogue waves for the second time that I found a decent analogy. The relatively new theory (based on Schrodingers Equation) goes that these super waves, which are big enough to sink ships, are formed when energy is "stolen" from one wave to feed another. This builds and builds to create these towering monsters. Well I think technology waves are very similar: something relatively innocuous, such as SOAP, pulls in energy from other fields, such as EAI and the Web, to grow to the scale of a disruptive influence when it reaches a tipping point. This also makes it difficult to predict a priori whether or not something will be a wave: many different things contribute.

With that in mind, what do I think will influence the next technology waves? Here are a few ideas though not all. I should add a disclaimer that these are my own personal opinions and not necessarily those of my employer:

  • A new programming language? Over the past 40 years or more we've seen languages come and go. The only constant is binary! Periodically we go back to high-level versus low-level language debates (e.g., can compilers really optimize as well as writing raw machine code?) Java has been influential for a long time, but if history has taught us anything it's that everything has a season, so it's only natural that at some point Java popularity will wane. But I'm not sure that a single language will replace it. Java didn't replace C, C++, COBOL or Lisp, for example. With the increase popularity of languages such as Erlang, Scala and even C++ making a comeback, variety is the right approach. Yet again a case of one-size doesn't fit all. When I was at University we learnt a lot of different languages, such as Pascal, Occam, Concurrent Euclid, Ada, C, C++, Lisp, Prolog, Forth, Fortran, 6502 (still one of my favourites), 68K, etc. When Java came along that seemed to change, with the focus more on the single language. Hopefully we'll go full circle, because as an industry we can't afford to keep reinventing software every 10 years to suit a new language.
  • What about Cloud/Virtualization? Yes, there's a lot of hype in this area and I do think it offers some very interesting possibilities. But I'm not sure it's a wave in its own right. I suspect that we're still missing something to turn this into a Super Technology Wave. That could be SOA, fault tolerance, the economy, or Skynet.
  • It will finally dawn on the masses that security (including identity, trust etc.) is something we take for granted and yet is not available (at all or sufficiently) in many of the things we need. (See Cloud, for example.) Security as an after thought should be replaced with security as an integral part of whatever we do. Yet again not a wave in and of itself, but something that will be pulled into one I hope.
  • Of course REST has been around for a long time and given the many discussions and debates that have been raging for the past few years we're definitely seeing more and more take-up. I'm not going to debate the pros and cons here (have done that before), but I am sure this will become a wave, if it's not already.
  • A unified modeling approach to building distributed systems, pulling together events, messages, etc. JJ has been talking about this for a while and it would be nice to see.
So there are a few thoughts. There are more, such as around designing for testability and HCI, but those can wait for another time. Maybe some or all of the above will be sucked into the new technology wave. Maybe none of them will. That's why I'm not sat writing this on some sun-soaked beach on my own private island!

Friday, June 12, 2009

Utter disbelief

I've been involved with standards since the early 1990s, working within the OMG, OASIS, GGF, W3C and others. They all have their rules and regulations, and most of the time they technical committees are populated by the vendors. That's not because these organizations are closed to end-users, but I think many of those end-users believe it's a lot of time and effort to participate. They'd be right.

Over those years I've been involved in a lot of political wrangling. You have to know when to push and when to give in. Priorities are important and today's foe could be tomorrow's ally. Sometimes people can get very emotional about this proposal or that, trying hard to justify why it should be voted for or against. That's good, because passion is important in what we do. But sometimes the arguments can be very fatuous. However, it wasn't until the other day though, that I realized I hadn't heard all of them!

To protect individuals I won't say what meeting it was, but suffice it to say that there was a face-to-face meeting recently that I attended by phone. Because of the time differences it meant I was doing my normal work from early in the morning my time and then in the evening jumping on a call for another 6 hours. For 3 days running! Anyway, during the 2nd day one of the vendors was trying to get their point across to everyone else and failing: pretty much everyone had decided to vote against, but this vendor kept trying hard. I had assumed that because this vendor (name withheld!) has been involved with standards for a while they knew the rules of the game. But apparently not, or at least the people representing them didn't.

So when it became pretty obvious that they were going to lose (this standards group uses the rule that one vendor has one vote, no matter how many employees it has on the working group) the cry went up "But what about the thousands of customers we have who want things our way?" Huh?! WTF?! First of all that's a very subjective statement. How do we know those customers exist and what they really want? If we were to have to take them all into account then we'd have to solicit their responses on all votes. Secondly if we go that route it'll become a "customer war", with vendor A saying "I've got 1000 customers" and vendor B saying "well I've got 1001!" This kind of weighted voting doesn't work! If customers really want a say then they can sign up to any of the standards groups. In fact I'd really encourage them to do that, because it makes a lot of difference: the best standards all have customer input!

After I heard this and stopped choking, I realized that it was really a case of desperation on the part of this vendor. Hopefully they'll read the rules before they turn up next time.

Tuesday, June 09, 2009

Web 2.0 architectures

One of the high points of any JavaOne that I go to is getting to meet up with friends such as Duane and James. As always we have fun catching up over a few drinks and good food (hopefully the pictures of this year's "event" won't turn up!) But this year was more special than usual because their book is out! It's been a long time in the works but it is well worth the wait. If you're at all interested in Web 2.0 then take a look at this book!

Friday, June 05, 2009


I've no idea if this is the last JavaOne as some rumours suggest (others that it may simply become an add-on to Oracle World), but it is with some sadness that I think about that and the acquisition of Sun by Oracle. Every time I think of this I keep hearing the words of Don't Let The Sun Go Down On Me (used to great effect in The Lost Boys.)

Putting aside my role within Red Hat, I've always had a soft-spot for Sun going back to the start of my career. In one way or another I've worked with Sun hardware and software for over 20 years. Back then, in the mid-1980's, the workstation was really starting to come into its own as the old batch-driven mainframe systems tailed off. My first work computer was a Whitechapel, the UK attempt to enter that market. A great machine, particularly if you were new to administering a multi-user Unix machine.

But in the Arjuna project the real king of machines was the Sun 3/60 with SunOS. For years we all strove to own one, either as a hand-me-down or when we got new funding. We went through to most of the Sun-4 series and into the Solaris years, with each year bringing a new top-of-the-line machine for one of us to manage (these were multi-user machines of course). Our project was so prominent within the University that others followed our lead where they could and where they couldn't they looked on slightly enviously.

During that time we often spoke with various Sun engineers in the hardware and software groups, e.g., Jim Waldo. This was on things as varied as transactions (not surprising really) through to operating systems (we got one of the first drops of the Spring operating system) and distributed computing. Then when Sun released Oak and eventually Java it was a natural extension for us to get involved with that heavily.

It was in the early 1990's that things started to change though. We were using Linux from the first public release Linus made (I remember Stuart and I taking it home one evening to replace Minix that we'd been running on the Atari's). Then when the project bought us Pentium machines, Stuart made the switch from a Sparc 2 running Solaris. He and I ran various benchmarks against my Sparc and for most of the things we needed at the time (compiling, execution etc.) the Pentium/Linux combination was faster as well as being a lot cheaper! The rest, as they say, is history. Year on year the PC and Linux combo got faster and faster, laptops came into this too pretty early, and the project steadily moved away from everything Sun. By the end of the 1990's the last few Sun workstations that were still in use were several years old and in the minority.

From conversations with others I've had over the years I think this is a very similar pattern to elsewhere. Was there anything Sun could do about this? Maybe and I'm sure that question will be debated for years to come. But for now it is with sadness that I contemplate a world without Sun. They helped shape what I am today from hardware, operating system, programming language(s) and so much more. In that regard I will always be in their dept.

Sunday, May 31, 2009

The paperless office?

If you ever come to my house you'll see that we're a very heavily literate family, with books adorning book shelves in pretty much every room in the place. But recently my wife took to reading in ebook format, for convenience (she can now read with the lights out) and for storage (one SD card can accommodate 1000s of books and is a heck of a lot easier to store than 1000s of books!). I managed to convert an old Jornada 720 that I had from my HP days into an ebook reader so it was also good for the environment in that regard. But it got me to thinking again about paper vs bits as far as reading material goes.

I've had this discussion with friends and colleagues for decades and my opinion has always been to prefer paper than bits. Excluding the tens/hundreds of papers I read each year for work, or simply because I want to, I usually have at least another hundred or so that I go through for the various conferences/workshops that I'm involved with. Then there are the two or three PhD or MSc theses that I have to read as well.

Most of these papers come in electronic format although there are the odd exceptions. It's true that I rarely print out papers that come in, say, pdf. But I do print some of them off. Why? For convenience for a start: I can't remember the last time a stack of A4 sheets ran out of battery, and marking them up is a lot easier. But also because I like the tactile input that I get from paper. The same goes for books: I much prefer physical books than ebooks, even though the content is the same. Then there's the connectedness aspect of reading an ebook/epaper: when I'm reading them on my laptop and my mind starts to stray I quickly head for the internet or Eclipse, or something else that can take my attention and soak up my time. With a physical copy of the same material I find that my mind wanders less and if it does it tends not to go too far. Weird.

Probably the only time I yearn for an electronic version of a book or paper is if I want to do a search for something. So I suppose what I really need is for someone to invent an ebook reader that gives you the look and feel of paper!

Saturday, May 23, 2009

Feeling more human again

It's a long weekend here (national holiday on Monday). I have a number of things I need to do for work, but at the moment I'm back coding on JBossTS for fun. Not something I have to do, but definitely something I like doing. It (coding, architecture, etc.) makes me feel good! I'm off to JavaOne next week so I'll take a few more coding tasks with me, probably on some other projects.

Saturday, May 16, 2009

The clock is ticking!

Only a few hours left to finish the SOA/ESB book! It'll be a relief to be honest. A lot of time and effort, but it's definitely time to finish it and move on to the next one.

Sunday, May 03, 2009

What ever happened to the Jiffy?

Back in the 70's when I first started to use computers we were told that computer time was measured in Jiffys (a 60th of a second back then). It was still a commonly used term when I started at university in the 80's (I even ran into it in physics). But since then it seems to have vanished from everyday use, at least in my field.

Science fiction to reality

Well over 30 years ago I read Do Androids Dream Of Electric Sheep? I loved the story and it's one I've returned to many times over the years. So when they announced a movie version of it I knew I had to go. I was a mere 16 when I went to see Bladerunner for the first time, on the back of such films as Star Wars, Close Encounters and Star Trek. I liked the film and have liked/owned every version since (yes, even the "original" with Harrison Ford/Deckard talking over in order to explain the plot). Of course they missed out bits from the book, but it was still a masterpiece.

One of the things I really liked in the movie is when Deckard uses a computer image scanner to look through photographs for signs of the Replicants. But this is no ordinary scanner: it uses data in the image to extrapolate further, uses reflections to look round corners and has incredibly good resolution. I found that idea fascinating and it was a high point for me when I played the Blade Runner game on my PC 10 years later. Over the years photographic resolutions and contrast capabilities have improved and we've all probably heard the debates of whether or not we can read newspapers from orbiting satellites. Not quite what Deckard was able to do, but interesting nonetheless.

However, today en route to Boston for a meeting, I found time to read my monthly dose of Scientific American. In this month's issue (May 2009) they have an interesting article on computer security and specifically how people can steal information from a computer without having direct access to it or the machine being networked at all. Of course they cover the areas known for a while, such as catching RF from screens, but it's the work on viewing screens through reflections off teapots, spoons and even the human eye that made me sit up and think! OK at the moment it requires a lot of time and money to do, but I'm sure it would be worth it for certain secrets. Plus it'll only be a matter of time before the equipment becomes smaller and less expensive.

What can you do about this information leakage? Well as the article says, "privacy filters" can increase the chances of being read in this way and flat-panel displays still emit some information. So curtains or blinds are the best defence at the moment. But I'm not going to worry too much. It was a nice article that brought back memories of a great movie and book.

Thursday, April 30, 2009

One thing that goes missing from backups

My mobile phone decided to die yesterday. Bad enough in and of itself, but it's gone in such a way that I can't get access to those little things I take for granted, such as address book! Yes, I have a version on the SIM, but it's way out of date. I only copy from the phone to the SIM when I move phones and that relies on them both being available! I back up my machines regularly. I do the same for my PDA. But it never occurs to me to look at the phone too. I think that'll change now.

Thursday, April 16, 2009

DoD and SOA

I was at the DoD sponsored SOA Symposium a couple of weeks back. I was speaking on Open Source and SOA, which is a little different to the abstract shown (more on that in a minute). Overall I was very pleased and impressed with the conference: it was by invitation only and limited to about 400 attendees (including the presenters). Every session was packed and there was a good mix of vendors and use-cases. I can hear what vendors think any time of the day, so it was the user-driven sessions and interactions that I was particularly interested in.

If this event was anything to go by then the DoD (and the US Government in general) really seems to have embraced SOA and open source. That's really good to see on a number of levels (I'm trying to remain objective here!) What blew me away though was the sheer scale of the projects that they run: lots of people involved and lots of money being spent. It should go without saying that reliability, fault tolerance and security (the Common Criteria is king here) are critical to everything that goes on, which can make it hard for the majority of vendors to get accepted.

Overall it was probably one of the most eye-opening events I've been to in recent years. The sessions were uniformly good, but it was the interactions between the presentations that really made this event shine. It's good to see (open source) SOA being used successfully in large-scale, mission critical applications.

Now back to my presentation. Because this was a DoD sponsored event you have to try to make your sessions vendor-neutral. Well I obviously missed that memo so had a rush on at the last minute to update the slides and remove all of the SOA Platform references. The presentation I ended up with was much better as a result: given my academic background I don't like white papers or sales-driven presentations anyway.

Wednesday, April 01, 2009

On the road again ...

I'm just about to get into a taxi to head off to the SOA Symposium in Washington on Government and Industry Best Practices in SOA. I'm presenting on Thursday morning on SOA and Open Source so if you're around come over. Since this is a DoD sponsored event I hear that the security will be pretty high, so it's going to be interesting for a number of reasons!

Tuesday, March 31, 2009

Monday, March 30, 2009

Analyze your blog

According to this blog analyzer I'm a doer.

"The active and playful type. They are especially attuned to people and things around them and often full of energy, talking, joking and engaging in physical out-door activities. The Doers are happiest with action-filled work which craves their full attention and focus. They might be very impulsive and more keen on starting something new than following it through. They might have a problem with sitting still or remaining inactive for any period of time."

Sacha is leaving

If you don't know by now, Sacha is leaving JBoss. There's not a lot more I can add to what Bob has said, except that I'll miss out almost daily interactions. Sacha started as a colleague (when he, Bob and Marc persuaded me to move from Arjuna) but grew to be a good friend and I wish him and his family all the best in the future. Today begins a new chapter in his life as well as that of JBoss. Good luck Sacha!

Sunday, March 29, 2009

First dive of the season

It may only have been Ellerton again, but we managed to get the first dive of the season in today. It was a bit like swimming in tea (less than 2 metres visibility) and cold (8 degrees Centigrade), but it was worth it. Everyone else there was in dry suits but we managed to struggle through in our wet suits. I just wish someone would invent a spray-on suit, because I'm sure I get more of a workout putting the darn thing on that actually swimming afterwards.

Thursday, March 19, 2009

RPC is dead?

From what Steve says it certainly sounds like this year's QCon London was fun. This year Steve was talking on the history of RPC, which has some overlaps with what I was saying at QCon London last year. But Steve goes in to a lot more detail and I recommend people checking it out.

One slight caveat though: as I've said many times in the past (so many that I leave it as an exercise to the reader to find the entries in my blog) RPC isn't dead and isn't something you should ignore. (I don't think Steve thinks that either.) Look around you and you'll see examples of old and new systems based on it. Are they all wrong to do so? No, of course not. Are some of them wrong to do so? Most probably yes. I know of several small and large-scale systems that are being developed in academia (no vendor pressure there) and industry (maybe some pressure) that are being based on RPC. In all cases those guys did their homework and understand the trade-offs that they're making in using RPC.

BTW I'm not a RPC fanboy by any means. Yes, I've used them and helped develop them, but I've also used and developed other approaches too.

What I'm trying to say is that RPC has its place. But a bit like ACID transactions, it can be (and has been) easily misused. There are better approaches these days for many of the things we'd once have considered the domain of RPC. But that doesn't mean RPC can't and shouldn't be used. As with everything, such as which language to use, which database, which communication protocol etc., you need to be aware of the pros and cons. This is an education problem more than a technical problem.

I also agree with Steve around Erlang (in fact it was Steve who put me on to it a while back). A very nice language. Now if only I could find the time to get back to it.

Thursday, March 12, 2009

Congratulations Barbara!

Barbara Liskov won the Turing Award! This is really good news. I met Barbara a few times, starting during my PhD days. At that point she was leading the Argus project in transactions and replication, which was relevant to my own research. Thoroughly deserved. Well done Barbara.

Wednesday, March 04, 2009

DOA 2009

I'm co-chairing DOA again. Check out the CfP for 2009.

HPTS 2009

HPTS is probably my favourite workshop/conference. I've been lucky enough to be able to attend every one since the 1990's and hopefully this year will be no different. The CfP is now up, so take a look.

Tuesday, February 10, 2009

The ultimate Christmas dinner

OK it's a little late but then I didn't realise this video was online! I watched it for the first time over Christmas 2007 and then again last year. Heston Blumenthal has quickly become a favourite to watch and someday I hope to be able to get to his restaurant. Until then I'll sit and watch in awe!

Zakim paper

I've been using Zakim for a few years through various W3C standards committees. Nice paper.

Wednesday, February 04, 2009

Arjuna and the OTS

A couple of comments on my last reminiscence post got me thinking again. One of the things we use to go on about a lot in the early days of the company was the influence we'd had on standards and particularly the Object Transaction Service from the OMG. If you ever compare the two you'll see that by the time the OTS started Graeme, who co-created the Arjuna project (upon which he also got his PhD), was working for Transarc and involved in the standards definition. I remember visiting him in Pittsburgh a few times during that period and hearing a few horror stories about this brain-dead approach being pushed by vendor X and that dumb idea from vendor Y. But I'm sure it was all fun; it was definitely influential on the industry and me personally. When I came to start pushing the OTS in the OMG I did understand what Graeme had been on about with standards processes!

Tuesday, February 03, 2009

OMG where did the time go?

While writing the previous entry concerning building transactional applications I came across that HP paper by Nigel Edwards. That sent shivers down my spine because I can recall Nigel sharing the office with Stuart and I for months while he got to grips with Arjuna. We were both still doing our PhDs and also in the throws of making the first code release of Arjuna to the world. Here's the announcement for Release 2 (it would appear that our first release predates the web!):

Arjuna (Distr Prog System)

What: Release 2 of Arjuna Distributed Programming System
From: (Arjuna Project)
Date: Mon, 17 May 1993 12:37:34 GMT

We are pleased to announce the availability of a new version
of Arjuna: a programming system for reliable distributed computing,
and the Arjuna mailing list.

The software and the manual for the Arjuna system can be
obtained by anonymous ftp: (

Arjuna System

This beta release of ArjunaPR2.0 fixes all known bugs present
in ArjunaPR1.2B that have been reported to us or that we have found,
and contains only minimal information about how to use the new features
provided. This release should be compilable with the following

AT&T Cfront Release 2.1, on SunOS 4.1.x,
(using Sun supplied lex and yacc).
AT&T Cfront Release 3.0.1, on SunOS 4.1.x and Solaris 2.1,
(using Sun supplied lex and yacc).
GCC versions 2.1, 2.2.2, on SunOS 4.1.x,
(using flex(v2.3.x) and bison).
Patched GCC version 2.3.3 on SunOS 4.1.x and Solaris 2.1,
(using flex(v2.3.x) and bison).
Sun C++ 2.1, on SunOs 4.1.x,
(using Sun's lex++ and yacc++).
HP C++ (B2402 A.02.34), HP-UX 8.07,
(using HP supplied lex and yacc or lex++ and yacc++).

The major new features are:

- Faster object store.
- Support for replicated objects.
- Memory resident object store.
- Support for ANSAware (not available via ftp)

Arjuna supports nested atomic actions (atomic transactions) for
controlling operations on objects (instances of C++ classes), which can
potentially be persistent. Arjuna has been implemented in C++ to run on
stock platforms (Unix on SUNs, HPs etc). The software available
includes a C++ stub generator which hides much of the details of
client-server based programming, plus a system programmer's manual
containing details of how to install Arjuna and use it to build
fault-tolerant distributed applications. The software and the manual
can be obtained by anonymous ftp: (

Several enhancements and ports on various distributed
computing platforms are in progress. We would be pleased to hear from
researchers and teachers interested in using Arjuna. The programmer's
manual contains the e-mail addresses for sending your comments and
problem reports.

ANSAware version of Arjuna

The ANSAware version of Arjuna is available from:

Architecture Projects Management Limited
Poseidon House
Castle Park Phone +44 223 323010
Cambridge Fax +44 223 359779
CB3 0RD Internet
United Kingdom UUCP ...uknet!ansa!apm

Arjuna Mailing List

To enable us to help people using Arjuna, an electronic mail list has
been setup. You can join the Arjuna mailing list by sending an e-mail
message to "" containing:

join arjuna

For example : join arjuna John Smith

Mail messages can then be sent to "", for

Arjuna Project Team
The Department of Computing Science,
The University,
Newcastle upon Tyne.
NE1 7RU, UK.

The work we did with Nigel still seems so fresh in my mind and yet it is so long ago.

Then I also remembered the work on Stabilis that some of our Brazilian PhD students did a few years later. We'd always talked about how the programming model we had in Arjuna was good for building complex applications (and Nigel's work agreed with that), but it took us all by surprise when they presented Stabilis: here was the largest and most complex system (a full relational database) built entirely on Arjuna! This was an impressive demonstration of what was possible with what we'd spent the last 6 years working on.

Definitely a time to reminisce!

Transactions and AIT/TOJ

Jonathan pointed me to a recent discussion on TSS concerning writing transactional resources for non-transactional objects. It expanded into the more general topic of how to write XAResources, which it topical at the moment within Red Hat. Anyway it got me thinking that many of these requests aren't necessarily for "How do I write an XAResource?" but more for "How do I make my data transactional?" These are two different questions: the former infers the latter, but the latter doesn't infer the former. Yes believe it or not but transactions existed before XA came on the scene.

So it got me thinking and I realised that what many of them they really want is something akin to what we did in the original Arjuna system. It didn't have a name back then, since it was the whole reason for doing the Arjuna research, but eventually it became known as Arjuna Integrated Transactions (AIT), or Transactional Objects for Java (TOJ). The use of "it" has not been pushed much (at all) over the past few years, which is a shame because I still think it's a great model. So it may be time to dust it off and bring it back into the light.

Sunday, February 01, 2009

Sock and Anti-Sock Pairs

Back when I was hard at work doing my Physics undergraduate degree (a quarter of a century ago!) I remember a few of us finding creative ways to spend the time during the 4 hour practical sessions we had twice a week. Well there are only so many times you can do Millikan's experiment or kill a cat in an enclosed box with a stray beta particle. (OK that last one is something we planned but never quite did for obvious reasons and it still scores high on the "what if?" discussions at reunions.) So what we often did was end up spending hours reading New Scientist.

The one article that has stuck in my mind for that length of time was a brief description of why you end up with odd socks in the wash and usually a piece of fluff. From what I can recall, it talked about spontaneous creation of anti-socks and sock/anti-sock annihilation. To us physics students it appealed on at least two levels: the obvious physics analogies concerning matter and anti-matter, but also the fact that for most of us this was the first time we'd been away from home and having to worry about doing our own washing, much of which was full of fluff and stray socks that we were sure weren't our own! Many times over the years I wish I'd kept that article. I had hoped that the wonder of the digital age would help, but this is the closest I've found.

DOA Program Committee and Topics

I'm chairing DOA 2009 and my co-chairs (Jean-Jacques Dubray and Fabio Panzieri) and I have been finalizing the program committee. Fortunately most of last year's PC have agreed to stay on so we've only had to add a couple of new people. Next step is to figure out the Call for Papers. If you've any thoughts on things you'd like to see at DOA 2009 that maybe weren't covered in the DOA 2008 CfP, let me know. After that we'll have to determine the invited speakers and keynotes. Fun fun fun!

Tuesday, January 27, 2009

Made my day

This article just made my day. Fantastic!

Sunday, January 18, 2009

SOA Design Patterns released

I've been writing a book on SOA and ESB for a while with a couple of friends. It's due to be part of Thomas Erl's SOA series later this year. While writing it Thomas asked us to produce some patterns for his book on SOA. Well the finished book, SOA Design Patterns, is now out and it's well worth a read. A number of people throughout the industry have helped to contribute some of the patterns within it so this is definitely a cross-industry collaborative effort. Thomas also wants it to be a live-work in as much as new patterns can be contributed by anyone. So take a look at the book and if you see something missing consider contributing it.

Saturday, January 17, 2009

Star Trek interlude

Anyone who's known me for long enough knows that I'm a Star Trek fan. (Not the variety that dresses up as Andorians or learns Klingon!) Being born in the late sixties sci-fi was fairly influential to me in one way or another, with my fondest TV memories coming from shows like Star Trek, Space 1999, Dr Who, The Six Million Dollar Man and Blakes 7.

I still say that the first couple of seasons of The Original Series were better than much of what was created in recent years. So it's with some apprehension that I look ahead to the new movie. I'm definitely not going to prejudge, but I had hoped they would retain some of the look-and-feel of the original.

On thing that has stayed with me over the past few years is a quote from Kirk to Picard from Star Trek Generations (watch the film if you want to understand the context):

"Don't let them transfer you... don't let anything take you off the bridge of that ship... Because while you're there, you can make a difference."

I've found that it applies to more than just starship captains!

Monday, January 12, 2009

Virtual conference

We're doing a virtual conference next month. I'm recording some sessions at the moment. Very strange, but I think it'll be fun to attend. Maybe this will encourage me to look into SecondLife.

Wednesday, January 07, 2009

SOA is dead (again?)

For the umpteenth time SOA is dead apparently. It's been shot so many times over the past year or so that you'd think it had a role in a Sam Peckinpah movie! However, maybe it's got 9 lives or you need to use a silver bullet, because each time it dies it keeps coming back.

This time it's my friend Anne who is reporting the demise of SOA. I've known Anne for many years and respect her, so unlike a few other obituaries I take note of this one. However, like Duane I have to disagree that SOA is dead. OK, the term may have been overloaded over the past few years almost to the point of being meaningless (analysts are as much at fault as anyone in the industry) but the concepts behind it are still very much alive and relevant (more so in today's economy.) Where we have lost our way is perhaps in having an agreed architectural definition for SOA: it is the 'A' after all, so surely we can agree on what that means (REST doesn't suffer this problem) as well as pushing the mantra that "SOA is not just a technology" (though given the number of people who have been saying this over the years I suspect that this is more a case of individuals simply not listening, certain vendors muddying the waters or it being lost in the noise.)

There is no global panacea for IT woes. There never has been and I doubt there ever will be one. For the most part software engineering practices evolve over time (Darwin would be proud). There are a few evolutionary dead ends on the way. I doubt we'll see any extinction-level events though (not unless they're associated with human extinction.) But these things take time (I'm sure the dinosaurs laughed at the evolutionary path that created the Coelacanth, but who's laughing now?)

We can't afford to keep jumping from one lifeboat to another just because some use cases don't match, or someone didn't quite understand that you need to think about how to use SOA before coding. The underlying requirements of loose coupling, security, federation, reliability etc. date back decades. The term SOA should have been good enough. It still is as far as I'm concerned. If it's not SOA then are we in for yet another round of acronym generation, like SOA 2.0, WOA etc? None of which really add much to help the people at the sharp end of IT problems!

I think in Anne's recent SOA obituary she is trying to indicate that the principles are fine but it's the term that needs to go. In the first case we can agree. In the second I don't think it's worth the industry wasting another 4 years to disagree on yet another acronym (if we do, I'd like to nominate XYZZY.) As I said above, for people actually using these principles and associated technologies it won't make a difference to them. (It may be good for analysts though.) So let's just stop this madness and start concentrating on where it really matters.

In this case I think it's definitely more a case of what Mark Twain once said, "The reports of my death have been greatly exaggerated".

Thursday, January 01, 2009


My friend Mic and I have discussed a few languages I've been playing with over the past year or so, which include D and Erlang. I think I managed to persuade Mic that there is some benefit to Erlang, but maybe the jury's still out on D. Well a few weeks back Mic mentioned that he was looking at Scala, so I decided to take a look. I didn't find the time until I took a break from work for the festive season and have just spent a few days giving it a go. My first impressions are pretty positive (I've mentioned a few times in the past that I'm a fan of functional programming languages and particularly lisp which I've used for over 20 years). I'm trying to use it through Eclipse but am finding the plugin a bit clunky, so I may go back to the command-line and emacs (I'm a shell/emacs person at heart!)

So far I haven't got into any really complicated projects or applications so there may be problems with the language that I don't know about yet. But at the moment I like it and will persevere with some pet projects that I've been thinking about for a while. Even if it doesn't lead to anything ultimately, it's another language to learn and experience gained. But I have a suspicion it'll be more than a passing fad for me. Thanks Mic.