Friday 27 March 2009

Catch 22

I’ve just read Marco Cantu’s comments about a few of my posts. (I’m honoured Marco!). The crux of Marco’s comments is that Delphi developers fall in to two distinct camps. Those who want Unicode and those who don’t. Therefore Codegear had to make a choice on whether to make it easy for camp 1 or camp 2.

There are a few problems with this hypothesis.

  1. There are more than two camps. I fall under neither for example. I want Unicode. I really do. I wanted (or rather needed) it so much in fact, that I implemented Unicode support using Delphi 2006. I would like to be able to move to Delphi 2009 (after all it is the latest and greatest, and where I can expect the most support, and where I hope migration to 64bit will be easiest), without having a major dip in the quality of my software.
  2. By choosing to change the meaning of PChar and String and all those methods that ultimately called Ansi versions of windows functions, what Codegear did was make it difficult for everybody. Those that are not interested in Unicode have work to do, and those that are interested in it also have work. Anybody who’s done the conversion knows that the biggest problems are places where PChar has been used as a pointer to a byte. Buffers and bookmarks in dataset descendents for example. So it’s never just a re-compile.
  3. All those who are defending Codegear’s decision, keep saying it’s easy to make the move. It’s x days work, it’s no big deal etc etc. That maybe the case, but experience tells me that making such a change will result in a far greater hit to the testing burden than actual development time.

I’m almost ready with my migration to D2009. It was probably harder (ironically) than most, because my application already was Unicode compatible through use of Widestrings and TNT unicode components, but it still wasn’t that bad. The thing is though I know for a fact that my application has gained a few (only a few mind you, I’m optimistic) bugs. It’s inevitable. And because most of the problems were to do with buffers and pointers, it’s those subtle, “it was ok on my machine” type of bugs. The testers are not going to be too happy. The first thing they ask when they receive a build is what should they concentrate on. My answer this time is everything. I’ve touched every single piece of functionality. Most (if not all) their automated tests are probably broken (I haven’t told them that bit yet!), so not only do they have a lot of testing to do, they have a lot of test code to migrate. And at the end of all this what does our application gain. Nothing. At least those who have Ansi applications can say we now have Unicode support. We can’t, we had that already!

I think Codegear had a dilemma, and quite a few people must have had a few sleepless nights trying to figure out what the best thing option was. Whatever they had chosen, would have caused someone, somewhere grief. Had they gone for any other option, I’d probably be writing a post complaining about it. I’m like that!

8 comments:

Anonymous said...

Unicode is a requirement for more and more interfaces (including Win64), and with string=ansistring, Delphi developers would spend more and more time maintaining conversions. Now, that everything is Unicode, except where otherwise noted, running costs are lower.

It takes a small investment to make the conversion, but seriously, it's not that big compared to the benefits. You don't need to convert everything perfectly to make it work as before, the big part of the job is actually if you want to unicode-enable your app, which is not required.

Ritsaert Hornstra said...

At our shop we already converted a few projects (as a testcase). What we did find was, that if you only use "simple" code (like a form, a button and a few other standard components the comverion is indeed a breeze.
The problems start with our own componets and utilities that need to communicate with external hardware, need to communicate with external applications / DLLs / etc tec. Needs to create binary files nad also a lot of library code that simply assumes Char = Ansichar and ises strings as buffers and PChars as pointers and use the set of Char type extensive.
For the newer parts we had testcases but the older units didn't. Two test conversions simply failed due to the fact that the application did not function correctly anymore and after a few days of conversion we had an application that

was less stable but yeah, unicode capable; something we cannot ask any money for. So in the end we just left those applicatoons in ANSI mode (eg D2007).

In the end it is the fact that the supportes say: it is easy (they are right for some parts) and the opponents who say they need a lot of work for which they cannot an invoice.
The net result is that Delphi users (not all of them but a substential part of them) stick with older versions of Delphi -> they do NOT spend money to buy newer versions -> Codegear does earn less -> Newer versions of Delphi will progress slower -> ... I still thuink it would have been a wise investment to have a compier switch / a conditional switch. It should not have been that difficult and the VCL was not duplicated, just a few places with ifdefs.

Anonymous said...

I still thuink it would have been a wise investment to have a compier switch / a conditional switch. It should not have been that difficult and the VCL was not duplicated, just a few places with ifdefs.

Don't you think they would have done that if it was possible?

Anonymous said...

"Don't you think they would have done that if it was possible?"


That's easy to answer: "No".

We know that because it was possible. All the evidence trotted out to "prove" that a switch was impossible were based entirely on an assumption that the Unicode support would be delivered in the chosen way.

"No, we cannot travel East because we are travelling West"

"But if we just turned around, we could travel East"

"No, you don't understand. We cannot travel East because we are travelling West - you see?"

"But *why* can't we travel East?"

"I told you - because we are travelling West. It's impossible to East."

"But we could just turn around."

"But see - we just proved that it's impossible to travel East, so clearly we can't turn around"


We couldn't have a switch because of the way Unicode support was implemented.

A different implementation could have supported a switch.

The question should be not:

Why can't we have a switch?

but:

Why did you adopt an implementation which made a switch impossible?

It's a subtle difference, but a real and important one (in terms of being the "right" question).

In all other respects it's an entirely meaningless question because we have what we have and nothing can change that now.

(shrug)

Ritsaert Hornstra said...

Don't you think they would have done that if it was possible?

I think they dismissed it too early. If you look at some code in the D2009 and the D2007 source you see that *a lot* of VCL code is stull very generic and should be simple to use for both ANSI and Unicode.

Let's say it would have cost them 2 man years of development and 2 manyears of testing to get it done (i think this is an over estimation). Thus costing something link 8000 hrs extra which translates to something like $600k of real costs. Now for each and every D2007 user that is NOT upgrading because of this they would have been rewarded with a few hunderds of dollars (I do not know how much of course) let's say $200. This means that even after 3000 upgraders it would be cost effective for them. *sigh*

Naughty Kiwi said...

Hi Babnik

I so want to disagree with you! It would be ridiculous for CodeGear to pore effort into a compiler switch that will ultimately be redundant. Better they force everyone to move to the greater Unicode world of things. Anything less would be an utter waste of resources...

Unfortunately deep down, i can't disagree. I want to move to Delphi 2009 now because it sounds like it is better and more stable, but I'm not ready to fix up all the little unicode incompatibilities in my third party code to do so. So for me the easy answer is to plod along with D2007.

I also want to move to unicode, but only when i am confident it won't cost me any sales during the transition.

I do think CodeGear made the sane decision, but realistically there are a lot of people scared of upgrading and this must be causing them a huge amount of financial pain.

Anonymous said...

The issue is that if you really cared about unicode you've already done the work and have that stuff that needs to be unicode broken out as WideStrings or some custom type and you are already using custom components that handled unicode. So when CodeGear 'fixed' the VCL they broke all the hidden under the covers stuff and the exposed 'need to be unicode' stuff is still stuck in the custom components which are now probably broken on the inside.

So they broke the build needlessly for 1) people who don't want unicode or care. 2) people who really care about unicode and had already designed there programs to handle it. The only people that really benefit from CodeGears path to unicode are those that don't really care that much but don't mind if they get unicode for free. Those folks might have some of the hidden breakage but essentially get a new unicode capable app for little effort.

All this really only matters for old ongoing apps though. CodeGears methods will make things better for new apps and going forward once all the painful conversion and library patching is done.

Anonymous said...

The Widestring solution means your program SUPPORTS UniCode. It (probably?) still communicates with windows in the old Win95/98/ME way.

Moving to D2009 means communication with the Windows API is now also UniCode.

Moving VB6 to VB.NET also created an outcry. But it has to be done sometime.

Windows 7 will no longer support 16-bit, I understand. So no DOS based programs will work anymore and my batch-driven applets will have to be re-written with no income generated.