Saturday, 28 March 2009

When did the internet become so essential?

Next week I’m going into hospital for a knee operation. I’m not worried about the pain, or the months of rehabilitation afterwards. I’m worried about how I’m going to live five days without an internet connection.

I’d gladly exchange painkillers for a wifi connection. When we get the occasional problem with our broadband connection at home, I go crazy worrying that it’s going to be days or even weeks before we get it back.

I don’t know if it’s just me, but I’d gladly give up TV, telephones, game consoles, (some) food, warmth, comfort rather than lose my  internet connectivity. At what point did connectivity become so important? I know there are people out there without connections, but it astounds me. I guess there are more important things, especially in developing countries, but in the first world, it’s quickly becoming essential to be within a metre or two from a computer with a connection. I guess with the advent of devices such as the IPhone, we’re going to be even closer to that connection than ever before. And we’ll subsequently fret even more when we’re cut off.

Friday, 27 March 2009

Catch 22

I’ve just read Marco Cantu’s comments about a few of my posts. (I’m honoured Marco!). The crux of Marco’s comments is that Delphi developers fall in to two distinct camps. Those who want Unicode and those who don’t. Therefore Codegear had to make a choice on whether to make it easy for camp 1 or camp 2.

There are a few problems with this hypothesis.

  1. There are more than two camps. I fall under neither for example. I want Unicode. I really do. I wanted (or rather needed) it so much in fact, that I implemented Unicode support using Delphi 2006. I would like to be able to move to Delphi 2009 (after all it is the latest and greatest, and where I can expect the most support, and where I hope migration to 64bit will be easiest), without having a major dip in the quality of my software.
  2. By choosing to change the meaning of PChar and String and all those methods that ultimately called Ansi versions of windows functions, what Codegear did was make it difficult for everybody. Those that are not interested in Unicode have work to do, and those that are interested in it also have work. Anybody who’s done the conversion knows that the biggest problems are places where PChar has been used as a pointer to a byte. Buffers and bookmarks in dataset descendents for example. So it’s never just a re-compile.
  3. All those who are defending Codegear’s decision, keep saying it’s easy to make the move. It’s x days work, it’s no big deal etc etc. That maybe the case, but experience tells me that making such a change will result in a far greater hit to the testing burden than actual development time.

I’m almost ready with my migration to D2009. It was probably harder (ironically) than most, because my application already was Unicode compatible through use of Widestrings and TNT unicode components, but it still wasn’t that bad. The thing is though I know for a fact that my application has gained a few (only a few mind you, I’m optimistic) bugs. It’s inevitable. And because most of the problems were to do with buffers and pointers, it’s those subtle, “it was ok on my machine” type of bugs. The testers are not going to be too happy. The first thing they ask when they receive a build is what should they concentrate on. My answer this time is everything. I’ve touched every single piece of functionality. Most (if not all) their automated tests are probably broken (I haven’t told them that bit yet!), so not only do they have a lot of testing to do, they have a lot of test code to migrate. And at the end of all this what does our application gain. Nothing. At least those who have Ansi applications can say we now have Unicode support. We can’t, we had that already!

I think Codegear had a dilemma, and quite a few people must have had a few sleepless nights trying to figure out what the best thing option was. Whatever they had chosen, would have caused someone, somewhere grief. Had they gone for any other option, I’d probably be writing a post complaining about it. I’m like that!

Tuesday, 24 March 2009

Google is NOT the internet. Is it?

As I’ve mentioned before, I’ve been working late nights on my wife’s website, and I’ve been having a few browser problems. I kind of expected those, and have got round some of the problems, and fixed others. What I didn’t expect was my users not finding the site.

As beta testers, I used some friends and family. Basically, I sent them the link in an email.

I assumed that they’d click on the link and get the site on their default browser, or at worst, they’d cut and paste the link into the address bar of the browser of their choice. I know you should never assume anything when it comes to users, but it seemed a safe assumption to make.

Anyway, a few days later, one particular person said they couldn’t get to the website. I checked, and the site was up. I told her to try again, and she said, it’s not working. Baffled, I went away, and checked some more. It was there. To cut a long story short, I interrogated her again, and found out she was cutting and pasting the url into Google, and doing a search. She just doesn’t use the address bar. The site now comes up first with the url as the search text, but 1 or 2 days after I’d gone live, the Googlebots hadn’t got round to it yet.

This user was baffled as to how she couldn’t find it. I explained about the address bar, and that Google is NOT the internet, but just a search engine, and that it happens to search a database it maintains of the internet. And the internet being a fluid thing, the database is by necessity just a snapshot of the internet. She just couldn’t understand it. To her, if it isn’t on Google, then it’s not on the internet. Am I the only one feeling slightly uncomfortable about this? I’m sure this person is not the only person in the world who thinks that Google is the internet. That’s an awful lot of power in the hands of one company.

Misunderstood. Who me?

My last post generated exactly the type of comments I expected it to. In fact I could have written them myself as I knew what was coming. Here’s a sample, and my reaction to them.

The US are only 5% of the world population. That means that 95% of all Delphi users need Unicode.

I never said I didn’t want unicode. In fact our application is already Unicode compatible, but I still wanted reference counted unicode strings. I just didn’t want my existing code to break. I would have preferred the choice of when I decide to move to UnicodeStrings to be in my hands, and not forced upon me. (well not forced, I could always stick with D2007, but you know what I mean)


I adapted my app within 2 or 3 hours.

Great, you now have a unicode version of the Hello World application. In the real world, applications are a little more complicated, and take a little longer. If you managed to upgrade your application in 2 or 3 hours, your application is either very small, or you’re a genius!


...why should I write new code with special TEditW lines - who needs pure ANSI code today?

Ok, why should the quality of my code suddenly take a nose dive, just because of a compiler upgrade? And again, I’m not arguing against Unicode and in favour of Ansi. I’m just questioning how the move was made.


You don't think this (Delphi 2007 compatibility switches) has been considered, do you?

I’m pretty sure it was, or at least I hope it was. Was the right decision made? I don’t know, but I’m pretty certain from a technical point of view, the right decision would have been the least number of breaking changes as possible. On the other hand, this would definitely have pushed the release date back quite a bit, so you also have commercial considerations. I’m a technical person, not a salesman, so I ignore the commercial considerations. I’m sure Codegear can’t afford to.


Apart from all the above comments, I got quite a few agreeing with my point of view. These actually surprised me more than the ones above. I originally thought perhaps I was being a little harsh, but it seems I’m not the only one slightly annoyed.

 

Anyway, the actual conversion is going pretty well. Biggest problems have been third party components. I now have an application that compiles, but ironically, while it handled Unicode with no problems in Delphi 2007, the Delphi 2009 version is struggling. We need to do a lot more work to get to the same quality point we were before. We will have touched probably 70-80% of the units. That’s a major test burden in of itself.

And the thing is, in the eyes of our users, our application hasn’t gained a thing.

Monday, 23 March 2009

Breaking Existing Code

We finally decided last week to have a go and try and move one of our applications from Delphi 2006 to Delphi 2009. Most of our other applications had been moved to Delphi 2007, this one, due to unfortunate timing, remained with Delphi 2006. I say try, because we are wary of the changes, and if it’s too much upheaval, we just may decide it’s not worth it.

It’s taken us a while to decide on whether we should make the move or not because this application had been made fully unicode compatible using widestrings and TNT components. We don’t actually need Delphi 2009’s unicode support, although we realise the difference between widestrings and Unicodestrings. The fact we’d done that work actually worked against us in this case.

Now I’ve done many of these upgrades. I’ve gone from Delphi 1 to Delphi 2, 2 to 3 so on and so forth, so I’m in a position to compare the experience. So far it hasn’t been easy. We use quite a few third party components which are no longer actively maintained, so we’re not getting a Delphi 2009 version anytime soon. Luckily, we have the source (actually luck doesn’t come into it, we’d never use a component if we didn’t have source!). Now the convention for creating buffers has always been to use PChar. This obviously doesn’t work anymore, and while our code isn’t too bad in this respect, the third party components we use are riddled with PChar buffers, pointer arithmetic etc etc

Now, I got to thinking. How could Codegear have made my life easier? Well for starters, they shouldn’t have broken my existing code! If PChar was a pointer to a (single byte) character, then it should stay that way! If a string is  an Ansistring, then it should have stayed that way. In fact I should have been able to recompile my code in Delphi 2009, and my application should have worked exactly the way it did in Delphi 2007 (I’d accept it if older code didn’t completely work first time, but Delphi 2007 code should more or less work the same) Unicode support is a great thing, and hats off to Codegear for implementing it in Delphi 2009, but I would have done it differently. I’d have left a string as an Ansistring, and created a new string for unicodestrings, PChar would’t have changed, and TEdit would still be, in essence a TAnsiEdit. They would then have created a parallel VCL, and appended say a W to the names (TEditW) for the unicode equivilents. So anyone who wants a unicode application needs to do a little work. Not too much, but hey, nothing comes for free. Those who don’t want it, and there must be a few out there who don’t really care, get a compiled application which behaves exactly as it did in Delphi 2007. They can upgrade to unicode, if they wish, at their own pace.

If Codegear had really wanted to be helpful, I guess they could have gone the extra mile and added some compiler directives. D2007 compatibility on or off. That way everybody is happy.

Codegear, and Borland before them, keep making the same mistake. They keep assuming that technology alone can create these magic buttons. Press them, and you get this wonderful transformation. That would be nice, but it just doesn’t work that way in the real world. They tried it with Kylix (just compile your existing Win32 code in Kylix, and you have a Linux application) and it didn’t work, they tried it with Delphi for .NET (just compile your existing Win32 code with Delphi 8, and you have a .NET application) and it didn’t work, and they’ve tried it again with Unicode support (just compile your existing non-unicode code with Delphi 2009, and you have full unicode support), and in my opinion it hasn’t worked either. Sure it works with a contrived example, but try it with any decent sized, real world application, and you are in trouble pretty quick.

I’m sure there’s many out there who have compiled their existing Delphi 2007 code in Delphi 2009, and with a little tweaking here and there have been able to get it to work. I just think it should be the developer’s decision when to introduce Unicode support. I guess it still is our decision, but currently ,if we decide that completely changing the way our code behaves is a step too far, our only option is to remain with Delphi 2007. And if that happens, Codegear are one step closer to losing a another customer.

Wednesday, 18 March 2009

Shiny Chrome

I haven’t posted in a while as I’ve been setting up a website for my wife. Nothing special, just some static html. God, do I hate HTML. It’s so illogical! It’s not real programming. I also made the mistake of assuming that modern browser’s are now more or less all compatible. It’s only my second ever website, so I could be forgiven. As I was developing, I was testing using Chrome, my default browser. I love Chrome, it’s fast, and does what you expect it. Once I was finished, I tried the site in IE7. ARRGGHH. What a mess. Firefox and Opera were slightly different, in rendering the pages, but IE was completely off. I had to compromise, to get IE to work, and then it still isn’t great. Why do people still use IE?

And don’t get me started on Dreamweaver. It keeps moving text around. I gave up on it, and used a text editor, Notepad++. Give me Delphi or Visual Studio development any day!

The site, if you are interested, is Chez Piron. We renovated an old barn, and my wife will be looking after letting it out as holiday accommodation.