Archive for the ‘Software’ Category

Wohoo, I’ve changed the world :-)

March 4, 2008

Well at least I feel like I have. I finally contributed some code back to open source and threes a really feel good factor attached to doing so. Fine, yes it was less than 50 lines and yes the Icon doesn’t look all that awesome (but in my defense, it was hand drawn using visual studio icon editor) .
 

So what was it?
Deblector, created by Felix is a plug-in for reflector which allows you to debug MSIL in the .net framework. I’ve been using this from very close to the start (almost 1.5 years ago now) when working on a nasty framework being developed by a 3rd party vendor. Many of the framework releases were premature and we were the ginnie pig developers building against it. Many features of this framework just didn’t work and we had no documentation other than the 2 people in the other room that created the framework (and never really given it a test, as shown by the number of core functionality that didn’t work and never would have when you see how they had written it). Anyway, the great thing about this reflector plugin was that it gave you the power to trace through others code and allowed you to attach to a application (much like visual studio debugger), set break points in the MSIL (helped by the core reflector features of disassembling the MSIL to C# or VB) and see the execution paths, set into, over and out of code, see local variables etc in order to figure out what might be causing the issues.

So what did I change?
The great feature of the mdbg was that you could set it to break at the first exception (so you could break on exceptions even within a try catch block). This was the main reason I started using Deblector as the framework I was debugging against always swallowed the exception, no logging, no messages, no nothing, just poof and gone (anti patterns anyone???). Deblector supported the “Catch Exception” command within the command window but when the break point was reached, the command window would know this and allow you to work with it, however, the Debelctor UI wouldn’t and so you could not use the built in prettiness and power of Deblector. So now with only a few lines of code and 30mins creating a nice icon (yeah don’t laugh at it, I thought it turned out quite well considering my art skills), I have added a handler for the event and hooked up the code that would keep the UI in check. Very simple and took very little time, however, I figure it’s a really useful feature and tool.

So, next time you are digging to someone elses framework or DLL and you don’t have code to but want the ability to debug…. give Deblector a try, could be the man for the job 🙂

deblector.jpg

Advertisements

To be or not to be? – Jack of all trades

February 8, 2008

jack_of_all_trades.jpgA few days back at the weekly mentoring session at work an interesting topic came up, To Specialize or not to specialize in the IT industry.

As a Microsoft partner, most products and solutions we sell is Microsoft based (CRM, MOSS, .Net etc) so to a degree as a company, we have already specialized. However, within this space, there’s a lot more to think about, for example, specializing in winforms, Web services, web apps or even specializing horizontal across these, for example, security, use of Application blocks such as policy injection, logging, enterprise search tools etc.

Towards the end of this, there was a vote, who would like to specialize and who wouldn’t. From recall, I think it was pretty close to 50/50 in each camp. It seemed that the idea of having dedicated people / group to call on in the time of need seemed very appealing, however, being a person known as an expert didn’t appeal to everyone .

A few quick things for consideration on this…

  • point.gifIt is always good to have help from people and leverage off others knowledge (everyone needs experts to climb on).
  • point.gif(Most) developers by nature want to learn and grow. When we get a chance to play with new technology, its like a little kid opening a up their Christmas present.
  • point.gifBeing given the title ‘Expert’ gives huge kudos amounts your work mates (including managers and PMs), which could help you land the next best project.
  • point.gifBeing an expert in old technology usually means that you end up supporting legacy systems and that the only way to get your hands on new toys is by leaving and finding a new job.
  • point.gifBeing the last expert in an old technology means you never get any work done as everyone else is asking you for help when they are digging into old systems.
  • point.gifWhen you feel you have reached as far as you can / want to, you would most likely want to dig into something different, therefore experts would be hard to keep.
  • point.gifWhen you are at the stage in life where you are happy to settle on stuff (maybe when you start having a family?) and sick of the would moving so darn fast, you might appreciate being the master of a technology no matter how old that technology is (Cobol anyone)?
  • point.gifThe time and investment required to be the first to discover, investigate and solve problems. This requires commitment from the company and also in many cases, the client. In most cases, Its not ideal to have tight deadlines and playing with new technology, you will only ever focus on the immediate goal and not the learning required to overcome a technology.
  • point.gifBeing an expert could mean the company is now dependent on you (pay rise anyone)?
  • point.gifBeing an expert could also mean the company forgets about you and loses visibility over all the things you do during a work day.

Anyway, a few thoughts…
I’m a keen advocate of the phrase “Jack of all trades, master of none”, though the phrase “Jack of all trades, master of most” does sound more appropriate in IT. In IT, It really does help if you can provide business solutions independently. It also helps if you have a basic understanding of a wide range of technology. With this basic knowledge, Google becomes a much better friend and the time you spend solving a complex problem becomes shorter. With technologies such as .Net and WCF abstracting complexity, it’s too easy for people to know how to plug things together without actually knowing the details around the underlying technology so when problems occurs, they are lost for how to begin solving them. For me, i think its best to have a breath of skills and do whatever you are working on well, that way, you may become natural experts in areas that you have worked on and still have broad knowledge.

Credit: Image shamelessly stolen from http://www.pjsinnam.com  (PJ Udo C.J. Fischer – 1975)

Kiwicon 2k7 – key note: The Psychology of Computer Insecurity

December 2, 2007

As promised, Here is the first of maybe a few (dependent on my quality notes I guess) write-ups around kiwicon.
Where better to start from then from the key note.
The key note was given by Peter Gutlin.
A large part of his talk was around the psychology of users and IT professionals when it comes to security. Part of this covered the idea of most “normal” users have a confirmation bias. This is where a person is looking for evidence on why something is what they perceive it as, for example, when looking into a url such as “WWW.ssl-yahoo.com“, many users came up with a conclusion that it must be a subdoimain / directory to yahoo.com.
There is also the idea of disconfirmation bias, which is the opposite of this.
Geeks generally fall into the 2nd group, looking for the padlock when logging into bank sites etc, where most users don’t.
One example given was a survey done on a phrase “All of Ann’s children are blonde. Is it valid to say a subset of Ann’s children are blonde?”. When asked in the lecture theatre full of IT professionals, roughly 90% agreed with this statement, however 70% of the general population taking this survey disagreed with this statement, highlighting the different ways of thinking between IT people and general users.
Developers generally have the expectation that users will notice the little stimulus or the absence of such, for example, when logging into online banking, security is dependent on the end user noticing the presence or absence of the padlock indicating that the site is using SSL. Most users will not notice the difference when performing their normal activity. This also goes one step further when looking at user behaviour in dealing with dialogs and popups. People form a pattern of ignoring alerts and clicking on whatever buttons needed to get them to complete their tasks. If an activex Dialog pops up, by force of habit many users will generally click Yes/Accept without reading or thinking twice.
He also applied the Bystander effect to the internet world as another barrier against computer users. The bystander effect is “a psychological phenomenon in which someone is less likely to intervene in an emergency situation when other people are present and able to help than when he or she is alone”. An reasoning for the bystander effect is that if one individual is part of many and no one else is doing something then that individual believes themselves to be wrong and everyone else’s perception to be correct. On the net this becomes even worst as the entire world becomes that bystander.
This also applies to open source software. Many It people believe that if they pick an open source product that is well used by others, then it is secure as someone if not many would have audited the code, however generally speaking, this is not the case as proven by many big open source projects were large security bugs are discovered very late after the release.
Peter finishes up his speech with a thought to think about: Error mitigation. When applying for jobs, you are run through numerous tests to prove how capable you are. The greater the need to get things right, the more tests you need to do, for example, if you were applying for a job in running a nuclear reactor, you would expect there would be a lengthy process.
Military relies heavily on psychometric testing and training before being allowed to do anything important out side of being a grunt.
However, the normal computer user gets no training at all before being allowed to use computers and get onto the net, using online shopping and emails.

KiwiCon 2k7

November 21, 2007

Kiwicon 2k7 

Last weekend I had the chance to go to Kiwicon, a security / hacking conference held in wellington running over a 2 day period. It was presented by a bunch of locals and a few out of towners, including one Russian guy with a very strong ascent. Many of the presenters had been presenting at Defcon and obviously knew what they were talking about and once again, the keynote was very impressive.Although not all the content was relevant from a developers point of view, all speakers broaden my way of thinking and highlighted to some degree the illusion of security (which is of-course relevant to everyone).

There were a few guys from Security-Assessment  presenting on how they do some of there job, what they lookout for and some examples of where things have really gone wrong (extremely useful from a developers view, especially because we deal with them a lot as they check some of the software / web applications we produce).

All and all, very worth while event, no regrets <william hung style> on spending a sunny wellington weekend inside listening to computer related security stuff.

I’m still completely wasted after the conference combined with the amount of overtime I’m pitching in at work, so not such a big write up for now but over the next few days I plan to post a few notes on some of the talks <watch this space??? 🙂 >

Some free time to build a trademe watchlist gadget

October 26, 2007

screenshot_img.pngWohoo, I finally managed to get hold of a few hours to try and build a vista gadget. I call it the “Trademe Watchlist watcher”. Being a trademe fan and all, I thought why not do a small gadget that would screen scrape (due to lack of web services) the web site and display a nice little summary of the next closing item on my watch list. It works and if you are a trade me addict like me and run vista (like me too 😉 ) then you might benefit out of this.

The gadget is inspired by Darryl from Microsoft and Jeremy from mindscape who did a proper gadget for trademe a while back, however while that gadget was more focused at sellers, this one is more for junk collectors (if you seen my house, you know what I mean).

For more information and download link… Click Here

Basic Flash Detection using Javascript

August 16, 2007

Every now and then I have found myself on the hunt for some “simple” yet robust flash detection. This can be pretty easy to rip off from other sites but there seems to be quite afew different ways to deal with this and ofcourse the whole cross compatibility issues involved. I also really wanted one which didnt use the VBscript tags to try and create objects so after spending time on the hunt and finding nothign that really fitted what i wanted, i thought I’ll hack together a function which would. It incroperates quite a few bits from other sources but really just manguling together bits and tiding them up to fit my purpose.

The end result is a small and clean function that should work in both IE, firefox / mozilla browsers.

Javascript function is below:

function HasFlash(MM_contentVersion){
     var MM_FlashCanPlay = false;
     var plugin = (navigator.mimeTypes && navigator.mimeTypes[“application/x-shockwave-flash”]) ? navigator.mimeTypes[“application/x-shockwave-flash”].enabledPlugin : 0;
if ( plugin ) {
          var words = navigator.plugins[“Shockwave Flash”].description.split(” “);
          for (var i = 0; i < words.length; ++i){
               if (isNaN(parseInt(words[i]))) {
                    continue;
               }
               var MM_PluginVersion = words[i];
          }
          MM_FlashCanPlay = MM_PluginVersion >= MM_contentVersion;
     }
     else if (navigator.userAgent && navigator.userAgent.indexOf(“MSIE”)>=0&& (navigator.appVersion.indexOf(“Win”) != -1)) {
          try {
               var flash = new ActiveXObject(“ShockwaveFlash.ShockwaveFlash.”+MM_contentVersion);
               MM_FlashCanPlay =
true;
          }
          catch (err){
               MM_FlashCanPlay = false;
          }
     }
     return MM_FlashCanPlay;
}

Tech Ed New Zealand 2007

August 15, 2007

Well its been a real busy 3 days up in Auckland’s Tech Ed.There were 18 intergenites all up and as work had it, we all showed up in bright yellow cammo army style gear which after the original shock, really looked impressive. We did manage to really enforce our presence there as you could amagine, 18 guys (and girls) dressed up in bright yellow army style pants would do. It was also very impressive seeing the social aspect/effect of this when walking out side of tech Ed and seeing the people of auckland taking double looks at us :-).

Moving on, there were some real highlights to Tech Ed (outside of the Tech fest party of course), they where

key note The key note was largely delivered by Lou Carbone, unfortunately I wasnt taking any notes as this was really worthy of it. Lou Carbone talked heaps about user experience, emphesising, you can have ultimetely crap service (or product) but still walk away with a great experience.One example (out of the many great examples he gave) was around him getting his hair cut during a trip to Canada. Story goes along the lines of this great looking barber shop that he went to to get his hair cut. He used strong visual words and pictures to describe the triditional nature of this shop, having native wood, fragerences etc. The barber himself was dressed in a white outfit as opposed to the more common jeans and t-shirt. So he goes in for a hair cut, gets the whole treatment, shampoo / conditioner, the massaging of his hair as this is all happening, soothing music playing in the background and gentel smell of aftershave. Infact, he was so happy from this that he also got his beard trimmed. As an added bonus, he also got his ear lobes genetly massaged (and how great that was). So when he looked into the mirror after the cut, he was supprised to see a less then average cut. Anyway, based on his experience there, he just couldnt help but keep going back, infact he started planning his hair cuts around his trips to Canada. Anyway, I really cant express this in such powerful words as he could but the underlying idea is around the experience and not always the quality (though I wouldnt think this is a good excuse for crap software, there really must be a point where the quality impacts the actual experience???).

Another example to this was based around staying at a dark motel where you might be too afraid to close your eyes at night vs a 5star hotel where the toilet paper gets folded in triangles to indicate the room has been cleaned and where your bath towels are foled into shapes, pockets or even swans.

One other important aspect to the experience puzzle is the wording, it is too easy to get the wrong message accross with small changes in words and the flip side is that it is also easy to get the right message accross using good words

I think this relates directly to the software industry as Its virtually impossible to create bug free software, however, if customers feel listened to and that the software provider understands their business / goals and believes in the support they are getting, chances are, they will keep coming back.

Software + services: microsoft’s vision for SOA, Saas and Web 2.0 (Michael Platt)

This session was a great introduction into some of the movements in the software industry and what is driving web 2.0.

The main drivers mentioned being mainly Business drivers (Mentioning both Google using the advertising model and amazon using virtual marketing with the focus on specilist books) and social drivers (myspace, beebo and facebook being prime examples). One interesting spin off to this was the fact that different “brands” of social networks had very different uptake levels from country to country. Another interesting point that he had mentioned and that has been in the spot light is the protection of peoples tention data (privacy and identitys over these social networks). He mentions Tim’s Blog article “7 rules of identity” for further readings.

Out side of this, He makes comparasons between Service Oriented Architecture(SOA) and Software as a Service, with strong emphesis on their charastics and the audience which ends up using them (SOA = loosely coupled, mainly used by enterprise, SaaS being picked up / targetted at Independent Software Vendor ).

His talk on the general web 2.0 aspect covered the strong uptake on mashup systems, user experience, architecture consideration and again, the security concerns.

Building rich Web experiences Using Silverlight and Javascript for developers (Joe Stegman)

This Session went through some of the basics steps around setting up a basic silverlight Project. It introduces silverlight as a kind of flash alternative (even though I dont think i ever heard flash mentioned). A few things mentioned which lead to this conclusion but yet for the life of me, I cant think of any one strong case which lead me down thgis path . There was also a lot around hooking up events to selverlight and alot of demos around the actual javascript involved. There was also afair bit of talk about the actual XAML. All and all, even though I dont thhink it was as techkie for the many c# developers, It was a great introduction to silverlight from a high level and it also highlighted many features to look out for with silverlight 1.1 (all demos were based on the current 1.0 release).

Fronde Debate – Agile? just another word for hacking!

Nothing to do at lunch time? why not show up to a light hearted debate on agile vs waterfull developement? This debate was a very humours debate, most of teh debate was focus more on personality and personal aspects of the actual debaters, however there were quite afew very good points that were brought out from both sides that at one point I was almost (i do emphesise the ALMOST) convinbced that the waterfall methology coukd work in quite afew situations, mainly, smaller applications that for what ever reasons, the application must be delivered within a short time frame that changes just can not happen… Anyway, It was a good debate and the debater from Mainfareight? did manage to put some good arguments through in favor of waterfull ( without too many personal attacks). The agile team did win in the end by audience vote but i think (as did the presenter), the Waterfall team did put a beter argument though. The ending few words came from the mainfreight debater who did confess that he was a agile man at heart and that they do infact work using agile techniques.

MCMS 2001 to 2002 Upgrade

July 2, 2007

For the last few months I have been involved with an MCMS 2001 upgrade to 2002 which involved 10 websites and over 70 templates. This was a huge technology upgrade as the 2001 stuff was based off old school asp and vb code and was being upgraded to .net2 / Mcms 2002 with sp2 (adds some basic .net2 support such as sitemaps and master pages).

This is actually a momoth task and should not really be called an upgrade as it is, to a large degree, a rewrite, only keeping the logic and assuming there will be no reusable code (you are converting from procedure based code to OO right?)

Tasks involved with upgarde:

Setting up the dev Enviroment: 

I found that with such a major setup such as CMS, its best doing dev work in a vm, In mycase, I used the free microsoft virtual PC 2007, windows 2003 (for its ability to host multiple websites) and sql 2000 as a base setup. Installing mcms 2002 and its service packs (SP1a and SP2) will take a good day. The other thing to note is I was never actually successful in getting the MCMS project templates integrated correctly within visual studio 2005 (maybe to do with order of install?)but then as it turns out, we didnt need it so chances are you prob wouldnt either. We used the standalone template manager by Steve Walker to do all the linking between the mcms template and the aspx templates files. There are also afew tricks to installing sp2 such that it takes less time to install (it will take atleast an hour, and if memory serves me right, it took me more like 2 or so hours).

Migrating the database:

The real good thing that saved me lots of time was that there was no need to have MCMS 2001 installed, all we needed was a copy of the old mcms 2001 database from the client (a whole 3.5 gigs worth). Once I restored this as a local database, I had to clean up the database users (removing all those that the client had and giving myself permissions of DB_Owner. All this done, next was to point the MCMS Database configuration tool (DCA) at the old 2001 database and let it work some magic. It will eventually say it found an old mcms database and ask if you want to convert it.

The conversion process for this has taken me over 30 hours when running from within a vm and hitting a database over the network, lesson learnt, do not use a networked database, it could had be the fact that the vm is slow, the network adaptor was doing natting for the vm, the database might have been under heavey load or just very chatty connection but yes, it took over 30 hours to do migrate 3.5 gigs or so. Much easier to have a local db inside the vm (which the 2nd time round, ended taking 2 hours max while running the vm off usb2 portable hard drives), serious lessons learnt!.

You will be prompted to add a admin user account, make sure you remember this as you will need this to get into mcms using site manager (and in order to add more users in mcms).

The mcms2001 Templates which used to be stored as part of the database will be slightly altered and spit out onto the file system (<Path>). At this stage, its prob a good idea to take backups of your migrated database and your exported dumaped out templates!

Coding up new .net 2 templates

The easiest and quickest way to do this in my case was to find a nice general page from the current live site, view the html (client browser end) and throw that whole chunk into a ‘Sample’ html file in the soloution. Next step was to clean out the non-template data from the html such as the placeholder’s body data. Now throw the cleaned up html into a master page and start moving out functionality into user controls (for now, they will contain only static data in order to get the site compiling and working quickly), so for example, some controls might be “breadCrumb”, “LeftNav”, “MainMenu” etc.

Once all the basic controls are taken out of the master page, you can start looking at creating your first template. I would pick off teh most commonly used template for this as it helps sorts out say 80% of the site with very little effort, so in my case it was a basic “general” template. Code this template up using either the standard mcms placeholders or you can (in my case I did) use some extended custom user controls which encapslated an “edit” guide text caption for authoring mode and the basic cms placeholder control all in one.

 Some useful “extended” controls to have:

A Placeholder control that holds both the standard MCMS control with a settable “edit caption”

A Panel Control which hides the content of the panel if the CMS placeholder is Empty / Black (<p>&nbsp;</p>, etc)

ImagePlaceholder control that allows setting propotional height/width properties (Feature of MCMS 2001 which is no longer supported in 2002).

Importance of knowing and talking Design patterns

June 16, 2007

As part of a push from a mate at work (and hopefully myself), I have started reading the book Head first design patterns by O’Reilly publishers (intro pdf avaliable on the site). By first looks, It’s a very childish book but once you get past the logic for it being so (well explained in the first few pages of the book/ pdf), it’s actually a really informative and invaluable book.

<ShamelessPromotionOfBook>

I have always thought that for me to pick up a good understanding of design patterns, I really have to be reading about them and implementing them as I go through the examples (prob the typical approach to learning). The way this book is written seems to remove this need by (according to the author) making use of different learning styles and ultimately making itself standout by use of images, with text and arrows inside the image rather then captions, images to stimulate emotion and attachment, lots of activities to practice and quiz yourself on your understanding, the use of stories which is a lot easier to remember then just raw information and facts and a lot more. End result is a much more interesting book that helps your brain persist, recall and hopefully apply the use of patterns in real life situations (the book gives heaps of good situations which relate somewhat to real world). So far the book is looking real good and I’ll highly recommend it to anyone who thinks they just wont be able to read a thick book on design patterns.

</ShamelessPromotionOfBook>

Anyway, back to my post, At the end of the first chapter, it talks about the huge importance and benefits of using a shared vocabulary when talking about software architecture, the vocabulary being mainly expressing  things in design patterns.
So fully supporting this comment, the key points and benefits of doing so (slightly grouped together and in 1 (longish) paragraph).

Saying more with less / Turbo charging the team / Stay in the design longer:
By using patterns to describe architecture, you are really describing and grouping together a whole pile of ideas which another developer should be able to understand without having to expand on the little details such as ‘this has a collection of animals which….”. By not getting into the details of how things are going to be implemented, you can easily form a stronger overview of the system and deal with the high level issues such as potential architecture floors. This also changes efficiency of the team (the whole turbo charge thing) by spending less time on actually explaining the individual components of a pattern (outside of the context of the pattern) and by minimising mistakes caused by misunderstanding and interpretation.

<ShamelessPromotionOfBook>
Once again, highly recommended book, easy to digest, Buy or Borrow this book!
</ShamelessPromotionOfBook>