Thursday, December 13, 2012

Product Engineering is different!

Knowing the difference between 'product engineering' and 'application development' is not optional at my workplace... here at 99X Technology, we live and breath product engineering everyday, the niche we specialize.

Anyone thinking of of joining us... it's clearly advantages to know why & how product engineering is different to other disciplines.  This blogpost... one may even treat it as a leaked interview guide from me... ;)

If you ask the question from someone, the popular answer is around codebase maintainability. Historically application development was considered short lived, for example with limited number of releases (deploy & forget), whereas products were expected to evolve for longer durations and regularly maintained.

But this doesn't hold entirely true today. Although most of the best practices of software maintenance (e.g. unit-testing, automation) were ideated at product engineering houses, I see the difference diminishing lately as smart organizations demand internal applications also to evolve regularly and rigorously. Though it is true that product engineering mandate longterm maintainability  that's not optional for application development either.

So what are the clear cut differences? One of the important aspects is to understand the flexibility demands of your customers. Products are intended to be sold to multiple organizations (or individuals if in B2C space). Business rules, user flows and integration requirements can substantially differ from one organization to another. 

Unless you verify your architecture agility fulfills the 'cultural' demands of your customers, you may have a hard time convincing the customers at your sales meetings. Note the use of the word 'cultural' above... that was very intentional. You can propose customers to adapt the better/innovative flows in your product, but the last thing you need to negotiate is the culture and the values of his/her organization.

Next significant is to understand the distribution model of your software. Let it be SaaS, OTA or push installation on desktop computers, distribution has to be painless for the customer. Internal IT departments are not going to sit for days and figure out how to get your software up and running. In application development, the context and environmental conditions are much known, and buyin from internal IT is high as often it's their own brain child.

But in product engineering, assumptions you may take are limited and your product needs to address a broader combination of packaging, deployment and compatibility scenarios with a smooth distribution model.

Last but not least are the considerations of sales and marketing department. What is your licensing model? Do you provide online trials and demos? How to manage the different editions you offer to the market? These are some unique questions in product engineering. 

Also keep in mind that products are expected to compete in the market with dozens of similar software out there... How attractive your user interface is? What is the user experience you excel? Importance of user experience (UX) design is far prominent in product engineering for two reasons. First is the obvious 'market competition' aspects. Second is the fact that your product is not custom developed for a particular organization and therefore may appear allien to your users unless you stand out in the user experience you provide.

Ensure you answer these questions consciously when doing product engineering, or be prepared to loose your market to someone who does it correct!  ...and the reason we specialize product engineering is because... we want our customers to be in the latter group!


I have refined the content to be a more structured white paper. See it here: 

Wednesday, December 05, 2012

Simplicity Misunderstood !

Human thinking is complex... and our brain deals with dynamic, non-linear, multi-dimensional problems everyday. For example, people easily drive on extremely chaotic asian streets, simultaneously appreciating the pretty views around (u know what I mean..), while engaged in a heavy political conversation with another in the car… There is no question human brain is capable… but the question is… why we developers wet our pants when it comes to complexity in user interfaces we create??

Here is my take on it:
People hate 'perceived complexity' 
but love 'power for complexity' !
I know, that statement needs explanation! Here we go... 

Lets take Business Applications. Business applications deal with power/heavy users… and for power users, power to perform complex operations is mandatory. Give them a tool that restricts behavior/imaginations, and usage will just fall off... Need a proof? who thinks Excel is simple? despite criticism, it is still the most valued tool after all..

Ok, now.. What is the difference between 'perceived complexity' and 'power for complexity'? See below screen for a casestudy:

Problem with above interface is that it's just 'static, linear, single-diminutional' and people perceive it as complex… As per my view, the secret of creating user interfaces which are not perceived as complex, but with 'power for complexity' is hidden in three attributes, the exact ones I mentioned in the first sentence of the post!

  • Make the interface dynamic: By dynamic, I mean the sensitivity of the user state and context. Make the UI relevant to the user by understanding the state of the user and operating environment of the user proactively. Dynamic interfaces help taking most of the perceived complexity away from data oriented screens. See how smart LinkedIn in getting users to complete their CVs... That is a great use of social aspects too. Mobile adaption and new HTML5 capabilities helps web applications here a lot.

  • Make the interactions nonlinear: linear interfaces are navigation centric whereas nonlinear interfaces content centric. Paging and form based traditional UIs are examples for navigation centric UIs. Most of the modern single paged UIs are content centric and present users with relevant content at right time at right place without users performing navigation.

  • Serve with multidimensional content: My favorite .. Multidimensional behavior opens limitless opportunities for rich user interactions. Users are free for imaginations and are helped to best use the brain for creativity. Facet analysis based 'advanced search' interfaces are a good example for this. Multidimensional interfaces can even gamify complex operations and make the users to enjoy use of their brain while better achieving business objectives.

What do you think? Do you have any addition to the above list?

Thursday, November 29, 2012

How much 'multi-page' a 'single-page' application can be?

Title seems confusing??? mmm, yes.. it reflects the state of my mind on this subject! 
With HTML5/JavaScript hype, the trend today is to create the every web application a single paged. Look around to see the most famous web applications.. most of them are single page applications with heavy use of JavaScripting.
Question: First of all, why do we need to create our web applications single paged? 
Answer: It sounds cool and techie !!!
A funny answer.. but unfortunately the common approach too (smile)... May be a better answer is that.. 'We go single page to improve our user experience'

Now, how can a single page application improve usability?  
  • Multipage applications download entire application layout content with almost every user action, slowing down the response time 
  • Browser spend significant time in re-rendering the entire application for each server round trip (DOM is slow), slowing down the response time 
  • Rendering on server may have high demands on server side ultimately impacting user response time with higher loads 
  • Roundtrip makes the browser to blink at reload distracting the user engagement with the application 
  • Single page code bases are cleaner due to clear separation of concerns (API and UI).. resulting faster time to market 
  • Single page approach empower us to create a non-linear user experience that is completely different from form/page driven experience of multipaged web

Sounds great.. But can they be held true all the time? May not be.. there are some pitfalls of 'single-page' approach too:
  • Browsers are not the best species in memory handling and garbage collection. Specially when it comes to DOM handling. For example, elements removed from DOM are stil held in memory. If we let users to work on DOM for a long time without a refresh, Browser might struggle to cop up with memory issues 
  • We should ideally reuse as much as DOM elements without disposing. But even with that approach, if the user creates thousands of reusable DOM elements as she uses the application, browser may suffer in coping up. On the other hand, the memory leaks created by bad coding practices will too pile up in a long lived DOM. 
  • If we need the complete product suite functionality to be available as a single page application, it can be too much of JS/CSS code to be loaded at once impacting the initial load time.

Probably we will need to look for a solution that let us enjoy the best of both worlds..
  • If we study how Facebook operates, we notice it works as a single page application (long lived DOM) in general. But time to time, with a user action, it reload the page entirely causing a new DOM to be formed. With this approach Facebook ensures users do not continue working on the same DOM for a long stretch. 
  • If we look at how JIRA works, we can see a different approach to it. JIRA is a collection of single page applications. Each of the sub application works as a single page application but when traversing between sub applications, it refreshes the DOM (behave as a multi page application).

What flexibility is there in implementing this with boilerplatejs?
It is not difficult to implement the approach taken by JIRA with already existing boilerplatejs architecture. Each of the sub application can be of completely different entry points (meaning we will have different main.html files for different sub applications). This ensures page refresh when users move between sub applications, causing the DOM to refresh.
Each entry point will only include the scripts that are necessary for that sub application to operate. These scripts can be of:
  • Third party scripts such as jQuery. These will be served from a central CDN allowing caching 
  • Common Scripts such as boilerplate 'core/context.js' 
  • Specific custom scripts of the particular sub application such as viewmodel, component, etc.
The latter 2 will be compiled in to a single script per application by BoilerplateJS optimizer. Meaning even the core classes will be in sub application script for each and every entry point. In contrast, it is possible to separately compile the core framework classes allowing separate caching of the same. But the benefits are not worth the trouble in my opinion, because doing so restricts the developers selectively using 'core' scripts in their applications.

Wednesday, August 22, 2012

BoilerplateJS is out!

After several of months of hard work, BoilerplateJS - "The Javascript reference architecture" is finally out in public. We created the website, samples and some documentation ready for someone to start boiling complex code with Javascript. 

BoilerplateJS is created to make complex applications possible with Javascript + HTML5. It is not just another utility library solving a single concern such as MVC or Routing, but a solid reference architecture demostrating the patterns and concepts in dealing with complex Javascript projects.

Don't misinterpret. We are not reinventing concerns such as DOM manipulation, MVC, Routing, etc. Use your favorite JS libraries to work on those. BoilerplateJS demonstrates the best practices for integrating your favourite utility libraries in to a robust scalable product architecture.

Few things BoilerplateJS will assist to deal with are:
  • Structuring your solution 
  • JS script dependency management 
  • OO programming with JS 
  • Building a modular product suit 
  • Routing, browser history, book marking 
  • Unit testing 
  • JS optimization, obfuscation
  • Document generation 
  • Localization 
  • UI theming and more... 
Have a look at and let me know what you think about it! We are working on complete documentation + training videos which will be out soon. A big thank for my team Asanka, Himeshi & Janith who work as interns at 99X Technology.

Thursday, June 28, 2012

Traditional Portals are dead! Long live 'Script Portals'!

A little history of assumptions

As business grows and competition toughen, enterprises need to make business information more mobile. Around a decade ago, many believed unified web portals to be a rescuer in this mobility game. Technologists betted on web portals based on few two major assumptions:
  • Users will use web as their preferred channel for information consumption 
  • Users would prefer unified information under a single user interface

The reality of generation-i 

World wide web (www) is no longer the talk of the town. Users of generation-i do not believe in unified information anymore. They need information to be available as per relevancy. Today much greater information mobility is demanded where;
  • access channel is flexible and appropriate (www, mobile, tablets, etc.) 
  • controlled flow of relavant information push is preferred over pull behaviors

Limitations of traditional portal platforms  

With the massive buzz around portals, most major technology companies invested in own portal frameworks around a decade ago. Most of these were actually enhancements for existing CMS products, causing portals to be too heavy for the desired purpose. On the other hand, portal standards that emerged (such as JSR 168/268, Web Parts, etc.) were always server side standards causing the developers to be bound to a particular technology and to a specific portal platform.

Life is even harder for 'product' portals 

Product portals (built by ISVs) had even more challenges compared to application portals. 
  • If your product is serving B2B domain, your customer may have own information systems including portals and other channels. Your product information needs mobility and integration capabilities towards those existing eco system. Yet another separate portal could be challenging to sell specially for large customers. 
  • Product are expected to be maintained and developed for a longer time period. It is vital to select a platform which is developer friendly for coming years. Selecting between EpiServer, DotNetNuke, SharePoint, LifeRay, etc is not a simple choice to make. All these platforms will vendor lock you when you start implementing your custom portlets. 

New architecture for product portals - 'script portal architecture'

Imagine placing a Facebook widget inside a simple html portlet of your portal server. It is just a matter of coping and pasting few lines of scripts to make the widget to appear inside the portlet. For example following picture shows Facebook like-box placed inside DNN portal.

We can also develop our custom product portlets in a similar manner as Facebook widgets. With the advancement of HTML(5)/JS technologies, one can avoid dependency on server side portlet rendering completely. Development of each custom product portlet (lets call it a 'script portlet') can be fully decoupled from the server side portal platform, and may even be served from a different server altogether.

This doesn't necessarily eliminate the need for traditional portal systems. Most of the portal/CMS functionality may be used in conjunction with script portlets. Just that we do not develop custom portlets using server side APIs of the portal platform.

Under such an architecture, all your script portlets may be placed on any web page not just on the default product portal. If your customer already has an own portal these components can be easily integrated with copy-pasting simple scripts.
  • The information portlets of your product are decoupled from the traditional portal platform (DNN in this case) and the rendering is done at the client side using javascript DOM manipulations. Developers are now free to move between different portal platforms as need arises. 
  • Data required are fetched as JSON objects (JSONp for cross domain communication) and it is our own custom 'script portal' that integrates with business systems in preparing data. This architecture automatically builds a REST API for business functionality which could be used for deep integrations with third party systems. 
  • If script portlets need to communicate with each other, it is easy to implement DOM based messaging bus that will allow interportlet communication. Use of events will make the smart portlets to be independent without requiring knowledge about related portals. 

Monday, June 11, 2012

Leadership is not about winning every battle

A team is formed... goals are set… competition just begins... a member makes a wrong move… penalty costs apply… at the start, team is already being behind the competition…
Leader A thinking: I need the glory of winning every battle. I made my first move right, but he couldn't… we lost points.. I'm hurt… I will do everything possible for him to feel paying for it… and I need everyone to know if we lose, we lose because of him, not me… I need my battle glory !
Leader B thinking: Yes, we lost our first battle.. but more important is to win the war... learn from the mistake… keep the team together.. appreciate to gain back the lost team spirit… value differences and use for fightback… finally what matter most is not if we 'win the war', but if we are 'proud of the fight we put together' !
I’d prefer ‘fighting under leader B’ to ‘winning under leader A’ !

Thursday, May 31, 2012

Designing RESTful APIs

A restful castle

During a customer call today, I bumped on to a question about standards of RESTful API implementations. This intact is a quite a tricky question, because REST is actually an architectural style, not a protocol specification. To prove the point, you may take Twitter, Facebook, LinkedIn, or any other popular RESTful API for that matter, non of them has similar protocol specification.

My approach for REST API design is to study few of these latest and popular APIs to derive a style that matches my needs. In this post I thought of sharing my expierience and thoughts on what I believe is a quite elegant RESTful implementation if someone needs a starting point. Note that these are not standards and I'm prepared to be challenged if someone convinces me otherwise.

Use of HTTP methods and URLs

Unlike SOAP RPC which encourages developers to define actions with verbs + nouns [ e.g. getAllStudents() ], REST proposes keep these 2 separated. REST utilizes common HTTP methods GET, POST, PUT, DELETE to inform server about type of CRUD action to perform. Additionally the URLs are used to identify the data entity/container on which the action should be performed. Following table depicts a simple structure of REST requests for someone to use.

GET (Retrieve)
POST (Create)
PUT (Update)
DELETE (Delete)
Get all students
Create a new
Bulk update
Delete all
Get 1242
Update 1242
Delete 1242

But in real life projects, not everything is a CRUD operation. For such situations we may use URLs that uses action verbs instead of nouns. 
An example would be "/api/interest/calculate?rate=15&capital=100&period=2".

Use of HTTP Headers

I believe RESTful services should also make use of the HTTP headers elegantly, ensuring that URLs and Payload data are not cluttered.
  • For example, the protocol format could be defined in the HTTP header (content-type: 'application/json') rather than being a part of the URL (e.g. /students?type=json) 
  • Also it's important to perform caching on the client side to avoid unnecessary content transfers and rendering. Clients should use If-Modified-Since header where server may respond with 304 (not modified) if no modification available. 
  • Incase of exceptions server may respond with 404 (not found) when client request is erroneous and 500 (sever error) incase something unavoidable happens on the server. I prefer this approach compared to payload driven error handling implemented in Facebook API.

Use of Content

It is easy to overlook this aspect, resulting a non readable, non consistent REST API.
  • For example, it is important to agree and document how should the JSON entity structure should look like for collections. Although I'm yet to use, it could be a good idea to use 'JSON Schemas' to define and validate the agreed message formats. 
  • Also it is important NOT to make the payload too verbose. For that matter, I discourage use of JSON structure to transfer binary content. Instead JSON response may contain hyperlinks to mark related documents, images, etc. 
  • Generally developer code SDKs are very much fine grained. If these are directly exposed as REST APIs, the client-server communicate could become chatty and expensive. It is recommended to use an abstraction layer on top of the SDK layer to adapt more corse grained services to be used for the REST API.

Support evolution of your API

Versioning is always a debatable subject on services. Ideally all APIs should be backward compatible but unfortunately i'm yet to find that ideal project. In REST way, an entity should have a permanent URL regardless of the version. But for previous versions of the API you could keep limited, short life spanned urls with version number attached.

Permanent URL:
Deprecated/Beta URL:

Clients should always use the permanent URLs, but incase of backward compatibility breach, client may temporally indicate the API version on its HTTP request header. Server can then do routing/transfers to serve the client with the correct API version.

Hopefully you are now convinced to spend sometime in agreeing the basics of your protocols specification before jumping on implementation. As said above REST is just an architectural style, you need to decide the protocol specs before you implement.