Thursday, December 01, 2005

Protegrity: Database Encryption

Need to get more information about what is really going on and how this technology can help in data privacy initiatives.

Application Dependency Scanner

May be worth something to look at for basic applications. But, my basic problem still remaind dynamic dependencies added due to configurable class names, etc. So it probably may not help. Another thing to look at!!

Wednesday, November 30, 2005

A Recipe for Newspaper Survival in the Internet Age

Lessons
  1. Some of the readers know more about the subject, so allow them to contribute
  2. Others know less about the subject or may have personal agenda. Let "community" handle them via "moderating system" (I think editors also must play a big role in the moderation till the community is not big enough to be self-regulating)
  3. Malicious, obscene content should not be reason for not opening up to readers. Let moderation or editor take care.
  4. reader vs advertiser - well if reader make the medium trustworthy, in long run you will have more revenue. besides that allow advertisers to reply to the things.
  5. Go "Local" and advertise local
  6. Think internet as mainstream medium to grow since all others are being reduced.

Thursday, November 17, 2005

Java XML Tech

JDOM
- http://www.jdom.org/

DOM4J
(Better) - http://www.dom4j.org/

STAX
- http://dev2dev.bea.com/xml/stax.html

JAXB
- http://java.sun.com/webservices/jaxb/
- https://jaxb.dev.java.net/

Castor
- http://www.castor.org/ XStream
- http://xstream.codehaus.org/ Jaxen
- http://jaxen.org/
Nux
- http://dsd.lbl.gov

James Strachan: Is Ajax gonna kill the web frameworks?

Good Discussion!! The major points seems to be
  • When Client needs to receive remote events (obviously not by polling since that adds burden on server)
  • A very complex "windowing" GUI with a lot of local event generation, validation, etc which can be too much for javascript which is an interpreted (thus slower) language. - My interpretation
  • If the processing rely too much on the business state/session which contains sensitive data (hence needs to be stored some where safe) and in world of SOA there is no place to save them!! - My interpretation
  • Too much pain w.r.t. browser incompatibility and immature frameworks and tool support
  • In-house applications do not need them since the customer is on uniform platform.
  • Debuggin Javascript on browser is terrible - But faster since no compile step and also firefox has good tools(I think)
    • JSEclipse - Not good Enough

    • Thoughts!!
      Browser synched with the latest version of java.
      Standard Browser APIs for accessing Web Page DOM + Object Model
      Swing Platform and layout manager compatible with HTML
      That's JavaStart??

Monday, November 14, 2005

Cisco Moves Linksys into Small Business Market

Wow!! $62 per user! that seems way too high. What's with these service companies and their love with "per user" price. Why can't they service on "per channel" or "per connection" basis along with very basic fees for maintainance. This would make much sense for those companies that would like to give service access to all their employees but all their employees would not be using the system at the same time.
This model basically seems applicable to all the network based service where all the people would not be accessing the service all the time.

Friday, November 11, 2005

Trails: .8 Released

I am not sure whether this is better than the earlier products that I looked at.

Friday, November 04, 2005

The Evolving CIO's Technologies

  • "SOA" - anyhow, anywhere, anytime
  • Document Management
  • BPM/Workflow
  • Virtualization of OS, Storage & VPN, Wireless Network
  • Application streaming(??)
  • Opensource desktop
  • Grid

Thursday, November 03, 2005

AJAX Framework Comparision

Good Selection!! Need to revisit

WSSE 3.0

WSSE implements WS-* which is standard for interoperability. Digital signature, Authentication/User Token, WSE 3.0 does -
Security easier, integration with WCF,
---------||------||--------> Client Server <--------||------||--------- 6 Turnkey scenarios scenarios 1.
        send encrypted message 
        send encrypted large key
      ---------||------||-------->
Client                                    Server
(Public Certificate)                       (private key)
large key                                    
      <--------||------||---------
         send user/password encrypted with large key
Use policy file to get it done
WCF - Shipped with Vista. WSSE 3.0 wire level interoperability with WCF.
-----------------------------------------
|Secure|Reliable (new)| Tx      |             |
-------------------------------
|Soap  (Message)                   |   WSDL   |
------------------------------------------
| XML/XSD (data)                                  |
------------------------------------------
| http    (transport)   | TCP |Custom(UDP) |
------------------------------------------
With 3.0, the ASMX besides the basic services can be hosted as a service along with websire

Bob Lee: Generating sequence diagrams with aspects

Indead a very good use of the concept of aspecting. I have not been able to understand the use of aspecting in the development and product code. Even the much touted logging usage does not make sense since it does not capture the business event for which you have to write specific event information through logging API directly.
But it seems the Aspect has found a good use in the debugging and understanding the applications' features.

Wednesday, November 02, 2005

OctetString Engineer Says ‘Caching is Evil’

Ran into this comment which took me to above article. Besides the basic marketing stuff, the thought/comment did go into the core issue of why cache.
The idea of cache and cache management arises from the basic tussle between performance and data freshness. If you need better performance you will go with cache (well designed to have good cache hit and low cache miss) while if freshness is important cache may not be your cup of tea (unless designed so that updates flow into cache from datasource).
With regards to that, the cache has its place in identity management for data that for which the cache expiry or update speed is much higher than rate of data staleness (like first name, last name, email id, contact information) while it would be not so good viseversa or if freshness of data overrides performance requirements.

Tuesday, November 01, 2005

Airlines Trying To Cut Out The Middlemen... Again

Is the middleman finally going out of business especially if these service provider use eBay or google base to publish their data.

Oracle Hands Developers a Free, Open-Source Database

I did not see any thing about open-sourcing the database.

Saturday, October 29, 2005

RIFE/Crud 1.0: CRUD scaffolding for RIFE released

Two promising technologies in two days and both do not seems to be going the way i think this development process should move. This process comes closer to my idea of how ultimately the request processing is a workflow which uses component to get the work done. This idea comes quite well in this technology. My issue is with reinventing the wheel. I would have really like the product to use a workflow definition language for achieving the request flow and data flow. On an initial review it looks like it borrowed the idea from the BEA/Apache Beehive (did i get it right?). But I would really have loved the idea of extracting database schema or from entity relationship diagram and generate objects or vice-versa automatically and drawing the request and data flow using GUI instead of editing xml and in a workflow language instead of developing your own.
The meta-data about data constraints is fine but that can not be extended to Web interface. Now what needs to be displayed as editable or non-editable and sorting decisions are not a business logic decision (as it can be an authorization decision) and thus should not live with the bean definition. It is a interface decision and should be part of that!! This is where even I am stuck w.r.t. to how to tie the workflow to interface. What is the answer? But that is a separate topic...
The technology does look promising and can work as inspiration for other technologies...

How Many Times Should We Pay For Our Software?

Article just tells me that the market is maturing w.r.t. vendors getting ready. I have thought that PC is most likely a temporary path to go tho next step where people will move from accessing content using personal medium through a shared medium (similar to the way cable system evolved). With regards to this at this point the market needs to figure out the model. I think the mediums would be hosting the software and people may be ready to pay monthly rent for the service. Now the medium could be cable, phone/optical fiber or utility provider (may be electricity or who knows the water utility).
Lets leave it at that and let the market figure it out!!

Google And IBM Team Up Search Technology

Interesting development!! I have always liked the idea of using google desktop search as a corporate knowledge management tool once some facility has been built to securely control and access the agents running on the individual desktop. I have had some thoughts on doing this for my own company but never got time around to do that.
Besides that I guess it is great way to capture the two ends of the information i.e. databases and desktops. I am not sure whether google search appliance could not look into these database and hence google has to depend on IBM for this type of data. Another thing which brings to life is the issues people had with desktop search at the start i.e. it brought out unwanted things from the system. Guess this goes to the idea of privacy and data access control i.e. what is searchable and what's not.

Friday, October 28, 2005

Paranoid Penguin - Single Sign-on and the Corporate Directory, Part I

Now that was the quickest way to build the infrastructure and the consultants are just sucking the money for doing nothing :)
Guys lets not build something, attach "identity management" to it and tell the world we have solved the issue in 1 section. This article may be good for a small university or a Small business. Anything more than that the SSO and "identity management" is very huge project which may run from 4 months to 3yrs and needs a lot of things.

Microsoft's Vigilante Investigation of Zombies

This brings back the whole idea of > if you leave door of your house unlocked and some body comes in, looks around and leaves is it punishable > if you leave door of your house unlocked and some body comes in and drinks water from your tap (is that a good analogy for wireless access point for basic web surfing) and leaves is it punishable > if you leave door of your house unlocked intentionally and some body comes and are caught is it punishable.

Attention podcasters

The idea of annotating pictures is not new but doing that with video and audio!! May be that is how the next generation search engines would be able to make sense out of these type of contents till we figure out a way to dicipher an arbitirary piece of audio and video.

Artificial Scarcity, Garbage Collection and the Long Tail

Great Article!!

the anatomy of a standard

Redirect This!! Hmm... a money making/analytics scheme! I do not think that is the way to get it done. It seems to be more of a browser feature which allows you to select content from a website and then blog that. This way the content website does not have to change but at the same time the user is able to "grab" the content that is important for him.
Now only way a third party can get involved in this process is by making sure that user is not violating the copyright by reprinting the information which means that it can provide the capability of generating "URLs" to address the content of interest instead of displaying the entire website by may be just selecting the stuff or running autogenerated greasemonkey scripts on website on the browser.

A Prescription for Novell's "Cold Realities"

This is where it becomes apparent that just having great product set, and great relationship with developer does not help unless, you do not get the message across well to your customers who are more likely to be the CIO et,al i.e. your sales and marketing!!

A me shaped hole in the web and other thoughts from Internet Identity Workshop 2005

hmm... identity noise (great concept!! though morally questionable)
and difference between enterprise and individual needs w.r.t. identitity seems to come from the basic idea that enterprise is an individual that is formed by collection of individual that have purposely chosen to relenquish some of their identity, culture,etc to come together. So, eventhough in private or outside the enterprise an individual can practice and implement his beliefs (obviously under law) , the enterprise has relinquished some of his beliefs and requirements for greater good of enterprise. I think l lost my chain of thought some where....

OpenToro Version 3.0 Released

First Thought -
Ahh!! finally a product that my father can use (if packaged properly) probably with Open Office database to develop an application for his office. And then reading through the tutorial I think he would lose interest some where around editing XML.
Damn!! will have to wait longer before Microsoft Access will be out of his machine!!

evolutionNext: "Inline XML in Java Code? WTF?"

Now the things are really going out of hand. The basic java language has been very stable and got the work done (may be not always). These additional "features" which cater to the latest fads is not going to help the language. It will just make the language more bloated and we will start running into the issues like those with operator overloading in C++.
I really miss the simplicity of C !!

Friday, October 14, 2005

Symantec to unleash 'Big Brother' on the world

Hmm.. I am not sure how this system is helpful without the identity flowing with the request. Most of the applications that connect to database using proxy users and auditing/monitoring based on it is not very useful since it can not be tied back the the users that is running the query. So, we need the next generation monitoring and auditing applications to be able to track the actual user identity. I would be expecting the Application server providers and database providers to develop such technologies to audit and monitor the user id end-to-end. I am not sure how well the network security by them selves or Host security by themselves will be able to crack this market.
Let's see

Thursday, October 13, 2005

SOA Maturity Mockery

As far as I remember, the CMM maturity model has nothing to do with how you can achieve the level. Another important point being that by default in CMM model, every body is on Level 1. So even though there are good points made by the author, I think he does not understand the concept of CMM and how it just assign the level to the company based on the audit. So a company does not have to go from Level 1 to Level 2 but if there is a long term strategy developed, the company can go to level 3 directly if they can prove what they are trying to achieve here is good for level 3.

Free the Data

Hmm.. and loose the thing that makes money and further more allow other to make money from that. Does not make a lot of sense, just like a lot of business models that did not make sense back in dotcom days. Even then the idea was to build the service and set it free. They will come, like my cute little service, and start paying for that once I ask for the money. Well we all know what happen to those services. We have to understand that this model does not work unless you are a very large company and the product/data you are selling out is not the core of your existance or does not bring any money to you. And that is why the products being opensourced by big company are products that have outlived their shelf life or are not making any money for the company.
That is why you need a syndication model in place. The content generators will sydicate the content and get paid for allowing other to get access to their data. The idea here is that, content/data can not be set free for a long time because creation of data takes time and money. Any model that sets the data "free" or uses free data to build services will be always in jeopardy. This is due to fact that such ideas look brilliant during the boom times or till you have not run out of VC's money and go down the drain as soon as the economy goes south.
This brings us to question why the almighty google and other service provides like msn and yahoo are providing data for free. Well we have to understand that, google is formost in the business of pattern recognition and not in content provider business. This pattern recognition business means that they need to lure the users using content to track and find general patterns which can help them build a system that can target ad and premium contents more precisely to the users. While other portals have to provide their premium content free since google is doing so or may be they are building the same structure behind the scene. So who knows when we will run out of free data!!

IBM Offers Best Practices to Open Source Foundation

Seems like nothing more than making the theory free so that the people are going to purchase tools to implement the theory. There are theories in Computer Science on software development which would help anybody but without appropriate tools for them, they are useless from development point of view. I am not saying that theory being given out is bad (since it has been "used" by 1/2 million developers) but just that theory without the tools is as good as concept of turing machine without the mordern computers.

Tuesday, October 11, 2005

Drools Project Joins JBoss

So finally the workflow engine, rules engine are coming together. I have been looking towards an integrated Workflow, rules engine and interface engine for easy product development for some time now. The SOA would need this interface engine for allowing users to interact with the SOA services. Where is this interface engine which is integrated with the workflow engine going to come from? Are XForms or any other web frameworks an answer to this?

Dan Farber on Web 2.0

The article, got me thinking on the way content creation works on TV. I am not a TV history buff and so I may be wrong. The TV medium started out with only the big networks having the know how and money to create the content. This content was broadcast to the viewers. But as the time passed and more people became adept in the content creation process, the idea of syndication was born which allowed content creation to be separate from broadcasting. I think it is this idea that is one of the reason keeping google off the content creation wagon
MSN and Yahoo may continue to be content providers of the future with content providers like people (like columnists) and company (like big studio) syndicating the content to them. The google will be a "public access channel" which would allow users to create contents and publish to the world that would like to see them along with target advertisement?

Experts give identity management advice

Points raised on
  1. Process and System Integration are challenges
  2. "Identity Management is viewed to be responsibility of employees in charge of physical security" This is totally against all my experience in financial industry where the identity management is typically part of the Risk Management group and that co-ordinates with physical and HR to develop and implement identity management solutions. But at the same time HR is the golden data source in most of the place.
  3. "Get the background check process right" which is typically performed by HR during on-boarding process.
  4. "One ID across the organization" mostly a dream every body wants but nobody has (but there are instances where organization have been able to achieve it atleast for employees though not for customers.
  5. "Biometric is the key to solve duplication" but biometric can not be converted into identifier. It is used as authentication data but not as identifier.

Deploying SSO and biometrics in the race to put ou…

Deploying SSO and biometrics in the race to put ou…
Problem Solved: SSO
Product Used: Imprivata OneSign (Reduced Sign On)
Plus points: Appliance, Profile builder, Integration with fingerprint authentication
Issues:
Integration with Citrix in version 2.6 solved in 2.8
few minor issues
1) Missing finger (that was required by security policy) of one of the users.
2) Pressing finger too hard on device resulted in poor fingerprint profile making it useless for comparision.

Security: standards arent enough

Security: standards arent enough:
Basic point that Web service security is not going to solve the security problem. I think every body understands that, WSS will solve authentication and authorization. For rest of the things like
  1. Validate your input
  2. Set size limits on your incoming data
  3. Ensure the attachments do not have any "viruses", etc.
you will be on your own or purchase the XML firewalls. Another point being Security services must be centralized. Again a continuing trend which helps in consolidating the administration and security analysis.

You get what you pay for

I very much agree with the basic idea that just like in any other country the price to get a person to break law depends on the purchasing power parity and the salary of the person being bribed. In addition to that the strength of the law enforcement and tangible and non-tangible cost that the person may pay also helps in setting the price of the bribe.
If the price that a person has to pay is raised high enough, it is very much possible to increase the amount that would make a person amenable to breaking his/her contract. In order to ensure that the invoices keep coming, it is important for company and country (to which work is being outsourced) to develop perception that they have taken adequate measures to increase the price for breach of security.
Even though I am not a great supporter of the outsourcing business, I have worked with companies in India and some of the large financial institutions (which are supposed to be most secure)in US. I think I have more faith on the measures implemented by the Indian companies than their US counterparts. This could be because I may have worked with best companies in India and not so good companies outside that country. So, it would be lot better to evaluate the company that you are outsourcing to rather than go by FUD generated by some people.

Wednesday, August 24, 2005

Symantec firewall Woes and solutions (Notes)

I was having some problems with Symantec client firewall and gaim. Basically only yahoo was working properly and other protocol like jabber and msn messenger were not working. So finally found the following link which tells how to add the port 443 and 5222 (for talk.google.com) to HTTP Port list to get the gaim working with the Google and MSN. It is very interesting why this issue arises specifically for gaim when the MSN Messenger and Google talk work without any problem.
Another issue that I was running in to was that symantec firewall user session (that appears on the system tray) was crashing during initialization of the after login to windows. This was making the control of firewall very cumbersome. The Microsoft website did not give any specific reason for the crash. But today I noticed that after installing the Google Desktop 2 over the previous google Desktop, the firewall session did not crash.
When I tried to install the .Net Framework 1.1 the Client firewall crashed along with some nokia utilities (which were crashing previously), but the firewall seems to be stable after the re-start.

Saturday, April 30, 2005

ERP: A thought!

This is way out of my league since I have never worked with a single ERP product except to integrate with IDM solutions but the similarity was hard to miss. This came out of a brief discussion I had with my collegue where he was very adamant that ERP implementation is about sitting down and rebuilding the company's process around the product that was bought instead of the ERP fitting into the process that is already present. This to me is the same case of the IDM implementation where the Vendors/Implementation engineer want the companies to change their process to fit the product that they have purchased rather than viseversa. Even though I can understand that as a part of any IDM or ERP implementation the business process can be reviewed to make them more efficient but that does not mean that implementation has to mimic the product design because products were not built to support the requirements.
Besides that I think another issue with ERP was the non-existance of the service infrastructure that modern IT departments are building now. At the time ERP was being implemented, the product had to implement all the services like Identity, Acess Control, Audit, Transaction, workflow,and so on as part of their system. But now with the move towards more abstract service definition for things like Identity, Access Control, Workflow, the ERP systems can be lean mean systems that integrate with the existing infrasture instead of being a monolithic application that require so many consultants to get right the first time (because each of the ERP system is not an expert in developing all the capabilities into their product they end up implementing the feature the way they thing is the right way).
Assuming this is correct premise, the ERP implementations happened before their time. In case the companies continue on their part to built more abstract services, in the next decade the ERP systems may become very thin orchestration engine that tie all these services together. In order to stay relevant in that scenarios, these systems may have to become more audit and compliance driven that use BI technology (and input from human) to fine tune the orchestration without breaking any laws and trying to achieve the mission for its users.

Thursday, April 14, 2005

OS Installation on VMWare 5.0.0: Notes

I started working with the vmware and found it to be quite cool. As the first steps I installed Solaris 10 and SUSE Enterprise Linux 9. The basic issues that I faced were
  • Solaris 10 - All the CDs have to be installed after first reboot or else the system may end up in a weird state. For example I installed the first CD and then allowed it to reboot with the first CD in the drive, in that case it did not ask for any other CD and I was able to login through console since X11 was not installed. Then I tried installing the CDs 2&3 using the installer present in the root directory of the CD at which point everything was installed. After this I rebooted the server and started the X11 but I started having problem with creating directories in /home and could not load the CD (it failed with error that "device is already mounted or is busy"). After that I completely reinstalled the OS and made sure that I changed the CD after the first CD is installed and then changed the other CDs to make sure all the CDs are installed before going through another re-boot. This seems to have done the trick and server came up fine with no major issue. The basic issue that I am facing is that the server does not have the hostname (because by default it expect it to be provided by DHCP) assigned to it even after assigning it in hostname.pc0 (the interface is named pc0). Need to look more at that!! Some links
    • Basics
    • Some pointers
    • SUSE Enterprise Linux 9 The basic issue that I ran into was that the during installation the VMWare's "Graphic card" was not detected properly and the installation ran in text mode. The strange thing about the text is that it diplayed four separate screens two of which were updated but that made reading the screen very difficult.
Will keep adding stuff as I run into other issues.

Tuesday, February 15, 2005

CPR of Software performance

While reading one of the articles on chip design I found that the ideas of Cache, Prediction and Replication were used to bump the perfomance of the chips. This is very interesting because most of the CRUD (create, read, update, delete) application also use the same concept (except prediction) to boost their performance. We do cache the data coming from databases and we do replicate the code across multiple servers to provide scalability. But the prediction is some thing that I have not seen in any product that I have worked with!! This is something that has bothered me for some time as to why is prediction not part of the JDBC implimentation or part of typical Object/Relational Mapping tools (need to check whether it is present in hibernate). I understand that even the basic implementation is going to be very complicated because we need the basic concept of finding patterns using both query and data, but this is a step that has to be taken to boost the application performance. I really need to find more information on any available product that already does this!!

Tuesday, February 01, 2005

Some thoughts on the UML and MDA!!

So I am thinking of starting in to the world of MDA and UML 2.0 that is supposed to help you achieve all the benefits of the MDA. But before I start going in that direction, I wanted to put down what is it that I am trying to solve when I start developing application to solve a business model. Now based on my limited understanding of the application that I have worked with there are the following things that I should be able to represent in a modelling system.
  • Object Template (Class)Representation So basically in order to start modelling a system, we need to be able to model the entities of the system. These entity (in an object oriented world) will have Data/properties and methods. But I think we need to increase this list as follows
    • Properties: This just like in object oriented world would represent the data that actually makes an object the object that represents an entity in real world.
    • Method This represents the actions that can be performed on this object.
    • Constraints Now most of the times this is something that is built as part of the implementation but I think we really need to allow people to set the constraints where the mouth is i.e. at the Design level. This will ensure that if required these constraints can be enforced by compiler or by the generated code either at the object definition level or better still extrapolate that to user/external input interface validation step. Now these constraints are contextual and thus have to be specified at the level where that context is complete which can be at object template level. These constraints could be for object creation, data validation, data dependency (like you can not have interest value without interest rate and principal amount), method invocation, object deletion and so on(TODO: need to complete this list).
    • Relationship I some how feel that UML class diagram has not done enough justice to provide efficient representation of the relationship. Now typically the relationships are represented as an object template/class with pointers to associated object template(s) or as a line with relationship names and additional information about relationship type (many to one, one to one, etc) but you can not have both the features at the same time (or can you?). Another thing that I see missing in such a scenario is definition of reverse relationships like father, son and so on. Most of the times the relationship typically gets defines as a "is a" relationship and so the design would create a class for the relationship and later on a complete redesign would be needed in case the reverse relationship comes into picture. But instead in case the relationship could be defined as a separate entity that binds atleast two object, it will automatically get defined as a concept of "role" which is less restrictive in the sense that it can be assigned to or removed from the base objects as the system evolves. This idea will help the system representation grow more organically as the components become clear. So this provides us to avoid some basic design flaws during the design phase. These relatioship definition would obviously have associate properties, methods and constraints (for relationship creation, relationship deletion, relationship enforcements to name a few).
  • Use Case definition (or getting the work out of the system) This basically defines the work that we want to extract out of system and how the system components work with each other to achieve the same. There are a few things that I think are missing from UML.
    • object selection constraint/rule definition The activity or sequence diagrams are used to describe the interaction between the various object instances during the performance of the use case. Now when we typically make the call from one object instance to another, we need to ensure that we are sending the message to or calling method on correct object. This information can not be built into the UML diagram since the nearest thing present in the diagram is the description name of the object. We need to be able to extend this model to be able to specify a constraint rule description the object selection rule. Now there is an interesting consequence to that. When we describe such a rule, this rule will have input variables that will have to be fulfilled by either calling object or the environment. In case this has to be satisfied by environment, it automatically brings in the concept of session into picture which is basically collection of minimum set of attributes that uniquely identifies a transaction/process (if the session is not being used as cache as most of people end up using it as).
    • Business Logic Most of the business logic can be divided in to two types: Business Process and Business Rule. The business process basically defines a series of steps that need to be performed which involve interaction with external entities like Person, Web Service, Database or other application modules. This is represented very well by sequence diagram (and its extensions). But not all aspects are represented very well through activity diagrams. For example, it does not allow you to specify what are the inputs needed from the users or how to express the input interface to the user using a platform or technology independent way like XUL (though not the best example) or may be a new standard. This requires that the Interface Design itself should be part of the business logic and should be expressible in technology independent format which can then be used to generate platform specific code during next stage.
      Business Rule on the other hand basically represents basic business algorithms like interest rate calculation or user access control calculation which can be evaluated given all the inputs are available. This piece of the business logic is typically missing from the UML diagram (is it?).
  • Deployment Definition Now an important part of the architecture is the decision of how to deploy the model that has been defined. Now the platform and technology used to deploy an object will decide how the interface will get designed. For example, if the event trigger comes from a user and that is deployed over web, then the interface will be a ASP.NET or J2EE(JSP) implementation while if it is implemented in client server environment, then the client may have to a swing application or a windows application. So it is at the deployment level, where we have to make the choices about the technology and then the next step would be code generation!! Once we have have such a platform ready with automatic code generation capabilities, it is not very far to be able to see large computers being able to simulate using the hardware specifications and software specifications (like how many CPU instruction is taken up by specific kernel call), whether the specific deployment is the best implementation in terms of throughput, response time and other QOS parameters before going into production over a specific architecture (I really would like to see that in my lifetime), like we are able to do for chip designs.
  • Aspects An important part of the architecture is the basic generic services that are shared by most or all modules which may or may not be part of the original system models. These include the services like logging and mesurements (Error/Performance monitoring, reporting and correction), audit, security (authentication, authorization, privacy), performance enhancer (cache, replication, prediction), middleware and so on. These are now a days termed under aspect (am i correct?) which are supposed to be integrated with business logic but do not form integral part of the same. It is very important to be able to integrate the aspects in to the business logic at the deployment level (so that we can change them based on the technology constraints, Policy constraints and so on) and will vary based on the deployment configuration selected. So for example, an interface is selected as client-server on NT environment, then it may make more sense to use Window based Kerberos security for authentication vs using token based authentication in case the Web based application.
Most of these thoughts may be representation of me not understanding the capabilites of the UML diagrams and may be its me that needs to do a catchup rather than vise versa. So let me begin my quest for the same...

Thursday, January 27, 2005

3 Dimensional Alert!!

Yesterday I finally posted a letter that was waiting for a very long time. That triggered the use case of a GPS enabled phone/PDA/ that allows you to set appointment in 3 dimensions (I am not sure about the height though). So the idea would be that you will have your favorite destinations like your home, office, favourite store stored during cofiguration. Then when you need to get milk, you can set up the alert with milk and location. So on your daily trip back home, it will trigger as soon as you reach near the store.
Another important addition that I can think of is that the precision factor should be configurable for the location and associated quite time. So in case you are about 10 feet from the grocery store, it will remind you. Also in case you are in office on the 10th floor and you have a letter box nearby, it is not going to bother you every time you walk over to the window.