Friday, November 30, 2007

Why use LDAP?

Why LDAP?

The question is a good one and I think as LDAP proliferates across more systems, many people will have the same question - and it deserves a good answer, so here is my two-cents-worth.

What we are really talking about is directory services, not just the protocol. Directories have some serious advantages over DBMS. Databases are optimized for OLTP (online transaction processing), but not for performing quick searches of information that is frequently used, but not updated constantly. In other words, Directories can deliver data very quickly (read) compared to a DB, but handles updates (write) slower in comparison to databases.

Features and Benefits using LDAP

§ Cross-platform functionality and industry standards-based (important consideration for future growth and deployments)

§ Widely accepted standard for the Internet

§ Inexpensive since licensing is usually not based on number of connections or clients open source directories are widely available. Also, replication and synchronization features are usually built-in rather than requiring a separate license as is the case for many databases.

§ Replication and synchronization is easy compared to complex DBMS implementation with highly specific SQL script requirements.

§ ACI’s for delegated administration so you can setup accounts that are highly specific in what administration functions a group has {e.g. an account may only allow for phone numbers to be updated, another for new objects (name, email, phone number) to be inserted, but not deleted or existing objects modified}

§ High Performance, since directory data is store hieratical you have very high availability over DBMS, sometimes up to 10 times higher.

Sample Use Cases.

The following is a short list of common uses of directory services since these uses are data profiles that are fairly static and do not have deep relationships – they are stored as relatively “flat” trees.

§ Phone / Address book

§ Infrastructure Resource List (ip addresses, etc)

§ Public Certificates

§ User credentials, groups, roles (for authentication/ authorization)

Directories are also more secure and can keep credentials “locked” and unable to read or copy from an outside source, and you can do in a database. Directories are based on a hierarchal storage schema, a “tree” structure. Information that would be able bi-directionally in a database are not available in this manner in a directory. Items that are lower in the hierarchy could be read, but data higher in the hierarchy are not available to the client. So you could read a person’s contact information, but not necessarily be able to see what accounts he has, or other people in a group that she is a part of. In a database, records are stored relationally, so if you can read a person in a group, you can read the group and theoretically see the records of everyone in the group if you have direct access to the tables, not true in a directory. read more | digg story

Wednesday, November 28, 2007

The 7th Annual eWEEK Excellence Awards: The Winners

RSA Access ManagerNovell Access ManagerRadiant Logic RadiantOne VDS

read more | digg story

Tuesday, November 20, 2007

IT Infrastructure model

Because everyone has a different idea on what infrastructure is, I think it is good to introduce a definition of IT infrastructure:IT infrastructure is the total set of components that enables applications to functionThe following model can be used to visualize the various components.

read more | digg story

Identity as Application Infrastructure: Evolution or Revolution?

Jackson Shaw discusses Earl Perkins and Neil MacDonald discussion at Gartner's IAM event 11-07. His concluding question is "Do you think that virtualization might be the force that can overcome the inertia? Maybe, maybe." I think yes, it has all the components and the right approach to the problem.The inertia Shaw refers to can be overcome, when and if a market player brings a high enough of a value (in this case, enough of a revolution in how we operate in the IdM / SOA environment) that warrants a big move. With market obsessed with only acquisitions (and rightly so), correcting, consolidating, and making applications and services more standardized, we are not seeing a lot of innovation. Just as in other markets in the past (the calm before the storm) and in each move in other revolutions (industrial, semi-conductor, www, etc) I think we will see a large move in improvement in technology again once the market is consolidated and strengthened. Only after the dust settles will we know what the real business needs are, and what problems we can solve at our current level of technology.In the mean-time - let's keep working towards integration and SOA concepts.

read more | digg story

Friday, November 16, 2007

Single Sign-On beyond the firewall

SSO is only becoming a larger project, as seen with the growing interest and need for federated identity management (FIM)http://www.infoq.com/news/2007/11/fim

read more | digg story

Single Sign-On beyond the firewall

SSO is only becoming a larger project, as seen with the growing interest and need for federated identity management (FIM)

read more | digg story

Wednesday, November 14, 2007

The Future of IdM

"Everything you know about identity management is wrong"
Take a serious look at how IdM is now what changes are coming, I think the analysis is 80% dead-on.

http://jacksonshaw.blogspot.com/2007/11/everything-you-know-about-identity.html

read more | digg story

Tuesday, November 6, 2007

Information Fabric and SOA

Good article with good links to some past discussion on SOA and data virtualization / data abstraction

read more | digg story

Abstraction Layer (Data Virtualization) and SOA

White paper that does a good job explaining why data virtualization is important to successful SOA deployments. Planning ahead is key, if your organization is not implementing these principles, they will be left behind in the future of business where data is more agile and accessible for new services and applications. If you have problems getting this whitepaper let me know and I will send it to you.

It is a bit light in good technical explanation of the problems and might leave you feeling like, "so what now". The principles are good ones, it needs to be developed more... pass along your questions and I can focus on material that is the most relevant to you...

read more | digg story

Monday, November 5, 2007

The basics of identity management

Nice overview of what Federation brings to the table for the enterprise and how it can change the horizon of IdM. It is interesting to note that authentication is mentioned as the first hurdle to overcome before moving to a federated environment. This is one of the most difficult IdM services to implement and requires a lot of planning for the future.

If you implement a point solution for solving authentication, getting to any Federated environment will be very difficult. Design with the future in mind, make sure you implement the right solutions when you tackle Authentication.

read more | digg story

Friday, November 2, 2007

DIY software faults are expensive, says survey

This article started a series of thoughts in my head around why problems like these exist. The thoughts and issues started coming up like an eruption - this posting is quite long.If you work in one IT department long enough (which most of us do not) you will see the entire lifecycle of this problem. If you are more transient you probably only see part of the story, the genesis; the perfect custom coded solution - everyone loves you and things are great - or the demise; some jerk created this custom coded band-aid and now we are stuck trying to fix it.At the inception the deployment make sense for everyone, doing the work in house seems like the best choice based on your needs and gives you a competitive advantage over taking an out-of-the-box solutions.As time passes something goes wrong the people who created the custom code have moved on to another job, position, etc and you are stuck trying to fix something that no one else really knows everything about - there is no product support number to call.Custom coding is great, it gives you strong advantages over your competition and allows your organization to act according to your business logic instead of the lest common denominator of your industry an out-of-the-box solution usually gives you.Don't look at the application level first, the key is in your infrastructure. Don't custom code something that should be handled by your infrastructure. Beef up your infrastructure capabilities, so that it is able to adapt to new business needs and able to be supported by professionals (who know the product inside and out) if there is a problem. Don't go this part alone!Keep custom coding to how you USE the data, not how it is delivered. If your infrastructure is flexible and adaptive you will lose nothing and gain reliability, serviceability and the ability to deploy new initiatives in a fraction of the time of your competitors.Your infrastructure needs to support flexibility in some key areas:1) Data publishing - can you represent data from multiple sources as a single source?If you can't get the data you need in one location you will find yourself replicating data from multiple silos into a plethora of new repositories at every turn, re-assembling the data it into a view that is optimal for your new service or application. This creates unnecessary complexity.ORYou start to change the structure of your existing data silos. BAD IDEA - there is a reason these silos exist, its like messing with your DNA, there is a reason these silos evolved the way they did. It is almost like cutting off one of your arms so that you could fit through a new door.Changing the structure or existing integration points of your data silos can have repercussions that are not easily apparent. Often legacy systems are black boxes, you dont' know what is really going on inside. Do you really want to start messing around with something you don't totally understand the nature of?This is where data virtualization can help. Virtualize access to disparate data silos and recompose the data into a view that you can use for your client (data modeling).2) Data access - can you access the data by multiple industry standards?If your new piece of infrastructure is a directory, then you are stuck with LDAP. Sure most applications are LDAP compliant for say Authentication, but what about other applications that would benefit from SQL access, or web services that leverage SAML? Don't get pigeon-holed into one protocol, again its about flexibility.3) Data availability - can you get to the data when you need it?Again, custom coding doesn't usually allow for growth. Your infrastructure needs to be scalable and have features that can be added for increased performance as your needs grow. The best indicator of a useful tool is how often it is used... same thing here, if you build something useful to your organization, people will find more uses for it. The better your solution is the more demands will be made on it. You may be the hero this year, but next year if your solution can't keep up - you will be the next picture in the break room pinned to the dart board.4) Data integration - can you match disparate data objects accurately?Integration... sigh... there is a lot to be said here, the problems look simple at first look, and when its time for the IT department to implement it - gentile Dr. Jeckel turns into Mr. Hyde. Data is never as clean as you think it is... Make sure you plan ahead and your infrastructure can support your needs to examine data and integrate it accurately.If you can't aggregate, correlate and integrate data then you can't leverage existing data. Your efforts will be for naught. This is where virtualization plays a key role again. You have to be able to compare apples to apples, not apples to lawnmowers. Things have to be comparable to be able to compare them! Sounds dumb, but you would be surprised how often this is forgotten, managers think since its data in our network, we can just use it anyway we want, its not true. Computers are still dumb, they don't know TimothyPaul and TimPaul are the same person how can you expect them to recognize uid=9898A in a directory and user=TPaul in a database as the same person? If you can make this correlation between profiles I can use my user list in SunOne and extend the entries with groups in Active Directory, or even authenticate against AD as if it was a single SunOne directory for web access control applications like TAM or SiteMinder.5) Synchronization - can you propagate data across disparate data sources?Here is another painful part of data delivery, synchronization. You have to be able to rely on data when you retrieve it. If solutions are not flexible (as most custom in-house solutions) then each time a system requirement changes, everything breaks down.Make sure you have a synchronization component in your infrastructure. If you can incorporate it into your other components all the better, just make sure you can change your topology and add attributes into the system easily.This is a bit long, but this topic is huge - custom in-house solutions are the cause of large problems across organizations as it ages. Its time to look at the future, build the solution at the infrastructure level - not try to fit new applications into a rigid enterprise by custom coding around the problem. You don't have to.

read more | digg story

Ideas I got after reading an article about UNIX and distributed authentication

Some interesting information about using UNIX abilities to achieve extended function from a distributed authentication environment, specifically laptops to cache (memory) login information when disconnected from the network.

http://blogs.techrepublic.com.com/opensource/?p=127

I think the ideas here are more interesting than the implementation suggested. If you can cache the identifiers, why not do that across multiple LDAP stores, (even from multiple security domains, domain controllers, and AD forests) into a single directory. You couldn't disconnect your laptop, but you could achieve reduced or single sign-on, especially for external applications. The identification step (or search) of the LDAP store would be much faster, once you have the DN the credential check would be sent back to the corresponding LDAP store.

If I have confused you, just let me know... I will be happy to fill in more details of where my mind is going on this after reading this article today....

read more | digg story

Ideas

Some interesting information about using UNIX abilities to achieve extended function from a distributed authentication environment, specifically laptops to cache (memory) login information when disconnected from the network.

http://blogs.techrepublic.com.com/opensource/?p=127

I think the ideas here are more interesting than the implementation suggested. If you can cache the identifiers, why not do that across multiple LDAP stores, (even from multiple security domains, domain controllers, and AD forests) into a single directory. You couldn't disconnect your laptop, but you could achieve reduced or single sign-on, especially for external applications. The identification step (or search) of the LDAP store would be much faster, once you have the DN the credential check would be sent back to the corresponding LDAP store.

If I have confused you, just let me know... I will be happy to fill in more details of where my mind is going on this after reading this article today....