Wednesday, December 12, 2007

LDAP support via sudo in UNIX

An enterprising blogger expounds how you do no longer need to be limited in how UNIX users are managed, such as storing user account in flat files. Using sudo will let the administrator utilize a directory service for security (i.e. authentication/authorization). There are other solutions, but here is one that is relatively easy and cheap since sudo is open-source and the functionality is built-in.http://breakablelinux.blogspot.com/2007/12/linux-authentication-and-authorization.html

read more | digg story

Thursday, December 6, 2007

Solving the privacy puzzle in a federated identity model

In this article Rosie Lombardi contrasts virtual directories and meta-directories as the central access point options in creating a federated environment for consolidated authentication via web. A simple overview, but some good points of discussion, how do governments establish a way for people to gain information and access to services across agencies, states, and other governing systems?Unique Identifier or FIM?She quotes Temoshok at the GSA... "I don't want to simplify too much, but governments have two basic choices for this: a national ID or federated identity management system," So, here are the questions that arise. >> If you have FIM how do the silos correlate identities without the national ID# to act as the unique identifier? >> If you have a national ID, how do you facilitate the data sharing? Where do you verify the national ID is valid? What is the person in question is using the same national ID# with different alias'? You need a silo to set up the "master" national ID list - creating a huge repository of all your citizens. After the initial verification, what about continued information exchange? What if the person moves, the address is not updated. What about national security concerns? I see virtual directories as offering more to solving these problems than just being easier and less expensive to deploy (as the only benefit asserted by James Quin, senior analyst at Info-Tech Research Group asserts in the article). A virtual directory solution can be used to solve the correlation problem between identities (without the pesky national id#), impose policy (logic) to alert administrators of suspicious activity (i.e. same ID# using several different names/aliases) and update (synchronize) information across systems such as new phone numbers, address changes, or other contact information. Quin also brings up the question of security, he feels that one meta-directory is more secure (although admittedly most expensive and complicated to deploy) because there is only one point of failure. With the virtual directory solution he says there are multiple points of failure, the virtual directory and all connected sources - how is this not true of the meta-directory system, unless you are planning not to synchronize and if you don't synchronize, how you expect to keep the information current? Or perhaps you are not using the virtual directory as the point of access, and securing the underlying sources behind firewalls, etc? "But the virtual directory approach means personal information about citizens resides in many government systems and servers in redundant and potentially inaccurate forms." Here we see a lack of understanding of the functions and features of virtual directories. Virtual directories are perfectly able to perform synchronization services, correlation, identity aggregation, directory replication, and more, perfectly designed for exactly this problem.I don't see a down side to virtual directories, I just don't. The more I learn and the more I use virtual directories to solve these problems, the more I love them! Virtual directories can be made just a secure using SAML, SSL, and ACI's just as a metadirectory (only if you have a good directory service attached to the front end). http://www.intergovworld.com/article/abc978260a01040800129dda8cb5dba1/pg1.htm

read more | digg story

Monday, December 3, 2007

Logical Data Models for SOA Information Exchange

See what some say is the major road-block to SOA deployments and why it doesn't have to be so hard to solve - you need to think hierarchy, data modeling, object-classes, and abstraction (flexibility). If you are used to working with Directories and even more so, virtual directories, you will have a leg up on understanding these concepts and how they are useful in simplifying the issues in SOA deployments. It doesn't have to be that bad, REALLY!

read more | digg story

SOA in the IdM

Here is a definition of SOA given in the article found at http://www.itbusinessedge.com/item/?ci=23055 - its a short understandable definition;
"SOA. Service-oriented architecture refers to a paradigm that focuses on how you maximize the sharing, reuse and interoperability of distributed corporate resources across your network. And to maximize sharing, reuse, etc., you need a universal middleware environment, an integration fabric, a set of standards. So that comes down to things like the Web services standards" and I would add LDAP and SQL to these standards, don't keep this idea only in the world of external users, internet, or even intranet - use these concepts at the data integration level inside your IT deployments, especially in the IdM services space....

Virtual Directories are such a middleware component that can accomplish this. SOA is here if you want it, or you can wait until the vendors catch up and start helping you understand how to use their products....

Security and Data Management

Identity Management and Data Management starting to overlap in your mind? Then maybe its because you have dug into the topic deep enough to see the problems, or you are just losing sight of where the lines are? Certainly some of the issues are the same and have the same solutions, so where is the future of IdM?

read more | digg story

Friday, November 30, 2007

Why use LDAP?

Why LDAP?

The question is a good one and I think as LDAP proliferates across more systems, many people will have the same question - and it deserves a good answer, so here is my two-cents-worth.

What we are really talking about is directory services, not just the protocol. Directories have some serious advantages over DBMS. Databases are optimized for OLTP (online transaction processing), but not for performing quick searches of information that is frequently used, but not updated constantly. In other words, Directories can deliver data very quickly (read) compared to a DB, but handles updates (write) slower in comparison to databases.

Features and Benefits using LDAP

§ Cross-platform functionality and industry standards-based (important consideration for future growth and deployments)

§ Widely accepted standard for the Internet

§ Inexpensive since licensing is usually not based on number of connections or clients open source directories are widely available. Also, replication and synchronization features are usually built-in rather than requiring a separate license as is the case for many databases.

§ Replication and synchronization is easy compared to complex DBMS implementation with highly specific SQL script requirements.

§ ACI’s for delegated administration so you can setup accounts that are highly specific in what administration functions a group has {e.g. an account may only allow for phone numbers to be updated, another for new objects (name, email, phone number) to be inserted, but not deleted or existing objects modified}

§ High Performance, since directory data is store hieratical you have very high availability over DBMS, sometimes up to 10 times higher.

Sample Use Cases.

The following is a short list of common uses of directory services since these uses are data profiles that are fairly static and do not have deep relationships – they are stored as relatively “flat” trees.

§ Phone / Address book

§ Infrastructure Resource List (ip addresses, etc)

§ Public Certificates

§ User credentials, groups, roles (for authentication/ authorization)

Directories are also more secure and can keep credentials “locked” and unable to read or copy from an outside source, and you can do in a database. Directories are based on a hierarchal storage schema, a “tree” structure. Information that would be able bi-directionally in a database are not available in this manner in a directory. Items that are lower in the hierarchy could be read, but data higher in the hierarchy are not available to the client. So you could read a person’s contact information, but not necessarily be able to see what accounts he has, or other people in a group that she is a part of. In a database, records are stored relationally, so if you can read a person in a group, you can read the group and theoretically see the records of everyone in the group if you have direct access to the tables, not true in a directory. read more | digg story

Wednesday, November 28, 2007

The 7th Annual eWEEK Excellence Awards: The Winners

RSA Access ManagerNovell Access ManagerRadiant Logic RadiantOne VDS

read more | digg story

Tuesday, November 20, 2007

IT Infrastructure model

Because everyone has a different idea on what infrastructure is, I think it is good to introduce a definition of IT infrastructure:IT infrastructure is the total set of components that enables applications to functionThe following model can be used to visualize the various components.

read more | digg story

Identity as Application Infrastructure: Evolution or Revolution?

Jackson Shaw discusses Earl Perkins and Neil MacDonald discussion at Gartner's IAM event 11-07. His concluding question is "Do you think that virtualization might be the force that can overcome the inertia? Maybe, maybe." I think yes, it has all the components and the right approach to the problem.The inertia Shaw refers to can be overcome, when and if a market player brings a high enough of a value (in this case, enough of a revolution in how we operate in the IdM / SOA environment) that warrants a big move. With market obsessed with only acquisitions (and rightly so), correcting, consolidating, and making applications and services more standardized, we are not seeing a lot of innovation. Just as in other markets in the past (the calm before the storm) and in each move in other revolutions (industrial, semi-conductor, www, etc) I think we will see a large move in improvement in technology again once the market is consolidated and strengthened. Only after the dust settles will we know what the real business needs are, and what problems we can solve at our current level of technology.In the mean-time - let's keep working towards integration and SOA concepts.

read more | digg story

Friday, November 16, 2007

Single Sign-On beyond the firewall

SSO is only becoming a larger project, as seen with the growing interest and need for federated identity management (FIM)http://www.infoq.com/news/2007/11/fim

read more | digg story

Single Sign-On beyond the firewall

SSO is only becoming a larger project, as seen with the growing interest and need for federated identity management (FIM)

read more | digg story

Wednesday, November 14, 2007

The Future of IdM

"Everything you know about identity management is wrong"
Take a serious look at how IdM is now what changes are coming, I think the analysis is 80% dead-on.

http://jacksonshaw.blogspot.com/2007/11/everything-you-know-about-identity.html

read more | digg story

Tuesday, November 6, 2007

Information Fabric and SOA

Good article with good links to some past discussion on SOA and data virtualization / data abstraction

read more | digg story

Abstraction Layer (Data Virtualization) and SOA

White paper that does a good job explaining why data virtualization is important to successful SOA deployments. Planning ahead is key, if your organization is not implementing these principles, they will be left behind in the future of business where data is more agile and accessible for new services and applications. If you have problems getting this whitepaper let me know and I will send it to you.

It is a bit light in good technical explanation of the problems and might leave you feeling like, "so what now". The principles are good ones, it needs to be developed more... pass along your questions and I can focus on material that is the most relevant to you...

read more | digg story

Monday, November 5, 2007

The basics of identity management

Nice overview of what Federation brings to the table for the enterprise and how it can change the horizon of IdM. It is interesting to note that authentication is mentioned as the first hurdle to overcome before moving to a federated environment. This is one of the most difficult IdM services to implement and requires a lot of planning for the future.

If you implement a point solution for solving authentication, getting to any Federated environment will be very difficult. Design with the future in mind, make sure you implement the right solutions when you tackle Authentication.

read more | digg story

Friday, November 2, 2007

DIY software faults are expensive, says survey

This article started a series of thoughts in my head around why problems like these exist. The thoughts and issues started coming up like an eruption - this posting is quite long.If you work in one IT department long enough (which most of us do not) you will see the entire lifecycle of this problem. If you are more transient you probably only see part of the story, the genesis; the perfect custom coded solution - everyone loves you and things are great - or the demise; some jerk created this custom coded band-aid and now we are stuck trying to fix it.At the inception the deployment make sense for everyone, doing the work in house seems like the best choice based on your needs and gives you a competitive advantage over taking an out-of-the-box solutions.As time passes something goes wrong the people who created the custom code have moved on to another job, position, etc and you are stuck trying to fix something that no one else really knows everything about - there is no product support number to call.Custom coding is great, it gives you strong advantages over your competition and allows your organization to act according to your business logic instead of the lest common denominator of your industry an out-of-the-box solution usually gives you.Don't look at the application level first, the key is in your infrastructure. Don't custom code something that should be handled by your infrastructure. Beef up your infrastructure capabilities, so that it is able to adapt to new business needs and able to be supported by professionals (who know the product inside and out) if there is a problem. Don't go this part alone!Keep custom coding to how you USE the data, not how it is delivered. If your infrastructure is flexible and adaptive you will lose nothing and gain reliability, serviceability and the ability to deploy new initiatives in a fraction of the time of your competitors.Your infrastructure needs to support flexibility in some key areas:1) Data publishing - can you represent data from multiple sources as a single source?If you can't get the data you need in one location you will find yourself replicating data from multiple silos into a plethora of new repositories at every turn, re-assembling the data it into a view that is optimal for your new service or application. This creates unnecessary complexity.ORYou start to change the structure of your existing data silos. BAD IDEA - there is a reason these silos exist, its like messing with your DNA, there is a reason these silos evolved the way they did. It is almost like cutting off one of your arms so that you could fit through a new door.Changing the structure or existing integration points of your data silos can have repercussions that are not easily apparent. Often legacy systems are black boxes, you dont' know what is really going on inside. Do you really want to start messing around with something you don't totally understand the nature of?This is where data virtualization can help. Virtualize access to disparate data silos and recompose the data into a view that you can use for your client (data modeling).2) Data access - can you access the data by multiple industry standards?If your new piece of infrastructure is a directory, then you are stuck with LDAP. Sure most applications are LDAP compliant for say Authentication, but what about other applications that would benefit from SQL access, or web services that leverage SAML? Don't get pigeon-holed into one protocol, again its about flexibility.3) Data availability - can you get to the data when you need it?Again, custom coding doesn't usually allow for growth. Your infrastructure needs to be scalable and have features that can be added for increased performance as your needs grow. The best indicator of a useful tool is how often it is used... same thing here, if you build something useful to your organization, people will find more uses for it. The better your solution is the more demands will be made on it. You may be the hero this year, but next year if your solution can't keep up - you will be the next picture in the break room pinned to the dart board.4) Data integration - can you match disparate data objects accurately?Integration... sigh... there is a lot to be said here, the problems look simple at first look, and when its time for the IT department to implement it - gentile Dr. Jeckel turns into Mr. Hyde. Data is never as clean as you think it is... Make sure you plan ahead and your infrastructure can support your needs to examine data and integrate it accurately.If you can't aggregate, correlate and integrate data then you can't leverage existing data. Your efforts will be for naught. This is where virtualization plays a key role again. You have to be able to compare apples to apples, not apples to lawnmowers. Things have to be comparable to be able to compare them! Sounds dumb, but you would be surprised how often this is forgotten, managers think since its data in our network, we can just use it anyway we want, its not true. Computers are still dumb, they don't know TimothyPaul and TimPaul are the same person how can you expect them to recognize uid=9898A in a directory and user=TPaul in a database as the same person? If you can make this correlation between profiles I can use my user list in SunOne and extend the entries with groups in Active Directory, or even authenticate against AD as if it was a single SunOne directory for web access control applications like TAM or SiteMinder.5) Synchronization - can you propagate data across disparate data sources?Here is another painful part of data delivery, synchronization. You have to be able to rely on data when you retrieve it. If solutions are not flexible (as most custom in-house solutions) then each time a system requirement changes, everything breaks down.Make sure you have a synchronization component in your infrastructure. If you can incorporate it into your other components all the better, just make sure you can change your topology and add attributes into the system easily.This is a bit long, but this topic is huge - custom in-house solutions are the cause of large problems across organizations as it ages. Its time to look at the future, build the solution at the infrastructure level - not try to fit new applications into a rigid enterprise by custom coding around the problem. You don't have to.

read more | digg story

Ideas I got after reading an article about UNIX and distributed authentication

Some interesting information about using UNIX abilities to achieve extended function from a distributed authentication environment, specifically laptops to cache (memory) login information when disconnected from the network.

http://blogs.techrepublic.com.com/opensource/?p=127

I think the ideas here are more interesting than the implementation suggested. If you can cache the identifiers, why not do that across multiple LDAP stores, (even from multiple security domains, domain controllers, and AD forests) into a single directory. You couldn't disconnect your laptop, but you could achieve reduced or single sign-on, especially for external applications. The identification step (or search) of the LDAP store would be much faster, once you have the DN the credential check would be sent back to the corresponding LDAP store.

If I have confused you, just let me know... I will be happy to fill in more details of where my mind is going on this after reading this article today....

read more | digg story

Ideas

Some interesting information about using UNIX abilities to achieve extended function from a distributed authentication environment, specifically laptops to cache (memory) login information when disconnected from the network.

http://blogs.techrepublic.com.com/opensource/?p=127

I think the ideas here are more interesting than the implementation suggested. If you can cache the identifiers, why not do that across multiple LDAP stores, (even from multiple security domains, domain controllers, and AD forests) into a single directory. You couldn't disconnect your laptop, but you could achieve reduced or single sign-on, especially for external applications. The identification step (or search) of the LDAP store would be much faster, once you have the DN the credential check would be sent back to the corresponding LDAP store.

If I have confused you, just let me know... I will be happy to fill in more details of where my mind is going on this after reading this article today....

Monday, October 29, 2007

Using AD within Linuz for authentication

Does Active Directory top Linux authentication options?

http://searchenterpriselinux.techtarget.com/originalContent/0,289142,sid39_gci1279624,00.html

Integration issues between Linux and Active Directory discussed by Enk (Gartner) – metadirectories and other solutions – LDAP and Kerberos are discussed as a disadvantage because most organizations do not have people with LDAP expertise -

cross-platform authentication market will probably remain in flux until at least 2009”

Thursday, October 25, 2007

Common Virtual Directory Scenarios

Excerpts from this posting... good stuff...

http://360tek.blogspot.com/2006_03_01_360tek_archive.html

"
  1. Protocol Translation
  2. Web Service Enablement
  3. Multi-Repository Search
  4. Joined Identity View
  5. Permission-Based Results
  6. Dynamic DIT
  7. Authentication
  8. Real-Time Data Access

Virtual Directory technologies eliminate boundaries. Hassles related to LDAP object types, attribute definitions and other schema-related issues are eliminated by virtualizing the view into the backend identity stores. You're no longer limited by the existing data format or database branding. There's no requirement to migrate the data from a relational database into an LDAP directory in order to make the data LDAP- or Web Service- accessible."


He talks about these issues quickly, but don't think they are not HUGE issues in deployment.

Also, some virtual directories offer more interfaces than standard just LDAP, he alludes to this fact as "web service- accessible", but still implies the use of LDAP for the web service. Some virtual directories could present information by other protocols such as DSML (for web services), SQL, SOAP, SAML, etc. Make sure the virtual directory you use supports different protocols for application access.

Metadata

What is metedata? describing metadata and basic application of metadata
http://www.addsimplicity.com/adding_simplicity_an_engi/2007/10/what-metadata.html

Metadata can be very useful in managing identities, knowing the context of users is critical for IDM initiatives (e.g. authorization) leveraging existing policy, groups, and roles means a more consistent enforcement of business logic and better security across your organization.

In this case metadata refers to how the system currently defines a user, I like to refer to this information as context. Why? because the metadata allows me to see the context in which the user operates - what the actor does inside the system.

Understanding your metadata means you can leverage it, as always "knowledge is power"...

read more | digg story

synchronization versus virtualization

virtual directories vs. meta-directory - most of the story is right on, but misses features of a virtual directory focusing on a virtual directory as only a proxy engine, which it is not. virtual directories can offer real-time synchronization AND persistent data, negating most of his "disadvantages". Meta is old, Virtual is new and more adaptive

read more | digg story