I’ve been interested in exploring the data space and how it relates to user virtualization for a while as in my opinion; data is one of the most critical elements of the user experience. There are many solutions with respect to managing user data available on the market however when you provide software to enterprise organizations there is much to consider and unfortunately its not just a matter of pulling various products together to provide a solution, true thought is needed.
I’ve spent many years of my career consulting for some of the largest organizations in the world and have had the privilege of being part of some truly inspiring technical designs and solutions. One thing that’s been interesting to observe and look back on is how data was managed within these organizations and how it’s different today.
The concept of the home drive is something that’s been with us for years and exists in pretty much every operating system. At one point the corporate home drive was the pinnacle data store for user data and management of it was essential to any environment. I remember spending time architecting Citrix and terminal services deployments and spending days on home drive management alone, writing scripts to perform clever folder redirection or using tools like subst.exe, join.exe and junction.exe to create sudo-virtual data views for users.
As laptops started to become prevalent the demise of the home folder began with users opted to store their data locally rather than on their central stores. Many organizations tried to implement technology such as offline folders to make things easier, however with VPNs and slow Internet connections in the way, users simply opted for the path of least resistance and did what made them most productive.
The decentralization of data was further exacerbated as the smart-phone and tablet space gained traction with many users expecting access to their data from their now array of devices. While organizations did their best to provide solutions (or excuses), users once again took the path of least resistance and turned to the (now abundantly) available cloud services for help. While these seemingly magical solutions appear to be the holy grail of data management, they also bring their own sets of problems including security, compliance, oversight and ownership to name a few.
I think its fair to say that users don't necessarily want to be bad corporate citizens and generally want to do the right thing but at the end of the day, like the rest of the organization they have deliverables and are going to do whatever helps make their lives easy. Bill Gates once said if you show people the problems, and show them the solutions they will be moved to act and from these words, IT can learn a lot. Organizations can’t control users by restricting, denying or making excuses; this will just cause disruption whereas if they provide solutions to user problems, then users generally will adopt them with positive attitudes.
Today our CTO and VP of Product Management, Harry Labana will talk with Brian Madden and Gabe Knuth about a project, which has largely been in stealth mode here at AppSense. Project Orca is the codename for our data management solution currently being developed which aims to provide enterprise organizations with the ability to manage user data in what many consider to be a new age of computing access.
When I first presented Project Orca to the AppSense management team I spoke about 3 principal factors that have to be considered when creating a data solution, principals that I personal believe make or break any solution.
As I mentioned earlier in this post, user data is probably the most critical element of the user experience and it’s probably safe to say that almost every subsystem touches it in someway or another. When creating a data management product its very easy to create a completely new system or yet another place to store data without considering the effects of doing such. With Project Orca I wanted to create something that not only made use of an organizations existing storage hardware but that also fitted in with existing workflow. If for example a user accessed data from their laptop through the Project Orca technology, that same data needed to be accessible for the user through a terminal services session with a simple mapped drive.
With the above in mind, we couldn’t simply create a technology that made use of a specially designated data location but instead we needed to make use of the data locations that users already used. With Project Orca, the contents that the user sees today through their regular mapped network home folder is available through the various Project Orca clients and visa-versa.
Native experience is also essential to workflow and we worked exceptionally hard to ensure that our agents integrate into the native operating systems. On Microsoft Windows and Apple OS X for example the users sees no difference between accessing data in their regular local documents folder to accessing data in their corporate home folder. Like many online services, at client installation the user specifies the location where they want their corporate data to reside resulting in a regular folder being available to them. Behind the scenes, the Project Orca technology works hard to ensure data is syncronized and centralized.
Access is essential to the success of a data management platform and one of the very reasons that products such as Dropbox are so prevalent. While I aggressively disagree that Apple is going to take the enterprise by storm any time soon with an influx of Mac’s, todays IT world is not solely based on Microsoft Windows. The (very successful) introduction of the Apple iPad and iPhone into the enterprise has already changed the way which many users work leading to users expecting data to be accessible on these new devices.
The Project Orca technology has desktop clients for both Microsoft Windows and Apple OS X along with mobile clients for iOS, Android and Microsoft Windows Phone OS. For those devices where Project Orca is not capable of providing native access, a HTML5 based web client is available taking advantages of new features of the language to provide exceptional functionality.
AppSense is known for providing exceptional enterprise technology so with data; we absolutely had to consider the requirements of this space. Organizations are desperate to provide users with the access and tools they need to get at their data but have a duty (sometimes a legal requirement) to protect their assets and users don’t always understand this.
With Project Orca we have created technology that can provide users with the same levels of access that they expect today but at the same time have given IT control and oversight of the data. Its easy to say to an organization that the box allows HTTPS connections making access easier but if the same box is not secure enough to sit in the corporate DMZ and still requires a VPN connection then the access problem is not really addressed. With Project Orca, we have worked exceptionally hard to provide technology that would actually be capable of being a corporate firewall let alone provide access to corporate data.
Over my next few blog posts I’m going to give more insight into Project Orca, how its its put together, its fundamental building blocks and architecture and insight from the developers, architects and engineers involved in its creation. There are some fantastically talented people creating this exciting technology and we’re very close to the whole world having access to it.
As always I’m interested to hear people’s thoughts and be sure to follow me on Twitter where I’ll make people aware as things develop…